question
stringlengths
15
100
context
stringlengths
18
412k
what were the estimated economic costs of ww1
Economic History of World War I - wikipedia The economic history of World War I covers the methods used by the First World War (1914 -- 1918), as well as related postwar issues such as war debts and reparations. It also covers the economic mobilization of labor, industry and agriculture. It deals with economic warfare such as the blockade of Germany, and with some issues closely related to the economy, such as military issues of transportation. For a broader perspective see Home front during World War I. All of the powers in 1914 expected a short war; none had made any economic preparations for a long war, such as stockpiling food or critical raw materials. The longer the war went on, the more the advantages went to the Allies, with their larger, deeper, more versatile economies and better access to global supplies. As Broadberry and Harrison conclude, once stalemate set in late in 1914: The Allies had much more potential wealth they could spend on the war. One estimate (using 1913 US dollars) is that no Allies spent $147 billion on the war and the Central Powers only $61 billion. Among the Allies, Britain and its Empire spent $47 billion and the U.S. $27 billion (America joined AFTER the war started) while among the Central Powers, Germany spent $45 billion. Total war demanded total mobilization of all the nation 's resources for a common goal. Manpower had to be channeled into the front lines (all the powers except the United States and Britain had large trained reserves designed just for that). Behind the lines labor power had to be redirected away from less necessary activities that were luxuries during a total war. In particular, vast munitions industries had to be built up to provide shells, guns, warships, uniforms, airplanes, and a hundred other weapons both old and new. Agriculture had to be mobilized as well, to provide food for both civilians and for soldiers (many of whom had been farmers and needed to be replaced by old men, boys and women) and for horses to move supplies. Transportation in general was a challenge, especially when Britain and Germany each tried to intercept merchant ships headed for the enemy. Finance was a special challenge. Germany financed the Central Powers. Britain financed the Allies until 1916, when it ran out of money and had to borrow from the United States. The U.S. took over the financing of the Allies in 1917 with loans that it insisted be repaid after the war. The victorious Allies looked to defeated Germany in 1919 to pay reparations that would cover some of their costs. Above all, it was essential to conduct the mobilization in such a way that the short term confidence of the people was maintained, the long - term power of the political establishment was upheld, and the long - term economic health of the nation was preserved. Gross domestic product (GDP) increased for three Allies (Britain, Italy, and U.S.), but decreased in France and Russia, in neutral Netherlands, and in the three main Central Powers. The shrinkage in GDP in Austria, Russia, France, and the Ottoman Empire reached 30 to 40 %. In Austria, for example, most pigs were slaughtered, so at war 's end there was no meat. The Western Front quickly stabilized, with almost no movement of more than a few hundred yards. The greatest single expenditure on both sides was for artillery shells, the chief weapon in the war. Since the front was highly stable, both sides built elaborate railway networks that brought supplies within a mile or two of the front lines, with horse - drawn wagons used for the final deliveries. In the ten - month battle at Verdun, the French and Germans fired some 10 million shells in all, weighing 1.4 million tons of steel. Economic warfare against Germany worked -- the British blockade was effective. The German counter-blockade with U-Boats was defeated by the convoy system and massive American ship building. Britain paid the war costs of most of its Allies until it ran out of money, then the US took over, funding those Allies and Britain as well. The economy (in terms of GDP) grew about 7 % from 1914 to 1918 despite the absence of so many men in the services; by contrast the German economy shrank 27 %. The War saw a decline of civilian consumption, with a major reallocation to munitions. The government share of GDP soared from 8 % in 1913 to 38 % in 1918 (compared to 50 % in 1943). Despite fears in 1916 that munitions production was lagging, the output was more than adequate. The annual output of artillery grew from 91 guns in 1914 to 8039 in 1918. Warplanes soared from 200 in 1914 to 3200 in 1918, while the production of machine guns went from 300 to 121,000. In 1915, the Anglo - French Financial Commission agreed a $500 million loan from private American banks. By 1916, Britain was funding most of the Empire 's war expenditures, all of Italy 's and two thirds of the war costs of France and Russia, plus smaller nations as well. The gold reserves, overseas investments and private credit then ran out forcing Britain to borrow $4 billion from the U.S. Treasury in 1917 -- 18. Shipments of American raw materials and food allowed Britain to feed itself and its army while maintaining her productivity. The financing was generally successful, as the City 's strong financial position minimized the damaging effects of inflation, as opposed to much worse conditions in Germany. Overall consumer consumption declined 18 % from 1914 to 1919. Trade unions were encouraged as membership grew from 4.1 million in 1914 to 6.5 million in 1918, peaking at 8.3 million in 1920 before relapsing to 5.4 million in 1923. Women were available and many entered munitions factories and took other home front jobs vacated by men. Energy was a critical factor for the British war effort. Most of the energy supplies came from coal mines in Britain, where the issue was labour supply. Critical however was the flow of oil for ships, lorries and industrial use. There were no oil wells in Britain so everything was imported. The U.S. pumped two - thirds of the world 's oil. In 1917, total British consumption was 827 million barrels, of which 85 percent was supplied by the United States, and 6 percent by Mexico. The great issue in 1917 was how many tankers would survive the German u-boats. Convoys and the construction of new tankers solved the German threat, while tight government controls guaranteed that all essential needs were covered. An Inter-Allied Petroleum Conference allocated American supplies to Britain, France and Italy. An oil crisis occurred in Britain due to the 1917 German submarine campaign. Standard Oil of NJ, for example, lost 6 tankers (including the brand new John D. Archbold) between May and September. The only solution to the crisis lay with increased oil shipment from America. The Allies formed the Inter-Allied Petroleum Conference with USA, Britain, France, and Italy as the members. Standard and Royal Dutch / Shell ran it and made it work. The introduction of convoys as an antidote to the German U-boats and the joint management system by Standard Oil and Royal Dutch / Shell helped to solve the Allies ' supply problems. The close working relationship that evolved was in marked contrast to the feud between the government and Standard Oil years earlier. In 1917 and 1918, there was increased domestic demand for oil partly due to the cold winter that created a shortage of coal. Inventories and imported oil from Mexico were used to close the gap. In January 1918, the U.S. Fuel Administrator ordered industrial plants east of Mississippi to close for a week to free up oil for Europe. Fuel oil for the Royal Navy was the highest priority. In 1917, the Royal Navy consumed 12,500 tons a month, but had a supply of 30,000 tons a month from the Anglo - Persian Oil Company, using their oil wells in Persia. Clydeside shipyards before 1914 had been the busiest in the world, turning out more than a third of the entire British output. They expanded by a third during the war, primarily to produce transports of the sort that German U-boats were busy sinking. Confident of postwar expansion, the companies borrowed heavily to expand their facilities. But after the war, employment tumbled as the yards proved too big, too expensive, and too inefficient; in any case world demand was down. The most skilled craftsmen were especially hard hit, because there were few alternative uses for their specialized skills. Ireland was on the verge of civil war in 1914 after Parliament voted a home rule law that was intensely opposed by the Protestants, especially those in Ulster. When the war broke out the law was suspended and Protestants gave very strong support for the war in terms of military service and industrial output. Occurring during Ireland 's Revolutionary period, the Irish Catholic experience of the war was complex and its memory of it divisive. At the outbreak of the war, most Irish people, regardless of political affiliation, supported the war in much the same way as their British counterparts, and both nationalist and unionist leaders initially backed the British war effort. Their followers, both Catholic and Protestant, served extensively in the British forces, many in three specially raised divisions. Over 200,000 Irishmen fought in the war, in several theatres with 30,000 deaths. In 1916, Catholic supporters of Irish independence from the United Kingdom took the opportunity of the ongoing war to proclaim an Irish Republic and to defend it in an armed rebellion against British rule in Dublin. The rebellion was poorly planned and quickly suppressed. The British executed most of the prisoners which caused Catholic opinion to surge in favour of independence. Britain 's intention to impose conscription in Ireland in 1918 provoked widespread resistance and as a result remained unimplemented. The Commonwealth nations and India all played major roles. The Asian and African colonies provided large numbers of civilian workers, as well as some soldiers. The Indian Army during World War I contributed a large number of divisions and independent brigades to the European, Mediterranean and the Middle East theatres of war. Over one million Indian troops served overseas, of whom 62,000 died and another 67,000 were wounded. Canada was prosperous during the war but ethnic conflict escalated almost out of control. In terms of long - run economic trends, the war hardly affected the direction or the speed of change. The trajectory of the main economic factors, the business and financial system, and the technology continued on their way. Women temporarily took war jobs, and at the end of the war there was a great deal of unrest among union members and farmers for a few years. Billy Hughes, prime minister from October 1915, expanded the government 's role in the economy, while dealing with intense debates over the issue of conscription. Historian Gerhard Fisher argues that the Hughes government aggressively promoted economic, industrial, and social modernization. However, Fischer also says it was done by means of exclusion and repression. He says the war turned a peaceful nation into "one that was violent, aggressive, angst - and conflict - ridden, torn apart by invisible front lines of sectarian division, ethnic conflict and socio - economic and political upheaval. '' In 1914 the Australian economy was small but the population of five million was very nearly the most prosperous in the world per capita. The nation depended on the export of wool, mutton, wheat and minerals. London provided assurances that it would underwrite the war risk insurance for shipping in order to allow trade amongst the Commonwealth to continue in the face of the German u-boat threat. London imposed controls so that no exports would wind up in German hands. The British government protected prices by buying Australian products even though the shortage of shipping meant that there was no chance that they would ever receive them. On the whole Australian commerce expanded. In terms of value, Australian exports rose almost 45 per cent, while the number of Australians employed in the manufacturing industry increased over 11 per cent. Iron mining and steel manufacture grew enormously. Inflation became a factor as consumer prices went up, while the cost of exports was deliberately kept lower than market value in an effort to prevent further inflationary pressures worldwide. As a result, the cost of living for many average Australians was increased. The trade union movement, already powerful grew rapidly, though the movement split on the political question of conscription. Despite the considerable rises in the costs of many basic items, the government sought to stabilize wages, much to the anger of unionists. the average weekly wage during the war was increased by between 8 -- 12 per cent, it was not enough to keep up with inflation and as a result there was considerable discontent amongst workers, to the extent that industrial action followed. Not all of these disputes were due to economic factors, and indeed in some part they were the result of violent opposition to the issue of conscription, which many trade unionists were opposed to. Nevertheless, the result was very disruptive and it has been estimated that between 1914 and 1918 there were 1,945 industrial disputes, resulting in 8,533,061 working days lost and £ 4,785,607 in lost wages. The cost of the war was £ 377 million, of which 70 % was borrowed and the rest came from taxes. Overall, the war had a significantly negative impact on the Australia economy. Real aggregate Gross Domestic Product (GDP) declined by 9.5 percent over the period 1914 to 1920, while the mobilization of personnel resulted in a 6 percent decline in civilian employment. Meanwhile, although population growth continued during the war years, it was only half that of the prewar rate. Per capita incomes also declined sharply, failing by 16 percent. South Africa 's main economic role was in the supply of two - thirds of the gold production in the British Empire (most of the remainder came from Australia). When the war began Bank of England officials worked with the government of South Africa to block any gold shipments to Germany, and force the mine owners to sell only to the Treasury, at prices set by the Treasury. This facilitated purchases of munitions and food in the U.S, and other neutrals. By 1919 London lost control to the mining companies (which were now backed by the South African government). They wanted the higher prices and sales to New York that a free market would provide. The Germans invaded Belgium at the start of the war and held the entire country (except for a tiny sliver) for the entire war. They left Belgium stripped and barren. Over a 1.4 million refugees fled to France or to neutral Netherlands. Over half the German regiments in Belgium were involved in major incidents. After the systematic atrocities by the German army in the first few weeks of the war, German civil servants took control and were generally correct, albeit strict and severe. There was never an armed resistance movement, but there was a large - scale spontaneous passive resistance of refusal to work for the benefit of Germany. Belgium was heavily industrialized; while farms operated and small shops stayed open most large establishments shut down or drastically reduced their output. The faculty closed the universities; many publishers shut down their newspapers. Most Belgians "turned the four war years into a long and extremely dull vacation, '' according to Kossmann. In 1916 Germany deported 120,000 men and boys to work in Germany; this set off a storm of protest from neutral countries and they were returned. Germany then stripped the factories of all useful machinery, and used the rest as scrap iron for its steel mills. At the start of war, silver 5 franc coins were collected and melted down by the National Bank to augment its silver reserves. They were exchangeable for paper banknotes, and later zinc coins, although many demonetized silver coins were hoarded. With the German invasion, the National Bank 's reserves were transferred to Antwerp and eventually to England where they were deposited at the Bank of England. Throughout the German occupation there was a shortage of official coins and banknotes in circulation, and so around 600 communes, local governments and companies issued their own unofficial "Necessity Money '' to enable the continued functioning of the local economies. The Belgian franc was fixed at an exchange rate of 1 franc to 1.25 German mark, which was also introduced as legal tender. Neutral countries led by the United States set up the Commission for Relief in Belgium, headed by American engineer Herbert Hoover. It shipped in large quantities of food and medical supplies, which it tried to reserve for civilians and keep out of the hands of the Germans. Many businesses collaborated with the Germans, and some women cohabitated with them. They were treated roughly in a wave of popular violence in November and December 1918. The government set up judicial proceedings to punish the collaborators. Rubber had long been the main export of the Belgian Congo and production levels held up during the war but its importance fell from 77 % of exports (by value) to only 15 %. New resources were opened, especially copper mining in Katanga Province. The Union Minière du Haut Katanga company dominated the copper industry, exporting its product along a direct rail line to the sea at Beira. The war caused a heavy demand for copper, and production soared from 997 tons in 1911 to 27,000 tons in 1917, then fell off to 19,000 tons in 1920. Smelters operate at Elisabethville. Before the war the copper was sold to Germany and, in order to prevent loss of capacity, the British purchased all the Congo 's wartime output with the revenues going to the Belgian government in exile. Diamond and gold mining also expanded during the war. The Anglo - Dutch firm Lever Bros. greatly expanded the palm oil business during the war and there was an increased output of cocoa, rice and cotton. New rail and steamship lines opened to handle the expanded export traffic. No one had plans for a long war, so the learning process was slow. The German invasion captured 40 % of France 's heavy industry in 1914, especially in steel and coal. The French GDP in 1918 was 24 % smaller than in 1913; since a third went into the war effort, the civilian standard of living fell in half. But thousands of little factories opened up across France, hiring older men, women, youth, disabled veterans, and even some soldiers. Algerian and Vietnamese laborers were brought in. By standardizing on basic but effective models early on, the French produced enough artillery, tanks and airplanes to equip not only their own army but the United States as well. The network of small plants produced 200,000 75mm shells a day. The US provided much food, steel, coal and machine tools, and $3.6 billion in loans to finance it all; the British loaned another $3 billion. Considerable relief came with the influx of American food, money and raw materials in 1917. The economy was supported after 1917 by American government loans which were used to purchase foods and manufactured goods that allowed a decent standard of living. The arrival of over a million American soldiers in 1918 brought heavy spending for food and construction materials. Labor shortages were in part alleviated by the use of volunteer workers from the colonies. France 's diverse regions suffered in different ways. The northeast was occupied and exploited by the Germans during the entire war, and was left in ruins. While the occupied area in 1913 contained only 14 % of France 's industrial workers, it produced 58 % of the steel, and 40 % of the coal. Combat never reached Massif Central region but its farms and industries were hurt. The heavy loss of men into the army manpower was partly restored on the farms and in the construction industry by using prisoners of war, migratory workers, women, and older children. War contracts made some firms prosperous but on the whole did not compensate for the loss of foreign markets. There was a permanent loss of population caused by battle deaths and emigration. The economy of Algeria was severely disrupted. Internal lines of communication and transportation were disrupted, and shipments of the main export, cheap wine, had to be cut back. Crime soared as French forces were transferred to the Western Front, and there was rioting in the province of Batna. Shortages mounted, inflation soared, banks cut off credit, and the provincial government was ineffective. The French government floated four war bond issues on the London market and raised 55 million pounds. These bonds were denominated in francs instead of pounds or gold, and were not guaranteed against exchange rate fluctuations. After the war franc lost value and the British bondholders tried, and failed, to get restitution. J.P. Morgan & Co. of New York was the major American financier for the Allies, and worked closely with French bankers. However its dealings became strained because of growing misunderstandings between the Wall Street bankers and French bankers and diplomats. France relied heavily on its worldwide empire for manpower, to work in munitions factories and other civilian jobs inside France. A famous example was Ho Chi Minh who worked in Paris, and was highly active in organizing fellow Vietnamese, and even demanding a voice for them at the Paris Peace Conference in 1919. The French army enlisted hundreds of thousands of colonials. From Africa came 212,000 soldiers, of whom 160,000 fought on the Western front. The rapid unplanned buildup of French military operations in Africa disrupted normal trade relations and all the colonies, especially disrupting food supplies for the cities and distorting the local labor markets. French administrators, focused on supporting the armies on the Western Front, disregarded or suppressed protest movements. The Russian economy was far too backward to sustain a major war, and conditions deteriorated rapidly, despite financial aid from Britain. By late 1915 there was a severe shortage of artillery shells. The very large but poorly equipped Russian army fought tenaciously and desperately despite its poor organisation and lack of munitions. Casualties were enormous. By 1915, many soldiers were sent to the front unarmed, and told to pick up whatever weapons they could from the battlefield. The onset of World War I exposed the poor administrative skills of the czarist government under Nicholas II. A show of national unity had accompanied Russia 's entrance into the war, with defense of the Slavic Serbs the main battle cry. In the summer of 1914, the Duma and the zemstva expressed full support for the government 's war effort. The initial conscription was well organized and peaceful, and the early phase of Russia 's military buildup showed that the empire had learned lessons from the Russo - Japanese War. But military reversals and the government 's incompetence soon soured much of the population. Enemy control of the Baltic Sea and the Black Sea severed Russia from most of its foreign supplies and markets. Russia had not prepared for a major war and Reacted very slowly as problems mounted in 1914 - 16. Inflation became a serious problem. Because of inadequate material support for military operations, the War Industry Committees were formed to ensure that necessary supplies reached the front. But army officers quarreled with civilian leaders, seized administrative control of front areas, and refused to cooperate with the committee. The central government distrusted the independent war support activities that were organized by zemstva and cities. The Duma quarreled with the war bureaucracy of the government, and center and center - left deputies eventually formed the Progressive Bloc to create a genuinely constitutional government. While the central government was hampered by court intrigue, the strain of the war began to cause popular unrest. Food shortages increasingly impacted urban areas, caused by military purchases, transportation bottlenecks, financial confusion, and administrative mismanagement. By 1915 high food prices and fuel shortages caused strikes in some cities. Food riots became more common and more violent, and ready the angry populace for withering political attacks on the czarist regime. Workers, who had won the right to representation in sections of the War Industries Committee, used those sections to mobilize political opposition. The countryside also was becoming restive. Soldiers were increasingly insubordinate, particularly the newly recruited peasants who faced the prospect of being used as cannon fodder in the inept conduct of the war. The bad situation continued to deteriorate. Increasing conflict between the tsar and the Duma destroyed popular and elite support for the old regime. In early 1917, deteriorating rail transport caused acute food and fuel shortages, which resulted in escalating riots and strikes. Authorities summoned troops to quell the disorders in Petrograd (as St. Petersburg had been called since September 1914, to Russianize the Germanic name). In 1905 troops had fired on demonstrators and saved the monarchy, but in 1917 the troops turned their guns over to the angry crowds. Public support for the tsarist regime simply evaporated in 1917, ending three centuries of Romanov rule. Italy joined the Allies in 1915, but was poorly prepared for war. Loans from Britain paid for nearly all its war expenses. The Italian army of 875,000 men was poorly led and lacked heavy artillery and machine guns. The industrial base was too small to provide adequate amounts of modern equipment, and the old - fashioned rural base did not produce much of a food surplus. Before the war the government had ignored labor issues, but now it had to intervene to mobilize war production. With the main working - class Socialist party reluctant to support the war effort, strikes were frequent and cooperation was minimal, especially in the Socialist strongholds of Piedmont and Lombardy. The government imposed high wage scales, as well as collective bargaining and insurance schemes. Many large firms expanded dramatically. The workforce at the Ansaldo munitions company grew from 6,000 to 110,000 as it manufactured 10,900 artillery pieces, 3,800 warplanes, 95 warships and 10 million artillery shells. At Fiat the workforce grew from 4,000 to 40,000. Inflation doubled the cost of living. Industrial wages kept pace but not wages for farm workers. Discontent was high in rural areas since so many men were taken for service, industrial jobs were unavailable, wages grew slowly and inflation was just as bad. In terms of munitions production, the 15 months after April 1917 involved an amazing parade of mistakes, misguided enthusiasm, and confusion. Americans were willing enough, but they did not know their proper role. Wilson was unable to figure out what to do when, or even to decide who was in charge. Typical of the confusion was the coal shortage that hit in December 1917. Because coal was by far the most major source of energy and heat, a grave crisis ensued. There was in fact plenty of coal being mined, but 44,000 loaded freight and coal cars were tied up in horrendous traffic jams in the rail yards of the East Coast. Two hundred ships were waiting in New York harbor for cargo that was delayed by the mess. The solution included nationalizing the coal mines and the railroads for the duration, shutting down factories one day a week to save fuel, and enforcing a strict system of priorities. Only in March 1918 did Wilson finally take control of the crisis The war saw many women taking on what were traditionally men 's jobs. Many worked on the assembly lines of factories, producing trucks and munitions. For the first time, department stores employed African American women as elevator operators and cafeteria waitresses. The Food Administration helped housewives prepare nutritious meals with less waste and with optimum use of the foods available. Most important, the morale of the women remained high, as millions join the Red Cross as volunteers to help soldiers and their families. With rare exceptions, the women did not protest the draft. Samuel Gompers, head of the AFL, and nearly all labor unions were strong supporters of the war effort. They minimized strikes as wages soared and full employment was reached. The AFL unions strongly encouraged their young men to enlist in the military, and fiercely opposed efforts to reduce recruiting and slow war production by the anti-war labor union called the Industrial Workers of the World (IWW) and also left - wing Socialists. President Wilson appointed Gompers to the powerful Council of National Defense, where he set up the War Committee on Labor. The AFL membership soared to 2.4 million in 1917. In 1919, the Union tried to make their gains permanent and called a series of major strikes in meat, steel and other industries. The strikes, all of which failed, forced unions back to their position around 1910. While Germany rapidly mobilized its soldiers, it had to improvise the mobilization of the civilian economy for the war effort. It was severely handicapped by the British blockade that cut off food supplies, machinery and raw materials. Walter Rathenau played the key role in convincing the War Ministry to set up the War Raw Materials Department (Kriegsrohstoffabteilung -- "KRA ''); he was in charge of it from August 1914 to March 1915 and established the basic policies and procedures. His senior staff were on loan from industry. KRA focused on raw materials threatened by the British blockade, as well as supplies from occupied Belgium and France. It set prices and regulated the distribution to vital war industries. It began the development of ersatz raw materials. KRA suffered many inefficiencies caused by the complexity and selfishness KRA encountered from commerce, industry, and the government. Some two dozen additional agencies were created dealing with specific products; the agencies could confiscate supplies and redirect them to the munitions factories. Cartels were created and small firms merged into larger ones for greater efficiency and ease of central control. The military took an increasingly dominant role in setting economic priorities and in direct control of vital industries. It was usually inefficient, but it performed very well in aircraft. The army set prices and wages, gave out draft exemptions, guaranteed the supply of credit and raw materials, limited patent rights, and supervised management -- labor relationships. The industry expanded very rapidly with high quality products and many innovations, and paid wages well above the norm for skilled workers. Total spending by the national government reached 170 billion marks during the war, of which taxes covered only 8 %, and the rest was borrowed from German banks and private citizens. Eight national war loans reached out to the entire population and raised 100 million marks. It proved almost impossible to borrow money from outside. The national debt rose from only 5 billion marks in 1914 to 156 billion in 1918. These bonds became worthless in 1923 because of hyperinflation. As the war went on conditions deteriorated rapidly on the home front, with severe food shortages reported in all urban areas by 1915. Causes involved the transfer of many farmers and food workers into the military, an overburdened railroad system, shortages of coal, and the British blockade that cut off imports from abroad. The winter of 1916 -- 1917 was known as the "turnip winter '', because that vegetable, usually fed to livestock, was used by people as a substitute for potatoes and meat, which were increasingly scarce. Thousands of soup kitchens were opened to feed the hungry people, who grumbled that the farmers were keeping the food for themselves. Even the army had to cut the rations for soldiers. Morale of both civilians and soldiers continued to sink. In the Ottoman Empire Turkish nationalists took control before the war began. They drove out Greeks and Armenians who had been the backbone of the business community, replacing them with ethnic Turks who were given favorable contracts but who lacked the international connections, credit sources, and entrepreneurial skills needed for business. The Ottoman economy was based on subsistence agriculture; there was very little industry. Turkish wheat was in high demand, but transportation was rudimentary and not much of it reached Germany. The war cut off imports except from Germany. Prices quadrupled. The Germans provided loans and supplied the army with hardware, especially captured Belgian and Russian equipment. Other supplies were in short supply; the soldiers were often in rags. Medical services were very bad and illness and death rates were high. Most of the Ottoman soldiers deserted when they had the opportunity, so the force level shrank from a peak strength of 800,000 in 1916 to only 100,000 in 1918. The Astro - Hungarian monarchical personal union of the two countries was a result of the Compromise of 1867. Kingdom of Hungary lost its former status after the Hungarian Revolution of 1848. However following the 1867 reforms, the Austrian and the Hungarian states became co-equal within the Empire. Austria - Hungary was geographically the second - largest country in Europe after the Russian Empire, at 621,538 km (239,977 sq mi), and the third-most populous (after Russia and the German Empire). In comparison with Germany and Britain, the Austro - Hungarian economy lagged behind considerably, as sustained modernization had begun much later in Austria - Hungary. The Empire built up the fourth - largest machine building industry of the world, after the United States, Germany, and Britain. Austria - Hungary was also the world 's third largest manufacturer and exporter of electric home appliances, electric industrial appliances and facilities for power plants, after the United States and the German Empire. The Empire of Austria and the Kingdom of Hungary had always maintained separate parliaments: the Imperial Council (Austria) and the Diet of Hungary. Except for the Pragmatic Sanction of 1713, common laws never existed in the Empire of Austria and the Kingdom of Hungary. There was no common citizenship: one was either an Austrian citizen or a Hungarian citizen, never both. Austria and Hungary were fiscally sovereign and independent entities. The Kingdom of Hungary could preserve its separated and independent budget. However, by the end of the 19th century, economic differences gradually began to even out as economic growth in the eastern parts of the Empire consistently surpassed that in the western. The strong agriculture and food industry of the Kingdom of Hungary with the centre of Budapest became predominant within the empire and made up a large proportion of the export to the rest of Europe. Meanwhile, western areas, concentrated mainly around Prague and Vienna, excelled in various manufacturing industries. This division of labour between the east and west, besides the existing economic and monetary union, led to an even more rapid economic growth throughout Austria - Hungary by the early 20th century. Austria could preserve its dominance within the empire in the sectors of the first industrial revolution, but Hungary had a better position in the industries of the second industrial revolution, in these modern industrial sectors the Austrian competition could not become overwhelming. The empire 's heavy industry had mostly focused on machine building, especially for the electric power industry, locomotive industry and automotive industry, while in light industry the precision mechanics industry was the most dominant. During the war the national governments of Vienna and Budapest set up a highly centralized war economy, resulting in a bureaucratic dictatorship. It drafted skilled workers and engineers without realizing the damage it did to the economy. The Czech region had a more advanced economy, but was reluctant to support the war effort. Czechs rejected any customs union with Germany, because it threatened their language and culture. Czech bankers had an eye to early independence; they purchased many securities from the Czech lands, thus insuring their strong domestic position in what became Czechoslovakia in 1918. Bulgaria, a poor rural nation of 4.5 million people, at first stayed neutral. In 1915 it joined the Central Powers. It mobilized a very large army of 800,000 men, using equipment supplied by Germany. Bulgaria was ill - prepared for a long war; absence of so many soldiers sharply reduced agricultural output. Much of its best food was smuggled out to feed lucrative black markets elsewhere. By 1918 the soldiers were not only short of basic equipment like boots but they were being fed mostly corn bread with a little meat. The peace treaty in 1919 stripped Bulgaria of its conquests, reduced its army to 20,000 men, and demanded reparations of £ 100 million. Conditions on the Continent were bad for every belligerent. Britain sustained the lightest damage to its civilian economy, apart from its loss of men. The major damage was to its merchant marine and to its financial holdings. The United States and Canada prospered during the war. The reparations levied on Germany by the Treaty of Versailles were designed to restore the damage to the civilian economies, but little of the reparations money went for that. Most of Germany 's reparations payments were funded by loans from American banks, and the recipients used them to pay off loans they had from the U.S. Treasury. Between 1919 and 1932, Germany paid out 19 billion goldmarks in reparations, and received 27 billion goldmarks in loans from New York bankers and others. These loans were eventually paid back by West Germany after World War II.
when does it snow in north carolina mountains
Climate of North Carolina - wikipedia North Carolina 's climate varies from the Atlantic coast in the east to the Appalachian Mountain range in the west. The mountains often act as a "shield '', blocking low temperatures and storms from the Midwest from entering the Piedmont of North Carolina. Most of the state has a humid subtropical climate (Köppen climate classification Cfa), except in the higher elevations of the Appalachians which have a subtropical highland climate (Köppen Cfb). For most areas in the state, the temperatures in July during the daytime are approximately 90 ° F (32 ° C). In January the average temperatures range near 50 ° F (10 ° C). (However, a polar vortex or "cold blast '' can significantly bring down average temperatures, seen in the winters of 2014 and 2015.) There is an average of forty - five inches of rain a year (fifty in the mountains). July storms account for much of this precipitation. As much as 15 % of the rainfall during the warm season in the Carolinas can be attributed to tropical cyclones. Mountains usually see some snow in the fall and winter. Moist winds from the southwest drop an average of 80 inches (2,000 mm) of precipitation on the western side of the mountains, while the northeast - facing slopes average less than half that amount. Snow in North Carolina is seen on a regular basis in the mountains. North Carolina averages 5 inches (130 mm) of snow a year. However, this also varies greatly across the state. Along the coast, most areas register less than 2 inches (51 mm) per year while the state capital, Raleigh averages 7.5 inches (190 mm). Farther west in the Piedmont - Triad, the average grows to approximately 9 inches (230 mm). The Charlotte area averages approximately 6.5 inches (170 mm). The mountains in the west act as a barrier, preventing most snowstorms from entering the Piedmont. When snow does make it past the mountains, it is usually light and is seldom on the ground for more than two or three days. However, several storms have dropped 18 inches (460 mm) or more of snow within normally warm areas. The 1993 Storm of the Century that lasted from March 11 to March 15 affected locales from Canada to Central America, and brought a significant amount of snow to North Carolina. Newfound Gap received more than 36 inches (0.91 m) of snow with drifts more than 5 feet (1.5 m), while Mount Mitchell measured over 4 feet (1.2 m) of snow with drifts to 14 feet (4.3 m). Most of the northwestern part of the state received somewhere between 2 feet (0.61 m) an 3 feet (0.91 m) of snow. Another significant snowfall hit the Raleigh area in January 2000 when more than 20 inches (510 mm) of snow fell. There was also a heavy snowfall totaling 18 inches (460 mm) that hit the Wilmington area on December 22 - 23, 1989. This storm affected only the Southeastern US coast, as far south as Savannah, GA, with little to no snow measured west of I - 95. Most big snows that impact areas east of the mountains come from extratropical cyclones which approach from the south across Georgia and South Carolina and move off the coast of North or South Carolina. These storms typically throw Gulf or Atlantic moisture over cold Arctic air at ground level, usually propelled southward from Arctic high pressure over the Northeastern or New England states. If the storms track sufficiently far to the east, snow will be limited to the eastern part of the state (as with the December 22 - 23, 1989 storm). If the cyclones travel close to the coast, warm air will get pulled into eastern North Carolina due to increasing flow off the milder Atlantic Ocean, bringing a rain / snow line well inland with heavy snow restricted to the Piedmont, foothills and mountains, as with the January 22, 1987 storm. If the storm tracks inland into eastern North Carolina, the rain / snow line ranges between Raleigh and Greensboro. Located along the Atlantic Coast, North Carolina is no stranger to hurricanes. Many hurricanes that come up from the Caribbean Sea make it up the coast of eastern America, passing by North Carolina. On October 15, 1954, Hurricane Hazel struck North Carolina, at that time it was a category 4 hurricane within the Saffir - Simpson Hurricane Scale. Hazel caused significant damage due to its strong winds. A weather station at Oak Island reported maximum sustained winds of 140 miles per hour (230 km / h), while in Raleigh winds of 90 miles per hour (140 km / h) were measured. The hurricane caused 19 deaths and significant destruction. One person at Long Beach claimed that "of the 357 buildings that existed in the town, 352 were totally destroyed and the other five damaged ''. Hazel was described as "the most destructive storm in the history of North Carolina '' in a 1989 report. In 1996, Hurricane Fran made landfall in North Carolina. As a category 3 hurricane, Fran caused a great deal of damage, mainly through winds. Fran 's maximum sustained wind speeds were 115 miles per hour (185 km / h), while North Carolina 's coast saw surges of 8 feet (2.4 m) to 12 feet (3.7 m) above sea level. The amount of damage caused by Fran ranged from $1.275 to $2 billion in North Carolina. Heavy rains accompany tropical cyclones and their remnants which move northeast from the Gulf of Mexico coastline, as well as inland from the western subtropical Atlantic ocean. Over the past 30 years, the wettest tropical cyclone to strike the coastal plain was Hurricane Floyd of September 1999, which dropped over 24 inches (610 mm) of rainfall north of Southport. Unlike Hazel and Fran, the main force of destruction was from precipitation. Before Hurricane Floyd reached North Carolina, the state had already received large amounts of rain from Hurricane Dennis less than two weeks before Floyd. This saturated much of the Eastern North Carolina soil and allowed heavy rains from Hurricane Floyd to turn into floods. Over 35 people died from Floyd. In the mountains, Hurricane Frances of September 2004 was nearly as wet, bringing over 23 inches (580 mm) of rainfall to Mount Mitchell. In most years, the greatest weather - related economic loss incurred in North Carolina is due to severe weather spawned by summer thunderstorms. These storms affect limited areas, with their hail and wind accounting for an average annual loss of over US $5 million. North Carolina averages 31 tornadoes a year with May seeing the most tornadoes on average a month with 5. June, July and August all have an average of 3 tornadoes with an increase to 4 average tornadoes a month in September. It is through September and into early November when North Carolina can typically expect to see that smaller, secondary, severe weather season. While severe weather season is technically from March through May, tornadoes have touched down in North Carolina in every month of the year. (1). On November 28, 1988, an early morning F4 tornado smashed across northwestern Raleigh, continuing 84 miles further, killing 4 and injuring 157 (http://www4.ncsu.edu/~nwsfo/storage/cases/19881128/). In winter, North Carolina is somewhat protected by the Appalachian Mountains to the west. Cold fronts from Canada are typically reduced in intensity by the mountains. However, occasionally cold air can move from the north or northeast, east of the Appalachian Mountains, from Arctic high pressure systems that settle over the Northeastern or New England states. Other polar and Arctic outbreaks can cross the mountains and force temperatures to drop to about 12 ° F (− 11 ° C) in central North Carolina. Still, temperatures below zero degrees Fahrenheit are extremely rare outside of the mountains. The coldest ever recorded temperature in North Carolina was − 34 ° F (− 37 ° C) on January 21, 1985, at Mount Mitchell. The winter temperatures on the coast are milder due to the warming influence of the Atlantic Ocean and the Gulf Stream. The average ocean temperature in Southport in January is still higher than the average ocean temperature in Maine during July. Snow is common in the mountains, although many ski resorts use snowmaking equipment to make sure there is always snow on their land. North Carolina 's relative humidity is highest in the winter. Tornadoes are most likely in the spring. Major tornado outbreaks affected parts of eastern North Carolina on March 28, 1984, and April 16, 2011. The month of May experiences the greatest rise in temperatures. During the spring, there are warm days and cool nights in the Piedmont. Temperatures are somewhat cooler in the mountains and warmer, particularly at night, near the coast. North Carolina 's humidity is lowest in the spring. North Carolina experiences high summer temperatures. Sometimes, cool, dry air from the north will invade North Carolina for brief periods of time, with temperatures quickly rebounding. It remains colder at high elevations, with the average summer temperature in Mount Mitchell lying at 68 ° F (20 ° C). Morning temperatures are on average 20 ° F (12 ° C) lower than afternoon temperatures, except along the Atlantic Coast. The largest economic loss from severe weather in North Carolina is due to severe thunderstorms in the summer, although they usually only hit small areas. Tropical cyclones can impact the state during the summer as well. Fogs are also frequent in the summer. Fall is the most rapidly changing season temperature wise, especially in October and November. Tropical cyclones remain a threat until late in the season. The Appalachian Mountains are frequently visited at this time of year, due to the leaves changing color in the trees. During El Niño events, winter and early spring temperatures are cooler than average with above average precipitation in the central and eastern parts of the state and drier weather in the western part. La Niña usually brings warmer than average temperatures with above average precipitation in the western part of the state while the central and coastal regions stay drier than average. The water on North Carolina 's shores have risen 2 inches (50 mm) but the reason is heavily discussed. Temperatures in North Carolina have risen too. Over the last 100 years, the average temperature in Chapel Hill has gone up 1.2 ° F (0.7 ° C) and precipitation in some parts of the state has increased by 5 percent.
where is washington-dc located on the united states map
Washington, D.C. - Wikipedia Washington, D.C., formally the District of Columbia and commonly referred to as "Washington '', "the District '', or simply "D.C. '', is the capital of the United States. The signing of the Residence Act on July 16, 1790, approved the creation of a capital district located along the Potomac River on the country 's East Coast. The U.S. Constitution provided for a federal district under the exclusive jurisdiction of the Congress and the District is therefore not a part of any state. The states of Maryland and Virginia each donated land to form the federal district, which included the pre-existing settlements of Georgetown and Alexandria. Named in honor of President George Washington, the City of Washington was founded in 1791 to serve as the new national capital. In 1846, Congress returned the land originally ceded by Virginia; in 1871, it created a single municipal government for the remaining portion of the District. Washington had an estimated population of 681,170 as of July 2016. Commuters from the surrounding Maryland and Virginia suburbs raise the city 's population to more than one million during the workweek. The Washington metropolitan area, of which the District is the principal city, has a population of over 6 million, the sixth - largest metropolitan statistical area in the country. The centers of all three branches of the federal government of the United States are in the District, including the Congress, President, and Supreme Court. Washington is home to many national monuments and museums, which are primarily situated on or around the National Mall. The city hosts 176 foreign embassies as well as the headquarters of many international organizations, trade unions, non-profit organizations, lobbying groups, and professional associations. A locally elected mayor and a 13 ‐ member council have governed the District since 1973. However, the Congress maintains supreme authority over the city and may overturn local laws. D.C. residents elect a non-voting, at - large congressional delegate to the House of Representatives, but the District has no representation in the Senate. The District receives three electoral votes in presidential elections as permitted by the Twenty - third Amendment to the United States Constitution, ratified in 1961. Various tribes of the Algonquian - speaking Piscataway people (also known as the Conoy) inhabited the lands around the Potomac River when Europeans first visited the area in the early 17th century. One group known as the Nacotchtank (also called the Nacostines by Catholic missionaries) maintained settlements around the Anacostia River within the present - day District of Columbia. Conflicts with European colonists and neighboring tribes forced the relocation of the Piscataway people, some of whom established a new settlement in 1699 near Point of Rocks, Maryland. In his Federalist No. 43, published January 23, 1788, James Madison argued that the new federal government would need authority over a national capital to provide for its own maintenance and safety. Five years earlier, a band of unpaid soldiers besieged Congress while its members were meeting in Philadelphia. Known as the Pennsylvania Mutiny of 1783, the event emphasized the need for the national government not to rely on any state for its own security. Article One, Section Eight, of the Constitution permits the establishment of a "District (not exceeding ten miles square) as may, by cession of particular states, and the acceptance of Congress, become the seat of the government of the United States ''. However, the Constitution does not specify a location for the capital. In what is now known as the Compromise of 1790, Madison, Alexander Hamilton, and Thomas Jefferson came to an agreement that the federal government would pay each state 's remaining Revolutionary War debts in exchange for establishing the new national capital in the Southern United States. On July 9, 1790, Congress passed the Residence Act, which approved the creation of a national capital on the Potomac River. The exact location was to be selected by President George Washington, who signed the bill into law on July 16. Formed from land donated by the states of Maryland and Virginia, the initial shape of the federal district was a square measuring 10 miles (16 km) on each side, totaling 100 square miles (259 km). Two pre-existing settlements were included in the territory: the port of Georgetown, Maryland, founded in 1751, and the city of Alexandria, Virginia, founded in 1749. During 1791 -- 92, Andrew Ellicott and several assistants, including a free African American astronomer named Benjamin Banneker, surveyed the borders of the federal district and placed boundary stones at every mile point. Many of the stones are still standing. A new federal city was then constructed on the north bank of the Potomac, to the east of Georgetown. On September 9, 1791, the three commissioners overseeing the capital 's construction named the city in honor of President Washington. The federal district was named Columbia, which was a poetic name for the United States commonly in use at that time. Congress held its first session in Washington on November 17, 1800. Congress passed the Organic Act of 1801, which officially organized the District and placed the entire territory under the exclusive control of the federal government. Further, the unincorporated area within the District was organized into two counties: the County of Washington to the east of the Potomac and the County of Alexandria to the west. After the passage of this Act, citizens living in the District were no longer considered residents of Maryland or Virginia, which therefore ended their representation in Congress. On August 24 -- 25, 1814, in a raid known as the Burning of Washington, British forces invaded the capital during the War of 1812. The Capitol, Treasury, and White House were burned and gutted during the attack. Most government buildings were repaired quickly; however, the Capitol was largely under construction at the time and was not completed in its current form until 1868. In the 1830s, the District 's southern territory of Alexandria went into economic decline partly due to neglect by Congress. The city of Alexandria was a major market in the American slave trade, and pro-slavery residents feared that abolitionists in Congress would end slavery in the District, further depressing the economy. Alexandria 's citizens petitioned Virginia to take back the land it had donated to form the District, through a process known as retrocession. The Virginia General Assembly voted in February 1846 to accept the return of Alexandria and on July 9, 1846, Congress agreed to return all the territory that had been ceded by Virginia. Therefore, the District 's current area consists only of the portion originally donated by Maryland. Confirming the fears of pro-slavery Alexandrians, the Compromise of 1850 outlawed the slave trade in the District, although not slavery itself. The outbreak of the American Civil War in 1861 led to expansion of the federal government and notable growth in the District 's population, including a large influx of freed slaves. President Abraham Lincoln signed the Compensated Emancipation Act in 1862, which ended slavery in the District of Columbia and freed about 3,100 enslaved persons, nine months prior to the Emancipation Proclamation. In 1868, Congress granted the District 's African American male residents the right to vote in municipal elections. By 1870, the District 's population had grown 75 % from the previous census to nearly 132,000 residents. Despite the city 's growth, Washington still had dirt roads and lacked basic sanitation. Some members of Congress suggested moving the capital further west, but President Ulysses S. Grant refused to consider such a proposal. Congress passed the Organic Act of 1871, which repealed the individual charters of the cities of Washington and Georgetown, and created a new territorial government for the whole District of Columbia. President Grant appointed Alexander Robey Shepherd to the position of governor in 1873. Shepherd authorized large - scale projects that greatly modernized Washington, but ultimately bankrupted the District government. In 1874, Congress replaced the territorial government with an appointed three - member Board of Commissioners. The city 's first motorized streetcars began service in 1888 and generated growth in areas of the District beyond the City of Washington 's original boundaries. Washington 's urban plan was expanded throughout the District in the following decades. Georgetown was formally annexed by the City of Washington in 1895. However, the city had poor housing conditions and strained public works. Washington was the first city in the nation to undergo urban renewal projects as part of the "City Beautiful movement '' in the early 1900s. Increased federal spending as a result of the New Deal in the 1930s led to the construction of new government buildings, memorials, and museums in Washington. World War II further increased government activity, adding to the number of federal employees in the capital; by 1950, the District 's population reached its peak of 802,178 residents. The Twenty - third Amendment to the United States Constitution was ratified in 1961, granting the District three votes in the Electoral College for the election of president and vice president, but still no voting representation in Congress. After the assassination of civil rights leader Dr. Martin Luther King, Jr., on April 4, 1968, riots broke out in the District, primarily in the U Street, 14th Street, 7th Street, and H Street corridors, centers of black residential and commercial areas. The riots raged for three days until more than 13,600 federal troops stopped the violence. Many stores and other buildings were burned; rebuilding was not completed until the late 1990s. In 1973, Congress enacted the District of Columbia Home Rule Act, providing for an elected mayor and 13 - member council for the District. In 1975, Walter Washington became the first elected and first black mayor of the District. On September 11, 2001, terrorists hijacked American Airlines Flight 77 and deliberately crashed the plane into the Pentagon in nearby Arlington, Virginia. United Airlines Flight 93, believed to be destined for Washington, D.C., crashed in Pennsylvania when passengers tried to recover control of the plane from hijackers. Washington, D.C., is located in the mid-Atlantic region of the U.S. East Coast. Due to the District of Columbia retrocession, the city has a total area of 68.34 square miles (177.0 km), of which 61.05 square miles (158.1 km) is land and 7.29 square miles (18.9 km) (10.67 %) is water. The District is bordered by Montgomery County, Maryland, to the northwest; Prince George 's County, Maryland, to the east; and Arlington and Alexandria, Virginia, to the south and west. The south bank of the Potomac River forms the District 's border with Virginia and has two major tributaries: the Anacostia River and Rock Creek. Tiber Creek, a natural watercourse that once passed through the National Mall, was fully enclosed underground during the 1870s. The creek also formed a portion of the now - filled Washington City Canal, which allowed passage through the city to the Anacostia River from 1815 until the 1850s. The Chesapeake and Ohio Canal starts in Georgetown and was used during the 19th century to bypass the Little Falls of the Potomac River, located at the northwest edge of Washington at the Atlantic Seaboard fall line. The highest natural elevation in the District is 409 feet (125 m) above sea level at Fort Reno Park in upper northwest Washington. The lowest point is sea level at the Potomac River. The geographic center of Washington is near the intersection of 4th and L Streets NW. The District has 7,464 acres (30.21 km) of parkland, about 19 % of the city 's total area and the second - highest percentage among high - density U.S. cities. The National Park Service manages most of the 9,122 acres (36.92 km) of city land owned by the U.S. government. Rock Creek Park is a 1,754 - acre (7.10 km) urban forest in Northwest Washington, which extends 9.3 miles (15.0 km) through a stream valley that bisects the city. Established in 1890, it is the country 's fourth - oldest national park and is home to a variety of plant and animal species including raccoon, deer, owls, and coyotes. Other National Park Service properties include the C&O Canal National Historical Park, the National Mall and Memorial Parks, Theodore Roosevelt Island, Columbia Island, Fort Dupont Park, Meridian Hill Park, Kenilworth Park and Aquatic Gardens, and Anacostia Park. The D.C. Department of Parks and Recreation maintains the city 's 900 acres (3.6 km) of athletic fields and playgrounds, 40 swimming pools, and 68 recreation centers. The U.S. Department of Agriculture operates the 446 - acre (1.80 km) U.S. National Arboretum in Northeast Washington. Washington is in the northern part of the humid subtropical climate zone (Köppen: Cfa) However, under the Trewartha climate classification, the city has a temperate maritime climate (Do). Winters are usually chilly with light snow, and summers are hot and humid. The District is in plant hardiness zone 8a near downtown, and zone 7b elsewhere in the city, indicating a humid subtropical climate. Spring and fall are mild to warm, while winter is chilly with annual snowfall averaging 15.5 inches (39 cm). Winter temperatures average around 38 ° F (3.3 ° C) from mid-December to mid-February. Summers are hot and humid with a July daily average of 79.8 ° F (26.6 ° C) and average daily relative humidity around 66 %, which can cause moderate personal discomfort. The combination of heat and humidity in the summer brings very frequent thunderstorms, some of which occasionally produce tornadoes in the area. Blizzards affect Washington on average once every four to six years. The most violent storms are called "nor'easters '', which often affect large sections of the East Coast. From January 27 to 28, 1922, the city officially received 28 inches (71 cm) of snowfall, the largest snowstorm since official measurements began in 1885. According to notes kept at the time, the city received between 30 and 36 inches (76 and 91 cm) from a snowstorm on January 1772. Hurricanes (or their remnants) occasionally track through the area in late summer and early fall, but are often weak by the time they reach Washington, partly due to the city 's inland location. Flooding of the Potomac River, however, caused by a combination of high tide, storm surge, and runoff, has been known to cause extensive property damage in the neighborhood of Georgetown. Precipitation occurs throughout the year. The highest recorded temperature was 106 ° F (41 ° C) on August 6, 1918, and on July 20, 1930. while the lowest recorded temperature was − 15 ° F (− 26 ° C) on February 11, 1899, during the Great Blizzard of 1899. During a typical year, the city averages about 37 days at or above 90 ° F (32.2 ° C) and 64 nights at or below freezing. Washington, D.C., is a planned city. In 1791, President Washington commissioned Pierre (Peter) Charles L'Enfant, a French - born architect and city planner, to design the new capital. He enlisted Scottish surveyor Alexander Ralston helped layout the city plan. The L'Enfant Plan featured broad streets and avenues radiating out from rectangles, providing room for open space and landscaping. He based his design on plans of cities such as Paris, Amsterdam, Karlsruhe, and Milan that Thomas Jefferson had sent to him. L'Enfant's design also envisioned a garden - lined "grand avenue '' approximately 1 mile (1.6 km) in length and 400 feet (120 m) wide in the area that is now the National Mall. President Washington dismissed L'Enfant in March 1792 due to conflicts with the three commissioners appointed to supervise the capital 's construction. Andrew Ellicott, who had worked with L'Enfant surveying the city, was then tasked with completing the design. Though Ellicott made revisions to the original plans, including changes to some street patterns, L'Enfant is still credited with the overall design of the city. By the early 1900s, L'Enfant's vision of a grand national capital had become marred by slums and randomly placed buildings, including a railroad station on the National Mall. Congress formed a special committee charged with beautifying Washington 's ceremonial core. What became known as the McMillan Plan was finalized in 1901 and included re-landscaping the Capitol grounds and the National Mall, clearing slums, and establishing a new citywide park system. The plan is thought to have largely preserved L'Enfant's intended design. By law, Washington 's skyline is low and sprawling. The federal Heights of Buildings Act of 1910 allows buildings that are no taller than the width of the adjacent street, plus 20 feet (6.1 m). Despite popular belief, no law has ever limited buildings to the height of the United States Capitol or the 555 - foot (169 m) Washington Monument, which remains the District 's tallest structure. City leaders have criticized the height restriction as a primary reason why the District has limited affordable housing and traffic problems caused by urban sprawl. The District is divided into four quadrants of unequal area: Northwest (NW), Northeast (NE), Southeast (SE), and Southwest (SW). The axes bounding the quadrants radiate from the U.S. Capitol building. All road names include the quadrant abbreviation to indicate their location and house numbers generally correspond with the number of blocks away from the Capitol. Most streets are set out in a grid pattern with east -- west streets named with letters (e.g., C Street SW), north -- south streets with numbers (e.g., 4th Street NW), and diagonal avenues, many of which are named after states. The City of Washington was bordered by Boundary Street to the north (renamed Florida Avenue in 1890), Rock Creek to the west, and the Anacostia River to the east. Washington 's street grid was extended, where possible, throughout the District starting in 1888. Georgetown 's streets were renamed in 1895. Some streets are particularly noteworthy, such as Pennsylvania Avenue, which connects the White House to the Capitol and K Street, which houses the offices of many lobbying groups. Washington hosts 177 foreign embassies, constituting approximately 297 buildings beyond the more than 1,600 residential properties owned by foreign countries, many of which are on a section of Massachusetts Avenue informally known as Embassy Row. The architecture of Washington varies greatly. Six of the top 10 buildings in the American Institute of Architects ' 2007 ranking of "America 's Favorite Architecture '' are in the District of Columbia: the White House; the Washington National Cathedral; the Thomas Jefferson Memorial; the United States Capitol; the Lincoln Memorial; and the Vietnam Veterans Memorial. The neoclassical, Georgian, gothic, and modern architectural styles are all reflected among those six structures and many other prominent edifices in Washington. Notable exceptions include buildings constructed in the French Second Empire style such as the Eisenhower Executive Office Building. Outside downtown Washington, architectural styles are even more varied. Historic buildings are designed primarily in the Queen Anne, Châteauesque, Richardsonian Romanesque, Georgian revival, Beaux - Arts, and a variety of Victorian styles. Rowhouses are especially prominent in areas developed after the Civil War and typically follow Federalist and late Victorian designs. Georgetown 's Old Stone House was built in 1765, making it the oldest - standing original building in the city. Founded in 1789, Georgetown University features a mix of Romanesque and Gothic Revival architecture. The Ronald Reagan Building is the largest building in the District with a total area of approximately 3.1 million square feet (288,000 m). The U.S. Census Bureau estimates that the District 's population was 681,170 on July 1, 2016, an 13.2 % increase since the 2010 United States Census. The increase continues a growth trend since 2000, following a half - century of population decline. The city was the 24th most populous place in the United States as of 2010. According to data from 2010, commuters from the suburbs increase the District 's daytime population to over one million people. If the District were a state it would rank 49th in population, ahead of Vermont and Wyoming. The Washington Metropolitan Area, which includes the District and surrounding suburbs, is the sixth - largest metropolitan area in the United States with an estimated 6 million residents in 2014. When the Washington area is included with Baltimore and its suburbs, the Baltimore -- Washington Metropolitan Area had a population exceeding 9.5 million residents in 2014, the fourth - largest combined statistical area in the country. According to 2016 Census Bureau data, the population of Washington, D.C., was 47.7 % Black or African American, 44.6 % White (36.4 % non-Hispanic White), 4.1 % Asian, 0.6 % American Indian or Alaska Native, and 0.2 % Native Hawaiian or Other Pacific Islander. Individuals from two or more races made up 2.7 % of the population. Hispanics of any race made up 10.9 % of the District 's population. Washington has had a significant African American population since the city 's foundation. African American residents composed about 30 % of the District 's total population between 1800 and 1940. The black population reached a peak of 70 % by 1970, but has since steadily declined due to many African Americans moving to the surrounding suburbs. Partly as a result of gentrification, there was a 31.4 % increase in the non-Hispanic white population and an 11.5 % decrease in the black population between 2000 and 2010. About 17 % of D.C. residents were age 18 or younger in 2010; lower than the U.S. average of 24 %. However, at 34 years old, the District had the lowest median age compared to the 50 states. As of 2010, there were an estimated 81,734 immigrants living in Washington, D.C. Major sources of immigration include El Salvador, Vietnam, and Ethiopia, with a concentration of Salvadorans in the Mount Pleasant neighborhood. Researchers found that there were 4,822 same - sex couples in the District of Columbia in 2010; about 2 % of total households. Legislation authorizing same - sex marriage passed in 2009 and the District began issuing marriage licenses to same - sex couples in March 2010. A 2007 report found that about one - third of District residents were functionally illiterate, compared to a national rate of about one in five. This is attributed in part to immigrants who are not proficient in English. As of 2011, 85 % of D.C. residents age 5 and older spoke English at home as a primary language. Half of residents had at least a four - year college degree in 2006. D.C. residents had a personal income per capita of $55,755; higher than any of the 50 states. However, 19 % of residents were below the poverty level in 2005, higher than any state except Mississippi. Of the District 's population, 17 % is Baptist, 13 % is Catholic, 6 % is Evangelical Protestant, 4 % is Methodist, 3 % is Episcopalian / Anglican, 3 % is Jewish, 2 % is Eastern Orthodox, 1 % is Pentecostal, 1 % is Buddhist, 1 % is Adventist, 1 % is Lutheran, 1 % is Muslim, 1 % is Presbyterian, 1 % is Mormon, and 1 % is Hindu. Over 90 % of D.C. residents have health insurance coverage, the second - highest rate in the nation. This is due in part to city programs that help provide insurance to low - income individuals who do not qualify for other types of coverage. A 2009 report found that at least 3 % of District residents have HIV or AIDS, which the Centers for Disease Control and Prevention (CDC) characterizes as a "generalized and severe '' epidemic. Crime in Washington, D.C., is concentrated in areas associated with poverty, drug abuse, and gangs. A 2010 study found that 5 % of city blocks accounted for over one - quarter of the District 's total crime. The more affluent neighborhoods of Northwest Washington are typically safe, but reports of violent crime increase in poorer neighborhoods generally concentrated in the eastern portion of the city. Approximately 60,000 residents are ex-convicts. Washington was often described as the "murder capital '' of the United States during the early 1990s. The number of murders peaked in 1991 at 479, but the level of violence then began to decline significantly. By 2012, Washington 's annual murder count had dropped to 88, the lowest total since 1961. The murder rate has since risen from that historic low, though it remains close to half the rate of the early 2000s. In 2016, the District 's Metropolitan Police Department tallied 135 homicides, a 53 % increase from 2012 but a 17 % decrease from 2015. Many neighborhoods such as Columbia Heights and Logan Circle are becoming safer and vibrant. However, incidents of robberies and thefts have remained higher in these areas because of increased nightlife activity and greater numbers of affluent residents. Even still, citywide reports of both property and violent crimes have declined by nearly half since their most recent highs in the mid-1990s. On June 26, 2008, the Supreme Court of the United States held in District of Columbia v. Heller that the city 's 1976 handgun ban violated the right to keep and bear arms as protected under the Second Amendment. However, the ruling does not prohibit all forms of gun control; laws requiring firearm registration remain in place, as does the city 's assault weapon ban. In addition to the District 's own Metropolitan Police Department, many federal law enforcement agencies have jurisdiction in the city as well; most visibly the U.S. Park Police, founded in 1791. Washington has a growing, diversified economy with an increasing percentage of professional and business service jobs. The gross state product of the District in 2010 was $103.3 billion, which would rank it No. 34 compared to the 50 states. The gross product of the Washington Metropolitan Area was $435 billion in 2014, making it the sixth - largest metropolitan economy in the United States. Between 2009 and 2016, GDP per capita in Washington, D.C has consistently ranked on the very top among US states. In 2016, at $160,472, its GDP per capita is almost three times as high as that of Massachusetts, which ranked second place in the country. As of June 2011, the Washington Metropolitan Area had an unemployment rate of 6.2 %; the second - lowest rate among the 49 largest metro areas in the nation. The District of Columbia itself had an unemployment rate of 9.8 % during the same time period. In 2012, the federal government accounted for about 29 % of the jobs in Washington, D.C. This is thought to immunize Washington to national economic downturns because the federal government continues operations even during recessions. Many organizations such as law firms, independent contractors (both defense and civilian), non-profit organizations, lobbying firms, trade unions, industry trade groups, and professional associations have their headquarters in or near D.C. to be close to the federal government. Tourism is Washington 's second largest industry. Approximately 18.9 million visitors contributed an estimated $4.8 billion to the local economy in 2012. The District also hosts nearly 200 foreign embassies and international organizations such as the World Bank, the International Monetary Fund (IMF), the Organization of American States, the Inter-American Development Bank, and the Pan American Health Organization. In 2008, the foreign diplomatic corps in Washington employed about 10,000 people and contributed an estimated $400 million annually to the local economy. The District has growing industries not directly related to government, especially in the areas of education, finance, public policy, and scientific research. Georgetown University, George Washington University, Washington Hospital Center, Children 's National Medical Center and Howard University are the top five non-government - related employers in the city as of 2009. According to statistics compiled in 2011, four of the largest 500 companies in the country were headquartered in the District. In the 2017 Global Financial Centres Index, Washington was ranked as having the 12th most competitive financial center in the world, and fifth most competitive in the United States (after New York City, San Francisco, Chicago, and Boston). The National Mall is a large, open park in downtown Washington between the Lincoln Memorial and the United States Capitol. Given its prominence, the mall is often the location of political protests, concerts, festivals, and presidential inaugurations. The Washington Monument and the Jefferson Pier are near the center of the mall, south of the White House. Also on the mall are the National World War II Memorial at the east end of the Lincoln Memorial Reflecting Pool, the Korean War Veterans Memorial, and the Vietnam Veterans Memorial. Directly south of the mall, the Tidal Basin features rows of Japanese cherry blossom trees that originated as gifts from the nation of Japan. The Franklin Delano Roosevelt Memorial, George Mason Memorial, Jefferson Memorial, Martin Luther King Jr. Memorial, and the District of Columbia War Memorial are around the Tidal Basin. The National Archives houses thousands of documents important to American history including the Declaration of Independence, the United States Constitution, and the Bill of Rights. Located in three buildings on Capitol Hill, the Library of Congress is the largest library complex in the world with a collection of over 147 million books, manuscripts, and other materials. The United States Supreme Court Building was completed in 1935; before then, the court held sessions in the Old Senate Chamber of the Capitol. The Smithsonian Institution is an educational foundation chartered by Congress in 1846 that maintains most of the nation 's official museums and galleries in Washington, D.C. The U.S. government partially funds the Smithsonian and its collections are open to the public free of charge. The Smithsonian 's locations had a combined total of 30 million visits in 2013. The most visited museum is the National Museum of Natural History on the National Mall. Other Smithsonian Institution museums and galleries on the mall are: the National Air and Space Museum; the National Museum of African Art; the National Museum of American History; the National Museum of the American Indian; the Sackler and Freer galleries, which both focus on Asian art and culture; the Hirshhorn Museum and Sculpture Garden; the Arts and Industries Building; the S. Dillon Ripley Center; and the Smithsonian Institution Building (also known as "The Castle ''), which serves as the institution 's headquarters. The Smithsonian American Art Museum and the National Portrait Gallery are housed in the Old Patent Office Building, near Washington 's Chinatown. The Renwick Gallery is officially part of the Smithsonian American Art Museum but is in a separate building near the White House. Other Smithsonian museums and galleries include: the Anacostia Community Museum in Southeast Washington; the National Postal Museum near Union Station; and the National Zoo in Woodley Park. The National Gallery of Art is on the National Mall near the Capitol and features works of American and European art. The gallery and its collections are owned by the U.S. government but are not a part of the Smithsonian Institution. The National Building Museum, which occupies the former Pension Building near Judiciary Square, was chartered by Congress and hosts exhibits on architecture, urban planning, and design. There are many private art museums in the District of Columbia, which house major collections and exhibits open to the public such as the National Museum of Women in the Arts; the Corcoran Gallery of Art, the largest private museum in Washington; and The Phillips Collection in Dupont Circle, the first museum of modern art in the United States. Other private museums in Washington include the Newseum, the O Street Museum Foundation, the International Spy Museum, the National Geographic Society Museum, and the Marian Koshland Science Museum. The United States Holocaust Memorial Museum near the National Mall maintains exhibits, documentation, and artifacts related to the Holocaust. Washington, D.C., is a national center for the arts. The John F. Kennedy Center for the Performing Arts is home to the National Symphony Orchestra, the Washington National Opera, and the Washington Ballet. The Kennedy Center Honors are awarded each year to those in the performing arts who have contributed greatly to the cultural life of the United States. The historic Ford 's Theatre, site of the assassination of President Abraham Lincoln, continues to operate as a functioning performance space as well as museum. The Marine Barracks near Capitol Hill houses the United States Marine Band; founded in 1798, it is the country 's oldest professional musical organization. American march composer and Washington - native John Philip Sousa led the Marine Band from 1880 until 1892. Founded in 1925, the United States Navy Band has its headquarters at the Washington Navy Yard and performs at official events and public concerts around the city. Washington has a strong local theater tradition. Founded in 1950, Arena Stage achieved national attention and spurred growth in the city 's independent theater movement that now includes organizations such as the Shakespeare Theatre Company, Woolly Mammoth Theatre Company, and the Studio Theatre. Arena Stage opened its newly renovated home in the city 's emerging Southwest waterfront area in 2010. The GALA Hispanic Theatre, now housed in the historic Tivoli Theatre in Columbia Heights, was founded in 1976 and is a National Center for the Latino Performing Arts. The U Street Corridor in Northwest D.C., known as "Washington 's Black Broadway '', is home to institutions like the Howard Theatre, Bohemian Caverns, and the Lincoln Theatre, which hosted music legends such as Washington - native Duke Ellington, John Coltrane, and Miles Davis. Washington has its own native music genre called go - go; a post-funk, percussion - driven flavor of rhythm and blues that was popularized in the late 1970s by D.C. band leader Chuck Brown. The District is an important center for indie culture and music in the United States. The label Dischord Records, formed by Ian MacKaye, was one of the most crucial independent labels in the genesis of 1980s punk and eventually indie rock in the 1990s. Modern alternative and indie music venues like The Black Cat and the 9: 30 Club bring popular acts to the U Street area. Washington is one of 13 cities in the United States with teams from all four major professional men 's sports and is home to one major professional women 's team. The Washington Wizards (National Basketball Association), the Washington Capitals (National Hockey League), and the Washington Mystics (Women 's National Basketball Association), play at the Capital One Arena in Chinatown. Nationals Park, which opened in Southeast D.C. in 2008, is home to the Washington Nationals (Major League Baseball). D.C. United (Major League Soccer) plays at RFK Stadium. The Washington Redskins (National Football League) play at FedExField in nearby Landover, Maryland. Current D.C. teams have won a combined ten professional league championships: the Washington Redskins have won five; D.C. United has won four; and the Washington Wizards (then the Washington Bullets) have won a single championship. Other professional and semi-professional teams in Washington include: the Washington Kastles (World TeamTennis); the Washington D.C. Slayers (USA Rugby League); the Baltimore Washington Eagles (U.S. Australian Football League); the D.C. Divas (Independent Women 's Football League); and the Potomac Athletic Club RFC (Rugby Super League). The William H.G. FitzGerald Tennis Center in Rock Creek Park hosts the Citi Open. Washington is also home to two major annual marathon races: the Marine Corps Marathon, which is held every autumn, and the Rock ' n ' Roll USA Marathon held in the spring. The Marine Corps Marathon began in 1976 and is sometimes called "The People 's Marathon '' because it is the largest marathon that does not offer prize money to participants. The District 's four NCAA Division I teams, American Eagles, George Washington Colonials, Georgetown Hoyas and Howard Bison and Lady Bison, have a broad following. The Georgetown Hoyas men 's basketball team is the most notable and also plays at the Capital One Arena. From 2008 to 2012, the District hosted an annual college football bowl game at RFK Stadium, called the Military Bowl. The D.C. area is home to one regional sports television network, Comcast SportsNet (CSN), based in Bethesda, Maryland. Washington, D.C., is a prominent center for national and international media. The Washington Post, founded in 1877, is the oldest and most - read local daily newspaper in Washington. It is probably most notable for its coverage of national and international politics and for exposing the Watergate scandal. "The Post '', as it is popularly called, had the sixth - highest readership of all news dailies in the country in 2011. The Washington Post Company also publishes a daily free commuter newspaper called the Express, which summarizes events, sports and entertainment, as well as the Spanish - language paper El Tiempo Latino. Another popular local daily is The Washington Times, the city 's second general interest broadsheet and also an influential paper in political circles. The alternative weekly Washington City Paper also have substantial readership in the Washington area. Some community and specialty papers focus on neighborhood and cultural issues, including the weekly Washington Blade and Metro Weekly, which focus on LGBT issues; the Washington Informer and The Washington Afro American, which highlight topics of interest to the black community; and neighborhood newspapers published by The Current Newspapers. Congressional Quarterly, The Hill, Politico and Roll Call newspapers focus exclusively on issues related to Congress and the federal government. Other publications based in Washington include the National Geographic magazine and political publications such as The Washington Examiner, The New Republic and Washington Monthly. The Washington Metropolitan Area is the ninth - largest television media market in the nation, with two million homes, approximately 2 % of the country 's population. Several media companies and cable television channels have their headquarters in the area, including C - SPAN; Black Entertainment Television (BET); Radio One; the National Geographic Channel; Smithsonian Networks; National Public Radio (NPR); Travel Channel (in Chevy Chase, Maryland); Discovery Communications (in Silver Spring, Maryland); and the Public Broadcasting Service (PBS) (in Arlington, Virginia). The headquarters of Voice of America, the U.S. government 's international news service, is near the Capitol in Southwest Washington. Article One, Section Eight of the United States Constitution grants the United States Congress "exclusive jurisdiction '' over the city. The District did not have an elected local government until the passage of the 1973 Home Rule Act. The Act devolved certain Congressional powers to an elected mayor, currently Muriel Bowser, and the thirteen - member Council of the District of Columbia. However, Congress retains the right to review and overturn laws created by the council and intervene in local affairs. Each of the city 's eight wards elects a single member of the council and residents elect four at - large members to represent the District as a whole. The council chair is also elected at - large. There are 37 Advisory Neighborhood Commissions (ANCs) elected by small neighborhood districts. ANCs can issue recommendations on all issues that affect residents; government agencies take their advice under careful consideration. The Attorney General of the District of Columbia, currently Karl Racine, is elected to a four - year term. Washington, D.C., observes all federal holidays and also celebrates Emancipation Day on April 16, which commemorates the end of slavery in the District. The flag of Washington, D.C., was adopted in 1938 and is a variation on George Washington 's family coat of arms. The mayor and council set local taxes and a budget, which must be approved by Congress. The Government Accountability Office and other analysts have estimated that the city 's high percentage of tax - exempt property and the Congressional prohibition of commuter taxes create a structural deficit in the District 's local budget of anywhere between $470 million and over $1 billion per year. Congress typically provides additional grants for federal programs such as Medicaid and the operation of the local justice system; however, analysts claim that the payments do not fully resolve the imbalance. The city 's local government, particularly during the mayoralty of Marion Barry, was criticized for mismanagement and waste. During his administration in 1989, The Washington Monthly magazine claimed that the District had "the worst city government in America. '' In 1995, at the start of Barry 's fourth term, Congress created the District of Columbia Financial Control Board to oversee all municipal spending. Mayor Anthony Williams won election in 1998 and oversaw a period of urban renewal and budget surpluses. The District regained control over its finances in 2001 and the oversight board 's operations were suspended. The District is not a state and therefore has no voting representation in the Congress. D.C. residents elect a non-voting delegate to the House of Representatives, currently Eleanor Holmes Norton (D - D.C. At - Large), who may sit on committees, participate in debate, and introduce legislation, but can not vote on the House floor. The District has no official representation in the United States Senate. Neither chamber seats the District 's elected "shadow '' representative or senators. Unlike residents of U.S. territories such as Puerto Rico or Guam, which also have non-voting delegates, D.C., residents are subject to all federal taxes. In the financial year 2012, D.C., residents and businesses paid $20.7 billion in federal taxes; more than the taxes collected from 19 states and the highest federal taxes per capita. A 2005 poll found that 78 % of Americans did not know that residents of the District of Columbia have less representation in Congress than residents of the 50 states. Efforts to raise awareness about the issue have included campaigns by grassroots organizations and featuring the city 's unofficial motto, "Taxation Without Representation '', on D.C. vehicle license plates. There is evidence of nationwide approval for D.C. voting rights; various polls indicate that 61 to 82 % of Americans believe that D.C. should have voting representation in Congress. Despite public support, attempts to grant the District voting representation, including the D.C. statehood movement and the proposed District of Columbia Voting Rights Amendment, have been unsuccessful. Opponents of D.C. voting rights propose that the Founding Fathers never intended for District residents to have a vote in Congress since the Constitution makes clear that representation must come from the states. Those opposed to making D.C. a state claim that such a move would destroy the notion of a separate national capital and that statehood would unfairly grant Senate representation to a single city. Washington, D.C., has fourteen official sister city agreements. Listed in the order each agreement was first established, they are: Bangkok, Thailand (1962, renewed 2002); Dakar, Senegal (1980, renewed 2006); Beijing, China (1984, renewed 2004); Brussels, Belgium (1985, renewed 2002); Athens, Greece (2000); Paris, France (2000 as a friendship and cooperation agreement, renewed 2005); Pretoria, South Africa (2002, renewed 2008); Seoul, South Korea (2006); Accra, Ghana (2006); Sunderland, United Kingdom (2006); Rome, Italy (2011); Ankara, Turkey (2011); Brasília, Brazil (2013); and Addis Ababa, Ethiopia (2013). Each of the listed cities is a national capital except for Sunderland, which includes the town of Washington, the ancestral home of George Washington 's family. Paris and Rome are each formally recognized as a "partner city '' due to their special one sister city policy. District of Columbia Public Schools (DCPS) operates the city 's 123 public schools. The number of students in DCPS steadily decreased for 39 years until 2009. In the 2010 -- 11 school year, 46,191 students were enrolled in the public school system. DCPS has one of the highest - cost yet lowest - performing school systems in the country, both in terms of infrastructure and student achievement. Mayor Adrian Fenty 's administration made sweeping changes to the system by closing schools, replacing teachers, firing principals, and using private education firms to aid curriculum development. The District of Columbia Public Charter School Board monitors the 52 public charter schools in the city. Due to the perceived problems with the traditional public school system, enrollment in public charter schools has steadily increased. As of fall 2010, D.C., charter schools had a total enrollment of about 32,000, a 9 % increase from the prior year. The District is also home to 92 private schools, which enrolled approximately 18,000 students in 2008. The District of Columbia Public Library operates 25 neighborhood locations including the landmark Martin Luther King Jr. Memorial Library. Private universities include American University (AU), the Catholic University of America (CUA), Gallaudet University, George Washington University (GW), Georgetown University (GU), Howard University, the Johns Hopkins University School of Advanced International Studies (SAIS), and Trinity Washington University. The Corcoran College of Art and Design provides specialized arts instruction and other higher - education institutions offer continuing, distance and adult education. The University of the District of Columbia (UDC) is a public university providing undergraduate and graduate education. D.C. residents may also be eligible for a grant of up to $10,000 per year to offset the cost of tuition at any public university in the country. The District is known for its medical research institutions such as Washington Hospital Center and the Children 's National Medical Center, as well as the National Institutes of Health in Bethesda, Maryland. In addition, the city is home to three medical schools and associated teaching hospitals at George Washington, Georgetown, and Howard universities. There are 1,500 miles (2,400 km) of streets, parkways, and avenues in the District. Due to the freeway revolts of the 1960s, much of the proposed interstate highway system through the middle of Washington was never built. Interstate 95 (I - 95), the nation 's major east coast highway, therefore bends around the District to form the eastern portion of the Capital Beltway. A portion of the proposed highway funding was directed to the region 's public transportation infrastructure instead. The interstate highways that continue into Washington, including I - 66 and I - 395, both terminate shortly after entering the city. The Washington Metropolitan Area Transit Authority (WMATA) operates the Washington Metro, the city 's rapid transit system, as well as Metrobus. Both systems serve the District and its suburbs. Metro opened on March 27, 1976 and, as of July 2014, consists of 91 stations and 117 miles (188 km) of track. With an average of about one million trips each weekday, Metro is the second - busiest rapid transit system in the country. Metrobus serves over 400,000 riders each weekday and is the nation 's fifth - largest bus system. The city also operates its own DC Circulator bus system, which connects commercial areas within central Washington. Union Station is the city 's main train station and services approximately 70,000 people each day. It is Amtrak 's second - busiest station with 4.6 million passengers annually and is the southern terminus for the Northeast Corridor and Acela Express routes. Maryland 's MARC and Virginia 's VRE commuter trains and the Metrorail Red Line also provide service into Union Station. Following renovations in 2011, Union Station became Washington 's primary intercity bus transit center. Three major airports serve the District. Ronald Reagan Washington National Airport is across the Potomac River from downtown Washington in Arlington, Virginia and primarily handles domestic flights. Major international flights arrive and depart from Washington Dulles International Airport, 26.3 miles (42.3 km) west of the District in Fairfax and Loudoun counties in Virginia. Baltimore - Washington International Thurgood Marshall Airport is 31.7 miles (51.0 km) northeast of the District in Anne Arundel County, Maryland. According to a 2010 study, Washington - area commuters spent 70 hours a year in traffic delays, which tied with Chicago for having the nation 's worst road congestion. However, 37 % of Washington - area commuters take public transportation to work, the second - highest rate in the country. An additional 12 % of D.C. commuters walked to work, 6 % carpooled, and 3 % traveled by bicycle in 2010. A 2011 study by Walk Score found that Washington was the seventh-most walkable city in the country with 80 % of residents living in neighborhoods that are not car dependent. An expected 32 % increase in transit usage within the District by 2030 has spurred construction of a new DC Streetcar system to interconnect the city 's neighborhoods. Construction has also started on an additional Metro line that will connect Washington to Dulles airport. The District is part of the regional Capital Bikeshare program. Started in 2010, it is currently one of the largest bicycle sharing systems in the country with over 4,351 bicycles and more than 395 stations all provided by PBSC Urban Solutions. By 2012, the city 's network of marked bicycle lanes covered 56 miles (90 km) of streets. The District of Columbia Water and Sewer Authority (i.e. WASA or D.C. Water) is an independent authority of the D.C. government that provides drinking water and wastewater collection in Washington. WASA purchases water from the historic Washington Aqueduct, which is operated by the Army Corps of Engineers. The water, sourced from the Potomac River, is treated and stored in the city 's Dalecarlia, Georgetown, and McMillan reservoirs. The aqueduct provides drinking water for a total of 1.1 million people in the District and Virginia, including Arlington, Falls Church, and a portion of Fairfax County. The authority also provides sewage treatment services for an additional 1.6 million people in four surrounding Maryland and Virginia counties. Pepco is the city 's electric utility and services 793,000 customers in the District and suburban Maryland. An 1889 law prohibits overhead wires within much of the historic City of Washington. As a result, all power lines and telecommunication cables are located underground in downtown Washington, and traffic signals are placed at the edge of the street. A plan announced in 2013 would bury an additional 60 miles (97 km) of primary power lines throughout the District. Washington Gas is the city 's natural gas utility and serves over one million customers in the District and its suburbs. Incorporated by Congress in 1848, the company installed the city 's first gas lights in the Capitol, the White House, and along Pennsylvania Avenue.
how many countries is a serbian film banned in
A Serbian film - Wikipedia A Serbian Film (Serbian: Српски филм, translit. Srpski film) is a 2010 Serbian horror film produced and directed by Srđan Spasojević, in his feature film debut. Spasojević also co-wrote the film with Aleksandar Radivojević. It tells the story of a financially struggling porn star who agrees to participate in an "art film '', only to discover that he has been drafted into a snuff film with pedophilic and necrophilic themes. The film stars Serbian actors Srđan Todorović and Sergej Trifunović. Upon its debut on the art film circuit, the film received substantial attention for its graphic depictions of rape, necrophilia and child sexual abuse. The Serbian state investigated the film for crime against sexual morals and crime related to the protection of minors. The film has been banned in Spain, Germany, Australia, New Zealand, Malaysia, Singapore, Norway, and South Korea, and was temporarily banned from screening in Brazil. Financially struggling Miloš, a former porn star known for his talent, lives with his wife, Marija, and six - year - old son, Petar. His brother, Marko, a corrupt police officer, is attracted to Marija and is jealous of Miloš 's sexual prowess. Marija is curious about her husband 's past and is concerned about the family 's income. Lejla, a former co-star, offers Miloš a starring role in an art film directed by Vukmir, an independent pornographer, who wishes to cast Miloš for his powerful erection. Having already caught Petar watching one of his films and unaware of the details of Vukmir 's film, Miloš is hesitant to participate and continue his career, but accepts to secure his family 's financial future. While meeting Vukmir, Miloš passes a bald man and his entourage, regarding them warily. Filming begins at an orphanage, where Vukmir feeds Miloš instructions through an earpiece given by Vukmir 's driver, Raša, while a film crew follows him. Miloš sees a young woman being physically abused and scolded by her mother, having disgraced her deceased war hero husband 's memory by becoming a prostitute. In a dark room, screens show Jeca, a girl of indeterminate age, seductively eating an ice pop, while Miloš is fellated by a nurse. Then, Miloš is instructed to receive fellatio from the mother, while Jeca watches. Miloš refuses, but is forced to continue. Marko later informs him that Vukmir is a former psychologist and has worked in children 's television and state security. Miloš meets with Vukmir, announcing that he is retiring and dropping out of the film, but Vukmir explains to a hesitant Miloš his artistic style of pornography, showing a film of a woman giving birth to a newborn baby, a baby which is then immediately raped by Raša, much to the joy of the mother. The disgusted and horrified Miloš storms out and drives away as Vukmir boasts to him that this is "a new genre '' and that he terms it as "newborn porn ''. At a road junction, being in a disturbed state of mind, he is approached and seduced by an attractive woman who, unbeknownst to him, is Vukmir 's doctor. A bloodied Miloš wakes up in his bed the next morning with no memory of what has happened. He returns to the now abandoned set and finds a number of tapes. Viewing them, Miloš discovers that he was drugged to induce an aggressive, sexually aroused, and suggestible state. At Vukmir 's manipulative direction, Miloš beat and decapitated Jeca 's mother while raping her and was later raped by one of the guards. He then watches footage of Lejla voicing concern for Miloš to Vukmir, stating that she is quitting and taking Miloš with her. A bloodied Lejla is then shown restrained, with a blood puddle and several teeth in the floor right in front of her. A masked man appears and she is forced to fellate him, suffocating her to death. The footage continues as Miloš is led to Jeca 's home, where an elderly woman praises him for killing Jeca 's mother, laments about Jeca 's father dying before he "made her a woman '', and offers Jeca as a "virgin commune ''. Miloš refuses, threatens to cut off his penis with a knife, and escapes through a window. After wandering the streets for a while, he ends up huddling in an alleyway, where he watches as a teenage girl passes by while being tailed by a pair of thugs. He begins masturbating and is assaulted by the thugs before they are killed by Raša, who along with Vukmir takes Miloš to a warehouse. At the warehouse, Vukmir 's doctor administers more drugs to Miloš, who in an angry outburst sticks a syringe into her neck, rendering her unconscious from the overdose. He is then taken into a big room, where he is conducted to have intercourse with two hidden bodies placed under sheets and with bags on their heads. Miloš furiously begins penetrating them while keeping them restrained, and as he swaps from one onto the other, the masked man from Lejla 's film enters and begins raping the first. Vukmir then reveals the masked man to be Marko, his victim to be Marija, and finally, that Miloš is raping Petar. At this moment, the agonizing female doctor enters the room, with her crotch entirely covered in blood and a bloody pipe on her hand, attracting everyone 's attention before collapsing dead. Snapping, an enraged Miloš lunges at Vukmir and repeatedly smashes his head against the floor, initiating a brawl during which Marija bites off a piece of Marko 's neck, then bludgeons him to death with a sculpture. Miloš wrestles with the guards and seizes one of their guns, shooting both of them and injuring the one - eyed Raša, whom he kills by ramming his erect penis into his empty eye socket. During all of this, a dying Vukmir praises Miloš 's actions as truly worthy of cinema. Miloš, having recalled his actions up to that point, including locking his wife and son in their basement before passing out earlier, smashes Marko 's head with the sculpture in a fit of impotence and despair, before returning home to find both of them in shock, with Petar totally unresponsive. After coping with the matter for hours, Miloš and his wife ultimately agree, in silence, that they and their son should die together, so the three gather in bed and embrace before Miloš fires a fatal shot through himself, Petar and Marija. Sometime later, a new film crew, including the bald man from the beginning of the film, is shown recording in the bedroom. One of the security guards begins to unzip his pants and the director, the unnamed bald man, advises him to "start with the little one ''. Spasojević and Radivojević, have stated that the film is a parody of modern politically correct films made in Serbia, which are financially supported by foreign funds. When asked why they chose the title ' Srpski Film ' for the film 's name, Radivojević answered, "We have become synonyms for chaos and lunacy. The title is a cynical reference to that image. Srpski Film is also a metaphor for our national cinema -- boring, predictable and altogether unintentionally hilarious which throughout our film to some extent is commented on and subtly parodied. '' Similarly, Radivojević describes Serbian cinema as "... pathetic state financed films made by people who have no sense or connection to film, but are strongly supported by foreign funds. Quality of the film is not their concern, only the bureaucratic upholding of the rule book on political correctness. '' According to Spasojević, the character of Vukmir is "an exaggerated representation of the new European film order... In Eastern Europe, you can not get your film financed unless you have a barefoot girl who cries on the streets, or some story about war victims in our region... the Western world has lost feelings, so they 're searching for false ones, they want to buy feelings. '' In another interview Spasojević is quoted as saying the film "denounces the fascism of political correctness. '' Questioned by the Croatian media on whether the violence depicted deals with crimes committed by Serbian soldiers during the Yugoslav Wars, Spasojević answered, "' Srpski Film ' does not touch upon war themes, but in a metaphorical way deals with the consequences of postwar society and a man that is exploited to the extreme in the name of securing the survival of his family. '' The first ever showing of A Serbian Film took place on 15 March 2010 at midnight in Austin as part of the 2010 South by Southwest. During the introduction by Alamo Drafthouse Cinema 's owner Tim League, the audience in the theater was once again warned about the extreme nature of the scenes they were about to see and given one last chance to leave the screening. He also coaxed a handful of audience members to join him on the stage -- where they jointly snorted lines of salt, squeezed lime juice into their eyes and took shots of tequila in order to "understand what Serbians have been through to create a culture of A Serbian Film ''. The following day, the film played once more. Next was the screening at the Brussels International Festival of Fantasy Film in April. On 11 June 2010, the film screened in Serbia as part of the Cinema City festival in Novi Sad. The film was run on 16 -- 19 July 2010 during the Fantasia Festival in Montreal as part of the Subversive Serbia program. The film was due to screen on 29 August 2010 at the Film Four FrightFest in London, UK but was pulled by the organizers following the intervention of Westminster Council. Films shown at this festival are usually shown pre-certificate but in this case Westminster Council refused to grant permission for its exhibition until it had been classified by the BBFC. Following its DVD submission to the BBFC (there were no theatrical materials available in the time frame requested for a proper theatrical classification), 49 cuts totaling four minutes and eleven seconds were requested for DVD certification. The UK distributor, Revolver Entertainment, initially looked into the possibilities of the process, but it became clear that the film would then have to be resubmitted to the BBFC and further cuts may then have been required. It was decided that to show a heavily edited version was not in the spirit of the festival and consequently its exhibition was pulled from the schedule. The film was replaced at the festival by Rodrigo Cortés ' Buried starring Ryan Reynolds. The Raindance Film Festival, that picked up the film at the Cannes Film Festival in May, subsequently held the UK premiere and "found a way around the ban by billing the screening as a ' private event ' ''. The Sun tabloid described the film as ' sick ' and ' vile ' following the festival 's 2010 Press Launch and Westminster Council requested to monitor the invitations to the screening. The 35mm print was shipped from the BBFC for the 8 October 2010 premiere. On 21 October 2010, the film had a single screening at Toronto 's Bloor Cinema. It took place as part of the monthly event called Cinemacabre Movie Nights organized by the Rue Morgue magazine. The publication also spotlighted the film and featured it on its cover. On 26 November 2010, the film was refused classification by the Australian Classification Board, banning sales and public showings of the film in Australia. However, on 5 April 2011, the Australian Classification Board approved a censored version of the film. Later in 2011, the censored version was also re-refused classification after review. On 12 and 16 July 2011, the film was screened at FANTASPOA in Porto Alegre, Brazil and at least at one other film festival in the country, before being banned just before a screening in Rio de Janeiro. Initially the ban applied only in Rio, but afterwards the decision became valid throughout the country, pending further judgement of the film. On March 2011, A Serbian Film won the Special Jury Prize in the 31st edition of Fantasporto, Portugal 's biggest film festival, in Porto. On 24 September 2010, A Serbian Film was released uncensored (104 minutes) in Serbian theaters, with screening times scheduled late at night. The film had a limited release in UK theaters on 10 December 2010 in the edited form (99 minutes), with four minutes and eleven seconds of its original content removed by the British Board of Film Classification due to "elements of sexual violence that tend to eroticize or endorse sexual violence. '' A Serbian Film thus became the most censored cinema release in Britain since the 1994 Indian film Nammavar that had five minutes and eight seconds of its violent content removed. The film had a limited release in the United States on 6 May 2011, edited to 98 minutes with an NC - 17 rating. It was released on VOD at the website FlixFling on the same day, except only slightly edited to 103 minutes. The film 's North American DVD and Blu - ray release was on 25 October 2011 through Invincible Pictures. Netflix has refused to carry the film as well as wholesale outlets Ingram and VPD. It is available on demand at FlixFling.com. Through Invincible Pictures, a limited edition uncut version was released via DVD on 22 May 2012. Tom Ashley, CEO of the distribution company, had this to say, "Of course we would have preferred an uncut release last year. Unfortunately, the charges brought against Mr. Sala (director of the Sitges Film Festival) were something we had to seriously factor into that release. Now that those charges have been dropped, we can bring A Serbian Film to its fans as its director had intended. '' In September 2011, without any official explanation, Netflix removed the film from their list of titles available for viewing, and from their in - site search results. It remains available in uncensored form on other major online DVD sites. A Serbian Film was banned by a court in San Sebastián, Spain for "threatening sexual freedom '' and thus could not be shown in the XXI Semana de Cine Fantástico y de Terror (21st Horror and Fantasy Film Festival). The film was shown at an adults - only screening at the Spanish Sitges Film Festival during October 2010. As a result, the festival 's director Ángel Sala was charged with exhibiting child pornography by the Spanish prosecutor who decided to take action in May 2011 after receiving a complaint from a Roman Catholic organization over a pair of scenes involving the rapes of a young child and a newborn. The charges were later dropped. Upon initial release, the FSK (German motion picture rating organisation) ordered that the film be refused classification due to concerns that the content may violate German federal law. On 30 June 2011, a version was allowed with 13 minutes cut, and was rated "No release to youths '' (released to age 18 or older, German: Keine Jugendfreigabe). The film was banned in Norway after two months of sales as it was found to be in violation of criminal law (namely sections 204a and 382, which deal with the sexual representation of children and extreme violence). The film was temporarily banned for screening in Brazil. Although the film was given a "not recommended for those under the age of 18, due to depictions of sex, pedophilia, violence and cruelty '' rating by the Dejus, a legal decision banned it temporarily due to its content "offending the government of Brazil ''. This was the first time a film was banned in Brazil since the promulgation of the 1988 Constitution. On 5 July 2012, this decision was overturned. The uncut version and a second version with 2 minutes cut were refused classification before a third version with 3 minutes and 55 seconds cut was passed with an R18+ classification. Before its release, major Australian DVD retailer JB Hi - Fi announced that they would not be distributing the film, either online or in physical stores. They attributed this to the "Disturbing content of the film '' and to a disagreement with the (then) R18+ rating. However, the film was available from this retailer for a time. It was refused classification and thus effectively banned in South Australia just days before its release date. On 19 September 2011, the Australian Classification Review Board also rated the film "Refused Classification '', effectively banning the film from distribution Australia - wide. According to the Review Board, "A Serbian Film could not be accommodated within the R18+ classification as the level of depictions of sexual violence, themes of incest, and depictions of child sexual abuse in the film has an impact which is very high and not justified by context. '' Accordingly, the film is banned in Australia. On 25 May 2012, the film was banned outright by the New Zealand Office of Film & Literature Classification. On 24 August 2012, the film was rejected and banned without question by the Film Censorship Board of Malaysia. On 24 August 2012, the same day, it was banned in Singapore due to its content being "likely to cause controversy in Singapore ''. The film was given a Limited screening price or restricted rating twice by the KMRB. The first edit was submitted on 9 August 2011 with a duration of 94 minutes but was rejected for having extreme violence. The second edit was trimmed to 88 minutes and labelled as the director 's edition, was submitted on 6 October 2011, but was also given the same restricted rating, this time for extreme themes. The film was released to great controversy over its portrayal of sexual violence. Spasojević has responded to the controversy with "This is a diary of our own molestation by the Serbian government... It 's about the monolithic power of leaders who hypnotize you to do things you do n't want to do. You have to feel the violence to know what it 's about. '' While acknowledging some level of conservatism among the public and theater owners, Spasojević says that government - enforced censorship in Serbia is non-existent and was not the driving force behind the making of A Serbian Film: "In Serbia we do n't have ratings, there is no law forbidding anything from being shown in a film and there is no law forbidding anyone from buying a ticket. '' Blic 's Milan Vlajčić penned a middle - of - the - road review, praising the direction, technical aspects, "effective iconography '', and "video game pacing '' while saying that the film was taken to the edges of self - parody. Đorđe Bajić and Zoran Janković of the web magazine Popboks gave the film a highly affirmative review, summing it up as "the dark Grand Guignol that shreds its celluloid victims with unconcealed intensity while showing in full color and detail, the collapse of the last bastions of decency, morality, and rationality '' and concluding that "it has a lot to say outside of the mere and unrestrained exploitation. '' In an interview, Serbian actor and film director Dragan Bjelogrlić criticized the film: "Shallow and plain wrong -- sum up my feelings about this movie. I have a problem with A Serbian Film. Its director in particular. I 've got a serious problem with the boy whose father got wealthy during the 1990s -- nothing against making money, but I know how money was made in Serbia during the 1990s -- and then pays for his son 's education abroad and eventually the kid comes back to Serbia to film his view of the country using his dad 's money and even calls the whole thing A Serbian Film. To me that 's a metaphor for something unacceptable. The second generation comes back to the country and using the money that was robbed from the people of Serbia, smears the very same people by portraying them as the worst scum of the earth. You know, when the first generation of the Rockefellers finished robbing America, the second one built museums, galleries, charitable organizations, and financed America. But in Serbia we 're seeing every segment of society continually being taken apart and for me this movie is a paradigm of that. I 've never met this kid and I really do n't want to since that meeting would n't be pleasant at all. '' Based on 29 reviews collected by the film review aggregator Rotten Tomatoes, 45 % of critics gave A Serbian Film a positive review, with an average rating of 5.1 out of 10. A.O. Scott of the New York Times wrote in his review, "At first glance -- and few are likely to dare a second -- it belongs in the high - concept shock - horror tradition whose most recent and notorious specimen is probably The Human Centipede. As is often the case with movies like this, A Serbian Film revels in its sheer inventive awfulness and dares the viewer to find a more serious layer of meaning. '' Karina Longworth of the Village Voice called the film "a passionate argument against a no - holds - barred exploration of extreme human sexuality and violence '' and referred to the film 's supposed commentary on the sad state of post-Milošević Serbian society as "specious lip service. '' She concludes: "That this film exists at all is a more cogent commentary on the nation 's collective trauma than any of the direct statements or potential metaphors contained within. '' Scott Weinberg wrote "I think the film is tragic, sickening, disturbing, twisted, absurd, infuriated, and actually quite intelligent. There are those who will be unable (or unwilling) to decipher even the most basic of ' messages ' buried within A Serbian Film, but I believe it 's one of the most legitimately fascinating films I 've ever seen. I admire and detest it at the same time. And I will never watch it again. Ever. '' Alison Willmore wrote that "Movies can use transgressive topics and imagery toward great artistic resonance. They can also just use them for pure shock / novelty / boundary - pushing, which is where I 'd group Serbian Film. That it comes from a country that 's spent decades deep in violent conflict, civil unrest, corruption and ethnic tensions makes it tempting to read more into the film than I think it actually offers -- ultimately, it has as much to say about its country of origin as (Eli Roth 's) Hostel does about America, which is a little, but nothing on the scale its title suggests. '' Ai n't It Cool News ' Harry Knowles lists it in his Top 10 films of 2010, stating "This is a fantastic, brilliant film -- that given time, will eventually outgrow the absurd reactions of people that think it is a far harder film than it actually is. '' Time Out New York 's Joshua Rothkopf accuses A Serbian Film of pandering to "mouth - breathing gorehounds who found Hostel a bit too soft (i.e., fanatics who would hijack the horror genre into extremity because deeper thinking is too hard) '' before concluding that "the movie says as much about Eastern Europe as Twilight does about the Pacific Northwest. '' Tim Anderson of horror review site Bloody Disgusting strongly discouraged anyone from ever viewing the film, writing, "If what I have written here is enough to turn your feelings of wonder into a burning desire to watch this monstrosity, then perhaps I have n't been clear enough. You do n't want to see Serbian Film. You just think you do. '' In his very negative review of A Serbian Film, BBC Radio 5 Live 's Mark Kermode called it a "nasty piece of exploitation trash in the mould of Jörg Buttgereit and Ruggero Deodato '', going on to add that "if it is somehow an allegory of Serbian family and Serbian politics then the allegory gets lost amidst the increasingly stupid splatter. '' Furthermore, he mentioned A Serbian Film again in his review of Fred: The Movie, pairing the two as his least favorite viewing experiences of the year. Calum Waddell of Total Sci - Fi in a negative review took issue with the filmmakers ' statements that their film says something about the politics of Serbia, writing, "if you want to learn about Serbia, chances are, you wo n't be watching a movie whose main claim to fame is that a man rapes a newborn baby '', before concluding that "Srđan Spasojević will go to his grave being known as the guy who filmed a grown man having sex with a baby. And that 's something that -- despite all of the money, attention and champagne parties at Cannes -- I would never want on my conscience. Good luck to him in regaining some humanity. '' Total Film awarded the film two stars out of five, finding the film 's shock hype not to be fully deserved: "... a film that was slightly silly and none - too - distressing to begin with. Works best as a reflection on modern day porn 's obsession with masochism and humiliation. ''
who was involved in apartheid in south africa
Apartheid - Wikipedia Apartheid (South African English pronunciation: / əˈpɑːrteɪd /; Afrikaans: (aˈpartɦəit)) was a system of institutionalised racial segregation and discrimination in South Africa between 1948 and 1991. Broadly speaking, apartheid was delineated into petty apartheid, which entailed the segregation of public facilities and social events, and grand apartheid, which dictated housing and employment opportunities by race. Prior to the 1940s, some aspects of apartheid had already emerged in the form of minority rule by white South Africans and the socially enforced separation of black South Africans from other races, which later extended to pass laws and land apportionment. Apartheid was adopted as a formal policy by the South African government after the ascension of the National Party (NP) during the country 's 1948 general elections. A codified system of racial stratification began to take form in South Africa under the Dutch Empire in the late eighteenth century, although informal segregation was present much earlier due to social cleavages between Dutch colonists and a creolised, ethnically diverse slave population. With the rapid growth and industrialisation of the British Cape Colony in the nineteenth century, racial policies and laws became increasingly rigid. Cape legislation that discriminated specifically against black Africans began appearing shortly before 1900. The policies of the Boer republics were also racially exclusive; for instance, the Transvaal constitution barred nonwhite participation in church and state. The first apartheid law was the Prohibition of Mixed Marriages Act, 1949, followed closely by the Immorality Act of 1950, which made it illegal for most South African citizens to marry or pursue sexual relationships across racial lines. The Population Registration Act, 1950 classified all South Africans into one of four racial groups based on appearance, known ancestry, socioeconomic status, and cultural lifestyle: "black '', "white '', "coloured '', and "Indian '', the last two of which included several sub-classifications. Places of residence were determined by racial classification. From 1960 to 1983, 3.5 million nonwhite South Africans were removed from their homes and forced into segregated neighbourhoods, in one of the largest mass removals in modern history. Most of these targeted removals were intended to restrict the black population to ten designated "tribal homelands '', also known as bantustans, four of which became nominally independent states. The government announced that relocated persons would lose their South African citizenship as they were absorbed into the bantustans. Apartheid sparked significant international and domestic opposition, resulting in some of the most influential global social movements of the twentieth century. It was the target of frequent condemnation in the United Nations, and brought about an extensive arms and trade embargo on South Africa. During the 1970s and 1980s, internal resistance to apartheid became increasingly militant, prompting brutal crackdowns by the National Party administration and protracted sectarian violence that left thousands dead or in detention. Some reforms of the apartheid system were undertaken, including allowing for Indian and coloured political representation in parliament, but these measures failed in appeasing most activist groups. Between 1987 and 1993 the National Party entered into bilateral negotiations with the African National Congress, the leading anti-apartheid political movement, for ending segregation and introducing majority rule. In 1990, prominent ANC leaders such as Nelson Mandela were released from detention. Apartheid legislation was abolished in mid-1991, pending multiracial elections set for April 1994. Apartheid is an Afrikaans word meaning "separateness '', or "the state of being apart '', literally "apart - hood ''. Its first recorded use was in 1929. Under the 1806 Cape Articles of Capitulation the new British colonial rulers were required to respect previous legislation enacted under Roman Dutch law and this led to a separation of the law in South Africa from English Common Law and a high degree of legislative autonomy. The governors and assemblies that governed the legal process in the various colonies of South Africa were launched on a different and independent legislative path from the rest of the British Empire. In the days of slavery, slaves required passes to travel away from their masters. In 1797 the Landdrost and Heemraden of Swellendam and Graaff - Reinet extended pass laws beyond slaves and ordained that all Khoikhoi (designated as Hottentots) moving about the country for any purpose should carry passes. This was confirmed by the British Colonial government in 1809 by the Hottentot Proclamation, which decreed that if a Khoikhoi were to move they would need a pass from their master or a local official. Ordinance No. 49 of 1828 decreed that prospective black immigrants were to be granted passes for the sole purpose of seeking work. These passes were to be issued for Coloureds and Khoikhoi, but not for other Africans, who were still forced to carry passes. The United Kingdom 's Slavery Abolition Act 1833 (3 & 4 Will. IV c. 73) abolished slavery throughout the British Empire and overrode the Cape Articles of Capitulation. To comply with the act the South African legislation was expanded to include Ordinance 1 in 1835, which effectively changed the status of slaves to indentured labourers. This was followed by Ordinance 3 in 1848, which introduced an indenture system for Xhosa that was little different from slavery. The various South African colonies passed legislation throughout the rest of the nineteenth century to limit the freedom of unskilled workers, to increase the restrictions on indentured workers and to regulate the relations between the races. The Franchise and Ballot Act of 1892 instituted limits based on financial means and education to the black franchise, and the Natal Legislative Assembly Bill of 1894 deprived Indians of the right to vote. The Glen Grey Act of 1894, instigated by the government of Prime Minister Cecil John Rhodes limited the amount of land Africans could hold. In 1905 the General Pass Regulations Act denied blacks the vote, limited them to fixed areas and inaugurated the infamous Pass System. The Asiatic Registration Act (1906) required all Indians to register and carry passes. In 1910 the Union of South Africa was created as a self - governing dominion, which continued the legislative programme: the South Africa Act (1910) enfranchised whites, giving them complete political control over all other racial groups while removing the right of blacks to sit in parliament, the Native Land Act (1913) prevented blacks, except those in the Cape, from buying land outside "reserves '', the Natives in Urban Areas Bill (1918) was designed to force blacks into "locations '', the Urban Areas Act (1923) introduced residential segregation and provided cheap labour for industry led by white people, the Colour Bar Act (1926) prevented black mine workers from practising skilled trades, the Native Administration Act (1927) made the British Crown, rather than paramount chiefs, the supreme head over all African affairs, the Native Land and Trust Act (1936) complemented the 1913 Native Land Act and, in the same year, the Representation of Natives Act removed previous black voters from the Cape voters ' roll and allowed them to elect three whites to Parliament. One of the first pieces of segregating legislation enacted by Jan Smuts ' United Party government was the Asiatic Land Tenure Bill (1946), which banned land sales to Indians. The United Party government began to move away from the rigid enforcement of segregationist laws during World War II. Amid fears integration would eventually lead to racial assimilation, the legislature established the Sauer Commission to investigate the effects of the United Party 's policies. The commission concluded that integration would bring about a "loss of personality '' for all racial groups. The Union of South Africa had allowed social custom and law to govern the consideration of multiracial affairs and of the allocation, in racial terms, of access to economic, social, and political status. Most white South Africans, regardless of their own differences, accepted the prevailing pattern. Nevertheless, by 1948 it remained apparent that there were occasional gaps in the social structure, whether legislated or otherwise, concerning the rights and opportunities of nonwhites. The rapid economic development of World War II attracted black migrant workers in large numbers to chief industrial centres, where they compensated for the wartime shortage of white labour. However, this escalated rate of black urbanisation went unrecognised by the South African government, which failed to accommodate the influx with parallel expansion in housing or social services. Overcrowding, spiking crime rates, and disillusionment resulted; urban blacks came to support a new generation of leaders influenced by the principles of self - determination and popular freedoms enshrined in such statements as the Atlantic Charter. Whites reacted negatively to the changes, allowing the Herenigde Nasionale Party (or simply National Party) to convince a large segment of the voting bloc that the impotence of the United Party in curtailing the evolving position of nonwhites indicated that the organisation had fallen under the influence of Western liberals. Many Afrikaners, whites chiefly of Dutch descent but with early infusions of Germans and French Huguenots who were soon assimilated, also resented what they perceived as disempowerment by an underpaid black workforce and the superior economic power and prosperity of white English speakers. In addition, Jan Smuts, as a strong advocate of the United Nations, lost domestic support when South Africa was criticised for its colour bar and continued mandate of South West Africa by other UN member states. Afrikaner nationalists proclaimed that they offered the voters a new policy to ensure continued white domination. This policy was initially expounded from a theory drafted by Hendrik Verwoerd and was presented to the National Party by the Sauer Commission. It called for a systematic effort to organise the relations, rights, and privileges of the races as officially defined through a series of parliamentary acts and administrative decrees. Segregation had thus been pursued only in major matters, such as separate schools, and local society rather than law had been depended upon to enforce most separation; it should now be extended to everything. The party gave this policy a name -- apartheid (apartness). Apartheid was to be the basic ideological and practical foundation of Afrikaner politics for the next quarter of a century. The National Party 's election platform stressed that apartheid would preserve a market for white employment in which nonwhites could not compete. On the issues of black urbanisation, the regulation of nonwhite labour, influx control, social security, farm tariffs, and nonwhite taxation the United Party 's policy remained contradictory and confused. Its traditional bases of support not only took mutually exclusive positions, but found themselves increasingly at odds with each other. Smuts ' reluctance to consider South African foreign policy against the mounting tensions of the Cold War also stirred up discontent, while the nationalists promised to purge the state and public service of communist sympathisers. First to desert the United Party were Afrikaner farmers, who wished to see a change in influx control due to problems with squatters, as well as higher prices for their maize and other produce in the face of the mineowners ' demand for cheap food policies. Always identified with the affluent and capitalist, the party also failed to appeal to its working class constituents. Populist rhetoric allowed the National Party to sweep eight constituencies in the mining and industrial centres of the Witwatersrand and five more in Pretoria. Barring the predominantly English - speaking landowner electorate of the Natal, the United Party was defeated in almost every rural district. Its urban losses in the nation 's most populous province, the Transvaal, proved equally devastating. As the voting system was disproportionately weighted in favour of rural constituencies and the Transvaal in particular, the 1948 election catapulted the Herenigde Nasionale Party from a small minority party to a commanding position with an eight - vote parliamentary lead. Daniel François Malan became the first nationalist prime minister, with the aim of implementing the apartheid philosophy and silencing liberal opposition. Glen Grey Act (1894) Natal Legislative Assembly Bill (1894) Transvaal Asiatic Registration Act (1906) South Africa Act (1909) Mines and Works Act (1911) Natives Land Act (1913) Natives (Urban Areas) Act (1923) Immorality Act (1927) Native Administration Act (1927) Women 's Enfranchisement Act (1930) Franchise Laws Amendment Act (1931) Representation of Natives Act (1936) Native Trust and Land Act (1936) Native (Urban Areas) Consolidation Act (1945) Immorality Amendment Act † (1950) Population Registration Act (1950) Group Areas Act (1950) Suppression of Communism Act (1950) Native Building Workers Act (1951) Separate Representation of Voters Act (1951) Prevention of Illegal Squatting Act (1951) Bantu Authorities Act (1951) Native Laws Amendment Act † (1952) Pass Laws Act (1952) Public Safety Act (1953) Native Labour (Settlement of Disputes) Act (1953) Bantu Education Act (1953) Reservation of Separate Amenities Act (1953) Natives Resettlement Act (1954) Group Areas Development Act (1955) Riotous Assemblies Act (1956) Industrial Conciliation Act (1956) Natives (Prohibition of Interdicts) Act (1956) Immorality Act (1957) Bantu Investment Corporation Act (1959) Extension of University Education Act (1959) Promotion of Bantu Self - government Act (1959) Unlawful Organizations Act (1960) Indemnity Act (1961) Coloured Persons Communal Reserves Act (1961) Republic of South Africa Constitution Act (1961) Urban Bantu Councils Act (1961) General Law Amendment Act (1963) Separate Representation of Voters Amendment Act (1968) Prohibition of Political Interference Act (1968) Bantu Homelands Citizenship Act (1970) Bantu Homelands Constitution Act (1971) Aliens Control Act (1973) Indemnity Act (1977) National Key Points Act (1980) List of National Key Points Internal Security Act (1982) Black Local Authorities Act (1982) Interim Constitution (1993) Promotion of National Unity and Reconciliation Act (1995) NP leaders argued that South Africa did not comprise a single nation, but was made up of four distinct racial groups: white, black, coloured and Indian. Such groups were split into 13 nations or racial federations. White people encompassed the English and Afrikaans language groups; the black populace was divided into ten such groups. The state passed laws that paved the way for "grand apartheid '', which was centred on separating races on a large scale, by compelling people to live in separate places defined by race. This strategy was in part adopted from "left - over '' British rule that separated different racial groups after they took control of the Boer republics in the Anglo - Boer war. This created the black - only "townships '' or "locations '', where blacks were relocated to their own towns. In addition, "petty apartheid '' laws were passed. The principal apartheid laws were as follows. The first grand apartheid law was the Population Registration Act of 1950, which formalised racial classification and introduced an identity card for all persons over the age of 18, specifying their racial group. Official teams or boards were established to come to a conclusion on those people whose race was unclear. This caused difficulty, especially for coloured people, separating their families when members were allocated different races. The second pillar of grand apartheid was the Group Areas Act of 1950. Until then, most settlements had people of different races living side by side. This Act put an end to diverse areas and determined where one lived according to race. Each race was allotted its own area, which was used in later years as a basis of forced removal. The Prevention of Illegal Squatting Act of 1951 allowed the government to demolish black shanty town slums and forced white employers to pay for the construction of housing for those black workers who were permitted to reside in cities otherwise reserved for whites. The Prohibition of Mixed Marriages Act of 1949 prohibited marriage between persons of different races, and the Immorality Act of 1950 made sexual relations with a person of a different race a criminal offence. Under the Reservation of Separate Amenities Act of 1953, municipal grounds could be reserved for a particular race, creating, among other things, separate beaches, buses, hospitals, schools and universities. Signboards such as "whites only '' applied to public areas, even including park benches. Blacks were provided with services greatly inferior to those of whites, and, to a lesser extent, to those of Indian and coloured people. Further laws had the aim of suppressing resistance, especially armed resistance, to apartheid. The Suppression of Communism Act of 1950 banned any party subscribing to Communism. The act defined Communism and its aims so sweepingly that anyone who opposed government policy risked being labelled as a Communist. Since the law specifically stated that Communism aimed to disrupt racial harmony, it was frequently used to gag opposition to apartheid. Disorderly gatherings were banned, as were certain organisations that were deemed threatening to the government. Education was segregated by the 1953 Bantu Education Act, which crafted a separate system of education for black South African students and was designed to prepare black people for lives as a labouring class. In 1959 separate universities were created for black, coloured and Indian people. Existing universities were not permitted to enroll new black students. The Afrikaans Medium Decree of 1974 required the use of Afrikaans and English on an equal basis in high schools outside the homelands. The Bantu Authorities Act of 1951 created separate government structures for blacks and whites and was the first piece of legislation to support the government 's plan of separate development in the bantustans. The Promotion of Black Self - Government Act of 1959 entrenched the NP policy of nominally independent "homelands '' for blacks. So - called "self -- governing Bantu units '' were proposed, which would have devolved administrative powers, with the promise later of autonomy and self - government. It also abolished the seats of white representatives of black South Africans and removed from the rolls the few blacks still qualified to vote. The Bantu Investment Corporation Act of 1959 set up a mechanism to transfer capital to the homelands to create employment there. Legislation of 1967 allowed the government to stop industrial development in "white '' cities and redirect such development to the "homelands ''. The Black Homeland Citizenship Act of 1970 marked a new phase in the Bantustan strategy. It changed the status of blacks to citizens of one of the ten autonomous territories. The aim was to ensure a demographic majority of white people within South Africa by having all ten Bantustans achieve full independence. Interracial contact in sport was frowned upon, but there were no segregatory sports laws. The government tightened pass laws compelling blacks to carry identity documents, to prevent the immigration of blacks from other countries. To reside in a city, blacks had to be in employment there. Until 1956 women were for the most part excluded from these pass requirements, as attempts to introduce pass laws for women were met with fierce resistance. In 1950, D.F. Malan announced the NP 's intention to create a Coloured Affairs Department. J.G. Strijdom, Malan 's successor as Prime Minister, moved to strip voting rights from black and coloured residents of the Cape Province. The previous government had introduced the Separate Representation of Voters Bill into Parliament in 1951; however, four voters, G Harris, WD Franklin, WD Collins and Edgar Deane, challenged its validity in court with support from the United Party. The Cape Supreme Court upheld the act, but reversed by the Appeal Court, finding the act invalid because a two - thirds majority in a joint sitting of both Houses of Parliament was needed to change the entrenched clauses of the Constitution. The government then introduced the High Court of Parliament Bill (1952), which gave Parliament the power to overrule decisions of the court. The Cape Supreme Court and the Appeal Court declared this invalid too. In 1955 the Strijdom government increased the number of judges in the Appeal Court from five to 11, and appointed pro-Nationalist judges to fill the new places. In the same year they introduced the Senate Act, which increased the Senate from 49 seats to 89. Adjustments were made such that the NP controlled 77 of these seats. The parliament met in a joint sitting and passed the Separate Representation of Voters Act in 1956, which transferred coloured voters from the common voters ' roll in the Cape to a new coloured voters ' roll. Immediately after the vote, the Senate was restored to its original size. The Senate Act was contested in the Supreme Court, but the recently enlarged Appeal Court, packed with government - supporting judges, upheld the act, and also the Act to remove coloured voters. The 1956 law allowed Coloureds to elect four people to Parliament, but a 1969 law abolished those seats and stripped Coloureds of their right to vote. Since Asians had never been allowed to vote, this resulted in whites being the sole enfranchised group. A 2016 study in the Journal of Politics suggests that disenfranchisement in South Africa had a significant negative impact on basic service delivery to the disenfranchized. Before South Africa became a republic in 1961, politics among white South Africans was typified by the division between the mainly Afrikaner pro-republic conservative and the largely English anti-republican liberal sentiments, with the legacy of the Boer War still a factor for some people. Once South Africa became a republic, Prime Minister Hendrik Verwoerd called for improved relations and greater accord between people of British descent and the Afrikaners. He claimed that the only difference was between those in favor of apartheid and those against it. The ethnic division would no longer be between Afrikaans and English speakers, but between blacks and whites. Most Afrikaners supported the notion of unanimity of white people to ensure their safety. White voters of British descent were divided. Many had opposed a republic, leading to a majority "no '' vote in Natal. Later, some of them recognised the perceived need for white unity, convinced by the growing trend of decolonisation elsewhere in Africa, which concerned them. British Prime Minister Harold Macmillan 's "Wind of Change '' speech left the British faction feeling that Britain had abandoned them. The more conservative English speakers supported Verwoerd; others were troubled by the severing of ties with Britain and remained loyal to the Crown. They were displeased by having to choose between British and South African nationalities. Although Verwoerd tried to bond these different blocs, the subsequent voting illustrated only a minor swell of support, indicating that a great many English speakers remained apathetic and that Verwoerd had not succeeded in uniting the white population. Under the homeland system, the government attempted to divide South Africa into a number of separate states, each of which was supposed to develop into a separate nation - state for a different ethnic group. Territorial separation was hardly a new institution. There were, for example, the "reserves '' created under the British government in the nineteenth century. Under apartheid, 13 percent of the land was reserved for black homelands, a relatively small amount compared with the total population, and generally in economically unproductive areas of the country. The Tomlinson Commission of 1954 justified apartheid and the homeland system, but stated that additional land ought to be given to the homelands, a recommendation that was not carried out. When Verwoerd became Prime Minister in 1958, the policy of "separate development '' came into being, with the homeland structure as one of its cornerstones. Verwoerd came to believe in the granting of independence to these homelands. The government justified its plans on the basis that "(the) government 's policy is, therefore, not a policy of discrimination on the grounds of race or colour, but a policy of differentiation on the ground of nationhood, of different nations, granting to each self - determination within the borders of their homelands -- hence this policy of separate development ''. Under the homelands system, blacks would no longer be citizens of South Africa, becoming citizens of the independent homelands who worked in South Africa as foreign migrant labourers on temporary work permits. In 1958 the Promotion of Black Self - Government Act was passed, and border industries and the Bantu Investment Corporation were established to promote economic development and the provision of employment in or near the homelands. Many black South Africans who had never resided in their identified homeland were forcibly removed from the cities to the homelands. Ten homelands were allocated to different black ethnic groups: Lebowa (North Sotho, also referred to as Pedi), QwaQwa (South Sotho), Bophuthatswana (Tswana), KwaZulu (Zulu), KaNgwane (Swazi), Transkei and Ciskei (Xhosa), Gazankulu (Tsonga), Venda (Venda) and KwaNdebele (Ndebele). Four of these were declared independent by the South African government: Transkei in 1976, Bophuthatswana in 1977, Venda in 1979, and Ciskei in 1981 (known as the TBVC states). Once a homeland was granted its nominal independence, its designated citizens had their South African citizenship revoked and replaced with citizenship in their homeland. These people were then issued passports instead of passbooks. Citizens of the nominally autonomous homelands also had their South African citizenship circumscribed, meaning they were no longer legally considered South African. The South African government attempted to draw an equivalence between their view of black citizens of the homelands and the problems which other countries faced through entry of illegal immigrants. Bantustans within the borders of South Africa were classified as "self - governing '' or "independent ''. In theory, self - governing Bantustans had control over many aspects of their internal functioning but were not yet sovereign nations. Independent Bantustans (Transkei, Bophutatswana, Venda and Ciskei; also known as the TBVC states) were intended to be fully sovereign. In reality, they had no significant economic infrastructure and with few exceptions encompassed swaths of disconnected territory. This meant all the Bantustans were little more than puppet states controlled by South Africa. Throughout the existence of the independent Bantustans, South Africa remained the only country to recognise their independence. Nevertheless, internal organisations of many countries, as well as the South African government, lobbied for their recognition. For example, upon the foundation of Transkei, the Swiss - South African Association encouraged the Swiss government to recognise the new state. In 1976, leading up to a United States House of Representatives resolution urging the President to not recognise Transkei, the South African government intensely lobbied lawmakers to oppose the bill. Each TBVC state extended recognition to the other independent Bantustans while South Africa showed its commitment to the notion of TBVC sovereignty by building embassies in the TBVC capitals. During the 1960s, 1970s and early 1980s, the government implemented a policy of "resettlement '', to force people to move to their designated "group areas ''. Millions of people were forced to relocate. These removals included people relocated due to slum clearance programmes, labour tenants on white - owned farms, the inhabitants of the so - called "black spots '' (black - owned land surrounded by white farms), the families of workers living in townships close to the homelands, and "surplus people '' from urban areas, including thousands of people from the Western Cape (which was declared a "Coloured Labour Preference Area '') who were moved to the Transkei and Ciskei homelands. The best - publicised forced removals of the 1950s occurred in Johannesburg, when 60,000 people were moved to the new township of Soweto (an abbreviation for South Western Townships). Until 1955, Sophiatown had been one of the few urban areas where blacks were allowed to own land, and was slowly developing into a multiracial slum. As industry in Johannesburg grew, Sophiatown became the home of a rapidly expanding black workforce, as it was convenient and close to town. It had the only swimming pool for black children in Johannesburg. As one of the oldest black settlements in Johannesburg, it held an almost symbolic importance for the 50,000 blacks it contained, both in terms of its sheer vibrancy and its unique culture. Despite a vigorous ANC protest campaign and worldwide publicity, the removal of Sophiatown began on 9 February 1955 under the Western Areas Removal Scheme. In the early hours, heavily armed police forced residents out of their homes and loaded their belongings onto government trucks. The residents were taken to a large tract of land 19 kilometres (12 mi) from the city centre, known as Meadowlands, which the government had purchased in 1953. Meadowlands became part of a new planned black city called Soweto. Sophiatown was destroyed by bulldozers, and a new white suburb named Triomf (Triumph) was built in its place. This pattern of forced removal and destruction was to repeat itself over the next few years, and was not limited to black South Africans alone. Forced removals from areas like Cato Manor (Mkhumbane) in Durban, and District Six in Cape Town, where 55,000 coloured and Indian people were forced to move to new townships on the Cape Flats, were carried out under the Group Areas Act of 1950. Nearly 600,000 coloured, Indian and Chinese people were moved under the Group Areas Act. Some 40,000 whites were also forced to move when land was transferred from "white South Africa '' into the black homelands. The NP passed a string of legislation that became known as petty apartheid. The first of these was the Prohibition of Mixed Marriages Act 55 of 1949, prohibiting marriage between whites and people of other races. The Immorality Amendment Act 21 of 1950 (as amended in 1957 by Act 23) forbade "unlawful racial intercourse '' and "any immoral or indecent act '' between a white and a black, Indian or coloured person. Blacks were not allowed to run businesses or professional practices in areas designated as "white South Africa '' unless they had a permit. They were required to move to the black "homelands '' and set up businesses and practices there. Transport and civil facilities were segregated. Trains, hospitals and ambulances were segregated. Because of the smaller numbers of white patients and the fact that white doctors preferred to work in white hospitals, conditions in white hospitals were much better than those in often overcrowded and understaffed black hospitals. Blacks were excluded from living or working in white areas, unless they had a pass, nicknamed the dompas, also spelt dompass or dom pass. The most likely origin of this name is from the Afrikaans "verdomde pas '' (meaning accursed pass), although some commentators ascribe it to the Afrikaans words meaning "dumb pass ''. Only blacks with "Section 10 '' rights (those who had migrated to the cities before World War II) were excluded from this provision. A pass was issued only to a black with approved work. Spouses and children had to be left behind in black homelands. A pass was issued for one magisterial district (usually one town) confining the holder to that area only. Being without a valid pass made a person subject to arrest and trial for being an illegal migrant. This was often followed by deportation to the person 's homeland and prosecution of the employer for employing an illegal migrant. Police vans patrolled white areas to round up blacks without passes. Blacks were not allowed to employ whites in white South Africa. Although trade unions for black and coloured (mixed race) workers had existed since the early 20th century, it was not until the 1980s reforms that a mass black trade union movement developed. Trade unions under apartheid were racially segregated, with 54 unions being white only, 38 for Indian and coloured and 19 for black people. The Industrial Conciliation Act (1956) legislated against the creation of multi-racial trade unions and attempted to split existing multi-racial unions into separate branches or organisations along racial lines. In the 1970s the state spent ten times more per child on the education of white children than on black children within the Bantu Education system (the education system in black schools within white South Africa). Higher education was provided in separate universities and colleges after 1959. Eight black universities were created in the homelands. Fort Hare University in the Ciskei (now Eastern Cape) was to register only Xhosa - speaking students. Sotho, Tswana, Pedi and Venda speakers were placed at the newly founded University College of the North at Turfloop, while the University College of Zululand was launched to serve Zulu students. Coloureds and Indians were to have their own establishments in the Cape and Natal respectively. Each black homeland controlled its own education, health and police systems. Blacks were not allowed to buy hard liquor. They were able only to buy state - produced poor quality beer (although this was relaxed later). Public beaches were racially segregated. Public swimming pools, some pedestrian bridges, drive - in cinema parking spaces, graveyards, parks, and public toilets were segregated. Cinemas and theatres in white areas were not allowed to admit blacks. There were practically no cinemas in black areas. Most restaurants and hotels in white areas were not allowed to admit blacks except as staff. Blacks were prohibited from attending white churches under the Churches Native Laws Amendment Act of 1957, but this was never rigidly enforced and churches were one of the few places races could mix without the interference of the law. Blacks earning 360 rand a year or more had to pay taxes while the white threshold was more than twice as high, at 750 rand a year. On the other hand, the taxation rate for whites was considerably higher than that for blacks. Blacks could never acquire land in white areas. In the homelands, much of the land belonged to a "tribe '', where the local chieftain would decide how the land had to be used. This resulted in whites owning almost all the industrial and agricultural lands and much of the prized residential land. Most blacks were stripped of their South African citizenship when the "homelands '' became "independent '', and they were no longer able to apply for South African passports. Eligibility requirements for a passport had been difficult for blacks to meet, the government contending that a passport was a privilege, not a right, and the government did not grant many passports to blacks. Apartheid pervaded culture as well as the law, and was entrenched by most of the mainstream media. The population was classified into four groups: Black, White, Indian, and Coloured (capitalised to denote their legal definitions in South African law). The Coloured group included people regarded as being of mixed descent, including of Bantu, Khoisan, European and Malay ancestry. Many were descended from people brought to South Africa from other parts of the world, such as India, Madagascar, and China as slaves and indentured workers. The apartheid bureaucracy devised complex (and often arbitrary) criteria at the time that the Population Registration Act was implemented to determine who was Coloured. Minor officials would administer tests to determine if someone should be categorised either Coloured or Black, or if another person should be categorised either Coloured or White. Different members of the same family found themselves in different race groups. Further tests determined membership of the various sub-racial groups of the Coloureds. Many of those who formerly belonged to this racial group are opposed to the continuing use of the term "coloured '' in the post-apartheid era, though the term no longer signifies any legal meaning. The expressions "so - called Coloured '' (Afrikaans sogenaamde Kleurlinge) and "brown people '' (bruinmense) acquired a wide usage in the 1980s. Discriminated against by apartheid, Coloureds were as a matter of state policy forced to live in separate townships, in some cases leaving homes their families had occupied for generations, and received an inferior education, though better than that provided to Blacks. They played an important role in the anti-apartheid movement: for example the African Political Organization established in 1902 had an exclusively Coloured membership. Voting rights were denied to Coloureds in the same way that they were denied to Blacks from 1950 to 1983. However, in 1977 the NP caucus approved proposals to bring Coloureds and Indians into central government. In 1982, final constitutional proposals produced a referendum among Whites, and the Tricameral Parliament was approved. The Constitution was reformed the following year to allow the Coloured and Asian minorities participation in separate Houses in a Tricameral Parliament, and Botha became the first Executive State President. The idea was that the Coloured minority could be granted voting rights, but the Black majority were to become citizens of independent homelands. These separate arrangements continued until the abolition of apartheid. The Tricameral reforms led to the formation of the (anti-apartheid) United Democratic Front as a vehicle to try to prevent the co-option of Coloureds and Indians into an alliance with Whites. The battles between the UDF and the NP government from 1983 to 1989 were to become the most intense period of struggle between left - wing and right - wing South Africans. Colonialism and apartheid had a major impact on black and coloured women, since they suffered both racial and gender discrimination. Jobs were often hard to find. Many black and coloured women worked as agricultural or domestic workers, but wages were extremely low, if existent. Children suffered from diseases caused by malnutrition and sanitation problems, and mortality rates were therefore high. The controlled movement of black and coloured workers within the country through the Natives Urban Areas Act of 1923 and the pass laws separated family members from one another, because men could prove their employment in urban centres while most women were merely dependents; consequently, they risked being deported to rural areas. By the 1930s, association football mirrored the balkanised society of South Africa; football was divided into numerous institutions based on race: the (White) South African Football Association, the South African Indian Football Association (SAIFA), the South African African Football Association (SAAFA) and its rival the South African Bantu Football Association, and the South African Coloured Football Association (SACFA). Lack of funds to provide proper equipment would be noticeable in regards to black amateur football matches; this revealed the unequal lives black South Africans were subject to, in contrast to Whites, who were obviously much better off financially. Apartheid 's social engineering made it more difficult to compete across racial lines. Thus, in an effort to centralise finances, the federations merged in 1951, creating the South African Soccer Federation (SASF), which brought Black, Indian, and Coloured national associations into one body that opposed apartheid. This was generally opposed more and more by the growing apartheid government, and -- with urban segregation being reinforced with ongoing racist policies -- it was harder to play football along these racial lines. In 1956, the Pretoria regime -- the administrative capital of South Africa -- passed the first apartheid sports policy; by doing so, it emphasised the White - led government 's opposition to inter-racialism. While football was plagued by racism, it also played a role in protesting apartheid and its policies. With the international bans from FIFA and other major sporting events, South Africa would be in the spotlight internationally. In a 1977 survey, white South Africans ranked the lack of international sport as one of the three most damaging consequences of apartheid. By the mid-1950s, Black South Africans would also use media to challenge the "racialisation '' of sports in South Africa; anti-apartheid forces had begun to pinpoint sport as the "weakness '' of white national morale. Black journalists for the Johannesburg Drum magazine were the first to give the issue public exposure, with an intrepid special issue in 1955 that asked, "Why should n't our blacks be allowed in the SA team? '' As time progressed, international standing with South Africa would continue to be strained. In the 1980s, as the oppressive system was slowly collapsing the ANC and National Party started negotiations on the end of apartheid. Football associations also discussed the formation of a single, non-racial controlling body. This unity process accelerated in the late 1980s and led to the creation, in December 1991, of an incorporated South African Football Association. On 3 July 1992, FIFA finally welcomed South Africa back into international football. Sport has long been an important part of life in South Africa, and the boycotting of games by international teams had a profound effect on the white population, perhaps more so than the trade embargoes did. After the re-acceptance of South Africa 's sports teams by the international community, sport played a major unifying role between the country 's diverse ethnic groups. Mandela 's open support of the predominantly white rugby fraternity during the 1995 Rugby World Cup was considered instrumental in bringing together South African sports fans of all races. Defining its Asian population, a minority that did not appear to belong to any of the initial three designated non-white groups, was a constant dilemma for the apartheid government. For political reasons, the classification of "honorary white '' was granted to immigrants from Japan, South Korea and Taiwan -- countries with which South Africa maintained diplomatic and economic relations -- and to their descendants. Indian South Africans during apartheid were classified many ranges of categories from "Asian '' to "black '' to "Coloured '' and even the mono - ethnic category of "Indian '', but never as white, having been considered "nonwhite '' throughout South Africa 's history. The group faced severe discrimination during the apartheid regime and were subject to numerous racialist policies. Chinese South Africans -- who were descendants of migrant workers who came to work in the gold mines around Johannesburg in the late 19th century -- were initially either classified as "Coloured '' or "Other Asian '' and were subject to numerous forms of discrimination and restriction. It was not until 1984 that South African Chinese, increased to about 10,000, were given the same official rights as the Japanese, to be treated as whites in terms of the Group Areas Act, although they still faced discrimination and did not receive all the benefits / rights of their newly obtained honorary white status such as voting. Indonesians arrived at the Cape of Good Hope as slaves until the abolishment of slavery during the 1800s. They were predominantly Muslim, were allowed religious freedom and formed their own ethnic group / community known as Cape Malays. They were classified as part of the Coloured racial group. This was the same for South Africans of Malaysian descent who were also classified as part of the Coloured race and thus considered "not - white ''. South Africans of Filipino descent were classified as "black '' due to historical outlook on Filipinos by White South Africans, and many of them lived in Bantustans. Alongside apartheid the NP government implemented a programme of social conservatism. Pornography and gambling were banned. Cinemas, shops selling alcohol and most other businesses were forbidden from operating on Sundays. Abortion, homosexuality and sex education were also restricted; abortion was legal only in cases of rape or if the mother 's life was threatened. Television was not introduced until 1976 because the government viewed English programming as a threat to the Afrikaans language. Television was run on apartheid lines -- TV1 broadcast in Afrikaans and English (geared to a white audience), TV2 in Zulu and Xhosa and TV3 in Sotho, Tswana and Pedi (both geared to a black audience), and TV4 mostly showed programmes for an urban - black audience. Apartheid sparked significant internal resistance. The government responded to a series of popular uprisings and protests with police brutality, which in turn increased local support for the armed resistance struggle. Internal resistance to the apartheid system in South Africa came from several sectors of society and saw the creation of organisations dedicated variously to peaceful protests, passive resistance and armed insurrection. In 1949, the youth wing of the African National Congress (ANC) took control of the organisation and started advocating a radical black nationalist programme. The new young leaders proposed that white authority could only be overthrown through mass campaigns. In 1950 that philosophy saw the launch of the Programme of Action, a series of strikes, boycotts and civil disobedience actions that led to occasional violent clashes with the authorities. In 1959, a group of disenchanted ANC members formed the Pan Africanist Congress (PAC), which organised a demonstration against pass books on 21 March 1960. One of those protests was held in the township of Sharpeville, where 69 people were killed by police in the Sharpeville massacre. In the wake of Sharpeville, the government declared a state of emergency. More than 18,000 people were arrested, including leaders of the ANC and PAC, and both organisations were banned. The resistance went underground, with some leaders in exile abroad and others engaged in campaigns of domestic sabotage and terrorism. In May 1961, before the declaration of South Africa as a Republic, an assembly representing the banned ANC called for negotiations between the members of the different ethnic groupings, threatening demonstrations and strikes during the inauguration of the Republic if their calls were ignored. When the government overlooked them, the strikers (among the main organisers was a 42 - year - old, Thembu - origin Nelson Mandela) carried out their threats. The government countered swiftly by giving police the authority to arrest people for up to twelve days and detaining many strike leaders amid numerous cases of police brutality. Defeated, the protesters called off their strike. The ANC then chose to launch an armed struggle through a newly formed military wing, Umkhonto we Sizwe (MK), which would perform acts of sabotage on tactical state structures. Its first sabotage plans were carried out on 16 December 1961, the anniversary of the Battle of Blood River. In the 1970s, the Black Consciousness Movement was created by tertiary students influenced by the American Black Power movement. BC endorsed black pride and African customs and did much to alter the feelings of inadequacy instilled among black people by the apartheid system. The leader of the movement, Steve Biko, was taken into custody on 18 August 1977 and was beaten to death in detention. In 1976, secondary students in Soweto took to the streets in the Soweto uprising to protest against the imposition of Afrikaans as the only language of instruction. On 16 June, police opened fire on students protesting peacefully. According to official reports 23 people were killed, but the number of people who died is usually given as 176, with estimates of up to 700. In the following years several student organisations were formed to protest against apartheid, and these organisations were central to urban school boycotts in 1980 and 1983 and rural boycotts in 1985 and 1986. In parallel with student protests, labour unions started protest action in 1973 and 1974. After 1976 unions and workers are considered to have played an important role in the struggle against apartheid, filling the gap left by the banning of political parties. In 1979 black trade unions were legalised and could engage in collective bargaining, although strikes were still illegal. Economist Thomas Sowell wrote that basic supply and demand led to violations of Apartheid "on a massive scale '' throughout the nation, simply because there were not enough white South African business owners to meet the demand for various goods and services. Large portions of the garment industry and construction of new homes, for example, were effectively owned and operated by blacks, who either worked surreptitiously or who circumvented the law with a white person as a nominal, figurehead manager. In 1983, anti-apartheid leaders determined to resist the tricameral parliament assembled to form the United Democratic Front (UDF) in order to coordinate anti-apartheid activism inside South Africa. The first presidents of the UDF were Archie Gumede, Oscar Mpetha and Albertina Sisulu; patrons were Archbishop Desmond Tutu, Dr Allan Boesak, Helen Joseph, and Nelson Mandela. Basing its platform on abolishing apartheid and creating a nonracial democratic South Africa, the UDF provided a legal way for domestic human rights groups and individuals of all races to organise demonstrations and campaign against apartheid inside the country. Churches and church groups also emerged as pivotal points of resistance. Church leaders were not immune to prosecution, and certain faith - based organisations were banned, but the clergy generally had more freedom to criticise the government than militant groups did. The UDF, coupled with the protection of the church, accordingly permitted a major role for Archbishop Desmond Tutu, who served both as a prominent domestic voice and international spokesperson denouncing apartheid and urging the creation of a shared nonracial state. Although the majority of whites supported apartheid, some 20 percent did not. Parliamentary opposition was galvanised by Helen Suzman, Colin Eglin and Harry Schwarz, who formed the Progressive Federal Party. Extra-parliamentary resistance was largely centred in the South African Communist Party and women 's organisation the Black Sash. Women were also notable in their involvement in trade union organisations and banned political parties. South Africa 's policies were subject to international scrutiny in 1960, when Macmillan criticised them during his celebrated Wind of Change speech in Cape Town. Weeks later, tensions came to a head in the Sharpeville Massacre, resulting in more international condemnation. Soon afterwards Verwoerd announced a referendum on whether the country should become a republic. Verwoerd lowered the voting age for whites to 18 and included whites in South West Africa on the roll. The referendum on 5 October that year asked whites, "Are you in favour of a Republic for the Union? '', and 52 percent voted "Yes ''. As a consequence of this change of status, South Africa needed to reapply for continued membership of the Commonwealth, with which it had privileged trade links. India had become a republic within the Commonwealth in 1950, but it became clear that African and Asian member states would oppose South Africa due to its apartheid policies. As a result, South Africa withdrew from the Commonwealth on 31 May 1961, the day that the Republic came into existence. We stand here today to salute the United Nations Organisation and its Member States, both singly and collectively, for joining forces with the masses of our people in a common struggle that has brought about our emancipation and pushed back the frontiers of racism. At the first UN gathering in 1946, South Africa was placed on the agenda. The primary subject in question was the handling of South African Indians, a great cause of divergence between South Africa and India. In 1952, apartheid was again discussed in the aftermath of the Defiance Campaign, and the UN set up a task team to keep watch on the progress of apartheid and the racial state of affairs in South Africa. Although South Africa 's racial policies were a cause for concern, most countries in the UN concurred that this was a domestic affair, which fell outside the UN 's jurisdiction. In April 1960, the UN 's conservative stance on apartheid changed following the Sharpeville massacre, and the Security Council for the first time agreed on concerted action against the apartheid regime, demanding an end to racial separation and discrimination. From 1960 the ANC began a campaign of armed struggle of which there would later be a charge of 193 acts of terrorism from 1961 to 1963, mainly bombings and murders of civilians. Instead, the South African government began further suppression, banning the ANC and PAC. In 1961, UN Secretary - General Dag Hammarskjöld stopped over in South Africa and subsequently stated that he had been unable to reach agreement with Prime Minister Verwoerd. In 1961, dismissing an Israeli vote against South African apartheid at the United Nations, Verwoerd famously said, "Israel is not consistent in its new anti-apartheid attitude... they took Israel away from the Arabs after the Arabs lived there for a thousand years. In that, I agree with them. Israel, like South Africa, is an apartheid state. '' On 6 November 1962, the United Nations General Assembly passed Resolution 1761, condemning apartheid policies. In 1966, the UN held the first of many colloquiums on apartheid. The General Assembly announced 21 March as the International Day for the Elimination of Racial Discrimination, in memory of the Sharpeville massacre. In 1971, the General Assembly formally denounced the institution of homelands, and a motion was passed in 1974 to expel South Africa from the UN, but this was vetoed by France, the United Kingdom and the United States, all key trade associates of South Africa. On 7 August 1963 the United Nations Security Council passed Resolution 181, calling for a voluntary arms embargo against South Africa. In the same year a Special Committee Against Apartheid was established to encourage and oversee plans of action against the regime. From 1964 the US and Britain discontinued their arms trade with South Africa. The Security Council also condemned the Soweto massacre in Resolution 392. In 1977, the voluntary UN arms embargo became mandatory with the passing of Resolution 418. Economic sanctions against South Africa were also frequently debated as an effective way of putting pressure on the apartheid government. In 1962, the UN General Assembly requested that its members sever political, fiscal and transportation ties with South Africa. In 1968, it proposed ending all cultural, educational and sporting connections as well. Economic sanctions, however, were not made mandatory, because of opposition from South Africa 's main trading partners. In 1973, the UN adopted the Apartheid Convention which defines apartheid and even qualifies it as a crime against humanity which might lead to international criminal prosecution of the individuals responsible for perpetrating it. This convention has however only been ratified by 107 of the 193 member states as of August 2008. The convention was initially drafted by the former USSR and Guinea, before being presented to the UN General Assembly. The convention was adopted with a vote of 91 for, and 4 (Portugal, South Africa, the United Kingdom and the United States) against the convention. In 1978 and 1983 the UN condemned South Africa at the World Conference Against Racism. After much debate, by the late 1980s the United States, the United Kingdom, and 23 other nations had passed laws placing various trade sanctions on South Africa. A disinvestment from South Africa movement in many countries was similarly widespread, with individual cities and provinces around the world implementing various laws and local regulations forbidding registered corporations under their jurisdiction from doing business with South African firms, factories, or banks. Pope John Paul II was an outspoken opponent of apartheid. In 1985, while visiting the Netherlands, he gave an impassioned speech at the International Court of Justice condemning apartheid, proclaiming that "no system of apartheid or separate development will ever be acceptable as a model for the relations between peoples or races. '' In September 1988 he made a pilgrimage to countries bordering South Africa, while demonstratively avoiding South Africa itself. During his visit to Zimbabwe, he called for economic sanctions against South Africa 's government. The Organisation of African Unity (OAU) was created in 1963. Its primary objectives were to eradicate colonialism and improve social, political and economic situations in Africa. It censured apartheid and demanded sanctions against South Africa. African states agreed to aid the liberation movements in their fight against apartheid. In 1969, fourteen nations from Central and East Africa gathered in Lusaka, Zambia, and formulated the Lusaka Manifesto, which was signed on 13 April by all of the countries in attendance except Malawi. This manifesto was later taken on by both the OAU and the United Nations. The Lusaka Manifesto summarised the political situations of self - governing African countries, condemning racism and inequity, and calling for black majority rule in all African nations. It did not rebuff South Africa entirely, though, adopting an appeasing manner towards the apartheid government, and even recognising its autonomy. Although African leaders supported the emancipation of black South Africans, they preferred this to be attained through peaceful means. South Africa 's negative response to the Lusaka Manifesto and rejection of a change to its policies brought about another OAU announcement in October 1971. The Mogadishu Declaration stated that South Africa 's rebuffing of negotiations meant that its black people could only be freed through military means, and that no African state should converse with the apartheid government. In 1966 B.J. Vorster became Prime Minister. He was not prepared to dismantle apartheid, but he did try to redress South Africa 's isolation and to revitalise the country 's global reputation, even those with black - ruled nations in Africa. This he called his "Outward - Looking '' policy. Vorster 's willingness to talk to African leaders stood in contrast to Verwoerd 's refusal to engage with leaders such as Abubakar Tafawa Balewa of Nigeria in 1962 and Kenneth Kaunda of Zambia in 1964. In 1966, he met the heads of the neighbouring states of Lesotho, Swaziland and Botswana. In 1967, he offered technological and financial aid to any African state prepared to receive it, asserting that no political strings were attached, aware that many African states needed financial aid despite their opposition to South Africa 's racial policies. Many were also tied to South Africa economically because of their migrant labour population working on the South African mines. Botswana, Lesotho and Swaziland remained outspoken critics of apartheid, but depended on South Africa 's economic aid. Malawi was the first country not on South African borders to accept South African aid. In 1967, the two states set out their political and economic relations, and, in 1969, Malawi became the only country at the assembly which did not sign the Lusaka Manifesto condemning South Africa ' apartheid policy. In 1970, Malawian president Hastings Banda made his first and most successful official stopover in South Africa. Associations with Mozambique followed suit and were sustained after that country won its sovereignty in 1975. Angola was also granted South African loans. Other countries which formed relationships with South Africa were Liberia, Ivory Coast, Madagascar, Mauritius, Gabon, Zaire (now the Democratic Republic of the Congo) and the Central African Republic. Although these states condemned apartheid (more than ever after South Africa 's denunciation of the Lusaka Manifesto), South Africa 's economic and military dominance meant that they remained dependent on South Africa to varying degrees. South Africa 's isolation in sport began in the mid-1950s and increased throughout the 1960s. Apartheid forbade multiracial sport, which meant that overseas teams, by virtue of their having players of diverse races, could not play in South Africa. In 1956, the International Table Tennis Federation severed its ties with the all - white South African Table Tennis Union, preferring the non-racial South African Table Tennis Board. The apartheid government responded by confiscating the passports of the Board 's players so that they were unable to attend international games. In 1959, the non-racial South African Sports Association (SASA) was formed to secure the rights of all players on the global field. After meeting with no success in its endeavours to attain credit by collaborating with white establishments, SASA approached the International Olympic Committee (IOC) in 1962, calling for South Africa 's expulsion from the Olympic Games. The IOC sent South Africa a caution to the effect that, if there were no changes, they would be barred from the 1964 Olympic Games. The changes were initiated, and in January 1963, the South African Non-Racial Olympic Committee (SANROC) was set up. The Anti-Apartheid Movement persisted in its campaign for South Africa 's exclusion, and the IOC acceded in barring the country from the 1964 Games in Tokyo. South Africa selected a multi-racial team for the next Games, and the IOC opted for incorporation in the 1968 Games in Mexico. Because of protests from AAMs and African nations, however, the IOC was forced to retract the invitation. Foreign complaints about South Africa 's bigoted sports brought more isolation. Racially selected New Zealand sports teams toured South Africa, until the 1970 All Blacks rugby tour allowed Maori to go under the status of "honorary whites ''. Huge and widespread protests occurred in New Zealand in 1981 against the Springbok tour -- the government spent $8 million protecting games using the army and police force. A planned All Black tour to South Africa in 1985 remobilised the New Zealand protesters and it was cancelled. A "rebel tour '' -- not government sanctioned -- went ahead in 1986, but after that sporting ties were cut, and New Zealand made a decision not to convey an authorised rugby team to South Africa until the end of apartheid. Vorster replaced Verwoerd as Prime Minister in 1966 following the latter 's assassination and declared that South Africa would no longer dictate to other countries what their teams should look like. Although this reopened the gate for international sporting meets, it did not signal the end of South Africa 's racist sporting policies. In 1968 Vorster went against his policy by refusing to permit Basil D'Oliveira, a Coloured South African - born cricketer, to join the English cricket team on its tour to South Africa. Vorster said that the side had been chosen only to prove a point, and not on merit. After protests, however, "Dolly '' was eventually included in the team. Protests against certain tours brought about the cancellation of a number of other visits, including that of an England rugby team touring South Africa in 1969 / 70. The first of the "White Bans '' occurred in 1971 when the Chairman of the Australian Cricketing Association -- Sir Don Bradman -- flew to South Africa to meet Vorster. Vorster had expected Bradman to allow the tour of the Australian cricket team to go ahead, but things became heated after Bradman asked why black sportsmen were not allowed to play cricket. Vorster stated that blacks were intellectually inferior and had no finesse for the game. Bradman -- thinking this ignorant and repugnant -- asked Vorster if he had heard of a man named Garry Sobers. On his return to Australia, Bradman released a one sentence statement: "We will not play them until they choose a team on a non-racist basis. '' In South Africa, Vorster vented his anger publicly against Bradman, while the African National Congress rejoiced. This was the first time a predominantly white nation had taken the side of multiracial sport, producing an unsettling resonance that more "White '' boycotts were coming. Almost twenty years later, on his release from prison, Nelson Mandela asked a visiting Australian statesman if Donald Bradman, his childhood hero, was still alive (Bradman lived until 2001). In 1971, Vorster altered his policies even further by distinguishing multiracial from multinational sport. Multiracial sport, between teams with players of different races, remained outlawed; multinational sport, however, was now acceptable: international sides would not be subject to South Africa 's racial stipulations. In 1978, Nigeria boycotted the Commonwealth Games because New Zealand 's sporting contacts with the South African government were not considered to be in accordance with the 1977 Gleneagles Agreement. Nigeria also led the 32 - nation boycott of the 1986 Commonwealth Games because of British prime minister Margaret Thatcher 's ambivalent attitude towards sporting links with South Africa, significantly affecting the quality and profitability of the Games and thus thrusting apartheid into the international spotlight. In the 1960s, the Anti-Apartheid Movements began to campaign for cultural boycotts of apartheid South Africa. Artists were requested not to present or let their works be hosted in South Africa. In 1963, 45 British writers put their signatures to an affirmation approving of the boycott, and, in 1964, American actor Marlon Brando called for a similar affirmation for films. In 1965, the Writers ' Guild of Great Britain called for a proscription on the sending of films to South Africa. Over sixty American artists signed a statement against apartheid and against professional links with the state. The presentation of some South African plays in Britain and the United States was also vetoed. After the arrival of television in South Africa in 1975, the British Actors Union, Equity, boycotted the service, and no British programme concerning its associates could be sold to South Africa. Sporting and cultural boycotts did not have the same impact as economic sanctions, but they did much to lift consciousness amongst normal South Africans of the global condemnation of apartheid. While international opposition to apartheid grew, the Nordic countries -- and Sweden in particular -- provided both moral and financial support for the ANC. On 21 February 1986 -- a week before he was murdered -- Sweden 's prime minister Olof Palme made the keynote address to the Swedish People 's Parliament Against Apartheid held in Stockholm. In addressing the hundreds of anti-apartheid sympathisers as well as leaders and officials from the ANC and the Anti-Apartheid Movement such as Oliver Tambo, Palme declared: "Apartheid can not be reformed; it has to be eliminated. '' Other Western countries adopted a more ambivalent position. In Switzerland, the Swiss - South African Association lobbied on behalf of the South African government. In the 1980s, the US Reagan and UK Thatcher administrations followed a "constructive engagement '' policy with the apartheid government, vetoing the imposition of UN economic sanctions, justified by a belief in free trade and a vision of South Africa as a bastion against Marxist forces in Southern Africa. Thatcher declared the ANC a terrorist organisation, and in 1987 her spokesman, Bernard Ingham, famously said that anyone who believed that the ANC would ever form the government of South Africa was "living in cloud cuckoo land ''. The American Legislative Exchange Council (ALEC), a conservative lobbying organisation, actively campaigned against divesting from South Africa throughout the 1980s. By the late 1980s, with the tide of the Cold War turning and no sign of a political resolution in South Africa, Western patience began to run out. By 1989, a bipartisan Republican / Democratic initiative in the US favoured economic sanctions (realised as the Comprehensive Anti-Apartheid Act of 1986), the release of Nelson Mandela and a negotiated settlement involving the ANC. Thatcher too began to take a similar line, but insisted on the suspension of the ANC 's armed struggle. The UK 's significant economic involvement in South Africa may have provided some leverage with the South African government, with both the UK and the US applying pressure and pushing for negotiations. However, neither Britain nor the US was willing to apply economic pressure upon their multinational interests in South Africa, such as the mining company Anglo American. Although a high - profile compensation claim against these companies was thrown out of court in 2004, the US Supreme Court in May 2008 upheld an appeal court ruling allowing another lawsuit that seeks damages of more than US $400 billion from major international companies which are accused of aiding South Africa 's apartheid system. During the 1950s, South African military strategy was decisively shaped by fears of communist espionage and a conventional Soviet threat to the strategic Cape trade route between the south Atlantic and Indian Oceans. The apartheid government supported the U.S. - led North Atlantic Treaty Organization (NATO), as well as its policy of regional containment against Soviet - backed regimes and insurgencies worldwide. By the late 1960s, the rise of Soviet client states on the African continent, as well as Soviet aid for militant anti-apartheid movements, was considered one of the primary external threats to the apartheid system. South African officials frequently accused domestic opposition groups of being communist proxies. For its part the Soviet Union viewed South Africa as a bastion of neocolonialism and a regional Western ally, which helped fuel its support for various anti-apartheid causes. From 1973 onward much of South Africa 's white population increasingly looked upon their country as a bastion of the free world besieged militarily, politically, and culturally by communism and radical black nationalism. The apartheid government perceived itself as being locked in a proxy struggle with the Warsaw Pact and by implication, armed wings of black nationalist forces such as Umkhonto we Sizwe (MK) and the People 's Liberation Army of Namibia (PLAN), which often received Soviet arms and training. This was described as "Total Onslaught ''. South African initiatives designed to counter "Total Onslaught '' were known as "Total Strategy '' and involved building up a formidable conventional military and counter-intelligence capability. Total Strategy was built on the principles of counter-revolution as espoused by noted French tactician André Beaufre. Considerable effort was devoted towards circumventing international arms sanctions, and the government even went so far as to develop nuclear weapons, allegedly with covert assistance from Israel. In 2010, The Guardian released South African government documents that revealed an Israeli offer to sell the apartheid regime nuclear weapons. Israel categorically denied these allegations and claimed that the documents were minutes from a meeting which did not indicate any concrete offer for a sale of nuclear weapons. Shimon Peres said that The Guardian 's article was based on "selective interpretation... and not on concrete facts. '' From the late 1970s to the late 1980s, defence budgets in South Africa were raised exponentially. Covert operations focused on espionage and domestic political manipulation became common, the number of special forces units swelled, and the South African Defence Force had amassed enough sophisticated conventional weaponry to pose a serious threat to the "front - line states '', a regional alliance of neighbouring countries opposed to apartheid. South Africa had a policy of attacking insurgent bases and safe houses of PLAN and MK in neighbouring countries beginning in the early 1980s. These attacks were in retaliation for acts of sabotage, urban terrorism, and guerrilla raids by MK, PLAN, and the Azanian People 's Liberation Army (APLA). The country also aided organisations in surrounding countries who were actively combating the spread of communism in southern Africa. The results of these policies included: In 1984, Mozambican president Samora Machel signed the Nkomati Accord with South Africa 's president P.W. Botha, in an attempt to end South African support for the opposition group RENAMO. South Africa agreed to cease supporting anti-government forces, while the MK was prohibited from operating in Mozambique. This was a setback for the ANC. Machel hoped the agreement would alliterate the civil war and allow Mozambique to rebuild its economy. Two years later, President Machel was killed in an air crash in mountainous terrain in South Africa near the Mozambican border after returning from a meeting in Zambia. South Africa was accused by the Mozambican government and US Secretary of State George P. Shultz of continuing its aid to RENAMO. The Mozambican government also made an unproven allegation that the accident was caused intentionally by a false radio navigation beacon that scrambled the aircraft 's navigational system. These charges were never proven and is still a subject of some controversy, despite the South African Margo Commission finding that the crash was an accident. A Soviet delegation that did not participate in the investigation issued a minority report implicating South Africa. Beginning in 1966, PLAN, armed wing of the South West African People 's Organisation (SWAPO), contested South Africa 's occupation of South West Africa (now Namibia). This conflict deepened after Angola gained its independence in 1975 under the leadership of the leftist Popular Movement for the Liberation of Angola (MPLA) aided by Cuba. South Africa, Zaire and the United States sided with the Angolan rival UNITA party against the MPLA 's armed force, FAPLA (People 's Armed Forces for the Liberation of Angola). The following struggle turned into one of several late Cold War flashpoints. The Angolan civil war developed into a conventional war with South Africa and UNITA on one side against the MPLA government, the Soviet Union, the Cubans and SWAPO on the other. During the 1980s the government, led by P.W. Botha, became increasingly preoccupied with security. It set up a powerful state security apparatus to "protect '' the state against an anticipated upsurge in political violence that the reforms were expected to trigger. The 1980s became a period of considerable political unrest, with the government becoming increasingly dominated by Botha 's circle of generals and police chiefs (known as securocrats), who managed the various States of Emergencies. Botha 's years in power were marked also by numerous military interventions in the states bordering South Africa, as well as an extensive military and political campaign to eliminate SWAPO in Namibia. Within South Africa, meanwhile, vigorous police action and strict enforcement of security legislation resulted in hundreds of arrests and bans, and an effective end to the ANC 's sabotage campaign. The government punished political offenders brutally. 40,000 people annually were subjected to whipping as a form of punishment. The vast majority had committed political offences and were lashed ten times for their crime. If convicted of treason, a person could be hanged, and the government executed numerous political offenders in this way. As the 1980s progressed, more and more anti-apartheid organisations were formed and affiliated with the UDF. Led by the Reverend Allan Boesak and Albertina Sisulu, the UDF called for the government to abandon its reforms and instead abolish apartheid and eliminate the homelands completely. Serious political violence was a prominent feature from 1985 to 1989, as black townships became the focus of the struggle between anti-apartheid organisations and the Botha government. Throughout the 1980s, township people resisted apartheid by acting against the local issues that faced their particular communities. The focus of much of this resistance was against the local authorities and their leaders, who were seen to be supporting the government. By 1985, it had become the ANC 's aim to make black townships "ungovernable '' (a term later replaced by "people 's power '') by means of rent boycotts and other militant action. Numerous township councils were overthrown or collapsed, to be replaced by unofficial popular organisations, often led by militant youth. People 's courts were set up, and residents accused of being government agents were dealt extreme and occasionally lethal punishments. Black town councillors and policemen, and sometimes their families, were attacked with petrol bombs, beaten, and murdered by necklacing, where a burning tyre was placed around the victim 's neck, after they were restrained by wrapping their wrists with barbed wire. This signature act of torture and murder was embraced by the ANC and its leaders. On 20 July 1985, Botha declared a State of Emergency in 36 magisterial districts. Areas affected were the Eastern Cape, and the PWV region ("Pretoria, Witwatersrand, Vereeniging ''). Three months later the Western Cape was included. An increasing number of organisations were banned or listed (restricted in some way); many individuals had restrictions such as house arrest imposed on them. During this state of emergency about 2,436 people were detained under the Internal Security Act. This act gave police and the military sweeping powers. The government could implement curfews controlling the movement of people. The president could rule by decree without referring to the constitution or to parliament. It became a criminal offence to threaten someone verbally or possess documents that the government perceived to be threatening, to advise anyone to stay away from work or oppose the government, and to disclose the name of anyone arrested under the State of Emergency until the government released that name, with up to ten years ' imprisonment for these offences. Detention without trial became a common feature of the government 's reaction to growing civil unrest and by 1988, 30,000 people had been detained. The media was censored, thousands were arrested and many were interrogated and tortured. On 12 June 1986, four days before the tenth anniversary of the Soweto uprising, the state of emergency was extended to cover the whole country. The government amended the Public Security Act, including the right to declare "unrest '' areas, allowing extraordinary measures to crush protests in these areas. Severe censorship of the press became a dominant tactic in the government 's strategy and television cameras were banned from entering such areas. The state broadcaster, the South African Broadcasting Corporation (SABC), provided propaganda in support of the government. Media opposition to the system increased, supported by the growth of a pro-ANC underground press within South Africa. In 1987, the State of Emergency was extended for another two years. Meanwhile, about 200,000 members of the National Union of Mineworkers commenced the longest strike (three weeks) in South African history. 1988 saw the banning of the activities of the UDF and other anti-apartheid organisations. Much of the violence in the late 1980s and early 1990s was directed at the government, but a substantial amount was between the residents themselves. Many died in violence between members of Inkatha and the UDF - ANC faction. It was later proven that the government manipulated the situation by supporting one side or the other when it suited it. Government agents assassinated opponents within South Africa and abroad; they undertook cross-border army and air - force attacks on suspected ANC and PAC bases. The ANC and the PAC in return exploded bombs at restaurants, shopping centres and government buildings such as magistrates courts. Between 1960 and 1994, according to statistics from the Truth and Reconciliation Commission, the Inkatha Freedom Party was responsible for 4,500 killings, South African security forces were responsible for 2,700 killings and the ANC was responsible for 1,300 killings. The state of emergency continued until 1990, when it was lifted by State President F.W. de Klerk. Apartheid developed by racism of colonial factors and due to South Africa 's "unique industrialization ''. The policies of industrialisation led to segregation of and classing of people, which was "specifically developed to nurture early industry such as mining and capitalist culture ''. Cheap labour was the basis of the economy and this was taken from what the state classed as peasant groups and the migrants. Furthermore, Philip Bonner highlights the "contradictory economic effects '' as the economy did not have a manufacturing sector, therefore promoting short term profitability but limiting labour productivity and the size of local markets. This also led to its collapse as "Clarkes emphasises the economy could not provide and compete with foreign rivals as they failed to master cheap labour and complex chemistry ''. The contradictions in the traditionally capitalist economy of the apartheid state led to considerable debate about racial policy, and division and conflicts in the central state. To a large extent the political ideology of apartheid had emerged from the colonisation of Africa by European powers which institutionalised racial discrimination and exercised a paternal philosophy of "civilising inferior natives. '' Some scholars have argued that this can be reflected in Afrikaner Calvinism, with its parallel traditions of racialism; for example, as early as 1933 the executive council of the Broederbond formulated a recommendation for mass segregation. External western influence can be seen as one of the factors that arguably greatly influenced political ideology, particularly due to the influences of colonisation. South Africa in particular is argued to be an "unreconstructed example of western civilisation twisted by racism ''. However, western influence also helped end apartheid. "Once the power of the Soviet Union declined along with its Communist influence, western nations felt Apartheid could no longer be tolerated and spoke out, encouraging a move towards democracy and self - determination ''. In the 1960s, South Africa experienced economic growth second only to that of Japan. Trade with Western countries grew, and investment from the United States, France and Britain poured in. In 1974, resistance to apartheid was encouraged by Portugal 's withdrawal from Mozambique and Angola, after the 1974 Carnation Revolution. South African troops withdrew from Angola in early 1976, failing to prevent the MPLA from gaining power there, and black students in South Africa celebrated. The Mahlabatini Declaration of Faith, signed by Mangosuthu Buthelezi and Harry Schwarz in 1974, enshrined the principles of peaceful transition of power and equality for all. Its purpose was to provide a blueprint for South Africa by consent and racial peace in a multi-racial society, stressing opportunity for all, consultation, the federal concept, and a Bill of Rights. It caused a split in the United Party that ultimately realigned opposition politics in South Africa, with the formation of the Progressive Federal Party in 1977. It was the first of such agreements by acknowledged black and white political leaders in South Africa. In 1978, the defence minister of the NP, Pieter Willem Botha, became Prime Minister. Botha 's white regime was worried about the Soviet Union helping revolutionaries in South Africa, and the economy had slowed down. The new government noted that it was spending too much money trying to maintain the segregated homelands that had been created for blacks and the homelands were proving to be uneconomical. Nor was maintaining blacks as a third class working well. The labour of blacks remained vital to the economy, and illegal black labour unions were flourishing. Many blacks remained too poor to make much of a contribution to the economy through their purchasing power -- although they were more than 70 percent of the population. Botha 's regime was afraid that an antidote was needed to prevent the blacks from being attracted to Communism. In July 1979, the Nigerian government claimed that the Shell - BP Petroleum Development Company of Nigeria Limited (SPDC) was selling Nigerian oil to South Africa, although there was little evidence or commercial logic for such sales. The alleged sanctions - breaking was used to justify the seizure of some of BP 's assets in Nigeria including their stake in SPDC, although it appears the real reasons were economic nationalism and domestic politics ahead of the Nigerian elections. Many South Africans attended schools in Nigeria, and Nelson Mandela several times acknowledged the role of Nigeria in the struggle against apartheid. In the 1980s, the anti-apartheid movements in the United States and Europe were gaining support for boycotts against South Africa, for the withdrawal of US firms from South Africa and for the release of Mandela. South Africa was becoming an outlaw in the world community of nations. Investing in South Africa by Americans and others was coming to an end and an active policy of disinvestment ensued. In the early 1980s, Botha 's National Party government started to recognise the inevitability of the need to reform apartheid. Early reforms were driven by a combination of internal violence, international condemnation, changes within the National Party 's constituency, and changing demographics -- whites constituted only 16 percent of the total population, in comparison to 20 percent fifty years earlier. In 1983, a new constitution was passed implementing what was called the Tricameral Parliament, giving coloureds and Indians voting rights and parliamentary representation in separate houses -- the House of Assembly (178 members) for whites, the House of Representatives (85 members) for coloureds and the House of Delegates (45 members) for Indians. Each House handled laws pertaining to its racial group 's "own affairs '', including health, education and other community issues. All laws relating to "general affairs '' (matters such as defence, industry, taxation and Black affairs) were handled by a cabinet made up of representatives from all three houses. However, the white chamber had a large majority on this cabinet, ensuring that effective control of the country remained in white hands. Blacks, although making up the majority of the population, were excluded from representation; they remained nominal citizens of their homelands. The first Tricameral elections were largely boycotted by Coloured and Indian voters, amid widespread rioting. Concerned over the popularity of Mandela, Botha denounced him as an arch - Marxist committed to violent revolution, but to appease black opinion and nurture Mandela as a benevolent leader of blacks, the government moved him from Robben Island to Pollsmoor Prison in a rural area just outside Cape Town, where prison life was easier. The government allowed Mandela more visitors, including visits and interviews by foreigners, to let the world know that he was being treated well. Black homelands were declared nation - states and pass laws were abolished. Black labour unions were legitimised, the government recognised the right of blacks to live in urban areas permanently and gave blacks property rights there. Interest was expressed in rescinding the law against interracial marriage and also rescinding the law against sex between the races, which was under ridicule abroad. The spending for black schools increased, to one - seventh of what was spent per white child, up from on one - sixteenth in 1968. At the same time, attention was given to strengthening the effectiveness of the police apparatus. In January 1985, Botha addressed the government 's House of Assembly and stated that the government was willing to release Mandela on condition that Mandela pledge opposition to acts of violence to further political objectives. Mandela 's reply was read in public by his daughter Zinzi -- his first words distributed publicly since his sentence to prison twenty - one years before. Mandela described violence as the responsibility of the apartheid regime and said that with democracy there would be no need for violence. The crowd listening to the reading of his speech erupted in cheers and chants. This response helped to further elevate Mandela 's status in the eyes of those, both internationally and domestically, who opposed apartheid. Between 1986 and 1988, some petty apartheid laws were repealed. Botha told white South Africans to "adapt or die '' and twice he wavered on the eve of what were billed as "rubicon '' announcements of substantial reforms, although on both occasions he backed away from substantial changes. Ironically, these reforms served only to trigger intensified political violence through the remainder of the eighties as more communities and political groups across the country joined the resistance movement. Botha 's government stopped short of substantial reforms, such as lifting the ban on the ANC, PAC and SACP and other liberation organisations, releasing political prisoners, or repealing the foundation laws of grand apartheid. The government 's stance was that they would not contemplate negotiating until those organisations "renounced violence ''. By 1987, South Africa 's economy was growing at one of the lowest rates in the world, and the ban on South African participation in international sporting events was frustrating many whites in South Africa. Examples of African states with black leaders and white minorities existed in Kenya and Zimbabwe. Whispers of South Africa one day having a black President sent more hardline whites into Rightist parties. Mandela was moved to a four - bedroom house of his own, with a swimming pool and shaded by fir trees, on a prison farm just outside Cape Town. He had an unpublicised meeting with Botha. Botha impressed Mandela by walking forward, extending his hand and pouring Mandela 's tea. The two had a friendly discussion, with Mandela comparing the African National Congress ' rebellion with that of the Afrikaner rebellion and talking about everyone being brothers. A number of clandestine meetings were held between the ANC - in - exile and various sectors of the internal struggle, such as women and educationalists. More overtly, a group of white intellectuals met the ANC in Senegal for talks. Early in 1989, Botha suffered a stroke; he was prevailed upon to resign in February 1989. He was succeeded as president later that year by F.W. de Klerk. Despite his initial reputation as a conservative, de Klerk moved decisively towards negotiations to end the political stalemate in the country. In his opening address to parliament on 2 February 1990, de Klerk announced that he would repeal discriminatory laws and lift the 30 - year ban on leading anti-apartheid groups such as the African National Congress, the Pan Africanist Congress, the South African Communist Party (SACP) and the United Democratic Front. The Land Act was brought to an end. De Klerk also made his first public commitment to release Nelson Mandela, to return to press freedom and to suspend the death penalty. Media restrictions were lifted and political prisoners not guilty of common law crimes were released. On 11 February 1990, Nelson Mandela was released from Victor Verster Prison after more than 27 years of confinement. Having been instructed by the UN Security Council to end its long - standing involvement in South West Africa / Namibia, and in the face of military stalemate in Southern Angola, and an escalation in the size and cost of the combat with the Cubans, the Angolans, and SWAPO forces and the growing cost of the border war, South Africa negotiated a change of control; Namibia became independent on 21 March 1990. Apartheid was dismantled in a series of negotiations from 1990 to 1991, culminating in a transitional period which resulted in the country 's 1994 general elections, the first in South Africa held with universal suffrage. In 1990 negotiations were earnestly begun, with two meetings between the government and the ANC. The purpose of the negotiations was to pave the way for talks towards a peaceful transition towards majority rule. These meetings were successful in laying down the preconditions for negotiations, despite the considerable tensions still abounding within the country. Apartheid legislation was abolished in 1991. At the first meeting, the NP and ANC discussed the conditions for negotiations to begin. The meeting was held at Groote Schuur, the President 's official residence. They released the Groote Schuur Minute, which said that before negotiations commenced political prisoners would be freed and all exiles allowed to return. There were fears that the change of power would be violent. To avoid this, it was essential that a peaceful resolution between all parties be reached. In December 1991, the Convention for a Democratic South Africa (CODESA) began negotiations on the formation of a multiracial transitional government and a new constitution extending political rights to all groups. CODESA adopted a Declaration of Intent and committed itself to an "undivided South Africa ''. Reforms and negotiations to end apartheid led to a backlash among the right - wing white opposition, leading to the Conservative Party winning a number of by - elections against NP candidates. De Klerk responded by calling a whites - only referendum in March 1992 to decide whether negotiations should continue. A 68 per cent majority gave its support, and the victory instilled in de Klerk and the government a lot more confidence, giving the NP a stronger position in negotiations. When negotiations resumed in May 1992, under the tag of CODESA II, stronger demands were made. The ANC and the government could not reach a compromise on how power should be shared during the transition to democracy. The NP wanted to retain a strong position in a transitional government, and the power to change decisions made by parliament. Persistent violence added to the tension during the negotiations. This was due mostly to the intense rivalry between the Inkatha Freedom Party (IFP) and the ANC and the eruption of some traditional tribal and local rivalries between the Zulu and Xhosa historical tribal affinities, especially in the Southern Natal provinces. Although Mandela and Buthelezi met to settle their differences, they could not stem the violence. One of the worst cases of ANC - IFP violence was the Boipatong massacre of 17 June 1992, when 200 IFP militants attacked the Gauteng township of Boipatong, killing 45. Witnesses said that the men had arrived in police vehicles, supporting claims that elements within the police and army contributed to the ongoing violence. Subsequent judicial inquiries found the evidence of the witnesses to be unreliable or discredited, and that there was no evidence of National Party or police involvement in the massacre. When de Klerk visited the scene of the incident he was initially warmly welcomed, but he was suddenly confronted by a crowd of protesters brandishing stones and placards. The motorcade sped from the scene as police tried to hold back the crowd. Shots were fired by the police, and the PAC stated that three of its supporters had been gunned down. Nonetheless, the Boipatong massacre offered the ANC a pretext to engage in brinkmanship. Mandela argued that de Klerk, as head of state, was responsible for bringing an end to the bloodshed. He also accused the South African police of inciting the ANC - IFP violence. This formed the basis for ANC 's withdrawal from the negotiations, and the CODESA forum broke down completely at this stage. The Bisho massacre on 7 September 1992 brought matters to a head. The Ciskei Defence Force killed 29 people and injured 200 when they opened fire on ANC marchers demanding the reincorporation of the Ciskei homeland into South Africa. In the aftermath, Mandela and de Klerk agreed to meet to find ways to end the spiralling violence. This led to a resumption of negotiations. Right - wing violence also added to the hostilities of this period. The assassination of Chris Hani on 10 April 1993 threatened to plunge the country into chaos. Hani, the popular general secretary of the South African Communist Party (SACP), was assassinated in 1993 in Dawn Park in Johannesburg by Janusz Waluś, an anti-communist Polish refugee who had close links to the white nationalist Afrikaner Weerstandsbeweging (AWB). Hani enjoyed widespread support beyond his constituency in the SACP and ANC and had been recognised as a potential successor to Mandela; his death brought forth protests throughout the country and across the international community, but ultimately proved a turning point, after which the main parties pushed for a settlement with increased determination. On 25 June 1993, the AWB used an armoured vehicle to crash through the doors of the Kempton Park World Trade Centre where talks were still going ahead under the Negotiating Council, though this did not derail the process. In addition to the continuing "black - on - black '' violence, there were a number of attacks on white civilians by the PAC 's military wing, the Azanian People 's Liberation Army (APLA). The PAC was hoping to strengthen their standing by attracting the support of the angry, impatient youth. In the St James Church massacre on 25 July 1993, members of the APLA opened fire in a church in Cape Town, killing 11 members of the congregation and wounding 58. In 1993 de Klerk and Mandela were jointly awarded the Nobel Peace Prize "for their work for the peaceful termination of the apartheid regime, and for laying the foundations for a new democratic South Africa ''. Violence persisted right up to the 1994 elections. Lucas Mangope, leader of the Bophuthatswana homeland, declared that it would not take part in the elections. It had been decided that, once the temporary constitution had come into effect, the homelands would be incorporated into South Africa, but Mangope did not want this to happen. There were strong protests against his decision, leading to a coup d'état in Bophuthatswana on 10 March that deposed Mangope, despite the intervention of white right - wingers hoping to maintain him in power. Three AWB militants were killed during this intervention, and harrowing images were shown on national television and in newspapers across the world. Two days before the elections, a car bomb exploded in Johannesburg, killing nine. The day before the elections, another one went off, injuring 13. At midnight on 26 -- 27 April 1994 the old flag was lowered, and the old (now co-official) national anthem Die Stem ("The Call '') was sung, followed by the raising of the new rainbow flag and singing of the other co-official anthem, Nkosi Sikelel ' iAfrika ("God Bless Africa ''). The election was held on 27 April 1994 and went off peacefully throughout the country as 20 million South Africans cast their votes. There was some difficulty in organising the voting in rural areas, but people waited patiently for many hours to vote amidst a palpable feeling of goodwill. An extra day was added to give everyone the chance. International observers agreed that the elections were free and fair. The European Union 's report on the election compiled at the end of May 1994, published two years after the election, criticised the Independent Electoral Commission 's lack of preparedness for the polls, the shortages of voting materials at many voting stations, and the absence of effective safeguards against fraud in the counting process. In particular, it expressed disquiet that "no international observers had been allowed to be present at the crucial stage of the count when party representatives negotiated over disputed ballots. '' This meant that both the electorate and the world were "simply left to guess at the way the final result was achieved. '' The ANC won 62.65 percent of the vote, less than the 66.7 percent that would have allowed it to rewrite the constitution. 252 of the 400 seats went to members of the African National Congress. The NP captured most of the white and coloured votes and became the official opposition party. As well as deciding the national government, the election decided the provincial governments, and the ANC won in seven of the nine provinces, with the NP winning in the Western Cape and the IFP in KwaZulu - Natal. On 10 May 1994, Mandela was sworn in as South Africa 's president. The Government of National Unity was established, its cabinet made up of 12 ANC representatives, six from the NP, and three from the IFP. Thabo Mbeki and de Klerk were made deputy presidents. The anniversary of the elections, 27 April, is celebrated as a public holiday known as Freedom Day. The following individuals, who had previously supported apartheid, made public apologies:
how old do you have to be to enter triwizard tournament
Harry Potter and the Goblet of Fire - wikipedia Harry Potter and the Goblet of Fire is a fantasy book written by British author J.K. Rowling and the fourth novel in the Harry Potter series. It follows Harry Potter, a wizard in his fourth year at Hogwarts School of Witchcraft and Wizardry and the mystery surrounding the entry of Harry 's name into the Triwizard Tournament, in which he is forced to compete. The book was published in the United Kingdom by Bloomsbury and in the United States by Scholastic; in both countries the release date was 8 July 2000, the first time a book in the series was published in both countries at the same time. The novel won a Hugo Award, the only Harry Potter novel to do so, in 2001. The book was adapted into a film, which was released worldwide on 18 November 2005, and a video game by Electronic Arts. Throughout the three previous novels in the Harry Potter series, the main character, Harry Potter, has struggled with the difficulties of growing up, and the added challenge of being a famed wizard: when Harry was a baby, Lord Voldemort, the most powerful Dark wizard in history, killed Harry 's parents but mysteriously vanished after unsuccessfully trying to kill Harry, which left a lightning - shaped scar on Harry 's forehead. This results in Harry 's immediate fame and his being placed in the care of his abusive muggle, or non-magical, aunt and uncle, Aunt Petunia Dursley and Uncle Vernon Dursley, who have a son named Dudley Dursley. Harry learns that he is a wizard when he is 11 years old and enrols in Hogwarts School of Witchcraft and Wizardry. He befriends Ron Weasley and Hermione Granger, and is confronted by Lord Voldemort who is trying to regain power. In Harry 's first year he has to protect the Philosopher 's Stone from Voldemort and one of his faithful followers at Hogwarts. After returning to the school after summer break, students at Hogwarts are attacked by the legendary monster of the "Chamber of Secrets '' after the chamber is opened. Harry ends the attacks by killing a Basilisk and defeating another attempt by Lord Voldemort to return to full strength. The following year, Harry hears that he has been targeted by escaped mass murderer Sirius Black. Despite stringent security measures at Hogwarts, Harry is confronted by Black at the end of his third year of schooling, and Harry learns that Black was framed and is actually Harry 's godfather. He also learned that it was his father 's old school friend Peter Pettigrew who actually betrayed his parents. The book opens with Harry seeing Frank Bryce being killed by Lord Voldemort in a vision, and is awoken by his scar hurting. The Weasleys then take Harry and Hermione Granger to the Quidditch World Cup, using a Portkey, to watch Ireland versus Bulgaria, with Ireland emerging victorious. There, Harry meets Cedric Diggory, who is attending the match with his father. After the match, Voldemort 's followers attack the site, destroying spectators ' tents and wreaking havoc. The Dark Mark gets fired into the sky, which leads to a panic since it is the first time the sign has been seen in 13 years. Winky, Barty Crouch Senior 's house elf, is falsely accused of casting the Mark after she is found holding Harry 's wand, which is revealed to have been used to cast the Mark, as Harry had lost it during the chaos of the Death Eaters ' attack. Hermione, angry at this injustice, forms a society to promote the rights of house elves known as S.P.E.W. (Society for the Promotion of Elvish Welfare). At Hogwarts, Professor Dumbledore announces that Alastor "Mad - Eye '' Moody will be the Defence Against the Dark Arts teacher for the year, and also that Hogwarts will host the Triwizard Tournament, with a prize of one thousand gold Galleons. However, only those over 17 -- the age of majority in the wizarding world -- will be allowed to enter. It is the first time in 202 years that the Triwizard Tournament will be held. Students from Beauxbatons Academy and the Durmstrang Institute, other wizarding academies, will travel to Hogwarts, where they will stay for the year, in hopes of competing. At Halloween, the Goblet of Fire picks Fleur Delacour from Beauxbatons Academy; Viktor Krum (who is also the Seeker on Bulgaria 's Quidditch team) from Durmstrang Institute; and Cedric Diggory from Hogwarts to compete in the tournament. However, it additionally gives a fourth name -- Harry Potter -- leading to suspicion and indignation from everyone and magically binding Harry to compete. Ron is jealous that Harry is once again in the limelight and refuses to speak to Harry. Hagrid reveals to Harry that the first task involves dragons, and since Fleur and Krum 's headmasters are also aware of this, and will surely tell them in advance, Harry informs Cedric as well. At the task, Harry has to pass a Hungarian Horntail to retrieve a golden egg that contains a hint to the next task, which he does by summoning his Firebolt broomstick with the Accio spell, and finishes the task tied for first with Krum. Ron and Harry subsequently reconcile, Ron now understanding the full danger of the tournament. When Harry opens the egg, though, it merely shrieks loudly. Hermione then takes Harry and Ron to the school kitchens, where house elves work. There, they meet a distraught Winky, who is struggling to get over the loss of her sacking. They also meet Harry 's old friend Dobby, who has been employed at Hogwarts to work in the kitchens; he is the only known house elf to appreciate his freedom, despite his hardworking nature. Meanwhile, gossipy reporter Rita Skeeter is writing scandalous articles of half - truths and outright fabrications in The Daily Prophet about those at Hogwarts, including Hermione, Harry, Hagrid, and Madame Maxime of Beauxbatons. With the Yule Ball approaching, Harry must find a partner, but when he finally approaches his crush Cho Chang, Cedric has beaten him to her, so Harry and Ron ask Parvati and Padma Patil. Ron is shocked and jealous to see that Hermione is attending with Krum. Cedric gives Harry a tip on the egg, telling him to take it to the prefects ' bathroom, but Harry refuses to listen, jealous over Cho. Finally acting on the tip, Harry takes the egg to the prefects ' bathroom, where Moaning Myrtle tells him to listen to the egg underwater; there the words become understandable. Harry learns that the task is to recover something he will "sorely miss '', and starts looking for spells to help him breathe where the objects will be taken: The Black Lake. By the morning of the task, Harry still has n't found a solution, but Dobby gives him some Gillyweed to give Harry gills. Harry completes the task by rescuing Ron from under the lake. Harry then takes a risk by also rescuing Fleur 's younger sister, Gabrielle, after Fleur was unable to. After the judges confer, he earns enough points to tie him with Cedric for the lead. One month before the final task, Harry and Krum are talking when they encounter Crouch, who appears to have gone insane, but manages to tell Harry to get Dumbledore. Leaving Krum with Crouch, Harry fetches Dumbledore but returns to find Krum stunned and Crouch gone. Harry returns to preparing for the final task, a hedge maze. Inside the maze, Harry is forced to incapacitate Krum, who has been bewitched, to save Cedric. Working together, the two reach the cup. They agree to touch it at the same time, and doing so, discover that it is a Portkey that transports them to a graveyard. There, Peter Pettigrew kills Cedric and uses Harry 's blood (along with his own hand and Tom Riddle Sr. 's bone) to resurrect Lord Voldemort. Voldemort summons his Death Eaters, berating them for thinking he was dead, before he reveals that he has a single "faithful servant '' concealed at Hogwarts, who has been working to ensure that Harry would make it to the graveyard, and then challenges Harry to a duel. However, when he and Harry fire curses at each other, their wands connect due to their identical cores. Voldemort 's wand releases the most recent spells it performed, resulting in imprints of his last victims appearing in the graveyard, including Harry 's parents, who provide a distraction so that Harry can escape back to Hogwarts using the Portkey, taking Cedric 's body with him. When he returns, Moody takes him to his office, and reveals himself to be Voldemort 's ' faithful servant '; he was the one who put Harry 's name into the Goblet of Fire, and has been guiding him through the tournament from behind the scenes to ensure that he would grab the Portkey first. Before Moody can kill Harry, Dumbledore, McGonagall and Snape intervene. They learn that Moody is in fact Barty Crouch Jr., Mr. Crouch 's son, disguised by Polyjuice Potion. Crouch had sentenced Crouch Jr. to life imprisonment in Azkaban over alleged ties to the Death Eaters but smuggled him out as a last favour to his dying wife. Crouch Jr. was the one who set off the Dark Mark at the Quidditch World Cup, doing it to scare the Death Eaters he felt had abandoned Voldemort. Eventually, Voldemort had gotten in contact with Crouch Jr. and had him impersonate Moody as part of his plan. Crouch Jr. also admits to killing Crouch Sr., to prevent him telling Dumbledore about Voldemort. The real Moody is found inside Crouch Jr. 's enchanted trunk and rescued. Harry is then declared the winner of the Triwizard Tournament and given his winnings. Many people, including Fudge, do not believe Harry and Dumbledore about Voldemort 's return, and as Fudge has the Dementor 's Kiss performed, Crouch Jr. is unable to give testimony. Hermione discovers Rita Skeeter is an unregistered Animagus, who can take the form of a beetle, and blackmails her to force her to stop writing her libellous stories. Not wanting his tournament winnings, Harry gives them to Fred and George to start their joke shop and returns home with the Dursleys. Harry Potter and the Goblet of Fire is the fourth book in the Harry Potter series. The first, Harry Potter and the Philosopher 's Stone, was published by Bloomsbury on 26 June 1997; the second, Harry Potter and the Chamber of Secrets, was published on 2 July 1998; and the third, Harry Potter and the Prisoner of Azkaban, followed on 8 July 1999. Goblet of Fire is considerably longer than the first three; almost twice the size (the paperback edition was 636 pages). Rowling stated that she "knew from the beginning it would be the biggest of the first four ''. She said there needed to be a "proper run - up '' for the conclusion and rushing the "complex plot '' could confuse readers. She also stated that "everything is on a bigger scale '' which was symbolic, as Harry 's horizons widened both literally and metaphorically as he grew up. She also wanted to explore more of the magical world. Until the official title 's announcement on 27 June 2000, the book was called by its working title, ' Harry Potter IV. ' Previously, in April, the publisher had listed it as Harry Potter and the Doomspell Tournament. However, J. K. Rowling expressed her indecision about the title in an Entertainment Weekly interview. "I changed my mind twice on what (the title) was. The working title had got out -- Harry Potter and the Doomspell Tournament. Then I changed Doomspell to Triwizard Tournament. Then I was teetering between Goblet of Fire and Triwizard Tournament. In the end, I preferred Goblet of Fire because it 's got that kind of cup of destiny feel about it, which is the theme of the book. '' Rowling mentioned that she originally had a Weasley relative named Malfalda, who, according to Rowling, "was the daughter of the ' second cousin who 's a stockbroker ' mentioned in Philosopher 's Stone. This stockbroker had been very rude to Mr. and Mrs. Weasley in the past, but now he and his (Muggle) wife had inconveniently produced a witch, they came back to the Weasleys asking for their help in introducing her to wizarding society before she starts at Hogwarts ''. Malfalda was supposed to be a Slytherin and who was to fill in the Rita Skeeter subplot, but eventually was removed as "there were obvious limitations to what an eleven year old closeted at school could discover ''. Rowling considered Rita Skeeter to be "much more flexible ''. Rowling also admitted that the fourth book was the most difficult to write at the time, because she noticed a giant plot hole halfway through writing. In particular, Rowling had trouble with the ninth chapter, "The Dark Mark '', which she rewrote 13 times. Jeff Jensen, who interviewed Rowling for Entertainment Weekly in 2000, pointed out that bigotry is a big theme in the Harry Potter novels and Goblet of Fire in particular. He mentioned how Voldemort and his followers are prejudiced against Muggles and how in Goblet of Fire Hermione forms a group to liberate Hogwarts ' house - elves who have "been indentured servants so long they lack desire for anything else ''. When asked why she explored this theme, Rowling replied, Because bigotry is probably the thing I detest most. All forms of intolerance, the whole idea of that which is different from me is necessarily evil. I really like to explore the idea that difference is equal and good. But there 's another idea that I like to explore, too. Oppressed groups are not, generally speaking, people who stand firmly together -- no, sadly, they kind of subdivide among themselves and fight like hell. That 's human nature, so that 's what you see here. This world of wizards and witches, they 're already ostracized, and then within themselves, they 've formed a loathsome pecking order. She also commented that she did not feel this was too "heavy '' for children, as it was one of those things that a "huge number of children at that age start to think about ''. Goblet of Fire was the first book in the Harry Potter series to be released in the United States on the same date as the United Kingdom, on 8 July 2000, strategically on a Saturday so children did not have to worry about school conflicting with buying the book. It had a combined first - printing of over five million copies. It was given a record - breaking print run of 3.9 million. Three million copies of the book were sold over the first weekend in the US alone. FedEx dispatched more than 9,000 trucks and 100 planes to fulfil book deliveries. The pressure in editing caused a mistake which shows Harry 's father emerging first from Voldemort 's wand; however, as confirmed in Prisoner of Azkaban, James died first, so then Harry 's mother ought to have come out first. This was corrected in later editions. To publicise the book, a special train named Hogwarts Express was organised by Bloomsbury, and run from King 's Cross to Perth, carrying J.K. Rowling, a consignment of books for her to sign and sell, also representatives of Bloomsbury and the press. The book was launched on 8 July 2000, on platform 1 at King 's Cross -- which had been given "Platform ​ 9 ⁄ '' signs for the occasion -- following which the train departed. En route it called at Didcot Railway Centre, Kidderminster, the Severn Valley Railway, Crewe (overnight stop), Manchester, Bradford, York, the National Railway Museum (overnight stop), Newcastle, Edinburgh, arriving at Perth on 11 July. The locomotive was West Country class steam locomotive no. 34027 Taw Valley, which was specially repainted red for the tour; it later returned to its normal green livery (the repaints were requested and paid for by Bloomsbury). The coaches of the train included a sleeping car. A Diesel locomotive was coupled at the other end, for use when reversals were necessary, such as the first stage of the journey as far as Ferme Park, just south of Hornsey. The tour generated considerably more press interest than the launch of the film Thomas and the Magic Railroad which was premièred in London the same weekend. Harry Potter and the Goblet of Fire has received mostly positive reviews. In The New York Times Book Review, author Stephen King stated the Goblet of Fire was "every bit as good as Potters 1 through 3 '' and praised the humour and subplots, although he commented that "there 's also a moderately tiresome amount of adolescent squabbling... it 's a teenage thing ''. Kirkus Reviews called it "another grand tale of magic and mystery... and clicking along so smoothly that it seems shorter than it is ''. However, they commented that it did tend to lag, especially at the end where two "bad guys '' stopped the action to give extended explanations, and that the issues to be resolved in sequels would leave "many readers, particularly American ones, uncomfortable ''. For The Horn Book Magazine, Martha V. Parravano gave a mixed review, saying "some will find (it) wide - ranging, compellingly written, and absorbing; others, long, rambling, and tortuously fraught with adverbs ''. A Publishers Weekly review praised the book 's "red herrings, the artful clues and tricky surprises that disarm the most attentive audience '' and saying it "might be her most thrilling yet. '' Writing for The New Yorker, Joan Acocella noted that "where the prior volumes moved like lightning, here the pace is slower, the energy more dispersed. At the same time, the tone becomes more grim. '' Kristin Lemmerman of CNN said that it is not great literature: ' Her prose has more in common with your typical beach - blanket fare and the beginning contained too much recap to introduce characters to new readers, although Rowling quickly gets back on track, introducing readers to a host of well - drawn new characters. ' Writing for Salon.com, Charles Taylor was generally positive about the change of mood and development of characters. Entertainment Weekly 's reviewer Kristen Baldwin gave Goblet of Fire the grade of A -, praising the development of the characters as well as the many themes presented. However, she did worry that a shocking climax may be a "nightmare factory '' for young readers. Harry Potter and the Goblet of Fire won several awards, including the 2001 Hugo Award for Best Novel. It won the 2002 Indian Paintbrush Book Award, the third after Philosopher 's Stone and Prisoner of Azkaban. The novel also won an Oppenheim Toy Portfolio Platinum Award for one of the best books, who claimed it was "more intense than the first three books ''. In addition, Entertainment Weekly listed Goblet of Fire in second place on their list of The New Classics: Books -- The 100 best reads from 1983 to 2008. Harry Potter and the Goblet of Fire was adapted into a film, released worldwide on 18 November 2005, which was directed by Mike Newell and written by Steve Kloves. The film grossed $102.7 million for the opening weekend, and eventually grossed $896 million worldwide. The film was also nominated for Best Art Direction at the 78th Academy Awards. It was also made into a video game for PC, PlayStation 2, Nintendo DS, Nintendo GameCube, Xbox, Game Boy Advance, and PlayStation Portable by Electronic Arts. It was released just before the film. Much of the plot of Harry Potter and the Cursed Child involves revisiting scenes from Goblet of Fire, with younger protagonists born long after these events travelling back in time in a misguided effort to change history and save Cedric Diggory - which only leads to them damaging events in the present and worsening the situation.
how many stars make up the big dipper
Big Dipper - wikipedia The Big Dipper (US) or the Plough (UK) is an asterism consisting of seven bright stars of the constellation Ursa Major; six of them are of second magnitude and one, Megrez (δ), of third magnitude. Four define a "bowl '' or "body '' and three define a "handle '' or "head ''. It is recognized as a distinct grouping in many cultures. The North Star (Polaris), the current northern pole star and the tip of the handle of the Little Dipper (Little Bear), can be located by extending an imaginary line through the front two stars of the asterism, Merak (β) and Dubhe (α). This makes it useful in celestial navigation. The constellation of Ursa Major has been seen as a bear, a wagon, or a ladle. The "bear '' tradition is Greek, but apparently the name "bear '' has parallels in Siberian or North American traditions. The name "Bear '' is Homeric, and apparently native to Greece, while the "Wain '' tradition is Mesopotamian. Book XVIII of Homer 's Iliad mentions it as "the Bear, which men also call the Wain ''. In Latin, these seven stars were known as the "Seven Oxen '' (septentriones, from septem triōnēs). The classical mythographer identified the "Bear '' as the nymph Callisto, changed into a she - bear by Hera, the jealous wife of Zeus. In Ireland and the United Kingdom, this pattern is known as the Plough. The symbol of the Starry Plough has been used as a political symbol by Irish Republican and left wing movements. Former names include the Great Wain (i.e., wagon) or Butcher 's Cleaver. The terms Charles 's Wain and Charles his Wain are derived from the still older Carlswæn. A folk etymology holds that this derived from Charlemagne, but the name is common to all the Germanic languages and intended the churls ' wagon (i.e., "the men 's wagon ''), in contrast with the women 's wagon (the Little Dipper). An older "Odin 's Wain '' may have preceded these Nordic designations. In German, it is known as the "Great Wagon '' (Großer Wagen) and, less often, the "Great Bear '' (Großer Bär). In Scandinavia, it is known by variations of "Charles 's Wagon '' (Karlavagnen, Karlsvogna, or Karlsvognen), but also the "Great Bear '' (Stora Björn). In Dutch, its official name is the "Great Bear '' (Grote Beer), but it is popularly known as the "Saucepan '' (Steelpannetje). In Italian, too, it is called the "Great Wagon '' (Grande Carro). In Romanian and most Slavic languages, it is known as the "Great Wagon '' as well, but, in Hungarian, it is commonly called "Göncöl 's Wagon '' (Göncölszekér) or, less often, "Big Göncöl '' (Nagy Göncöl) after a táltos (shaman) in Hungarian mythology who carried medicine that could cure any disease. In Finnish, the figure is known as Otava with established etymology in the archaic meaning ' salmon net ', although other uses of the word refer to ' bear ' and ' wheel '. The bear relation is claimed to stem from the animal 's resemblance to -- and mythical origin from -- the asterism rather than vice versa. In the Lithuanian language, the stars of Ursa Major are known as Didieji Grįžulo Ratai ("Stars of the Riding Hall 's Wheels ''). Other names for the constellation include Perkūno Ratai ("Wheels of Perkūnas ''), Kaušas ("Bucket ''), Vežimas ("Carriage ''), and Samtis ("Summit ''). In traditional Chinese astronomy, which continues to be used throughout East Asia (e.g., in astrology), these stars are generally considered to compose the Right Wall of the Purple Forbidden Enclosure which surrounds the Northern Celestial Pole, although numerous other groupings and names have been made over the centuries. Similarly, each star has a distinct name, which likewise has varied over time and depending upon the asterism being constructed. The Western asterism is now known as the "Northern Dipper '' (北斗) or the "Seven Stars of the Northern Dipper '' (Chinese and Japanese: 北斗 七星; pinyin: Běidǒu Qīxīng; Cantonese Yale: Bak1 - dau2 Cat1 - sing1; rōmaji: Hokutō Shichisei; Korean: 북두칠성; romaja: Bukdu Chilseong; Vietnamese: Sao Bắc Đẩu). The personification of the Big Dipper itself is also known as "Doumu '' (斗 母) in Chinese folk religion and Taoism, and Marici in Buddhism. In Shinto, the seven largest stars of Ursa Major belong to Amenominakanushi, the oldest and most powerful of all kami. In North Korea, the constellation is featured on the flag of the country 's special forces. In South Korea, the constellation is referred to as "the seven stars of the north ''. In the related myth, a widow with seven sons found comfort with a widower, but to get to his house required crossing a stream. The seven sons, sympathetic to their mother, placed stepping stones in the river. Their mother, not knowing who put the stones in place, blessed them and, when they died, they became the constellation. In Malaysian, it is known as the "Dipper Stars '' (Buruj Biduk); in Indonesian, as the "Canoe Stars '' (Bintang Biduk). In Burmese, these stars are known as Pucwan Tārā (ပုဇွန် တာရာ, pronounced "bazun taya ''). Pucwan (ပုဇွန်) is a general term for a crustacean, such as prawn, shrimp, crab, lobster, etc. In Javanese, as known as "Bintang Kartika ''. This name comes from Sanskrit which refers "krttikã '' the same star cluster. In ancient Javanese this brightest seven stars are known as Lintang Wuluh, literally means "seven stars ''. This star cluster is so popular because its emergence into the sky signals the time marker for planting. In Hindu astronomy, it is referred to as the "Collection of Seven Great Sages '' (Saptarshi Mandala), as each star is named after a mythical Hindu sage. An Arabian story has the four stars of the Plough 's bowl as a coffin, with the three stars in the handle as mourners, following it. In Mongolian, it is known as the "Seven Gods '' (Долоон бурхан). In Kazakh, they are known as the Jetiqaraqshi (Жетіқарақшы) and, in Kyrgyz, as the Jetigen (Жетиген). Within Ursa Major the stars of the Big Dipper have Bayer designations in consecutive Greek alphabetical order from the bowl to the handle. In the same line of sight as Mizar, but about one light - year beyond it, is the star Alcor (80 UMa). Together they are known as the "Horse and Rider ''. At fourth magnitude, Alcor would normally be relatively easy to see with the unaided eye, but its proximity to Mizar renders it more difficult to resolve, and it has served as a traditional test of sight. Mizar itself has four components and thus enjoys the distinction of being part of an optical binary as well as being the first - discovered telescopic binary (1617) and the first - discovered spectroscopic binary (1889). Five of the stars of the Big Dipper are at the core of the Ursa Major Moving Group. The two at the ends, Dubhe and Alkaid, are not part of the swarm, and are moving in the opposite direction. Relative to the central five, they are moving down and to the right in the map. This will slowly change the Dipper 's shape, with the bowl opening up and the handle becoming more bent. In 50,000 years the Dipper will no longer exist as we know it, but be re-formed into a new Dipper facing the opposite way. The stars Alkaid to Phecda will then constitute the bowl, while Phecda, Merak, and Dubhe will be the handle. Not only are the stars in the Big Dipper easily found themselves, they may also be used as guides to yet other stars. Thus it is often the starting point for introducing Northern Hemisphere beginners to the night sky: Additionally, the Dipper may be used as a guide to telescopic objects: The "Seven Stars '' referenced in the Bible 's Book of Amos may refer to these stars or, more likely, to the Pleiades. In addition, the constellation has also been used in corporate logos and the Alaska flag. The seven stars on a red background of the Flag of the Community of Madrid, Spain, are the stars of the Big Dipper Asterism. It can be said the same thing about the seven stars pictured in the bordure azure of the Coat of arms of Madrid, capital of Spain.
list of disney movies and when they came out
List of Walt Disney Pictures films - wikipedia This is a list of films released theatrically under the Walt Disney Pictures banner (known as that since 1983, with Never Cry Wolf as its first release) and films released before that under the former name of the parent company, Walt Disney Productions (1929 -- 1983). Most films listed here were distributed in the United States by the company 's distribution division, Walt Disney Studios Motion Pictures (formerly known as Buena Vista Distribution Company (1953 -- 1987) and Buena Vista Pictures Distribution (1987 -- 2007)). The Disney features produced before Peter Pan (1953) were originally distributed by RKO Radio Pictures, and are now distributed by Walt Disney Studios Motion Pictures. This list is organized by release date and includes live action feature films, animated feature films (including films developed and produced by Walt Disney Animation Studios and Pixar Animation Studios), and documentary films (including titles from the True - Life Adventures series and films produced by the Disneynature label). For an exclusive list of animated films released by Walt Disney Pictures and its previous entities see List of Disney theatrical animated features. This list is only for theatrical films released under the main Disney banner. The list does not include films released by other existing, defunct or divested labels or subsidiaries owned by Walt Disney Studios (i.e. Marvel Studios, Lucasfilm, Touchstone Pictures, Hollywood Pictures, Miramax Films, Dimension Films, ESPN Films etc.; unless they are credited as co-production partners) nor any direct - to - video releases, TV films, theatrical re-releases, or films originally released by other non-Disney studios. 1930s 1940s 1950s 1960s 1970s 1980s 1990s 2000s 2010s Future releases Notes See also Further Reading References Entertainment In - depth lists by other types Disney - branded labels. Operating: Defunct: Other film labels and / or subsidiaries, that are not Disney - branded. Operating: Defunct: Divested (once owned but no longer owned by Disney):
when does rainy season start in the caribbean
Wet season - wikipedia The monsoon season, is the time of year when most of a region 's average annual rainfall occurs. Generally the season lasts at least a month. The term "green season '' is also sometimes used as a euphemism by tourist authorities. Areas with wet seasons are dispersed across portions of the tropics and subtropics. Under the Köppen climate classification, for tropical climates, a wet season month is defined as a month where average precipitation is 60 millimetres (2.4 in) or more. In contrast to areas with savanna climates and monsoon regimes, Mediterranean climates have wet winters and dry summers. Dry and rainy months are characteristic of tropical seasonal forests: in contrast to tropical rainforests, which do not have dry or wet seasons, since their rainfall is equally distributed throughout the year. Some areas with pronounced rainy seasons will see a break in rainfall mid-season, when the intertropical convergence zone or monsoon trough moves to higher latitudes in the middle of the warm season. When the wet season occurs during a warm season, or summer, precipitation falls mainly during the late afternoon and early evening. In the wet season, air quality improves, fresh water quality improves, and vegetation grows substantially, leading to crop yields late in the season. Rivers overflow their banks, and some animals retreat to higher ground. Soil nutrients diminish and erosion increases. The incidence of malaria increases in areas where the rainy season coincides with high temperatures, particularly in tropical areas. Some animals have adaptation and survival strategies for the wet season. Often, the previous dry season leads to food shortages in the wet season, as the crops have yet to mature. In areas where the heavy rainfall is associated with a wind shift, the wet season is known as the monsoon. Rainfall in the wet season is mainly due to daytime heating which leads to diurnal thunderstorm activity within a pre-existing moist airmass, so the rain mainly falls in late afternoon and early evening in savannah and monsoon regions. Further, much of the total rainfall each day occurs in the first minutes of the downpour, before the storms mature into their stratiform stage. Most places have only one wet season, but areas of the tropics can have two wet seasons, because the monsoon trough, or Intertropical Convergence Zone, can pass over locations in the tropics twice per year. However, since rain forests have rainfall spread evenly through the year, they do not have a wet season. It is different for places with a Mediterranean climate. In the western United States, during the cold season from September -- May, extratropical cyclones from the Pacific Ocean move inland into the region due to a southward migration of the jet stream during the cold season. This shift in the jet stream brings much of the annual precipitation to the region, and sometimes also brings heavy rain and strong low pressure systems. The peninsula of Italy has weather very similar to the western United States in this regard. Areas with a savanna climate in Sub-Saharan Africa, such as Ghana, Burkina Faso, Darfur, Eritrea, Ethiopia, and Botswana have a distinct rainy season. Also within the savanna climate regime, Florida and South Texas have a rainy season. Monsoon regions include the Indian subcontinent, Southeast Asia (including Indonesia and Philippines), northern sections of Australia 's North, Polynesia, Central America, western and southern Mexico, the Desert Southwest of the United States, southern Guyana, portions of northeast Brazil. Northern Guyana has two wet seasons: one in early spring and the other in early winter. In western Africa, there are two rainy seasons across southern sections, but only one across the north. Within the Mediterranean climate regime, the west coast of the United States and the Mediterranean coastline of Italy, Greece, and Turkey experience a wet season in the winter months. Similarly, the wet season in the Negev desert of Israel extends from October through May. At the boundary between the Mediterranean and monsoon climates lies the Sonoran desert, which receives the two rainy seasons associated with each climate regime. The wet season is known by many different local names throughout the world. For example, in Mexico it is known as "storm season ''. Different names are given to the various short "seasons '' of the year by the Aboriginal tribes of Northern Australia: the wet season typically experienced there from December to March is called Gudjewg. The precise meaning of the word is disputed, although it is widely accepted to relate to the severe thunderstorms, flooding, and abundant vegetation growth commonly experienced at this time. In tropical areas, when the monsoon arrives, high daytime high temperatures drop and overnight low temperatures increase, thus reducing diurnal temperature variation. During the wet season, a combination of heavy rainfall and, in some places such as Hong Kong, an onshore wind, improve air quality. In Brazil, the wet season is correlated with weaker trade winds off the ocean. The pH level of water becomes more balanced due to the charging of local aquifers during the wet season. Water also softens, as the concentration of dissolved materials reduces during the rainy season. Erosion is also increased during rainy periods. Arroyos that are dry at other times of the year fill with runoff, in some cases with water as deep as 10 feet (3.0 m). Leaching of soils during periods of heavy rainfall depletes nutrients. The higher runoff from land masses affects nearby ocean areas, which are more stratified, or less mixed, due to stronger surface currents forced by the heavy rainfall runoff. High rainfall can cause widespread flooding, which can lead to landslides and mudflows in mountainous areas. Such floods cause rivers to burst their banks and submerge homes. The Ghaggar - Hakra River, which only flows during India 's monsoon season, can flood and severely damage local crops. Floods can be exacerbated by fires that occurred during the previous dry season, which cause soils which are sandy or composed of loam to become hydrophobic, or water repellent. In various ways governments may help people deal with wet season floods. Flood plain mapping identifies which areas are more prone to flooding. Instructions on controlling erosion through outreach are also provided by telephone or the internet. The wet season is the main period of vegetation growth within the Savanna climate regime. However, this also means that wet season is a time for food shortages before crops reach their full maturity. This causes seasonal weight changes for people in developing countries, with a drop occurring during the wet season until the time of the first harvest, when weights rebound. Malaria incidence increases during periods of high temperature and heavy rainfall. Cows calve, or give birth, at the beginning of the wet season. The onset of the rainy season signals the departure of the monarch butterfly from Mexico. Tropical species of butterflies show larger dot markings on their wings to fend off possible predators and are more active during the wet season than the dry season. Within the tropics and warmer areas of the subtropics, decreased salinity of near shore wetlands due to the rains causes an increase in crocodile nesting. Other species, such as the arroyo toad, spawn within the couple of months after the seasonal rains. Armadillos and rattlesnakes seek higher ground. Template: Watershed
where do states get their power in the constitution
States ' rights - wikipedia In American political discourse, states ' rights are political powers held for the state governments rather than the federal government according to the United States Constitution, reflecting especially the enumerated powers of Congress and the Tenth Amendment. The enumerated powers that are listed in the Constitution include exclusive federal powers, as well as concurrent powers that are shared with the states, and all of those powers are contrasted with the reserved powers -- also called states ' rights -- that only the states possess. The balance of federal powers and those powers held by the states as defined in the Supremacy Clause of the U.S. Constitution was first addressed in the case of McCulloch v. Maryland (1819). The Court 's decision by Chief Justice John Marshall asserted that the laws adopted by the federal government, when exercising its constitutional powers, are generally paramount over any conflicting laws adopted by state governments. After McCulloch, the primary legal issues in this area concerned the scope of Congress ' constitutional powers, and whether the states possess certain powers to the exclusion of the federal government, even if the Constitution does not explicitly limit them to the states. The Supremacy Clause of the U.S. Constitution states: In The Federalist Papers, ratification proponent Alexander Hamilton explained the limitations this clause placed on the proposed federal government, describing that acts of the federal government were binding on the states and the people therein only if the act was in pursuance of constitutionally granted powers, and juxtaposing acts which exceeded those bounds as "void and of no force '': In the period between the American Revolution and the ratification of the United States Constitution, the states had united under a much weaker federal government and a much stronger state and local government, pursuant to the Articles of Confederation. The Articles gave the central government very little, if any, authority to overrule individual state actions. The Constitution subsequently strengthened the central government, authorizing it to exercise powers deemed necessary to exercise its authority, with an ambiguous boundary between the two co-existing levels of government. In the event of any conflict between state and federal law, the Constitution resolved the conflict via the Supremacy Clause of Article VI in favor of the federal government, which declares federal law the "supreme Law of the Land '' and provides that "the Judges in every State shall be bound thereby, any Thing in the Constitution or Laws of any State to the Contrary notwithstanding. '' However, the Supremacy Clause only applies if the federal government is acting in pursuit of its constitutionally authorized powers, as noted by the phrase "in pursuance thereof '' in the actual text of the Supremacy Clause itself (see above). When the Federalists passed the Alien and Sedition Acts in 1798, Thomas Jefferson and James Madison secretly wrote the Kentucky and Virginia Resolutions, which provide a classic statement in support of states ' rights and called on state legislatures to nullify unconstitutional federal laws. (The other states, however, did not follow suit and several rejected the notion that states could nullify federal law.) According to this theory, the federal union is a voluntary association of states, and if the central government goes too far each state has the right to nullify that law. As Jefferson said in the Kentucky Resolutions: Resolved, that the several States composing the United States of America, are not united on the principle of unlimited submission to their general government; but that by compact under the style and title of a Constitution for the United States and of amendments thereto, they constituted a general government for special purposes, delegated to that government certain definite powers, reserving each State to itself, the residuary mass of right to their own self - government; and that whensoever the general government assumes undelegated powers, its acts are unauthoritative, void, and of no force: That to this compact each State acceded as a State, and is an integral party, its co-States forming, as to itself, the other party... each party has an equal right to judge for itself, as well of infractions as of the mode and measure of redress. The Kentucky and Virginia Resolutions, which became part of the Principles of ' 98, along with the supporting Report of 1800 by Madison, became final documents of Jefferson 's Democratic - Republican Party. Gutzman argued that Governor Edmund Randolph designed the protest in the name of moderation. Gutzman argues that in 1798, Madison espoused states ' rights to defeat national legislation that he maintained was a threat to republicanism. During 1831 -- 33, the South Carolina Nullifiers quoted Madison in their defense of states ' rights. But Madison feared that the growing support for this doctrine would undermine the union and argued that by ratifying the Constitution states had transferred their sovereignty to the federal government. The most vociferous supporters of states ' rights, such as John Randolph of Roanoke, were called "Old Republicans '' into the 1820s and 1830s. Tate (2011) undertook a literary criticism of a major book by John Taylor of Caroline, New Views of the Constitution of the United States. Tate argues it is structured as a forensic historiography modeled on the techniques of 18th - century Whig lawyers. Taylor believed that evidence from American history gave proof of state sovereignty within the union, against the arguments of nationalists such as U.S. Chief Justice John Marshall. Another states ' rights dispute occurred over the War of 1812. At the Hartford Convention of 1814 -- 15, New England Federalists voiced opposition to President Madison 's war, and discussed secession from the Union. In the end they stopped short of calls for secession, but when their report appeared at the same time as news of the great American victory at the Battle of New Orleans, the Federalists were politically ruined. One major and continuous strain on the union, from roughly 1820 through the Civil War, was the issue of trade and tariffs. Heavily dependent upon international trade, the almost entirely agricultural and export - oriented South imported most of its manufactured goods from Europe or obtained them from the North. The North, by contrast, had a growing domestic industrial economy that viewed foreign trade as competition. Trade barriers, especially protective tariffs, were viewed as harmful to the Southern economy, which depended on exports. In 1828, the Congress passed protective tariffs to benefit trade in the northern states, but that were detrimental to the South. Southerners vocally expressed their tariff opposition in documents such as the South Carolina Exposition and Protest in 1828, written in response to the "Tariff of Abominations. '' Exposition and Protest was the work of South Carolina senator and former vice president John C. Calhoun, formerly an advocate of protective tariffs and internal improvements at federal expense. South Carolina 's Nullification Ordinance declared that both the tariff of 1828 and the tariff of 1832 were null and void within the state borders of South Carolina. This action initiated the Nullification Crisis. Passed by a state convention on November 24, 1832, it led, on December 10, to President Andrew Jackson 's proclamation against South Carolina, which sent a naval flotilla and a threat of sending federal troops to enforce the tariffs; Jackson authorized this under color of national authority, claiming in his 1832 Proclamation Regarding Nullification that "our social compact in express terms declares, that the laws of the United States, its Constitution, and treaties made under it, are the supreme law of the land '' and for greater caution adds, "that the judges in every State shall be bound thereby, anything in the Constitution or laws of any State to the contrary notwithstanding. '' Over the following decades, another central dispute over states ' rights moved to the forefront. The issue of slavery polarized the union, with the Jeffersonian principles often being used by both sides -- anti-slavery Northerners, and Southern slaveholders and secessionists -- in debates that ultimately led to the American Civil War. Supporters of slavery often argued that one of the rights of the states was the protection of slave property wherever it went, a position endorsed by the U.S. Supreme Court in the 1857 Dred Scott decision. In contrast, opponents of slavery argued that the non-slave - states ' rights were violated both by that decision and by the Fugitive Slave Law of 1850. Exactly which -- and whose -- states ' rights were the casus belli in the Civil War remain in controversy. A major Southern argument in the 1850s was that banning slavery in the territories discriminated against states that allowed slavery, making them second - class states. In 1857 the Supreme Court sided with the states ' rights supporters, declaring in Dred Scott v. Sandford that Congress had no authority to regulate slavery in the territories. Jefferson Davis used the following argument in favor of the equal rights of states: Resolved, That the union of these States rests on the equality of rights and privileges among its members, and that it is especially the duty of the Senate, which represents the States in their sovereign capacity, to resist all attempts to discriminate either in relation to person or property, so as, in the Territories -- which are the common possession of the United States -- to give advantages to the citizens of one State which are not equally secured to those of every other State. Southern states sometimes argued against ' states rights '. For example, Texas challenged some northern states having the right to protect fugitive slaves. Economists such as Thomas DiLorenzo and Charles Adams argue that the Southern secession and the ensuing conflict was much more of a fiscal quarrel than a war over slavery. Northern - inspired tariffs benefited Northern interests but were detrimental to Southern interests and were destroying the economy in the South. These tariffs would be less subject to states rights ' arguments. The historian James McPherson noted that Southerners were inconsistent on the states ' rights issue, and that Northern states tried to protect the rights of their states against the South during the Gag Rule and fugitive slave law controversies. The historian William H. Freehling noted that the South 's argument for a state 's right to secede was different from Thomas Jefferson 's, in that Jefferson based such a right on the unalienable equal rights of man. The South 's version of such a right was modified to be consistent with slavery, and with the South 's blend of democracy and authoritarianism. Historian Henry Brooks Adams explains that the anti-slavery North took a consistent and principled stand on states ' rights against federal encroachment throughout its history, while the Southern states, whenever they saw an opportunity to expand slavery and the reach of the slave power, often conveniently forgot the principle of states ' rights -- and fought in favor of federal centralization: Between the slave power and states ' rights there was no necessary connection. The slave power, when in control, was a centralizing influence, and all the most considerable encroachments on states ' rights were its acts. The acquisition and admission of Louisiana; the Embargo; the War of 1812; the annexation of Texas "by joint resolution '' (rather than treaty); the war with Mexico, declared by the mere announcement of President Polk; the Fugitive Slave Law; the Dred Scott decision -- all triumphs of the slave power -- did far more than either tariffs or internal improvements, which in their origin were also southern measures, to destroy the very memory of states ' rights as they existed in 1789. Whenever a question arose of extending or protecting slavery, the slaveholders became friends of centralized power, and used that dangerous weapon with a kind of frenzy. Slavery in fact required centralization in order to maintain and protect itself, but it required to control the centralized machine; it needed despotic principles of government, but it needed them exclusively for its own use. Thus, in truth, states ' rights were the protection of the free states, and as a matter of fact, during the domination of the slave power, Massachusetts appealed to this protecting principle as often and almost as loudly as South Carolina. Sinha and Richards both argue that the south only used states ' rights when they disagreed with a policy. Examples given are a states ' right to engage in slavery or to suppress freedom of speech. They argue that it was instead the result of the increasing cognitive dissonance in the minds of Northerners and (some) Southern non-slaveowners between the ideals that the United States was founded upon and identified itself as standing for, as expressed in the Declaration of Independence, the Constitution of the United States, and the Bill of Rights, and the reality that the slave - power represented, as what they describe as an anti-democratic, counter-republican, oligarchic, despotic, authoritarian, if not totalitarian, movement for ownership of human beings as the personal chattels of the slaver. As this cognitive dissonance increased, the people of the Northern states, and the Northern states themselves, became increasingly inclined to resist the encroachments of the slave power upon their states ' rights and encroachments of the slave power by and upon the federal government of the United States. The slave power, having failed to maintain its dominance of the federal government through democratic means, sought other means of maintaining its dominance of the federal government, by means of military aggression, by right of force and coercion, and thus, the Civil War occurred. In Texas v. White, 74 U.S. 700 (1869) the Supreme Court ruled that Texas had remained a state ever since it first joined the Union, despite claims to have joined the Confederate States of America; the court further held that the Constitution did not permit states to unilaterally secede from the United States, and that the ordinances of secession, and all the acts of the legislatures within seceding states intended to give effect to such ordinances, were "absolutely null '' under the constitution. A series of Supreme Court decisions developed the state action constraint on the Equal Protection Clause. The state action theory weakened the effect of the Equal Protection Clause against state governments, in that the clause was held not to apply to unequal protection of the laws caused in part by complete lack of state action in specific cases, even if state actions in other instances form an overall pattern of segregation and other discrimination. The separate but equal theory further weakened the effect of the Equal Protection Clause against state governments. With United States v. Cruikshank (1876), a case which arose out of the Colfax Massacre of blacks contesting the results of a Reconstruction era election, the Supreme Court held that the Fourteenth Amendment did not apply to the First Amendment or Second Amendment to state governments in respect to their own citizens, only to acts of the federal government. In McDonald v. City of Chicago (2010), the Supreme Court held that the Second Amendment right of an individual to "keep and bear arms '' is incorporated by the Due Process Clause of the Fourteenth Amendment, and therefore fully applicable to states and local governments. Furthermore, United States v. Harris (1883) held that the Equal Protection Clause did not apply to an 1883 prison lynching on the basis that the Fourteenth Amendment applied only to state acts, not to individual criminal actions. In the Civil Rights Cases (1883), the Supreme Court allowed segregation by striking down the Civil Rights Act of 1875, a statute that prohibited racial discrimination in public accommodation. It again held that the Equal Protection Clause applied only to acts done by states, not to those done by private individuals, and as the Civil Rights Act of 1875 applied to private establishments, the Court said, it exceeded congressional enforcement power under Section 5 of the Fourteenth Amendment. By the beginning of the 20th century, greater cooperation began to develop between the state and federal governments and the federal government began to accumulate more power. Early in this period, a federal income tax was imposed, first during the Civil War as a war measure and then permanently with the Sixteenth Amendment in 1913. Before this, the states played a larger role in government. States ' rights were affected by the fundamental alteration of the federal government resulting from the Seventeenth Amendment, depriving state governments of an avenue of control over the federal government via the representation of each state 's legislature in the U.S. Senate. This change has been described by legal critics as the loss of a check and balance on the federal government by the states. Following the Great Depression, the New Deal and then World War II saw further growth in the authority and responsibilities of the federal government. The case of Wickard v. Filburn allowed the federal government to enforce the Agricultural Adjustment Act, providing subsidies to farmers for limiting their crop yields, arguing agriculture affected interstate commerce and came under the jurisdiction of the Commerce Clause even when a farmer grew his crops not to be sold, but for his own private use. After World War II, President Harry Truman supported a civil rights bill and desegregated the military. The reaction was a split in the Democratic Party that led to the formation of the "States ' Rights Democratic Party '' -- better known as the Dixiecrats -- led by Strom Thurmond. Thurmond ran as the States ' Rights candidate for President in the 1948 election, losing to Truman. During the 1950s and 1960s, the Civil Rights Movement was confronted by the proponents in the Southern states of racial segregation and Jim Crow laws who denounced federal interference in these state - level laws as an assault on states ' rights. Though Brown v. Board of Education (1954) overruled the Plessy v. Ferguson (1896) decision, the Fourteenth and Fifteenth amendments were largely inactive in the South until the Civil Rights Act of 1964 (42 U.S.C. § 21) and the Voting Rights Act of 1965. Several states passed Interposition Resolutions to declare that the Supreme Court 's ruling in Brown usurped states ' rights. There was also opposition by states ' rights advocates to voting rights at Edmund Pettus Bridge, which was part of the Selma to Montgomery marches, that resulted in the Voting Rights Act of 1965. In 1964, the issue of fair housing in California involved the boundary between state laws and federalism. California Proposition 14 overturned the Rumsford Fair Housing Act in California and allowed discrimination in any type of housing sale or rental. Martin Luther King, Jr. and others saw this as a backlash against civil rights. Actor Ronald Reagan gained popularity by supporting Proposition 14, and was later elected governor of California. The U.S. Supreme Court 's Reitman v. Mulkey decision overturned Proposition 14 in 1967 in favor of the Equal Protection Clause of the Fourteenth Amendment. Conservative historians Thomas E. Woods, Jr. and Kevin R.C. Gutzman argue that when politicians come to power they exercise all the power they can get, in the process trampling states ' rights. Gutzman argues that the Kentucky and Virginia resolutions of 1798 by Jefferson and Madison were not only responses to immediate threats but were legitimate responses based on the long - standing principles of states ' rights and strict adherence to the Constitution. Another concern is the fact that on more than one occasion, the federal government has threatened to withhold highway funds from states which did not pass certain articles of legislation. Any state which lost highway funding for any extended period would face financial impoverishment, infrastructure collapse or both. Although the first such action (the enactment of a national speed limit) was directly related to highways and done in the face of a fuel shortage, most subsequent actions have had little or nothing to do with highways and have not been done in the face of any compelling national crisis. An example of this would be the federally mandated drinking age of 21, upheld in South Dakota v. Dole. Critics of such actions feel that when the federal government does this they upset the traditional balance between the states and the federal government. More recently, the issue of states ' rights has come to a head when the Base Realignment and Closure Commission (BRAC) recommended that Congress and the Department of Defense implement sweeping changes to the National Guard by consolidating some Guard installations and closing others. These recommendations in 2005 drew strong criticism from many states, and several states sued the federal government on the basis that Congress and the Pentagon would be violating states ' rights should they force the realignment and closure of Guard bases without the prior approval of the governors from the affected states. After Pennsylvania won a federal lawsuit to block the deactivation of the 111th Fighter Wing of the Pennsylvania Air National Guard, defense and Congressional leaders chose to try to settle the remaining BRAC lawsuits out of court, reaching compromises with the plaintiff states. Current states ' rights issues include the death penalty, assisted suicide, same - sex marriage, gun control, and cannabis, the last of which is in direct violation of federal law. In Gonzales v. Raich, the Supreme Court ruled in favor of the federal government, permitting the Drug Enforcement Administration (DEA) to arrest medical marijuana patients and caregivers. In Gonzales v. Oregon, the Supreme Court ruled the practice of physician - assisted suicide in Oregon is legal. In Obergefell v. Hodges, the Supreme Court ruled that states could not withhold recognition to same - sex marriages. In District of Columbia v. Heller (2008), the United States Supreme Court ruled that gun ownership is an individual right under the Second Amendment of the United States Constitution, and the District of Columbia could not completely ban gun ownership by law - abiding private citizens. Two years later, the court ruled that the Heller decision applied to states and territories via the Second and 14th Amendments in McDonald v. Chicago, stating that states, territories and political divisions thereof, could not impose total bans on gun ownership by law - abiding citizens. These concerns have led to a movement sometimes called the State Sovereignty movement or "10th Amendment Sovereignty Movement ''. Some, such as former representative Ron Paul (R - TX), have proposed repealing the 17th Amendment of the United States Constitution. In 2009 -- 2010 thirty - eight states introduced resolutions to reaffirm the principles of sovereignty under the Constitution and the 10th Amendment; 14 states have passed the resolutions. These non-binding resolutions, often called "state sovereignty resolutions '' do not carry the force of law. Instead, they are intended to be a statement to demand that the federal government halt its practices of assuming powers and imposing mandates upon the states for purposes not enumerated by the Constitution. The Supreme Court 's University of Alabama v. Garrett (2001) and Kimel v. Florida Board of Regents (2000) decisions allowed states to use a rational basis review for discrimination against the aged and disabled, arguing that these types of discrimination were rationally related to a legitimate state interest, and that no "razorlike precision '' was needed. '' The Supreme Court 's United States v. Morrison (2000) decision limited the ability of rape victims to sue their attackers in federal court. Chief Justice William H. Rehnquist explained that "States historically have been sovereign '' in the area of law enforcement, which in the Court 's opinion required narrow interpretations of the Commerce Clause and Fourteenth Amendment. Kimel, Garrett and Morrison indicated that the Court 's previous decisions in favor of enumerated powers and limits on Congressional power over the states, such as United States v. Lopez (1995), Seminole Tribe v. Florida (1996) and City of Boerne v. Flores (1997) were more than one time flukes. In the past, Congress relied on the Commerce Clause and the Equal Protection Clause for passing civil rights bills, including the Civil Rights Act of 1964. Lopez limited the Commerce Clause to things that directly affect interstate commerce, which excludes issues like gun control laws, hate crimes, and other crimes that affect commerce but are not directly related to commerce. Seminole reinforced the "sovereign immunity of states '' doctrine, which makes it difficult to sue states for many things, especially civil rights violations. The Flores "congruence and proportionality '' requirement prevents Congress from going too far in requiring states to comply with the Equal Protection Clause, which replaced the ratchet theory advanced in Katzenbach v. Morgan (1966). The ratchet theory held that Congress could ratchet up civil rights beyond what the Court had recognized, but that Congress could not ratchet down judicially recognized rights. An important precedent for Morrison was United States v. Harris (1883), which ruled that the Equal Protection Clause did not apply to a prison lynching because the state action doctrine applies Equal Protection only to state action, not private criminal acts. Since the ratchet principle was replaced with the "congruence and proportionality '' principle by Flores, it was easier to revive older precedents for preventing Congress from going beyond what Court interpretations would allow. Critics such as Associate Justice John Paul Stevens accused the Court of judicial activism (i.e., interpreting law to reach a desired conclusion). The tide against federal power in the Rehnquist court was stopped in the case of Gonzales v. Raich, 545 U.S. 1 (2005), in which the court upheld the federal power to prohibit medicinal use of cannabis even if states have permitted it. Rehnquist himself was a dissenter in the Raich case. Since the 1940s, the term "states ' rights '' has often been considered a loaded term because of its use in opposition to federally mandated racial desegregation and more recently, same - sex marriage. During the heyday of the civil rights movement, defenders of segregation used the term "states ' rights '' as a code word -- in what is now referred to as dog - whistle politics -- political messaging that appears to mean one thing to the general population but has an additional, different or more specific resonance for a targeted subgroup. In 1948 it was the official name of the "Dixiecrat '' party led by white supremacist presidential candidate Strom Thurmond. Democratic governor George Wallace of Alabama, who famously declared in his inaugural address in 1963, "Segregation now! Segregation tomorrow! Segregation forever! '' -- later remarked that he should have said, "States ' rights now! States ' rights tomorrow! States ' rights forever! '' Wallace, however, claimed that segregation was but one issue symbolic of a larger struggle for states ' rights; in that view, which some historians dispute, his replacement of segregation with states ' rights would be more of a clarification than a euphemism. In 2010, Texas governor Rick Perry 's use of the expression "states ' rights '', to some, was reminiscent of "an earlier era when it was a rallying cry against civil rights. '' During an interview with The Dallas Morning News, Perry made it clear that he supports the end of segregation, including passage of the Civil Rights Act. Texas president of the NAACP Gary Bledsoe stated that he understood that Perry was n't speaking of "states ' rights '' in a racial context; but others still felt offended by the term because of its past misuse.
american tennis player who holds the most grand slams
List of Grand Slam men 's singles champions - wikipedia This article details the list of men 's singles Grand Slam tournaments tennis champions. Some major changes have taken place in history and have affected the number of titles that have been won by various players. These have included the opening of the French national championships to international players in 1925, the elimination of the challenge round in 1922, and the admission of professional players in 1968 (the start of the Open era). Note: All of these tournaments have been listed since they began, rather than when they officially became majors. The Australian and US tournaments have only been officially regarded as majors by the ILTF (now the ITF) since 1924 (though many regarded the US Championships as a major before then). The French Championships have only been a major since 1925 (when it became open to all amateurs internationally). Before 1924 (since 1912 / 1913 to 1923) there were 3 official majors: Wimbledon, the World Hard Court Championships (played on clay) and the World Covered Court Championships (played on an indoor wood surface). All - time Open Era All - time Open Era Note: Bold indicates player still active. Note: * indicates ongoing streak, bold indicates player still active. These players won all four majors. The year listed is the year the player first won each tournament; the last one is marked in bold. The age listed is the age at the end of that last tournament, i.e., the age at which the player completed his Career Grand Slam. (Winners of all four Grand Slam singles tournaments in the same calendar year) Note: players with four titles are not included here. Note: players with more than two titles are not included here. Bold = Active Streaks
when was congress supposed to meet according to the original constitution
Article one of the United States Constitution - wikipedia Article One of the United States Constitution establishes the legislative branch of the federal government, the United States Congress. The Congress is a bicameral legislature consisting of a House of Representatives and a Senate. All legislative Powers herein granted shall be vested in a Congress of the United States, which shall consist of a Senate and House of Representatives. Section 1 is a vesting clause that bestows federal legislative power exclusively to Congress. Similar clauses are found in Articles II and III. The former confers executive power upon the President alone, and the latter grants judicial power solely to the federal judiciary. These three articles create a separation of powers among the three branches of the federal government. This separation of powers, by which each department may exercise only its own constitutional powers and no others, is fundamental to the idea of a limited government accountable to the people. The separation of powers principle is particularly noteworthy in regard to the Congress. The Constitution declares that the Congress may exercise only those legislative powers "herein granted '' within Article I (as later limited by the Tenth Amendment). It also, by implied extension, prohibits Congress from delegating its legislative authority to either of the other branches of government, a rule known as the nondelegation doctrine. However, the Supreme Court has ruled that Congress does have latitude to delegate regulatory powers to executive agencies as long as it provides an "intelligible principle '' which governs the agency 's exercise of the delegated regulatory authority. That the power assigned to each branch must remain with that branch, and may be expressed only by that branch, is central to the theory. The nondelegation doctrine is primarily used now as a way of interpreting a congressional delegation of authority narrowly, in that the courts presume Congress intended only to delegate that which it certainly could have, unless it clearly demonstrates it intended to "test the waters '' of what the courts would allow it to do. Although not specifically mentioned in the Constitution, Congress has also long asserted the power to investigate and the power to compel cooperation with an investigation. The Supreme Court has affirmed these powers as an implication of Congress 's power to legislate. Since the power to investigate is an aspect of Congress 's power to legislate, it is as broad as Congress 's powers to legislate. However, it is also limited to inquiries that are "in aid of the legislative function; '' Congress may not "expose for the sake of exposure. '' It is uncontroversial that a proper subject of Congress 's investigation power is the operations of the federal government, but Congress 's ability to compel the submission of documents or testimony from the President or his subordinates is often - discussed and sometimes controversial (see executive privilege), although not often litigated. As a practical matter, the limitation of Congress 's ability to investigate only for a proper purpose ("in aid of '' its legislative powers) functions as a limit on Congress 's ability to investigate the private affairs of individual citizens; matters that simply demand action by another branch of government, without implicating an issue of public policy necessitating legislation by Congress, must be left to those branches due to the doctrine of separation of powers. The courts are highly deferential to Congress 's exercise of its investigation powers, however. Congress has the power to investigate that which it could regulate, and the courts have interpreted Congress 's regulatory powers broadly since the Great Depression. The House of Representatives shall be composed of Members chosen every second Year by the People of the several States, and the Electors in each State shall have the Qualifications requisite for Electors of the most numerous Branch of the State Legislature. Section Two provides for the election of the House of Representatives every second year. Since Representatives are to be "chosen... by the People, '' State Governors are not allowed to appoint temporary replacements when vacancies occur in a state 's delegation to the House of Representatives; instead, the Governor of the state is required by clause 4 to issue a writ of election calling a special election to fill the vacancy. At the time of its creation, the Constitution did not explicitly give citizens an inherent right to vote. Rather, it provided that those qualified to vote in elections for the largest chamber of a state 's legislature may vote in Congressional (House of Representatives) elections. Since the Civil War, several constitutional amendments have been enacted that have curbed the states ' broad powers to set voter qualification standards. Though never enforced, clause 2 of the Fourteenth Amendment provides that when the right to vote at any election for the choice of electors for President and Vice President of the United States, Representatives in Congress, the Executive and Judicial officers of a State, or the members of the Legislature thereof, is denied to any of the male inhabitants of such State, being twenty - one years of age, and citizens of the United States, or in any way abridged, except for participation in rebellion, or other crime, the basis of representation therein shall be reduced in the proportion which the number of such male citizens shall bear to the whole number of male citizens twenty - one years of age in such State. The Fifteenth Amendment prohibits the denial of the right to vote based on race, color, or previous condition of servitude. The Nineteenth Amendment prohibits the denial of the right to vote based on sex. The Twenty - fourth Amendment prohibits the revocation of voting rights due to the non-payment of a poll tax. The Twenty - sixth Amendment prohibits the denial of the right of US citizens, eighteen years of age or older, to vote on account of age. Moreover, since the Supreme Court has recognized voting as a fundamental right, the Equal Protection Clause places very tight limitations (albeit with uncertain limits) on the states ' ability to define voter qualifications; it is fair to say that qualifications beyond citizenship, residency, and age are usually questionable. In the 1960s, the Supreme Court started to view voting as a fundamental right covered by the Equal Protection Clause of the Fourteenth Amendment. In a dissenting opinion of a 1964 Supreme Court case involving reapportionment in the Alabama state legislature, Associate Justice John Marshall Harlan II included Minor in a list of past decisions about voting and apportionment which were no longer being followed. In Oregon v. Mitchell (1970), the Supreme Court held that the Qualifications clause did not prevent Congress from overriding state - imposed minimum age restrictions for voters in Congressional elections. Since clause 3 provides that Members of the House of Representatives are apportioned state - by - state and that each state is guaranteed at least one Representative, exact population equality between all districts is not guaranteed and, in fact, is currently impossible, because while the size of the House of Representatives is fixed at 435, several states had less than 1 / 435 of the national population at the time of the last reapportionment in 2010. However, the Supreme Court has interpreted the provision of Clause One that Representatives shall be elected "by the People '' to mean that, in those states with more than one member of the House of Representatives, each congressional election district within the state must have nearly identical populations. No Person shall be a Representative who shall not have attained to the Age of twenty five Years, and been seven Years a Citizen of the United States, and who shall not, when elected, be an Inhabitant of that State in which he shall be chosen. The Constitution provides three requirements for Representatives: A Representative must be at least 25 years old, must be an inhabitant of the state in which he or she is elected, and must have been a citizen of the United States for the previous seven years. There is no requirement that a Representative reside within the district in which he or she represents; although this is usually the case, there have been occasional exceptions. The Supreme Court has interpreted the Qualifications Clause as an exclusive list of qualifications that can not be supplemented by a house of Congress exercising its Section 5 authority to "judge... the... qualifications of its own members '' or by a state in its exercise of its Section 4 authority to prescribe the "times, places and manner of holding elections for Senators and Representatives. '' The Supreme Court, as well as other federal courts, have repeatedly barred states from additional restrictions, such as imposing term limits on members of Congress, allowing members of Congress to be subject to recall elections, or requiring that Representatives live in the congressional district in which they represent. A 2002 Congressional Research Service report also found that no state could implement a qualification that a Representative not be a convicted felon or incarcerated. However, the United States Supreme Court has ruled that certain ballot access requirements, such as filing fees and submitting a certain number of valid petition signatures do not constitute additional qualifications and thus few Constitutional restrictions exist as to how harsh ballot access laws can be. Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers, which shall be determined by adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three fifths of all other Persons. The actual Enumeration shall be made within three Years after the first Meeting of the Congress of the United States, and within every subsequent Term of ten Years, in such Manner as they shall by Law direct. The number of Representatives shall not exceed one for every thirty Thousand, but each State shall have at Least one Representative; and until such enumeration shall be made, the State of New Hampshire shall be entitled to chuse (sic) three, Massachusetts eight, Rhode - Island and Providence Plantations one, Connecticut five, New - York six, New Jersey four, Pennsylvania eight, Delaware one, Maryland six, Virginia ten, North Carolina five, South Carolina five, and Georgia three. After much debate, the framers of the Constitution decided to make population the basis of apportioning the seats in the House of Representatives and the tax liability among the states. To facilitate this, the Constitution mandates that a census be conducted every ten years to determine the population of each state and of the nation as a whole and establishes a rule for who shall be counted or excluded from the count. As the new form of government would become operational prior to the completion of a national census, the Constitution also provides for a temporary apportionment of seats. Originally, the population of each state and of the nation as a whole was ascertained by adding to the whole number of free Persons, three - fifths the number of all other Persons (i.e. slaves), but excluding non-taxed Native Americans. This Constitutional rule, known as the three - fifths compromise, was a compromise between Southern and Northern states in which three - fifths of the population of slaves would be counted for enumeration purposes and for the apportionment of seats in the House of Representatives and of taxes among the states. It was, according to Supreme Court Justice Joseph Story (writing in 1833), a "matter of compromise and concession, confessedly unequal in its operation, but a necessary sacrifice to that spirit of conciliation, which was indispensable to the union of states having a great diversity of interests, and physical condition, and political institutions ''. Following the completion of each census, Congress is empowered to use the aggregate population in all the states (according to the prevailing Constitutional rule for determining population) to determine the relative population of each state to the population of the whole, and, based on its calculations, to establish the appropriate size of the House and to allocate a particular number of representatives to each state according to its share of the national population. Since enactment of the Reapportionment Act of 1929, a constant 435 House seats have been apportioned among the states according to each census, and determining the size of the House is not presently part of the apportionment process. With one exception, the apportionment of 1842, the House of Representatives had been enlarged by various degrees from sixty - five members in 1788 to 435 members by 1913. The determination of size was made based on the aggregate national population, so long as the size of the House did not exceed 1 member for every 30,000 of the country 's total population nor the size of any state 's delegation exceed 1 for every 30,000 of that state 's population. With the size of the House still fixed at 435, the current ratio, as of the 2010 Census, is around 1 Representative: 700,000 Citizens. Although the first sentence in this clause originally concerned apportionment of both House seats and taxes among the several states, the Fourteenth Amendment sentence that replaced it in 1868 mentioned only the apportionment of House seats. Even so, the constraint placed upon Congress 's taxation power remained, as the restriction was reiterated in Article 1 Section 9 Clause 4. The amount of direct taxes that could be collected by the federal government from the people in any State would still be tied directly to that state 's share of the national population. Due to this restriction, application of the income tax to income derived from real estate and specifically income in the form of dividends from personal property ownership such as stock shares was found to be unconstitutional because it was not apportioned among the states; that is to say, there was no guarantee that a State with 10 % of the country 's population paid 10 % of those income taxes collected, because Congress had not fixed an amount of money to be raised and apportioned it between the States according to their respective shares of the national population. To permit the levying of such an income tax, Congress proposed and the states ratified the Sixteenth Amendment, which removed the restriction by specifically providing that Congress could levy a tax on income "from whatever source derived '' without it being apportioned among the States or otherwise based on a State 's share of the national population. When vacancies happen in the Representation from any State, the Executive Authority thereof shall issue Writs of Election to fill such Vacancies. Section two, Clause four, provides that when vacancies occur in the House of Representatives, it is not the job of the House of Representatives to arrange for a replacement, but the job of the State whose vacant seat is up for refilling. Moreover, the State Governor may not appoint a temporary replacement, but must instead arrange for a special election to fill the vacancy. The original qualifications and procedures for holding that election are still valid. The House of Representatives shall chuse (sic) their Speaker and other Officers; and shall have the sole Power of Impeachment. Section Two further provides that the House of Representatives may choose its Speaker and its other officers. Though the Constitution does not mandate it, every Speaker has been a member of the House of Representatives. The Speaker rarely presides over routine House sessions, choosing instead to deputize a junior member to accomplish the task. Finally, Section Two grants to the House of Representatives the sole power of impeachment. Although the Supreme Court has not had an occasion to interpret this specific provision, the Court has suggested that the grant to the House of the "sole '' power of impeachment makes the House the exclusive interpreter of what constitutes an impeachable offense. This power, which is analogous to the bringing of criminal charges by a grand jury, has been used only rarely. The House of Representatives has initiated impeachment proceedings 62 times since 1789, and nineteen federal officials have been formally impeached as a result, including: two Presidents (Andrew Johnson and Bill Clinton), one Cabinet Secretary (William W. Belknap), one Senator (William Blount), one Supreme Court Associate Justice (Samuel Chase), and fourteen federal judges. The Constitution does not specify how impeachment proceedings are to be initiated. Until the early 20th century, a House member could rise and propose an impeachment, which would then be assigned to a committee for investigation. Presently, it is the House Judiciary Committee that initiates the process and then, after investigating the allegations, prepares recommendations for the whole House 's consideration. If the House votes to adopt an impeachment resolution, the Chairman of the Judiciary Committee recommends a slate of "managers, '' whom the House subsequently approves by resolution. These Representatives subsequently become the prosecution team in the impeachment trial in the Senate (see Section 3, Clause 6 below). The Senate of the United States shall be composed of two Senators from each State, chosen by the Legislature thereof, for six Years; and each Senator shall have one Vote. The first Clause of Section Three provides that each state is entitled to have two Senators, who would be elected by its state legislature (now by the people of each state), serve for staggered six - year terms, and have one vote each. By these provisions, the framers of the Constitution intended to protect the interests of the states as states. This clause has been superseded by the Seventeenth Amendment, ratified in 1913, which, in part, provides as amended, that The Senate of the United States shall be composed of two Senators from each State, elected by the people thereof, for six years; and each Senator shall have one vote. Article Five specifies the means by which the Constitution of the United States can be amended. It ends by temporarily shielding three Article I clauses from being amended. This clause is among them. (The others are first and fourth clauses in Section 9.) Article Five provides that "no State, without its Consent, shall be deprived of its equal Suffrage in the Senate ''. Thus, no individual state may have its individual representation in the Senate adjusted without its consent. That is to say, an amendment that changed this clause to provide that all states would get only one Senator (or three Senators, or any other number) could become valid as part of the Constitution if ratified by three - fourths of the states; however, one that provided for some basis of representation other than strict numerical equality (for example, population, wealth, or land area), would require the unanimous consent of all the states. Denying the states their intended role as joint partners in the federal government by abolishing their equality in the Senate would, according to Chief Justice Salmon P. Chase (in Texas v. White), destroy the grounding of the Union. This Article V provision has been employed by those opposed to contemplated constitutional amendments that would grant the District of Columbia full representation in the Congress without also granting it statehood. Their argument is that an amendment that would allow a non-state district to have two Senators would deprive the states of their equal suffrage in the Senate and would therefore require unanimous ratification by all the states. Whether unanimous consent of the 50 states would be required for such an amendment to become operative remains an unanswered political question. Immediately after they shall be assembled in Consequence of the first Election, they shall be divided as equally as may be into three Classes. The Seats of the Senators of the first Class shall be vacated at the Expiration of the second Year, of the second Class at the Expiration of the fourth Year, and of the third Class at the Expiration of the sixth Year, so that one third may be chosen every second Year; and if Vacancies happen by Resignation, or otherwise, during the Recess of the Legislature of any State, the Executive thereof may make temporary Appointments until the next Meeting of the Legislature, which shall then fill such Vacancies. After the first group of Senators was elected to the First Congress (1789 -- 1791), the Senators were divided into three "classes '' as nearly equal in size as possible, as required by this section. This was done in May 1789 by lot. It was also decided that each state 's Senators would be assigned to two different classes. Those Senators grouped in the first class had their term expire after only two years; those Senators in the second class had their term expire after only four years, instead of six. After this, all Senators from those States have been elected to six - year terms, and as new States have joined the Union, their Senate seats have been assigned to two of the three classes, maintaining each grouping as nearly equal in size as possible. In this way, election is staggered; approximately one - third of the Senate is up for re-election every two years, but the entire body is never up for re-election in the same year (as contrasted with the House, where its entire membership is up for re-election every 2 years). As originally established, Senators were elected by the Legislature of the State they represented in the Senate. If a senator died, resigned, or was expelled, the legislature of the state would appoint a replacement to serve out the remainder of the senator 's term. If the State Legislature was not in session, its Governor could appoint a temporary replacement to serve until the legislature could elect a permanent replacement. This was superseded by the Seventeenth Amendment, which provided for the Popular Election of Senators, instead of their appointment by the State Legislature. In a nod to the less populist nature of the Senate, the Amendment tracks the vacancy procedures for the House of Representatives in requiring that the Governor call a special election to fill the vacancy, but (unlike in the House) it vests in the State Legislature the authority to allow the Governor to appoint a temporary replacement until the special election is held. Note, however, that under the original Constitution, the Governors of the states were expressly allowed by the Constitution to make temporary appointments. The current system, under the Seventeenth Amendment, allows Governors to appoint a replacement only if their state legislature has previously decided to allow the Governor to do so; otherwise, the seat must remain vacant until the special election is held to fill the seat, as in the case of a vacancy in the House. No Person shall be a Senator who shall not have attained to the Age of thirty Years, and been nine Years a Citizen of the United States, and who shall not, when elected, be an Inhabitant of that State for which he shall be chosen. A Senator must be at least 30 years of age, must have been a citizen of the United States for at least nine years before being elected, and must reside in the State he or she will represent at the time of the election. The Supreme Court has interpreted the Qualifications Clause as an exclusive list of qualifications that can not be supplemented by a House of Congress exercising its Section. 5. authority to "Judge... the... Qualifications of its own Members, '' or by a state in its exercise of its Section. 4. authority to prescribe the "Times, Places and Manner of holding Elections for Senators and Representatives,... '' The Vice President of the United States shall be President of the Senate, but shall have no Vote, unless they be equally divided. Section Three provides that the Vice President is the President of the Senate. Excepting the duty to receive the tally of electoral votes for President, this is the only regular responsibility assigned to the office of the Vice President by the Constitution. When serving in this capacity, the Vice President, who is not a member of the Senate, may cast tie - breaking votes. Early in the nation 's history, Vice Presidents frequently presided over the Senate. In modern times, the Vice President usually does so only during ceremonial occasions or when a tie in the voting is anticipated. Through March 30, 2017, a tie - breaking vote has been cast 258 times by 36 different Vice Presidents. The Senate shall chuse (sic) their other Officers, and also a President pro tempore, in the Absence of the Vice President, or when he shall exercise the Office of the President of the United States. Clause five provides for a President pro tempore of the Senate, a Senator elected to the post by the Senate, to preside over the body when the Vice President is either absent or exercising the Office of the President. Although the Constitutional text seems to suggest to the contrary, the Senate 's current practice is to elect a full - time President pro tempore at the beginning of each Congress, as opposed to making it a temporary office only existing during the Vice President 's absence. Since World War II, the senior (longest serving) member of the majority party has filled this position. As is true of the Speaker of the House, the Constitution does not require that the President pro tempore be a senator, but by convention, a senator is always chosen. The Senate shall have the sole Power to try all Impeachments. When sitting for that Purpose, they shall be on Oath or Affirmation. When the President of the United States is tried, the Chief Justice shall preside: And no Person shall be convicted without the Concurrence of two thirds of the Members present. Clause Six grants to the Senate the sole power to try impeachments and spells out the basic procedures for impeachment trials. The Supreme Court has interpreted this clause to mean that the Senate has exclusive and unreviewable authority to determine what constitutes an adequate impeachment trial. Of the nineteen federal officials formally impeached by the House of Representatives, eleven were acquitted and seven were convicted by the Senate. On one occasion the Senate declined to hold a trial. The constitution 's framers vested the Senate with this power for several reasons. First, they believed Senators would be better educated, more virtuous, and more high - minded than Members of the House of Representatives and thus uniquely able to decide responsibly the most difficult of political questions. Second, they believed that the Senate, being a numerous body, would be well suited to handle the procedural demands of an impeachment trial, in which it, unlike judges and the judiciary system, would "never be tied down by such strict rules, either in the delineation of the offense by the prosecutor, or in the construction of it by judges, as in the common cases serve to limit the discretion of courts in favor of personal security. '' (Alexander Hamilton, The Federalist No. 65). There are three Constitutionally mandated requirements for impeachment trials. The provision that Senators must sit on oath or affirmation was designed to impress upon them the extreme seriousness of the occasion. The stipulation that the Chief Justice is to preside over presidential impeachment trials underscores the solemnity of the occasion and aims to avoid the possible conflict of interest of a Vice President 's presiding over the proceeding for the removal of the one official standing between him (or her) and the presidency. The specification that a two - thirds supermajority vote of those Senators present is necessary in order to convict designed to facilitate serious deliberation and to make removal possible only through a consensus that cuts across factional divisions. Judgment in Cases of Impeachment shall not extend further than to removal from Office, and disqualification to hold and enjoy any Office of honor, Trust or Profit under the United States: but the Party convicted shall nevertheless be liable and subject to Indictment, Trial, Judgment and Punishment, according to Law. If any officer is convicted on impeachment, he or she is immediately removed from office, and may be barred from holding any public office in the future. No other punishments may be inflicted pursuant to the impeachment proceeding, but the convicted party remains liable to trial and punishment in the courts for civil and criminal charges. The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations, except as to the Places of chusing (sic) Senators. The purpose of this clause is twofold. First, it makes clear the division of responsibility with respect to the conduct of the election of federal Senators and Representatives. That responsibility lies primarily with the states and secondarily with Congress. Second, the clause lodges the power to regulate elections in the respective legislative branches of the states and the federal government, not with the executive or judicial. As authorized by this clause, Congress has set a uniform date for federal elections: the Tuesday following the first Monday in November. Presently, as there are no on - point federal regulations, the states retain the authority to regulate the dates on which other aspects of the election process are held (registration, primary elections, etc.) and where elections will be held. As for regulating the "manner '' of elections, the Supreme Court has interpreted this to mean "matters like notices, registration, supervision of voting, protection of voters, prevention of fraud and corrupt practices, counting of votes, duties of inspectors and canvassers, and making and publication of election returns. '' The Supreme Court has held that States may not exercise their power to determine the "manner '' of holding elections to impose term limits on their congressional delegation. One of the most significant ways that each state regulates the "manner '' of elections is through their power to draw electoral districts. Although in theory Congress could draw the district map for each State, it has not exercised this level of oversight. Congress has, however, required the States to conform to certain practices when drawing districts. States are currently required to use a single - member district scheme, whereby the State is divided into as many election districts for Representatives in the House of Representatives as the size of its representation in that body (that is to say, Representatives can not be elected at - large from the whole State unless the State has only one Representative in the House, nor can districts elect more than 1 Representative). The Supreme Court has interpreted "by the Legislature thereof '' to include voters using the initiative process, in those state whose constitutions provide it, to create an independent redistricting commission. Congress first exercised its power to regulate elections nation - wide 1842, when the 27th Congress passed a law requiring the election of Representatives by districts. In subsequent years, Congress expanded on the requirements, successively adding contiguity, compactness, and substantial equality of population to the districting requirements. These standards were all later deleted in the Reapportionment Act of 1929. Congress subsequently reinstated the requirement that districts be composed of contiguous territory, be "compact, '' and have equal populations within each State. Congress has allowed those requirements to lapse, but the Supreme Court has re-imposed the population requirement on the States under the Equal Protection Clause and is suspicious of districts that do not meet the other "traditional '' districting criteria of compactness and contiguity. In 1866, Congress legislated a remedy for a situation under which deadlocks in state legislatures over the election of Senators were creating vacancies in the office. The act required the two houses of each legislature to meet in joint session on a specified day and to meet every day thereafter until a Senator was selected. The first comprehensive federal statute dealing with elections was adopted in 1870 as a means of enforcing the Fifteenth Amendment 's guarantee against racial discrimination in granting suffrage rights. Under the Enforcement Act of 1870, and subsequent laws, false registration, bribery, voting without legal right, making false returns of votes cast, interference in any manner with officers of election, and the neglect by any such officer of any duty required by state or federal law were made federal offenses. Provision was made for the appointment by federal judges of persons to attend at places of registration and at elections with authority to challenge any person proposing to register or vote unlawfully, to witness the counting of votes, and to identify by their signatures the registration of voters and election tally sheets. Beginning with the Tillman Act of 1907, Congress has imposed a growing number of restrictions on elections and campaign financing. The most significant piece of legislation has been the 1971 Federal Election Campaign Act. It was this legislation that was at issue in the Supreme Court 's seminal decision, Buckley v. Valeo (1976), which, in the face of a First Amendment challenge, set the ground rules for campaign finance legislation, generally disallowing restrictions on expenditures by candidates, but permitting restrictions on contributions by individuals and corporations. In addition to statutory constraints, Congress and the States have altered the electoral process through amending the Constitution (first in the above mentioned Fifteenth Amendment). The Seventeenth Amendment altered the manner of conducting the elections of Senators; establishing that they are to be elected by the people of the states. Also, the Nineteenth Amendment prohibits any U.S. citizen from being denied the right to vote on the basis of sex; the Twenty - fourth Amendment prohibits both Congress and the states from conditioning the right to vote in federal elections on payment of a poll tax or other types of tax; and the Twenty - sixth Amendment prohibits the states and the federal government from using age as a reason for denying the right to vote to U.S. citizens who are at least eighteen years old. The Congress shall assemble at least once in every Year, and such Meeting shall be on the first Monday in December, unless they shall by Law appoint a different Day. Clause 2 fixes an annual date upon which Congress must meet. By doing so, the Constitution empowers Congress to meet, whether or not the President called it into session. Article II, Section 3 does grant the president limited authority to convene and adjourn both Houses (or either of them) and mandates that it will meet at least once in a year to enact legislation on behalf of the people. Some delegates to the 1787 constitutional convention believed yearly meetings were not necessary, for there would not be enough legislative business for Congress to deal with annually. Nathaniel Gorham of Massachusetts argued that the time should be fixed to prevent disputes from arising within the legislature, and to allow the states to adjust their elections to correspond with the fixed date. A fixed date also corresponded to the tradition in the states of having annual meetings. Finally, Gorham concluded that the legislative branch should be required to meet at least once a year to act as a check upon the executive department. Although this clause provides that the annual meeting was to be on the first Monday in December, the government established by the 1787 Constitution did not begin operations until March 4, 1789. As the 1st Congress held its initial meeting on March 4, that became the date on which new representatives and senators took office in subsequent years. Therefore, every other year, although a new Congress was elected in November, it did not come into office until the following March, with a "lame duck '' session convening in the interim. This practice was altered in 1933 following ratification of the Twentieth Amendment, which states (in Section 2) that, "The Congress shall assemble at least once in every year, and such meeting shall begin at noon on the third day of January, unless they shall by law appoint a different day ''. This change virtually eliminated the necessity of there being a lame duck session of Congress. Each House shall be the Judge of the Elections, Returns and Qualifications of its own Members, and a Majority of each shall constitute a Quorum to do Business; but a smaller Number may adjourn from day to day, and may be authorized to compel the Attendance of absent Members, in such Manner, and under such Penalties as each House may provide. Section Five states that a majority of each House constitutes a quorum to do business; a smaller number may adjourn the House or compel the attendance of absent members. In practice, the quorum requirement is all but ignored. A quorum is assumed to be present unless a quorum call, requested by a member, proves otherwise. Rarely do members ask for quorum calls to demonstrate the absence of a quorum; more often, they use the quorum call as a delaying tactic. Sometimes, unqualified individuals have been admitted to Congress. For instance, the Senate once admitted John Henry Eaton, a twenty - eight - year - old, in 1818 (the admission was inadvertent, as Eaton 's birth date was unclear at the time). In 1934, a twenty - nine - year - old, Rush Holt, was elected to the Senate; he agreed to wait six months, until his thirtieth birthday, to take the oath. The Senate ruled in that case that the age requirement applied as of the date of the taking of the oath, not the date of election. Each House may determine the Rules of its Proceedings, punish its Members for disorderly Behavior, and, with the Concurrence of two thirds, expel a member. Each House can determine its own Rules (assuming a quorum is present), and may punish any of its members. A two - thirds vote is necessary to expel a member. Section 5, Clause 2 does not provide specific guidance to each House regarding when and how each House may change its rules, leaving details to the respective chambers. Each House shall keep a Journal of its Proceedings, and from time to time publish the same, excepting such Parts as may in their Judgment require Secrecy; and the Yeas and Nays of the Members of either House on any question shall, at the desire of one fifth of those present, be entered on the Journal. Each House must keep and publish a Journal, though it may choose to keep any part of the Journal secret. The decisions of the House -- not the words spoken during debates -- are recorded in the Journal; if one - fifth of those present (assuming a quorum is present) request it, the votes of the members on a particular question must also be entered. Neither House, during the Session of Congress, shall, without the Consent of the other, adjourn for more than three days, nor to any other Place than that in which the two Houses shall be sitting. Neither House may adjourn, without the consent of the other, for more than three days. Often, a House will hold pro forma sessions every three days; such sessions are merely held to fulfill the constitutional requirement, and not to conduct business. Furthermore, neither House may meet in any place other than that designated for both Houses (the Capitol), without the consent of the other House. The Senators and Representatives shall receive a Compensation for their Services, to be ascertained by Law, and paid out of the Treasury of the United States. They shall in all Cases, except Treason, Felony and Breach of the Peace, be privileged from Arrest during their Attendance at the Session of their respective Houses, and in going to and returning from the same; and for any Speech or Debate in either House, they shall not be questioned in any other Place. Senators and Representatives set their own compensation. Under the Twenty - seventh Amendment, any change in their compensation will not take effect until after the next congressional election. Members of both Houses have certain privileges, based on those enjoyed by the members of the British Parliament. Members attending, going to or returning from either House are privileged from arrest, except for treason, felony or breach of the peace. One may not sue a Senator or Representative for slander occurring during Congressional debate, nor may speech by a member of Congress during a Congressional session be the basis for criminal prosecution. The latter was affirmed when Mike Gravel published over 4,000 pages of the Pentagon Papers in the Congressional Record, which might have otherwise been a criminal offense. This clause has also been interpreted in Gravel v. United States, 408 U.S. 606 (1972) to provide protection to aides and staff of sitting members of Congress, so long as their activities relate to legislative matters. No Senator or Representative shall, during the Time for which he was elected, be appointed to any civil Office under the Authority of the United States, which shall have been created, or the Emoluments whereof shall have been increased during such time; and no Person holding any Office under the United States, shall be a Member of either House during his Continuance in Office. Senators and Representatives may not simultaneously serve in Congress and hold a position in the executive branch. This restriction is meant to protect legislative independence by preventing the president from using patronage to buy votes in Congress. It is a major difference from the political system in the British Parliament, where cabinet ministers are required to be members of parliament. Furthermore, Senators and Representatives can not resign to take newly created or higher - paying political positions; rather, they must wait until the conclusion of the term for which they were elected. If Congress increases the salary of a particular officer, it may later reduce that salary to permit an individual to resign from Congress and take that position (known as the Saxbe fix). The effects of the clause were discussed in 1937, when Senator Hugo Black was appointed an Associate Justice of the Supreme Court with some time left in his Senate term. Just prior to the appointment, Congress had increased the pension available to Justices retiring at the age of seventy. It was therefore suggested by some that the office 's emolument had been increased during Black 's Senatorial term, and that therefore Black could not take office as a Justice. The response, however, was that Black was fifty - one years old, and would not receive the increased pension until at least 19 years later, long after his Senate term had expired. All Bills for raising Revenue shall originate in the House of Representatives; but the Senate may propose or concur with Amendments as on other Bills. This establishes the method for making Acts of Congress that involve taxation. Accordingly, any bill may originate in either House of Congress, except for a revenue bill, which may originate only in the House of Representatives. In practice, the Senate sometimes circumvents this requirement by substituting the text of a revenue bill previously passed by the House with a substitute text. Either House may amend any bill, including revenue and appropriation bills. This clause of the U.S. Constitution stemmed from an English parliamentary practice that all money bills must have their first reading in the House of Commons. This practice was intended to ensure that the power of the purse is possessed by the legislative body most responsive to the people, although the English practice was modified in America by allowing the Senate to amend these bills. The clause was part of the Great Compromise between small and large states; the large states were unhappy with the lopsided power of small states in the Senate, and so the clause theoretically offsets the unrepresentative nature of the Senate, and compensates the large states for allowing equal voting rights to Senators from small states. Every Bill which shall have passed the House of Representatives and the Senate, shall, before it become a Law, be presented to the President of the United States; If he approve he shall sign it, but if not he shall return it, with his Objections to that House in which it shall have originated, who shall enter the Objections at large on their Journal, and proceed to reconsider it. If after such Reconsideration two thirds of that House shall agree to pass the Bill, it shall be sent, together with the Objections, to the other House, by which it shall likewise be reconsidered, and if approved by two thirds of that House, it shall become a Law. But in all such Cases the Votes of both Houses shall be determined by yeas and Nays, and the Names of the Persons voting for and against the Bill shall be entered on the Journal of each House respectively. If any Bill shall not be returned by the President within ten Days (Sundays excepted) after it shall have been presented to him, the Same shall be a Law, in like Manner as if he had signed it, unless the Congress by their Adjournment prevent its Return, in which Case it shall not be a Law. This clause is known as the Presentment Clause. Before a bill becomes law, it must be presented to the President, who has ten days (excluding Sundays) to act upon it. If the President signs the bill, it becomes law. If he disapproves of the bill, he must return it to the House in which it originated together with his objections. This procedure has become known as the veto, although that particular word does not appear in the text of Article One. The bill does not then become law unless both Houses, by two - thirds votes, override the veto. If the President neither signs nor returns the bill within the ten - day limit, the bill becomes law, unless the Congress has adjourned in the meantime, thereby preventing the President from returning the bill to the House in which it originated. In the latter case, the President, by taking no action on the bill towards the end of a session, exercises a "pocket veto '', which Congress may not override. In the former case, where the President allows a bill to become law unsigned, there is no common name for the practice, but recent scholarship has termed it a "default enactment. '' What exactly constitutes an adjournment for the purposes of the pocket veto has been unclear. In the Pocket Veto Case (1929), the Supreme Court held that "the determinative question in reference to an ' adjournment ' is not whether it is a final adjournment of Congress or an interim adjournment, such as an adjournment of the first session, but whether it is one that ' prevents ' the President from returning the bill to the House in which it originated within the time allowed. '' Since neither House of Congress was in session, the President could not return the bill to one of them, thereby permitting the use of the pocket veto. In Wright v. United States (1938), however, the Court ruled that adjournments of one House only did not constitute an adjournment of Congress required for a pocket veto. In such cases, the Secretary or Clerk of the House in question was ruled competent to receive the bill. Every Order, Resolution, or Vote to which the Concurrence of the Senate and House of Representatives may be necessary (except on a question of Adjournment) shall be presented to the President of the United States; and before the Same shall take Effect, shall be approved by him, or being disapproved by him, shall be repassed by two thirds of the Senate and House of Representatives, according to the Rules and Limitations prescribed in the Case of a Bill. In 1996, Congress passed the Line Item Veto Act, which permitted the President, at the time of the signing of the bill, to rescind certain expenditures. The Congress could disapprove the cancellation and reinstate the funds. The President could veto the disapproval, but the Congress, by a two - thirds vote in each House, could override the veto. In the case Clinton v. City of New York, the Supreme Court found the Line Item Veto Act unconstitutional because it violated the Presentment clause. First, the procedure delegated legislative powers to the President, thereby violating the nondelegation doctrine. Second, the procedure violated the terms of Section Seven, which state, "if he approve (the bill) he shall sign it, but if not he shall return it. '' Thus, the President may sign the bill, veto it, or do nothing, but he may not amend the bill and then sign it. Every bill, order, resolution, or vote that must be passed by both Houses, except on a question of adjournment, must be presented to the President before becoming law. However, to propose a constitutional amendment, two - thirds of both Houses may submit it to the states for the ratification, without any consideration by the President, as prescribed in Article V. Some Presidents have made very extensive use of the veto, while others have not used it at all. Grover Cleveland, for instance, vetoed over four hundred bills during his first term in office; Congress overrode only two of those vetoes. Meanwhile, seven Presidents have never used the veto power. There have been 2,560 vetoes, including pocket vetoes. Congress 's legislative powers are enumerated in Section Eight: The Congress shall have power Many powers of Congress have been interpreted broadly. Most notably, the Taxing and Spending, Interstate Commerce, and Necessary and Proper Clauses have been deemed to grant expansive powers to Congress. Congress may lay and collect taxes for the "common defense '' or "general welfare '' of the United States. The U.S. Supreme Court has not often defined "general welfare, '' leaving the political question to Congress. In United States v. Butler (1936), the Court for the first time construed the clause. The dispute centered on a tax collected from processors of agricultural products such as meat; the funds raised by the tax were not paid into the general funds of the treasury, but were rather specially earmarked for farmers. The Court struck down the tax, ruling that the general welfare language in the Taxing and Spending Clause related only to "matters of national, as distinguished from local, welfare ''. Congress continues to make expansive use of the Taxing and Spending Clause; for instance, the social security program is authorized under the Taxing and Spending Clause. Congress has the power to borrow money on the credit of the United States. In 1871, when deciding Knox v. Lee, the Court ruled that this clause permitted Congress to emit bills and make them legal tender in satisfaction of debts. Whenever Congress borrows money, it is obligated to repay the sum as stipulated in the original agreement. However, such agreements are only "binding on the conscience of the sovereign '', as the doctrine of sovereign immunity prevents a creditor from suing in court if the government reneges its commitment. The Congress shall have Power (...) To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes; The Supreme Court has seldom restrained the use of the commerce clause for widely varying purposes. The first important decision related to the commerce clause was Gibbons v. Ogden, decided by a unanimous Court in 1824. The case involved conflicting federal and state laws: Thomas Gibbons had a federal permit to navigate steamboats in the Hudson River, while the other, Aaron Ogden, had a monopoly to do the same granted by the state of New York. Ogden contended that "commerce '' included only buying and selling of goods and not their transportation. Chief Justice John Marshall rejected this notion. Marshall suggested that "commerce '' included navigation of goods, and that it "must have been contemplated '' by the Framers. Marshall added that Congress 's power over commerce "is complete in itself, may be exercised to its utmost extent, and acknowledges no limitations other than are prescribed in the Constitution ''. The expansive interpretation of the Commerce Clause was restrained during the late nineteenth and early twentieth centuries, when a laissez - faire attitude dominated the Court. In United States v. E.C. Knight Company (1895), the Supreme Court limited the newly enacted Sherman Antitrust Act, which had sought to break up the monopolies dominating the nation 's economy. The Court ruled that Congress could not regulate the manufacture of goods, even if they were later shipped to other states. Chief Justice Melville Fuller wrote, "commerce succeeds to manufacture, and is not a part of it. '' The U.S. Supreme Court sometimes ruled New Deal programs unconstitutional because they stretched the meaning of the commerce clause. In Schechter Poultry Corp. v. United States, (1935) the Court unanimously struck down industrial codes regulating the slaughter of poultry, declaring that Congress could not regulate commerce relating to the poultry, which had "come to a permanent rest within the State. '' As Chief Justice Charles Evans Hughes put it, "so far as the poultry here in question is concerned, the flow of interstate commerce has ceased. '' Judicial rulings against attempted use of Congress 's Commerce Clause powers continued during the 1930s. In 1937, the Supreme Court began moving away from its laissez - faire attitude concerning Congressional legislation and the Commerce Clause, when it ruled in National Labor Relations Board v. Jones & Laughlin Steel Company, that the National Labor Relations Act of 1935 (commonly known as the Wagner Act) was constitutional. The legislation under scrutiny, prevented employers from engaging in "unfair labor practices '' such as firing workers for joining unions. In sustaining this act, the Court, signaled its return to the philosophy espoused by John Marshall, that Congress could pass laws regulating actions that even indirectly influenced interstate commerce. This new attitude became firmly set into place in 1942. In Wickard v. Filburn, the Court ruled that production quotas under the Agricultural Adjustment Act of 1938 were constitutionally applied to agricultural production (in this instance, home - grown wheat for private consumption) that was consumed purely intrastate, because its effect upon interstate commerce placed it within the power of Congress to regulate under the Commerce Clause. This decision marked the beginning of the Court 's total deference to Congress ' claims of Commerce Clause powers, which lasted into the 1990s. United States v. Lopez (1995) was the first decision in six decades to invalidate a federal statute on the grounds that it exceeded the power of the Congress under the Commerce Clause. The Court held that while Congress had broad lawmaking authority under the Commerce Clause, the power was limited, and did not extend so far from "commerce '' as to authorize the regulation of the carrying of handguns, especially when there was no evidence that carrying them affected the economy on a massive scale. In a later case, United States v. Morrison (2000), the justices ruled that Congress could not make such laws even when there was evidence of aggregate effect. In contrast to these rulings, the Supreme Court also continues to follow the precedent set by Wickard v. Filburn. In Gonzales v. Raich it ruled that the Commerce Clause granted Congress the authority to criminalize the production and use of home - grown cannabis even where states approve its use for medicinal purposes. The court held that, as with the agricultural production in the earlier case, home - grown cannabis is a legitimate subject of federal regulation because it competes with marijuana that moves in interstate commerce. Congress may establish uniform laws relating to naturalization and bankruptcy. It may also coin money, regulate the value of American or foreign currency and punish counterfeiters. Congress may fix the standards of weights and measures. Furthermore, Congress may establish post offices and post roads (the roads, however, need not be exclusively for the conveyance of mail). Congress may promote the progress of science and useful arts by granting copyrights and patents of limited duration. Section eight, clause eight of Article One, known as the Copyright Clause, is the only instance of the word "right '' used in the original constitution (though the word does appear in several Amendments). Though perpetual copyrights and patents are prohibited, the Supreme Court has ruled in Eldred v. Ashcroft (2003) that repeated extensions to the term of copyright do not constitute perpetual copyright; also note that this is the only power granted where the means to accomplish its stated purpose is specifically provided for. Courts inferior to the Supreme Court may be established by Congress. Congress has several powers related to war and the armed forces. Under the War Powers Clause, only Congress may declare war, but in several cases it has, without declaring war, granted the President the authority to engage in military conflicts. Five wars have been declared in United States ' history: the War of 1812, the Mexican -- American War, the Spanish -- American War, World War I and World War II. Some historians argue that the legal doctrines and legislation passed during the operations against Pancho Villa constitute a sixth declaration of war. Congress may grant letters of marque and reprisal. Congress may establish and support the armed forces, but no appropriation made for the support of the army may be used for more than two years. This provision was inserted because the Framers feared the establishment of a standing army, beyond civilian control, during peacetime. Congress may regulate or call forth the state militias, but the states retain the authority to appoint officers and train personnel. Congress also has exclusive power to make rules and regulations governing the land and naval forces. Although the executive branch and the Pentagon have asserted an ever - increasing measure of involvement in this process, the U.S. Supreme Court has often reaffirmed Congress 's exclusive hold on this power (e.g. Burns v. Wilson, 346 U.S. 137 (1953)). Congress used this power twice soon after World War II with the enactment of two statutes: the Uniform Code of Military Justice to improve the quality and fairness of courts martial and military justice, and the Federal Tort Claims Act which among other rights had allowed military service persons to sue for damages until the U.S. Supreme Court repealed that section of the statute in a divisive series of cases, known collectively as the Feres Doctrine. Congress has the exclusive right to legislate "in all cases whatsoever '' for the nation 's capital, the District of Columbia. Congress chooses to devolve some of such authority to the elected mayor and council of District of Columbia. Nevertheless, Congress remains free to enact any legislation for the District so long as constitutionally permissible, to overturn any legislation by the city government, and technically to revoke the city government at any time. Congress may also exercise such jurisdiction over land purchased from the states for the erection of forts and other buildings. The Congress shall have Power (...) To make all Laws which shall be necessary and proper for carrying into Execution the foregoing Powers, and all other Powers vested by this Constitution in the Government of the United States, or in any Department or Officer thereof. Finally, Congress has the power to do whatever is "necessary and proper '' to carry out its enumerated powers and, crucially, all others vested in it. This has been interpreted to authorize criminal prosecution of those whose actions have a "substantial effect '' on interstate commerce in Wickard v. Filburn; however, Thomas Jefferson, in the Kentucky Resolutions, supported by James Madison, maintained that a penal power could not be inferred from a power to regulate, and that the only penal powers were for treason, counterfeiting, piracy and felony on the high seas, and offenses against the law of nations. The necessary and proper clause has been interpreted extremely broadly, thereby giving Congress wide latitude in legislation. The first landmark case involving the clause was McCulloch v. Maryland (1819), which involved the establishment of a national bank. Alexander Hamilton, in advocating the creation of the bank, argued that there was "a more or less direct '' relationship between the bank and "the powers of collecting taxes, borrowing money, regulating trade between the states, and raising and maintaining fleets and navies ''. Thomas Jefferson countered that Congress 's powers "can all be carried into execution without a national bank. A bank therefore is not necessary, and consequently not authorized by this phrase ''. Chief Justice John Marshall agreed with the former interpretation. Marshall wrote that a Constitution listing all of Congress 's powers "would partake of a prolixity of a legal code and could scarcely be embraced by the human mind ''. Since the Constitution could not possibly enumerate the "minor ingredients '' of the powers of Congress, Marshall "deduced '' that Congress had the authority to establish a bank from the "great outlines '' of the general welfare, commerce and other clauses. Under this doctrine of the necessary and proper clause, Congress has sweepingly broad powers (known as implied powers) not explicitly enumerated in the Constitution. However, the Congress can not enact laws solely on the implied powers, any action must be necessary and proper in the execution of the enumerated powers. The ninth section of Article One places limits on Congress ' powers: The Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress prior to the Year one thousand eight hundred and eight, but a Tax or duty may be imposed on such Importation, not exceeding ten dollars for each Person. The Privilege of the Writ of Habeas Corpus shall not be suspended, unless when in Cases of Rebellion or Invasion the public Safety may require it. No Bill of Attainder or ex post facto Law shall be passed. No Capitation, or other direct, Tax shall be laid, unless in Proportion to the Census or Enumeration herein before directed to be taken. No Tax or Duty shall be laid on Articles exported from any State. No Preference shall be given by any Regulation of Commerce or Revenue to the Ports of one State over those of another: nor shall Vessels bound to, or from, one State, be obliged to enter, clear, or pay Duties in another. No Money shall be drawn from the Treasury, but in Consequence of Appropriations made by Law; and a regular Statement and Account of Receipts and Expenditures of all public Money shall be published from time to time. No Title of Nobility shall be granted by the United States: And no Person holding any Office of Profit or Trust under them, shall, without the Consent of the Congress, accept of any present, Emolument, Office, or Title, of any kind whatever, from any King, Prince, or foreign State. The first clause in this section prevents Congress from passing any law that would restrict the importation of slaves into the United States prior to 1808. Congress could however, levy a per capita duty of up to ten dollars for each slave imported into the country. This clause was further entrenched into the Constitution by Article V, where it is explicitly shielded from constitutional amendment prior to 1808. On March 2, 1807, Congress approved legislation prohibiting the importation of slaves into the United States, which went into effect January 1, 1808, the first day permitted by the Constitution. A writ of habeas corpus is a legal action against unlawful detainment that commands a law enforcement agency or other body that has a person in custody to have a court inquire into the legality of the detention. The court may order the person released if the reason for detention is deemed insufficient or unjustifiable. The Constitution further provides that the privilege of the writ of habeas corpus may not be suspended "unless when in cases of rebellion or invasion the public safety may require it ''. In Ex parte Milligan (1866), the Supreme Court ruled that the suspension of habeas corpus in a time of war was lawful, but military tribunals did not apply to citizens in states that had upheld the authority of the Constitution and where civilian courts were still operating. A bill of attainder is a law by which a person is immediately convicted without trial. An ex post facto law is a law which applies retroactively, punishing someone for an act that was only made criminal after it was done. The ex post facto clause does not apply to civil matters. Section Nine reiterates the provision from Section Two that direct taxes must be apportioned by state populations. This clause was also explicitly shielded from constitutional amendment prior to 1808 by Article V. In 1913, the 16th Amendment exempted all income taxes from this clause. This overcame the ruling in Pollock v. Farmers ' Loan & Trust Co. that the income tax could only be applied to regular income and could not be applied to dividends and capital gains. Furthermore, no tax may be imposed on exports from any state. Congress may not, by revenue or commerce legislation, give preference to ports of one state over those of another; neither may it require ships from one state to pay duties in another. All funds belonging to the Treasury may not be withdrawn except according to law. Modern practice is that Congress annually passes a number of appropriations bills authorizing the expenditure of public money. The Constitution requires that a regular statement of such expenditures be published. The Title of Nobility Clause prohibits Congress from granting any title of nobility. In addition, it specifies that no civil officer may accept, without the consent of Congress, any gift, payment, office or title from a foreign ruler or state. However, a U.S. citizen may receive foreign office before or after their period of public service. No State shall enter into any Treaty, Alliance, or Confederation; grant Letters of Marque and Reprisal; coin Money; emit Bills of Credit; make any Thing but gold and silver Coin a Tender in Payment of Debts; pass any Bill of Attainder, ex post facto Law, or Law impairing the Obligation of Contracts, or grant any Title of Nobility. States may not exercise certain powers reserved for the federal government: they may not enter into treaties, alliances or confederations, grant letters of marque or reprisal, coin money or issue bills of credit (such as currency). Furthermore, no state may make anything but gold and silver coin a tender in payment of debts, which expressly forbids any state government (but not the federal government) from "making a tender '' (i.e., authorizing something that may be offered in payment) of any type or form of money to meet any financial obligation, unless that form of money is coins made of gold or silver (or a medium of exchange backed by and redeemable in gold or silver coins, as noted in Farmers & Merchants Bank v. Federal Reserve Bank). Much of this clause is devoted to preventing the States from using or creating any currency other than that created by Congress. In Federalist no. 44, Madison explains that "... it may be observed that the same reasons which shew the necessity of denying to the States the power of regulating coin, prove with equal force that they ought not to be at liberty to substitute a paper medium in the place of coin. Had every State a right to regulate the value of its coin, there might be as many different currencies as States; and thus the intercourse among them would be impeded. '' Moreover, the states may not pass bills of attainder, enact ex post facto laws, impair the obligation of contracts, or grant titles of nobility. The Contract Clause was the subject of much contentious litigation in the 19th century. It was first interpreted by the Supreme Court in 1810, when Fletcher v. Peck was decided. The case involved the Yazoo land scandal, in which the Georgia legislature authorized the sale of land to speculators at low prices. The bribery involved in the passage of the authorizing legislation was so blatant that a Georgia mob attempted to lynch the corrupt members of the legislature. Following elections, the legislature passed a law that rescinded the contracts granted by the corrupt legislators. The validity of the annulment of the sale was questioned in the Supreme Court. In writing for a unanimous court, Chief Justice John Marshall asked, "What is a contract? '' His answer was: "a compact between two or more parties. '' Marshall argued that the sale of land by the Georgia legislature, though fraught with corruption, was a valid "contract ''. He added that the state had no right to annul the purchase of the land, since doing so would impair the obligations of contract. The definition of a contract propounded by Chief Justice Marshall was not as simple as it may seem. In 1819, the Court considered whether a corporate charter could be construed as a contract. The case of Trustees of Dartmouth College v. Woodward involved Dartmouth College, which had been established under a Royal Charter granted by King George III. The Charter created a board of twelve trustees for the governance of the College. In 1815, however, New Hampshire passed a law increasing the board 's membership to twenty - one with the aim that public control could be exercised over the College. The Court, including Marshall, ruled that New Hampshire could not amend the charter, which was ruled to be a contract since it conferred "vested rights '' on the trustees. The Marshall Court determined another dispute in Sturges v. Crowninshield. The case involved a debt that was contracted in early 1811. Later in that year, the state of New York passed a bankruptcy law, under which the debt was later discharged. The Supreme Court ruled that a retroactively applied state bankruptcy law impaired the obligation to pay the debt, and therefore violated the Constitution. In Ogden v. Saunders (1827), however, the court decided that state bankruptcy laws could apply to debts contracted after the passage of the law. State legislation on the issue of bankruptcy and debtor relief has not been much of an issue since the adoption of a comprehensive federal bankruptcy law in 1898. No State shall, without the Consent of the Congress, lay any Imposts or Duties on Imports or Exports, except what may be absolutely necessary for executing it 's (sic) inspection Laws: and the net Produce of all Duties and Imposts, laid by any State on Imports or Exports, shall be for the Use of the Treasury of the United States; and all such Laws shall be subject to the Revision and Controul (sic) of the Congress. Still more powers are prohibited of the states. States may not, without the consent of Congress, tax imports or exports except for the fulfillment of state inspection laws (which may be revised by Congress). The net revenue of the tax is paid not to the state, but to the federal Treasury. No State shall, without the Consent of Congress, lay any Duty of Tonnage, keep Troops, or Ships of War in time of Peace, enter into any Agreement or Compact with another State, or with a foreign Power, or engage in War, unless actually invaded, or in such imminent Danger as will not admit of delay. Under the Compact Clause, states may not, without the consent of Congress, keep troops or armies during times of peace. They may not enter into alliances nor compacts with foreign states, nor engage in war unless invaded. States may, however, organize and arm a militia according to the discipline prescribed by Congress. (Article I, Section 8, enumerated powers of Congress.) The National Guard, whose members are also members of the militia of the United States as defined by 10 U.S.C. § 311, fulfill this function, as do persons serving in State Militias with federal oversight under 32 U.S.C. § 109. The idea of allowing Congress to have say over agreements between states traces back to the numerous controversies that arose between various colonies. Eventually compromises would be created between the two colonies and these compromises would be submitted to the Crown for approval. After the American Revolutionary War, the Articles of Confederation allowed states to appeal to Congress to settle disputes between the states over boundaries or "any cause whatever ''. The Articles of Confederation also required Congressional approval for "any treaty or alliance '' in which a state was one of the parties. There have been a number of Supreme Court cases concerning what constitutes valid congressional consent to an interstate compact. In Virginia v. Tennessee, 148 U.S. 503 (1893), the Court found that some agreements among states stand even when lacking the explicit consent of Congress. (One example the court gave was a state moving some goods from a distant state to itself, it would not require Congressional approval to contract with another state to use its canals for transport.) According to the Court, the Compact Clause requires congressional consent only if the agreement among the states is "directed to the formation of any combination tending to the increase of political power in the States, which may encroach upon or interfere with the just supremacy of the United States ''. The congressional consent issue is at the center of the current debate over the constitutionality of the not yet effective National Popular Vote Interstate Compact entered into by several states plus the District of Columbia.
where did they shoot three billboards outside ebbing missouri
Three Billboards Outside Ebbing, Missouri - Wikipedia Three Billboards Outside Ebbing, Missouri is a 2017 drama film written, directed, and produced by Martin McDonagh. It stars Frances McDormand as a mother who rents three billboards to call attention to her daughter 's unsolved murder. Woody Harrelson, Sam Rockwell, John Hawkes, and Peter Dinklage appear in supporting roles. The film was released in the United States on November 10, 2017, and in the United Kingdom on January 12, 2018, by Fox Searchlight Pictures; it has grossed $156 million worldwide. At the 90th Academy Awards, the film won Best Actress (McDormand) and Best Supporting Actor (Rockwell); it had seven total nominations, including Best Picture, Best Original Screenplay and Best Supporting Actor (Harrelson). At the 75th Golden Globe Awards, the film won Best Motion Picture -- Drama, Best Actress -- Drama (McDormand), Best Supporting Actor (Rockwell) and Best Screenplay. It also won three SAG Awards, including Outstanding Performance by a Cast in a Motion Picture, and five BAFTA Film Awards, including Best Film and Outstanding British Film. In the town of Ebbing, Missouri, Mildred Hayes is grieving the rape and murder of her teenage daughter, Angela, seven months earlier. Angry over the lack of progress in the investigation, Mildred rents three abandoned billboards near her home, and posts on them: "Raped While Dying '', "Still No Arrests? '', and "How Come, Chief Willoughby? '' The billboards upset the townspeople, including Chief Bill Willoughby and Officer Jason Dixon, the latter being a racist and a violent alcoholic. The open secret that Willoughby suffers from terminal pancreatic cancer adds to everyone 's disapproval. Mildred and her son Robbie are harassed and threatened, but to Robbie 's chagrin, she stays firm about keeping the billboards up. While Willoughby is sympathetic to Mildred 's frustration, he finds the billboards an unfair attack on his character. Angered by Mildred 's lack of respect for his authority, Dixon threatens businessman Red Welby, who rented Mildred the billboards, and he arrests her friend and coworker, Denise, on trivial marijuana possession charges. Mildred is also visited by her abusive ex-husband Charlie, who blames her for their daughter 's death. Willoughby brings Mildred in for questioning after she drills a hole in her dentist 's thumb when he threatens her. During the interview, Willoughby coughs up blood. He leaves the hospital against medical advice and spends an idyllic day with his wife Anne and their two daughters, then commits suicide due to his illness. He leaves suicide notes for several people, including Mildred, in which he explains that she was not a factor in his suicide and that he secretly paid to keep the billboards up for another month, amused at the trouble this will bring her and hope that they will keep attention on the murder. Dixon reacts to the news of Willoughby 's death by assaulting Welby and throwing him out of a window. This is witnessed by Willoughby 's replacement, Abercrombie, who fires Dixon. In the meanwhile, Mildred is threatened by a crop - haired stranger in her store. The billboards are destroyed by arson. Mildred retaliates by tossing Molotov cocktails at the police station, which she believes is unoccupied for the night. However, Dixon is there to read Willoughby 's letter to him, which advises him to let go of hate and learn to love, as the only way to realize his wish to become a detective. Dixon escapes with Angela 's case file but suffers severe burns. Mildred 's acquaintance James witnesses the incident and provides Mildred with an alibi, claiming they were on a date. Dixon is treated for his burns, and he is temporarily confined in the same hospital room as Welby, to whom he apologizes. Discharged from the hospital, Dixon overhears the man who threatened Mildred bragging in a bar of an incident similar to Angela 's murder. He notes the Idaho license plate number of the man 's vehicle, then provokes a fight by scratching the man 's face. At home later, he removes a sample of the man 's DNA from under his fingernails. Meanwhile, Mildred goes on a date with James to thank him for the alibi. Charlie enters with his 19 - year - old girlfriend Penelope and admits to burning the billboards while intoxicated. After accidentally causing James to leave, Mildred apologetically tells Charlie to treat Penelope well and leaves. Though commending him, Abercrombie informs Dixon that the DNA sample does not match DNA found on Angela 's body, and that the man was overseas on military duty at the time of the murder. Dixon concludes that the man must be guilty of some other rape, and joins Mildred on a trip to Idaho in order to kill him. On the way, Mildred confesses to Dixon that she set the police station on fire. He indicates that he knew already. They express reservations about their mission but agree to decide what to do along the way. While traveling through the Southern United States in around 1998, Martin McDonagh came across a couple of accusatory billboards about an unsolved crime, which he described as "raging and painful and tragic (sic) ''. The billboards highlighted the incompetence of police work and deeply affected McDonagh; he said that the image "stayed in my mind (...) kept gnawing at me '' and presumed that they were put up by the victim 's mother. This incident, combined with his desire to create strong female characters, inspired him to write the story for Three Billboards Outside Ebbing, Missouri. McDonagh discussed the creative process, saying that it took him about ten years to "(decide) that it was a mother who had taken these things out. It all became fiction (...) based on a couple of actual billboards ''. The character of Mildred was written with Frances McDormand in mind, and likewise the character of Dixon was written specifically for Rockwell. McDormand initially felt that she was older than the character as it was written, and suggested that Mildred instead be Angela 's grandmother; McDonagh disagreed, feeling that it would change the story too much. McDormand 's husband Joel Coen persuaded her to take the part regardless. McDormand took inspiration for her character from John Wayne; and Rockwell, wanting to make his character "the exact opposite '' of Mildred, took inspiration for his character in part from Wayne 's co-star in The Man Who Shot Liberty Valance, Lee Marvin. Principal photography began on May 2, 2016, in Sylva, North Carolina, and ran for 33 days. Allison Outdoor Advertising of Sylva built the actual billboards, which were put in a pasture near Black Mountain, North Carolina because that location was better. Most of the time the billboards were covered because people in the area found them upsetting. David Penix of Arden, North Carolina bought the billboards and used the wood for a roof in Douglas Lake in Tennessee, though the messages are no longer in order. Town Pump Tavern in Black Mountain, which had appeared in The World Made Straight, was used as a set and was closed for three days during filming. A pool table and booths were added. The bar 's actual sign appeared in the movie. The musical score was written by Carter Burwell, who had also supplied the score for McDonagh 's films In Bruges and Seven Psychopaths. As well as Burwell 's score, the film features songs by Joan Baez, Monsters of Folk, Townes Van Zandt, and the Four Tops. Three Billboards Outside Ebbing, Missouri premiered in competition at the 74th Venice International Film Festival on September 4, 2017. It also had screenings at the 2017 Toronto International Film Festival, the 2017 San Sebastián International Film Festival (where it won the Audience Award), the BFI London Film Festival, and the 2017 Zurich Film Festival. It was also screened at the Mar del Plata International Film Festival. In the United States, the film was released, by Fox Searchlight Pictures, on November 10, 2017, beginning with a limited release, before "going wide '' on December 1. The film was released on 4K Ultra HD, Blu - ray and DVD on February 27, 2018. Six Shooter, McDonagh 's Academy Award - winning short film, is included as a bonus. As of March 16, 2018, Three Billboards Outside Ebbing, Missouri has grossed $53.6 million in the United States and Canada, and $91 million in other countries, for a worldwide total of $144.7 million. In its limited opening weekend, the film made $322,168 from four theaters for a per - theater average of $80,542, the fourth best of 2017. The film made $1.1 million from 53 theaters in its second weekend and $4.4 million from 614 in its third, finishing a respective 9th and 10th at the box office. In the weekend following its four Golden Globe wins the film was added to 712 theaters (for a total of 1,022) and grossed $2.3 million, and increase of 226 % from the previous week 's $706,188. The weekend of January 27, 2018, following the announcement of the film 's seven Oscar nominations, it made $3.6 million (an increase of 87 % over the previous week 's $1.9 million), finishing 13th. The weekend of March 9 -- 11, following its two Oscar wins, the film made $705,000, down 45 % from the previous week 's $1.3 million. The film had an approval rating of 92 % based on 317 reviews on the review aggregator website, Rotten Tomatoes, and an average rating of 8.5 / 10. The website 's critical consensus reads, "Three Billboards Outside Ebbing, Missouri deftly balances black comedy against searing drama -- and draws unforgettable performances from its veteran cast along the way. '' On Metacritic, which assigns a normalized rating to reviews, the film had a weighted average score of 88 out of 100, based on 50 critics, indicating "universal acclaim. '' Audiences polled by CinemaScore gave the film an average grade of "A -- '' on an A+ to F scale. Owen Gleiberman of Variety praised the film 's performances, stating "It 's Mildred 's glowering refusal to back down that defines her, and McDormand brilliantly spotlights the conflicted humanity beneath the stony façade, '' and called Rockwell 's performance a "revelation. '' Steve Pond, writing for TheWrap, praised McDonagh 's writing, calling it "Very funny, very violent and surprisingly moving. '' Divesh Mirchandani from C'est le Cinema hailed the film for its screenplay, rating the film 4 / 5. (1) Conversely, others have criticized the film 's script. In her review for The New York Times, Manohla Dargis wrote "(McDonagh 's) jokes can be uninterestingly glib with tiny, bloodless pricks that are less about challenging the audience than about obscuring the material 's clichés and overriding theatricality. '' In a column for The Daily Beast, blogger Ira Madison III called the film 's treatment of Rockwell 's character "altogether offensive, '' arguing that the character 's described racism was handled insensitively, and writing "McDonagh 's attempts to script the black experience in America are often fumbling and backward and full of outdated tropes. '' A scathing critique by Tim Parks in The New Yorker asked "what does it (this film) tell us about the U.S.A.? '' After praising the "magnificently photographed images '', the plot is ridiculed as containing "a thousand cheap coincidences ''. "Caricatures, conflict, and political correctness mesh, with the greatest of ease ''. Parks ' theme is that "below the surface of this narrative, a deeper conflict is being waged: the fight of the liberal intelligentsia against the redneck, racist Trump voters of Missouri ''. At the 75th Golden Globe Awards, Three Billboards Outside Ebbing, Missouri won for Best Motion Picture -- Drama, Best Actress -- Drama (McDormand), Best Supporting Actor (Rockwell), and Best Screenplay, and was nominated for Best Director and Best Original Score. At the 71st British Academy Film Awards, it received nine nominations, including Best Film, Best Director, and Best Actress in a Leading Role (McDormand), and Best Actor in a Supporting Role for both Rockwell and Harrelson. It won five awards, including Best Film and Outstanding British Film (making it the only film along with The King 's Speech to win both awards since the latter category was reintroduced in 1992) while both McDormand and Rockwell won the Lead Actress and Supporting Actor awards respectively. It was nominated for four awards at the 24th Screen Actors Guild Awards, winning three, including Outstanding Performance by a Cast in a Motion Picture. At the 90th Academy Awards it received seven nominations, including Best Picture, Best Actress for Frances McDormand, Best Original Screenplay for Martin McDonagh and two Best Supporting Actor nominations for both Sam Rockwell and Woody Harrelson. McDormand and Rockwell took home their respective awards. It was named one of the top 10 films of the year by the American Film Institute. At the 2017 Toronto International Film Festival, the film won its top prize, the People 's Choice Award. At the 2017 San Sebastián International Film Festival, it won the Audience Award. On February 15, 2018, Justice4Grenfell, an advocacy group created in response to the Grenfell Tower fire, hired three vans with electronic screens in a protest against perceived inaction in response to the fire. The vans were driven around London, and displayed messages in the style of the billboards in the film: ' 71 Dead ', ' And Still No Arrests? ', ' How Come? ' In response to the Stoneman Douglas High School shooting that took place on February 14, 2018, in Parkland, Florida, activist group Avaaz had three vans circle Florida senator Marco Rubio 's offices displaying ' Slaughtered in School ', ' And Still No Gun Control? ', ' How Come, Marco Rubio? ' On the night of February 15, 2018, the movement # OccupyJustice set up three billboards and a number of banners in Malta, marking the four - month anniversary of the murder of the journalist Daphne Caruana Galizia. The billboards bore the text ' A Journalist Killed. No Justice. ', ' A Country Robbed. No Justice. ', and ' No Resignations. No Justice. ' The authorities removed the billboards the following day, stating that they were illegal. The government was criticized for this move, and a day after their removal, activists laid down banners with similar text near Auberge de Castille, the Office of the Prime Minister. Outside the Bristol city centre on February 3, 2018, a mural was erected depicting three billboards reading ' Our NHS is dying ', ' And still no more funding ', and ' How come, Mrs May '. It was installed by the groups People 's Republic of Stokes Croft and Protect Our NHS in response to the alleged privatization of the National Health Service (NHS) and the death of a 15 - year - old girl attributed by some to a purported lack of resources by the NHS. On February 22, 2018, the Union of Medical Care and Relief Organizations, protesting the inaction of the UN 's role within the Syrian Civil War, set up three billboards outside the United Nations building in New York that read ' 500,000 Dead in Syria ', ' And still no action? ', and ' How come, Security Council '. On or around March 1, around the time of the 2018 Oscars, three billboards were taken out in Los Angeles, stating "WE ALL KNEW AND STILL NO ARRESTS '', "AND THE OSCAR FOR BIGGEST PEDOPHILE GOES TO... '' and "NAME NAMES ON STAGE OR SHUT THE HELL UP! '', as an attempt to protest both the Oscars and the # MeToo movement. On March 8, on International Women 's Day, three billboards were put in downtown Pristina, Kosovo, to protest the death of two women as a result of domestic violence. Both McDormand and McDonagh have responded positively to the protests, with McDonagh saying "You could n't ask for anything more than for an angry film to be adopted by protests, '' and McDormand saying she is "thrilled that activists all over the world have been inspired by the set decoration of the three billboards in Martin 's film. '' On March 24, 2018, signs inspired by Three Billboards appeared at March for Our Lives gun safety rallies across the US and around the world.
when did israel adopt the star of david
Star of David - wikipedia The Star of David (✡), known in Hebrew as the Shield of David or Magen David (Hebrew מָגֵן דָּוִד; Biblical Hebrew Māḡēn Dāwīḏ (maːˈɣeːn daːˈwiːð), Tiberian (mɔˈɣen dɔˈvið), Modern Hebrew (maˈɡen daˈvid), Ashkenazi Hebrew and Yiddish Mogein Dovid (ˈmɔɡeɪn ˈdɔvid) or Mogen Dovid), is a generally recognized symbol of modern Jewish identity and Judaism. Its shape is that of a hexagram, the compound of two equilateral triangles. Unlike the menorah, the Lion of Judah, the shofar and the lulav, the Star of David was never a uniquely Jewish symbol, although it had been used in that way as a printer 's colophon since the sixteenth century. During the 19th century the symbol began to proliferate among the Jewish communities of Eastern Europe, ultimately being used among the Jewish communities in the Pale of Settlement. A significant motivating factor, according to scholar Gershom Sholem, was the desire to represent Jewish religion and / or identity in the same manner the Christian cross identified that religion 's believers. The earliest Jewish usage of the symbol was inherited from medieval Arabic literature by Kabbalists for use in talismanic protective amulets (segulot) where it was known as a Seal of Solomon. The symbol was also used in Christian churches as a decorative motif many centuries before its first known use in a Jewish synagogue. Before the 19th century, official use in Jewish communities was generally known only in the region of today 's Czech Republic, Austria and possibly parts of Southern Germany, having begun in medieval Prague. The symbol became representative of the worldwide Zionist community, and later the broader Jewish community, after it was chosen as the central symbol on a flag at the First Zionist Congress in 1897. The identification of the term "Star of David '' or "Shield of David '' with the hexagram shape dates to the 17th century. The term "Shield of David '' is also used in the Siddur (Jewish prayer book) as a title of the God of Israel. The hexagram does appear occasionally in Jewish contexts since antiquity, apparently as a decorative motif. For example, in Israel, there is a stone bearing a hexagram from the arch of a 3rd -- 4th century synagogue in the Galilee. Originally, the hexagram may have been employed as an architectural ornament on synagogues, as it is, for example, on the cathedrals of Brandenburg and Stendal, and on the Marktkirche at Hanover. A pentagram in this form is found on the ancient synagogue at Tell Hum. In the synagogues, perhaps, it was associated with the mezuzah. The use of the hexagram in a Jewish context as a possibly meaningful symbol may occur as early as the 11th century, in the decoration of the carpet page of the famous Tanakh manuscript, the Leningrad Codex dated 1008. Similarly, the symbol illuminates a medieval Tanakh manuscript dated 1307 belonging to Rabbi Yosef bar Yehuda ben Marvas from Toledo, Spain. A Siddur dated 1512 from Prague displays a large hexagram on the cover with the phrase, "He will merit to bestow a bountiful gift on anyone who grasps the Shield of David. '' A hexagram has been noted on a Jewish tombstone in Taranto, Apulia in Southern Italy, which may date as early as the third century CE. The Jews of Apulia were noted for their scholarship in Kabbalah, which has been connected to the use of the Star of David. Medieval Kabbalistic grimoires show hexagrams among the tables of segulot, but without identifying them as "Shield of David ''. In the Renaissance Period, in the 16th - century Land of Israel, the book Ets Khayim conveys the Kabbalah of Ha - Ari (Rabbi Isaac Luria) who arranges the traditional items on the seder plate for Passover into two triangles, where they explicitly correspond to Jewish mystical concepts. The six sfirot of the masculine Zer Anpin correspond to the six items on the seder plate, while the seventh sfira being the feminine Malkhut corresponds to the plate itself. However, these seder - plate triangles are parallel, one above the other, and do not actually form a hexagram. According to G.S. Oegema (1996) Isaac Luria provided the hexagram with a further mystical meaning. In his book Etz Chayim he teaches that the elements of the plate for the Seder evening have to be placed in the order of the hexagram: above the three sefirot "Crown '', "Wisdom '', and "Insight '', below the other seven. Similarly, M. Costa wrote that M. Gudemann and other researchers in the 1920s claimed that Isaac Luria was influential in turning the Star of David into a national Jewish emblem by teaching that the elements of the plate for the Seder evening have to be placed in the order of the hexagram. Gershom Scholem (1990) disagrees with this view, arguing that Isaac Luria talked about parallel triangles one beneath the other and not about the hexagram. The Star of David at least since the 20th century remains associated with the number seven and thus with the Menorah, and popular accounts associate it with the six directions of space plus the center (under the influence of the description of space found in the Sefer Yetsira: Up, Down, East, West, South, North, and Center), or the Six Sefirot of the Male (Zeir Anpin) united with the Seventh Sefirot of the Female (Nukva). Some say that one triangle represents the ruling tribe of Judah and the other the former ruling tribe of Benjamin. It is also seen as a dalet and yud, the two letters assigned to Judah. There are 12 Vav, or "men, '' representing the 12 tribes or patriarchs of Israel. In 1354, King of Bohemia Charles IV prescribed for the Jews of Prague a red flag with both David 's shield and Solomon 's seal, while the red flag with which the Jews met King Matthias of Hungary in the 15th century showed two pentagrams with two golden stars. In 1460, the Jews of Ofen (Budapest, Hungary) received King Matthias Corvinus with a red flag on which were two Shields of David and two stars. In the first Hebrew prayer book, printed in Prague in 1512, a large hexagram appears on the cover. In the colophon is written: "Each man beneath his flag according to the house of their fathers... and he will merit to bestow a bountiful gift on anyone who grasps the Shield of David. '' In 1592, Mordechai Maizel was allowed to affix "a flag of King David, similar to that located on the Main Synagogue '' on his synagogue in Prague. Following the Battle of Prague (1648), the Jews of Prague were again granted a flag, in recognition in their contribution to the city 's defense. That flag showed a yellow hexagram on a red background, with a star placed in the center of the hexagram. The symbol became representative of the worldwide Zionist community, and later the broader Jewish community, after it was chosen to represent the First Zionist Congress in 1897. A year before the congress, Herzl had written in his 1896 Der Judenstaat: We have no flag, and we need one. If we desire to lead many men, we must raise a symbol above their heads. I would suggest a white flag, with seven golden stars. The white field symbolizes our pure new life; the stars are the seven golden hours of our working - day. For we shall march into the Promised Land carrying the badge of honor. David Wolffsohn (1856 -- 1914), a businessman prominent in the early Zionist movement, was aware that the nascent Zionist movement had no official flag, and that the design proposed by Theodor Herzl was gaining no significant support, wrote: At the behest of our leader Herzl, I came to Basle to make preparations for the Zionist Congress. Among many other problems that occupied me then was one that contained something of the essence of the Jewish problem. What flag would we hang in the Congress Hall? Then an idea struck me. We have a flag -- and it is blue and white. The talith (prayer shawl) with which we wrap ourselves when we pray: that is our symbol. Let us take this Talith from its bag and unroll it before the eyes of Israel and the eyes of all nations. So I ordered a blue and white flag with the Shield of David painted upon it. That is how the national flag, that flew over Congress Hall, came into being. In the early 20th century, the symbol began to be used to express Jewish affiliations in sports. Hakoah Vienna was a Jewish sports club founded in Vienna, Austria, in 1909 whose teams competed with the Star of David on the chest of their uniforms, and won the 1925 Austrian League soccer championship. Similarly, The Philadelphia Sphas basketball team in Philadelphia (whose name was an acronym of its founding South Philadelphia Hebrew Association) wore a large Star of David on their jerseys to proudly proclaim their Jewish identity, as they competed in the first half of the 20th century. In boxing, Benny "the Ghetto Wizard '' Leonard (who said he felt as though he was fighting for all Jews) fought with a Star of David embroidered on his trunks in the 1910s. World heavyweight boxing champion Max Baer fought with a Star of David on his trunks as well, notably, for the first time as he knocked out Nazi Germany hero Max Schmeling in 1933; Hitler never permitted Schmeling to fight a Jew again. A Star of David, often yellow, was used by the Nazis during the Holocaust to identify Jews. After the German invasion of Poland in 1939, there initially were different local decrees forcing Jews to wear distinct signs (e.g. in the General Government, a white armband with a blue Star of David; in the Warthegau, a yellow badge, in the form of a Star of David, on the left breast and on the back). If a Jew was found in public without the star, he could be severely punished. The requirement to wear the Star of David with the word Jude (German for Jew) was then extended to all Jews over the age of six in the Reich and in the Protectorate of Bohemia and Moravia (by a decree issued on September 1, 1941 and signed by Reinhard Heydrich) and was gradually introduced in other Nazi - occupied areas. Others, however, wore the Star of David as a symbol of defiance against Nazi antisemitism, as in the case of United States Army private Hal Baumgarten, who wore a Star of David emblazoned on his back during the 1944 invasion of Normandy. The flag of Israel, depicting a blue Star of David on a white background, between two horizontal blue stripes was adopted on October 28, 1948, five months after the country 's establishment. The origins of the flag 's design date from the First Zionist Congress in 1897; the flag has subsequently been known as the "flag of Zion ''. Many Modern Orthodox synagogues, and many synagogues of other Jewish movements, have the Israeli flag with the Star of David prominently displayed at the front of the synagogues near the Ark containing the Torah scrolls. Magen David Adom (MDA) ("Red Star of David '' or, translated literally, "Red Shield of David '') is Israel 's only official emergency medical, disaster, ambulance service. It has been an official member of the International Committee of the Red Cross since June 2006. According to the Israel Ministry of Foreign Affairs, Magen David Adom was boycotted by the International Committee of the Red Cross, which refused to grant the organization membership because "it was (...) argued that having an emblem used by only one country was contrary to the principles of universality. '' Other commentators said the ICRC did not recognize the medical and humanitarian use of this Jewish symbol, a Red Shield, alongside the Christian cross and the Muslim crescent. Since 1948, the Star of David has carried the dual significance of representing both the state of Israel, and Jewish identity in general. In the United States especially, it continues to be used in the latter sense by a number of athletes. In baseball, Jewish major leaguer Gabe Kapler had a Star of David tattooed on his left calf in 2000, with the words "strong - willed '' and "strong - minded '', major leaguer Mike "SuperJew '' Epstein drew a Star of David on his baseball glove, and major leaguer Ron Blomberg had a Star of David emblazoned in the knob of his bat which is on display at the Baseball Hall of Fame. NBA basketball star Amar'e Stoudemire, who says he is spiritually and culturally Jewish, had a Star of David tattoo put on his left hand in 2010. NFL football defensive end Igor Olshansky has Star of David tattoos on each side of his neck, near his shoulders. Israeli golfer Laetitia Beck displays a blue - and - white magen david symbol on her golf apparel. In boxing, Jewish light heavyweight world champion Mike "The Jewish Bomber '' Rossman fought with a Star of David embroidered on his boxing trunks, and also has a blue Star of David tattoo on the outside of his right calf. Other boxers fought with Stars of David embroidered on their trunks include world lightweight champion, world light heavyweight boxing champion Battling Levinsky, Barney Ross (world champion as a lightweight, as a junior welterweight, and as a welterweight), world flyweight boxing champion Victor "Young '' Peres, world bantamweight champion Alphonse Halimi, and more recently World Boxing Association super welterweight champion Yuri Foreman, light welterweight champion Cletus Seldin, and light middleweight Boyd Melson. Welterweight Zachary "Kid Yamaka '' Wohlman has a tattoo of a Star of David across his stomach, and welterweight Dmitriy Salita even boxes under the nickname "Star of David ''. Maccabi clubs still use the Star of David in their emblems. The Jewish Encyclopedia cites a 12th - century Karaite document as the earliest Jewish literary source to mention a symbol called "Magen Dawid '' (without specifying its shape). The name ' Shield of David ' was used by at least the 11th century as a title of the God of Israel, independent of the use of the symbol. The phrase occurs independently as a Divine title in the Siddur, the traditional Jewish prayer book, where it poetically refers to the Divine protection of ancient King David and the anticipated restoration of his dynastic house, perhaps based on Psalm 18, which is attributed to David, and in which God is compared to a shield (v. 31 and v. 36). The term occurs at the end of the "Samkhaynu / Gladden us '' blessing, which is recited after the reading of the Haftara portion on Saturday and holidays. The earliest known text related to Judaism which mentions a sign called the "Shield of David '' is Eshkol Ha - Kofer by the Karaite Judah Hadassi, in the mid-12th century CE: Seven names of angels precede the mezuzah: Michael, Gabriel, etc.... Tetragrammaton protect you! And likewise the sign, called the "Shield of David '', is placed beside the name of each angel. This book is of Karaite, and not of Rabbinic Jewish origin, and it does not describe the shape of the sign in any way. Star in the Schneider Synagogue, Istanbul Star in the Ari Ashkenazi Synagogue, Safed The Magen David Adom emblem A synagogue in Karlsruhe, Germany, with the outline of a Star of David A recruitment poster published in American Jewish magazines during WWI. Daughter of Zion (representing the Jewish people): Your Old New Land must have you! Join the Jewish regiment. Roundel displayed on Israeli Air Force aircraft, 1948 - today Stained glass Star of David USVA headstone emblem 3 USVA headstone emblem 44 Morocco fly mask embroidery
what's the record for most points scored in nba game
NBA Regular season records - wikipedia This article lists all - time records achieved in the NBA regular season in major statistical categories recognized by the league, including those set by teams and individuals in a game, season, and career. The NBA also recognizes records from its original incarnation, the Basketball Association of America (BAA). In 2006, the NBA introduced age requirement restrictions. Prospective high school players must wait a year before entering the NBA, making age - related records harder to break. Note: Other than the longest game and disqualifications in a game, all records in this section are since the 24 - second shot clock was instituted for 1954 -- 55 season onward. * This award has only been given since the 1968 -- 69 season. * * This award has only been given since the 1982 -- 83 season.
when will the oscar nominations for 2018 be announced
90th Academy Awards - Wikipedia The 90th Academy Awards ceremony, presented by the Academy of Motion Picture Arts and Sciences (AMPAS), honored the best films of 2017 and took place at the Dolby Theatre in Hollywood, Los Angeles, California. The ceremony was held on March 4, 2018 rather than its usual late - February date to avoid conflicting with the 2018 Winter Olympics. During the ceremony, AMPAS presented Academy Awards (commonly referred to as Oscars) in 24 categories. The ceremony was televised in the United States by American Broadcasting Company (ABC), produced by Michael De Luca and Jennifer Todd and directed by Glenn Weiss. Comedian Jimmy Kimmel hosted for the second consecutive year, making him the first person to host back - to - back ceremonies since Billy Crystal in 1997 and 1998. In related events, the Academy held its 9th Annual Governors Awards ceremony at the Grand Ballroom of the Hollywood and Highland Center on November 11, 2017. On February 10, 2018, in a ceremony at the Beverly Wilshire Hotel in Beverly Hills, California, the Academy Scientific and Technical Awards were presented by host actor Sir Patrick Stewart. The Shape of Water won a leading four awards, including Best Picture and Best Director for Guillermo del Toro. Dunkirk won three awards; Blade Runner 2049, Coco, Darkest Hour, and Three Billboards Outside Ebbing, Missouri won two awards each. Call Me by Your Name, Dear Basketball, A Fantastic Woman, Get Out, Heaven Is a Traffic Jam on the 405, I, Tonya, Icarus, Phantom Thread, and The Silent Child received one each. With a U.S. viewership of 26.5 million, it was the least - watched show in the Academy 's history. The nominees for the 90th Academy Awards were announced on January 23, 2018, at 5: 22 a.m. PST (13: 22 UTC), at the Samuel Goldwyn Theater in Beverly Hills, California, via global live stream, from the Academy and by actors Tiffany Haddish and Andy Serkis. The Shape of Water led all nominees with thirteen nominations; Dunkirk came in second with eight, and Three Billboards Outside Ebbing, Missouri came in third with seven. Winners are listed first, highlighted in boldface, and indicated with a double dagger (). The Academy held its ninth annual Governors Awards ceremony on November 11, 2017, during which the following awards were presented: The following individuals presented awards or performed musical numbers. Despite the mixed reception received from the preceding year 's ceremony, the Academy rehired Michael De Luca and Jennifer Todd as producers for the second consecutive year. In May 2017, it was announced that Jimmy Kimmel would return as host for a second consecutive year. Kimmel expressed that he was thrilled to be selected to MC the gala again, commenting, "Hosting the Oscars was a highlight of my career and I am grateful to Cheryl (Boone Isaacs), Dawn (Hudson), and the Academy for asking me to return to work with two of my favorite people, Mike De Luca and Jennifer Todd. If you think we screwed up the ending this year, wait until you see what we have planned for the 90th anniversary show! '' Jimmy extensively campaigned for the ceremony, shooting several promos and discussions on his talk show. On December 4, 2017, it was announced that the timing of the ceremony and its pre-show had been changed and both would be scheduled to broadcast a half - hour earlier than prior telecasts. In the first half of the nominations announcement, pre-taped category introductions were included that featured actresses Priyanka Chopra, Rosario Dawson, Gal Gadot, Salma Hayek, Michelle Rodriguez, Zoe Saldana, Molly Shannon, Rebel Wilson and Michelle Yeoh. As per the tradition of the Academy, the previous year 's Best Actor winner usually presents the Best Actress award for the next year 's ceremony; in lieu of this, last year 's Best Actor winner Casey Affleck reportedly decided not to attend the ceremony due to his sexual harassment allegations. Jodie Foster and Jennifer Lawrence presented the award together in place of Affleck. The Best Actor award was presented by Jane Fonda and Helen Mirren. Warren Beatty and Faye Dunaway returned to present the Best Picture Award for the second year in the row, after last year 's announcement error. Sixth - year in a row Derek McLane designed the stage with forty - five million Swarovski crystals. At the time of the nominations announcement on January 23, 2018, the combined gross of the nine Best Picture nominees at the North American box offices was $568.2 million, with an average of $63.1 million per film (although Dunkirk and Get Out were the only films with a gross above $46 million). When the nominations were announced, Dunkirk was the highest - grossing film among the Best Picture nominees with $188 million in domestic box office receipts. Get Out was the second - highest - grossing film with $175.6 million, followed by The Post ($45.7 million), Darkest Hour ($41 million), Lady Bird ($39.1 million), Three Billboards Outside Ebbing, Missouri ($32.2 million), The Shape of Water ($30.4 million), Call Me by Your Name ($9.1 million), and Phantom Thread ($6.3 million). From the date of announcements to the time of the ceremony on March 4, 2018, the total made by the Best Picture nominees at the North American box offices was $126.7 million, with an average of $14.1 million per film. The Post ($34.6 million) and The Shape of Water ($27 million) had the highest grossed during that frame, followed by Three Billboards Outside Ebbing, Missouri ($19.8 million), Darkest Hour ($14.5 million), Phantom Thread ($13.8 million), Lady Bird ($9.2 million), Call Me by Your Name ($7.5 million) and Get Out ($353,795 from a one week re-release). Thirty - six nominations went to 15 films on the list of the top 50 grossing movies of the year. Of those 15 films, only Coco (12th), Logan (15th) Dunkirk (16th), Get Out (18th), The Boss Baby (19th), and Ferdinand (35th) were nominated for Best Picture, Best Animated Feature or any of the directing, acting or screenwriting awards. The other top 50 box - office hits that earned nominations were Star Wars: The Last Jedi (1st), Beauty and the Beast (2nd), Guardians of the Galaxy Vol. 2 (8th), Kong: Skull Island (17th), War for the Planet of the Apes (20th), Wonder (33rd), The Greatest Showman (29th), Baby Driver (36th), and Blade Runner 2049 (41st). Right after her win at the Governor 's ball, actress Frances McDormand 's Oscar was briefly stolen for fifteen minutes by a man named Terry Bryant, who had a ticket to the after - party. Bryant filmed himself with the statute and reportedly telling other "guests he was a winner, '' before being apprehended by Chef Wolfgang Puck 's photographer who did not recognize Bryant as a winner and retrieved the statute from him returning it back to the actress. The Academy said in a statement, "Best Actress winner Frances McDormand and her Oscar were happily reunited after a brief separation at last night 's Governors Ball. The alleged thief was quickly apprehended by a photographer and members of our fast - acting Academy and security teams. '' Despite McDormand 's consent to let Bryant go, he was arrested by LAPD and was charged with grand theft, but was released without a bail following Wednesday 's hearing after the judge ruled that "he did not pose a flight risk. '' He appeared in court on March 28, 2018, where without any consesus his hearing was rescheduled on May 1, 2018. The show received a mixed reception from media publications. Some media outlets were more critical of the show. On review aggregator website Rotten Tomatoes, the show holds an approval rating of 46 % based on 28 critics, and summarized, "The 90th Academy Awards played it safe and hit no major snags -- but by clocking in at over four hours, wore out its welcome long before the surprise ending. '' Hank Stuever of The Washington Post marked, "In his second year, Kimmel has shown that the telecast need n't be anything but sharp and sure, with a funny host whose bits are manageable, shareable and -- best of all -- forgotten. We 're not making showbiz history here; we 're just trying to get through another Oscar night. '' Chief critic David Edelstein of Vulture wrote, "This was the best, most inspiring, and most sheerly likable Academy Awards telecast I 've ever seen... It was also -- in terms of the actual awards -- among the most disappointing. '' Vanity Fair 's, Richard Lawson wrote, "As a host, Kimmel struck a careful, appropriately measured tone... All told, Sunday 's ceremony did an admirable job of recognizing all the turmoil surrounding it while maintaining the silly, chintzy trappings that so many of us tune into the Oscars for. '' CNN 's Brian Lowry quipped, "The Oscars are a big, unwieldy beast, which invariably try to serve too many masters. Yet if the intent was ultimately to maintain a celebratory tone without ignoring either the outside world or the elephant in the room throughout this year 's awards, host Jimmy Kimmel and the show itself largely succeeded. '' Others were more critical of the show. Television critic Maureen Ryan of Variety said, "All things considered, the show had a more or less low - key vibe. Normally it takes about two hours for the numbing effect to set in, but despite host Jimmy Kimmel 's best efforts, Sunday 's telecast started to feel a bit languid and low - energy far earlier. '' Television critic James Poniewozik of The New York Times said, "despite the recent upheaval in Hollywood, the ceremony at large still focused mainly on celebration and glitter literally, in the case of the blinding set, which looked as if the ceremony were encased in an enormous geode. There 's also the perennial problem of bloat. The hitch, of course, is that every part of the show has its constituency. '' Darren Franich of Entertainment Weekly wrote, "What fun we had at this year 's Oscars! Long show, sure, but where to cut it? '' Writing for Deadline Greg Evans said, "Did the nearly four - hour running time contain any moments for the Oscar ages? Probably not. '' David Wiegand of San Francisco Chronicle said, "Even the hope that the noise of clapping might keep the audience at home and in the theater awake, there was little of that for anything except the entrance of actors of advance age. '' The Oregonian columnist Kristi Turnquist wrote, "Was it respectful? Absolutely. Did it make for kind of a dull, earnest Oscars show? Yeah, kind of. '' Attaining 26.5 million U.S. viewers according to Nielsen ratings, the ceremony 's telecast had a 16 - percent drop in viewership from last year 's ceremony and had the lowest U.S. viewership in Oscar history. On March 6, after the final ratings were confirmed, President Donald Trump took to his Twitter account, saying, "Lowest rated Oscars in HISTORY. Problem is, we do n't have stars anymore -- except your President (just kidding, of course)! ''. In response, Kimmel also tweeted, saying, "Thanks, lowest rated President in HISTORY. '' The annual In Memoriam segment was introduced by Jennifer Garner with Eddie Vedder performing a rendition of the Tom Petty 's song "Room at the Top ''. The segment paid tribute to following forty - four artists in the montage: On the Academy 's website there is a gallery focusing on several other artists that were not included in the segment. Official website News resources Analysis Other resources
where is disney land located in united states
Disneyland - Wikipedia Disneyland Park, originally Disneyland, is the first of two theme parks built at the Disneyland Resort in Anaheim, California, opened on July 17, 1955. It is the only theme park designed and built under the direct supervision of Walt Disney. It was originally the only attraction on the property; its official name was changed to Disneyland Park to distinguish it from the expanding complex in the 1990s. Walt Disney came up with the concept of Disneyland after visiting various amusement parks with his daughters in the 1930s and 1940s. He initially envisioned building a tourist attraction adjacent to his studios in Burbank to entertain fans who wished to visit; however, he soon realized that the proposed site was too small. After hiring a consultant to help him determine an appropriate site for his project, Disney bought a 160 - acre (65 ha) site near Anaheim in 1953. Construction began in 1954 and the park was unveiled during a special televised press event on the ABC Television Network on July 17, 1955. Since its opening, Disneyland has undergone a number of expansions and major renovations, including the addition of New Orleans Square in 1966, Bear Country (now Critter Country) in 1972, Mickey 's Toontown in 1993, and the forthcoming Star Wars: Galaxy 's Edge in 2019. Opened in 2001, Disney California Adventure Park was built on the site of Disneyland 's original parking lot. Disneyland has a larger cumulative attendance than any other theme park in the world, with over 650 million guests since it opened. In 2013, the park hosted approximately 16.2 million guests, making it the third most visited park in the world that calendar year. According to a March 2005 Disney report, 65,700 jobs are supported by the Disneyland Resort, including about 20,000 direct Disney employees and 3,800 third - party employees (independent contractors or their employees). To all who come to this happy place: Welcome. Disneyland is your land. Here age relives fond memories of the past, and here youth may savor the challenge and promise of the future. Disneyland is dedicated to the ideals, the dreams, and the hard facts that have created America, with the hope that it will be a source of joy and inspiration to all the world. The concept for Disneyland began when Walt Disney was visiting Griffith Park in Los Angeles with his daughters Diane and Sharon. While watching them ride the merry - go - round, he came up with the idea of a place where adults and their children could go and have fun together, though his dream lay dormant for many years. He may have also been influenced by his father 's memories of the World 's Columbian Exposition of 1893 in Chicago (his father worked at the Exposition). The Midway Plaisance there included a set of attractions representing various countries from around the world and others representing various periods of man; it also included many rides including the first Ferris wheel, a "sky '' ride, a passenger train that circled the perimeter, and a Wild West Show. Another likely influence was Benton Harbor, Michigan 's nationally famous House of David 's Eden Springs Park. Disney visited the park and ultimately bought one of the older miniature trains originally used there; the colony had the largest miniature railway setup in the world at the time. The earliest documented draft of Disney 's plans was sent as a memo to studio production designer Dick Kelsey on August 31, 1948, where it was referred to as a "Mickey Mouse Park '', based on notes Walt made during his and Ward Kimball 's trip to Chicago Railroad Fair the same month, with a two - day stop in Henry Ford 's Museum and Greenfield Village, a place with attractions like a Main Street and steamboat rides, which he had visited eight years earlier. While people wrote letters to Disney about visiting the Walt Disney Studios, he realized that a functional movie studio had little to offer to visiting fans, and began to foster ideas of building a site near the Burbank studios for tourists to visit. His ideas evolved to a small play park with a boat ride and other themed areas. The initial concept, the Mickey Mouse Park, started with an 8 - acre (3.2 ha) plot across Riverside Drive. He started to visit other parks for inspiration and ideas, including Tivoli Gardens in Denmark, Efteling in the Netherlands, and Greenfield Village, Playland, and Children 's Fairyland in the United States; and (according to the film director Ken Annakin, in his autobiography ' So You want to be a film director? '), Bekonscot Model Village & Railway, Beaconsfield, England. His designers began working on concepts, though the project grew much larger than the land could hold. Disney hired Harrison Price from Stanford Research Institute to gauge the proper area to locate the theme park based on the area 's potential growth. Based on Price 's analysis (for which he would be recognized as a Disney Legend in 2003), Disney acquired 160 acres (65 ha) of orange groves and walnut trees in Anaheim, southeast of Los Angeles in neighboring Orange County. The Burbank site originally considered by Disney is now home to Walt Disney Animation Studios and ABC Studios. Difficulties in obtaining funding prompted Disney to investigate new methods of fundraising, and he decided to create a show named Disneyland. It was broadcast on then - fledgling ABC. In return, the network agreed to help finance the park. For its first five years of operation, Disneyland was owned by Disneyland, Inc., which was jointly owned by Walt Disney Productions, Walt Disney, Western Publishing and ABC. In addition, Disney rented out many of the shops on Main Street, U.S.A. to outside companies. By 1960, Walt Disney Productions bought out all other shares, a partnership which would eventually lead to the Walt Disney Corporation 's acquisition of ABC in the mid-1990s. In 1952, the proposed project had been called Disneylandia, but Disney followed ABC 's advice and changed it to Disneyland two years later, when excavation of the site began. Construction began on July 16, 1954 and cost $17 million to complete. The park was opened one year and one day later. U.S. Route 101 (later Interstate 5) was under construction at the same time just north of the site; in preparation for the traffic Disneyland was expected to bring, two more lanes were added to the freeway before the park was finished. Disneyland was dedicated at an "International Press Preview '' event held on Sunday, July 17, 1955, which was only open to invited guests and the media. Although 28,000 people attended the event, only about half of those were actual invitees, the rest having purchased counterfeit tickets, or even sneaked into the park by climbing over the fence. The following day, it opened to the public, featuring twenty attractions. The Special Sunday events, including the dedication, were televised nationwide and anchored by three of Walt Disney 's friends from Hollywood: Art Linkletter, Bob Cummings, and Ronald Reagan. ABC broadcast the event live, during which many guests tripped over the television camera cables. In Frontierland, a camera caught Cummings kissing a dancer. When Disney started to read the plaque for Tomorrowland, he read partway then stopped when a technician off - camera said something to him, and after realizing he was on - air, said, "I thought I got a signal '', and began the dedication from the start. At one point, while in Fantasyland, Linkletter tried to give coverage to Cummings, who was on the pirate ship. He was not ready, and tried to give the coverage back to Linkletter, who had lost his microphone. Cummings then did a play - by - play of him trying to find it in front of Mr. Toad 's Wild Ride. Traffic was delayed on the two - lane Harbor Boulevard. Famous figures who were scheduled to show up every two hours showed up all at once. The temperature was an unusually high 101 ° F (38 ° C), and because of a local plumbers ' strike, Disney was given a choice of having working drinking fountains or running toilets. He chose the latter, leaving many drinking fountains dry. This generated negative publicity since Pepsi sponsored the park 's opening; disappointed guests believed the inoperable fountains were a cynical way to sell soda, while other vendors ran out of food. The asphalt that had been poured that morning was soft enough to let women 's high - heeled shoes sink into it. Some parents threw their children over the crowd 's shoulders to get them onto rides, such as the King Arthur Carrousel. In later years, Disney and his 1955 executives referred to July 17, 1955, as "Black Sunday ''. After the extremely negative press from the preview opening, Walt Disney invited attendees back for a private "second day '' to experience Disneyland properly. The next day, the park 's official public opening day, crowds gathered in line as early as 2: 00 am. The first person to buy a ticket and enter the park was David MacPherson with ticket number 2, as Roy O. Disney arranged to pre-purchase ticket number 1 from Curtis Lineberry, the manager of admissions. However, an official picture of Walt Disney and two children, Christine Vess Watkins (age 5) and Michael Schwartner (7), inaccurately identifies them as the first two guests of Disneyland. Both received lifetime passes to Disneyland that day, and MacPherson was awarded one shortly thereafter, which was later expanded to every single Disney - owned park in the world. Approximately 50,000 guests attended the Monday opening day. At the time, and during the lifetimes of Walt and Roy Disney, July 17 was considered merely a preview, with July 18 the official opening day. Since then, aided by memories of the television broadcast, the company has adopted July 17 as the official date, the one commemorated every year as Disneyland 's birthday. In September 1959, Soviet Premier Nikita Khrushchev spent thirteen days in the United States, with two requests: to visit Disneyland and to meet John Wayne, Hollywood 's top box - office draw. Due to the Cold War tension and security concerns, he was famously denied an excursion to Disneyland. The Shah of Iran and Empress Farah were invited to Disneyland by Walt Disney in the early 1960s. There was moderate controversy over the lack of African American employees. As late as 1963, civil rights activists were pressuring Disneyland to hire black people, with executives responding that they would "consider '' the requests. The park did however hire people of Asian descent, such as Ty Wong and Bob Kuwahara. As part of the Casa de Fritos operation at Disneyland, "Doritos '' (Spanish for "little golden things '') were created at the park to recycle old tortillas that would have been discarded. The Frito - Lay Company saw the popularity of the item and began selling them regionally in 1964, and then nationwide in 1966. In the late 1990s, work began to expand the one - park, one - hotel property. Disneyland Park, the Disneyland Hotel, the site of the original parking lot, and acquired surrounding properties were earmarked to become part of the Disneyland Resort. At that time, the property saw the addition of the Disney California Adventure theme park, a shopping, dining and entertainment complex named Downtown Disney, a remodeled Disneyland Hotel, the construction of Disney 's Grand Californian Hotel & Spa, and the acquisition and re-branding of the Pan Pacific Hotel as Disney 's Paradise Pier Hotel. The park was renamed "Disneyland Park '' to distinguish it from the larger complex under construction. Because the existing parking lot (south of Disneyland) was repurposed by these projects, the six - level, 10,250 - space Mickey and Friends parking structure was constructed in the northwest corner. Upon completion in 2000, it was the largest parking structure in the United States. The park 's management team during the mid-1990s was a source of controversy among fans and employees. In an effort to boost profits, various changes were begun by then - executives Cynthia Harriss and Paul Pressler. While their initiatives provided a short - term increase in shareholder returns, they drew widespread criticism for their lack of foresight. The retail backgrounds of Harriss and Pressler led to a gradual shift in Disneyland 's focus from attractions to merchandising. Outside consultants McKinsey & Company were brought in to help streamline operations, resulting in many changes and cutbacks. After nearly a decade of deferred maintenance, the original park was showing signs of neglect. Fans of the park decried the perceived decline in customer value and park quality and rallied for the dismissal of the management team. 2003 - present Matt Ouimet, the former president of the Disney Cruise Line, was promoted to assume leadership of the Disneyland Resort in late 2003. Shortly afterward, he selected Greg Emmer as Senior Vice President of Operations. Emmer was a long - time Disney cast member who had worked at Disneyland in his youth prior to moving to Florida and held multiple executive leadership positions at the Walt Disney World Resort. Ouimet quickly set about reversing certain trends, especially concerning cosmetic maintenance and a return to the original infrastructure maintenance schedule, in hopes of restoring Disneyland 's former safety record. Similarly to Disney himself, Ouimet and Emmer could often be seen walking the park during business hours with members of their respective staff, wearing cast member name badges, standing in line for attractions, and welcoming guests ' comments. In July 2006, Matt Ouimet left The Walt Disney Company to become president of Starwood Hotels & Resorts Worldwide. Soon after, Ed Grier, executive managing director of Walt Disney Attractions Japan, was named president of the resort, though he retired from his job on February 8, 2008. In October 2009, Grier announced his retirement, and was replaced by George Kalogridis. The "Happiest Homecoming on Earth '' was an eighteen - month - long celebration (held through 2005 and 2006) of the fiftieth anniversary of the Disneyland Park, also celebrating Disneyland 's milestone throughout Disney parks worldwide. In 2004, the park underwent major renovations in preparation, restoring many classic attractions, notably Space Mountain, Jungle Cruise, the Haunted Mansion, Pirates of the Caribbean, and Walt Disney 's Enchanted Tiki Room. Attractions that had been in the park on opening day had one ride vehicle painted gold, and the park was decorated with fifty Golden Mickey Ears. The celebration started on May 5, 2005, and ended on September 30, 2006, and was followed by the "Year of a Million Dreams '' celebration, lasting twenty - seven months and ending on December 31, 2008. Beginning on January 1, 2010, Disney Parks hosted the Give a Day, Get a Disney Day volunteer program, in which Disney encouraged people to volunteer with a participating charity and receive a free Disney Day at either a Disneyland Resort or Walt Disney World park. On March 9, 2010, Disney announced that it had reached its goal of one million volunteers and ended the promotion to anyone who had not yet registered and signed up for a specific volunteer situation. In July 2015, Disneyland celebrated its 60th Diamond Celebration anniversary. Disneyland Park introduced the Paint the Night parade and Disneyland Forever fireworks show, and Sleeping Beauty Castle is decorated in diamonds with a large "60 '' logo. The Diamond Celebration concluded in September 2016 and the whole decoration of the anniversary was removed around Halloween 2016. Disneyland Park consists of eight themed "lands '' and a number of concealed backstage areas, and occupies approximately 85 acres (34 ha) (expanding to about 100 acres with Star Wars Land). The park opened with Main Street, U.S.A., Adventureland, Frontierland, Fantasyland, and Tomorrowland, and has since added New Orleans Square in 1966, Bear Country (now known as Critter Country) in 1972, and Mickey 's Toontown in 1993. In 1957, Holidayland opened to the public with a 9 acres (3.6 ha) recreation area including a circus and baseball diamond, but was closed in late 1961. It is often referred to as the "lost '' land of Disneyland. Throughout the park are ' Hidden Mickeys ', representations of Mickey Mouse heads inserted subtly into the design of attractions and environmental decor. An elevated berm supports the 3 ft (914 mm) narrow gauge Disneyland Railroad that circumnavigates the park. A new 14 - acre land to be constructed at the park, Star Wars Land, was announced on August 15, 2015, at the 2015 D23 Expo by Disney CEO Bob Iger. Main Street, U.S.A. (2010) Adventureland (Themed for a 1950s view of adventure, capitalizing on the post-war Tiki craze) Frontierland (Big Thunder Mountain Railroad in 2008) New Orleans Square (The Haunted Mansion and Fantasmic! viewing area in 2010) Critter Country (Splash Mountain in 2010) Fantasyland (Peter Pan 's Flight and the Matterhorn Bobsleds) Mickey 's Toontown (2010) Tomorrowland (Space Mountain in 2009) Main Street, U.S.A. is patterned after a typical Midwest town of the early 20th century. It is a popular myth that Walt Disney derived inspiration from his boyhood town of Marceline, Missouri, but it was actually more closely based on Imagineer, Harper Goff 's hometown of Fort Collins, Colorado. It is the first area guests see when they enter the park (if not entering by monorail), and is how guests reach Central Plaza. At the center of Disneyland and immediately North of Central Plaza stands Sleeping Beauty Castle, which provides entrance to Fantasyland. For those of us who remember the carefree time it recreates, Main Street will bring back happy memories. For younger visitors, it is an adventure in turning back the calendar to the days of grandfather 's youth. Main Street, U.S.A. is reminiscent of the Victorian period of America with the train station, town square, movie theater, city hall, firehouse complete with a steam - powered pump engine, emporium, shops, arcades, double - decker bus, horse - drawn streetcar, jitneys and other bits of memorabilia. Main Street is also home to the Disney Art Gallery and the Opera House which showcases Great Moments with Mr. Lincoln, a show featuring an Audio - Animatronic version of the president. There are many specialty stores on Main Street including: a candy store, jewelry and watch shop, a silhouette station, a store that sells Disney collectible items created by various artists, and a hat shop where you have the option of creating your own ear hat along with a personalized embroidery. At the far end of Main Street, U.S.A. is Sleeping Beauty Castle, the Partners statue, and the Central Plaza (also known as the Hub), which is a portal to most of the themed lands: the entrance to Fantasyland is by way of a drawbridge across a moat and through the castle. Adventureland, Frontierland, and Tomorrowland are arrayed on both sides of the castle. Several lands are not directly connected to the Central Plaza -- namely, New Orleans Square, Critter Country and Mickey 's Toontown. The design of Main Street, U.S.A. uses the technique of forced perspective to create an illusion of height. Buildings along Main Street are built at ⁄ scale on the first level, then ⁄ on the second story, and ⁄ scale on the third -- reducing the scale by ⁄ each level up. Adventureland is designed to recreate the feel of an exotic tropical place in a far - off region of the world. "To create a land that would make this dream reality '', said Walt Disney, "we pictured ourselves far from civilization, in the remote jungles of Asia and Africa. '' Attractions include opening day 's Jungle Cruise, the Indiana Jones Adventure, and Tarzan 's Treehouse, which is a conversion of Swiss Family Treehouse from the Walt Disney film, Swiss Family Robinson. Walt Disney 's Enchanted Tiki Room which is located at the entrance to Adventureland is the first feature attraction to employ Audio - Animatronics, a computer synchronization of sound and robotics. New Orleans Square is based on 19th - century New Orleans, opened on July 24, 1966. It is very popular with Disneyland guests, as it is home to some of the park 's most popular attractions: Pirates of the Caribbean and the Haunted Mansion, with nighttime entertainment in Fantasmic!. This area is the home of the famous Club 33. Frontierland recreates the setting of pioneer days along the American frontier. According to Walt Disney, "All of us have cause to be proud of our country 's history, shaped by the pioneering spirit of our forefathers. Our adventures are designed to give you the feeling of having lived, even for a short while, during our country 's pioneer days. '' Frontierland is home to the Pinewood Indians band of animatronic Native Americans, who live on the banks of the Rivers of America. Entertainment and attractions include Big Thunder Mountain Railroad, the Mark Twain Riverboat, the Sailing Ship Columbia, Pirate 's Lair on Tom Sawyer Island, and Frontierland Shootin ' Exposition. Frontierland is also home to the Golden Horseshoe Saloon, an Old West - style show palace, where the comedic troupe "Billy Hill and the Hillbillies '' entertains guests. Critter Country opened in 1972 as "Bear Country '', and was renamed in 1988. Formerly the area was home to Indian Village, where indigenous tribespeople demonstrated their dances and other customs. Today, the main draw of the area is Splash Mountain, a log - flume journey inspired by the Uncle Remus stories of Joel Chandler Harris and the animated segments of Disney 's Academy Award - winning 1946 film, Song of the South. In 2003, a dark ride called The Many Adventures of Winnie the Pooh replaced the Country Bear Jamboree, which closed in 2001. The attraction is still open in Walt Disney World 's Magic Kingdom. Fantasyland is the area of Disneyland of which Walt Disney said, "What youngster has not dreamed of flying with Peter Pan over moonlit London, or tumbling into Alice 's nonsensical Wonderland? In Fantasyland, these classic stories of everyone 's youth have become realities for youngsters -- of all ages -- to participate in. '' Fantasyland was originally styled in a medieval European fairground fashion, but its 1983 refurbishment turned it into a Bavarian village. Attractions include several dark rides, the King Arthur Carousel, and various family attractions. Fantasyland has the most fiber optics in the park; more than half of them are in Peter Pan 's Flight. Sleeping Beauty 's Castle features a walk - through story telling of Briar Rose 's adventure as Sleeping Beauty. The attraction opened in 1959, was redesigned in 1972, closed in 1992 for reasons of security and the new installation of pneumatic ram firework shell mortars for "Believe, There 's Magic in the Stars '', and reopened 2008 with new renditions and methods of storytelling and the restored work of Eyvind Earle. Mickey 's Toontown opened in 1993 and was partly inspired by the fictional Los Angeles suburb of Toontown in the Touchstone Pictures ' 1988 release Who Framed Roger Rabbit. Mickey 's Toontown is based on a 1930s cartoon aesthetic and is home to Disney 's most popular cartoon characters. Toontown features two main attractions: Gadget 's Go Coaster and Roger Rabbit 's Car Toon Spin. The "city '' is also home to cartoon character 's houses such as the house of Mickey Mouse, Minnie Mouse and Goofy, as well as Donald Duck 's boat. The 3 ft (914 mm) gauge Jolly Trolley can also be found in this area, though it closed as an attraction in 2003 and is now present only for display purposes. During the 1955 inauguration Walt Disney dedicated Tomorrowland with these words: "Tomorrow can be a wonderful age. Our scientists today are opening the doors of the Space Age to achievements that will benefit our children and generations to come. The Tomorrowland attractions have been designed to give you an opportunity to participate in adventures that are a living blueprint of our future. '' Disneyland producer Ward Kimball had rocket scientists Wernher von Braun, Willy Ley, and Heinz Haber serve as technical consultants during the original design of Tomorrowland. Initial attractions included Rocket to the Moon, Astro - Jets and Autopia; later, the first incarnation of the Submarine Voyage was added. The area underwent a major transformation in 1967 to become New Tomorrowland, and then again in 1998 when its focus was changed to present a "retro - future '' theme reminiscent of the illustrations of Jules Verne. Current attractions include Space Mountain, Star Wars Launch Bay, Autopia, Jedi Training: Trials of the Temple, the Disneyland Monorail Tomorrowland Station, Astro Orbitor, and Buzz Lightyear Astro Blasters. Finding Nemo Submarine Voyage opened on June 11, 2007, resurrecting the original Submarine Voyage which closed in 1998. Star Tours was closed in July 2010, and replaced with Star Tours -- The Adventures Continue in June 2011. In August 2015, Disney CEO Bob Iger announced the addition of a Star Wars Land at the D23 Expo in August 2015. The 14 - acre land -- which will also be built at Disney 's Hollywood Studios -- will open in 2019 and include two new attractions; a Millennium Falcon - inspired attraction that will allow guests in control of a "customized secret mission '' and a second attraction that places guests in "a climactic battle between the First Order and the resistance ''. On April 14, 2016, construction began north of Frontierland, on the site where Big Thunder Ranch was located, in addition to some backstage areas. It is scheduled to open in 2019. Disneyland originated many concepts which have become part of the corporate culture of Disney Parks as a whole, and which in turn spread to its other parks. Most importantly, Disneyland staff use theatrical terminology to emphasize that a visit to the park is intended to be similar to witnessing a performance. Visitors are referred to as "guests '' and park employees as "cast members ''. "On stage '' refers to any area of the resort that is open to guests. "Backstage '' refers to any area of the resort that is closed to guests. A crowd is referred to as an "audience ''. "Costume '' is the attire that cast members who perform the day - to - day operations of the park must wear. "Show '' is the resort 's presentation to its guests, such as the color and façades of buildings, placement of rides and attractions, costumes to match the themed lands. When signing credit card receipts, guests are asked for their "autograph ''. "Stage managers '' are responsible for overseeing the operation of the park. Cast members who are in charge of a specific team are called "leads, '' as in a film or theater "lead role ''. In earlier years, the offices where administrative work took place were referred to as "production offices ''. "Production schedulers '' build employee work schedules to meet the necessary workload, while "stage schedulers '' handle day - to - day changes in that work schedule (such as a change in park hours, necessitating a change in everybody 's shifts.) Each cast member 's job is called a "role ''. When working in their roles, cast members must follow a "script '', a code of conduct and approved, themed phraseology that cast members may use when at work. "No '' and "I do n't know '' are notably absent from scripts. Backstage areas are closed areas of attraction, store, and restaurant buildings, as well as outdoor service areas located behind such buildings. Although some areas of the park, particularly New Orleans Square, have underground operations and storage areas, there is no park - wide network of subterranean tunnels, such as Walt Disney World 's utilidors. There are several points of entry from outside the park to the backstage areas: Ball Gate (from Ball Road), T.D.A. Gate (adjacent to the Team Disney Anaheim building), Harbor Pointe (from Harbor Boulevard), and Winston Gate (from Disneyland Drive). Berm Road encircles the park from Firehouse Gate (behind the Main Street Fire Station) to Egghouse Gate (adjacent to the Disneyland Opera House). The road is so called because it generally follows outside the path of Disneyland 's berm. A stretch of the road, wedged between Tomorrowland and Harbor Boulevard, is called Schumacher Road. It has two narrow lanes and runs underneath the Monorail track. There are also two railroad bridges that cross Berm Road: one behind City Hall and the other behind Tomorrowland. Major buildings backstage include the Frank Gehry - designed Team Disney Anaheim, where most of the division 's administration currently works, as well as the Old Administration Building, behind Tomorrowland. The Old Administration Building additionally houses the Grand Canyon and Primeval World dioramas visible on the Disneyland Railroad. The northwest corner of the park is home to most of the park 's maintenance facilities, including company vehicle services, including parking lot trams and Main Street vehicles, the scrap yard, where the resort 's garbage and recyclables are sorted for collection, Circle D Corral, where the resort 's horses and other animals are stabled, parade float storage and maintenance, distribution center for all Resort merchandise, ride vehicle service areas, the paint shop, and the sign shop. Backstage also refers to parts of show buildings that are normally not seen by guests. Backstage areas are generally off - limits to park guests. This prevents guests from seeing the industrial areas that violate the "magic '' of on - stage and keeps them safe from the potentially dangerous machinery. Cast members can also find some solace while they work or rest, as backstage offers alternate routes between the park 's various areas. Many attractions are housed in large, soundstage - like buildings, some of which are partially or completely disguised by external theming. Generally, these buildings are painted a dull green color in areas not seen by guests, this choice helps to disguise the buildings among the foliage and make them less visually obtrusive. Walt Disney Imagineering has termed this color "Go Away Green. '' Most of them have off - white flat roofs that support HVAC units and footpaths for cast members. Inside are the rides, as well as hidden walkways, service areas, control rooms, and other behind - the - scenes operations. Photography is forbidden in these areas, both inside and outside, although some photos have found their way to a variety of web sites. Guests who attempt to explore backstage are warned and often escorted from the property. The boundary between on and off - stage is demarcated at every access point. Everything within guest view when a door or gateway is open is also considered on stage. It is from this point that characters start playing their part. That way, when the door is open, guests will not accidentally see a person out of character backstage. Various amenities exist for Cast Members backstage when they are on breaks, or before and after their scheduled shifts. A number of cafeterias, now run by SodexoMAGIC, offer discounted meals throughout the day. These include Inn Between (behind the Plaza Inn), Eat Ticket (near the Team Disney Anaheim building behind Mickey 's Toontown), and Westsider Grill (located approximately behind New Orleans Square). Partners Federal Credit Union, the credit union for employees of The Walt Disney Company, provides nearly 20 ATMs backstage for cast member use and maintains an express branch at the Team Disney Anaheim building. Walt Disney had a longtime interest in transportation, and trains in particular. Disney 's passion for the "iron horse '' led to him building a miniature live steam backyard railroad -- the "Carolwood Pacific Railroad '' -- on the grounds of his Holmby Hills estate. Throughout all the iterations of Disneyland during the seventeen or so years when Disney was conceiving it, one element remained constant: a train encircling the park. The primary designer for the park transportation vehicles was Bob Gurr who gave himself the title of Director of Special Vehicle Design in 1954. Encircling Disneyland and providing a grand circle tour is the Disneyland Railroad (DRR), a 3 ft (914 mm) narrow gauge short - line railway consisting of five oil - fired and steam - powered locomotives, in addition to three passenger trains and one passenger - carrying freight train. Originally known as the Disneyland and Santa Fe Railroad, the DRR was presented by the Atchison, Topeka and Santa Fe Railway until 1974. From 1955 to 1974, the Santa Fe Rail Pass was accepted in lieu of a Disneyland "D '' coupon. With a 3 ft (914 mm) gauge, the most common narrow track gauge used in North America, the track runs in a continuous loop around Disneyland through each of its realms. Each 1900s - era train departs Main Street Station on an excursion that includes scheduled station stops at: New Orleans Square Station; Toontown Depot; and Tomorrowland Station. The Grand Circle Tour then concludes with a visit to the "Grand Canyon / Primeval World '' dioramas before returning passengers to Main Street, U.S.A. One of Disneyland 's signature attractions is its Disneyland Monorail System monorail service, which opened in Tomorrowland in 1959 as the first daily - operating monorail train system in the Western Hemisphere. The monorail guideway has remained almost exactly the same since 1961, aside from small alterations while Indiana Jones Adventure was being built. Five generations of monorail trains have been used in the park, since their lightweight construction means they wear out quickly. The most recent operating generation, the Mark VII, was installed in 2008. The monorail shuttles visitors between two stations, one inside the park in Tomorrowland and one in Downtown Disney. It follows a 2.5 - mile (4.0 km) long route designed to show the park from above. Currently, the Mark VII is running with the colors red, blue and orange. The monorail was originally a loop built with just one station in Tomorrowland. Its track was extended and a second station opened at the Disneyland Hotel in 1961. With the creation of Downtown Disney in 2001, the new destination is Downtown Disney, instead of the Disneyland Hotel. The physical location of the monorail station did not change, but the original station building was demolished as part of the hotel downsizing, and the new station is now separated from the hotel by several Downtown Disney buildings, including ESPN Zone and the Rainforest Café. All of the vehicles found on Main Street, U.S.A., grouped together as the Main Street Vehicles attraction, were designed to accurately reflect turn - of - the - century vehicles, including a 3 ft (914 mm) gauge tramway featuring horse - drawn streetcars, a double - decker bus, a fire engine, and an automobile. They are available for one - way rides along Main Street, U.S.A. The horse - drawn streetcars are also used by the park entertainment, including The Dapper Dans. The horseless carriages are modeled after cars built in 1903, and are two - cylinder, four - horsepower (3 kW) engines with manual transmission and steering. Walt Disney used to drive the fire engine around the park before it opened, and it has been used to host celebrity guests and in the parades. Most of the original main street vehicles were designed by Bob Gurr. From the late 1950s to 1968 Los Angeles Airways provided regularly scheduled helicopter passenger service between Disneyland and Los Angeles International Airport (LAX) and other cities in the area. The helicopters initially operated from Anaheim / Disneyland Heliport, located behind Tomorrowland. Service later moved, in 1960, to a new heliport north of the Disneyland Hotel. Arriving guests were transported to the Disneyland Hotel via tram. The service ended after two fatal crashes in 1968: The crash in Paramount, California, on May 22, 1968, killed 23 (the worst helicopter accident in aviation history at that time). The second crash in Compton, California on August 14, 1968, killed 21. In addition to the attractions, Disneyland provides live entertainment throughout the park. Most of the mentioned entertainment is not offered daily, but only on selected days of the week, or selected periods of the year. Many Disney characters can be found throughout the park, greeting visitors, interacting with children, and posing for photos. Some characters have specific areas where they are scheduled to appear, but can be found wandering as well. Some of the rarest are characters like Rabbit (from Winnie - the - Pooh), Max, Mushu, and Agent P. Periodically through recent decades (and most recently during the summers of 2005 and 2006), Mickey Mouse would climb the Matterhorn attraction several times a day with the support of Minnie, Goofy, and other performers. Other mountain climbers could also be seen on the Matterhorn from time to time. As of March 2007, Mickey and his "toon '' friends no longer climb the Matterhorn but the climbing program continues. Every evening at dusk, there is a military - style flag retreat to lower the U.S. Flag by a ceremonial detail of Disneyland 's Security staff. The ceremony is usually held between 4: 00 and 5: 00 pm, depending on the entertainment being offered on Main Street, U.S.A., to prevent conflicts with crowds and music. Disney does report the time the Flag Retreat is scheduled on its Times Guide, offered at the entrance turnstiles and other locations. The Disneyland Band, which has been part of the park since its opening, plays the role of the Town Band on Main Street, U.S.A. It also breaks out into smaller groups like the Main Street Strawhatters, the Hook and Ladder Co., and the Pearly Band in Fantasyland. However, on March 31, 2015, the Disneyland Resort notified the band members of an "end of run ''. The reason for doing so is that they would start a new higher energy band. The veteran band members were invited to audition for the new Disneyland band, and were told that even if they did not make the new band or audition, they would still play in small groups around the park. This sparked some controversy with supporters of the traditional band. Fantasmic!, which debuted in 1992, is a popular multimedia nighttime show on the Rivers of America. The star Mickey Mouse summons the characters and spirit of beloved Disney cartoons and uses the power of imagination to defeat evil villains trying to turn his dream into a nightmare. The presentation is made at the Laffite 's Tavern end of Pirate 's Lair on Tom Sawyer Island and shows the Rivers of America imagery as integral with the stage. It uses Frontierland and New Orleans Square as the spectator arena. Rivers of America consists of synchronized lighting and special effects, with floating barges, the Mark Twain Riverboat, the Sailing Ship Columbia, fountains, lasers, fireworks, thirty - foot - tall "mist screens '' upon which animated scenes are projected, and the automated 45 - foot fire - breathing dragon animated multimedia show is spectacle. Elaborate fireworks shows synchronized with Disney songs and often have appearances from Tinker Bell flying in the sky above Sleeping Beauty Castle (during "Magical 's run, Dumbo would be featured along with Tinkerbell, and during Disneyland Forever 's run, Nemo would be featured along with Tink. Since 2000, presentations have become more elaborate, featuring new pyrotechnics, launch techniques and story lines. In 2004, Disneyland introduced a new air launch pyrotechnics system, reducing ground level smoke and noise and decreasing negative environmental impacts. At the time the technology debuted, Disney announced it would donate the patents to a non-profit organization for use throughout the industry. Since 2009, Disneyland has moved to a rotating repertoire of firework spectaculars. During the holiday season, there is a special fireworks presentation called Believe... In Holiday Magic, which has been running since 2000, except for a hiatus in 2005 during the park 's 50th anniversary celebration, and in 2015 during the park 's 60th. Scheduling of fireworks shows depends on the time of year. During the slower off - season periods, the fireworks are only offered on weekends. During the busier times, Disney offers additional nights. The park offers fireworks nightly during its busy periods, which include Easter / Spring Break, Summer and Christmas time. Disneyland spends about $41,000 per night on the fireworks show. The show is normally offered at 8: 45 or 9: 30 pm if the park is scheduled to close at 10 pm or later, but shows have started as early as 5: 45 pm. A major consideration is weather / winds, especially at higher elevations, which can, and often will, force the delay or cancellation of the show. With a few minor exceptions, such as July 4 and New Year 's Eve, shows must finish by 10 pm due to the conditions of the permit issued by the City of Anaheim. The Golden Horseshoe Saloon offers a live stage show with an Old West feel. The Golden Horseshoe Revue was an American frontier - themed vaudeville show starring Sluefoot Sue and Pecos Bill. It ran until the mid-1980s, when it was replaced by a similar show starring Lily Langtree (or Miss Lily) and Sam the Bartender. Billy Hill and the Hillbillies have played their guitars and banjos in a bluegrass - and - comedy show, but recently moved to Knott 's Berry Farm. Since their departure, The Laughing Stock Co. enacts small humorous skits with an Old West theme inside the Saloon. Disneyland has featured a number of different parades traveling down the park 's central Main Street -- Fantasyland corridor. There have been daytime and nighttime parades that celebrated Disney films or seasonal holidays with characters, music, and large floats. One of the most popular parades was the Main Street Electrical Parade, which recently ended a limited - time return engagement after an extended run at the Magic Kingdom at Walt Disney World in Lake Buena Vista, Florida. From May 5, 2005 through November 7, 2008, as part of Disneyland 's 50th anniversary, Walt Disney 's Parade of Dreams was presented, celebrating several of the classic Disney films including The Lion King, The Little Mermaid, Alice in Wonderland, and Pinocchio. In 2009, Walt Disney 's Parade of Dreams was replaced by Celebrate! A Street Party, which premiered on March 27, 2009. Disney did not call Celebrate! A Street Party a parade, but rather a "street event. '' During the Christmas season, Disneyland presents "A Christmas Fantasy '' Parade. Walt Disney 's Parade of Dreams was replaced by Mickey 's Soundsational Parade, which debuted on May 27, 2011. Disneyland debuted a new nighttime parade called "Paint the Night '', on May 22, 2015, as part of the park 's 60th anniversary. The Tomorrowland Terrace is a stage in Tomorrowland. It is a two - story stage where the lower stage rises from below floor level. It was popular in the 1960s with music performers of the day. Over the years, it was eventually replaced with Club Buzz, a Buzz Lightyear - themed stage and show featuring the space character from the Toy Story films. In 2006, it was restored to the Tomorrowland Terrace with the same style and design as the original. It is now home to the Jedi Training Academy interactive stage show where children are chosen as Jedi padawan and taught how to use a lightsaber. Each child then has the opportunity to face Star Wars antagonists Darth Vader or Darth Maul. Also, local bands have returned to play in the evenings, just as Tomorrowland Terrace hosted in the 1960s. Various other street performers appear throughout the park, some seasonally. They include: Special holiday - themed groups are also added each year, such as the Main Street Carolers during the Christmas season. From Disneyland 's opening day until 1982, the price of the attractions was in addition to the price of park admission. Guests paid a small admission fee to get into the park, but admission to most of the rides and attractions required guests to purchase tickets, either individually or in a book, that consisted of several coupons, initially labeled "A '' through "C ''. "A '' coupons allowed admission to the smaller rides and attractions such as the Main Street Vehicles, whereas "C '' coupons were used for the most common attractions like Peter Pan 's Flight, or the Mad Tea Party. As more thrilling rides were introduced, such as the Disneyland Monorail or the Matterhorn Bobsleds, "D '' and then eventually "E '' coupons were introduced. Coupons could be combined to equal the equivalent of another ticket (e.g., two "A '' tickets equal one "B '' ticket). From the thrill ride experience at Disneyland, the colloquial expression "an E ticket ride '' is used to describe any exceptionally thrilling experience. Disneyland later featured a "Keys to the Kingdom '' booklet of tickets, which consisted of 10 unvalued coupons sold for a single flat rate. These coupons could be used for any attraction regardless of its regular value. In 1982, Disney dropped the idea for individual ride tickets to a single admission price with unlimited access to all attractions, "except shooting galleries ''. While this idea was not original to Disney, its business advantages were obvious: in addition to guaranteeing that everyone paid a large sum even if they stayed for only a few hours and rode only a few rides, the park no longer had to print tickets or ticket books, staff ticket booths, or provide staff to collect tickets or monitor attractions for people sneaking on without tickets. Later, Disney introduced other entry options such as multi-day passes, Annual Passes (which allow unlimited entry to the Park for an annual fee), and Southern California residents ' discounts. On February 28, 2016, Disneyland adopted a demand - based pricing system for single - day admission, charging different prices for "value '', "regular '', and "peak '' days, based on projected attendance. Approximately 30 % of days will be designated as "value '', mainly weekdays when school is in session, 44 % will be designated as "regular '', and 26 % will be designated as "peak '', mostly during holidays and weekends in July. ^ * Before 1982, passport tickets were available to groups only. Disneyland has had four unscheduled closures: Additionally, Disneyland has had numerous planned closures: Every year in October, Disneyland has a Halloween promotion. During this promotion, or as Disneyland calls it a "party '', areas in the park are decorated in a Halloween theme. Space Mountain and the Haunted Mansion are temporarily re-themed as part of the promotion. A Halloween party is offered on selected nights in late September and October for a separate fee, with a special fireworks show that is only shown at the party. From early November until the beginning of January, the park is decorated for the holidays. Seasonal entertainment includes the Believe... In Holiday Magic firework show and A Christmas Fantasy Parade, while the Haunted Mansion and It 's a Small World are temporarily redecorated in a holiday theme. The Sleeping Beauty Castle is also known to become snow - capped and decorated with colorful lights during the holidays as well. Plaque at the entrance. Disneyland, June 1962. The original red Mark I ALWEG Monorail train, with one car added, and then designated Mark II. Both trains were created especially for Disneyland. The other train was identical, but blue in color, August 1963. The blue Mark II ALWEG Monorail train, August 1963. A Los Angeles Airways S - 61L helicopter lifting off from the Disneyland heliport, August 1963. Aerial photo of Disneyland and the surrounding area, including the Disneyland Hotel with its Monorail Station, the Disneyland Heliport, orange groves, Santa Ana Freeway and Melodyland Theater, May 1965 The Haunted Mansion 's antebellum architecture is styled as a Southern plantation home. Downtown Disney. Sleeping Beauty Castle during the Happiest Homecoming on Earth. Disney California Adventure park entrance on July 4, 2010, this entrance was removed and remodeled in 2012 Theme parks that were closely themed to Disneyland Theme parks built by ex-Disneyland employee Cornelius Vanderbilt Wood Notes
where are the limits on the 14th amendment
Fourteenth Amendment to the United States Constitution - wikipedia The Fourteenth Amendment (Amendment XIV) to the United States Constitution was adopted on July 9, 1868, as one of the Reconstruction Amendments. The amendment addresses citizenship rights and equal protection of the laws and was proposed in response to issues related to former slaves following the American Civil War. The amendment was bitterly contested, particularly by the states of the defeated Confederacy, which were forced to ratify it in order to regain representation in Congress. The Fourteenth Amendment, particularly its first section, is one of the most litigated parts of the Constitution, forming the basis for landmark decisions such as Brown v. Board of Education (1954) regarding racial segregation, Roe v. Wade (1973) regarding abortion, Bush v. Gore (2000) regarding the 2000 presidential election, and Obergefell v. Hodges (2015) regarding same - sex marriage. The amendment limits the actions of all state and local officials, including those acting on behalf of such an official. The amendment 's first section includes several clauses: the Citizenship Clause, Privileges or Immunities Clause, Due Process Clause, and Equal Protection Clause. The Citizenship Clause provides a broad definition of citizenship, nullifying the Supreme Court 's decision in Dred Scott v. Sandford (1857), which had held that Americans descended from African slaves could not be citizens of the United States. The Privileges or Immunities Clause has been interpreted in such a way that it does very little. The Due Process Clause prohibits state and local government officials from depriving persons of life, liberty, or property without legislative authorization. This clause has also been used by the federal judiciary to make most of the Bill of Rights applicable to the states, as well as to recognize substantive and procedural requirements that state laws must satisfy. The Equal Protection Clause requires each state to provide equal protection under the law to all people, including all non-citizens, within its jurisdiction. This clause has been the basis for many decisions rejecting irrational or unnecessary discrimination against people belonging to various groups. The second, third, and fourth sections of the amendment are seldom litigated. However, the second section 's reference to "rebellion and other crime '' has been invoked as a constitutional ground for felony disenfranchisement. The fourth section was held, in Perry v. United States (1935), to prohibit a current Congress from abrogating a contract of debt incurred by a prior Congress. The fifth section gives Congress the power to enforce the amendment 's provisions by "appropriate legislation ''; however, under City of Boerne v. Flores (1997), this power may not be used to contradict a Supreme Court decision interpreting the amendment. Section 1. All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny to any person within its jurisdiction the equal protection of the laws. Section 2. Representatives shall be apportioned among the several States according to their respective numbers, counting the whole number of persons in each State, excluding Indians not taxed. But when the right to vote at any election for the choice of electors for President and Vice President of the United States, Representatives in Congress, the Executive and Judicial officers of a State, or the members of the Legislature thereof, is denied to any of the male inhabitants of such State, being twenty - one years of age, and citizens of the United States, or in any way abridged, except for participation in rebellion, or other crime, the basis of representation therein shall be reduced in the proportion which the number of such male citizens shall bear to the whole number of male citizens twenty - one years of age in such State. Section 3. No person shall be a Senator or Representative in Congress, or elector of President and Vice President, or hold any office, civil or military, under the United States, or under any State, who, having previously taken an oath, as a member of Congress, or as an officer of the United States, or as a member of any State legislature, or as an executive or judicial officer of any State, to support the Constitution of the United States, shall have engaged in insurrection or rebellion against the same, or given aid or comfort to the enemies thereof. But Congress may, by a vote of two - thirds of each House, remove such disability. Section 4. The validity of the public debt of the United States, authorized by law, including debts incurred for payment of pensions and bounties for services in suppressing insurrection or rebellion, shall not be questioned. But neither the United States nor any State shall assume or pay any debt or obligation incurred in aid of insurrection or rebellion against the United States, or any claim for the loss or emancipation of any slave; but all such debts, obligations and claims shall be held illegal and void. Section 5. The Congress shall have power to enforce, by appropriate legislation, the provisions of this article. In the final years of the American Civil War and the Reconstruction Era that followed, Congress repeatedly debated the rights of black former slaves freed by the 1863 Emancipation Proclamation and the 1865 Thirteenth Amendment, the latter of which had formally abolished slavery. Following the passage of the Thirteenth Amendment by Congress, however, Republicans grew concerned over the increase it would create in the congressional representation of the Democratic - dominated Southern States. Because the full population of freed slaves would now be counted for determining congressional representation, rather than the three - fifths previously mandated by the Three - Fifths Compromise, the Southern States would dramatically increase their power in the population - based House of Representatives, regardless of whether the former slaves were allowed to vote. Republicans began looking for a way to offset this advantage, either by protecting and attracting votes of former slaves, or at least by discouraging their disenfranchisement. In 1865, Congress passed what would become the Civil Rights Act of 1866, guaranteeing citizenship without regard to race, color, or previous condition of slavery or involuntary servitude. The bill also guaranteed equal benefits and access to the law, a direct assault on the Black Codes passed by many post-war states. The Black Codes attempted to return ex-slaves to something like their former condition by, among other things, restricting their movement, forcing them to enter into year - long labor contracts, prohibiting them from owning firearms, and preventing them from suing or testifying in court. Although strongly urged by moderates in Congress to sign the bill, President Andrew Johnson vetoed it on March 27, 1866. In his veto message, he objected to the measure because it conferred citizenship on the freedmen at a time when 11 out of 36 states were unrepresented in the Congress, and that it discriminated in favor of African - Americans and against whites. Three weeks later, Johnson 's veto was overridden and the measure became law. Despite this victory, even some Republicans who had supported the goals of the Civil Rights Act began to doubt that Congress really possessed constitutional power to turn those goals into laws. The experience also encouraged both radical and moderate Republicans to seek Constitutional guarantees for black rights, rather than relying on temporary political majorities. Over 70 proposals for an amendment were drafted. In late 1865, the Joint Committee on Reconstruction proposed an amendment stating that any citizens barred from voting on the basis of race by a state would not be counted for purposes of representation of that state. This amendment passed the House, but was blocked in the Senate by a coalition of Radical Republicans led by Charles Sumner, who believed the proposal a "compromise with wrong '', and Democrats opposed to black rights. Consideration then turned to a proposed amendment by Representative John A. Bingham of Ohio, which would enable Congress to safeguard "equal protection of life, liberty, and property '' of all citizens; this proposal failed to pass the House. In April 1866, the Joint Committee forwarded a third proposal to Congress, a carefully negotiated compromise that combined elements of the first and second proposals as well as addressing the issues of Confederate debt and voting by ex-Confederates. The House of Representatives passed House Resolution 127, 39th Congress several weeks later and sent to the Senate for action. The resolution was debated and several amendments to it were proposed. Amendments to Sections 2, 3, and 4 were adopted on June 8, 1866, and the modified resolution passed by a 33 to 11 vote. The House agreed to the Senate amendments on June 13 by a 138 -- 36 vote. A concurrent resolution requesting the President to transmit the proposal to the executives of the several states was passed by both houses of Congress on June 18. The Radical Republicans were satisfied that they had secured civil rights for blacks, but were disappointed that the amendment would not also secure political rights for blacks; in particular, the right to vote. For example, Thaddeus Stevens, a leader of the disappointed Radical Republicans, said: "I find that we shall be obliged to be content with patching up the worst portions of the ancient edifice, and leaving it, in many of its parts, to be swept through by the tempests, the frosts, and the storms of despotism. '' Abolitionist Wendell Phillips called it a "fatal and total surrender ''. This point would later be addressed by the Fifteenth Amendment. Ratification of the amendment was bitterly contested. State legislatures in every formerly Confederate state, with the exception of Tennessee, refused to ratify it. This refusal led to the passage of the Reconstruction Acts. Ignoring the existing state governments, military government was imposed until new civil governments were established and the Fourteenth Amendment was ratified. It also prompted Congress to pass a law on March 2, 1867, requiring that a former Confederate state must ratify the Fourteenth Amendment before "said State shall be declared entitled to representation in Congress ''. The first twenty - eight states to ratify the Fourteenth Amendment were: If rescission by Ohio and New Jersey were invalid, South Carolina would have been the 28th State. Rescission by Oregon did not occur until later. These rescissions caused significant controversy. However, ratification by other states continued during the course of the debate: On July 20, 1868, Secretary of State William H. Seward certified that if withdrawals of ratification by New Jersey and Ohio were ineffective, then the amendment had become part of the Constitution on July 9, 1868, with ratification by South Carolina. The following day, Congress adopted and transmitted to the Department of State a concurrent resolution declaring the Fourteenth Amendment to be a part of the Constitution and directing the Secretary of State to promulgate it as such. Both New Jersey and Ohio were named in the congressional resolution as having ratified the amendment, although Alabama was also named, making 29 states total. On the same day, one more State ratified: On July 27, Secretary Seward received the formal ratification from Georgia. The following day, July 28, Secretary Seward issued his official proclamation certifying the ratification of the 14th Amendment. Secretary Seward stated that his proclamation was "in conformance '' to the resolution by Congress, but his official list of States included both Alabama and Georgia, as well as Ohio and New Jersey. The inclusion of Ohio and New Jersey has led some to question the validity of rescission of a ratification. The inclusion of Alabama and Georgia has called that conclusion into question. While there have been Supreme Court cases dealing with ratification issues, this particular question has never been adjudicated. The Fourteenth Amendment was subsequently ratified: Since Ohio and New Jersey re-ratified the Fourteenth Amendment in 2003, all U.S. states that existed during Reconstruction have ratified the amendment. Section 1 of the amendment formally defines United States citizenship and also protects various civil rights from being abridged or denied by any state or state actor. Abridgment or denial of those civil rights by private persons is not addressed by this amendment; the Supreme Court held in the Civil Rights Cases (1883) that the amendment was limited to "state action '' and, therefore, did not authorize the Congress to outlaw racial discrimination by private individuals or organizations (though Congress can sometimes reach such discrimination via other parts of the Constitution). U.S. Supreme Court Justice Joseph P. Bradley commented in the Civil Rights Cases that "individual invasion of individual rights is not the subject - matter of the (14th) Amendment. It has a deeper and broader scope. It nullifies and makes void all state legislation, and state action of every kind, which impairs the privileges and immunities of citizens of the United States, or which injures them in life, liberty or property without due process of law, or which denies to any of them the equal protection of the laws. '' The Radical Republicans who advanced the Thirteenth Amendment hoped to ensure broad civil and human rights for the newly freed people -- but its scope was disputed before it even went into effect. The framers of the Fourteenth Amendment wanted these principles enshrined in the Constitution to protect the new Civil Rights Act from being declared unconstitutional by the Supreme Court and also to prevent a future Congress from altering it by a mere majority vote. This section was also in response to violence against black people within the Southern States. The Joint Committee on Reconstruction found that only a Constitutional amendment could protect black people 's rights and welfare within those states. This first section of the amendment has been the most frequently litigated part of the amendment, and this amendment in turn has been the most frequently litigated part of the Constitution. The Citizenship Clause overruled the Supreme Court 's Dred Scott decision that black people were not citizens and could not become citizens, nor enjoy the benefits of citizenship. Some members of Congress voted for the Fourteenth Amendment in order to eliminate doubts about the constitutionality of the Civil Rights Act of 1866, or to ensure that no subsequent Congress could later repeal or alter the main provisions of that Act. The Civil Rights Act of 1866 had granted citizenship to all persons born in the United States if they were not subject to a foreign power, and this clause of the Fourteenth Amendment constitutionalized this rule. There are varying interpretations of the original intent of Congress and of the ratifying states, based on statements made during the congressional debate over the amendment, as well as the customs and understandings prevalent at that time. Some of the major issues that have arisen about this clause are the extent to which it included Native Americans, its coverage of non-citizens legally present in the United States when they have a child, whether the clause allows revocation of citizenship, and whether the clause applies to illegal immigrants. Historian Eric Foner, who has explored the question of U.S. birthright citizenship to other countries, argues that: Many things claimed as uniquely American -- a devotion to individual freedom, for example, or social opportunity -- exist in other countries. But birthright citizenship does make the United States (along with Canada) unique in the developed world. (...) Birthright citizenship is one expression of the commitment to equality and the expansion of national consciousness that marked Reconstruction. (...) Birthright citizenship is one legacy of the titanic struggle of the Reconstruction era to create a genuine democracy grounded in the principle of equality. During the original congressional debate over the amendment Senator Jacob M. Howard of Michigan -- the author of the Citizenship Clause -- described the clause as having the same content, despite different wording, as the earlier Civil Rights Act of 1866, namely, that it excludes Native Americans who maintain their tribal ties and "persons born in the United States who are foreigners, aliens, who belong to the families of ambassadors or foreign ministers. '' According to historian Glenn W. LaFantasie of Western Kentucky University, "A good number of his fellow senators supported his view of the citizenship clause. '' Others also agreed that the children of ambassadors and foreign ministers were to be excluded. Senator James Rood Doolittle of Wisconsin asserted that all Native Americans were subject to United States jurisdiction, so that the phrase "Indians not taxed '' would be preferable, but Senate Judiciary Committee Chairman Lyman Trumbull and Howard disputed this, arguing that the federal government did not have full jurisdiction over Native American tribes, which govern themselves and make treaties with the United States. In Elk v. Wilkins (1884), the clause 's meaning was tested regarding whether birth in the United States automatically extended national citizenship. The Supreme Court held that Native Americans who voluntarily quit their tribes did not automatically gain national citizenship. The issue was resolved with the passage of the Indian Citizenship Act of 1924, which granted full U.S. citizenship to indigenous peoples. The Fourteenth Amendment provides that children born in the United States become American citizens regardless of the citizenship of their parents. At the time of the amendment 's passage, three Senators, including Trumbull, the author of the Civil Rights Act, as well as President Andrew Johnson, asserted that both the Civil Rights Act and the Fourteenth Amendment would confer citizenship on such children at birth; however, Senator Edgar Cowan of Pennsylvania had a definitively contrary opinion. These congressional remarks applied to non-citizens lawfully present in the United States, as the problem of unauthorized immigration did not exist in 1866, and some scholars dispute whether the Citizenship Clause applies to unauthorized immigrants, although the law of the land continues to be based on the standard interpretation. Congress during the 21st century has occasionally discussed revising the clause to reduce the practice of "birth tourism '', in which a pregnant foreign national gives birth in the United States for purposes of the child 's citizenship. The clause 's meaning with regard to a child of legal immigrants was tested in United States v. Wong Kim Ark (1898). The Supreme Court held that under the Fourteenth Amendment, a man born within the United States to Chinese citizens who have a permanent domicile and residence in the United States and are carrying on business in the United States -- and whose parents were not employed in a diplomatic or other official capacity by a foreign power -- was a citizen of the United States. Subsequent decisions have applied the principle to the children of foreign nationals of non-Chinese descent. Loss of national citizenship is possible only under the following circumstances: For much of the country 's history, voluntary acquisition or exercise of a foreign citizenship was considered sufficient cause for revocation of national citizenship. This concept was enshrined in a series of treaties between the United States and other countries (the Bancroft Treaties). However, the Supreme Court repudiated this concept in Afroyim v. Rusk (1967), as well as Vance v. Terrazas (1980), holding that the Citizenship Clause of the Fourteenth Amendment barred the Congress from revoking citizenship. However, Congress can revoke citizenship that it had previously granted to a person not born in the United States. The Privileges or Immunities Clause, which protects the privileges and immunities of national citizenship from interference by the states, was patterned after the Privileges and Immunities Clause of Article IV, which protects the privileges and immunities of state citizenship from interference by other states. In the Slaughter - House Cases (1873), the Supreme Court concluded that the Constitution recognized two separate types of citizenship -- "national citizenship '' and "state citizenship '' -- and the Court held that the Privileges or Immunities Clause prohibits states from interfering only with privileges and immunities possessed by virtue of national citizenship. The Court concluded that the privileges and immunities of national citizenship included only those rights that "owe their existence to the Federal government, its National character, its Constitution, or its laws. '' The Court recognized few such rights, including access to seaports and navigable waterways, the right to run for federal office, the protection of the federal government while on the high seas or in the jurisdiction of a foreign country, the right to travel to the seat of government, the right to peaceably assemble and petition the government, the privilege of the writ of habeas corpus, and the right to participate in the government 's administration. This decision has not been overruled and has been specifically reaffirmed several times. Largely as a result of the narrowness of the Slaughter - House opinion, this clause subsequently lay dormant for well over a century. In Saenz v. Roe (1999), the Court ruled that a component of the "right to travel '' is protected by the Privileges or Immunities Clause: Despite fundamentally differing views concerning the coverage of the Privileges or Immunities Clause of the Fourteenth Amendment, most notably expressed in the majority and dissenting opinions in the Slaughter - House Cases (1873), it has always been common ground that this Clause protects the third component of the right to travel. Writing for the majority in the Slaughter - House Cases, Justice Miller explained that one of the privileges conferred by this Clause "is that a citizen of the United States can, of his own volition, become a citizen of any State of the Union by a bona fide residence therein, with the same rights as other citizens of that State. '' (emphasis added) Justice Miller actually wrote in the Slaughter - House Cases that the right to become a citizen of a state (by residing in that state) "is conferred by the very article under consideration '' (emphasis added), rather than by the "clause '' under consideration. In McDonald v. Chicago (2010), Justice Clarence Thomas, while concurring with the majority in incorporating the Second Amendment against the states, declared that he reached this conclusion through the Privileges or Immunities Clause instead of the Due Process Clause. Randy Barnett has referred to Justice Thomas 's concurring opinion as a "complete restoration '' of the Privileges or Immunities Clause. In the 1884 case of Hurtado v. California, the U.S. Supreme Court said: Due process of law in the (Fourteenth Amendment) refers to that law of the land in each state which derives its authority from the inherent and reserved powers of the state, exerted within the limits of those fundamental principles of liberty and justice which lie at the base of all our civil and political institutions, and the greatest security for which resides in the right of the people to make their own laws, and alter them at their pleasure. The Due Process Clause of the Fourteenth Amendment applies only against the states, but it is otherwise textually identical to the Due Process Clause of the Fifth Amendment, which applies against the federal government; both clauses have been interpreted to encompass identical doctrines of procedural due process and substantive due process. Procedural due process is the guarantee of a fair legal process when the government tries to interfere with a person 's protected interests in life, liberty, or property, and substantive due process is the guarantee that the fundamental rights of citizens will not be encroached on by government. The Due Process Clause of the Fourteenth Amendment also incorporates most of the provisions in the Bill of Rights, which were originally applied against only the federal government, and applies them against the states. Beginning with Allgeyer v. Louisiana (1897), the Court interpreted the Due Process Clause as providing substantive protection to private contracts, thus prohibiting a variety of social and economic regulation; this principle was referred to as "freedom of contract ''. Thus, the Court struck down a law decreeing maximum hours for workers in a bakery in Lochner v. New York (1905) and struck down a minimum wage law in Adkins v. Children 's Hospital (1923). In Meyer v. Nebraska (1923), the Court stated that the "liberty '' protected by the Due Process Clause (w) ithout doubt... denotes not merely freedom from bodily restraint but also the right of the individual to contract, to engage in any of the common occupations of life, to acquire useful knowledge, to marry, establish a home and bring up children, to worship God according to the dictates of his own conscience, and generally to enjoy those privileges long recognized at common law as essential to the orderly pursuit of happiness by free men. However, the Court did uphold some economic regulation, such as state Prohibition laws (Mugler v. Kansas, 1887), laws declaring maximum hours for mine workers (Holden v. Hardy, 1898), laws declaring maximum hours for female workers (Muller v. Oregon, 1908), and President Woodrow Wilson 's intervention in a railroad strike (Wilson v. New, 1917), as well as federal laws regulating narcotics (United States v. Doremus, 1919). The Court repudiated, but did not explicitly overrule, the "freedom of contract '' line of cases in West Coast Hotel v. Parrish (1937). In Poe v. Ullman (1961), dissenting judge John Marshall Harlan II adopted a broad view of the "liberty '' protected by the Fourteenth Amendment Due Process clause: (T) he full scope of the liberty guaranteed by the Due Process Clause can not be found in or limited by the precise terms of the specific guarantees elsewhere provided in the Constitution. This ' liberty ' is not a series of isolated points pricked out in terms of the taking of property; the freedom of speech, press, and religion; the right to keep and bear arms; the freedom from unreasonable searches and seizures; and so on. It is a rational continuum which, broadly speaking, includes a freedom from all substantial arbitrary impositions and purposeless restraints,... and which also recognizes, what a reasonable and sensitive judgment must, that certain interests require particularly careful scrutiny of the state needs asserted to justify their abridgment. This broad view of liberty was adopted by the Supreme Court in Griswold v. Connecticut (for further information see below). Although the "freedom of contract '' described above has fallen into disfavor, by the 1960s, the Court had extended its interpretation of substantive due process to include other rights and freedoms that are not enumerated in the Constitution but that, according to the Court, extend or derive from existing rights. For example, the Due Process Clause is also the foundation of a constitutional right to privacy. The Court first ruled that privacy was protected by the Constitution in Griswold v. Connecticut (1965), which overturned a Connecticut law criminalizing birth control. While Justice William O. Douglas wrote for the majority that the right to privacy was found in the "penumbras '' of various provisions in the Bill of Rights, Justices Arthur Goldberg and John Marshall Harlan II wrote in concurring opinions that the "liberty '' protected by the Due Process Clause included individual privacy. The right to privacy was the basis for Roe v. Wade (1973), in which the Court invalidated a Texas law forbidding abortion except to save the mother 's life. Like Goldberg 's and Harlan 's concurring opinions in Griswold, the majority opinion authored by Justice Harry Blackmun located the right to privacy in the Due Process Clause 's protection of liberty. The decision disallowed many state and federal abortion restrictions, and it became one of the most controversial in the Court 's history. In Planned Parenthood v. Casey (1992), the Court decided that "the essential holding of Roe v. Wade should be retained and once again reaffirmed. '' In Lawrence v. Texas (2003), the Court found that a Texas law against same - sex sexual intercourse violated the right to privacy. In Obergefell v. Hodges (2015), the Court ruled that the fundamental right to marriage included same - sex couples being able to marry. When the government seeks to burden a person 's protected liberty interest or property interest, the Supreme Court has held that procedural due process requires that, at a minimum, the government provide the person notice, an opportunity to be heard at an oral hearing, and a decision by a neutral decision maker. For example, such process is due when a government agency seeks to terminate civil service employees, expel a student from public school, or cut off a welfare recipient 's benefits. The Court has also ruled that the Due Process Clause requires judges to recuse themselves in cases where the judge has a conflict of interest. For example, in Caperton v. A.T. Massey Coal Co. (2009), the Court ruled that a justice of the Supreme Court of Appeals of West Virginia had to recuse himself from a case involving a major contributor to his campaign for election to that court. While many state constitutions are modeled after the United States Constitution and federal laws, those state constitutions did not necessarily include provisions comparable to the Bill of Rights. In Barron v. Baltimore (1833), the Supreme Court unanimously ruled that the Bill of Rights restrained only the federal government, not the states. However, the Supreme Court has subsequently held that most provisions of the Bill of Rights apply to the states through the Due Process Clause of the Fourteenth Amendment under a doctrine called "incorporation. '' Whether incorporation was intended by the amendment 's framers, such as John Bingham, has been debated by legal historians. According to legal scholar Akhil Reed Amar, the framers and early supporters of the Fourteenth Amendment believed that it would ensure that the states would be required to recognize the same individual rights as the federal government; all of these rights were likely understood as falling within the "privileges or immunities '' safeguarded by the amendment. By the latter half of the 20th century, nearly all of the rights in the Bill of Rights had been applied to the states. The Supreme Court has held that the amendment 's Due Process Clause incorporates all of the substantive protections of the First, Second, Fourth, Fifth (except for its Grand Jury Clause) and Sixth Amendments and the Cruel and Unusual Punishment Clause of the Eighth Amendment. While the Third Amendment has not been applied to the states by the Supreme Court, the Second Circuit ruled that it did apply to the states within that circuit 's jurisdiction in Engblom v. Carey. The Seventh Amendment right to jury trial in civil cases has been held not to be applicable to the states, but the amendment 's Re-Examination Clause applies not only to federal courts, but also to "a case tried before a jury in a state court and brought to the Supreme Court on appeal. '' The Equal Protection Clause was created largely in response to the lack of equal protection provided by law in states with Black Codes. Under Black Codes, blacks could not sue, give evidence, or be witnesses. They also were punished more harshly than whites. In 1880, the Supreme Court stated in Strauder v. West Virginia that the Equal Protection Clause was The Clause mandates that individuals in similar situations be treated equally by the law. Although the text of the Fourteenth Amendment applies the Equal Protection Clause only against the states, the Supreme Court, since Bolling v. Sharpe (1954), has applied the Clause against the federal government through the Due Process Clause of the Fifth Amendment under a doctrine called "reverse incorporation. '' In Yick Wo v. Hopkins (1886), the Supreme Court has clarified that the meaning of "person '' and "within its jurisdiction '' in the Equal Protection Clause would not be limited to discrimination against African Americans, but would extend to other races, colors, and nationalities such as (in this case) legal aliens in the United States who are Chinese citizens: These provisions are universal in their application to all persons within the territorial jurisdiction, without regard to any differences of race, of color, or of nationality, and the equal protection of the laws is a pledge of the protection of equal laws. Persons "within its jurisdiction '' are entitled to equal protection from a state. Largely because the Privileges and Immunities Clause of Article IV has from the beginning guaranteed the privileges and immunities of citizens in the several states, the Supreme Court has rarely construed the phrase "within its jurisdiction '' in relation to natural persons. In Plyler v. Doe (1982), where the Court held that aliens illegally present in a state are within its jurisdiction and may thus raise equal protection claims the Court explicated the meaning of the phrase "within its jurisdiction '' as follows: "(U) se of the phrase "within its jurisdiction '' confirms the understanding that the Fourteenth Amendment 's protection extends to anyone, citizen or stranger, who is subject to the laws of a State, and reaches into every corner of a State 's territory. '' The Court reached this understanding among other things from Senator Howard, a member of the Joint Committee of Fifteen, and the floor manager of the amendment in the Senate. Senator Howard was explicit about the broad objectives of the Fourteenth Amendment and the intention to make its provisions applicable to all who "may happen to be '' within the jurisdiction of a state: The last two clauses of the first section of the amendment disable a State from depriving not merely a citizen of the United States, but any person, whoever he may be, of life, liberty, or property without due process of law, or from denying to him the equal protection of the laws of the State. This abolishes all class legislation in the States and does away with the injustice of subjecting one caste of persons to a code not applicable to another... It will, if adopted by the States, forever disable every one of them from passing laws trenching upon those fundamental rights and privileges which pertain to citizens of the United States, and to all person who may happen to be within their jurisdiction. (emphasis added by the U.S. Supreme Court) The relationship between the Fifth and Fourteenth Amendments was addressed by Justice Field in Wong Wing v. United States (1896). He observed with respect to the phrase "within its jurisdiction '': "The term ' person, ' used in the Fifth Amendment, is broad enough to include any and every human being within the jurisdiction of the republic. A resident, alien born, is entitled to the same protection under the laws that a citizen is entitled to. He owes obedience to the laws of the country in which he is domiciled, and, as a consequence, he is entitled to the equal protection of those laws... The contention that persons within the territorial jurisdiction of this republic might be beyond the protection of the law was heard with pain on the argument at the bar -- in face of the great constitutional amendment which declares that no State shall deny to any person within its jurisdiction the equal protection of the laws. '' The Supreme Court also decided whether foreign corporations are also within the jurisdiction of a state, ruling that a foreign corporation which sued in a state court in which it was not licensed to do business to recover possession of property wrongfully taken from it in another state was within the jurisdiction and could not be subjected to unequal burdens in the maintenance of the suit. When a state has admitted a foreign corporation to do business within its borders, that corporation is entitled to equal protection of the laws but not necessarily to identical treatment with domestic corporations. In Santa Clara County v. Southern Pacific Railroad (1886), the court reporter included a statement by Chief Justice Morrison Waite in the decision 's headnote: The court does not wish to hear argument on the question whether the provision in the Fourteenth Amendment to the Constitution, which forbids a State to deny to any person within its jurisdiction the equal protection of the laws, applies to these corporations. We are all of the opinion that it does. This dictum, which established that corporations enjoyed personhood under the Equal Protection Clause, was repeatedly reaffirmed by later courts. It remained the predominant view throughout the twentieth century, though it was challenged in dissents by justices such as Hugo Black and William O. Douglas. Between 1890 and 1910, Fourteenth Amendment cases involving corporations vastly outnumbered those involving the rights of blacks, 288 to 19. In the decades following the adoption of the Fourteenth Amendment, the Supreme Court overturned laws barring blacks from juries (Strauder v. West Virginia, 1880) or discriminating against Chinese Americans in the regulation of laundry businesses (Yick Wo v. Hopkins, 1886), as violations of the Equal Protection Clause. However, in Plessy v. Ferguson (1896), the Supreme Court held that the states could impose segregation so long as they provided similar facilities -- the formation of the "separate but equal '' doctrine. The Court went even further in restricting the Equal Protection Clause in Berea College v. Kentucky (1908), holding that the states could force private actors to discriminate by prohibiting colleges from having both black and white students. By the early 20th century, the Equal Protection Clause had been eclipsed to the point that Justice Oliver Wendell Holmes, Jr. dismissed it as "the usual last resort of constitutional arguments. '' The Court held to the "separate but equal '' doctrine for more than fifty years, despite numerous cases in which the Court itself had found that the segregated facilities provided by the states were almost never equal, until Brown v. Board of Education (1954) reached the Court. In Brown the Court ruled that even if segregated black and white schools were of equal quality in facilities and teachers, segregation was inherently harmful to black students and so was unconstitutional. Brown met with a campaign of resistance from white Southerners, and for decades the federal courts attempted to enforce Brown 's mandate against repeated attempts at circumvention. This resulted in the controversial desegregation busing decrees handed down by federal courts in various parts of the nation. In Parents Involved in Community Schools v. Seattle School District No. 1 (2007), the Court ruled that race could not be the determinative factor in determining to which public schools parents may transfer their children. In Plyler v. Doe (1982) the Supreme Court struck down a Texas statute denying free public education to illegal immigrants as a violation of the Equal Protection Clause of the Fourteenth Amendment because discrimination on the basis of illegal immigration status did not further a substantial state interest. The Court reasoned that illegal aliens and their children, though not citizens of the United States or Texas, are people "in any ordinary sense of the term '' and, therefore, are afforded Fourteenth Amendment protections. In Hernandez v. Texas (1954), the Court held that the Fourteenth Amendment protects those beyond the racial classes of white or "Negro '' and extends to other racial and ethnic groups, such as Mexican Americans in this case. In the half - century following Brown, the Court extended the reach of the Equal Protection Clause to other historically disadvantaged groups, such as women and illegitimate children, although it has applied a somewhat less stringent standard than it has applied to governmental discrimination on the basis of race (United States v. Virginia (1996); Levy v. Louisiana (1968).) The Supreme Court ruled in Regents of the University of California v. Bakke (1978) that affirmative action in the form of racial quotas in public university admissions was a violation of Title VI of the Civil Rights Act of 1964; however, race could be used as one of several factors without violating of the Equal Protection Clause or Title VI. In Gratz v. Bollinger (2003) and Grutter v. Bollinger (2003), the Court considered two race - conscious admissions systems at the University of Michigan. The university claimed that its goal in its admissions systems was to achieve racial diversity. In Gratz, the Court struck down a points - based undergraduate admissions system that added points for minority status, finding that its rigidity violated the Equal Protection Clause; in Grutter, the Court upheld a race - conscious admissions process for the university 's law school that used race as one of many factors to determine admission. In Fisher v. University of Texas (2013), the Court ruled that before race can be used in a public university 's admission policy, there must be no workable race - neutral alternative. In Schuette v. Coalition to Defend Affirmative Action (2014), the Court upheld the constitutionality of a state constitutional prohibition on the state or local use of affirmative action. Reed v. Reed (1971), which struck down an Idaho probate law favoring men, was the first decision in which the Court ruled that arbitrary gender discrimination violated the Equal Protection Clause. In Craig v. Boren (1976), the Court ruled that statutory or administrative sex classifications had to be subjected to an intermediate standard of judicial review. Reed and Craig later served as precedents to strike down a number of state laws discriminating by gender. Since Wesberry v. Sanders (1964) and Reynolds v. Sims (1964), the Supreme Court has interpreted the Equal Protection Clause as requiring the states to apportion their congressional districts and state legislative seats according to "one man, one vote ''. The Court has also struck down redistricting plans in which race was a key consideration. In Shaw v. Reno (1993), the Court prohibited a North Carolina plan aimed at creating majority - black districts to balance historic underrepresentation in the state 's congressional delegations. The Equal Protection Clause served as the basis for the decision in Bush v. Gore (2000), in which the Court ruled that no constitutionally valid recount of Florida 's votes in the 2000 presidential election could be held within the needed deadline; the decision effectively secured Bush 's victory in the disputed election. In League of United Latin American Citizens v. Perry (2006), the Court ruled that House Majority Leader Tom DeLay 's Texas redistricting plan intentionally diluted the votes of Latinos and thus violated the Equal Protection Clause. Individual liberties guaranteed by the United States Constitution, other than the Thirteenth Amendment 's ban on slavery, protect not against actions by private persons or entities, but only against actions by government officials. Regarding the Fourteenth Amendment, the Supreme Court ruled in Shelley v. Kraemer (1948): "(T) he action inhibited by the first section of the Fourteenth Amendment is only such action as may fairly be said to be that of the States. That Amendment erects no shield against merely private conduct, however discriminatory or wrongful. '' The court added in Civil Rights Cases (1883): "It is State action of a particular character that is prohibited. Individual invasion of individual rights is not the subject matter of the amendment. It has a deeper and broader scope. It nullifies and makes void all State legislation, and State action of every kind, which impairs the privileges and immunities of citizens of the United States, or which injures them in life, liberty, or property without due process of law, or which denies to any of them the equal protection of the laws. '' Vindication of federal constitutional rights are limited to those situations where there is "state action '' meaning action of government officials who are exercising their governmental power. In Ex parte Virginia (1880), the Supreme Court found that the prohibitions of the Fourteenth Amendment "have reference to actions of the political body denominated by a State, by whatever instruments or in whatever modes that action may be taken. A State acts by its legislative, its executive, or its judicial authorities. It can act in no other way. The constitutional provision, therefore, must mean that no agency of the State, or of the officers or agents by whom its powers are exerted, shall deny to any person within its jurisdiction the equal protection of the laws. Whoever, by virtue of public position under a State government, deprives another of property, life, or liberty, without due process of law, or denies or takes away the equal protection of the laws, violates the constitutional inhibition; and as he acts in the name and for the State, and is clothed with the State 's power, his act is that of the State. '' There are however instances where people are the victims of civil - rights violations that occur in circumstances involving both government officials and private actors. In the 1960s, the United States Supreme Court adopted an expansive view of state action opening the door to wide - ranging civil - rights litigation against private actors when they act as state actors (i.e., acts done or otherwise "sanctioned in some way '' by the state). The Court found that the state action doctrine is equally applicable to denials of privileges or immunities, due process, and equal protection of the laws. The critical factor in determining the existence of state action is not governmental involvement with private persons or private corporations, but "the inquiry must be whether there is a sufficiently close nexus between the State and the challenged action of the regulated entity so that the action of the latter may be fairly treated as that of the State itself. '' "Only by sifting facts and weighing circumstances can the nonobvious involvement of the State in private conduct be attributed its true significance. '' The Supreme Court asserted that plaintiffs must establish not only that a private party "acted under color of the challenged statute, but also that its actions are properly attributable to the State. (...) '' "And the actions are to be attributable to the State apparently only if the State compelled the actions and not if the State merely established the process through statute or regulation under which the private party acted. '' The rules developed by the Supreme Court for business regulation are that (1) the "mere fact that a business is subject to state regulation does not by itself convert its action into that of the State for purposes of the Fourteenth Amendment, '' and (2) "a State normally can be held responsible for a private decision only when it has exercised coercive power or has provided such significant encouragement, either overt or covert, that the choice must be deemed to be that of the State. '' Under Article I, Section 2, Clause 3, the basis of representation of each state in the House of Representatives was determined by adding three - fifths of each state 's slave population to its free population. Because slavery (except as punishment for crime) had been abolished by the Thirteenth Amendment, the freed slaves would henceforth be given full weight for purposes of apportionment. This situation was a concern to the Republican leadership of Congress, who worried that it would increase the political power of the former slave states, even as they continued to deny freed slaves the right to vote. Two solutions were considered: On January 31, 1866, the House of Representatives voted in favor of a proposed constitutional amendment that would reduce a state 's representation in the House in proportion to which that state used "race or color '' as a basis to deny the right to vote in that state. The amendment failed in the Senate, partly because radical Republicans foresaw that states would be able to use ostensibly race - neutral criteria, such as educational and property qualifications, to disenfranchise the freed slaves without negative consequence. So the amendment was changed to penalize states that denied the vote to male citizens over twenty - one for any reason other than participation in crime. Later, the Fifteenth Amendment was adopted to guarantee the right to vote could not be denied based on race or color. The effect of Section 2 was twofold: The first reapportionment after the enactment of the Fourteenth Amendment occurred in 1873, based on the 1870 census. Congress appears to have attempted to enforce the provisions of Section 2, but was unable to identify enough disenfranchised voters to make a difference to any state 's representation. In the implementing statute, Congress added a provision stating that should any state, after the passage of this Act, deny or abridge the right of any of the male inhabitants of such State, being twenty - one years of age, and citizens of the United States, to vote at any election named in the amendments to the Constitution, article fourteen, section two, except for participation in rebellion or other crime, the number of Representatives apportioned in this act to such State shall be reduced in the proportion which the number of such male citizens shall have to the whole number of male citizens twenty - one years of age in such State. A nearly identical provision remains in federal law to this day. Despite this legislation, in subsequent reapportionments, no change has ever been made to any state 's Congressional representation on the basis of the Amendment. Bonfield, writing in 1960, suggested that "(t) he hot political nature of such proposals has doomed them to failure ''. Aided by this lack of enforcement, Southern States continued to use pretexts to prevent many blacks from voting until the passage of the Voting Rights Act of 1965. In the Fourth Circuit case of Saunders v Wilkins (1945), Saunders claimed that Virginia should have its Congressional representation reduced because of its use of a poll tax and other voting restrictions. The plaintiff sued for the right to run for Congress at large in the state, rather than in one of its designated Congressional districts. The lawsuit was dismissed as a political question. Some have argued that Section 2 was implicitly repealed by the Fifteenth Amendment, but the Supreme Court acknowledged the provisions of Section 2 in some later decisions. In Minor v. Happersett (1875), the Supreme Court cited Section 2 as supporting its conclusion that the right to vote was not among the "privileges and immunities of citizenship '' protected by Section 1. In Richardson v. Ramirez (1974), the Court cited Section 2 as justifying the states disenfranchising felons. In Hunter v. Underwood (1985), a case involving disenfranchising black misdemeanants, the Supreme Court concluded that the Tenth Amendment can not save legislation prohibited by the subsequently enacted Fourteenth Amendment. More specifically the Court concluded that laws passed with a discriminatory purpose are not excepted from the operation of the Equal Protection Clause by the "other crime '' provision of Section 2. The Court held that Section 2 "was not designed to permit the purposeful racial discrimination (...) which otherwise violates (Section) 1 of the Fourteenth Amendment. '' Abolitionist leaders criticized the amendment 's failure to specifically prohibit the states from denying people the right to vote on the basis of race. Section 2 protects the right to vote only of adult males, not adult females, making it the only provision of the Constitution to explicitly discriminate on the basis of sex. Section 2 was condemned by women 's suffragists, such as Elizabeth Cady Stanton and Susan B. Anthony, who had long seen their cause as linked to that of black rights. The separation of black civil rights from women 's civil rights split the two movements for decades. Section 3 prohibits the election or appointment to any federal or state office of any person who had held any of certain offices and then engaged in insurrection, rebellion or treason. However, a two - thirds vote by each House of the Congress can override this limitation. In 1898, the Congress enacted a general removal of Section 3 's limitation. In 1975, the citizenship of Confederate general Robert E. Lee was restored by a joint congressional resolution, retroactive to June 13, 1865. In 1978, pursuant to Section 3, the Congress posthumously removed the service ban from Confederate president Jefferson Davis. Section 3 was used to prevent Socialist Party of America member Victor L. Berger, convicted of violating the Espionage Act for his anti-militarist views, from taking his seat in the House of Representatives in 1919 and 1920. Section 4 confirmed the legitimacy of all public debt appropriated by the Congress. It also confirmed that neither the United States nor any state would pay for the loss of slaves or debts that had been incurred by the Confederacy. For example, during the Civil War several British and French banks had lent large sums of money to the Confederacy to support its war against the Union. In Perry v. United States (1935), the Supreme Court ruled that under Section 4 voiding a United States bond "went beyond the congressional power. '' The debt - ceiling crises of 2011 and 2013 raised the question of what is the President 's authority under Section 4. Some, such as legal scholar Garrett Epps, fiscal expert Bruce Bartlett and Treasury Secretary Timothy Geithner, have argued that a debt ceiling may be unconstitutional and therefore void as long as it interferes with the duty of the government to pay interest on outstanding bonds and to make payments owed to pensioners (that is, Social Security and Railroad Retirement Act recipients). Legal analyst Jeffrey Rosen has argued that Section 4 gives the President unilateral authority to raise or ignore the national debt ceiling, and that if challenged the Supreme Court would likely rule in favor of expanded executive power or dismiss the case altogether for lack of standing. Erwin Chemerinsky, professor and dean at University of California, Irvine School of Law, has argued that not even in a "dire financial emergency '' could the President raise the debt ceiling as "there is no reasonable way to interpret the Constitution that (allows him to do so) ''. Jack Balkin, Knight Professor of Constitutional Law at Yale University, opined that like Congress the President is bound by the Fourteenth Amendment, for otherwise, he could violate any part of the amendment at will. Because the President must obey the Section 4 requirement not to put the validity of the public debt into question, Balkin argued that President Obama is obliged "to prioritize incoming revenues to pay the public debt: interest on government bonds and any other ' vested ' obligations. What falls into the latter category is not entirely clear, but a large number of other government obligations -- and certainly payments for future services -- would not count and would have to be sacrificed. This might include, for example, Social Security payments. '' Section 5, also known as the Enforcement Clause of the Fourteenth Amendment, enables Congress to pass laws enforcing the amendment 's other provisions. In the Civil Rights Cases (1883), the Supreme Court interpreted Section 5 narrowly, stating that "the legislation which Congress is authorized to adopt in this behalf is not general legislation upon the rights of the citizen, but corrective legislation ''. In other words, the amendment authorizes Congress to pass laws only to combat violations of the rights protected in other sections. In Katzenbach v. Morgan (1966), the Court upheld Section 4 (e) of the Voting Rights Act of 1965, which prohibits certain forms of literacy requirements as a condition to vote, as a valid exercise of Congressional power under Section 5 to enforce the Equal Protection Clause. The Court ruled that Section 5 enabled Congress to act both remedially and prophylactically to protect the rights guaranteed by the amendment. However, in City of Boerne v. Flores (1997), the Court narrowed Congress 's enforcement power, holding that Congress may not enact legislation under Section 5 that substantively defines or interprets Fourteenth Amendment rights. The Court ruled that legislation is valid under Section 5 only if there is a "congruence and proportionality '' between the injury to a person 's Fourteenth Amendment right and the means Congress adopted to prevent or remedy that injury. Why this if it was not in the power of the legislature to deny the right of suffrage to some male inhabitants? And if suffrage was necessarily one of the absolute rights of citizenship, why confine the operation of the limitation to male inhabitants? Women and children are, as we have seen, "persons. '' They are counted in the enumeration upon which the apportionment is to be made, but if they were necessarily voters because of their citizenship unless clearly excluded, why inflict the penalty for the exclusion of males alone? Clearly, no such form of words would have been elected to express the idea here indicated if suffrage was the absolute right of all citizens.
how many of the crow movies are there
Category: the crow films - wikipedia Help The following 4 pages are in this category, out of 4 total. This list may not reflect recent changes (learn more).
what were the original m and m colors
M&M 's - wikipedia M&M 's are "colorful button - shaped chocolates '', each of which has the letter "m '' printed in lower case on one side, surrounds a filling which varies depending upon the variety of M&M 's. The original candy had a milk chocolate filling which, upon introducing other variations, was branded as the "plain '' variety. "Peanut '' M&M 's, which feature a peanut coated in milk chocolate, and finally a candy shell, were the first variation to be introduced, and they remain a regular variety. Numerous other variations have been introduced, some of which are regular widespread varieties (such as "peanut butter '', "almond '', "pretzel '', "crispy '', "dark chocolate '', and "caramel '') while others are limited in duration or geographic availability. M&M 's originated in the United States in 1941, and are sold in over 100 countries, since 2003. More than 400 million individual M&M 's are produced every day in the United States. They are produced in different colors, some of which have changed over the years. The candy - coated chocolate concept was inspired by a method used to allow soldiers to carry chocolate without having it melt. The company 's longest - lasting slogan reflects this: "Melts in your mouth, not in your hand. '' A traditional milk chocolate M&M weighs about 0.91 grams / 0.032 ounces and has about 4.7 kilocalories (kcal) of food energy (1.7 kcal from fat). Forrest Mars, Sr., son of the Mars Company founder, Frank C. Mars, copied the idea for the candy in the 1930s during the Spanish Civil War when he saw soldiers eating British made Smarties, chocolate pellets with a colored shell of what confectioners call hard panning (essentially hardened sugar syrup) surrounding the outside, preventing the candies from melting. Mars received a patent for his own process on March 3, 1941. Production began in 1941 in a factory located at 285 Badger Avenue in Clinton Hill, Newark, New Jersey. When the company was founded it was M&M Limited. The two "Ms '' represent the names of Forrest E. Mars Sr., the founder of Newark Company, and Bruce Murrie, son of Hershey Chocolate 's president William F.R. Murrie, who had a 20 percent share in the product. The arrangement allowed the candies to be made with Hershey chocolate, as Hershey had control of the rationed chocolate at the time. The demand for the candies during World War II caused an increase in production and its factory moved to bigger quarters at 200 North 12th Street in Newark, New Jersey, where it remained until 1958 when it moved to a bigger factory at Hackettstown. During the war, the candies were exclusively sold to the military. In 1949, the brand released the tagline "Melts in your mouth, not in your hand. '' In 1950, a black "M '' was imprinted on the candies giving them a unique trademark. It was changed to white in 1954. In the early 1950s, the Midwest Research Institute (now MRIGlobal) in Kansas City, Missouri, worked on behalf of M&M 's to perfect a process whereby 3,300 pounds (1,500 kg) of chocolate centers could be coated every hour. Peanut M&M 's were introduced in 1954 but first appeared only in the color tan. In 1960, M&M 's added the yellow, red, and green colors. In 1976, orange was added to the M&M rainbow to replace red, which was discontinued due to the red dye scare over Red Dyes # 2 and # 4 having been evaluated to be carcinogenic in nature. In spite of the fact that M&M 's had used the less controversial Red Dye # 40, the public was wary of any food being dyed red. Red M&M 's were re-introduced in 1987. In the 1980s, M&M 's were introduced internationally to Australia, Canada, Europe, Hong Kong, Japan, Malaysia, and the United Kingdom. Although they were marketed and then withdrawn in the 1960s, almond - centered M&M 's were available again in 1988 in limited release, with appearances only during Christmas and Easter times; they became a standard part of the product line in 1992. Also in 1986, M&M 's launched Holidays Chocolate Candies for Easter and Christmas, with the Easter candies having bunny, chick, and egg symbols on pastel - colored shells, and the Christmas candies having pine tree, bell, and candle symbols on red and green shells; with the latter also having a special mint flavor. By 1993, the holiday symbols were replaced with the standard trademark "M ''. In 1991, Peanut Butter M&M 's were released. These candies have peanut butter inside the chocolate center and the same color scheme as the other varieties. As of at least 2013, the size of the peanut butter M&M has become slightly smaller. In 1995, tan M&Ms were discontinued to be replaced by blue. In 1996, Mars introduced "M&M 's Minis '', smaller candies usually sold in plastic tubes instead of bags. In 1999, Crispy M&M 's were released. They were slightly larger than the milk chocolate variety and also featured a crispy wafer center. They were discontinued in the United States in 2005 and remained available in Europe, and Southeast Asia. In January 2015, they returned to production in the United States. In July 2001, Dulce de Leche M&M 's were introduced in five markets with large Hispanic populations: Los Angeles, California; San Diego, California; Miami, Florida; Mcallen - Brownsville, Texas; and San Antonio, Texas. The flavor never became popular with the Hispanic community, who preferred existing M&M 's flavors, and it was discontinued in most areas by early 2003. In 2010, Pretzel M&M 's were released. They contain a crunchy, salty pretzel center inside of the chocolate coating and are about the same size as the Peanut M&M 's, but their shape tends to be more spherical. In 2013, the M&M 's chocolate bar was re-released. It was originally released in 2004 and named M - Azing. In 2014, Mega M&M 's were re-introduced. Before then, the ' Mega M&M 's ' had been released in 2007 promoting the Shrek Movies, being dubbed "Ogre - Sized M&M 's ''. In 2015, Crispy M&M 's were re-introduced in the United States. They had remained available continuously in Europe and Australia. In 2016, the M&M cookie was re-introduced in the United States. Also in 2016, the M&M 's flavor vote was created in which the fans could vote for either honey, coffee, or chili nut M&M 's to go with peanut M&M 's. Coffee Nuts was announced as the winner by VEEP 's Tony Hale In April 2017, M&M 's chocolate blocks went on sale in Australia. Six varieties (milk chocolate, strawberry, crispy, hazelnut, crispy mint and almond) are available. Also in 2017, Caramel M&M 's were released in the United States. M&M 's varieties have included the following sizes and fillings. Note that some have only been made available for a limited time, such as white cheesecake for Easter, pumpkin spice or white candy corn for Halloween, and White Strawberry Shortcake for Valentine 's Day. Over the years, marketing has helped build and expand the M&M 's brand. Computer - animated graphics, personification of the candies as characters with cartoon - like storytelling, and various merchandising techniques including the introduction of new flavors, colors and customizable merchandise have helped to increase the brand 's recognition as a candy icon. In 1982, the Mars candy bar company rejected the inclusion of M&M 's in the new Steven Spielberg movie E.T.: The Extra-Terrestrial. Competitor Hershey, on the other hand, took a chance with their Reese 's Pieces, which is similar to M&M 's but contains a peanut butter filling, and with the blockbuster success, its candy sales dramatically increased, perhaps by as much as 300 %. In 1990, M&M 's exhibited at New York 's Erie County Fair a life - size fiberglass cow covered with 66,000 M&M candies -- each adhered by hand with the "m '' logo on each candy facing outward. According to a website run by the cow 's designer, Michael Adams, the stunt earned M&M Mars $1 million in free publicity because it was reported on by Newsweek magazine, as well as the New York Post, UPI and WABC - TV, and Live with Regis. In 1995, the company ran the M&M 's Color Campaign, a contest in which participants were given the choice of selecting purple, blue, or pink as the color of a new variety of M&M 's. The announcement of the winning color (blue) was carried on most of the television networks ' news programs as well as the talk shows of David Letterman and Jay Leno. As part of the contest results, the company had the Empire State Building lighted in blue. Although the financial details of these deals were not disclosed and neither was the campaign 's effect on sales, one marketing book estimated that the company "collected millions '' in free publicity and that the campaign "certainly '' resulted in an increasing of the brand 's awareness. In 1998, M&M 's were styled as "The Official Candy of the New Millennium, '' as MM is the Roman numeral for 2000. This date was also the release of the rainbow M&M 's, which are multi-colored and filled with a variety of different fillings. In 2000, "Plain '' M&M 's (a name created in 1954 when "Peanut '' M&M 's were introduced) were renamed "Milk Chocolate '' M&M 's, and pictures of the candy pieces were added to the traditional brown and white packaging. In 1990, Mars Snackfood US signed up to be a sponsor for NASCAR in the Sprint Cup Series. Drivers for the M&M 's - sponsored car through the years have included Ernie Irvan (1999), Ken Schrader (2000 -- 02), Eliott Sadler (2003 -- 06), Ricky Rudd (2007), David Gilliland (2006 -- 07), Kyle Busch (2008 - current, won 2015 Sprint Cup Series Championship), and Michael McDowell. The introduction of the blue M&M to Australia was promoted by the Australian Football League 's Carlton Football Club, which wore sky - blue colored guernseys in one of its matches in 1997 instead of its traditional navy blue -- a color which the successful and fiercely traditional club had worn since the 1870s. In 2010, Mars Snackfood Australia described it as the most successful promotional campaign it had ever engaged in. In April 2005, M&M 's ran the "mPire '' promotion to tie in with the Star Wars: Episode III -- Revenge of the Sith movie release. M&M 's were offered in dark chocolate varieties (Regular and Peanut) for the first time after a string of Addams Family M&M 's commercials. In May 2004, M&M 's ran a Shrek 2 promotion to tie in with the movie 's release. M&M 's were offered "ogre - sized '' (65 % larger) in swamp / ogre colors. They were sold at many stores displayed in huge cardboard - cutout ogre displays. In the summer of 2005, Mars added "Mega M&M 's '' to the lineup. These candies, at 55 % larger than the traditional M&M 's, were a little smaller than the ogre - sized version. They were available in milk chocolate and peanut varieties. The colors for Mega M&M 's were changed to less - bright colors, ostensibly to appeal to older consumers: teal (replacing green), beige (replacing orange), maroon (replacing red), gold (replacing yellow), blue - gray (replacing blue), and brown. In July 2006, Dark Chocolate M&M 's reappeared in a purple package, followed in 2007 by Dark Chocolate Peanut M&M 's. Also in 2006, the company piloted White Chocolate M&M 's as a tie - in with their "Pirates of the Caribbean '' promotion. The company also offered eight new flavors of M&M 's via online sales, as well as at M&M 's World locations: "All That Razz ''; "Eat, Drink, & Be Cherry ''; "A Day at the Peach ''; "Orange - U-Glad ''; "Mint Condition ''; "AlmonDeeLicious ''; "Nut What You Think ''; and "Cookie Monster ''. Mars also released a "Crispy Mint '' variety in Australia that year. Also in 2006, M&M 's became the official chocolate of NASCAR. In 2007, M&M 's introduced a limited - edition raspberry flavor called "M&M 's Razzberry Chocolate Candies ''. Also in 2007, M&M 's produced a 50 - foot, smiling Lady Liberty M&M statue to kick off a campaign encouraging Americans to create their own M&M characters at mms.com. The website allows for people to log in and create their own character from scratch. They can choose features such as the color, shape, hair, and accessories. In 2008, two limited - edition varieties of the candy were introduced -- "Wildly Cherry '' M&M 's, and, as a marketing tie - in with the film Indiana Jones and the Kingdom of the Crystal Skull, "Mint Crisp '' M&M 's. M&M 's also introduced another new product called "M&M 's Premiums '' in 2008. They come in five flavors -- chocolate almond, mint chocolate, mocha, raspberry almond, and triple chocolate (milk, dark, and white chocolate), which are sold in small upright cartons with a plastic bag inside. M&M 's Premiums do not have a candy shell but are coated with carnauba wax and color. Dark Chocolate was added in 2009, replacing Mocha. During the summer of 2008, My M&M 's launched ' Faces, ' which allows consumers to print the faces of loved ones on M&M 's chocolate candies at mymms.com. In February 2009, M&M 's launched the "M&M 's Colour Break - Up '' promotion in Australia where M&M 's were sold in separate packs (one for each color): the packs included a code to win prizes. In Summer 2009, M&M 's launched a limited - edition "Strawberried Peanut Butter '' variant to tie in with the release of Transformers: Revenge of the Fallen. In addition, M&M 's launched a limited edition "Coconut M&M 's, '' which became a permanent item in 2010. In early 2010, M&M 's Bare All were released as part of a competition in Australia and New Zealand. M&M 's Bare All winning packs were ordinary M&M 's, but without colored shells. An official website was launched, along with television advertisements. In April 2010, M&M 's launched a new pretzel variety. In November 2011, Mars released M&M 's Cinnamon Milk Chocolate for Christmas. About the time pretzel M&M 's came out, the M&M 's wrapper designs in the U.S. were redone, from the old design, used from 2004 - early 2010. In 2012, M&M 's released two new Dark Chocolate flavors: Raspberry and Mint. Also that year, M&M 's released a White Chocolate flavor for the Easter season. From May 30, 2012, onwards, M&M 's will be launched in Macau. Its Macanese launch language is Portuguese. In 2012, Peanut M&M 's were produced in the UK in a limited edition "Red, White and Blues only '' pack, in connection with the country 's Diamond Jubilee and 2012 Summer Olympics. The ' M ' remains white on the white candies. The commercial advertising this promotional package had Yellow donning various outfits of British stereotypes to try to get into the limited edition pack. Similarly, to promote the 2014 FIFA World Cup, Peanut M&M 's were produced in a pack that contained only green, yellow, and blue candies, dubbed "Brazilian M&M 's '' in reference to the colors of the flag of Brazil. "Brazilian M&M 's '' were re-released in 2016 to promote the 2016 Summer Olympics, but are now available in both Chocolate and Peanut. In 2013, M&M 's launched the "Better with M '' campaign, which included cause - related marketing. The campaign worked with Habitat for Humanity and encouraged fans to use a Facebook app to volunteer at the various sites where the homes were being built. The advertising campaign was one of the largest that Mars had ever executed. The 2013 "America Better With M '' initiative sough to provide money directly to Habitat for Humanity through offering limited versions of M&Ms in red, white and blue. Related candy brands from Mars include Minstrels, Revels, Skittles, and Treets. M&M 's World specialty shops have been established in some locations, including Las Vegas, Orlando, New York, London, and Shanghai. Several M&Ms - themed video games have been created. The first was M&M 's: The Lost Formulas, released on September 28, 2000. Early black - and - white adverts for the candy in 1954 featured two talking, anthropomorphic M&M characters -- one plain and one peanut -- diving into a swimming pool full of chocolate. The first incarnation of the characters in CGI was a 1994 celebrity campaign which had the characters interacting with celebrities on which M&Ms candy color is their favorite. This campaign was created by Blue Sky Studios. Concurrent with 1995 's blue M&M campaign, M&M 's introduced second computer - animated "spokescandies '' in their television commercials. The depiction and campaign of the M&M 's were made by Will Vinton in 1995. Vinton previously created the clay - animation California Raisins in 1986. Around the time he worked on CGI projects, he made the depiction of the M&M 's as more mature than most food mascots. These include the team of the cynical and sardonic Red (originally voiced by Jon Lovitz, thereafter Billy West) who is the mascot for milk chocolate, peanut butter, and crispy M&M 's, and the happy and gullible Yellow (originally voiced by John Goodman, thereafter J.K. Simmons), who is the mascot for peanut M&M 's (he was originally known as "Peanut '' when first introduced). Other mascots include the "cool one '', Blue (voiced by Phil Hartman until his death in 1998, thereafter Robb Pruitt) who is the mascot for almond M&M 's; the seductive Green (voiced by Cree Summer), who is the mascot for both dark chocolate mint and peanut butter M&M 's, and the slightly neurotic Orange (voiced by Eric Kirchberger), who was introduced when Crispy M&M 's were first released and returned when Pretzel M&M 's debuted in 2010. Orange, upon his return, was joined by the second non-M&M mascot, Pretzel Guy (voiced by Maurice LaMarche), who "supports '' him and offers helpful advice as he hates the idea of having a pretzel put inside his body. Other mascots that were introduced, but no longer used, are Almond, the original green guy; Orange, a female peanut character, Chocolate Bar; the first non-M&M character that always gets foiled or outdone by Red and Yellow by being melted when M&M 's ca n't, and the Swarmees for M&M 's Minis candies, which are portrayed as destructive yet crafty troublemakers who Red and Yellow are always trying unsuccessfully to contain. Female M&M 's mascots were introduced in 1995. Green was the milk chocolate mascot and Tan was the peanut. Marketing discontinued Tan when they introduced the then - new Blue mascot. Green was the only female M&M 's mascot from her introduction in 1995 until 2012 when M&M 's unveiled a new additional spokescandy, the businesslike Ms. Brown (voiced by Miss America 1984 Vanessa Williams), the "Chief Chocolate Officer. '' She made her debut in a Super Bowl XLVI advertisement, where several people at a party assume she is naked because her shell is the same color as her insides, which causes Red to remove his outer shell thinking "it 's that kind of party '', and start dancing to the LMFAO song "Sexy And I Know It. '' The original colors of M&M 's candies were red, yellow, violet, green and orange. In 1976, the Food and Drug Administration released a study that linked red dye 2 in food coloring to cancer. Though Mars did not use this dye, they decided to pull the red M&M 's from the market to avoid possible misunderstandings. The red candy were reintroduced to the market ten years later. In early 1995, Mars ran a promotion in which consumers were invited to vote on which of blue, pink, or purple would replace the tan M&M 's. Blue was the winner with 54 % of the votes. It replaced tan in late 1995. Consumers could vote by calling 1 - 800 - FUN - COLOR. Ads for the new blue colors featured a plain and an almond blue M&M character as Red and Yellow take notice of trying to do takes in the commercial by painting themselves blue where they appear on stage with B.B. King singing the blues, but the filmmakers had to cut the scene as they were not the real blue M&M 's; another featured Red and Yellow holding their breath to look like the new blue M&M 's, where Steven Weber sees the three M&M 's, Red, Yellow, and Blue; and one more featuring Weber talking to the blue M&M if he had dived into the chocolate pool, but did not. In 2002, Mars solicited votes in their first ever "M&M 's Global Color Vote '' to add a new color from three choices: aqua (turquoise), pink, and purple. Purple won and was featured for a limited time. To help the colors get votes, Ken Schrader and his MB2 Motorsports team, who was sponsored by M&M 's at the time, ran four paint schemes during the 2002 NASCAR Winston Cup Series season representing the promotion (one for aqua, one for pink, one for purple, and another one with all three colors on the car.) Specially marked packages of M&M 's were released in Japan. Finding a bag of all purple M&M 's entitled the customer to a prize of 100 million yen (equivalent to approximately USD $852,000). On January 1, 2004, at the stroke of midnight, Mars removed all of the colors of M&M 's and made them black - and - white on both their candies and the packaging. It coincided with a commercial parodying The Wizard of Oz where Dorothy is home in bed and looks out of the window and sees what the colors of the four M&M 's were. The goal was to help the M&M 's find their colors in black - and - white packages of M&M 's, in this order: brown, orange, red, green, yellow, and blue. After all of the colors have been found, the colored packaging returned, and began carrying the theme "Chocolate is better in color ''. Since 2004, M&M 's have been available online in 17 colors, with personalized phrases on each candy on the opposite side from the "m ''. Released around Christmas, these custom - printed M&M 's were originally intended for holiday greetings, but are now available all year round. For the 2008 Valentine 's Day season, Mars introduced all - green bags of M&M 's. This was due to common urban folklore that says green M&M 's are an aphrodisiac. They were brought back for 2009 alongside the "Ms. Green Heats Up Valentine 's Day '' contest. In October 2011, Mars released M&M 's White Chocolate Candy Corn exclusively in the United States for Halloween. These candies come in three candy corn inspired colors: white, bright yellow, and bright orange. The following is a summary of the changes to the colors of the flagship (milk chocolate) flavor of M&M 's, the only filling manufactured continuously since the beginning of the brand. From 1941 until 1969, each package contained M&M 's in five different colors; when red M&M 's were reintroduced in 1987, they were added as a sixth color instead of replacing any of the existing colors. Red candies were eliminated in 1976 because of health concerns over the dye amaranth (FD&C Red # 2), which was a suspected carcinogen, and were replaced with orange - colored candies. This was done despite the fact that M&M 's did not contain the dye; the action was purely to satisfy worried consumers. Red candies were reintroduced ten years later, but they also kept the orange colored M&M 's. Paul Hethmon, then a student at University of Tennessee, started the campaign to bring back red M&M 's as a joke that would eventually become a worldwide phenomenon. In Europe, red M&M 's contain the red dye carmine (E120). Carmine, also known as cochineal, has had some campaigns launched against its use in food, because it is made from crushed insects. Notably, Starbucks in the UK had a campaign launched against it for using carmine.
who sang background vocals on my sweet lord
My Sweet Lord - wikipedia "My Sweet Lord '' is a song by English musician and Beatle, George Harrison. It was released in November 1970 on his triple album All Things Must Pass. Also issued as a single, Harrison 's first as a solo artist, "My Sweet Lord '' topped charts worldwide and was the biggest - selling single of 1971 in the UK. In America and Britain, the song was the first number - one single by an ex-Beatle. Harrison originally gave the song to his fellow Apple Records artist Billy Preston to record; this version, which Harrison co-produced, appeared on Preston 's Encouraging Words album in September 1970. Harrison wrote "My Sweet Lord '' in praise of the Hindu god Krishna, while at the same time intending the lyrics to serve as a call to abandon religious sectarianism through his deliberate blending of the Hebrew word hallelujah with chants of "Hare Krishna '' and Vedic prayer. The recording features producer Phil Spector 's Wall of Sound treatment and heralded the arrival of Harrison 's much - admired slide guitar technique, which one biographer described as being "musically as distinctive a signature as the mark of Zorro ''. Preston, Ringo Starr, Eric Clapton, and the group Badfinger are among the other musicians appearing on the recording. Later in the 1970s, "My Sweet Lord '' was at the centre of a heavily publicised copyright infringement suit, due to its similarity to the Ronnie Mack song "He 's So Fine '', a 1963 hit for the New York girl group the Chiffons. In 1976, Harrison was found to have subconsciously plagiarised the earlier tune, a verdict that had repercussions throughout the music industry. He claimed to have used the out - of - copyright "Oh Happy Day '', a Christian hymn, as his inspiration for the song 's melody. Harrison performed "My Sweet Lord '' at the Concert for Bangladesh in August 1971, and it remains the most popular composition from his post-Beatles career. He reworked the song as "My Sweet Lord (2000) '' for inclusion as a bonus track on the 30th anniversary reissue of All Things Must Pass. Many artists have covered the song including Andy Williams, Peggy Lee, Edwin Starr, Johnny Mathis, Nina Simone, Julio Iglesias, Richie Havens, Megadeth, Boy George, Elton John, Jim James, Bonnie Bramlett and Elliott Smith. "My Sweet Lord '' is ranked 460th on Rolling Stone magazine 's list of "the 500 Greatest Songs of All Time ''. The song reached number 1 in Britain for a second time when re-released in January 2002, two months after Harrison 's death. George Harrison began writing "My Sweet Lord '' in December 1969, when he, Billy Preston and Eric Clapton were in Copenhagen, Denmark, as guest artists on Delaney & Bonnie 's European tour. By this time, Harrison had already written the gospel - influenced "Hear Me Lord '' and "Gopala Krishna '', and (with Preston) the African - American spiritual "Sing One for the Lord ''. He had also produced two religious - themed hit singles on the Beatles ' Apple record label: Preston 's "That 's the Way God Planned It '' and Radha Krishna Temple (London) 's "Hare Krishna Mantra ''. The latter was a musical adaptation of the 5000 - year - old Vaishnava Hindu mantra, performed by members of the International Society for Krishna Consciousness (ISKCON), colloquially known as "the Hare Krishna movement ''. Harrison now wanted to fuse the messages of the Christian and Gaudiya Vaishnava faiths into what musical biographer Simon Leng terms "gospel incantation with a Vedic chant ''. The Copenhagen stopover marked the end of the Delaney & Bonnie tour, with a three - night residency at the Falkoner Theatre on 10 -- 12 December. According to Harrison 's 1976 court testimony, "My Sweet Lord '' was conceived while the band members were attending a backstage press conference and he had ducked out to an upstairs room at the theatre. Harrison recalled vamping chords on guitar and alternating between sung phrases of "hallelujah '' and "Hare Krishna ''. He later took the idea to the others, and the chorus vocals were developed further. Band leader Delaney Bramlett 's more recent version of events is that the idea originated from Harrison asking him how to go about writing a genuine gospel song, and that Bramlett demonstrated by scat singing the words "Oh my Lord '' while wife Bonnie and singer Rita Coolidge added gospel "hallelujah '' s in reply. British music journalist John Harris has questioned the accuracy of Bramlett 's account, however, comparing it to a fisherman 's "It was this big '' - type bragging story. Using as his inspiration the Edwin Hawkins Singers ' rendition of an eighteenth - century Christian hymn, "Oh Happy Day '', Harrison continued working on the theme. He completed the song, with some help from Preston, once they had returned to London. The song 's lyrics reflect Harrison 's often - stated desire for a direct relationship with God, expressed in simple words that all believers could affirm, regardless of their religion. Author Ian Inglis observes a degree of "understandable '' impatience in the first verse 's line, "Really want to see you, Lord, but it takes so long, my Lord ''. By the end of the song 's second verse, Harrison declares a wish to "know '' God also and attempts to reconcile the impatience: I really want to know you Really want to go with you Really want to show you, Lord, that it wo n't take long, my Lord Following this verse, in response to the main vocal 's repetition of the song title, Harrison devised a choral line singing the Hebrew word of praise, "hallelujah '', common in the Christian and Jewish religions. Later in the song, after an instrumental break, these voices return, now chanting the first twelve words of the Hare Krishna mantra, known more reverentially as the Maha mantra: Hare Krishna, Hare Krishna Krishna Krishna, Hare Hare Hare Rama, Hare Rama These Sanskrit words are the main mantra of the Hare Krishna faith, with which Harrison identified, although he did not belong to any spiritual organisation. In his 1980 autobiography, I, Me, Mine, Harrison explained that he intended repeating and alternating "hallelujah '' and "Hare Krishna '' to show that the two terms meant "quite the same thing '', as well as to have listeners chanting the mantra "before they knew what was going on! '' Following the Sanskrit lines, "hallelujah '' is sung twice more before the mantra repeats, along with an ancient Vedic prayer. According to Hindu tradition, this prayer is dedicated to a devotee 's spiritual teacher, or guru, and equates the teacher to the divine Trimurti -- Brahma, Vishnu and Shiva (or Maheshvara) -- and to the Godhead, Brahman. Gurur Brahmā, gurur Viṣṇur gurur devo Maheśvaraḥ gurus sākṣāt, paraṃ Brahma tasmai śrī gurave namaḥ. Former Krishna devotee Joshua Greene translates the lines as follows: "I offer homage to my guru, who is as great as the creator Brahma, the maintainer Vishnu, the destroyer Shiva, and who is the very energy of God. '' The prayer is the third verse of the Guru Stotram, a fourteen - verse hymn in praise of Hindu spiritual teachers. Some Christian fundamentalist anti-rock activists objected that chanting "Hare Krishna '' in "My Sweet Lord '' was anti-Christian or satanic, while some born - again Christians adopted the song as an anthem. Several commentators cite the mantra and the simplicity of Harrison 's lyrics as central to the song 's universality. The "lyrics are not directed at a specific manifestation of a single faith 's deity, '' Inglis writes, "but rather to the concept of one god whose essential nature is unaffected by particular interpretations and who pervades everything, is present everywhere, is all - knowing and all - powerful, and transcends time and space... All of us -- Christian, Hindu, Muslim, Jew, Buddhist -- can address our gods in the same way, using the same phrase (' my sweet Lord '). '' With the Beatles still together officially in December 1969, Harrison had no plans to make a solo album of his own and reportedly intended to offer "My Sweet Lord '' to Edwin Hawkins. Instead, following the Delaney & Bonnie tour, he decided to record it with Billy Preston, for whom Harrison was co-producing a second Apple album, Encouraging Words. Recording took place at Olympic Studios in London, in January 1970, with Preston as principal musician, supported by the guitarist, bass player and drummer from the Temptations ' backing band. The Edwin Hawkins Singers happened to be on tour in the UK as well, so Harrison invited them to participate; Hawkins ' gospel group also overdubbed vocals onto the Harrison -- Preston collaboration "Sing One for the Lord '' at this time. Preston 's version of "My Sweet Lord '' differs from Harrison 's later reading in that the "hallelujah '' refrain appears from the start of the song and, rather than the full mantra section, the words "Hare Krishna '' are sung only twice throughout the whole track. With the Vedic prayer likewise absent, Simon Leng views this original recording as a possible "definitive ' roots ' take ' '' of the song, thanks to its "pure gospel groove '' and Hawkins ' participation. In his review of Encouraging Words, Bruce Eder of AllMusic describes "My Sweet Lord '' and "All Things Must Pass '' (another Harrison composition originally given to Preston to record) as "stunning gospel numbers... that make the Harrison versions seem pallid ''. Preston 's "My Sweet Lord '' was a minor hit in Europe when issued as a single there in September 1970, but otherwise, Encouraging Words made little impression commercially. The album and single releases were delayed for at least two months in the United States, where "My Sweet Lord '' would climb to number 90 on the Billboard Hot 100 by the end of February 1971, helped by the enormous success of Harrison 's version. Five months after the Olympic session, with the Beatles having now broken up, "My Sweet Lord '' was one of 30 or more tracks that Harrison recorded for his All Things Must Pass triple album. It was a song he had been reluctant to record, for fear of committing himself publicly to such an overt religious message. "I was sticking my neck out on the chopping block because now I would have to live up to something, '' Harrison explained in I Me Mine, "but at the same time I thought ' Nobody 's saying it; I wish somebody else was doing it. ' '' With Phil Spector co-producing the sessions at Abbey Road Studios, Preston again played on the track, along with Clapton, Ringo Starr, Jim Gordon and all four members of Badfinger. The identity of the remaining musicians has traditionally been open to question, with drummer Alan White once claiming he played on the song, with Carl Radle on bass, Starr on tambourine and John Lennon among the rhythm guitarists. The common view, following research by Simon Leng, is that Harrison and Spector chose from a number of rhythm tracks before selecting the master take, which featured, among others, Klaus Voormann on bass and Gary Wright on a second keyboard; Bruce Spizer suggests that Peter Frampton may have added acoustic guitar after the main session. Harrison 's original vocal appears to have been acceptable, according to notes written by Spector in August, but the chorus vocals (all sung by Harrison and credited to "the George O'Hara - Smith Singers ''), his harmonised slide guitar parts, and John Barham 's orchestral arrangement were overdubbed during the next two months, partly at Trident Studios in central London. Leng describes the recording as a "painstakingly crafted tableau '' of sound, beginning with a bank of "chiming '' acoustic guitars and the "flourish '' of zither strings that introduces Harrison 's slide - guitar motif. At close to the two - minute mark, after the tension - building bridge, a subtle two - semitone shift in key (from E major to the rarely used key of F - sharp major, via a C# dominant seventh chord) signals the song 's release from its extended introduction. This higher register is then complemented by Harrison 's "increasingly impassioned '' vocal and the subsequent "timely reappearance '' of his twin slide guitars, before the backing vocals "deftly '' switch to the Sanskrit mantra and prayer. Leng also notes the Indian music aspects of the production, in the "swarmandal - like '' zithers, representing the sympathetic strings of a sitar, and the slide guitars ' evocation of sarangi, dilruba and other string instruments. In an interview for Martin Scorsese 's 2011 documentary on George Harrison, Spector recalls that he liked the results so much, he insisted that "My Sweet Lord '' be the lead single from the album. This later, rock version of the song was markedly different from the "Oh Happy Day '' - inspired gospel arrangement in musical and structural terms, aligning Harrison 's composition with pop music conventions, but also drawing out the similarities of its melody line with that of the Chiffons ' 1963 hit "He 's So Fine ''. Spizer suggests that this was due to Harrison being "so focused on the feel of his record '', while Record Collector editor Peter Doggett wrote in 2001 that, despite Harrison 's inspiration for "My Sweet Lord '' having come from "Oh Happy Day '', "in the hands of producer and arranger Phil Spector, it came out as a carbon copy of the Chiffons ' (song) ''. Chip Madinger and Mark Easter remark on the "sad '' fact that Spector, as "master of all that was ' girl - group ' during the early ' 60s '', failed to recognise the similarities. Before arriving in New York on 28 October to carry out mastering on All Things Must Pass, Harrison had announced that no single would be issued -- so as not to "detract from the impact '' of the triple album. Apple 's US executive, Allan Steckler, together with business manager Allen Klein and Spector all pushed for "My Sweet Lord '' to be released immediately, however, even though Billy Preston 's version was already scheduled for release as a single in America the following month. Film director Howard Worth recalls a preliminary finance meeting for the Raga documentary (for which Harrison would provide emergency funding through Apple Films) that began with the ex-Beatle asking him to listen to a selection of songs and pick his favourite, which was "My Sweet Lord ''. Harrison relented, and "My Sweet Lord '' was issued as the album 's lead single around the world, but not in Britain; the release date was 23 November 1970 in the United States. The mix of the song differed from that found on All Things Must Pass by featuring less echo and a slightly altered backing - vocal track. Both sides of the North American picture sleeve consisted of a Barry Feinstein photo of Harrison taken through a window at his recently purchased Friar Park home, with some of the estate 's trees reflected in the glass. Released as a double A-side with "Is n't It a Pity '', with Apple catalogue number 2995 in America, both sides of the disc featured a full Apple label. Public demand via constant airplay in Britain led to a belated UK release, on 15 January 1971. There, as Apple R 5884, the single was backed by "What Is Life '', a song that Apple soon released elsewhere internationally as the follow - up to "My Sweet Lord ''. Harrison 's version of "My Sweet Lord '' was an international number 1 hit by the end of 1970 and through the early months of 1971 -- the first solo single by a Beatle to reach the top, and the biggest seller by any of the four throughout the 1970s. Without the support of any concert appearances or promotional interviews by Harrison, the single 's commercial success was due to its impact on radio, where, Harrison biographer Gary Tillery writes, the song "rolled across the airwaves like a juggernaut, with commanding presence, much the way Dylan 's ' Like a Rolling Stone ' had arrived in the mid-sixties ''. Elton John recalls first hearing "My Sweet Lord '' in a taxi and names it as the last of the era 's great singles: "I thought, ' Oh my God, ' and I got chills. You know when a record starts on the radio, and it 's great, and you think, ' Oh, what is this, what is this, what is this? ' The only other record I ever felt that way about (afterwards) was ' Brown Sugar '... '' In his 40 - page Harrison tribute article for Rolling Stone in 2002, Mikal Gilmore credited "My Sweet Lord '' as being "as pervasive on radio and in youth consciousness as anything the Beatles had produced ''. The single was certified gold by the Recording Industry Association of America on 14 December 1970 for sales of over 1 million copies. It reached number 1 on the US Billboard Hot 100 on 26 December, remaining on top for four weeks, three of which coincided with All Things Must Pass 's seven - week reign atop the Billboard albums chart. In Britain, "My Sweet Lord '' entered the charts at number 7, before hitting number 1 on 30 January and staying there for five weeks. It was the biggest - selling single of 1971 in the UK and performed similarly well around the world, particularly in France and Germany, where it held the top spot for nine and ten weeks, respectively. In his 2001 appraisal of Harrison 's Apple recordings, for Record Collector, Doggett described Harrison as "arguably the most successful rock star on the planet '' over this period, adding: "' My Sweet Lord ' and All Things Must Pass topped charts all over the world, easily outstripping other solo Beatles projects later in the year, such as Ram and Imagine. '' The single 's worldwide sales amounted to 5 million copies by 1978, making it one of the best - selling singles of all time. By 2010, according to Inglis, "My Sweet Lord '' had sold over 10 million copies. The song returned to the number 1 position again in the UK when reissued in January 2002, two months after Harrison 's death from cancer at the age of 58. Peter Lavezzoli, author of The Dawn of Indian Music in the West, has written of Harrison 's first solo single: "' My Sweet Lord ' was everything that people wanted to hear in November 1970: shimmering harmonies, lustrous acoustic guitars, a solid Ringo Starr backbeat, and an exquisite (Harrison) guitar solo. '' Reviewing the single for Rolling Stone, Jon Landau called the track "sensational ''. In an era when songs by Radha Krishna Temple and adaptations of the Christian hymns "Oh Happy Day '' and "Amazing Grace '' were all worldwide hits, Ben Gerson of Rolling Stone observed that the substituting of Harrison 's "Hare Krishna '' refrain for the trivial "Doo - lang, doo - lang, doo - lang '' s of "He 's So Fine '' was "a sign of the times ''. John Lennon told a reporter, "Every time I put the radio on, it 's ' Oh my Lord ' -- I 'm beginning to think there must be a God. '' In his December 1970 album review for NME, Alan Smith bemoaned the apparent lack of a UK single release for "My Sweet Lord '' and noted that the song "seems to owe something '' to "He 's So Fine ''. To Gerson, it was an "obvious re-write '' of the Chiffons hit, and within two months US music publisher Bright Tunes had served a writ on Harrison citing unauthorised copyright infringement. In a January 1971 review for NME, Derek Johnson expressed surprise at Apple 's delay in releasing the single in the UK, before declaring: "In my opinion, this record -- finally and irrevocably -- establishes George as a talent equivalent to either Lennon or McCartney. '' More recently, AllMusic 's Richie Unterberger explains the international popularity of Harrison 's single: "' My Sweet Lord ' has a quasi-religious feel, but nevertheless has enough conventional pop appeal to reach mainstream listeners who may or may not care to dig into the spiritual lyrical message. '' Added to this was a slide guitar riff that Simon Leng describes as "among the best - known guitar passages in popular music ''. Ian Inglis highlights the combination of Harrison 's "evident lack of artifice '' and Spector 's "excellent production '', such that "My Sweet Lord '' can be heard "as a prayer, a love song, an anthem, a contemporary gospel track, or a piece of perfect pop ''. Due to the ensuing plagiarism suit, "My Sweet Lord '' became somewhat stigmatised by association, to the point where no mention of the song was complete without a reference to "He 's So Fine ''. "My Sweet Lord '' was ranked 460th on Rolling Stone magazine 's list of "the 500 Greatest Songs of All Time '' in 2004, yet the accompanying text only briefly mentioned the success of the single and Harrison 's "teardrop slide licks '' before concentrating on the controversial lawsuit. While acknowledging the common ground between the two songs, music critic David Fricke describes Harrison 's composition as "the honest child of black American sacred song ''. Writing around the time of All Things Must Pass 's 2001 reissue, again for Rolling Stone, Anthony DeCurtis described "My Sweet Lord '' as "capturing the sweet satisfactions of faith '', while to Mikal Gilmore, it is an "irresistible devotional ''. At the end of 1971, "My Sweet Lord '' topped the Melody Maker reader 's polls for both "Single of the Year '' and "World 's Single of the Year ''; in the US publication Record World, the song was also voted best single and Harrison was honoured as "Top Male Vocalist of 1971 ''. In June 1972, Harrison won two Ivor Novello songwriter 's awards for "My Sweet Lord ''. In 2010, AOL Radio listeners voted "My Sweet Lord '' the best song from George Harrison 's solo years. Mick Jagger and Keith Richards have both named it among their personal favourites of all Harrison 's songs, along with "While My Guitar Gently Weeps ''. According to the website Acclaimed Music, "My Sweet Lord '' has also appeared in the following critics ' best - song lists and books, among others: The 7,500 Most Important Songs of 1944 -- 2000 by author Bruce Pollock (2005), Dave Thompson 's 1000 Songs That Rock Your World (2011; ranked at number 247), Ultimate Classic Rock 's "Top 100 Classic Rock Songs '' (2013; number 56), the NME 's "100 Best Songs of the 1970s '' (2012; number 65), and the same magazine 's "500 Greatest Songs of All Time '' (2014; number 270). On 10 February 1971, Bright Tunes filed suit against Harrison and associated organisations (including Harrisongs, Apple Records and BMI), alleging copyright infringement of the late Ronnie Mack 's song "He 's So Fine ''. In I Me Mine, Harrison admits to having thought "Why did n't I realise? '' when others started pointing out the similarity between the two songs; by June that year, country singer Jody Miller had released a cover of "He 's So Fine '' incorporating Harrison 's "My Sweet Lord '' slide - guitar riffs, thus "really putting the screws in '' from his point of view. On Harrison 's behalf, manager Allen Klein entered into negotiations with Bright Tunes to resolve the issue, by offering to buy the financially ailing publisher 's entire catalogue, but no settlement could be reached before the company was forced into receivership. While comparing the two compositions, author and musicologist Dominic Pedler writes that both songs have a three - syllable title refrain ("My sweet Lord '', "He 's so fine '') followed by a 5 - 3 - 2 descent of the major scale in the tonic key (E major for "My Sweet Lord '' and G major for "He 's So Fine ''); respective tempos are similar: 121 and 145 beats per minute. In the respective B sections ("I really want to see you '' and "I dunno how I 'm gon na do it ''), there is a similar ascent through 5 - 6 - 8, but the Chiffons distinctively retain the G tonic for four bars and, on the repeat of the motif, uniquely go to an A-note 9th embellishment over the first syllable of "gon na ''. Harrison, on the other hand, introduces the more complex harmony of a relative minor (C # m), as well as the fundamental and distinctly original slide - guitar motif. While the case was on hold, Harrison and his former bandmates Lennon and Starr chose to sever ties with Klein at the end of March 1973 -- an acrimonious split that led to further lawsuits for the three ex-Beatles. Bright Tunes and Harrison later resumed their negotiations; his final offer of 40 per cent of "My Sweet Lord '' 's US composer 's and publisher 's royalties, along with a stipulation that he retain copyright for his song, was viewed as a "good one '' by Bright 's legal representation, yet the offer was rejected. It later transpired that Klein had renewed his efforts to purchase the ailing company, now solely for himself, and to that end was supplying Bright Tunes with insider details regarding "My Sweet Lord '' 's sales figures and copyright value. In the build - up to the case going to court, the Chiffons recorded a version of "My Sweet Lord '', with the aim of drawing attention to the lawsuit. Beatles author Alan Clayson has described the plagiarism suit as "the most notorious civil action of the decade '', the "extremity '' of the proceedings provoked by a combination of the commercial success of Harrison 's single and the intervention of "litigation - loving Mr Klein ''. Bright Tunes Music v. Harrisongs Music finally went to the United States district court on 23 February 1976, to hear evidence on the allegation of plagiarism. Harrison attended the proceedings in New York, with a guitar, and each side called musical experts to support its argument. After reconvening in September 1976, the court found that Harrison had "subconsciously '' copied the earlier tune, since he admitted to having been aware of the Chiffons ' recording. Judge Richard Owen said in his conclusion to the proceedings: Did Harrison deliberately use the music of He 's So Fine? I do not believe he did so deliberately. Nevertheless, it is clear that My Sweet Lord is the very same song as He 's So Fine with different words, and Harrison had access to He 's So Fine. This is, under the law, infringement of copyright, and is no less so even though subconsciously accomplished. With liability established, the court then recommended an amount for the damages to be paid by Harrison and Apple to Bright Tunes, which Owen totalled at $1,599,987 -- amounting to three - quarters of the royalty revenue raised in North America from "My Sweet Lord '', as well as a significant proportion of that from the All Things Must Pass album. This figure has been considered over-harsh and unrealistic by some observers, since it both underplayed the unique elements of Harrison 's recording -- the universal spiritual message of its lyrics, the signature guitar hook, and its production -- and ignored the critical acclaim his album received in its own right. Elliot Huntley observes: "People do n't usually hear a single and then automatically go and buy an expensive boxed - set triple album on the off - chance. '' The award factored in the royalty revenue raised from "My Sweet Lord '' 's inclusion on the recent Best of George Harrison compilation, though at a more moderate percentage than for the 1970 album. The ruling set new legal precedents and was a personal blow for Harrison, who admitted he was too "paranoid '' to write anything new for some time afterwards. Early reaction in the music industry saw Little Richard claim for breach of copyright in a track recorded by the Beatles in 1964 for the Beatles for Sale album, as well as Ringo Starr credit songwriter Clifford T. Ward as the inspiration for his Ringo 's Rotogravure song "Lady Gaye ''. In the UK, the corresponding damages suit, brought by Peter Maurice Music, was swiftly settled out of court in July 1977. During the drawn - out damages portion of the US suit, events played into Harrison 's hands when Klein 's ABKCO Industries finally purchased the copyright to "He 's So Fine '', and with it all litigation claims, after which Klein proceeded to negotiate sale of the song to Harrison. On 19 February 1981, the court decided that due to Klein 's duplicity in the case, Harrison would only have to pay ABKCO $587,000 instead of the $1.6 million award and he would also receive the rights to "He 's So Fine '' -- $587,000 being the amount Klein had paid Bright Tunes for the song in 1978. The court ruled that the former manager 's actions had been in breach of the fiduciary duty owed to Harrison, a duty that continued "even after the principal -- agent relationship ended ''. The litigation continued through to the early 1990s, however, as the finer points of the settlement were ironed out; in his 1993 essay on Bright Tunes v. Harrisongs, Joseph Self describes it as "without question, one of the longest running legal battles ever to be litigated in (the United States) ''. Matters would not ultimately be concluded until March 1998. Subsequent charges of plagiarism in the music industry have resulted in a policy of swift settlement and therefore limited damage to an artist 's credibility: the Rolling Stones ' "Anybody Seen My Baby? '', Oasis ' "Shakermaker '', "Whatever '' and "Step Out '', and the Verve 's "Bitter Sweet Symphony '' are all examples of songs whose writing credits were hastily altered to acknowledge composers of a potentially plagiarised work, with the minimum of litigation. Shortly before the ruling was handed down in September 1976, Harrison wrote and recorded a song inspired by the court case -- the upbeat "This Song '' -- which includes the line "This tune has nothing ' Bright ' about it ''. The 1960s soul hits "I Ca n't Help Myself (Sugar Pie Honey Bunch) '' and "Rescue Me '', as well as his own composition "You '', are all name - checked in the lyrics, as if to demonstrate the point that, as he later put it, "99 % of the popular music that can be heard is reminiscent of something or other. '' In a 1980 interview with Playboy magazine, John Lennon expressed his doubts about the notion of "subconscious '' plagiarism, saying: "He must have known, you know. He 's smarter than that... He could have changed a couple of bars in that song and nobody could ever have touched him, but he just let it go and paid the price. Maybe he thought God would just sort of let him off. '' Ringo Starr 's reaction was more charitable: "There 's no doubt that the tune is similar but how many songs have been written with other melodies in mind? George 's version is much heavier than The Chiffons -- he might have done it with the original in the back of his mind, but he 's just very unlucky that someone wanted to make it a test case in court. '' Speaking to his friend and I, Me, Mine editor Derek Taylor in 1979, Harrison said of the episode: "I do n't feel guilty or bad about it, in fact it saved many a heroin addict 's life. I know the motive behind writing the song in the first place and its effect far exceeded the legal hassle. '' Since its initial release on All Things Must Pass, "My Sweet Lord '' has appeared on the 1976 compilation The Best of George Harrison and 2009 's career - spanning Let It Roll: Songs by George Harrison. The original UK single (with "What Is Life '' as the B - side) was reissued on Christmas Eve 1976 in Britain -- a "provocative '' move by EMI, given the publicity the lawsuit had attracted that year for the song. The song appears in the 2017 Marvel Studios sequel film, Guardians of the Galaxy Vol. 2, and it is included on the movie 's soundtrack. On 26 December 1975, Harrison made a guest appearance on his friend Eric Idle 's BBC2 comedy show Rutland Weekend Television, sending up his serious public image, and seemingly about to perform "My Sweet Lord ''. As a running gag throughout the half - hour show, Harrison interrupts the sketches, trying to land an acting role as a pirate (and dressed accordingly), but gets turned down each time by RWT regulars Idle and Neil Innes, who simply want him to play the part of "George Harrison ''. He then reappears at the end in more normal attire, strumming the well - known introduction to "My Sweet Lord '' on an acoustic guitar, and backed by the house band; instead of continuing with the song, Harrison finally takes his chance to play "Pirate Bob '' by abruptly segueing into a sea shanty -- to the horror of the "greasy '' compère, played by Idle. The other musicians follow Harrison 's lead, after which a group of dancers appear on stage and the show 's closing credits roll. This performance is known as "The Pirate Song '', co-written by Harrison and Idle, and the recording is only available unofficially on bootleg compilations such as Pirate Songs. Observing the parallels with Harrison 's real - life reluctance to play the pop star, Simon Leng writes, "there was great resonance within these gags. '' In January 2001, Harrison included a new version of the song as a bonus track on the remastered All Things Must Pass album. "My Sweet Lord (2000) '' featured Harrison sharing vocals with Sam Brown, daughter of his friend Joe Brown, backed by mostly new instrumentation, including acoustic guitar by his son Dhani and tambourine by Ray Cooper. The track opens with a "snippet '' of sitar, to "emphasize its spiritual roots '', Leng suggests. On release, Harrison explained that his motivation for remaking the song was partly to "play a better slide guitar solo ''; he also cited the "spiritual response '' that the song had traditionally received, together with his interest in reworking the tune to avoid the contentious musical notes, as further reasons. Of the extended slide - guitar break on "My Sweet Lord (2000) '', Leng writes: "(Harrison) had never made so clear a musical statement that his signature bottleneck sound was as much his tool for self - expression as his vocal cords. '' Elliot Huntley opines that Harrison 's vocal was more "gospel inflected '' and perhaps even more sincere than on the original recording, "given his deteriorating health '' during the final year of his life. This version also appeared on the January 2002 posthumous release of the "My Sweet Lord '' single -- a three - song charity CD comprising the original 1970 -- 71 hit, the acoustic run - through of "Let It Down '' (with recent overdubs, another 2001 bonus track), and Harrison 's reworking of the title song. Proceeds from the single went to the Material World Charitable Foundation, set up by Harrison in April 1973. For some months after the single 's release, a portion of "My Sweet Lord (2000) '' played on Harrison 's official website, on a constant loop, over screen images of lotus petals scattering and then re-forming. The song also appears on the 2014 Apple Years 1968 -- 75 reissue of All Things Must Pass. In November 2011, a demo of "My Sweet Lord '', with Harrison backed by just Voormann and Starr, was included on the deluxe edition CD accompanying the British DVD release of Martin Scorsese 's George Harrison: Living in the Material World documentary. Described as an early "live take '' by compilation producer Giles Martin, and an "acoustic hosanna '' by David Fricke of Rolling Stone, it was recorded at the start of the All Things Must Pass sessions and was later released internationally on Early Takes: Volume 1 in May 2012. Harrison performed "My Sweet Lord '' at every one of his relatively few solo concerts, starting with the two Concert for Bangladesh shows at New York 's Madison Square Garden on 1 August 1971. The recording released on the subsequent live album was taken from the evening show and begins with Harrison 's spoken "Hare Krishna '' over his opening acoustic - guitar chords. Among the 24 backing musicians was a "Soul Choir '' featuring singers Claudia Linnear, Dolores Hall and Jo Green, but it was Harrison who sung the end - of - song Guru Stotram prayer in his role as lead vocalist, unlike on the studio recording (where it was sung by the backing chorus); the slide guitar parts were played by Eric Clapton and Jesse Ed Davis. During his 1974 North American tour, Harrison 's only one there as a solo artist, "My Sweet Lord '' was performed as the encore at each show. In contrast with the subtle shift from "hallelujah '' s to Sanskrit chants on his 1970 original, Harrison used the song to engage his audience in the practice of "chanting the holy names of the Lord '', or kirtan -- from "Om Christ! '' and Krishna, to Buddha and Allah -- with varying degrees of success. Backed by a band that again included Billy Preston, Harrison turned "My Sweet Lord '' into an "R&B - styled '' extended gospel - funk piece, closer in its arrangement to Preston 's Encouraging Words version and lasting up to ten minutes. The performance of the song at Tulsa 's Assembly Center on 21 November marked the only guest appearance of the tour when Leon Russell joined the band on stage. Harrison 's second and final solo tour took place in Japan in December 1991, with Clapton 's band. A live version of "My Sweet Lord '' recorded at the Tokyo Dome, on 14 December, was released the following year on the Live in Japan album. The following musicians are believed to have played on Harrison 's original version of "My Sweet Lord '': sales figures based on certification alone shipments figures based on certification alone
what was the population of africa in 1900
List of countries by population in 1900 - wikipedia This is a list of countries by population in 1900. Estimate numbers are from the beginning of the year and exact population figures are for countries that held a census on various dates in the year 1900. World Population
list of top 10 religion in the world
List of Religious populations - wikipedia This is a list of religious populations by number of adherents and countries. Adherents.com says "Sizes shown are approximate estimates, and are here mainly for the purpose of ordering the groups, not providing a definitive number ''. Countries with the greatest proportion of Christians from Christianity by country (as of 2010): Countries with the greatest proportion of Muslims from Islam by country (as of 2010) (figures excluding foreign workers in parenthesis): Remarks: Saudi Arabia does not include other religious beliefs in their census, the figures for these other religious groups could be higher than reported in the nation. While conversion to Islam is among its most supported tenets, conversion from Islam to another religion is considered to be the sin of apostasy and could be subject to the penalty of death in the country. Countries with the greatest proportion of people without religion (including agnostics and atheists) from Irreligion by country (as of 2007): Remarks: Ranked by mean estimate which is in brackets. Irreligious includes agnostic, atheist, secular believer, and people having no formal religious adherence. It does not necessarily mean that members of this group don ′ t belong to any religion. Some religions have harmonized with local cultures and can be seen as a cultural background rather than a formal religion. Additionally, the practice of officially associating a family or household with a religious institute while not formally practicing the affiliated religion is common in many countries. Thus, over half of this group is theistic and / or influenced by religious principles, but nonreligious / non-practicing and not true atheists or agnostics. See Spiritual but not religious. Countries with the greatest proportion of Hindus from Hinduism by country (as of 2010): Countries with the greatest proportion of Buddhists from Buddhism by country (as of 2010): As a spiritual practice, Taoism has made fewer inroads in the West than Buddhism and Hinduism. Despite the popularity of its great classics the I Ching and the Tao Te Ching, the specific practices of Taoism have not been promulgated in America with much success; these religions are not ubiquitous worldwide in the way that adherents of bigger world religions are, and they remain primarily an ethnic religion. Nonetheless, Taoist ideas and symbols such as Taijitu have become popular throughout the world through Tai Chi Chuan, Qigong, and various martial arts. The Chinese traditional religion has 184,000 believers in Latin America, 250,000 believers in Europe, and 839,000 believers in North America as of 1998. All of the below come from the U.S. Department of State 2009 International Religious Freedom Report, based on the highest estimate of people identified as indigenous or followers of indigenous religions that have been well - defined. Due to the syncretic nature of these religions, the following numbers may not reflect the actual number of practitioners. Countries with the greatest proportion of Sikhs: The Sikh homeland is the Punjab state, in India, where today Sikhs make up approximately 61 % of the population. This is the only place where Sikhs are in the majority. Sikhs have emigrated to countries all over the world -- especially to English - speaking and East Asian nations. In doing so they have retained, to an unusually high degree, their distinctive cultural and religious identity. Sikhs are not ubiquitous worldwide in the way that adherents of larger world religions are, and they remain primarily an ethnic religion. But they can be found in many international cities and have become an especially strong religious presence in the United Kingdom and Canada. Note that all these estimates come from a single source. However, this source gives a relative indication of the size of the Spiritist communities within each country. Countries with the greatest proportion of Jews (as of 2010): Countries with the greatest proportion of Bahá'ís (as of 2010) with a national population ≥ 200,000: Largest Christian populations (as of 2011): Largest Hindu populations (as of 2010): Largest Muslim populations (as of 2017): Largest Buddhist populations Largest Sikh populations Largest Jewish populations (as of 2011): Largest Bahá'í populations (as of 2010) in countries with a national population ≥ 200,000: As of 2005: Religions:
when is the original cast leaving dear evan hansen
Dear Evan Hansen - wikipedia Dear Evan Hansen is a musical with music and lyrics by Pasek and Paul, and a book by Steven Levenson. The musical opened on Broadway at the Music Box Theatre in December 2016, after its world premiere at the Arena Stage in Washington, DC, in July 2015 and an Off - Broadway production in March to May 2016. The title character, Evan Hansen, is a high school senior with a social anxiety disorder who finds himself amid the turmoil that follows a classmate 's death. The musical has received critical acclaim, particularly for Ben Platt 's leading performance, the lyrics, and the book, and has served as a touchstone for discussion about pre-mature storytelling and themes explored in musical theatre, particularly that of mental illness. At the 71st Tony Awards, it was nominated for nine awards, winning six, including Best Musical, Best Score and Best Actor in a Musical for Platt. The musical has its origins in an incident that took place during Pasek 's high school years. The musical "takes the notion of a teenager,... Evan Hansen, who invents an important role for himself in a tragedy that he did not earn. '' Evan Hansen, a teenager who struggles with severe social anxiety, writes a hopeful letter to himself as an assignment from his therapist before the first day of his senior year. His mother Heidi, a busy nurse 's aide who attends paralegal school at night, attempts to connect with Evan, but struggles to find common ground with him. She tells him to make new friends by asking people to sign the cast on his arm, which he had broken by falling out of a large tree over the summer. Across town, the wealthy Murphy family -- Cynthia, Larry, and their children Zoe and Connor -- sit down to breakfast. Zoe and Larry berate Connor for getting high before school, while Cynthia struggles with the fact that her family is falling apart. The two mothers wonder simultaneously how to connect with their sons ("Anybody Have A Map? ''). At school, Evan runs into Alana, an ambitious student obsessed with getting into a good college, and Jared, the son of a family friend and the closest thing Evan has to an actual friend. Both Alana and Jared notice his broken arm, but neither one takes Evan up on his offer to sign his cast. Evan has a physical altercation with Connor, prompting Zoe, Evan 's longtime crush, to apologize on her brother 's behalf. Evan wonders if this is his destiny -- to be ignored and an outcast for the rest of his life ("Waving Through A Window ''). Evan writes himself another letter, this time about how he 's given up on it being a good year and how he wonders if anyone would notice if he was n't there. He remarks that all his hope is now pinned on Zoe, even though he does n't know her. While printing out the letter in the school 's computer lab, he once again runs into Connor, who is more subdued than he was that morning. He offers to sign Evan 's cast, musing that maybe now they can both pretend they have friends. After signing the cast, he reads Evan 's letter and becomes furious at the mention of Zoe, thinking Evan intended for him to see the letter in order to hurt him. He storms out, taking the letter with him. Several days pass with no sign of Connor or Evan 's letter, leaving Evan in an intense state of anxiety over what Connor might have done with it. That day, he is called to the principal 's office, where Connor 's parents are waiting to meet him. They tell Evan that Connor had committed suicide a few days before, with Evan 's letter in his pocket. Believing it to be Connor 's suicide note, addressed to Evan, they ask Evan if he and Connor were close, as Connor had never mentioned having friends before. Evan attempts to explain but becomes overwhelmed, panicking. Not wanting to further their grief and trying to find a way out of the situation, Evan agrees to go to their house to talk about Connor. He confides all this in Jared, who advises him to just "nod and confirm ''. Instead, Evan begins to fabricate an intricate story of his and Connor 's friendship after seeing how distraught Cynthia is over the loss of her son. Evan claims that he and Connor kept up a secret email correspondence, and tells them a fictional version of the day he broke his arm in which Connor was with him the entire day ("For Forever ''). Later, Heidi reminds Evan that he needs to begin applying for college scholarships, but Evan is too distracted by his fabricated friendship with Connor to take her very seriously. She mentions hearing about Connor 's death, but Evan tells her not to worry, that he did n't know Connor. Heidi confronts Evan after seeing Connor 's name signed on his cast, but he lies by saying that someone else named Connor signed his cast. After realizing he needs evidence of his supposed "secret email account '', Evan enlists Jared 's help in creating fake, backdated email conversations ("Sincerely, Me ''). After Evan shows the Murphys Connor 's "emails '', Cynthia is ecstatic that her son had a friend and asks to see more of the emails, but Larry is more hurt that Connor took his family and his privileged life for granted. Zoe, who was never close to Connor, refuses to mourn him because she truly does not miss her brother because of his abusive behavior towards her ("Requiem ''). However, after Evan shows her the "suicide note '', Zoe notices that she is mentioned and asks Evan if Connor ever spoke about her. Evan, unable to tell her the truth, tells her all the reasons he loves her, but pretends that Connor said them ("If I Could Tell Her ''). Overcome with emotion, he impulsively kisses Zoe, but she throws him out. Evan notices that people are starting to forget about Connor, and wants to prevent what happened to him from happening to anyone else. Spurred on by his perceived spirit of Connor, Evan enlists Alana and Jared 's help in founding "The Connor Project '', an organization dedicated to keeping Connor 's memory alive and helping those like him. The three pitch the idea to the Murphys, who agree to support the project ("Disappear ''). At the official launch of the Connor Project, Evan gives an inspiring speech about his loneliness and friendship with Connor, which goes viral after it is posted online. Zoe, overcome by the impact her brother and Evan have had on people, kisses him ("You Will Be Found ''). Evan and Alana pitch a fund - raising idea on The Connor Project 's website. In memory of Connor, they want to raise $50,000 in three weeks to reopen the abandoned apple orchard that Evan and Connor supposedly spent time in. Evan, spurred on by his new relationship with Zoe, and his newfound family in the Murphys, begins to neglect his mother, Jared, and the Connor Project. Heidi tells Evan that she saw the video of his speech on Facebook, and asks him why he did n't tell her about the Connor Project or about his and Connor 's friendship. He angrily responds that he did n't have the time because she 's never around. He then rushes off to the Murphys ', telling her that he 's going to Jared 's. At the Murphys ', Evan bonds with Larry, and confides in him that his father left when he was young, has remarried, and no longer keeps in touch with him or Heidi ("To Break in a Glove ''). Sometime later, at Evan 's house, Evan makes an offhand comment to Zoe about how he and his mother do n't have much money and he needs the scholarships to pay for college. When he begins to mention Connor, Zoe tells him that she does n't want their relationship to be about Connor, but about the two of them ("Only Us ''). Evan gets in a fight with Jared who claims that Connor 's death was the best thing that ever happened to Evan -- he is no longer invisible, and has landed the girl of his dreams. Later on, Evan goes to the Murphys ' only to discover that Zoe had invited Heidi for dinner. Heidi who had no idea that Evan had spent time at the Murphys ' is mortified when Larry and Cynthia offer to use the money that they had set aside for Connor 's college fund to send Evan to school instead. After returning home, Heidi and Evan fight over his secrecy. When Heidi protests that the Murphys are not Evan 's family, Evan confesses that he feels not only welcomed, but accepted by the Murphys in light of Heidi 's absence and expectations towards Evan 's mental health. Heidi tearfully berates him for running off to his shiny new family while Alana begins to find inconsistencies in the fake emails Evan "received '' from Connor and suspect that the entire story has been a lie. Beginning to panic, Evan urges Jared to help him clear up the inaccuracies, but Jared refuses due to Evan 's own absence. Evan counters that Jared himself had acted as a friend to him when he needed him. Jared threatens to expose Evan, and Evan warns him he could in turn open up about Jared 's role. All three converge in Evan 's conscience, compounding his guilt and doubt over his decisions. ("Good For You ''). Evan decides he has to come clean with what he 's done. Connor 's "spirit '' attempts to talk him out of it citing the happiness he has given the Murphys and the fate of Evan 's relationship with Zoe, but Evan does not back down, angrily shouting he needs the whole thing to be over. Connor is unconvinced and asks Evan how he broke his arm: did he fall by accident or actually let go? Evan denies intent, but Connor tells him that if he tells the truth, all he has will be gone, and the only thing he will be left with is himself. He disappears leaving Evan alone. Evan is distraught and goes to see the Murphys, who have become the targets of hateful comments from people that believe they were responsible for Connor 's death. He walks in on the three of them fighting about why Connor really killed himself and finally admits that he fabricated the entire thing, hopeful that he could forge a genuine bond with the Murphys out of the tragedy. As Zoe and her mother tearfully run out, Larry turns away from Evan in disgust. Alone once more, Evan absorbs his perceived brokenness as inescapable. ("Words Fail ''). Evan finds Heidi waiting for him at home, who saw the letter online, immediately knowing that it was one of Evan 's therapy assignments. She apologizes to Evan for not seeing how badly he had been hurting. Evan says that it was n't her fault, he lied to her about so much. He then vaguely admits that his fall had been a suicide attempt. Heidi sits him down and recalls the day that his father moved out: how she felt so small and alone, and did n't know how she was going to make it by herself. In the end, however, she realized that she was n't alone -- she had Evan and knew that the two of them could make it through anything as long as they were together. Tearfully, Heidi promises that she 'll always be there for him when he needs her ("So Big / So Small ''). A year later, Evan is still living at home and working at Pottery Barn to save enough money to go to college the next semester. He contacts Zoe, whom he has not seen since she found out the truth, asking if she will agree to meet him. She does but insists that they meet at the orchard that has been reopened in Connor 's memory. He apologizes for the pain he caused her family and admits that he has been reading Connor 's ten favorite books after finding a list in an old yearbook in an attempt to connect with who he really was. He also thanks her and her parents for keeping his secret and revealing that they never told anyone else that his friendship with Connor was a lie. She forgives him saying the whole ordeal brought her family closer together over the past year because "everyone needed it for something. '' Evan asks her why she insisted on meeting at the orchard, and she replies that she wanted to be sure he saw it, and the two share a gentle moment before they awkwardly part. Evan mentally writes himself one last letter reflecting on the impact he 's had on his community and questions what is to come next ("Finale ''). Evan Hansen -- A high school senior who struggles with social anxiety disorder. He is assigned by his therapist to write letters to himself about why each day will be good, which becomes the catalyst for the plot of the story. He also has never had any friends, and has had a crush on Zoe Murphy for a very long time. After Connor 's death, he begins to tell lies of him being friends with Connor to the Murphy family because they found Evan 's letter to himself folded up in Connor 's pocket; they thought Connor wrote it to Evan. Heidi Hansen -- Evan 's mother, a nurse 's aide who attends law school at night, often leaving Evan on his own as a result. She tries to connect with Evan, but struggles because she does n't personally understand what he goes through on a daily basis. Zoe Murphy -- Connor 's younger sister and Evan 's longtime crush. She was never close to Connor, hated him even, but wishes she had known him better and turns to Evan after he lies and says he was friends with Connor. She does n't get mad at Evan, but rather accepts him, after the truth is revealed. Connor Murphy -- A high school senior who is also a social outcast with no friends, just like Evan. He is a frequent drug user, and becomes verbally abusive to his family when he is high. However, he is protective of Zoe, even though he is not close with her. Cynthia Murphy -- Connor and Zoe 's stay - at - home mother. She is constantly trying to keep her fragile family from falling apart, but is often unsuccessful. She clings to the memory of Connor even though she was never close with him, and her relationship with Larry and Zoe suffers because of it. Larry Murphy -- Connor and Zoe 's busy father. He works hard to give his family a relatively easy life, but he is emotionally distant from all three of them. He becomes close with Evan, who never had a strong father figure, and begins to see Evan as the son Connor never was. Alana Beck -- Evan 's precocious and sometimes insufferable classmate. She is constantly looking for academic and extracurricular activities to boost her college résumé. She never knew Connor, but is greatly affected by his death and quickly joins Evan in founding the Connor Project in order to keep Connor 's memory alive. Jared Kleinman -- Another of Evan 's classmates; he is the closest thing Evan has ever had to a friend. The son of a family friend of the Hansens, he initially only talks to Evan so that his parents will pay for his car insurance. Evan enlists his help in crafting fake emails from Connor, and slowly becomes a true friend to him. * * Not included on the Original Broadway Cast Recording An original Broadway cast album was released at midnight on February 3, 2017. The second song on the album, "Waving Through a Window '', was released as a special early download for those who pre-ordered the album. The fifth song, "Requiem '', was made available to stream for 24 hours on January 26, 2017, a week before the release of the cast recording. The song was released as a second pre-order bonus the next day. The recording of the Act 1 finale "You Will Be Found '' was available for a first listen online on January 30, 2017. The cast album debuted at number 8 on the February 25 Billboard 200. The cast album became available in compact disc format on February 24, 2017. Dear Evan Hansen premiered at the Arena Stage in Washington, D.C., running from July 10 to August 23, 2015. Directed by Michael Greif, with orchestrations by Alex Lacamoire, the set was designed by David Korins and the projection design was by Peter Nigrini. The cast featured Ben Platt in the title role. The musical opened Off - Broadway at the Second Stage Theatre on March 26, 2016, in previews, with the official opening on May 1. The cast featured Ben Platt, Laura Dreyfuss, Mike Faist, Rachel Bay Jones, Will Roland and Jennifer Laura Thompson repeating their roles from the Arena Stage production. New cast members were John Dossett and Kristolyn Lloyd. Michael Greif again directed, with choreography by Danny Mefford. The Off - Broadway engagement closed on May 29, 2016. The show premiered on Broadway on November 14, 2016, in previews, and officially opened on December 4. After originally announcing that performances would take place at the Belasco Theatre, in mid-September 2016, producers announced that the show would instead be performed at the Music Box Theatre. Michael Park, who originated the role of Larry in the Arena Stage production, returned for the Broadway production (replacing John Dossett who went on to the musical War Paint). All other cast members from the Second Stage production returned for the Broadway engagement. A U.S. Tour was announced, starting in Denver in October 2018. Derek Mong, in his review of the musical at the Arena Stage, wrote that the "inventive set design by David Korins... that transforms a small stage into a platform for the most intimate living room where a mother and son share a heart - to - heart to the physical abyss of internet cyberspace... book by Steven Levenson... lyrics and music by Benj Pasek and Justin Paul... heartfelt lyrics with universal appeal joined by the perfect, oftentimes acoustic, accompaniment that can change the mood from somber to celebratory to sinister in a single bar of music. '' Barbara Mackay in reviewing the Arena Stage production for TheatreMania wrote: "Levenson, Pasek, and Paul set themselves two high, untraditional bars in Evan Hansen: exploring a community 's grief and examining a lonely protagonist who desperately wants to connect with that community... Ben Platt is outstanding as Evan... Since the success of the musical depends entirely on whether Evan 's solitary nature appears funny or weird, Evan 's ability to laugh at himself and make the audience laugh is crucial. Platt is charming as he eternally twists his shirt tails and hangs his head... Although the themes of grief and loneliness are serious, the musical is anything but somber. It addresses challenging facts of life. But from start to finish, when Evan leaves his room and finds an authentic life outside it, Dear Evan Hansen contains far more joy than sadness. '' Susan Davidson, in her review of the Arena Stage production for CurtainUp, noted: "... it helps to suspend the disbelief that sullen, anti-social teenagers can change quickly. Surely that 's a process requiring time - released hormonal adjustments. It is hard to accept that a long - admired - from - afar girl can change Evan 's outlook on life so rapidly or that Connor 's teenage disequilibrium leads him to do what he does. Coming through loud and clear, however, is the fact that what starts as deceit can be blown totally out of proportion by the Internet where lies are disseminated with lightning speed leaving plenty of victims in their wake... The music is pleasant, not terribly original but good enough to get toes tapping. Benj Pasek and Justin Paul 's ballads stand out, particularly Heidi 's "So Big / So Small, '' Evan 's "Words Fail '' and Zoe and Evan 's young sweethearts duet "Only Us. '' '' Charles Isherwood, in his review of the Second Stage production for The New York Times, noted: "The songs, by Benj Pasek and Justin Paul (Dogfight, A Christmas Story), strike the same complex notes, with shapely, heartfelt lyrics that expose the tensions and conflicts that Connor 's death and Evan 's involvement cause in both families. The music, played by a small but excellent band on a platform upstage, is appealingly unstrident pop - rock, with generous doses of acoustic guitar, keyboards and strings. It 's the finest, most emotionally resonant score yet from this promising young songwriting team. '' Dear Evan Hansen is a recipient of the Edgerton Foundation New Play Award.
who said ishq aur jung me sab jayaz hai
Mohabbat Aur Jung - Wikipedia Mohabbat Aur Jung was a 1998 Hindi action drama film directed by Hameed Alam. The film was made on a budget of ₹ 1.5 crore (US $230,000) and turned out to be a flop. The central character is a boy who fights against the local gangsters selling drugs in his college. One day the principal is killed and the blame goes on him. The police inspector (Deepak Tijori) turns out to be the principal 's son who has vowed to avenge the life of his mother.
what happens when the result of a calculation exceeds the capacity of data type
Integer overflow - wikipedia In computer programming, an integer overflow occurs when an arithmetic operation attempts to create a numeric value that is outside of the range that can be represented with a given number of bits -- either larger than the maximum or lower than the minimum representable value. The most common result of an overflow is that the least significant representable bits of the result are stored; the result is said to wrap around the maximum (i.e. modulo power of two). An overflow condition may give results leading to unintended behavior. In particular, if the possibility has not been anticipated, overflow can compromise a program 's reliability and security. For some applications, such as timers and clocks, wrapping on overflow can be desirable. The C11 standard states that for unsigned integers modulo wrapping is the defined behavior and the term overflow never applies "a computation involving unsigned operands can never overflow. '' On some processors like graphics processing units (GPUs) and digital signal processors (DSPs) which support saturation arithmetic, overflowed results would be "clamped '', i.e. set to the minimum or the maximum value in the representable range, rather than wrapped around. The register width of a processor determines the range of values that can be represented. Typical binary register widths for unsigned integers include: When an arithmetic operation produces a result larger than the maximum above for a N - bit integer, an overflow reduces the result to modulo N - th power of 2, retaining only the least significant bits of the result and effectively causing a wrap around. In particular, multiplying or adding two integers may result in a value that is unexpectedly small, and subtracting from a small integer may cause a wrap to a large positive value (for example, 8 - bit integer addition 255 + 2 results in 1, which is 257 mod 2, and similarly subtraction 0 − 1 results in 255, a two 's complement representation of − 1). Such wrap around may cause security problems -- if an overflowed value is used as the number of bytes to allocate for a buffer, the buffer will be allocated unexpectedly small, leading to a potential buffer overflow and arbitrary code execution. If the variable has a signed integer type, a program may make the assumption that a variable always contains a positive value. An integer overflow can cause the value to wrap and become negative, which violates the program 's assumption and may lead to unexpected behavior (for example, 8 - bit integer addition of 127 + 1 results in − 128, a two 's complement of 128). Most computers have two dedicated processor flags to check for overflow conditions. The carry flag is set when the result of an addition or subtraction, considering the operands and result as unsigned numbers, does not fit in the given number of bits. This indicates an overflow with a carry or borrow from the most significant bit. An immediately following add with carry or subtract with borrow operation would use the contents of this flag to modify a register or a memory location that contains the higher part of a multi-word value. The overflow flag is set when the result of an operation on signed numbers does not have the sign that one would predict from the signs of the operands, e.g. a negative result when adding two positive numbers. This indicates that an overflow has occurred and the signed result represented in two 's complement form would not fit in the given number of bits. For an unsigned type, when the ideal result of an operation is outside the types representable range and the returned result is obtained by wrapping, then this event is commonly defined as an overflow. In contrast, the C11 standard defines that this event is not an overflow and states "a computation involving unsigned operands can never overflow. '' When the ideal result of an integer operation is outside the types representable range and the returned result is obtained by clamping, then this event is commonly defined as a saturation. Usage varies as to whether a saturation is or is not an overflow. To eliminate ambiguity, the terms wrapping overflow and saturating overflow can be used. The term underflow is most commonly used for floating - point math and not for integer math. But, many references can be found to integer underflow. When the term integer underflow is used, it means the ideal result was closer to minus infinity than the output type 's representable value closest to minus infinity. When the term integer underflow is used, the definition of overflow may include all types of overflows or it may only include cases where the ideal result was closer to positive infinity than the output type 's representable value closest to positive infinity. When the ideal result of an operation is not an exact integer, the meaning of overflow can be ambiguous in edge cases. Consider the case where the ideal result has value 127.25 and the output type 's maximum representable value is 127. If overflow is defined as the ideal value being outside the representable range of the output type, then this case would be classified as an overflow. For operations that have well defined rounding behavior, overflow classification may need to be postponed until after rounding is applied. The C11 standard defines that conversions from floating point to integer must round toward zero. If C is used to convert the floating point value 127.25 to integer, then rounding should be applied first to give an ideal integer output of 127. Since the rounded integer is in the outputs range, the C standard would not classify this conversion as an overflow. There are several methods of handling overflow: Programming languages implement various mitigation methods against an accidental overflow: Ada, Seed7 (and certain variants of functional languages), trigger an exception condition on overflow, while Python (since 2.4) seamlessly converts internal representation of the number to match its growth, eventually representing it as long -- whose ability is only limited by the available memory. Run - time overflow detection implementation AddressSanitizer is also available for C compilers. In languages with native support for Arbitrary - precision arithmetic and type safety (such as Python or Common Lisp), numbers are promoted to a larger size automatically when overflows occur, or exceptions thrown (conditions signaled) when a range constraint exists. Using such languages may thus be helpful to mitigate this issue. However, in some such languages, situations are still possible where an integer overflow can occur. An example is explicit optimization of a code path which is considered a bottleneck by the profiler. In the case of Common Lisp, this is possible by using an explicit declaration to type - annotate a variable to a machine - size word (fixnum) and lower the type safety level to zero for a particular code block. In Java 8, there are overloaded methods, for example like Math. addExact (int, int), which will throw ArithmeticException in case of overflow. Computer emergency response team (CERT) developed the As - if Infinitely Ranged (AIR) integer model, a largely automated mechanism to eliminate integer overflow and truncation in C / C++ using run - time error handling. In computer graphics or signal processing, it is typical to work on data that ranges from 0 to 1 or from − 1 to 1. An example of this is a grayscale image where 0 represents black, 1 represents white, and values in - between represent varying shades of gray. One operation that one may want to support is brightening the image by multiplying every pixel by a constant. Saturated arithmetic allows one to just blindly multiply every pixel by that constant without worrying about overflow by just sticking to a reasonable outcome that all these pixels larger than 1 (i.e. "brighter than white '') just become white and all values "darker than black '' just become black. Unanticipated arithmetic overflow is a fairly common cause of program errors. Such overflow bugs may be hard to discover and diagnose because they may manifest themselves only for very large input data sets, which are less likely to be used in validation tests. Taking the arithmetic mean of two numbers by adding them and dividing by two, as done in many search algorithms, causes error if the sum (although not the resulting mean) is too large to be represented, and hence overflows. An unhandled arithmetic overflow in the engine steering software was the primary cause of the crash of the 1996 maiden flight of the Ariane 5 rocket. The software had been considered bug - free since it had been used in many previous flights, but those used smaller rockets which generated lower acceleration than Ariane 5. On 30 April 2015, the Federal Aviation Authority announced it will order Boeing 787 operators to reset its electrical system periodically, to avoid an integer overflow which could lead to loss of electrical power and ram air turbine deployment, and Boeing deployed a software update in the fourth quarter. The European Aviation Safety Agency followed on 4 May 2015. The error happens after 231 centiseconds (248.55134814815 days), indicating a 32 - bit signed integer. Overflow bugs are evident in computer games. In the arcade game Donkey Kong, it is impossible to advance past level 22 due to an integer overflow in its time / bonus. The game takes the level number a user is on, multiplies it by 10 and adds 40. When they reach level 22, the time / bonus number is 260, which is too large for its 8 - bit 256 value register, so it resets itself to 0 and gives the remaining 4 as the time / bonus -- too short to finish the level. In Donkey Kong Jr. Math, when trying to calculate a number over 10000, it shows only the first 4 digits. Overflow is the cause of the famous Split Screen in Pac - Man and the Nuclear Gandhi in Civilization series. It also caused the Far Lands in Minecraft which existed from the Infdev development period to Beta 1.7. 3, however it was later fixed in Beta 1.8 but still exist in the Pocket Edition and Windows 10 Edition versions of Minecraft. Microsoft / IBM MACRO Assembler (MASM) Version 1.00, and likely all other programs built by the same Pascal compiler, had an integer overflow and signedness error in the stack setup code, which prevented them from running on newer DOS machines or emulators under some common configurations with more than 512 KB of memory. The program either hangs or displays an error message and exits to DOS. In August 2016, a Casino machine at Resorts World Casino printed a prize ticket of $42,949,672.76 as a result of an overflow bug. The Casino refused to pay this amount calling it a malfunction, using in their defense that the machine clearly stated that the maximum payout was $10,000, so any prize higher than that had to be the result of a programming bug. The Iowa Supreme Court ruled in favor of the Casino.
what is the average monthly income in russia
List of European Countries by average wage - wikipedia This is a map and list of countries containing monthly (annual divided by 12 months) gross and net income (after taxes) average wages in Europe in their local currency and in euros. The chart below reflects the average (mean) wage as reported by various data providers. The salary distribution is right - skewed, therefore more than 50 % of people earn less than the average gross salary. These figures will shrink after income tax is applied. In certain countries, actual incomes may exceed those listed in the table due to the existence of grey economies. In some countries, social security, contributions for pensions, public schools, and health are included in these taxes. The countries and territories in purple on the map have net average monthly salary purple - in excess of € 3,000, blue -- in the range of € 1,500 -- € 2,999, orange - € 500 to € 1499, in red - below € 499. The countries and territories in purple on the map have gross average monthly salaries (taxable income) blue - in excess of € 4,000, pink -- in the range of € 2,000 -- € 3,999, orange - € 1000 to € 1999, in red - below € 999. The following list transcontinental countries that have main territories in Asia with small territories in Europe. Turkey 2014 annual values (in national currency) for a family with two children with one average salary, including tax credits and allowances. (The numbers in the box below are shown with up to 9 significant figures. Clearly any figure beyond the first four is meaningless.)
the animal that does not chew the cud is
Kosher animals - wikipedia Kosher animals are animals that comply with the regulations of kashrut and are considered kosher foods. These dietary laws ultimately derive from various passages in the Torah with various modifications, additions and clarifications added to these rules by Halakha. Various other animal - related rules are contained in the 613 commandments. Leviticus 11: 3 - 8 and Deuteronomy 14: 4 - 8 both give the same general set of rules for identifying which land animals (Hebrew: בהמות Behemoth) are ritually clean. According to these, anything that "chews the cud '' and has a completely split hoof is ritually clean, but those animals that only chew the cud or only have cloven hooves are unclean. Both documents explicitly list four animals as being ritually impure: Camels are actually both even - toed ungulates and ruminants, although their feet are n't hooves at all, instead being two toes with a pad. Similarly, although the bible portrays them as ruminants, the hyrax, hare, and coney, are all coprophages, and do not ruminate and lack a rumen. These obvious discrepancies, and the question of whether there is a way to resolve them, have been investigated by various authors, most recently by Rabbi Natan Slifkin, in a book, entitled The Camel, the Hare, and the Hyrax. Unlike Leviticus 11: 3 - 8, Deuteronomy 14: 4 - 8 also explicitly names 10 animals considered ritually clean: The Deuteronomic passages mention no further land beasts as being clean or unclean, seemingly suggesting that the status of the remaining land beasts can be extrapolated from the given rules. By contrast, the Levitical rules later go on to add that all quadrupeds with paws should be considered ritually unclean, something not explicitly stated by the Deuteronomic passages; the only quadrupeds with paws are the carnivorans (dogs, wolves, cats, lions, hyenas, bears, etc.), and all carnivorans fall under this description. The Leviticus passages thus cover all the large land animals that naturally live in Canaan, except for primates, and equids (horses, zebras, etc.), which are not mentioned in Leviticus as being either ritually clean or unclean, despite their importance in warfare and society, and their mention elsewhere in Leviticus. In an attempt to help identify animals of ambiguous appearance, the Talmud, in a similar manner to Aristotle 's earlier Historia Animalium, argued that animals without upper teeth would always chew the cud and have split hoofs (thus being ritually clean), and that no animal with upper teeth would do so; the Talmud makes an exception for the case of the camel (which, like the other ruminant even - toed ungulates, is apparently ' without upper teeth ' though some citations.), even though the skulls clearly have both front and rear upper teeth. The Talmud also argues that the meat from the legs of clean animals can be torn lengthwise as well as across, unlike that of unclean animals, thus aiding to identify the status of meat from uncertain origin. Biblical scholars believe that the classification of animals was created to explain pre-existing taboos. Beginning with the Saadia Gaon, several Jewish commentators started to explain these taboos rationalistically; Said himself expresses an argument similar to that of totemism, that the unclean animals were declared so because they were worshipped by other cultures. Due to comparatively recent discoveries about the cultures adjacent to the Israelites, it has become possible to investigate whether such principles could underlie some of the food laws. Egyptian priests would only eat the meat of even - toed ungulates (swine, camelids, and ruminantians), and rhinoceros. Like the Egyptian priests, Vedic India (and presumably the Persians also) allowed the meat of rhinoceros and ruminantians, although cattle were excluded from this, since they were seemingly taboo in Vedic India; in a particular parallel with the Israelite list, Vedic India explicitly forbade the consumption of camelids and domestic pigs (but not boar). However, unlike the biblical rules, Vedic India did allow the consumption of hare and porcupine, but Harran did not, and was even more similar to the Israelite regulations, allowing all ruminants, but not other land beasts, and expressly forbidding the meat of camels. In addition to meeting the restrictions as defined by the Torah, there is also the issue of masorah (tradition). In general, animals are eaten only if there is a masorah that has been passed down from generations ago that clearly indicates that these animals are acceptable. For instance, there was considerable debate as to the kosher status of zebu and bison among the rabbinical authorities when they first became known and available for consumption; the Orthodox Union permits bison, as can be attested to by the menus of some of the more upscale kosher restaurants in New York City. Leviticus 11: 9 - 12 and Deuteronomy 14: 9 - 10 both state that anything residing in "the waters '' (which Leviticus specifies as being the seas and rivers) is ritually clean if it has both fins and scales, in contrast to anything residing in the waters with neither fins nor scales. The latter class of animals is described as ritually impure by Deuteronomy, Leviticus describes them as an "abomination '' KJV Leviticus 11: 10. Abomination is also sometimes used to translate piggul and toebah. Although these biblical rules do not specify the status of animals in the waters with fins but no scales, or scales but no fins, it has traditionally been assumed that these animals are also excluded from the ranks of the ritually clean. These rules restrict the permissible seafood to stereotypical fish, prohibiting the unusual forms such as the eel, lamprey, hagfish, and lancelet. In addition, these rules exclude non-fish marine creatures, such as crustaceans (lobster, crab, prawn, shrimp, barnacle, etc.), molluscs (squid, octopus, oyster, periwinkle, etc.), sea cucumbers, and jellyfish. Other creatures living in the sea and rivers that would be prohibited by the rules, but are not normally considered seafood, include the cetaceans (dolphin, whale, etc.), crocodilians (alligator, crocodile etc.), sea turtles, sea snakes, and all amphibians. Sharks are sometimes regarded as being among the ritually unclean foods according to these regulations, as they appear to have a smooth skin. However, sharks do have scales, they are just placoid scales, which are denser and appear smooth if rubbed in one direction, in contrast to leptoid scales, ganoid scales, and cosmoid scales. The sturgeon, and related fish, are also sometimes included among the ritually impure foods, as their surfaces are covered in scutes, which are bony armoured nodules; however, fish scutes are actually just hardened and enlarged scales. Scales has thus been traditionally interpreted along the lines of Nahmanides 's proposal that qasqeseth (scales) must refer specifically to scales that can be detached, by hand or with a knife, without ripping the skin. In practice this excludes all but cycloid and ctenoid scales. A minor controversy arises from the fact that the appearance of the scales of sturgeon, swordfish, and catfish is heavily affected by the ageing process -- their young satisfy Nahmanides ' rule, but when they reach adulthood they do not. Traditionally fins has been interpreted as referring to translucent fins. The Mishnah claims that all fish with scales will also have fins, but that the reverse is not always true. For the latter case, the Talmud argues that ritually clean fish have a distinct spinal column and flatish face, while ritually unclean fish do n't have spinal columns and have pointy heads, which would define the shark and sturgeon (and related fish) as ritually unclean. Nevertheless, Aaron Chorin, a prominent 19th - century rabbi and reformer, declared that the sturgeon was actually ritually pure, and hence permissible to eat. Many Conservative rabbis now view these particular fish as being kosher, but most Orthodox rabbis do not. The question for sturgeon is particularly significant as most caviar consists of sturgeon eggs, and therefore can not be kosher if the sturgeon itself is not. Sturgeon - derived caviar is not eaten by some Kosher - observant Jews because sturgeon possess ganoid scales instead of the usual ctenoid and cycloid scales. There is a kosher caviar made of seaweeds. The salmon roe is also kosher. Nahmanides believed that the restrictions against certain fish also addressed health concerns, arguing that fish with fins and scales (and hence ritually clean) typically live in shallower waters than those without fins or scales (i.e., those that were ritually impure), and consequently the latter were much colder and more humid, qualities he believed made their flesh toxic. The academic perception is that natural repugnance from "weird - looking '' fish is a significant factor in the origin of the restrictions. Vedic India (and presumably the Persians also) exhibit such repugnance, generally allowing fish, but forbidding "weird looking '' fish and exclusively carnivorous fish; in Egypt, another significant and influential culture near to the Israelites, the priests avoided all fish completely. In regard to birds, no general rule is given, and instead Leviticus 11: 13 - 19 and Deuteronomy 14: 11 - 18 explicitly list the prohibited birds. In the Shulchan Aruch 3 Signs are given to Kosher birds: Crop, an extra finger, a gizzard that can be peeled. Also it must not be a bird of prey. The Masoretic Text lists the birds as: The list in Deuteronomy has an additional bird, the dayyah, which seems to be a combination of da'ah and ayyah, and may be a scribal error; the Talmud regards it as a duplication of ayyah. This, and the other terms are vague and difficult to translate, but there are a few further descriptions, of some of these birds, elsewhere in the Bible: The Septuagint versions of the lists are more helpful, as in almost all cases the bird is clearly identifiable: Although the first ten of the birds identified by the Septuagint seem to fit the descriptions of the Masoretic Text, the ossifrage (Latin for bone breaker) being a good example, the correspondence is less clear for most of the remaining birds; it is also obvious that the list in Leviticus, or the list in Deuteronomy, or both, are in a different order in the Septuagint, compared to the Masoretic Text. Attempting to determine the correspondence is problematic; for example, the pelican may correspond to qa'at (vomiting), in reference to the pelican 's characteristic behaviour, but it may also correspond to kos (cup), as a reference to the pelican 's jaw pouch. An additional complexity arises from the fact that the porphyrion has not yet been identified, and classical Greek literature merely identifies a number of species that are not the porphyrion, including the peacock, grouse, and robin, and implies that the porphyrion is the cousin of the kingfisher. From these meager clarifications, the porphyrion can only be identified as anything from the lilac - breasted roller, Indian roller, or northern carmine bee - eater, to the flamingo. A likely candidate is the purple swamphen. During the Middle Ages, classical descriptions of the hoopoe were mistaken for descriptions of the lapwing, on account of the lapwing 's prominent crest, and the hoopoe 's rarity in England, resulting in lapwing being listed in certain bible translations instead of hoopoe; similarly the sea eagle has historically been confused with the osprey, and translations have often used the latter bird in place of the former. Because strouthos (ostrich) was also used in Greek for the sparrow, a few translations have placed the sparrow among the list. In Arabic, the Egyptian vulture is often referred to as rachami, and therefore a number of translations render racham as gier eagle, the old name for the Egyptian vulture. Variations arise when translations follow other ancient versions of the Bible, rather than the Septuagint, where they differ. Rather than vulture (gyps), the Vulgate has milvus, meaning red kite, which historically has been called the glede, on account of its gliding flight; similarly, the Syriac Peshitta has owl rather than ibis. Other variations arise from attempting to base translations primarily on the Masoretic Text; these translations generally interpret some of the more ambiguous birds as being various different kinds of vulture and owl. All of these variations mean that most translations arrive at a list of 20 birds from among the following: Despite being listed among the birds by the Bible, bats are not birds, and are in fact mammals (the reason being that the Hebrew Bible distinguishes animals into four general categories -- beasts of the land, flying animals, creatures which crawl upon the ground, and animals which dwell in water -- not according to modern scientific classification). Most of the remaining animals on the list are either birds of prey or birds living on water, and the majority of the latter in the list also eat fish or other seafood. The Septuagint 's version of the list comprehensively lists most of the birds of Canaan that fall into these categories. The conclusion of modern scholars is that, generally, ritually unclean birds were those clearly observed to eat other animals. Although it does regard all birds of prey as being forbidden, the Talmud is uncertain of there being a general rule, and instead gives detailed descriptions of the features that distinguish a bird as being ritually clean. The Talmud argues that clean birds would have craws, an easily separated double - skin, and would eat food by placing it on the ground (rather than holding it on the ground) and tearing it with their bills before eating it; however, the Talmud also argues that only the birds in the biblical list are actually forbidden -- these distinguishing features were only for cases when there was any uncertainty in the bird 's identity. The earliest rationalistic explanations of the laws against eating certain birds focused on symbolic interpretations; the first indication of this view can be found in the 1st century BC Letter of Aristeas, which argues that this prohibition is a lesson to teach justice, and is also about not injuring others. Such allegorical explanations were abandoned by most Jewish and Christian theologians after a few centuries, and later writers instead sought to find medical explanations for the rules; Nachmanides, for example, claimed that the black and thickened blood of birds of prey would cause psychological damage, making people much more inclined to cruelty. However, other cultures treated the meat of certain carnivorous birds as having medical benefits, the Romans viewing owl meat as being able to ease the pain of insect bites. Conversely, modern scientific studies have discovered very toxic birds such as the Pitohui, which are neither birds of prey nor water birds, and therefore the biblical regulations allow them to be eaten. Laws against eating any carnivorous birds also existed in Vedic India and Harran, and the Egyptian priests also refused to eat carnivorous birds. Due to the difficulty of identification, religious authorities have restricted consumption to specific birds for which Jews have passed down a tradition of permissibility from generation to generation. Birds for which there has been a tradition of their being kosher include: As a general principle, scavenging birds such as vultures and birds of prey such as hawks and eagles (which opportunistically eat carrion) are unclean. The turkey does not have a tradition, but because so many Orthodox Jews have come to eat it and it possesses the simanim (signs) required to render it a kosher bird, an exception is made, but with all other birds a masorah is required. Songbirds, which are consumed as delicacies in many societies, may be kosher in theory, but are not eaten in kosher homes as there is no tradition of them being eaten as such. Pigeons and doves are known to be kosher based on their permissible status as sacrificial offerings in the Temple of Jerusalem. The Orthodox Union of America considers that neither the peafowl nor the guineafowl are kosher birds since it has not obtained testimony from experts about the permissibility of either of these birds. In the case of the swans, there is no clear tradition of eating them. Rabbi Chaim Loike is currently the Orthodox Union 's specialist on kosher bird species. Unlike with land creatures and fish, the Torah does n't give signs for determining kosher birds; instead it gives a list of unkosher birds, The Talmud also offers signs for determining whether a bird is kosher or not. If a bird kills other animals to get its food, eats meat, or is a dangerous bird, then is not kosher, a predatory bird is unfit to eat, raptors like the eagles, hawks, owls and other hunting birds are not kosher, vultures and other carrion - eating birds are not kosher either. Crows and members of the crow family such as jackdaws, magpies and ravens are not kosher. The storks, kingfishers, penguins and other fish - eating birds are not kosher. Deuteronomy 14: 19 specifies that all "flying creeping things '' were to be considered ritually unclean and Leviticus 11: 20 goes further, describing all flying creeping things as filth, Hebrew sheqets. Leviticus goes on to list four exceptions, which Deuteronomy does not. All these exceptions are described by the Levitical passages as "going upon all four legs '' and as having "legs above their feet '' for the purpose of leaping. The identity of the four creatures the Levitical rules list are named in the Masoretic Text using words of uncertain meaning: The Mishnah argues that the ritually clean locusts could be distinguished as they would all have four feet, jumping with two of them, and have four wings which are of sufficient size to cover the entire locust 's body. The Mishnah also goes on to state that any species of locust could only be considered as clean if there was a reliable tradition that it was so. The only Jewish group that continue to preserve such a tradition are the Jews of Yemen, who use the term "kosher locust '' to describe the specific species of locusts they believe to be kosher, all of which are native to the Arabian Peninsula. Due to the difficulties in establishing the validity of such traditions, later rabbinical authorities forbade contact with all types of locust to ensure that the ritually unclean locusts were avoided. Leviticus 11: 42 - 43 specifies that whatever "goes on its belly, and whatever goes on all fours, or whatever has many feet, any swarming thing that swarms on the ground, you shall not eat, for they are detestable. '' (Hebrew: sheqets). Before stating this, it singles out eight particular "creeping things '' as specifically being ritually unclean in Leviticus 11: 29 - 30. Like many of the other biblical lists of animals, the exact identity of the creatures in the list is uncertain; medieval philosopher and Rabbi, Saadia Gaon, for example, gives a somewhat different explanation for each of the eight "creeping things. '' The Masoretic Text names them as follows: The Septuagint version of the list does n't appear to directly parallel the Masoretic, and is thought to be listed in a different order. It lists the eight as:
what is the purpose of the majority of fha programs
FHA insured loan - Wikipedia An FHA insured loan is a US Federal Housing Administration mortgage insurance backed mortgage loan which is provided by an FHA - approved lender. FHA insured loans are a type of federal assistance and have historically allowed lower income Americans to borrow money for the purchase of a home that they would not otherwise be able to afford. To obtain mortgage insurance from the Federal Housing Administration, an upfront mortgage insurance premium (UFMIP) equal to 1.75 percent of the base loan amount at closing is required, and is normally financed into the total loan amount by the lender and paid to FHA on the borrower 's behalf. There is also a monthly mortgage insurance premium (MIP) which varies based on the amortization term and loan - to - value ratio. The program originated during the Great Depression of the 1930s, when the rates of foreclosures and defaults rose sharply, and the program was intended to provide lenders with sufficient insurance. Some FHA programs were subsidized by the government, but the goal was to make it self - supporting, based on insurance premiums paid by borrowers. Over time, private mortgage insurance (PMI) companies came into play, and now FHA primarily serves people who can not afford a conventional down payment or otherwise do not qualify for PMI. The program has since this time been modified to accommodate the heightened recession. The National Housing Act of 1934 created the Federal Housing Administration (FHA), which was established primarily to increase home construction, reduce unemployment, and operate various loan insurance programs. The FHA makes no loans, nor does it plan or build houses. As in the Veterans Administration 's VA loan program, the applicant for the loan must make arrangements with a lending institution. This financial organization then may ask if the borrower wants FHA insurance on the loan or may insist that the borrower apply for it. The federal government, through the Federal Housing Administration, investigates the applicant and, having decided that the risk is favorable, insures the lending institution against loss of principal in case the borrower fails to meet the terms and conditions of the mortgage. The borrower, who pays an insurance premium of one half of 1 percent on declining balances for the lender 's protection, receives two benefits: a careful appraisal by an FHA inspector and a lower interest rate on the mortgage than the lender might have offered without the protection. African Americans and other racial minorities were largely denied access to FHA - backed loans, especially before 1950, and did gain access only in a handful of suburban developments specifically built for all - black occupancy. Under the Eisenhower Administration, FHA tried to coax private developers to build more housing for minority buyers through its Voluntary Home Credit Mortgage Program; however, less housing was built under the program than expected, and FHA refused to deny insurance to developers who discriminated. The way in which FHA - backed loans were administered thus contributed to a widening homeownership and racial wealth gap, even as they helped to build the white middle - class family. Until the latter half of the 1960s, the Federal Housing Administration served mainly as an insuring agency for loans made by private lenders. However, in recent years this role has been expanded as the agency became the administrator of interest rate subsidy and rent supplement programs. Important subsidy programs such as the Civil Rights Act of 1968 were established by the United States Department of Housing and Urban Development. In 1974 the Housing and Community Development Act was passed. Its provisions significantly altered federal involvement in a wide range of housing and community development activities. The new law made a variety of changes in FHA activities, although it did not involve (as had been proposed) a complete rewriting and consolidation of the National Housing Act. It did, however, include provisions relating to the lending and investment powers of federal savings and loan associations, the real estate lending authority of national banks, and the lending and depositary authority of federal credit unions. Further changes occurred in the 1977 Housing and Community Development Act, which raised ceilings on single - family loan amounts for savings and loan association lending, federal agency purchases, FHA insurance, and security for Federal Home Loan Bank advances. In 1980 the Housing and Community Development Act was passed; it permitted negotiated interest rates on certain FHA loans and created a new FHA rental subsidy program for middle - income families. On August 31, 2007, the FHA added a new refinancing program called FHA - Secure to help borrowers hurt by the 2007 subprime mortgage financial crisis. On March 6, 2008, the "FHA Forward '' program was initiated. This is the part of the stimulus package that President George W. Bush had in place to raise the loan limits for FHA. On April 1, 2012, the FHA enacted a new rule that requires their customers to settle with medical creditors in order to get a mortgage loan. This controversial change was rescinded and postponed until July 2012, but was later cancelled altogether pending clarification and additional guidance. By November 2012, the FHA was essentially bankrupt. The FHA does not make loans. Rather, it insures loans made by private lenders. The first step in obtaining an FHA loan is to contact several lenders and / or mortgage brokers and ask them if they are FHA - Approved by the U.S. Department of Housing and Urban Development to originate FHA loans. As each lender sets its own rates and terms, comparison shopping is important in this market. Second, the potential lender assesses the prospective home buyer for risk. The analysis of one 's debt - to - income ratio enables the buyer to know what type of home can be afforded based on monthly income and expenses and is one risk metric considered by the lender. Other factors, e.g. payment history on other debts, are considered and used to make decisions regarding eligibility and terms for a loan. FHA loans require a minimum FICO score of 580 to qualify for 3.5 percent down or 500 for 10 percent down. Mortgage lenders can add their own rules, also known as overlays on top of these minimum standards. FHA approved lenders use a program called Desktop Underwriter also known as DU for mortgage approval. DU considers the potential borrower 's debt ratio, reserves and credit score to make an automated credit decision. Some lenders also allow for manual underwriting if extenuating circumstances exist. The FHA makes provisions for home buyers who have recovered from "economic events ''. Via the Back To Work - Extenuating Circumstances program, the FHA reduces its standard, mandatory three - year application waiting period for buyers with a history of foreclosure, short sale or deed - in - lieu; and two - year application waiting period after a Chapter 7 or Chapter 13 bankruptcy. For buyers who can show that the economic event was preceded by at least a twenty percent household income reduction which lasted for six months or more; and who can show a satisfactory credit history for the most recent 12 months, the FHA will allow an application, and will agree to insure the home loan. The Back To Work program ended September 30, 2016. Section 251 insures home purchase or refinancing loans with interest rates that may increase or decrease over time, which enables consumers to purchase or refinance their home at a lower initial interest rate. FHA 's mortgage insurance programs help low - and moderate - income families become homeowners by lowering some of the costs of their mortgage loans. FHA mortgage insurance also encourages lenders to make loans to otherwise credit - worthy borrowers and projects that might not be able to meet conventional underwriting requirements, protecting the lender against loan default on mortgages for properties that meet certain minimum requirements, including manufactured homes, single and multifamily properties, and some health - related facilities. The basic FHA mortgage insurance program is Mortgage Insurance for One - to - Four - Family Homes (Section 203 (b)). FHA allows first time homebuyers to put down as little as 3.5 % and receive up to 6 % towards closing costs. However, some lenders wo n't allow a seller to contribute more than 3 % toward allowable closing costs. If little or no credit exists for the applicants, the FHA will allow a qualified non-occupant co-borrower to co-sign for the loan without requiring that person to reside in the home with the first time homebuyer. The co-signer does not have to be a blood relative. This is called a Non-Occupying Co-Borrower. FHA also allows gifts to be used for down payment from the following sources: FHA administers a number of programs, based on Section 203 (b), that have special features. One of these programs, Section 251, insures adjustable rate mortgages (ARMs) which, particularly during periods when interest rates are low, enable borrowers to obtain mortgage financing that is more affordable by virtue of its lower initial interest rate. This interest rate is adjusted annually, based on market indices approved by FHA, and thus may increase or decrease over the term of the loan. In 2006 FHA received approval to allow hybrid ARMs, in which the interest is fixed for the first 3 or 5 years, and is then adjusted annually according to market conditions and indices. The FHA Hybrid provides for an initial fixed interest rate for a period of three or five years, and then adjusts annually after the initial fixed period. The 3 / 1 and 5 / 1 FHA Hybrid products allow up to a 1 % annual interest rate adjustment after the initial fixed interest rate period, and a 5 % interest rate cap over the life of the loan. The new payment after an adjustment will be calculated on the current principal balance at the time of the adjustment. This insures that the payment adjustment will be minimal even on a worst case rate change. Down payment assistance and community redevelopment programs offer affordable housing opportunities to first - time homebuyers, low - and moderate - income individuals, and families who wish to achieve homeownership. Grant types include seller funded programs, the (1) Grant America Program and others, as well as programs that are funded by the federal government, such as the American Dream Down Payment Initiative. Many down payment grant programs are run by state and local governments, often using mortgage revenue bond funds. On May 27, 2006, the Internal Revenue Service issued Revenue Ruling 2006 - 27, in which it ruled that certain non-profit seller - funded down payment assistance programs (DPA programs) were not operating as "charitable organizations ''. The ruling was based largely on the circular nature of the cash flows, in which the seller paid the charity a "fee '' after closing. Many believe that the "grant '' is really being rolled into the price of the home. According to the Government Accountability Office, there are higher default and foreclosure rates for these mortgages. On October 31, 2007, the Department of Housing and Urban Development adopted new regulations to ban so - called "seller - funded '' down payment programs. The new regulations state that all organizations providing down payment assistance reimbursed by the property seller "before, during, or after '' that sale must cease providing grants on FHA loans by October 30, 2007, with the exception of the Nehemiah Corporation. Nehemiah is the beneficiary of a lawsuit settlement with Department of Housing and Urban Development in April 1998. The terms of that settlement will allow Nehemiah to operate until April 1, 2008. Ameridream was granted an extension to the new regulations until February 29, 2008. Several similarly operated government grant programs were introduced in response to the IRS Revenue Ruling in May 2006. Their governmental status made them exempt from the IRS Ruling, but they are still affected by the HUD Rule Change. One such organization was The Grant America Program, which was conducted by the Penobscot Indian Nation and had been available to all homebuyers in all fifty states. The FHA employs a two - tiered mortgage insurance premium (MIP) schedule. FHA MIP rates were lowered January 27, 2017 FHA MIP is not cancellable for mortgages originated after June 3, 2013.
the democratic party is the oldest political party in the united states
History of the United States Democratic Party - wikipedia The Democratic Party of the United States is the oldest voter - based political party in the world, tracing its heritage back to the anti-Federalists of the 1790s. During the "Second Party System '', from 1832 to the mid-1850s, under presidents Andrew Jackson, Martin Van Buren, and James K. Polk, the Democrats usually bested the opposition Whig Party by narrow margins. Both parties worked hard to build grassroots organizations and maximize the turnout of voters, which often reached 80 percent or 90 percent. Both parties used patronage extensively to finance their operations, which included emerging big city political machines as well as national networks of newspapers. The Democratic party was a proponent for slave - owners across the country, urban workers, and caucasian immigrants. It was especially attractive to Irish immigrants who increasingly controlled the party machinery in the cities. The party was much less attractive to businessmen, African American Evangelical Protestants, and social reformers. The party advocated westward expansion, Manifest Destiny, greater equality among all white men, and opposition to the national banks. In 1860 the Civil War began between the mostly - Republican North against the mostly - Democratic South. From 1860 to 1932, in the era of the Civil War to the Great Depression, the opposing Republican Party, organized in the mid-1850s from the ruins of the Whig Party and some other smaller splinter groups, was dominant in presidential politics. The Democrats elected only two presidents to four terms of office for 72 years: Grover Cleveland (in 1884 and 1892) and Woodrow Wilson (in 1912 and 1916). Over the same period, the Democrats proved more competitive with the Republicans in Congressional politics, enjoying House of Representatives majorities (as in the 65th Congress) in 15 of the 36 Congresses elected, although only in five of these did they form the majority in the United States Senate. The Party was split between the "Bourbon Democrats '', representing Eastern business interests, and the agrarian elements comprising poor farmers in the South and West. The agrarian element, marching behind the slogan of "free silver '' (i.e. in favor of inflation), captured the Party in 1896, and nominated the "Great Commoner '', William Jennings Bryan in 1896, 1900 and 1908; he lost every time. Both Bryan and Wilson were leaders of the "Progressive Movement '', 1890s -- 1920s. Starting with 32nd President Franklin D. Roosevelt in 1932 during the Great Depression, the Party dominated the "Fifth Party System '', with its liberal / progressive policies and programs with the "New Deal '' coalition to combat the emergency bank closings and the continuing financial depression since the famous "Wall Street Crash of 1929 '' and later going into the crises leading up to the Second World War of 1939 / 1941 to 1945. The Democrats and the Democratic Party, finally lost the White House and control of the executive branch of government only after Roosevelt 's death in April 1945 near the end of the War, and after the continuing post-war administration of Roosevelt 's third Vice President of the United States, Harry S Truman, former Senator from Missouri, (for 1945 to 1952, elections of 1944 and the "stunner '' of 1948). A new Republican Party president was only elected later in the following decade of the early 1950s with the losses by two - time nominee, the Governor of Illinois, Adlai Stevenson (grandson of the former Vice President with the same name of the 1890s) to the very popular war hero and commanding general in World War II, General Dwight D. Eisenhower (in 1952 and 1956). With two brief interruptions since the "Great Depression '', and World War II eras, the Democrats with unusually large majorities for over four decades, controlled the lower house of the United States Congress in the House of Representatives from 1930 until 1994, and the U.S. Senate for most of that same period, electing the Speaker of the House and the Representatives ' majority leaders / committee chairs along with the upper house of the Senate 's majority leaders and committee chairmen. Important Democratic progressive / liberal leaders included Presidents: 33rd -- Harry S Truman, (of Missouri), (1945 -- 1953), and 36th -- Lyndon B. Johnson, (of Texas), (1963 -- 1969), as well as the earlier Kennedy brothers of 35th President John F. Kennedy, (of Massachusetts), (1961 -- 1963), Senators Robert F. Kennedy, of New York, and Senator Edward M. ("Teddy '') Kennedy, of Massachusetts who carried the flag for modern American political liberalism. Since the Presidential Election of 1976, Democrats have won five out of the last ten presidential elections, winning in the presidential elections of 1976 (with 39th President Jimmy Carter of Georgia, 1976 -- 1981), 1992 and 1996 (with 42nd President Bill Clinton of Arkansas, 1993 -- 2001), and 2008 and 2012 (with 44th President Barack Obama of Illinois, 2009 -- 2017). Social scientists Theodore Caplow et al. argue, "the Democratic party, nationally, moved from left - center toward the center in the 1940s and 1950s, then moved further toward the right - center in the 1970s and 1980s. '' The modern Democratic Party emerged in the 1830s from former factions of the Democratic - Republican Party, which had largely collapsed by 1824. It was built by Martin Van Buren who assembled a cadre of politicians in every state behind war hero Andrew Jackson of Tennessee. The spirit of Jacksonian Democracy animated the party from the early 1830s to the 1850s, shaping the Second Party System, with the Whig Party the main opposition. After the disappearance of the Federalists after 1815, and the Era of Good Feelings (1816 -- 24), there was a hiatus of weakly organized personal factions until about 1828 -- 32, when the modern Democratic Party emerged along with its rival the Whigs. The new Democratic Party became a coalition of farmers, city - dwelling laborers, and Irish Catholics. Behind the party platforms, acceptance speeches of candidates, editorials, pamphlets and stump speeches, there was a widespread consensus of political values among Democrats. As Norton explains: The Democrats represented a wide range of views but shared a fundamental commitment to the Jeffersonian concept of an agrarian society. They viewed the central government as the enemy of individual liberty. The 1824 "corrupt bargain '' had strengthened their suspicion of Washington politics... Jacksonians feared the concentration of economic and political power. They believed that government intervention in the economy benefited special - interest groups and created corporate monopolies that favored the rich. They sought to restore the independence of the individual -- the artisan and the ordinary farmer -- by ending federal support of banks and corporations and restricting the use of paper currency, which they distrusted. Their definition of the proper role of government tended to be negative, and Jackson 's political power was largely expressed in negative acts. He exercised the veto more than all previous presidents combined. Jackson and his supporters also opposed reform as a movement. Reformers eager to turn their programs into legislation called for a more active government. But Democrats tended to oppose programs like educational reform mid the establishment of a public education system. They believed, for instance, that public schools restricted individual liberty by interfering with parental responsibility and undermined freedom of religion by replacing church schools. Nor did Jackson share reformers ' humanitarian concerns. He had no sympathy for American Indians, initiating the removal of the Cherokees along the Trail of Tears. The Party was weakest in New England, but strong everywhere else and won most national elections thanks to strength in New York, Pennsylvania, Virginia (by far, the most populous states at the time), and the American frontier. Democrats opposed elites and aristocrats, the Bank of the United States, and the whiggish modernizing programs that would build up industry at the expense of the yeoman or independent small farmer. Historian Frank Towers has specified an important ideological divide: Democrats stood for the ' sovereignty of the people ' as expressed in popular demonstrations, constitutional conventions, and majority rule as a general principle of governing, whereas Whigs advocated the rule of law, written and unchanging constitutions, and protections for minority interests against majority tyranny. From 1828 to 1848, banking and tariffs were the central domestic policy issues. Democrats strongly favored, and Whigs opposed, expansion to new farm lands, as typified by their expulsion of eastern American Indians and acquisition of vast amounts of new land in the West after 1846. The party favored the War with Mexico and opposed anti-immigrant nativism. Both Democrats and Whigs were divided on the issue of slavery. In the 1830s, the Locofocos in New York City were radically democratic, anti-monopoly, and were proponents of hard money and free trade. Their chief spokesman was William Leggett. At this time labor unions were few; some were loosely affiliated with the party. Foreign policy was a major issue in the 1840s; War threatened with Mexico over Texas, and with Britain over Oregon. Democrats strongly supported Manifest Destiny and most Whigs strongly opposed it. The 1844 election was a showdown, with the Democrat James K. Polk narrowly defeating Whig Henry Clay on the Texas issue. John Mack Faragher 's analysis of the political polarization between the parties is that: Most Democrats were wholehearted supporters of expansion, whereas many Whigs (especially in the North) were opposed. Whigs welcomed most of the changes wrought by industrialization but advocated strong government policies that would guide growth and development within the country 's existing boundaries; they feared (correctly) that expansion raised a contentious issue the extension of slavery to the territories. On the other hand, many Democrats feared industrialization the Whigs welcomed... For many Democrats, the answer to the nation 's social ills was to continue to follow Thomas Jefferson 's vision of establishing agriculture in the new territories in order to counterbalance industrialization. The Democratic National Committee (DNC) was created in 1848 at the convention that nominated General Lewis Cass, who lost to General Zachary Taylor of the Whigs. A major cause of the defeat was that the new Free Soil Party, which opposed slavery expansion, split the Democratic Party, particularly in New York, where the electoral votes went to Taylor. Democrats in Congress passed the Compromise of 1850 designed to put the slavery issue to rest while resolves issued involving territories gained following the War with Mexico... In state after state, however, the Democrats gained small but permanent advantages over the Whig Party, which finally collapsed in 1852, fatally weakened by division on slavery and nativism. The fragmented opposition could not stop the election of Democrats Franklin Pierce in 1852 and James Buchanan in 1856. During 1858 -- 60, Senator Stephen A. Douglas confronted President Buchanan in a furious battle for control of the party. Douglas finally won, but his nomination signaled defeat for the Southern wing of the party, and it walked out of the 1860 convention and nominated its own presidential ticket. Yonatan Eyal (2007) argues that the 1840s and 1850s were the heyday of a new faction of young Democrats called "Young America ''. Led by Stephen A. Douglas, James K. Polk, Franklin Pierce, and New York financier August Belmont, this faction explains, broke with the agrarian and strict constructionist orthodoxies of the past and embraced commerce, technology, regulation, reform, and internationalism. The movement attracted a circle of outstanding writers, including William Cullen Bryant, George Bancroft, Herman Melville and Nathaniel Hawthorne. They sought independence from European standards of high culture and wanted to demonstrate the excellence and exceptionalism of America 's own literary tradition. In economic policy Young America saw the necessity of a modern infrastructure with railroads, canals, telegraphs, turnpikes, and harbors; they endorsed the "market revolution '' and promoted capitalism. They called for Congressional land grants to the states, which allowed Democrats to claim that internal improvements were locally rather than federally sponsored. Young America claimed that modernization would perpetuate the agrarian vision of Jeffersonian Democracy by allowing yeomen farmers to sell their products and therefore to prosper. They tied internal improvements to free trade, while accepted moderate tariffs as a necessary source of government revenue. They supported the Independent Treasury (the Jacksonian alternative to the Second Bank of the United States) not as a scheme to quash the special privilege of the Whiggish monied elite, but as a device to spread prosperity to all Americans. Sectional confrontations escalated during the 1850s, the Democratic Party split between North and South grew deeper. The conflict was papered over at the 1852 and 1856 conventions by selecting men who had little involvement in sectionalism, but they made matters worse. Historian Roy F. Nichols explains why Franklin Pierce was not up to the challenges a Democratic president had to face: In 1854, over vehement opposition, the main Democratic leader in the Senate, Stephen Douglas of Illinois, pushed through the Kansas -- Nebraska Act. It established that settlers in Kansas Territory could vote to decide to allow or not allow slavery. Thousands of men moved in from North and South with the goal of voting slavery down or up, and their violence shook the nation. A major re-alignment took place among voters and politicians, with new issues, new parties, and new leaders. The Whig Party dissolved entirely. The crisis for the Democratic Party came in the late 1850s, as northern Democrats increasingly rejected national policies demanded by the southern Democrats. The demands were to support slavery outside the South. Southerners insisted that full equality for their region required the government to acknowledge the legitimacy of slavery outside the South. The southern demands included a fugitive slave law to recapture runaway slaves; opening Kansas to slavery; forcing a pro-slavery constitution on Kansas; acquire Cuba (where slavery already existed); accepting the Dred Scott decision of the Supreme Court; and adopting a federal slave code to protect slavery in the territories. President Buchanan went along with these demands; Douglas refused. Douglas proved a much better politician than Buchanan, but the bitter battle lasted for years and permanently alienated the northern and southern wings. When the new Republican Party formed in 1854 on the basis of refusing to tolerate the expansion of slavery into the territories, many northern Democrats (especially Free Soilers from 1848) joined it. The Republicans in 1854 now had a majority in most, but not all of the northern states. It had practically no support south of the Mason -- Dixon line. The formation of the new short - lived Know - Nothing Party allowed the Democrats to win the presidential election of 1856. Buchanan, a Northern "Doughface '' (his base of support was in the pro-slavery South), split the party on the issue of slavery in Kansas when he attempted to pass a Federal slave code as demanded by the South; most Democrats in the North rallied to Senator Douglas, who preached "Popular Sovereignty '' and believed that a Federal slave code would be undemocratic. The Democratic Party was unable to compete with the Republican Party, which controlled nearly all northern states by 1860, bringing a solid majority in the Electoral College. The Republicans claimed that the northern Democrats, including Doughfaces such as Pierce and Buchanan, and advocates of popular sovereignty such as Stephen A. Douglas and Lewis Cass, were all accomplices to Slave Power. The Republicans argued that slaveholders, all of them Democrats, had seized control of the federal government and were blocking the progress of liberty. In 1860 the Democrats were unable to stop the election of Republican Abraham Lincoln, even as they feared his election would lead to civil war. The Democrats split over the choice of a successor to President Buchanan along Northern and Southern lines; factions of the party provided two separate candidacies for President in the election of 1860, in which the Republican Party gained ascendancy. Some Southern Democratic delegates followed the lead of the Fire - Eaters by walking out of the Democratic convention at Charleston 's Institute Hall in April 1860 and were later joined by those who, once again led by the Fire - Eaters, left the Baltimore Convention the following June when the convention rejected a resolution supporting extending slavery into territories whose voters did not want it. The Southern Democrats nominated the pro-slavery incumbent Vice-President, John C. Breckinridge of Kentucky, for President and General Joseph Lane, former governor of Oregon, for Vice President. The Northern Democrats proceeded to nominate Douglas of Illinois for President and former Governor of Georgia Herschel Vespasian Johnson for Vice-President, while some southern Democrats joined the Constitutional Union Party, backing its nominees (who had both been prominent Whig leaders), former Senator John Bell of Tennessee for President and the politician Edward Everett of Massachusetts for Vice-President. This fracturing of the Democrats left them powerless. Republican Abraham Lincoln was elected the 16th President of the United States. Douglas campaigned across the country calling for unity and came in second in the popular vote, but carried only Missouri and New Jersey. Breckinridge carried 11 slave states, coming in second in the Electoral vote, but third in the popular vote. During the Civil War, Northern Democrats divided into two factions, the War Democrats, who supported the military policies of President Lincoln, and the Copperheads, who strongly opposed them. No party politics were allowed in the Confederacy, whose political leadership, mindful of the welter prevalent in antebellum American politics and with a pressing need for unity, largely viewed political parties as inimical to good governance and as being especially unwise in wartime. Consequently, the Democratic Party halted all operations during the life of the Confederacy, 1861 -- 65. Partisanship flourished in the North and strengthened the Lincoln Administration as Republicans automatically rallied behind it. After the attack on Fort Sumter, Douglas rallied northern Democrats behind the Union, but when Douglas died, the party lacked an outstanding figure in the North, and by 1862 an anti-war peace element was gaining strength. The most intense anti-war elements were the Copperheads. The Democratic Party did well in the 1862 congressional elections, but in 1864 it nominated General George McClellan, a War Democrat, on a peace platform, and lost badly because many War Democrats bolted to National Union candidate Abraham Lincoln. Many former Democrats became Republicans, especially soldiers such as generals Ulysses S. Grant and John A. Logan. In the 1866 elections, the Radical Republicans won two - thirds majorities in Congress and took control of national affairs. The large Republican majorities made Congressional Democrats helpless, though they unanimously opposed the Radicals ' Reconstruction policies. Realizing that the old issues were holding it back, the Democrats tried a "New Departure '' that downplayed the War and stressed such issues as corruption and white supremacy. Regardless, war hero Ulysses S. Grant led the Republicans to landslides in 1868 and 1872. The Democrats lost consecutive presidential elections from 1860 through 1880 (1876 was in dispute) and did not win the presidency until 1884. The party was weakened by its record of opposition to the war but nevertheless benefited from White Southerners ' resentment of Reconstruction and consequent hostility to the Republican Party. The nationwide depression of 1873 allowed the Democrats to retake control of the House in the 1874 Democratic landslide. The Redeemers gave the Democrats control of every Southern state (by the Compromise of 1877); the disenfranchisement of blacks took place 1880 -- 1900. From 1880 to 1960 the "Solid South '' voted Democratic in presidential elections (except 1928). After 1900, a victory in a Democratic primary was "tantamount to election '' because the Republican Party was so weak in the South. Although Republicans continued to control the White House until 1884, the Democrats remained competitive, especially in the mid-Atlantic and lower Midwest, and controlled the House of Representatives for most of that period. In the election of 1884, Grover Cleveland, the reforming Democratic Governor of New York, won the Presidency, a feat he repeated in 1892, having lost in the election of 1888. Cleveland was the leader of the Bourbon Democrats. They represented business interests, supported banking and railroad goals, promoted laissez - faire capitalism, opposed imperialism and U.S. overseas expansion, opposed the annexation of Hawaii, fought for the gold standard, and opposed Bimetallism. They strongly supported reform movements such as Civil Service Reform and opposed corruption of city bosses, leading the fight against the Tweed Ring. The leading Bourbons included Samuel J. Tilden, David Bennett Hill and William C. Whitney of New York, Arthur Pue Gorman of Maryland, Thomas F. Bayard of Delaware, Henry M. Mathews and William L. Wilson of West Virginia, John Griffin Carlisle of Kentucky, William F. Vilas of Wisconsin, J. Sterling Morton of Nebraska, John M. Palmer of Illinois, Horace Boies of Iowa, Lucius Quintus Cincinnatus Lamar of Mississippi, and railroad builder James J. Hill of Minnesota. A prominent intellectual was Woodrow Wilson. The Bourbons were in power when the Panic of 1893 hit, and they took the blame. A fierce struggle inside the party ensued, with catastrophic losses for both the Bourbon and agrarian factions in 1894, leading to the showdown in 1896. Just before the 1894 election, President Cleveland was warned by an advisor: The warning was appropriate, for the Republicans won their biggest landslide in decades, taking full control of the House, while the Populists lost most of their support. However, Cleveland 's factional enemies gained control of the Democratic Party in state after state, including full control in Illinois and Michigan, and made major gains in Ohio, Indiana, Iowa and other states. Wisconsin and Massachusetts were two of the few states that remained under the control of Cleveland 's allies. The opposition Democrats were close to controlling two thirds of the vote at the 1896 national convention, which they needed to nominate their own candidate. However they were not united and had no national leader, as Illinois governor John Peter Altgeld had been born in Germany and was ineligible to be nominated for president. Religious divisions were sharply drawn. Methodists, Congregationalists, Presbyterians, Scandinavian Lutherans and other pietists in the North were closely linked to the Republican Party. In sharp contrast, liturgical groups, especially the Catholics, Episcopalians, and German Lutherans, looked to the Democratic Party for protection from pietistic moralism, especially prohibition. Both parties cut across the class structure, with the Democrats gaining more support from the lower classes and Republicans more support from the upper classes. Cultural issues, especially prohibition and foreign language schools, became matters of contention because of the sharp religious divisions in the electorate. In the North, about 50 percent of voters were pietistic Protestants (Methodists, Scandinavian Lutherans, Presbyterians, Congregationalists, Disciples of Christ) who believed the government should be used to reduce social sins, such as drinking. Liturgical churches (Roman Catholics, German Lutherans, Episcopalians) comprised over a quarter of the vote and wanted the government to stay out of the morality business. Prohibition debates and referendums heated up politics in most states over a period of decade, as national prohibition was finally passed in 1918 (and repealed in 1932), serving as a major issue between the wet Democrats and the dry Republicans. Grover Cleveland led the party faction of conservative, pro-business Bourbon Democrats, but as the depression of 1893 deepened, his enemies multiplied. At the 1896 convention the silverite - agrarian faction repudiated the president, and nominated the crusading orator William Jennings Bryan on a platform of free coinage of silver. The idea was that minting silver coins would flood the economy with cash and end the depression. Cleveland supporters formed the National Democratic Party (Gold Democrats), which attracted politicians and intellectuals (including Woodrow Wilson and Frederick Jackson Turner) who refused to vote Republican. Bryan, an overnight sensation because of his "Cross of Gold '' speech, waged a new - style crusade against the supporters of the gold standard. Criss - crossing the Midwest and East by special train -- he was the first candidate since 1860 to go on the road -- he gave over 500 speeches to audiences in the millions. In St. Louis he gave 36 speeches to workingmen 's audiences across the city, all in one day. Most Democratic newspapers were hostile toward Bryan, but he seized control of the media by making the news every day, as he hurled thunderbolts against Eastern monied interests. The rural folk in the South and Midwest were ecstatic, showing an enthusiasm never before seen. Ethnic Democrats, especially Germans and Irish, however, were alarmed and frightened by Bryan. The middle classes, businessmen, newspaper editors, factory workers, railroad workers, and prosperous farmers generally rejected Bryan 's crusade. Republican William McKinley promised a return to prosperity based on the gold standard, support for industry, railroads and banks, and pluralism that would enable every group to move ahead. Although Bryan lost the election in a landslide, he did win the hearts and minds of a majority of Democrats, as shown by his renomination in 1900 and 1908; as late as 1924, the Democrats put his brother Charles W. Bryan on their national ticket. The victory of the Republican Party in the election of 1896 marked the start of the "Progressive Era, '' which lasted from 1896 to 1932, in which the Republican Party usually was dominant. The 1896 election marked a political realignment in which the Republican Party controlled the presidency for 28 of 36 years. The Republicans dominated most of the Northeast and Midwest, and half the West. Bryan, with a base in the South and Plains states, was strong enough to get the nomination in 1900 (losing to William McKinley) and 1908 (losing to William Howard Taft). Theodore Roosevelt dominated the first decade of the century, and to the annoyance of Democrats "stole '' the trust issue by crusading against trusts. Anti-Bryan conservatives controlled the convention in 1904, but faced a Theodore Roosevelt landslide. Bryan dropped his free silver and anti-imperialism rhetoric and supported mainstream progressive issues, such as the income tax, anti-trust, and direct election of Senators. Taking advantage of a deep split in the Republican Party, the Democrats took control of the House in 1910, and elected the intellectual reformer Woodrow Wilson in 1912 and 1916. Wilson successfully led Congress to a series of progressive laws, including a reduced tariff, stronger antitrust laws, new programs for farmers, hours - and - pay benefits for railroad workers, and the outlawing of child labor (which was reversed by the Supreme Court). Wilson tolerated the segregation of the federal Civil Service by Southern cabinet members. Furthermore, bipartisan constitutional amendments for prohibition and women 's suffrage were passed in his second term. In effect, Wilson laid to rest the issues of tariffs, money and antitrust that had dominated politics for 40 years. Wilson oversaw the U.S. role in World War I, and helped write the Versailles Treaty, which included the League of Nations. But in 1919 Wilson 's political skills faltered, and suddenly everything turned sour. The Senate rejected Versailles and the League, a nationwide wave of violent, unsuccessful strikes and race riots caused unrest, and Wilson 's health collapsed. The Democrats lost by a huge landslide in 1920, doing especially poorly in the cities, where the German - Americans deserted the ticket, and the Irish Catholics, who dominated the party apparatus, sat on their hands. Although they recovered considerable ground in the Congressional elections of 1922, the entire decade saw the Democrats as a helpless minority in Congress, and as a weak force in most northern states. At the 1924 Democratic National Convention, a resolution denouncing the Ku Klux Klan was introduced by forces allied with Al Smith and Oscar W. Underwood in order to embarrass the front - runner, William Gibbs McAdoo. After much debate, the resolution failed by a single vote. The KKK faded away soon after, but the deep split in the party over cultural issues, especially Prohibition, facilitated Republican landslides in 1920, 1924, and 1928. However, Al Smith did build a strong Catholic base in the big cities in 1928, and Franklin D. Roosevelt 's election as Governor of New York that year brought a new leader to center stage. The stock market crash of 1929 and the ensuing Great Depression set the stage for a more progressive government and Franklin D. Roosevelt won a landslide victory in the election of 1932, campaigning on a platform of "Relief, Recovery, and Reform ''; that is, relief of unemployment and rural distress, recovery of the economy back to normal, and long - term structural reforms to prevent a repetition of the Depression. This came to be termed "The New Deal '' after a phrase in Roosevelt 's acceptance speech. The Democrats also swept to large majorities in both houses of Congress, and among state governors. Roosevelt altered the nature of the party, away from laissez - faire capitalism, and towards an ideology of economic regulation and insurance against hardship. Two old words took on new meanings: "Liberal '' now meant a supporter of the New Deal; "conservative '' meant an opponent. Conservative Democrats were outraged; led by Al Smith, they formed the American Liberty League in 1934 and counterattacked. They failed, and either retired from politics or joined the Republican Party. A few of them, such as Dean Acheson, found their way back to the Democratic Party. The 1933 programs, called "the First New Deal '' by historians, represented a broad consensus. Roosevelt tried to reach out to business and labor, farmers and consumers, cities and countryside. By 1934, however, he was moving toward a more confrontational policy. After making gains in state governorships and in Congress, in 1934 Roosevelt embarked on an ambitious legislative program that came to be called "The Second New Deal. '' It was characterized by building up labor unions, nationalizing welfare by the WPA, setting up Social Security, imposing more regulations on business (especially transportation and communications), and raising taxes on business profits. Roosevelt 's New Deal programs focused on job creation through public works projects as well as on social welfare programs such as Social Security. It also included sweeping reforms to the banking system, work regulation, transportation, communications, and stock markets, as well as attempts to regulate prices. His policies soon paid off by uniting a diverse coalition of Democratic voters called the New Deal coalition, which included labor unions, southerners, minorities (most significantly, Catholics and Jews), and liberals. This united voter base allowed Democrats to be elected to Congress and the presidency for much of the next 30 years. After a triumphant re-election in 1936, he announced plans to enlarge the Supreme Court, which tended to oppose his New Deal, by five new members. A firestorm of opposition erupted, led by his own Vice President John Nance Garner. Roosevelt was defeated by an alliance of Republicans and conservative Democrats, who formed a Conservative coalition that managed to block nearly all liberal legislation (only a minimum wage law got through). Annoyed by the conservative wing of his own party, Roosevelt made an attempt to rid himself of it; in 1938, he actively campaigned against five incumbent conservative Democratic senators; all five senators won re-election. Under Roosevelt, the Democratic Party became identified more closely with modern liberalism, which included the promotion of social welfare, labor unions, civil rights, and the regulation of business. The opponents, who stressed long - term growth and support for entrepreneurship and low taxes, now started calling themselves "conservatives. '' Harry Truman took over after Roosevelt 's death in 1945, and the rifts inside the party that Roosevelt had papered over began to emerge. Major components included the big city machines, the southern state and local parties, the far - left, and the "Liberal coalition '' or "Liberal - Labor Coalition '' comprising the AFL, CIO, and ideological groups such as the NAACP (representing Blacks), the American Jewish Congress (AJC), and the Americans for Democratic Action (ADA) (representing liberal intellectuals). By 1948 the unions had expelled nearly all the far - left and Communist elements. On the right the Republicans blasted Truman 's domestic policies. "Had Enough? '' was the winning slogan as Republicans recaptured Congress in 1946 for the first time since 1928. Many party leaders were ready to dump Truman in 1948, but after General Dwight D. Eisenhower rejected their invitation they lacked an alternative. Truman counterattacked, pushing J. Strom Thurmond and his Dixiecrats out, and taking advantage of the splits inside the Republican Party. He was reelected in a stunning surprise. However all of Truman 's Fair Deal proposals, such as universal health care were defeated by the Southern Democrats in Congress. His seizure of the steel industry was reversed by the Supreme Court. On the far - left former Vice President Henry A. Wallace denounced Truman as a war - monger for his anti-Soviet programs, the Truman Doctrine, Marshall Plan, and NATO. Wallace quit the party, and ran for president as an independent in 1948. He called for détente with the Soviet Union but much of his campaign was controlled by Communists who had been expelled from the main unions. Wallace fared poorly and helped turn the anti-Communist vote toward Truman. By cooperating with internationalist Republicans, Truman succeeded in defeating isolationists on the right and supporters of softer lines on the Soviet Union on the left to establish a Cold War program that lasted until the fall of the Soviet Union in 1991. Wallace supporters and other Democrats who were farther left were pushed out of the party and the CIO in 1946 -- 48 by young anti-Communists like Hubert Humphrey, Walter Reuther, and Arthur Schlesinger, Jr. Hollywood emerged in the 1940s as an important new base in the party, led by movie - star politicians such as Ronald Reagan, who strongly supported Roosevelt and Truman at this time. In foreign policy, Europe was safe but troubles mounted in Asia. China fell to the Communists in 1949. Truman entered the Korean War without formal Congressional approval. When the war turned to a stalemate and he fired General Douglas MacArthur in 1951, Republicans blasted his policies in Asia. A series of petty scandals among friends and buddies of Truman further tarnished his image, allowing the Republicans in 1952 to crusade against "Korea, Communism and Corruption. '' Truman dropped out of the presidential race early in 1952, leaving no obvious successor. The convention nominated Adlai Stevenson in 1952 and 1956, only to see him overwhelmed by two Eisenhower landslides. In Congress the powerful duo of House Speaker Sam Rayburn and Senate Majority leader Lyndon B. Johnson held the party together, often by compromising with Eisenhower. In 1958 the party made dramatic gains in the midterms and seemed to have a permanent lock on Congress, thanks largely to organized labor. Indeed, Democrats had majorities in the House every election from 1930 to 1992 (except 1946 and 1952). Most southern Congressmen were conservative Democrats, however, and they usually worked with conservative Republicans. The result was a Conservative Coalition that blocked practically all liberal domestic legislation from 1937 to the 1970s, except for a brief spell 1964 -- 65, when Johnson neutralized its power. The counterbalance to the Conservative Coalition was the Democratic Study Group, which led the charge to liberalize the institutions of Congress and eventually pass a great deal of the Kennedy - Johnson program. The election of John F. Kennedy in 1960 over then - Vice President Richard M. Nixon re-energized the party. His youth, vigor and intelligence caught the popular imagination. New programs like the Peace Corps harnessed idealism. In terms of legislation, Kennedy was stalemated by the Conservative Coalition. Though Kennedy 's term in office lasted only about a thousand days, he tried to hold back Communist gains after the failed Bay of Pigs invasion in Cuba and the construction of the Berlin Wall, and sent 16,000 soldiers to Vietnam to advise the hard - pressed South Vietnamese army. He challenged America in the Space Race to land an American man on the moon by 1969. After the Cuban Missile Crisis he moved to de-escalate tensions with the Soviet Union. Kennedy also pushed for civil rights and racial integration, one example being Kennedy assigning federal marshals to protect the Freedom Riders in the south. His election did mark the coming of age of the Catholic component of the New Deal Coalition. After 1964 middle class Catholics started voting Republican in the same proportion as their Protestant neighbors. Except for the Chicago of Richard J. Daley, the last of the Democratic machines faded away. President Kennedy was assassinated on November 22, 1963, in Dallas, Texas. Then - Vice President Lyndon B. Johnson was sworn in as the new president. Johnson, heir to the New Deal ideals, broke the Conservative Coalition in Congress and passed a remarkable number of laws, known as the Great Society. Johnson succeeded in passing major civil rights laws that restarted racial integration in the south. At the same time, Johnson escalated the Vietnam War, leading to an inner conflict inside the Democratic Party that shattered the party in the elections of 1968. The Democratic Party platform of the 1960s was largely formed by the ideals of President Johnson 's "Great Society. '' The New Deal Coalition began to fracture as more Democratic leaders voiced support for civil rights, upsetting the party 's traditional base of Southern Democrats and Catholics in Northern cities. After Harry Truman 's platform gave strong support to civil rights and anti-segregation laws during the 1948 Democratic National Convention, many Southern Democratic delegates decided to split from the Party and formed the "Dixiecrats, '' led by South Carolina governor Strom Thurmond (who, as a Senator, would later join the Republican Party). However, few other Democrats left the party. On the other hand, African Americans, who had traditionally given strong support to the Republican Party since its inception as the "anti-slavery party, '' continued to shift to the Democratic Party, largely due to the economic opportunities offered by the New Deal relief programs, patronage offers, and the advocacy of and support for civil rights by such prominent Democrats as former First Lady Eleanor Roosevelt. Although Republican Dwight D. Eisenhower carried half the South in 1952 and 1956, and Senator Barry Goldwater also carried five Southern states in 1964, Democrat Jimmy Carter carried all of the South except Virginia, and there was no long - term realignment until Ronald Reagan 's sweeping victories in the South in 1980 and 1984. The party 's dramatic reversal on civil rights issues culminated when Democratic President Lyndon B. Johnson signed into law the Civil Rights Act of 1964. The Act was passed in both House and Senate by a Republican majority. Many of the Democrats, mostly southern Democrats opposed the act. Meanwhile, the Republicans, led again by Richard Nixon, were beginning to implement their new economic polices which aimed to resist federal encroachment on the states, while appealing to conservative and moderate in the rapidly growing cities and suburbs of the South. The year 1968 marked a major crisis for the party. In January, even though it was a military defeat for the Viet Cong, the Tet Offensive began to turn American public opinion against the Vietnam War. Senator Eugene McCarthy rallied intellectuals and anti-war students on college campuses and came within a few percentage points of defeating Johnson in the New Hampshire primary; Johnson was permanently weakened. Four days later Senator Robert Kennedy, brother of the late president, entered the race. Johnson stunned the nation on March 31 when he withdrew from the race; four weeks later his vice-president, Hubert H. Humphrey, entered the race but did not run in any primary. Kennedy and McCarthy traded primary victories while Humphrey gathered the support of labor unions and the big - city bosses. Kennedy won the critical California primary on June 4, but he was assassinated that night. (Even as Kennedy won California, Humphrey had already amassed 1000 of the 1312 delegate votes needed for the nomination, while Kennedy had about 700). During the 1968 Democratic National Convention, while police and the National Guard violently confronted anti-war protesters on the streets and parks of Chicago, the Democrats nominated Humphrey. Meanwhile, Alabama 's Democratic governor George C. Wallace launched a third - party campaign and at one point was running second to the Republican candidate Richard Nixon. Nixon barely won, with the Democrats retaining control of Congress. The party was now so deeply split that it would not again win a majority of the popular vote for president until 1976. (Jimmy Carter won the popular vote in 1976 with 50.1 %.) The degree to which the Southern Democrats had abandoned the party became evident in the 1968 presidential election when the electoral votes of every former Confederate state except Texas went to either Republican Richard Nixon or independent Wallace. Humphrey 's electoral votes came mainly from the Northern states, marking a dramatic reversal from the 1948 election 20 years earlier, when the losing Republican electoral votes were concentrated in the same states. Following the 1968 debacle, the McGovern - Fraser Commission proposed, and the Party adopted, far - reaching changes in how national convention delegates were selected. More power over the presidential nominee selection accrued to the rank and file and presidential primaries became significantly more important. In 1972, the Democrats nominated Sen. George McGovern (SD) as the presidential candidate on a platform which advocated, among other things, immediate U.S. withdrawal from Vietnam (with his anti-war slogan "Come Home, America! '') and a guaranteed minimum income for all Americans. McGovern 's forces at the national convention ousted Mayor Richard J. Daley and the entire Chicago delegation, replacing them with insurgents led by Jesse Jackson. After it became known that McGovern 's running mate, Thomas Eagleton, had received electric shock therapy, McGovern said he supported Eagleton "1000 % '' but he was soon forced to drop him and find a new running mate. Numerous top names turned him down, but McGovern finally selected Sargent Shriver, a Kennedy in - law who was close to Mayor Daley. On July 14, 1972, McGovern appointed his campaign manager, Jean Westwood, as the first woman chair of the Democratic National Committee. McGovern was defeated in a landslide by incumbent Richard Nixon, winning only Massachusetts and Washington, D.C. The sordid Watergate scandal soon destroyed the Nixon presidency, giving the Democrats a flicker of hope. With Gerald Ford 's pardon of Nixon soon after his resignation in 1974, the Democrats used the "corruption '' issue to make major gains in the off - year elections. In 1976, mistrust of the administration, complicated by a combination of economic recession and inflation, sometimes called stagflation, led to Ford 's defeat by Jimmy Carter, a former Governor of Georgia. Carter won as a little - known outsider by promising honesty in Washington, a message that played well to voters as he swept the South and won narrowly. Carter had served as a naval officer, a farmer, a state senator, and a one - term governor. His only experience with federal politics was when he chaired the Democratic National Committee 's congressional and gubernatorial elections in 1974. Some of Carter 's major accomplishments consisted of the creation of a national energy policy and the consolidation of governmental agencies, resulting in two new cabinet departments, the United States Department of Energy and the United States Department of Education. Carter also successfully deregulated the trucking, airline, rail, finance, communications, and oil industries (thus backtracking on the New Deal approach to regulation of the economy), bolstered the social security system, and appointed record numbers of women and minorities to significant government and judicial posts. He also enacted strong legislation on environmental protection, through the expansion of the National Park Service in Alaska, creating 103 million acres (417,000 km2) of park land. In foreign affairs, Carter 's accomplishments consisted of the Camp David Accords, the Panama Canal Treaties, the establishment of full diplomatic relations with the People 's Republic of China, and the negotiation of the SALT II Treaty. In addition, he championed human rights throughout the world and used human rights as the center of his administration 's foreign policy. Even with all of these successes, Carter failed to implement a national health plan or to reform the tax system, as he had promised in his campaign. Inflation was also on the rise. Abroad, the Iranians held 52 Americans hostage for 444 days, and Carter 's diplomatic and military rescue attempts failed. The Soviet invasion of Afghanistan later that year further disenchanted some Americans with Carter. In 1980, Carter defeated Senator Ted Kennedy to gain renomination, but lost to Ronald Reagan in November. The Democrats lost 12 Senate seats, and for the first time since 1954, the Republicans controlled the Senate. The House, however, remained in Democratic hands. After his defeat, Carter negotiated the release of every American hostage held in Iran. They were lifted out of Iran minutes after Reagan was inaugurated, ending a 444 - day crisis. Democrats who supported many conservative policies were instrumental in the election of Republican President Ronald Reagan in 1980. The "Reagan Democrats '' were Democrats before the Reagan years, and afterward, but they voted for Ronald Reagan in 1980 and 1984 and for George H.W. Bush in 1988, producing their landslide victories. Reagan Democrats were mostly white ethnics in the Northeast and Midwest who were attracted to Reagan 's social conservatism on issues such as abortion, and to his strong foreign policy. They did not continue to vote Republican in 1992 or 1996, so the term fell into disuse except as a reference to the 1980s. The term is not used to describe White Southerners who became permanent Republicans in presidential elections. Stan Greenberg, a Democratic pollster, analyzed white ethnic voters -- largely unionized auto workers -- in suburban Macomb County, Michigan, just north of Detroit. The county voted 63 percent for Kennedy in 1960 and 66 percent for Reagan in 1984. He concluded that Reagan Democrats no longer saw Democrats as champions of their middle class aspirations, but instead saw it as a party working primarily for the benefit of others, especially African Americans, advocacy groups of the political left, and the very poor. The failure to hold the Reagan Democrats and the white South led to the final collapse of the New Deal coalition. Reagan carried 49 states against former Vice President and Minnesota Senator Walter Mondale, a New Deal stalwart, in 1984. In response to these landslide defeats, the Democratic Leadership Council (DLC) was created in 1985. It worked to move the party rightwards to the ideological center in order to recover some of the fundraising that had been lost to the Republicans due to corporate donors supporting Reagan. The goal was to retain left - of - center voters as well as moderates and conservatives on social issues, to become a catch all party with widespread appeal to most opponents of the Republicans. Despite this, Massachusetts Governor Michael Dukakis, running not as a New Dealer but as an efficiency expert in public administration, lost by a landslide in 1988 to Vice President George H.W. Bush. For nearly a century after Reconstruction, the white South identified with the Democratic Party. The Democrats ' lock on power was so strong the region was called the Solid South, although the Republicans controlled parts of the Appalachian mountains and they competed for statewide office in the border states. Before 1948, Southern Democrats believed that their party, with its respect for states ' rights and appreciation of traditional southern values, was the defender of the southern way of life. Southern Democrats warned against aggressive designs on the part of Northern liberals and Republicans and civil rights activists whom they denounced as "outside agitators. '' The adoption of the strong civil rights plank by the 1948 convention and the integration of the armed forces by President Harry S. Truman 's Executive Order 9981, which provided for equal treatment and opportunity for African - American servicemen, drove a wedge between the northern and southern branches of the party. The party was sharply divided in the following election, as Southern Democrats Strom Thurmond ran as "States ' Rights Democratic Party ''. With the presidency of John F. Kennedy the Democratic Party began to embrace the Civil Rights Movement, and its lock on the South was irretrievably broken. Upon signing the Civil Rights Act of 1964, President Lyndon B. Johnson prophesied, "We have lost the South for a generation. '' Modernization had brought factories, national businesses, and larger, more cosmopolitan cities such as Atlanta, Dallas, Charlotte, and Houston to the South, as well as millions of migrants from the North and more opportunities for higher education. Meanwhile, the cotton and tobacco economy of the traditional rural South faded away, as former farmers commuted to factory jobs. As the South became more like the rest of the nation, it could not stand apart in terms of racial segregation. Integration and the Civil Rights Movement caused enormous controversy in the white South, with many attacking it as a violation of states ' rights. When segregation was outlawed by court order and by the Civil Rights acts of 1964 and 1965, a die - hard element resisted integration, led by Democratic governors Orval Faubus of Arkansas, Lester Maddox of Georgia, and especially George Wallace of Alabama. These populist governors appealed to a less - educated, blue - collar electorate that on economic grounds favored the Democratic Party and opposed desegregation. After 1965 most Southerners accepted integration (with the exception of public schools). Believing themselves betrayed by the Democratic Party, traditional White Southerners joined the new middle - class and the Northern transplants in moving toward the Republican Party. Meanwhile, newly enfranchised Black voters began supporting Democratic candidates at the 80 - 90 - percent levels, producing Democratic leaders such as Julian Bond and John Lewis of Georgia, and Barbara Jordan of Texas. Just as Martin Luther King had promised, integration had brought about a new day in Southern politics. The Republican Party 's southern strategy further alienated black voters from the party. In addition to its white middle - class base, Republicans attracted strong majorities among evangelical Christians, who prior to the 1980s were largely apolitical. Exit polls in the 2004 presidential election showed that George W. Bush led John Kerry by 70 -- 30 % among White Southerners, who comprised 71 % of the voters. Kerry had a 90 -- 9 lead among the 18 % of Southern voters who were black. One - third of the Southern voters said they were white Evangelicals; they voted for Bush by 80 -- 20. The Democrats included a strong element that came of age in opposition to the Vietnam War, and remained hostile toward American military interventions. On August 1, 1990, Iraq, led by Saddam Hussein, invaded Kuwait. President Bush formed an international coalition and secured UN approval to expel Iraq. Congress on January 12, 1991 authorized by a narrow margin the use of military force against Iraq, with Republicans in favor and Democrats opposed. The vote in the House was 250 -- 183, and in the Senate 52 - 47. In the Senate 42 Republicans and 10 Democrats voted yes to war, while 45 Democrats and two Republicans voted no. In the House 164 Republicans and 86 Democrats voted yes, and 179 Democrats, three Republicans and one Independent voted no. The Gulf War, a military operation known as "Desert Storm, '' was short and successful, but Hussein was allowed to remain in power. The Arab countries (and Japan) repaid all the American military costs. In the 1990s the Democratic Party revived itself, in part by moving to the right on economic policy. In 1992, for the first time in 12 years, the United States had a Democrat in the White House. During President Bill Clinton 's term, the Congress balanced the federal budget for the first time since the Kennedy presidency and presided over a robust American economy that saw incomes grow across the board. In 1994, the economy had the lowest combination of unemployment and inflation in 25 years. President Clinton also signed into law several gun control bills, including the Brady Bill, which imposed a five - day waiting period on handgun purchases; he also signed into legislation a ban on many types of semi-automatic firearms (which expired in 2004). His Family and Medical Leave Act, covering some 40 million Americans, offered workers up to 12 weeks of unpaid, job - guaranteed leave for childbirth or a personal or family illness. He deployed the U.S. military to Haiti to reinstate deposed president Jean - Bertrand Aristide, took a strong hand in Palestinian - Israeli peace negotiations, brokered a historic cease - fire in Northern Ireland, and negotiated the Dayton accords. In 1996, Clinton became the first Democratic president to be re-elected since Franklin D. Roosevelt. However, the Democrats lost their majority in both houses of Congress in 1994. Clinton vetoed two Republican - backed welfare reform bills before signing the third, the Personal Responsibility and Work Opportunity Act of 1996. The tort reform Private Securities Litigation Reform Act passed over his veto. Labor unions, which had been steadily losing membership since the 1960s, found they had also lost political clout inside the Democratic Party; Clinton enacted the North American Free Trade Agreement with Canada and Mexico over unions ' strong objections. In 1998, the Republican - led House of Representatives impeached Clinton on two charges; he was subsequently acquitted by the United States Senate in 1999. Under Clinton 's leadership, the United States participated in NATO 's Operation Allied Force against Yugoslavia that year. In the 1990s the Clinton Administration continued the free market, or neoliberal, reforms which began under the Reagan Administration. However, Economist Sebastian Mallaby argues that the Party increasingly adopted pro-business, pro free market principles after 1976: Historian Walter Scheidel also posits that both parties shifted to free markets in the 1970s: As the DLC attempted to move the Democratic agenda to the right (to a more centrist position), prominent Democrats from both the centrist and conservative factions (such as Terry McAuliffe) assumed leadership of the party and its direction. Some liberals and progressives felt alienated by the Democratic Party, which they felt had become unconcerned with the interests of the common people and left - wing issues in general. Some Democrats challenged the validity of such critiques, citing the Democratic role in pushing for progressive reforms. During the 2000 presidential election, the Democrats chose Vice President Al Gore to be the party 's candidate for the presidency. Gore ran against George W. Bush, the Republican candidate and son of former President George H.W. Bush. The issues Gore championed include debt reduction, tax cuts, foreign policy, public education, global warming, judicial appointments, and affirmative action. Nevertheless, Gore 's affiliation with Clinton and the DLC caused critics to assert that Bush and Gore were too similar, especially on free trade, reductions in social welfare, and the death penalty. Green Party presidential candidate Ralph Nader in particular was very vocal in his criticisms. Gore won a popular plurality of over 540,000 votes over Bush, but lost in the Electoral College by four votes. Many Democrats blamed Nader 's third - party spoiler role for Gore 's defeat. They pointed to the states of New Hampshire (4 electoral votes) and Florida (25 electoral votes), where Nader 's total votes exceeded Bush 's margin of victory. In Florida, Nader received 97,000 votes; Bush defeated Gore by a mere 537. Controversy plagued the election, and Gore largely dropped from politics for years; by 2005 however he was making speeches critical of Bush 's foreign policy. Despite Gore 's close defeat, the Democrats gained five seats in the Senate (including the election of Hillary Clinton in New York), to turn a 55 -- 45 Republican edge into a 50 -- 50 split (with a Republican Vice President breaking a tie). However, when Republican Senator Jim Jeffords of Vermont decided in 2001 to become an independent and vote with the Democratic Caucus, the majority status shifted along with the seat, including control of the floor (by the Majority Leader) and control of all committee chairmanships. However, the Republicans regained their Senate majority with gains in 2002 and 2004, leaving the Democrats with only 44 seats, the fewest since the 1920s. In the aftermath of the September 11, 2001 attacks, the nation 's focus was changed to issues of national security. All but one Democrat (Representative Barbara Lee) voted with their Republican counterparts to authorize President Bush 's 2001 invasion of Afghanistan. House leader Richard Gephardt and Senate leader Thomas Daschle pushed Democrats to vote for the USA PATRIOT Act and the invasion of Iraq. The Democrats were split over entering Iraq in 2003 and increasingly expressed concerns about both the justification and progress of the War on Terrorism, as well as the domestic effects, including threats to civil rights and civil liberties, from the Patriot act. Senator Russ Feingold was the only Senator to vote against the act. In the wake of the financial fraud scandal of the Enron Corporation and other corporations, Congressional Democrats pushed for a legal overhaul of business accounting with the intention of preventing further accounting fraud. This led to the bipartisan Sarbanes - Oxley Act in 2002. With job losses and bankruptcies across regions and industries increasing in 2001 and 2002, the Democrats generally campaigned on the issue of economic recovery. That did not work for them in 2002 as the Democrats lost a few seats in the U.S. House of Representatives. They lost three seats in the Senate (Georgia as Max Cleland was unseated, Minnesota as Paul Wellstone died and his succeeding Democratic candidate lost the election, and Missouri as Jean Carnahan was unseated) in the Senate. While Democrats gained governorships in New Mexico (where Bill Richardson was elected), Arizona (Janet Napolitano), Michigan (Jennifer Granholm) and Wyoming (Dave Freudenthal). Other Democrats lost governorships in South Carolina (Jim Hodges), Alabama (Don Siegelman) and, for the first time in more than a century, Georgia (Roy Barnes). The election led to another round of soul searching about the party 's narrowing base. Democrats had further losses 2003, when a voter recall unseated the unpopular Democratic governor of California, Gray Davis, and replaced him with Republican Arnold Schwarzenegger. By the end of 2003 the four most populous states had Republican governors: California, Texas, New York and Florida. The 2004 campaign started as early as December 2002, when Gore announced he would not run again in the 2004 election. Howard Dean, former Governor of Vermont, an opponent of the war and a critic of the Democratic establishment, was the front - runner leading into the Democratic primaries. Dean had immense grassroots support, especially from the left wing of the party. Massachusetts Senator John Kerry, a more centrist figure with heavy support from the Democratic Leadership Council, was nominated because he was seen as more "electable '' than Dean. As layoffs of American workers occurred in various industries due to outsourcing, some Democrats (including Dean and senatorial candidate Erskine Bowles of North Carolina) began to refine their positions on free trade, and some even questioned their past support for it. By 2004, the failure of George W. Bush 's administration to find weapons of mass destruction in Iraq, mounting combat casualties and fatalities in the ongoing Iraq War, and the lack of any end point for the War on Terror were frequently debated issues in the election. That year, Democrats generally campaigned on surmounting the jobless recovery, solving the Iraq crisis, and fighting terrorism more efficiently. In the end, Kerry lost both the popular vote (by 3 million out of over 120 million votes cast) and the Electoral College. Republicans also gained four seats in the Senate (leaving the Democrats with only 44 seats, their fewest since the 1920s) and three seats in the House of Representatives. Also, for the first time since 1952, the Democratic leader of the Senate lost re-election. In the end, there were 3,660 Democratic state legislators across the nation to the Republicans ' 3,557. Democrats gained governorships in Louisiana, New Hampshire and Montana. However, they lost the governorship of Missouri and a legislative majority in Georgia -- which had long been a Democratic stronghold. Senate pickups for the Democrats included Ken Salazar in Colorado and 2004 Democratic National Convention keynote speaker Barack Obama in Illinois. There were many reasons for the defeat. After the election most analysts concluded that Kerry was a poor campaigner. A group of Vietnam veterans opposed to Kerry called the Swift Boat Veterans for Truth undercut Kerry 's use of his military past as a campaign strategy. Kerry was unable to reconcile his initial support of the Iraq War with his opposition to the war in 2004, or manage the deep split in the Democratic Party between those who favored and opposed the war. Republicans ran thousands of television commercials to argue that Kerry had flip - flopped on Iraq. When Kerry 's home state of Massachusetts legalized same - sex marriage, the issue split liberal and conservative Democrats and independents (Kerry publicly stated throughout his campaign that he opposed same - sex marriage, but favored civil unions). Republicans exploited the same - sex marriage issue by promoting ballot initiatives in 11 states that brought conservatives to the polls in large numbers; all 11 initiatives passed. Flaws in vote - counting systems may also have played a role in Kerry 's defeat (see 2004 United States election voting controversies). Senator Barbara Boxer of California and several Democratic U.S. Representatives (including John Conyers of Michigan) raised the issue of voting irregularities in Ohio when the 109th Congress first convened, but they were defeated 267 -- 31 by the House and 74 - 1 by the Senate. Other factors included a healthy job market, a rising stock market, strong home sales, and low unemployment. After the 2004 election, prominent Democrats began to rethink the party 's direction, and a variety of strategies for moving forward were voiced. Some Democrats proposed moving towards the right to regain seats in the House and Senate and possibly win the presidency in the election of 2008; others demanded that the party move more to the left and become a stronger opposition party. One topic of discussion was the party 's policies surrounding reproductive rights. Rethinking the party 's position on gun control became a matter of discussion, brought up by Howard Dean, Bill Richardson, Brian Schweitzer and other Democrats who had won governorships in states where Second Amendment rights were important to many voters. In What 's the Matter with Kansas?, commentator Thomas Frank wrote that the Democrats needed to return to campaigning on economic populism. These debates were reflected in the 2005 campaign for Chairman of the Democratic National Committee, which Howard Dean won over the objections of many party insiders. Dean sought to move the Democratic strategy away from the establishment, and bolster support for the party 's state organizations, even in red states (the Fifty - state strategy). When the 109th Congress convened, Harry Reid, the new Senate Minority Leader, tried to convince the Democratic Senators to vote more as a bloc on important issues; he forced the Republicans to abandon their push for privatization of Social Security. In 2005, the Democrats retained their governorships in Virginia and New Jersey, electing Tim Kaine and Jon Corzine, respectively. However, the party lost the mayoral race in New York City, a Democratic stronghold, for the fourth straight time. With scandals involving lobbyist Jack Abramoff, as well as Duke Cunningham, Tom DeLay, Mark Foley, and Bob Taft, the Democrats used the slogan "Culture of corruption '' against the Republicans during the 2006 campaign. Negative public opinion on the Iraq War, widespread dissatisfaction over the ballooning federal deficit, and the inept handling of the Hurricane Katrina disaster dragged down President Bush 's job approval ratings. As a result of the 2006 midterm elections, the Democratic Party became the majority party in the House of Representatives and its caucus in the United States Senate constituted a majority when the 110th Congress convened in 2007. The Democrats had spent twelve successive years as the minority party in the House before the 2006 mid-term elections. The Democrats also went from controlling a minority of governorships to a majority. The number of seats held by party members likewise increased in various state legislatures, giving the Democrats control of a plurality of them nationwide. No Democratic incumbent was defeated, and no Democratic - held open seat was lost, in either the U.S. Senate, U.S. House, or with regards to any governorship. The Democratic Party 's electoral success has been attributed by some to running conservative - leaning Democrats against at - risk Republican incumbents, while others claim that running more populists and progressive candidates has been the source of success. Exit polling suggested that corruption was a key issue for many voters. In the 2006 Democratic caucus leadership elections, Democrats chose Representative Steny Hoyer of Maryland for House Majority Leader and nominated Representative Nancy Pelosi of California for speaker. Senate Democrats chose Harry Reid of Nevada for United States Senate Majority Leader. Pelosi was elected as the first female House speaker at the commencement of the 110th Congress. The House soon passed the measures that comprised the Democrats ' 100 - Hour Plan. The 2008 Democratic presidential primaries left two candidates in close competition: Illinois Senator Barack Obama and New York Senator Hillary Clinton. Both had won more support within a major American political party than any previous African American or female candidate. Before official ratification at the 2008 Democratic National Convention, Obama emerged as the party 's presumptive nominee. With President George W. Bush of the Republican Party ineligible for a third term and the Vice President Dick Cheney not pursuing his party 's nomination, Senator John McCain of Arizona more quickly emerged as the GOP nominee. Throughout most of the 2008 general election, polls showed a close race between Obama and John McCain. However, Obama maintained a small but widening lead over McCain in the wake of the liquidity crisis of September 2008. On November 4, Obama defeated McCain by a significant margin in the Electoral College; the party also made further gains in the Senate and House, adding to its 2006 gains. On January 20, 2009, Obama was inaugurated as the 44th president of the United States in a ceremony attended by nearly 2 million people, the largest congregation of spectators ever to witness the inauguration of a new president. That same day in Washington, D.C., Republican House of Representative leaders met in an "invitation only '' meeting for four hours to discuss the future of the Republican Party under the Obama administration. During the meeting, they agreed to bring Congress to a standstill regardless of how much it would hurt the American economy by pledging to obstruct and block President Obama on all legislation. One of the first acts by the Obama administration after assuming control was an order signed by Chief of Staff Rahm Emanuel that suspended all pending federal regulations proposed by outgoing President George W. Bush so that they could be reviewed. This was comparable to prior moves by the Bush Administration upon assuming control from Bill Clinton, who in his final 20 days in office issued 12 executive orders. In his first week he also established a policy of producing a weekly Saturday morning video address available on Whitehouse.gov and YouTube, much like those released during his transition period. The policy is likened to Franklin Delano Roosevelt 's fireside chats and George W. Bush 's weekly radio addresses. President Obama signed into law the following significant legislation during his first 100 days in the White House: Lilly Ledbetter Fair Pay Act of 2009, Children 's Health Insurance Reauthorization Act of 2009, and the American Recovery and Reinvestment Act of 2009. Also during his first 100 days, the Obama administration reversed the following significant George W. Bush administration policies: supporting the UN declaration on sexual orientation and gender identity, relaxing enforcement of cannabis laws, lifting the 71⁄2 - year ban on federal funding for embryonic stem cell research. Obama also issued Executive Order 13492, ordering the closure of the Guantanamo Bay detention camp, although it has remained open throughout his presidency. He also lifted some travel and money restrictions to the island, ended the Mexico City Policy, signed an order requiring the Army Field Manual to be used as guide for terror interrogations, which banned torture and other coercive techniques, such as waterboarding. Obama also announced stricter guidelines regarding lobbyists in an effort to raise the ethical standards of the White House. The new policy bans aides from attempting to influence the administration for at least two years if they leave his staff. It also bans aides on staff from working on matters they have previously lobbied on, or to approach agencies that they targeted while on staff. Their ban also included a gift - giving ban. However, one day later he nominated William J. Lynn III, a lobbyist for defence contractor Raytheon, for the position of Deputy Secretary of Defense. Obama later nominated William Corr, an anti-tobacco lobbyist, for Deputy Secretary of Health and Human Services. During the beginning of Barack Obama 's presidency emerged the Tea Party movement, a conservative movement that began to heavily influence the Republican Party within the United States, shifting the GOP further right wing and partisan in their ideology. On February 18, 2009, Obama announced that the U.S. military presence in Afghanistan would be bolstered by 17,000 new troops by summer. The announcement followed the recommendation of several experts including Defense Secretary Robert Gates that additional troops be deployed to the strife - torn South Asian country. On February 27, 2009, Obama addressed Marines at Camp Lejeune, North Carolina, and outlined an exit strategy for the Iraq War. Obama promised to withdraw all combat troops from Iraq by August 31, 2010, and a "transitional force '' of up to 50,000 counterterrorism, advisory, training, and support personnel by the end of 2011. Obama signed two presidential memorandum concerning energy independence, ordering the Department of Transportation to establish higher fuel efficiency standards before 2011 models are released and allowing states to raise their emissions standards above the national standard. Due to the economic crisis, the President enacted a pay freeze for senior White House staff making more than $100,000 per year. The action affected approximately 120 staffers and added up to about a $443,000 savings for the United States government. On March 10, 2009, Barack Obama, in a meeting with the New Democrat Coalition, told them that he was a "New Democrat '', "pro-growth Democrat '', "supports free and fair trade '', and "very concerned about a return to protectionism. '' On May 26, 2009, President Barack Obama nominated Sonia Sotomayor for Associate Justice of the Supreme Court of the United States. Sotomayor was confirmed by the Senate becoming the highest ranking government official of Puerto Rican heritage ever. On July 1, 2009, President Obama signed into law the Comprehensive Iran Sanctions, Accountability, and Divestment Act of 2010. On July 7, 2009, Al Franken was sworn into the Senate, thus Senate Democrats obtained the 60 vote threshold to overcome the Senate filibuster. On October 28, 2009, Barack Obama signed the National Defense Authorization Act for Fiscal Year 2010, which included in it the Matthew Shepard and James Byrd, Jr. Hate Crimes Prevention Act, which expanded federal hate crime laws to include sexual orientation, gender identity, and disability. On January 21, 2010, the Supreme Court ruled, in a 5 -- 4 decision, in the case of Citizens United v. Federal Election Commission that the First Amendment prohibited the government from restricting independent political expenditures by a nonprofit corporation. On February 4, 2010, Republican Scott Brown of Massachusetts was sworn into the Senate, thus ending Senate Democrats 60 vote threshold to overcome a filibuster. On March 23, 2010, President Obama signed into law his signature legislation of his presidency, the Patient Protection and Affordable Care Act, together with the Health Care and Education Reconciliation Act of 2010, it represents the most significant regulatory overhaul of the U.S. healthcare system since the passage of Medicare and Medicaid in 1965. On May 10, 2010, President Barack Obama nominated Elena Kagan for Associate Justice of the Supreme Court of the United States. On July 21, 2010, President Obama signed into law the Dodd -- Frank Wall Street Reform and Consumer Protection Act, and Elena Kagan was confirmed by the United States Senate on August 5, 2010 by a 63 - 37 vote. Kagan was sworn in by Chief Justice John Roberts on August 7, 2010. On 19 August 2010 the 4th Stryker Brigade, 2nd Infantry Division was the last American combat brigade to withdraw from Iraq. In a speech at the Oval Office on 31 August 2010 Obama declared: "the American combat mission in Iraq has ended. Operation Iraqi Freedom is over, and the Iraqi people now have lead responsibility for the security of their country. '' About 50,000 American troops remained in the country in an advisory capacity as part of "Operation New Dawn, '' which ran until the end of 2011. New Dawn was the final designated U.S. campaign of the war. The U.S. military continued to train and advise the Iraqi Forces, as well as participate in combat alongside them. In November 2, 2010, the 2010 midterm elections, the Democratic Party had a net loss of six seats in the United States Senate, and 63 seats in the United States House of Representatives. Control of the House of Representatives switched from the Democratic Party to the Republican Party. The Democrats lost a net of six state governorships and a net 680 seats in state legislatures. The Democrats lost control of seven state senate legislatures and 13 state houses. This was the worst performance of the Democratic Party in a national election since the 1946 elections. The Blue Dog Coalition numbers in the House were reduced from 54 members in 2008 to 26 members in 2011, they were half of the Democratic defeats during the election. This was the first United States national election in which Super PACs were used by Democrats and Republicans. Many commentators contribute the electoral success of the Republican Party in 2010 to the conservative Super PACs ' campaign spending, Tea Party movement, backlash against President Barack Obama, failure to mobilize the Obama coalition to get out and vote, and the failure of President Obama to enact many of his progressive and liberal campaign promises. On December 1, 2010, Obama announced at the U.S. Military Academy in West Point that the U.S. would send 30,000 more troops. Antiwar organizations in the U.S. responded quickly, and cities throughout the U.S. saw protests on 2 December. Many protesters compared the decision to deploy more troops in Afghanistan to the expansion of the Vietnam War under the Johnson administration. During the lameduck session of the 111th United States Congress, President Obama signed into law the following significant legislation: Tax Relief, Unemployment Insurance Reauthorization, and Job Creation Act of 2010, Do n't Ask, Do n't Tell Repeal Act of 2010, James Zadroga 9 / 11 Health and Compensation Act of 2010, Shark Conservation Act of 2010, and the FDA Food Safety Modernization Act of 2010. On December 18, 2010, the Arab Spring began. On 22 December 2010, the U.S. Senate gave its advice and consent to ratification of New START by a vote of 71 to 26 on the resolution of ratification. The 111th United States Congress has been considered one of the most productive Congresses in history in terms of legislation passed since the 89th Congress, during Lyndon Johnson 's Great Society. On February 23, 2011, United States Attorney General Eric Holder announced the United States federal government would no longer defend the Defense of Marriage Act within federal courts. In response to the First Libyan Civil War, Secretary of State Hillary Clinton joined with U.N. Ambassador Susan Rice and Office of Multilateral and Human Rights Director Samantha Power led the hawkish diplomatic team within the Obama administration that helped convince President Obama in favor airstrikes against Libyan government. On March 19, 2011, the United States began military intervention in Libya. United States domestic reaction to the 2011 military intervention in Libya were mixed in the Democratic Party. Opponents to the 2011 military intervention in Libya within the Democratic Party include Rep. Dennis Kucinich, Sen. Jim Webb, Rep. Raul Grijalva, Rep. Mike Honda, Rep. Lynn Woolsey, and Rep. Barbara Lee. The Congressional Progressive Caucus (CPC), an organization of progressive Democrats, said that the United States should conclude its campaign against Libyan air defenses as soon as possible. Support for the 2011 military intervention in Libya within the Democratic Party include President Bill Clinton, Sen. Carl Levin, Sen. Dick Durbin, Sen. Jack Reed, Sen. John Kerry, Minority Leader of the House of Representatives Nancy Pelosi, Legal Adviser of the Department of State Harold Hongju Koh, and Ed Schultz. On April 5, 2011, Vice President Joe Biden announced that Debbie Wasserman Schultz was President Barack Obama 's choice to succeed Tim Kaine as the 52nd Chair of the Democratic National Committee. On May 26, 2011, President Obama signed the PATRIOT Sunsets Extension Act of 2011, which was strongly criticized by some in the Democratic Party as violation of civil liberties and a continuation of the George W. Bush, Jr. administration. House Democrats largely opposed the PATRIOT Sunsets Extension Act of 2011, while Senate Democrats were slightly in favor of it. On October 21, 2011, President Obama signed into law three of the following United States free trade agreements: Free trade agreement between the United States of America and the Republic of Korea, Panama -- United States Trade Promotion Agreement, and the United States -- Colombia Free Trade Agreement. In the House of Representatives, Democratic Representatives largely opposed these agreements, while Senate Democrats were split on the agreements. This was a continuation of President Bill Clinton 's policy of support for free trade agreements. When asked by David Gregory about his views on same - sex marriage on Meet the Press on May 5, 2012, Biden stated he supported same - sex marriage. On May 9, 2012, a day after North Carolina voters approved Amendment 1, President Barack Obama became the first sitting United States president to come out in favor of same - sex marriage. The 2012 Democratic Party platform for Obama 's reelection ran over 26,000 words and included his position on numerous national issues. On security issues, It pledges "unshakable commitment to Israel 's security, '' says the Party will try to prevent Iran from acquiring a nuclear weapon. It calls for a strong military, but argues that in the current fiscal environment, tough budgetary decisions must include defense spending. On controversial social issues it supports abortion rights, same - sex marriage, and says the Party is "strongly committed to enacting comprehensive immigration reform. ''. On the economic side the Platform calls for extending the tax cuts for families earning under $250,000 and promises not to raise their taxes. It praises the Patient Protection and Affordable Care Act ("Obamacare '', but does not use that term). It "adamantly oppose any efforts to privatize Medicare. '' On the rules of politics, it attacks the recent Supreme Court decision Citizens United v. Federal Election Commission that allows much greater political spending. It demands "immediate action to curb the influence of lobbyists and special interests on our political institutions ''. Intense budget negotiations in the divided 112th Congress, wherein Democrats resolved to fight Republican demands for decreased spending and no tax hikes, threatened to shut down the government in April 2011, and later spurred fears that the United States would default on its debt. Continuing tight budgets were felt at the state level, where public - sector unions, a key Democratic constituency, battled Republican efforts to limit their collective bargaining powers in order to save money and reduce union power. This led to sustained protests by public - sector employees and walkouts by sympathetic Democratic legislators in states like Wisconsin and Ohio. The 2011 "Occupy movement, '' a campaign on the left for more accountable economic leadership, failed to have the impact on Democratic Party leadership and policy that the Tea Party movement had on the Republicans. Its leadership proved ineffective and the Occupy movement fizzled out. However echoes could be found in the presidential nomination campaign of Senator Bernie Sanders in 2015 - 16. Conservatives criticized the president for "passive '' responses to crises such as the 2009 Iranian protests and the 2011 Egyptian revolution. Additionally, liberal and Democratic activists objected to Obama 's decisions to send reinforcements to Afghanistan, resume military trials of terror suspects at Guantanamo Bay, and to help enforce a no - fly zone over Libya during that country 's civil war. But the demands of anti-war advocates were heeded when Obama followed through on a campaign promise to withdraw combat troops from Iraq. The 2012 election was characterized by very high spending, especially on negative television ads in about ten critical states. Despite a weak economic recovery and high unemployment, the Obama campaign successfully mobilized its coalition of youth, blacks, Hispanics and women. The campaign carried all the same states as in 2008 except two, Indiana and North Carolina. The election continued the pattern whereby Democrats won more votes in all presidential elections after 1988, except for 2004. Obama and the Democrats lost control of the Senate in the 2014 midterm elections, losing nine seats in that body and 13 in the GOP House. National polling from 2013 to the summer of 2015 showed Hillary Clinton with an overwhelming commanding lead over all of her potential primary opponents. Her main challenger was Independent Vermont Senator Bernie Sanders, whose rallies grew larger and larger as he attracted overwhelming majorities among Democrats under age 40. The sharp divide between the two candidates was the establishment versus the political outsider, with Hillary being the establishment candidate, and Sanders the outsider. Clinton received the endorsements from an overwhelming majority of office holders. Clinton 's core base voters during the primary was women, African Americans, Latino Americans, LGBTs, moderates, and older voters, while Sanders ' core base included younger voters under age 40, men and progressives. The ideological differences between the two candidates represented the ideological divide within the Democratic Party as a whole. Clinton, who cast herself as a moderate and a progressive, is ideologically more of a centrist representing the Bill Clinton and Barack Obama Third Way New Democrat wing of the Democratic Party. Bernie Sanders, who remained an Independent in the Senate throughout the primaries (despite running for President as a Democrat), is a self described democratic socialist and is ideologically more of a progressive representing the Elizabeth Warren populist wing of the Democratic Party. During the primaries, Bernie Sanders attacked Hillary Clinton for her ties to Wall Street and her previous support of the Defense of Marriage Act, the Trans - Pacific Partnership, the North American Free Trade Agreement, the Keystone Pipeline, the 2011 military intervention in Libya, and the Iraq War, while Hillary attacked Sanders for voting against the Brady Handgun Violence Prevention Act, the Commodity Futures Modernization Act of 2000, the Protection of Lawful Commerce in Arms Act, and the Comprehensive Immigration Reform Act of 2007. Clinton generally moved left and adopted variations of some of Sanders ' themes, such as trade and college tuition. Although she was favored in the polls, she lost the general election to Trump in November, despite winning the popular vote. On January 12, 2017, the National Democratic Redistricting Committee, a 527 organization that focuses on redistricting reform and is affiliated with the Democratic Party. The chair, president, and vice president of the umbrella organization is the 82th Attorney General Eric Holder, Elizabeth Pearson, and Alixandria "Ali '' Lapp respectively. President Barack Obama has said he would be involved with the committee. At the Inauguration of Donald Trump, 67 Democratic members of the United States House of Representatives boycotted the inauguration. This was the largest boycott by members of the United States Congress since the Second inauguration of Richard Nixon, where it was estimated that between 80 and 200 Democratic members of United States Congress boycotted. On January 23, 2017, Justice Democrats, political action committee, was created by Cenk Uygur of The Young Turks, Kyle Kulinski of Secular Talk, Saikat Chakrabarti, and Zack Exley (both former leadership from the former 2016 Bernie Sanders presidential campaign). The organization, formed as a result of the 2016 United States presidential election, has a stated goal of reforming the Democratic Party by running "a unified campaign to replace every corporate - backed member of Congress and rebuild the (Democratic) party from scratch '' starting in the 2018 Congressional midterms. On January 17, 2017, Third Way, a public policy think tank, launched New Blue, a $20 million campaign to study Democratic short comings in the United States elections, 2016 and offer a new economic agenda to help Democrats reconnect with the voters who have abandoned the party. The money will be spent to conduct extensive research, reporting and polling in Rust Belt states that once formed a Blue Wall, but which voted for President Donald Trump in 2016. Many progressives have criticized this as a desperate measure for the so - called establishment wing of the party to retain leadership. The 2017 Democratic National Committee chairmanship election was characterized primarily as being between the two candidates for the chairmanship, United States Representative for Minnesota 's 5th congressional district Keith Ellison and 26th United States Secretary of Labor Tom Perez. On February 25, 2017, Perez won the Democratic National Committee chairmanship and named Keith Ellison as Deputy Chair of the Democratic National Committee, a newly created position. The Obama administration pushed for Tom Perez to run against Keith Ellison and President Barack Obama personally called DNC members to vote for Perez. On May 15, 2017, Hillary Clinton launched Onward Together, a political action organization with a stated missions to "advance progressive values and work to build a brighter future for generations to come. '' US politics: Campaign text books The national committees of major parties published a "campaign textbook '' every presidential election from about 1856 to about 1932. They were designed for speakers and contain statistics, speeches, summaries of legislation, and documents, with plenty of argumentation. Only large academic libraries have them, but some are online:
who sang the original i can't live without you
I Ca n't Live Without You - wikipedia "I Ca n't Live Without You '' is a song by Liechtenstein producer Al Walser. The song was nominated for Best Dance Recording at the 2013 Grammy Awards. It lost to "Bangarang '' by Skrillex & Sirah. The song 's nomination for a Grammy Award caused controversy. Philip Sherburne of Spin noted a lack of notability of Walser compared to Avicii, Calvin Harris, Skrillex, and Swedish House Mafia, as they were also nominated for the same award, writing that the song 's "clunky rock / trance fusion and low - budget video make Rebecca Black 's ' Friday ' sound and look cutting - edge in comparison ''. Bill Freimuth, the vice president of the Recording Academy -- which is responsible for the Grammy Awards -- told MTV Walser "was a very active marketer of his work, and got his music out to lots and lots of our voting membership, and they chose to vote for it. '' He noted that ballot is audited by Deloitte and said "they found nothing really anomalous or wrong with the votes surrounding this nomination. '' Speculation surrounding the nomination includes Walser 's membership to the Recording Academy. Walser stated his position as a voter "was only helpful to the extent that I had access to the other voters, and they could see what I was doing on a regular basis ''. Billboard compared the incident to Linda Chorney, who gained a Grammy Award nomination the previous year through a similar tactic.
who played nicholas alamain on days of our lives
Lawrence Alamain - wikipedia Lawrence Alamain is a fictional character on NBC 's daytime drama, Days of Our Lives, and was portrayed by Michael Sabatino from September 11, 1990 to October 18, 1993, and several guest appearances from October 2, 2009 to February 26, 2010, November 4, 2010, and May 13, 2011 to August 3, 2011. Not much is known about Lawrence 's parents Leopold and Philomena, aside from the fact that they were part of an old and powerful European dynasty, part of the nobility of a fictional European country never really named onscreen but alternately referred to in the soap opera media as "Alamania '' (which is incidentally very similar to an old Latin word for "Germany '') and "Ubilam '' ("Malibu '' spelled backwards). The few times the country was shown onscreen, it bore a striking resemblance to southern California. The Alamains were also very active in business circles, and when Lawrence first appeared on the show, he was the head of a powerful international oil conglomerate. He later demonstrated an uncanny ability to retrieve highly secretive and classified information, such as the cure to biological warfare being used against Allied intelligence agents throughout the world, and the existence of business magnate and ex-mobster Victor Kiriakis 's "John Black file '' which also contained information about the government 's and supervillain Stefano DiMera 's search for some stolen Mayan codices of purportedly high scientific and technological value. Lawrence 's mother Philomena was later retconned to be the sister of Stefano 's common - law wife, Daphne. By the time the family was brought onto the show, Philomena had already died, and Leopold (portrayed by longtime character actor Avery Schreiber) was presented as a likeable, fun - loving eccentric, a far cry from his diabolical son Lawrence, before he was killed in an earthquake generated by one of Lawrence 's more outlandish plots. It was later revealed that Lawrence was once as kind - hearted as his father, but when his younger adopted brother Forrest was removed from the family canvas via a staged "drowning '' event, his family medicated him to the point where he was unable to feel any guilt or remorse about any of his actions. Lawrence was born to Leopold and Philomena Alamain, and is the adoptive brother of John Black. Lawrence, the scion of the powerful European family, the Alamains, was introduced as the man Katerina von Leuschner (known to Salemites as Carly Manning) was supposed to be wed in an arranged marriage. Carly 's best friend, Jennifer Horton, masqueraded as Carly in order to invalidate the marriage in 1990 - believing Lawrence had no idea who Carly was. Though Lawrence went through with the wedding, it was revealed he knew all along Jennifer was n't Katerina, as he (masquerading as a man named James) had an affair with Katerina (masquerading as Carly) when they were teenagers, conceiving a child, Nicholas Alamain, who was raised by Lawrence 's aunt Vivian Alamain. Most of Lawrence 's time on the show was marked by his triangle with Carly and her boyfriend, the heroic Bo Brady. Though Bo was a far better man than the often dastardly Lawrence, Carly eventually could n't deny her love for him and the two were married and left town in 1993 with their son, Nicholas. They were not seen again until 2009, when Carly killed Lawrence in self - defense, for threatening the life of Melanie Jonas, a child she 'd conceived in an affair with a colleague, Daniel Jonas. Though he died in 2009, Lawrence has appeared numerous times since then as a ghost to both Carly and his aunt Vivian.
who plays phyllis on young and the restless
Gina Tognoni - wikipedia Gina Tognoni / toʊnˈjoʊni / (born November 28, 1973) is an American actress, best known for her work with American daytime soap operas. Her most notable performances include Kelly Cramer on One Life to Live and Dinah Marler on Guiding Light. She is currently starring as Phyllis Summers on The Young and the Restless. Tognoni is known for her roles on daytime television soap operas. In 1995, she made her soap debut as Kelly Cramer in ABC 's One Life to Live. She appeared on the show until 2001. In April 2001, it was announced that Tognoni, after six years with the show, had decided not to renew her contract with the series. Tognoni returned to daytime television as Dinah Marler on CBS 's Guiding Light in July 2004, and won her first Daytime Emmy Award for Outstanding Supporting Actress in a Drama Series in 2006 for her work on the show. In 2007, she was nominated again in the same category but lost to Genie Francis. She won her second Emmy in 2008, becoming only the second actress to be a repeat winner in the Supporting Actress category. In 2009, after CBS canceled Guiding Light, Tognoni returned to her role as Kelly Cramer on One Life to Live. Also it was reported that Tognoni was in talks with several soaps for her post-Guiding Light gig. Following the April 2011 cancelation of One Life to Live and All My Children, it was announced that Tognoni and co-star Tom Degnan, who was portraying Joey Buchanan at the time, would exit the series before its January 2012 finale. In addition to her career on soaps, Tognoni had guest - starring roles in Law & Order: Special Victims Unit and The Sopranos, and co-starred in a number of television shows and independent films. On May 22, 2014, it was announced that Tognoni was cast as Phyllis Summers in CBS 's The Young and the Restless, replacing Michelle Stafford. Tognoni made her debut on August 11, 2014. Tognoni was born in St. Louis, Missouri. She has won Miss Rhode Island Teen USA in 1991 and Miss Rhode Island Teen All - American in 1993. Tognoni became engaged to Joseph Chiarello on November 21, 2007. She was introduced to him by her co-star Beth Ehlers. The two were married May 16, 2009.
what does a map sensor do for a car
MaP sensor - wikipedia The manifold absolute pressure sensor (MAP sensor) is one of the sensors used in an internal combustion engine 's electronic control system. Engines that use a MAP sensor are typically fuel injected. The manifold absolute pressure sensor provides instantaneous manifold pressure information to the engine 's electronic control unit (ECU). The data is used to calculate air density and determine the engine 's air mass flow rate, which in turn determines the required fuel metering for optimum combustion (see stoichiometry) and influence the advance or retard of ignition timing. A fuel - injected engine may alternatively use a mass airflow sensor (MAF sensor) to detect the intake airflow. A typical naturally aspirated engine configuration employs one or the other, whereas forced induction engines typically use both; a MAF sensor on the intake tract pre-turbo and a MAP sensor on the charge pipe leading to the throttle body. MAP sensor data can be converted to air mass data using the speed - density method. Engine speed (RPM) and air temperature are also necessary to complete the speed - density calculation. The MAP sensor can also be used in OBD II (on - board diagnostics) applications to test the EGR (exhaust gas recirculation) valve for functionality, an application typical in OBD II equipped General Motors engines. The following example assumes the same engine speed and air temperature. The engine requires the same mass of fuel in both conditions because the mass of air entering the cylinders is the same. If the throttle is opened all the way in condition 2, the manifold absolute pressure will increase from 50 kPa to nearly 100 kPa (14.5 psi, 29.53 inHG), about equal to the local barometer, which in condition 2 is sea level. The higher absolute pressure in the intake manifold increases the air 's density, and in turn more fuel can be burned resulting in higher output. Another example is varying rpm and engine loads - Where an engine may have 60kPa of manifold pressure at 1800 rpm in an unloaded condition, introducing load with a further throttle opening will change the final manifold pressure to 100kPa, engine will still be at 1800 rpm but its loading will require a different spark and fueling delivery. Engine vacuum is the difference between the pressures in the intake manifold and ambient atmospheric pressure. Engine vacuum is a "gauge '' pressure, since gauges by nature measure a pressure difference, not an absolute pressure. The engine fundamentally responds to air mass, not vacuum, and absolute pressure is necessary to calculate mass. The mass of air entering the engine is directly proportional to the air density, which is proportional to the absolute pressure, and inversely proportional to the absolute temperature. Note: Carburetors are largely dependent on air volume flow and vacuum, and neither directly infers mass. Consequently, carburetors are precise, but not accurate fuel metering devices. Carburetors were replaced by more accurate fuel metering methods, such as fuel injection in combination with an air mass flow sensor (MAF). With OBD II standards, vehicle manufacturers were required to test the exhaust gas recirculation (EGR) valve for functionality during driving. Some manufacturers use the MaP sensor to accomplish this. In these vehicles, they have a MAF sensor for their primary load sensor. The MaP sensor is then used for rationality checks and to test the EGR valve. The way they do this is during a deceleration of the vehicle when there is low absolute pressure in the intake manifold (i.e., a high vacuum present in the intake manifold relative to the outside air) the powertrain control module (PCM) will open the EGR valve and then monitor the MaP sensor 's values. If the EGR is functioning properly, the manifold absolute pressure will increase as exhaust gases enter. MAP sensors measure absolute pressure. Boost sensors or gauges measure the amount of pressure above a set absolute pressure. That set absolute pressure is usually 100 kPa. This is commonly referred to as gauge pressure. Boost pressure is relative to absolute pressure - as one increases or decreases, so does the other. It is a one - to - one relationship with an offset of - 100 kPa for boost pressure. Thus a MaP sensor will always read 100 kPa more than a boost sensor measuring the same conditions. A MaP sensor will never display a negative reading because it is measuring absolute pressure, where zero is the total absence of pressure. Vacuum is measured as a negative pressure relative to normal atmospheric pressure. Vacuum - Boost sensors can display negative readings, indicating vacuum or suction (a condition of lower pressure than the surrounding atmosphere). In forced induction engines (supercharged or turbocharged), a negative boost reading indicates that the engine is drawing air faster than it is being supplied, creating suction. The suction is caused by throttling in spark ignition engines and is not present in diesel engines. This is often called vacuum pressure when referring to internal combustion engines. In short, most boost sensors will read 100 kPa less than a MaP sensor reads. One can convert boost to MaP by adding 100 kPa. One can convert from MaP to boost by subtracting 100 kPa.
what year did they start putting radios in cars
Vehicle audio - wikipedia Vehicle audio is equipment installed in a car or other vehicle to provide in - car entertainment and information for the vehicle occupants. Until the 1950s it consisted of a simple AM radio. Additions since then have included FM radio (1952), 8 - Track tape players, Cassette Players, CD players (1984), DVD players, Blu - ray players, navigation systems, Bluetooth telephone integration, and smartphone controllers like CarPlay and Android Auto. Once controlled from the dashboard with a few buttons, they can now be controlled by steering wheel controls and voice commands. Initially implemented for listening to music and radio, vehicle audio is now part of car telematics, telecommunication, in - vehicle security, handsfree calling, navigation, and remote diagnostics systems. It is also used to create fake engine noise. For the 2015 Ford Mustang EcoBoost, an "Active Noise Control '' system was developed that amplifies the engine sound through the car speakers. A similar system is used in the F - 150 pickup truck. Volkswagen uses a Soundaktor, a special speaker to play sounds in cars such as the Golf GTi and Beetle Turbo. BMW plays a recorded sample of its motors through the car speakers, using a different samples according to the engine 's load and power. In 1904, well before commercially viable technology for mobile radio was in place, American inventor and self - described "Father of Radio '' Lee de Forest did some demonstration around a car radio at the 1904 Louisiana Purchase Exposition in St. Louis. Around 1920, vacuum tube technology had matured to the point where the availability of radio receivers made radio broadcasting viable. A technical challenge was that the vacuum tubes in the radio receivers required 50 to 250 volt direct current but car batteries ran at 6V. Voltage was stepped up with a vibrator that provided a pulsating DC which could be converted to a higher voltage with a transformer, rectified, and filtered to create higher - voltage DC. In 1930, the American Galvin Manufacturing Corporation marketed a Motorola branded radio receiver for $130. It was expensive: the contemporary Ford Model A cost $540. A Plymouth sedan, "wired for Philco Transistone radio without extra cost, '' is advertised in Ladies ' Home Journal in 1931. In 1932 in Germany the Blaupunkt AS 5 medium wave and longwave radio was marketed for 465 Reichsmark, about one third of the price of a small car. Because it took nearly 10 litres of space, it could not be located near the driver, and was operated via a steering wheel remote control. In 1933 Crossley Motors offer a factory fitted car radio. By the late 1930s, push button AM radios were considered a standard feature. In 1946 there were an estimated 9 million AM car radios in use. An FM receiver was offered by Blaupunkt in 1952. In 1953, Becker introduced the AM / FM Becker Mexico with a Variometer tuner, basically a station - search or scan function. In April 1955, the Chrysler Corporation had announced that it was offering a Mopar model 914HR branded Philco all transistor car radio, as a $150 option for its 1956 Chrysler and Imperial car models. Chrysler Corporation had decided to discontinue its all transistor car radio option at the end of 1956, due to it being too expensive, and replaced it with a cheaper hybrid (transistors and low voltage vacuum tubes) car radio for its new 1957 car models. In 1963 Becker introduced the Monte Carlo, a tubeless solid state radio, with no vacuum tubes. In 1964 Philips launched the Compact Cassette, and in 1965 Ford and Motorola jointly introduced the 8 - track tape in - car tape player. In subsequent years cassettes supplanted the 8 - track, and improved with longer play times, better tape quality, auto - reverse, and Dolby noise reduction. They were popular throughout the 1970s and ' 80s. While the CD had been on the market since 1982, it was in 1984 that Pioneer introduced the CDX - 1, the world 's first car CD player. It was known for its improved sound quality, instant track skipping and the formats increased durability over cassette tapes. Due to the ability that allowed drivers and passengers to change up to 10 CD 's at a time, car CD changers started to gain popularity in the late 80s and continuing throughout the 90s. Stock and aftermarket compact disc players began appearing in the late 1980s, competing with the cassette. The first car with an OEM CD player was the 1987 Lincoln Town Car, and the last new cars in the American market to be factory - equipped with a cassette deck in the dashboard was the 2010 Lexus SC430, and the Ford Crown Victoria. From 1974 to 2005 the Autofahrer - Rundfunk - Informationssystem was used by the German ARD network. Developed jointly by the Institut für Rundfunktechnik and Blaupunkt, it indicated the presence of traffic announcements through manipulation of the 57 kHz subcarrier of the station 's FM signal. ARI was replaced by the Radio Data System. In the 2010s new ways to play music came into competition with the CD and FM radio such as internet radio, satellite radio, USB and Bluetooth, and in - dash slots for memory card. And the automobile head unit became increasingly important as a housing for front and backup dashcams, navis, and operating systems with multiple functions, such as Android Auto, CarPlay and MirrorLink. Stock system is the OEM application that the vehicle 's manufacturer specified to be installed when the car was built. Aftermarket components can also be used. Amplifiers increase the power level of audio signals. Some head units have built - in stereo amplifiers. Other car audio systems use a separate stand - alone amplifier. Every amplifier has a rated power level sometimes noted on the head unit with the built in amplifier, or on the label of a stand - alone unit. Extremely loud sound systems in automobiles may violate the noise ordinance of some municipalities, some of which have outlawed them. In 2002 the U.S. Department of Justice issued a guide to police officers on how to deal with problems associated with loud audio systems in cars. 1955: World 's first All - Transistor car radio - Chrysler Mopar 914HR / Philco model C - 5690 A 1950s Philips car radio using both transistor and valves GM Delco Transistorized "Hybrid '' (vacuum tubes and transistors), first offered as an option on the 1956 Chevrolet Corvette car models. A car stereo head unit in a dashboard 1942 Lincoln Continental Cabriolet radio Dashboard of VW Hebmüller with Telefunken Radio (1949 / 50) 1964 Mercedes - Benz W110 190c dashboard with original FM Blaupunkt "Frankfurt '' head unit. BLAUPUNKT Köln Radio - German 1958 FORD Taunus 17M P2 deLuxe 1990 Ford Sierra CLX Radio - Cassette head unit in a dashboard with cassette storage 1978 AMC Matador sedan factory AM - FM - stereo - 8 - track unit with album by The Blues Brothers A set of speaker drivers removed from a passenger vehicle A car audio amplifier Two 10 - inch sub-woofers in the trunk of a car As technology keeps evolving, head units are now paired with climate control system and other essentials. They are now equipped with anti-theft system for protection purposes.
who is the youngest person on the supreme court
Demographics of the Supreme Court of the United States - wikipedia The demographics of the Supreme Court of the United States encompass the gender, ethnicity, and religious, geographic, and economic backgrounds of the 113 people who have been appointed and confirmed as justices to the Supreme Court. Some of these characteristics have been raised as an issue since the Court was established in 1789. For its first 180 years, justices were almost always white male Protestants. Prior to the 20th century, a few Roman Catholics were appointed, but concerns about diversity of the Court were mainly in terms of geographic diversity, to represent all geographic regions of the country, as opposed to ethnic, religious, or gender diversity. The 20th century saw the first appointment of justices who were Jewish (Louis Brandeis, 1916), African - American (Thurgood Marshall, 1967), female (Sandra Day O'Connor, 1981), and Italian - American (Antonin Scalia, 1986). The 21st century saw the first appointment of a Hispanic justice (Sonia Sotomayor, 2009), if justice Benjamin Cardozo, who was a Sephardi Jew of Portuguese descent and appointed in 1932, is excluded. In spite of the interest in the Court 's demographics and the symbolism accompanying the inevitably political appointment process, and the views of some commentators that no demographic considerations should arise in the selection process, the gender, race, educational background or religious views of the justices has played little role in their jurisprudence. For example, the opinions of the two African - American justices have reflected radically different judicial philosophies; William Brennan and Antonin Scalia shared Catholic faith and a Harvard Law School education, but shared little in the way of jurisprudential philosophies. The court 's first two female justices voted together no more often than with their male colleagues, and historian Thomas R. Marshall writes that no particular "female perspective '' can be discerned from their opinions. For most of the existence of the Court, geographic diversity was a key concern of presidents in choosing justices to appoint. This was prompted in part by the early practice of Supreme Court justices also "riding circuit '' -- individually hearing cases in different regions of the country. In 1789, the United States was divided into judicial circuits, and from that time until 1891, Supreme Court justices also acted as judges within those individual circuits. George Washington was careful to make appointments "with no two justices serving at the same time hailing from the same state ''. Abraham Lincoln broke with this tradition during the Civil War, and "by the late 1880s presidents disregarded it with increasing frequency ''. Although the importance of regionalism declined, it still arose from time to time. For example, in appointing Benjamin Cardozo in 1929, President Hoover was as concerned about the controversy over having three New York justices on the Court as he was about having two Jewish justices. David M. O'Brien notes that "(f) rom the appointment of John Rutledge from South Carolina in 1789 until the retirement of Hugo Black (from Alabama) in 1971, with the exception of the Reconstruction decade of 1866 -- 1876, there was always a southerner on the bench. Until 1867, the sixth seat was reserved as the ' southern seat '. Until Cardozo 's appointment in 1932, the third seat was reserved for New Englanders. '' The westward expansion of the U.S. led to concerns that the western states should be represented on the Court as well, which purportedly prompted William Howard Taft to make his 1910 appointment of Willis Van Devanter of Wyoming. Geographic balance was sought in the 1970s, when Nixon attempted to employ a "Southern strategy '', hoping to secure support from Southern states by nominating judges from the region. Nixon unsuccessfully nominated Southerners Clement Haynsworth of South Carolina and G. Harrold Carswell of Georgia, before finally succeeding with the nomination of Harry Blackmun of Minnesota. The issue of regional diversity was again raised with the 2010 retirement of John Paul Stevens, who had been appointed from the midwestern Seventh Circuit, leaving the Court with all but one Justice having been appointed from states on the East Coast. As of 2017, the Court has a majority from the Northeastern United States, with six justices coming from states to the north and east of Washington, D.C. including four justices born or raised in New York City. The remaining three justices come from Georgia, California and Colorado; the most recent justice from the Midwest being John Paul Stevens of Illinois who retired in 2010. Contemporary Justices may be associated with multiple states. Many nominees are appointed while serving in states or districts other than their hometown or home state. Chief Justice John Roberts, for example, was born in New York, but moved to Indiana at the age of five, where he grew up. After law school, Roberts worked in Washington, D.C. while living in Maryland. Thus, three states may claim his domicile. Despite the efforts to achieve geographic balance, only seven justices have ever hailed from states admitted after or during the Civil War. Nineteen states have never produced a Supreme Court Justice; in chronological order of admission to the Union these are: In contrast, some states have been over-represented, partly because there were fewer states from which early justices could be appointed. New York has produced fifteen justices, Ohio ten, Massachusetts nine, Virginia eight, six each from Pennsylvania and Tennessee, and five from Kentucky, Maryland, and New Jersey. A handful of justices were born outside the United States, mostly from among the earliest justices on the Court. These included James Wilson, born in Fife, Scotland; James Iredell, born in Lewes, England; and William Paterson, born in County Antrim, Ireland. Justice David Josiah Brewer was born farthest from the U.S., in Smyrna, in the Ottoman Empire, (now İzmir, Turkey). George Sutherland was born in Buckinghamshire, England. The last foreign - born Justice, and the only one of these for whom English was a second language, was Felix Frankfurter, born in Vienna, Austria. The Constitution imposes no citizenship requirement on federal judges. All Supreme Court justices were white and of European heritage until the appointment of Thurgood Marshall, the first African American Justice, in 1967. Since then, only two other non-white Justices have been appointed, Marshall 's African - American successor, Clarence Thomas in 1991, and Sonia Sotomayor in 2009. There have been six foreign - born justices in the Court 's history: James Wilson (1789 - 1798), born in Caskardy, Scotland; James Iredell (1790 - 1799), born in Lewes, England; William Paterson (1793 - 1806), born in County Antrim, Ireland; David Brewer (1889 - 1910), born to American missionary parents in Smyrna, Ottoman Empire (now İzmir, Turkey); George Sutherland (1922 - 1939), born in Buckinghamshire, England; and Felix Frankfurter (1939 - 1962), born in Vienna, Austria. The vast majority of white justices have been of Northern European, Northwestern European, or Germanic Protestant descent. Up until the 1980s, only six justices of "central, eastern, or southern European derivation '' had been appointed, and even among these six justices, five of them "were of Germanic background, which includes Austrian, German - Bohemian, and Swiss origins (John Catron, Samuel F. Miller, Louis Brandeis, Felix Frankfurter, and Warren Burger) '' while only one justice was of non-Germanic, Southern European descent (Benjamin N. Cardozo, of Iberian descent). Cardozo, appointed to the Court in 1932, was the first justice known to have non-Germanic or non-Anglo - Saxon ancestry and the first justice of Southern European descent.. Both of Justice Cardozo 's parents descended from Sephardic Jews from the Iberian Peninsula who fled to Holland during the Spanish Inquisition then to London, before arriving in New York prior to the American Revolution. Justice Antonin Scalia, who served from 1986 - 2016, and Justice Samuel Alito, who has served since 2006, are the first justices of Italian descent to be appointed to the Supreme Court. Justice Scalia 's father and both maternal grandparents as well as both of Justice Alito 's parents were born in Italy. Justice Ruth Bader Ginsburg was born to a father who immigrated from Russia at age 13 and a mother who was born four months after her parents immigrated from Poland. No African - American candidate was given serious consideration for appointment to the Supreme Court until the election of John F. Kennedy, who weighed the possibility of appointing William H. Hastie of the United States Court of Appeals for the Third Circuit. Hastie had been the first African - American elevated to a Court of Appeals when Harry S. Truman had so appointed him in 1949, and by the time of the Kennedy Administration, it was widely anticipated that Hastie might be appointed to the Supreme Court. That Kennedy gave serious consideration to making this appointment "represented the first time in American history that an African American was an actual contender for the high court ''. The first African American appointed to the Court was Thurgood Marshall, appointed by Lyndon B. Johnson in 1967. The second was Clarence Thomas, appointed by George H.W. Bush to succeed Marshall in 1991. Johnson appointed Marshall to the Supreme Court following the retirement of Justice Tom C. Clark, saying that this was "the right thing to do, the right time to do it, the right man and the right place. '' Marshall was confirmed as an Associate Justice by a Senate vote of 69 -- 11 on August 31, 1967. Johnson confidently predicted to one biographer, Doris Kearns Goodwin, that a lot of black baby boys would be named "Thurgood '' in honor of this choice (in fact, Kearns 's research of birth records in New York and Boston indicates that Johnson 's prophecy did not come true). Bush initially wanted to nominate Thomas to replace William Brennan, who stepped down in 1990, but he then decided that Thomas had not yet had enough experience as a judge after only months on the federal bench. Bush therefore nominated New Hampshire Supreme Court judge David Souter (who is not African American) instead. The selection of Thomas to instead replace Marshall preserved the existing racial composition of the court. The words "Latino '' and "Hispanic '' are sometimes given distinct meanings, with "Latino '' referring to persons of Latin American descent, and "Hispanic '' referring to persons having an ancestry, language or culture traceable to Spain or to the Iberian Peninsula as a whole, as well as to persons of Latin American descent, and the term "Lusitanic '' usually refers to persons having an ancestry, language or culture traceable to Portugal specifically. Sonia Sotomayor -- nominated by President Barack Obama on May 26, 2009, and sworn in on August 8 -- is the first Supreme Court Justice of Latin American descent. Born in New York City of Puerto Rican parents, she has been known to refer to herself as a "Nuyorican ''. Sotomayor is also generally regarded as the first Hispanic justice, although some sources claim that this distinction belongs to former Justice Benjamin N. Cardozo. It has been claimed that "only since the George H.W. Bush administration have Hispanic candidates received serious consideration from presidents in the selection process '', and that Emilio M. Garza (considered for the vacancy eventually given to Clarence Thomas) was the first Hispanic judge for whom such an appointment was contemplated. Subsequently, Bill Clinton was reported by several sources to have considered José A. Cabranes for a Supreme Court nomination on both occasions when a Court vacancy opened during the Clinton presidency. The possibility of a Hispanic Justice returned during the George W. Bush Presidency, with various reports suggesting that Emilio Garza, Alberto Gonzales, and Consuelo M. Callahan were under consideration for the vacancy left by the retirement of Sandra Day O'Connor. O'Connor's seat eventually went to Samuel Alito, however. Speculation about a Hispanic nomination arose again after the election of Barack Obama. In 2009, Obama appointed Sonia Sotomayor, a woman of Puerto Rican descent, to be the first unequivocally Hispanic Justice. Both the National Association of Latino Elected and Appointed Officials and the Hispanic National Bar Association count Sotomayor as the first Hispanic justice. Some historians contend that Cardozo -- a Sephardic Jew believed to be of distant Portuguese descent -- should also be counted as the first Hispanic Justice. Schmidhauser wrote in 1979 that "(a) mong the large ethnic groupings of European origin which have never been represented upon the Supreme Court are the Italians, Southern Slavs, and Hispanic Americans. '' The National Hispanic Center for Advanced Studies and Policy Analysis wrote in 1982 that the Supreme Court "has never had an Hispanic Justice '', and the Hispanic American Almanac similarly reported in 1996 that "no Hispanic has yet sat on the U.S. Supreme Court ''. However, Segal and Spaeth state: "Though it is often claimed that no Hispanics have served on the Court, it is not clear why Benjamin Cardozo, a Sephardic Jew of Spanish heritage, should not count. '' They identify a number of other sources that present conflicting views as to Cardozo 's ethnicity, with one simply labeling him "Iberian. '' In 2007, the Dictionary of Latino Civil Rights History also listed Cardozo as "the first Hispanic named to the Supreme Court of the United States. '' The nomination of Sonia Sotomayor, widely described in media accounts as the first Hispanic nominee, drew more attention to the question of Cardozo 's ethnicity. Cardozo biographer Andrew Kaufman questioned the usage of the term "hispanic '' during Cardozo 's lifetime, commenting: "Well, I think he regarded himself as Sephardic Jew whose ancestors came from the Iberian Peninsula. '' However, "no one has ever firmly established that the family 's roots were, in fact, in Portugal ''. It has also been asserted that Cardozo himself "confessed in 1937 that his family preserved neither the Spanish language nor Iberian cultural traditions ''. By contrast, Cardozo made his own translations of authoritative legal works written in French and German. Many ethnic groups have never been represented on the Court. There has never been a Justice with any Asian, Native American, or Pacific Islander heritage, and no person having such a heritage was publicly considered for an appointment until the 21st century. Legal scholar Viet D. Dinh, of Vietnamese descent, was named as a potential George W. Bush nominee. During the presidency of Barack Obama, potential nominees have included Harold Hongju Koh, of Korean descent, and former Idaho attorney general Larry Echo Hawk, a member of the Pawnee tribe. Public opinion about ethnic diversity on the court "varies widely depending on the poll question 's wording ''. For example, in two polls taken in 1991, one resulted in half of respondents agreeing that it was "important that there always be at least one black person '' on the Court while the other had only 20 % agreeing with that sentiment, and with 77 % agreeing that "race should never be a factor in choosing Supreme Court justices ''. It is claimed that the Presidents who have appointed Justices to the Supreme Court in recent years have taken race and religion into account, causing it to be unrepresentative of the U.S. population in general. Of the 113 justices, 109 (96.5 %) have been men. All Supreme Court justices were males until 1981, when Ronald Reagan fulfilled his 1980 campaign promise to place a woman on the Court, which he did with the appointment of Sandra Day O'Connor. O'Connor was later joined on the Court by Ruth Bader Ginsburg, appointed by Bill Clinton in 1993. After O'Connor retired in 2006, Ginsburg would be joined by Sonia Sotomayor and Elena Kagan, who were successfully appointed to the Court in 2009 and 2010, respectively, by Barack Obama. The only other woman to be nominated to the Court was Harriet Miers, whose nomination to succeed O'Connor by George W. Bush was withdrawn under fire. Substantial public sentiment in support of appointment of a woman to the Supreme Court has been expressed since at least as early as 1930, when an editorial in the Christian Science Monitor encouraged Herbert Hoover to consider Ohio justice Florence E. Allen or assistant attorney general Mabel Walker Willebrandt. Franklin Delano Roosevelt later appointed Allen to the United States Court of Appeals for the Sixth Circuit -- making her "one of the highest ranking female jurists in the world at that time ''. However, neither Roosevelt nor his successors over the following two decades gave strong consideration to female candidates for the Court. Harry Truman considered such an appointment, but was dissuaded by concerns raised by justices then serving that a woman on the Court "would inhibit their conference deliberations '', which were marked by informality. President Richard Nixon named Mildred Lillie, then serving on the Second District Court of Appeal of California, as a potential nominee to fill one of two vacancies on the Court in 1971. However, Lillie was quickly deemed unqualified by the American Bar Association, and no formal proceedings were ever set with respect to her potential nomination. Lewis Powell and William Rehnquist were then successfully nominated to fill those vacancies. In 1991, a poll found that 53 % of Americans felt it "important that there always be at least one woman '' on the Court. However, when O'Connor stepped down from the court, leaving Justice Ginsburg as the lone remaining woman, only one in seven persons polled found it "essential that a woman be nominated to replace '' O'Connor. All but a handful of Supreme Court justices have been married. Frank Murphy, Benjamin Cardozo, and James McReynolds were all lifelong bachelors. In addition, retired justice David Souter and current justice Elena Kagan have never been married. William O. Douglas was the first Justice to divorce while on the Court, and also had the most marriages of any Justice, with four. Justice John Paul Stevens divorced his first wife in 1979, marrying his second wife later that year. Sonia Sotomayor was the first female justice to be appointed as an unmarried woman, having divorced in 1983, long before her nomination in 2009. Several justices have become widowers while on the bench. The 1792 death of Elizabeth Rutledge, wife of Justice John Rutledge, contributed to the mental health problems that led to the rejection of his recess appointment. Roger B. Taney survived his wife, Anne, by twenty years. Oliver Wendell Holmes, Jr. resolutely continued working on the Court for several years after the death of his wife. William Rehnquist was a widower for the last fourteen years of his service on the Court, his wife Natalie having died on October 17, 1991 after suffering from ovarian cancer. With the death of Martin D. Ginsburg in June 2010, Ruth Bader Ginsburg became the first woman to be widowed while serving on the Court. With regards to sexual orientation, no Supreme Court justice has identified himself or herself as anything other than heterosexual, and no incontrovertible evidence of a justice having any other sexual orientation has ever been uncovered. However, the personal lives of several justices and nominees have attracted speculation. G. Harrold Carswell was unsuccessfully nominated by Richard Nixon in 1970, and was convicted in 1976 of battery for making an "unnatural and lascivious '' advance to a male police officer working undercover in a Florida men 's room. Some therefore claim him as the only gay or bisexual person nominated to the Court thus far. If so, it is unlikely that Nixon was aware of it; White House Counsel John Dean later wrote of Carswell that "(w) hile Richard Nixon was always looking for historical firsts, nominating a homosexual to the high court would not have been on his list ''. Speculation has been recorded about the sexual orientation of a few justices who were lifelong bachelors, but no unambiguous evidence exists that they were gay. Perhaps the greatest body of circumstantial evidence surrounds Frank Murphy, who was dogged by "(r) umors of homosexuality (...) all his adult life ''. For more than 40 years, Edward G. Kemp was Frank Murphy 's devoted, trusted companion. Like Murphy, Kemp was a lifelong bachelor. From college until Murphy 's death, the pair found creative ways to work and live together. (...) When Murphy appeared to have the better future in politics, Kemp stepped into a supportive, secondary role. As well as Murphy 's close relationship with Kemp, Murphy 's biographer, historian Sidney Fine, found in Murphy 's personal papers a letter that "if the words mean what they say, refers to a homosexual encounter some years earlier between Murphy and the writer. '' However, the letter 's veracity can not be confirmed and a review of all the evidence led Fine to conclude that he "could not stick his neck out and say (Murphy) was gay ''. Speculation has also surrounded Benjamin Cardozo, whose celibacy suggests repressed homosexuality or asexuality. The fact that he was unmarried and was personally tutored by the writer Horatio Alger (alleged to have had sexual relations with boys) led some of Cardozo 's biographers to insinuate that Cardozo was homosexual, but no real evidence exists to corroborate this possibility. Constitutional law scholar Jeffrey Rosen noted in a New York Times Book Review of Richard Polenberg 's book on Cardozo: Polenberg describes Cardozo 's lifelong devotion to his older sister Nell, with whom he lived in New York until her death in 1929. When asked why he had never married, Cardozo replied, quietly and sadly, "I never could give Nellie the second place in my life. '' Polenberg suggests that friends may have stressed Cardozo 's devotion to his sister to discourage rumors "that he was sexually dysfunctional, or had an unusually low sexual drive or was homosexual. '' But he produces no evidence to support any of these possibilities, except to note that friends, in describing Cardozo, used words like "beautiful '', "exquisite '', "sensitive '' or "delicate. '' Andrew Kaufman, author of Cardozo, a biography published in 2000, notes that "Although one can not be absolutely certain, it seems highly likely that Cardozo lived a celibate life ''. Judge Learned Hand is quoted in the book as saying about Cardozo: "He (had) no trace of homosexuality anyway ''. More recently, when David Souter was nominated to the Court, "conservative groups expressed concern to the White House... that the president 's bachelor nominee might conceivably be a homosexual ''. Similar questions were raised regarding the sexual orientation of unmarried nominee Elena Kagan. However, no evidence was ever produced regarding Souter 's sexual orientation, and Kagan 's apparent heterosexuality was attested by colleagues familiar with her dating history. When the Supreme Court was established in 1789, the first members came from among the ranks of the Founding Fathers and were almost uniformly Protestant. Of the 113 justices who have been appointed to the court, 91 have been from various Protestant denominations, 12 have been Catholics (one other justice, Sherman Minton, converted to Catholicism after leaving the Court). Another, Neil Gorsuch, was raised in the Catholic Church but later attended an Episcopalian church, though without specifying the denomination to which he felt he belonged. Eight have been Jewish and one, David Davis, had no known religious affiliation. Three of the 17 chief justices have been Catholics, and one Jewish justice, Abe Fortas, was unsuccessfully nominated to be chief justice. The table below shows the religious affiliation of each of the justices sitting as of April 2017: Most Supreme Court justices have been Protestant Christians. These have included 33 Episcopalians, 18 Presbyterians, nine Unitarians, five Methodists, three Baptists, and lone representatives of various other denominations. William Rehnquist was the Court 's only Lutheran. Noah Swayne was a Quaker. Some 15 Protestant justices did not adhere to a particular denomination. Baptist denominations and other evangelical churches have been underrepresented on the Court, relative to the population of the United States. Conversely, mainline Protestant churches historically were overrepresented. Following the retirement of John Paul Stevens in June 2010, the Court had an entirely non-Protestant composition for the first time in its history. Although Neil Gorsuch, appointed in 2017, attends an Episcopalian church, he was raised Catholic and it is unclear if he considers himself a Catholic or a Protestant. The first Catholic justice, Roger B. Taney, was appointed chief justice in 1836 by Andrew Jackson. The second, Edward Douglass White, was appointed as an associate justice in 1894, but also went on to become chief justice. Joseph McKenna was appointed in 1898, placing two Catholics on the Court until White 's death in 1921. This period marked the beginning of an inconsistently observed "tradition '' of having a "Catholic seat '' on the court. Other Catholic justices included Pierce Butler (appointed 1923) and Frank Murphy (appointed 1940). Sherman Minton, appointed in 1949, was a Protestant during his time on the Court. To some, however, his wife 's Catholic faith implied a "Catholic seat ''. Minton joined his wife 's church in 1961, five years after he retired from the Court. Minton was succeeded by a Catholic, however, when President Eisenhower appointed William J. Brennan to that seat. Eisenhower sought a Catholic to appoint to the Court -- in part because there had been no Catholic justice since Murphy 's death in 1949, and in part because Eisenhower was directly lobbied by Cardinal Francis Spellman of the Archdiocese of New York to make such an appointment. Brennan was then the lone Catholic justice until the appointment of Antonin Scalia in 1986, and Anthony Kennedy in 1988. Like Sherman Minton, Clarence Thomas was not a Catholic at the time he was appointed to the Court. Thomas was raised Catholic and briefly attended Conception Seminary College, a Roman Catholic seminary, but had joined the Protestant denomination of his wife after their marriage. At some point in the late 1990s, Thomas returned to Catholicism. In 2005, John Roberts became the third Catholic Chief Justice and the fourth Catholic on the Court. Shortly thereafter, Samuel Alito became the fifth on the Court, and the eleventh in the history of the Court. Alito 's appointment gave the Court a Catholic majority for the first time in its history. Besides Thomas, at least one other Justice, James F. Byrnes, was raised as a Roman Catholic, but converted to a different branch of Christianity prior to serving on the Court. In contrast to historical patterns, the Court has gone from having a "Catholic seat '' to being what some have characterized as a "Catholic court. '' The reasons for that are subject to debate, and are a matter of intense public scrutiny. That the majority of the Court is now Catholic, and that the appointment of Catholics has become accepted, represents a historical ' sea change. ' It has fostered accusations that the court has become "a Catholic boys club '' (particularly as the Catholics chosen tend to be politically conservative) and calls for non-Catholics to be nominated. In May 2009, President Barack Obama nominated a Catholic woman, Sonia Sotomayor, to replace retiring Justice David Souter. Her confirmation raised the number of Catholics on the Court to six, compared to three non-Catholics (all Jewish). With Antonin Scalia 's death in February 2016, the number of Catholic Justices went back to five. Neil Gorsuch, appointed in 2017, was raised Catholic but attends an Episcopalian church; it is unclear if he considers himself a Catholic or a Protestant. All of the Catholic justices have been members of the Roman (or Latin) rite within the Catholic Church. In 1853, President Millard Fillmore offered to appoint Louisiana Senator Judah P. Benjamin to be the first Jewish justice, and the New York Times reported (on February 15, 1853) that "if the President nominates Benjamin, the Democrats are determined to confirm him ''. However, Benjamin declined the offer, and ultimately became Secretary of State for the Confederacy during the Civil War. The first Jewish nominee, Louis Brandeis, was appointed in 1916, after a tumultuous hearing process. The 1932 appointment of Benjamin Cardozo raised mild controversy for placing two Jewish justices on the Court at the same time, although the appointment was widely lauded based on Cardozo 's qualifications, and the Senate was unanimous in confirming Cardozo. Cardozo was succeeded by another Jewish Justice, Felix Frankfurter, but Brandeis was succeeded by Protestant William O. Douglas. Negative reaction to the appointment of the early Jewish justices did not exclusively come from outside the Court. Justice James Clark McReynolds, a blatant anti-semite, refused to speak to Brandeis for three years following the latter 's appointment and when Brandeis retired in 1939, did not sign the customary dedicatory letter sent to Court members on their retirement. During Benjamin Cardozo 's swearing in ceremony McReynolds pointedly read a newspaper muttering "another one '' and did not attend that of Felix Frankfurter, exclaiming "My God, another Jew on the Court! '' Frankfurter was followed by Arthur Goldberg and Abe Fortas, each of whom filled what became known as the "Jewish Seat ''. After Fortas resigned in 1969, he was replaced by Protestant Harry Blackmun. No Jewish justices were nominated thereafter until Ronald Reagan nominated Douglas H. Ginsburg in 1987, to fill the vacancy created by the retirement of Lewis F. Powell; however, this nomination was withdrawn, and the Court remained without any Jewish justices until 1993, when Ruth Bader Ginsburg (unrelated to Douglas Ginsburg) was appointed to replace Byron White. Ginsburg was followed in relatively quick succession by the appointment of Stephen Breyer, also Jewish, in 1994 to replace Harry Blackmun. In 2010, the confirmation of President Barack Obama 's nomination of Elena Kagan to the Court ensured that three Jewish justices would serve simultaneously. Prior to this confirmation, conservative political commentator Pat Buchanan stated that, "If Kagan is confirmed, Jews, who represent less than 2 percent of the U.S. population, will have 33 percent of the Supreme Court seats ''. At the time of his remarks, 6.4 percent of justices had been Jewish in the history of the court. At the time of Breyer 's appointment in 1994, there were two Roman Catholic justices, Antonin Scalia and Anthony Kennedy, and two Jewish justices, Stephen Breyer and Ruth Bader Ginsburg. Clarence Thomas, who had been raised as a Roman Catholic but had attended an Episcopal church after his marriage, returned to Catholicism later in the 1990s. At this point, the four remaining Protestant justices -- Rehnquist, Stevens, O'Connor, and Souter -- remained a plurality on the Court, but for the first time in the history of the Court, Protestants were no longer an absolute majority. The first Catholic plurality on the Court occurred in 2005, when Chief Justice Rehnquist was succeeded in office by Chief Justice John Roberts, who became the fourth sitting Catholic justice. On January 31, 2006, Samuel Alito became the fifth sitting Catholic justice, and on August 6, 2009, Sonia Sotomayor became the sixth. By contrast, there has been only one Catholic U.S. President, John F. Kennedy (unrelated to Justice Kennedy), and one Catholic U.S. Vice President, Joe Biden, and there has never been a Jewish U.S. President or Vice President. At the beginning of 2010, Justice John Paul Stevens was the sole remaining Protestant on the Court. In April 2010, Justice Stevens announced his retirement, effective as of the Court 's 2010 summer recess. Upon Justice Stevens ' retirement, which formally began on June 28, 2010, the Court lacked a Protestant member, marking the first time in its history that it was exclusively composed of Jewish and Catholic justices. In January 2017, after seven years with no Protestant justices serving or nominated, President Donald Trump nominated Neil Gorsuch, an Episcopalian, to the Court. This development led to some comment. Law school professor Jeffrey Rosen wrote that "it 's a fascinating truth that we 've allowed religion to drop out of consideration on the Supreme Court, and right now, we have a Supreme Court that religiously at least, by no means looks like America ''. A number of sizable religious groups, each less than 2 % of the U.S. population, have had no members appointed as justices. These include Orthodox Christians, Mormons, Pentecostals, Muslims, Hindus, Buddhists, and Sikhs. George Sutherland has been described as a "lapsed Mormon '' because he was raised in the LDS Church, his parents having immigrated to the United States during Sutherland 's infancy to join that church. Sutherland 's parents soon left the LDS Church and moved to Montana. Sutherland himself also disaffiliated with the faith, but remained in Utah and graduated from Brigham Young Academy in 1881, the only non-Mormon in his class. In 1975, Attorney General Edward H. Levi had listed Dallin H. Oaks, a Mormon who had clerked for Earl Warren and was then president of Brigham Young University, as a potential nominee for Gerald Ford. Ford "crossed Oaks 's name off the list early on, noting in the margin that a member of the LDS Church might bring a ' confirmation fight ' ''. No professing atheist has ever been appointed to the Court, although some justices have declined to engage in religious activity, or affiliate with a denomination. As an adult, Benjamin Cardozo no longer practiced his faith and identified himself as an agnostic, though he remained proud of his Jewish heritage. Unlike the offices of President, U.S. Representative, and U.S. Senator, there is no minimum age for Supreme Court justices set forth in the United States Constitution. However, justices tend to be appointed after having made significant achievements in law or politics, which excludes many young potential candidates from consideration. At the same time, justices appointed at too advanced an age will likely have short tenures on the Court. The youngest justice ever appointed was Joseph Story, 32 at the time of his appointment in 1812; the oldest was Charles Evans Hughes, who was 67 at the time of his appointment as Chief Justice in 1930. (Hughes had previously been appointed to the Court as an associate justice in 1910, at the age of 48, but had resigned in 1916 to run for president). Story went on to serve for 33 years, while Hughes served 11 years after his second appointment. The oldest justice at the time of his initial appointment was Horace Lurton, 65 at the time of his appointment in 1909. Lurton died after only four years on the Court. The oldest sitting justice to be elevated to Chief Justice was Hughes ' successor, Harlan Fiske Stone, who was 68 at the time of his elevation in 1941. Stone died in 1946, only five years after his elevation. The oldest nominee to the court was South Carolina senator William Smith, nominated in 1837, then aged around 75 (it is known that he was born in 1762, but not the exact date). The Senate confirmed Smith 's nomination by a vote of 23 -- 18, but Smith declined to serve. Of the justices currently sitting, the youngest at time of appointment was Clarence Thomas, who was 43 years old at the time of his confirmation in 1991. As of the beginning of the 2016 -- 17 term, Elena Kagan was the youngest justice sitting, at 56 years of age. The oldest person to have served on the Court was Oliver Wendell Holmes, Jr., who stepped down two months shy of his 91st birthday. John Paul Stevens, second only to Holmes, left the court in June 2010, two months after turning 90. The average age of the Court as a whole fluctuates over time with the departure of older justices and the appointment of younger people to fill their seats. The average age of the Court is 72 years, 2 months. Just prior to the death of Chief Justice Rehnquist in September 2005, the average age was 71. After Sonia Sotomayor was appointed in August 2009, the average age at which current justices were appointed was about 53 years old. The longest period of time in which one group of justices has served together occurred from August 3, 1994, when Stephen Breyer was appointed to replace the retired Harry Blackmun, to September 3, 2005, the death of Rehnquist, totaling 11 years and 31 days. From 1789 until 1970, justices served an average of 14.9 years. Those who have stepped down since 1970 have served an average of 25.6 years. The retirement age had jumped from an average of 68 pre-1970 to 79 for justices retiring post-1970. Between 1789 and 1970 there was a vacancy on the Court once every 1.91 years. In the next 34 years since the two appointments in 1971, there was a vacancy on average only once every 3.75 years. The typical one - term president has had one appointment opportunity instead of two. Commentators have noted that advances in medical knowledge "have enormously increased the life expectancy of a mature person of an age likely to be considered for appointment to the Supreme Court ''. Combined with the reduction in responsibilities carried out by modern justices as compared to the early justices, this results in much longer potential terms of service. This has led to proposals such as imposing a mandatory retirement age for Supreme Court justices and predetermined term limits. Although the Constitution imposes no educational background requirements for federal judges, the work of the Court involves complex questions of law -- ranging from constitutional law to administrative law to admiralty law -- and consequently, a legal education has become a de facto prerequisite to appointment on the Supreme Court. Every person who has been nominated to the Court has been an attorney. Before the advent of modern law schools in the United States, justices, like most attorneys of the time, completed their legal studies by "reading law '' (studying under and acting as an apprentice to more experienced attorneys) rather than attending a formal program. The first justice to be appointed who had attended an actual law school was Levi Woodbury, appointed to the Court in 1846. Woodbury had attended Tapping Reeve Law School in Litchfield, Connecticut, the most prestigious law school in the United States in that day, prior to his admission to the bar in 1812. However, Woodbury did not earn a law degree. Woodbury 's successor on the Court, Benjamin Robbins Curtis, who received his law degree from Harvard Law School in 1832, and was appointed to the Court in 1851, was the first Justice to bear such a credential. Associate Justice James F. Byrnes, whose short tenure lasted from June 1941 to October 1942, was the last justice without a law degree to be appointed; Stanley Forman Reed, who served on the Court from 1938 to 1957, was the last sitting justice from such a background. In total, of the 113 justices appointed to the Court, 48 have had law degrees, an additional 18 attended some law school but did not receive a degree, and 47 received their legal education without any law school attendance. Two justices, Sherman Minton and Lewis F. Powell, Jr., earned a Master of Laws degree. The table below shows the college and law school from which each of the justices sitting as of April 2017 graduated: Not only have all justices been attorneys, nearly two thirds had previously been judges. As of 2017, eight of the nine sitting justices previously served as judges of the United States Courts of Appeals, while Justice Elena Kagan served as Solicitor General, the attorney responsible for representing the federal government in cases before the Court. Few justices have a background as criminal defense lawyers, and Thurgood Marshall is reportedly the last justice to have had a client in a death penalty case. Historically, justices have come from some tradition of public service; only George Shiras, Jr. had no such experience. Relatively few justices have been appointed from among members of Congress. Six were members of the United States Senate at the time of their appointment, while one was a sitting member of the House of Representatives. Six more had previously served in the Senate. Three have been sitting governors. Only one, William Howard Taft, had been President of the United States. The last justice to have held elected office was Sandra Day O'Connor, who was elected twice to the Arizona State Senate after being appointed there by the governor. The financial position of the typical Supreme Court Justice has been described as "upper - middle to high social status: reared in nonrural but not necessarily urban environment, member of a civic - minded, politically active, economically comfortable family ''. Charles A. Beard, in his An Economic Interpretation of the Constitution of the United States, profiled those among the justices who were also drafters of the Constitution. James Wilson, Beard notes, "developed a lucrative practice at Carlisle '' before becoming "one of the directors of the Bank of North America on its incorporation in 1781 ''. A member of the Georgia Land Company, Wilson "held shares to the amount of at least one million acres ''. John Blair was "one of the most respectable men in Virginia, both on account of his Family as well as fortune ''. Another source notes that Blair "was a member of a prominent Virginia family. His father served on the Virginia Council and was for a time acting Royal governor. His granduncle, James Blair, was founder and first president of the College of William and Mary. '' John Rutledge was elected Governor of South Carolina at a time when the Constitution of that state set, as a qualification for the office, ownership of "a settled plantation or freehold... of the value of at least ten thousand pounds currency, clear of debt ''. Oliver Ellsworth "rose rapidly to wealth and power in the bar of his native state '' with "earnings... unrivalled in his own day and unexampled in the history of the colony '', developing "a fortune which for the times and the country was quite uncommonly large ''. Bushrod Washington was the nephew of George Washington, who was at the time of the younger Washington 's appointment the immediate past President of the United States and one of the wealthiest men in the country. "About three - fifths of those named to the Supreme Court personally knew the President who nominated them ''. There have been exceptions to the typical portrait of justices growing up middle class or wealthy. For example, the family of Sherman Minton went through a period of impoverishment during his childhood, resulting from the disability of his father due to a heat stroke. In 2008, seven of the nine sitting justices were millionaires, and the remaining two were close to that level of wealth. Historian Howard Zinn, in his 1980 book A People 's History of the United States, argues that the justices can not be neutral in matters between rich and poor, as they are almost always from the upper class. Chief Justice Roberts is the son of an executive with Bethlehem Steel; Justice Stevens was born into a wealthy Chicago family; and Justices Kennedy and Breyer both had fathers who were successful attorneys. Justices Alito and Scalia both had educated (and education - minded) parents: Scalia 's father was a highly educated college professor and Alito 's father was a high school teacher before becoming "a long - time employee of the New Jersey state legislature ''. Only Justices Thomas and Sotomayor have been regarded as coming from a lower - class background. One authority states that "Thomas grew up in poverty. The Pin Point community he lived in lacked a sewage system and paved roads. Its inhabitants dwelled in destitution and earned but a few cents each day performing manual labor ''. The depth of Thomas ' poverty has been disputed by suggestions of "ample evidence to suggest that Thomas enjoyed, by and large, a middle - class upbringing ''. Beginning in 1979, the Ethics in Government Act of 1978 required federal officials, including the justices, to file annual disclosures of their income and assets. These disclosures provide a snapshot into the wealth of the justices, reported within broad ranges, from year to year since 1979. In the first such set of disclosures, only two justices were revealed to be millionaires: Potter Stewart and Lewis F. Powell, with Chief Justice Warren Burger coming in third with about $600,000 in holdings. The least wealthy Justice was Thurgood Marshall. The 1982 report disclosed that newly appointed Justice Sandra Day O'Connor was a millionaire, and the second - wealthiest Justice on the Court (after Powell). The remaining justices listed assets in the range of tens of thousands to a few hundred - thousand, with the exception of Thurgood Marshall, who "reported no assets or investment income of more than $100 ''. The 1985 report had the justices in relatively the same positions, while the 1992 report had O'Connor as the wealthiest member of the Court, with Stevens being the only other millionaire, most other justices reporting assets averaging around a half million dollars, and the two newest justices, Clarence Thomas and David Souter, reporting assets of at least $65,000. (In 2011, however, it was revealed that Thomas had misstated his income going back to at least 1989.) The 2007 report was the first to reflect the holdings of John Roberts and Samuel Alito. Disclosures for that year indicated that Clarence Thomas and Anthony Kennedy were the only justices who were clearly not millionaires, although Thomas was reported to have signed a book deal worth over one million dollars. Other justices reported holdings within the following ranges: The financial disclosures indicate that many of the justices have substantial stock holdings. This, in turn, has affected the business of the Court, as these holdings have led justices to recuse themselves from cases, occasionally with substantial impact. For example, in 2008, the recusal of John Roberts in one case, and Samuel Alito in another, resulted in each ending in a 4 -- 4 split, which does not create a binding precedent. The Court was unable to decide another case in 2008 because four of the nine justices had conflicts, three arising from stock ownership in affected companies.
meaning of tissues in human body in hindi
Tissue (biology) - wikipedia an ensemble of similar cells and their matrix with similar origin and function In biology, tissue is a cellular organizational level between cells and a complete organ. A tissue is an ensemble of similar cells and their extracellular matrix from the same origin that together carry out a specific function. Organs are then formed by the functional grouping together of multiple tissues. The English word is derived from the French tissu, meaning something that is woven, from the verb tisser, "to weave ''. The study of human and animal tissues is known as histology or, in connection with disease, histopathology. For plants, the discipline is called plant anatomy. The classical tools for studying tissues are the paraffin block in which tissue is embedded and then sectioned, the histological stain, and the optical microscope. In the last couple of decades, developments in electron microscopy, immunofluorescence, and the use of frozen tissue sections have enhanced the detail that can be observed in tissues. With these tools, the classical appearances of tissues can be examined in health and disease, enabling considerable refinement of medical diagnosis and prognosis. Animal tissues are grouped into four basic types: connective, muscle, nervous, and epithelial. Collections of tissues joined in structural units to serve a common function compose organs. While all animals can generally be considered to contain the four tissue types, the manifestation of these tissues can differ depending on the type of organism. For example, the origin of the cells comprising a particular tissue type may differ developmentally for different classifications of animals. The epithelium in all birds and animals is derived from the ectoderm and endoderm with a small contribution from the mesoderm, forming the endothelium, a specialized type of epithelium that composes the vasculature. By contrast, a true epithelial tissue is present only in a single layer of cells held together via occluding junctions called tight junctions, to create a selectively permeable barrier. This tissue covers all organismal surfaces that come in contact with the external environment such as the skin, the airways, and the digestive tract. It serves functions of protection, secretion, and absorption, and is separated from other tissues below by a basal lamina. Connective tissues are fibrous tissues. They are made up of cells separated by non-living material, which is called an extracellular matrix. This matrix can be liquid or rigid. For example, blood contains plasma as its matrix and bone 's matrix is rigid. Connective tissue gives shape to organs and holds them in place. Blood, bone, tendon, ligament, adipose and areolar tissues are examples of connective tissues. One method of classifying connective tissues is to divide them into three types: fibrous connective tissue, skeletal connective tissue, and fluid connective tissue. Muscle cells form the active contractile tissue of the body known as muscle tissue or muscular tissue. Muscle tissue functions to produce force and cause motion, either locomotion or movement within internal organs. Muscle tissue is separated into three distinct categories: visceral or smooth muscle, found in the inner linings of organs; skeletal muscle, typically attached to bones, which generate gross movement; and cardiac muscle, found in the heart where it contracts to pump blood throughout an organism. Cells comprising the central nervous system and peripheral nervous system are classified as nervous (or neural) tissue. In the central nervous system, neural tissues form the brain and spinal cord. In the peripheral nervous system, neural tissues forms the cranial nerves and spinal nerves, inclusive of the motor neurons. The epithelial tissues are formed by cells that cover the organ surfaces such as the surface of skin, the airways, the reproductive tract, and the inner lining of the digestive tract. The cells comprising an epithelial layer are linked via semi-permeable, tight junctions; hence, this tissue provides a barrier between the external environment and the organ it covers. In addition to this protective function, epithelial tissue may also be specialized to function in secretion, excretion and absorption. Epithelial tissue helps to protect organs from microorganisms, injury, and fluid loss. Functions of epithelial tissue: There are many kinds of epithelium, and nomenclature is somewhat variable. Most classification schemes combine a description of the cell - shape in the upper layer of the epithelium with a word denoting the number of layers: either simple (one layer of cells) or stratified (multiple layers of cells). However, other cellular features, such as cilia may also be described in the classification system. Some common kinds of epithelium are listed below: In plant anatomy, tissues are categorized broadly into three tissue systems: the epidermis, the ground tissue, and the vascular tissue. Plant tissues can also be divided differently into two types: Meristematic tissue consists of actively dividing cells, and leads to increase in length and thickness of the plant. The primary growth of a plant occurs only in certain, specific regions, such as in the tips of stems or roots. It is in these regions that meristematic tissue is present. Cells in these tissues are roughly spherical or polyhedral, to rectangular in shape, and have thin cell walls. New cells produced by meristem are initially those of meristem itself, but as the new cells grow and mature, their characteristics slowly change and they become differentiated as components of the region of occurrence of meristimatic tissues, they are classified as: The cells of meristematic tissues are similar in structure and have thin and elastic primary cell wall made up of cellulose. They are compactly arranged without inter-cellular spaces between them. Each cell contains a dense cytoplasm and a prominent nucleus. Dense protoplasm of meristematic cells contains very few vacuoles. Normally the meristematic cells are oval, polygonal or rectangular in shape. Meristemetic tissue cells have a large nucleus with small or no vacuoles, they have no inter cellular spaces. The meristematic tissues that take up a specific role lose the ability to divide. This process of taking up a permanent shape, size and a function is called cellular differentiation. Cells of meristematic tissue differentiate to form different types of permanent tissue. There are 3 types of permanent tissues: A group of cells which are similar in origin; similar in structure and similar in function are called simple permanent tissue. They are of four types: Parenchyma (para - ' beside '; chyma - ' in filling, loose, unpacked ') is the bulk of a substance. In plants, it consists of relatively unspecialised living cells with thin cell walls that are usually loosely packed so that intercellular spaces are found between cells of this tissue. This tissue provides support to plants and also stores food. In some situations, a parenchyma contains chlorophyll and performs photosynthesis, in which case it is called a chlorenchyma. In aquatic plants, large air cavities are present in parenchyma to give support to them to float on water. Such a parenchyma type is called aerenchyma. Collenchyma is Greek word where "Collen '' means gum and "chyma '' means infusion. It is a living tissue of primary body like Parenchyma. Cells are thin - walled but possess thickening of cellulose, water and pectin substances (pectocellulose) at the corners where number of cells join together. This tissue gives a tensile strength to the plant and the cells are compactly arranged and have very little inter-cellular spaces. It occurs chiefly in hypodermis of stems and leaves. It is absent in monocots and in roots. Collenchymatous tissue acts as a supporting tissue in stems of young plants. It provides mechanical support, elasticity, and tensile strength to the plant body. It helps in manufacturing sugar and storing it as starch. It is present in the margin of leaves and resist tearing effect of the wind. Sclerenchyma is Greek word where "Sclrenes '' means hard and "chyma '' means infusion. This tissue consists of thick - walled, dead cells (protoplasm is absent). These cells have hard and extremely thick secondary walls due to uniform distribution of lignin. Lignin deposition is so thick that the cell walls become strong, rigid and impermeable to water. Schlerenchyma cells have a narrow lumen and are long and narrow. The entire surface of the plant consists of a single layer of cells called epidermis or surface tissue. The entire surface of the plant has this outer layer of epidermis. Hence it is also called surface tissue. Most of the epidermal cells are relatively flat. The outer and lateral walls of the cell are often thicker than the inner walls. The cells forms a continuous sheet without inter cellular spaces. It protects all parts of the plant. The complex tissue consists of more than one type of cells which work together as a unit. Complex tissues help in the transportation of organic material, water and minerals up and down the plants. That is why it is also known as conducting and vascular tissue. The common types of complex permanent tissue are: Xylem and phloem together form vascular bundles. Xylem consists of: Xylem serves as a chief conducting tissue of vascular plants. It is responsible for the conduction of water and mineral ions / salt. Xylem tissue is organized in a tube - like fashion along the main axes of stems and roots. It consists of a combination of parenchyma cells, fibers, vessels, tracheids, and ray cells. Longer tubes made up of individual cells are vessels (tracheae), while vessel members are open at each end. Internally, there may be bars of wall material extending across the open space. These cells are joined end to end to form long tubes. Vessel members and tracheids are dead at maturity. Tracheids have thick secondary cell walls and are tapered at the ends. They do not have end openings such as the vessels. The tracheids ends overlap with each other, with pairs of pits present. The pit pairs allow water to pass from cell to cell. Though most conduction in xylem tissue is vertical, lateral conduction along the diameter of a stem is facilitated via rays. Rays are horizontal rows of long - living parenchyma cells that arise out of the vascular cambium. In trees and other woody plants, rays radiate out from the center of stems and roots, and appear like spokes on a wheel in cross section. Rays, unlike vessel members and tracheids, are alive at functional maturity. Phloem consists of: Phloem is an equally important plant tissue as it also is part of the ' plumbing system ' of a plant. Primarily, phloem carries dissolved food substances throughout the plant. This conduction system is composed of sieve - tube member and companion cells, that are without secondary walls. The parent cells of the vascular cambium produce both xylem and phloem. This usually also includes fibers, parenchyma and ray cells. Sieve tubes are formed from sieve - tube members laid end to end. The end walls, unlike vessel members in xylem, do not have openings. The end walls, however, are full of small pores where cytoplasm extends from cell to cell. These porous connections are called sieve plates. In spite of the fact that their cytoplasm is actively involved in the conduction of food materials, sieve - tube members do not have nuclei at maturity. It is the companion cells that are nestled between sieve - tube members that function in some manner bringing about the conduction of food. Sieve - tube members that are alive contain a polymer called callose, a carbohydrate polymer, forming the callus pad / callus, the colourless substance that covers the sieve plate. Callose stays in solution as long as the cell contents are under pressure. Phloem transports food and materials in plants upwards and downwards as required. Mineralized tissues are biological tissues that incorporate minerals into soft matrices. Such tissues may be found in both plants and animals, as well as algae. Typically these tissues form a protective shield against predation or provide structural support. The term was introduced in anatomy by Marie François Xavier Bichat in 1801. He argued that the body functions would be better understood taking as unity of study the tissues, and not the organs. Bichat distinguished 21 types of elementary tissues for the human body, a number later reduced by other authors.
what is the meaning of the song black and yellow by wiz khalifa
Black and Yellow - wikipedia "Black and Yellow '' is a song by American rapper Wiz Khalifa from his third studio album, Rolling Papers. It was released on September 14, 2010, as the lead single from the album. The song was written by Khalifa, along with Stargate, who produced it. It was released as a CD single in honor of Record Store Day. The song peaked at number one on the Billboard Hot 100, becoming Wiz Khalifa 's first number - one single in the US; he would top the chart again in 2015 with "See You Again ''. The song is about Khalifa 's car, a yellow Dodge Challenger Hemi with black stripes. He has stated that he got the car in those colors as a tribute to his hometown of Pittsburgh, Pennsylvania, whose official colors are black and gold, and its professional sports teams, most of whose colors are black and some variation of gold or yellow. The song itself does not mention Pittsburgh or sports, although the song 's music video made the connection to Pittsburgh explicit, showing various iconic locations in the city, as well as apparel associated with the football team the Pittsburgh Steelers and the baseball team the Pittsburgh Pirates. In the year after it was released, "Black and Yellow '' spawned dozens of remixes, parodies and remakes, both in the US and internationally, many of them made in tribute to a local sports team. At Super Bowl XLV in 2011, which featured the Steelers competing against the Green Bay Packers, the Steelers used "Black and Yellow '' as their fight song, while the Packers used a remix by Lil ' Wayne called "Green and Yellow '', marking the first time both teams at the Super Bowl had used the same song. On the issue dated October 2, 2010, "Black and Yellow '' debuted at No. 100 on the Billboard Hot 100. It then dropped out the following week and re-entered at No. 64 on the issue dated October 30, 2010. On its eighteenth charting week, the song rose to No. 1 on the issue dated February 19, 2011, selling 198,000 digital copies that week. "Black and Yellow '' has sold 4,144,922 digital copies since release. The music video was directed by Bill Paladino. It was filmed in Pittsburgh and features sights of the city, including the U.S. Steel Tower, BNY Mellon Center, PPG Place, William Penn Hotel, Citizens Bank Tower, Union Trust Building, One PNC Plaza, K&L Gates Center the Three Sisters and Smithfield Street bridges, Station Square, Shannon Hall of the Art Institute of Pittsburgh and the smoke stacks of the former U.S. Steel Homestead Works at The Waterfront adjacent to the city. The video also prominently features city icons such as the Terrible Towel, a rally towel for the Pittsburgh Steelers, and city sports in general, as well as Pittsburgh Pirates apparel. It was recorded in Chatsworth Avenue, Pittsburgh, PA. The song was featured in the multiple trailers for The Lego Batman Movie. On April 7, 2014, "Jimmy Kimmel Live! '' held a skit called "American Sign Language Rap Battle. '' Khalifa performed the song as three ASL - certified interpreters competed. The official remix, "Black & Yellow (G - Mix) '' features west - coast rapper Snoop Dogg, Juicy J and R&B singer T - Pain. Wiz Khalifa has a new verse on the track. The song leaked on December 12, 2010 but was officially released on December 16. A video for the remix was shot and was released on January 8, 2011. A remix by T - Pain, named "Black & Yellow (T - Mix) '', featured the verse that was later put on the official remix. Many remakes have been made, mostly in tribute to other sports teams, referring to their respective pair of colors. Some of the notable remakes include: In addition, a number of freestyle versions have been released, of artists rapping over "Black and Yellow '' 's instrumental track, without any overt reference to the song or to sports teams. These include versions by Crooked I (who included a chorus of "packing metal ''), Donnis, Young Jeezy, Tyga, Novi Novak, and Layzie Bone and Flesh - n - Bone of Bone Thugs - n - Harmony. sales figures based on certification alone shipments figures based on certification alone sales + streaming figures based on certification alone
who is the actor that played it the clown
Tim Curry - wikipedia Timothy James Curry (born 19 April 1946) is an English actor, voice actor, comedian, and singer. He is known for his work in a diverse range of theatre, film, and television productions, often portraying villainous roles or character parts. Curry rose to prominence with his portrayal of Dr. Frank - N - Furter in The Rocky Horror Picture Show (1975), reprising the role he had originated in the 1973 London and 1974 Los Angeles stage productions of The Rocky Horror Show. His other stage work includes various roles in the original West End production of Hair, Wolfgang Amadeus Mozart in the 1980 Broadway production of Amadeus, the Pirate King in the 1982 West End production of The Pirates of Penzance, Alan Swann in the Broadway production of My Favourite Year and King Arthur in Broadway and West End productions of Spamalot from 2005 to 2007. Curry received further acclaim for his film and television roles, including as Rooster Hannigan in the film adaptation of Annie (1982), as Darkness in the fantasy film Legend (1985), as Wadsworth in the mystery comedy film Clue (1985), as Pennywise the Dancing Clown in the horror miniseries It (1990) and Long John Silver in Muppet Treasure Island (1996). Curry has also gained acclaim as a voice actor. His roles in animation include Captain Hook on the FOX series Peter Pan & the Pirates (1990 -- 1991), Hexxus in the fantasy film FernGully: The Last Rainforest (1992), Nigel Thornberry on the Nickelodeon series The Wild Thornberrys (1998 -- 2004) and Palpatine on Star Wars: The Clone Wars (2012 -- 2014). Curry was born in Grappenhall, Cheshire. His father, James Curry, a chaplain in the Royal Navy, died when Curry was 12. Curry 's mother, Patricia, a school secretary, died in June 1999 after living with cancer for two years. His older sister, Judith ("Judy ''), was a concert pianist who died of a brain tumour in 2001. Curry spent most of his childhood in Plymouth, Devon, but, after his father 's death from pneumonia in 1958, his family moved to South London. Curry then went to boarding school and attended Kingswood School in Bath, Somerset. He developed into a talented boy soprano (treble). Deciding to concentrate on acting, Curry graduated from the University of Birmingham with a combined degree in English and Drama (BA Drama & Theatre Studies, 1968). Curry 's first full - time role was as part of the original London cast of the musical Hair in 1968, where he first met Richard O'Brien who went on to write Curry 's next full - time role, that of Dr. Frank - N - Furter in The Rocky Horror Show (1975). Curry recalled his first encounter with the project: I 'd heard about the play because I lived on Paddington Street, off Baker Street, and there was an old gym a few doors away. I saw Richard O'Brien in the street, and he said he 'd just been to the gym to see if he could find a muscleman who could sing. I said, "Why do you need him to sing? '' (laughs) And he told me that his musical was going to be done, and I should talk to Jim Sharman. He gave me the script, and I thought, "Boy, if this works, it 's going to be a smash. '' Originally, Curry rehearsed the character with a German accent and peroxide blond hair, and later, with an American accent. In March 2005, in an interview with Terry Gross of NPR 's Fresh Air, he explains that he decided to play Dr. Frank - N - Furter with an English accent after listening to an English woman say, "Do you have a house in town or a house in the country, '' and decided, "Yes, (Dr. Frank - N - Furter) should sound like the Queen. '' Curry originally thought the character was merely a laboratory doctor dressed in a white lab coat. However, at the suggestion of director Sharman, the character evolved into the diabolical mad scientist and transvestite with an upper - class Belgravia accent that carried over to The Rocky Horror Picture Show and made Curry a household name and gave him a cult following. He continued to play the character in London, Los Angeles, and New York City until 1975. In an interview with NPR, Curry called Rocky Horror a "rite of passage, '' and added that the film is "a guaranteed weekend party to which you can go with or without a date and probably find one if you do n't have one, and it 's also a chance for people to try on a few roles for size, you know? Figure out, help them maybe figure out their own sexuality. '' In 2016, Curry played The Criminologist in the television film remake of The Rocky Horror Picture Show. Shortly after the end of Rocky Horror 's run on Broadway, Curry returned to the stage with Tom Stoppard 's Travesties, which ran in London and New York from 1975 to 1976. Travesties was a Broadway hit. It won two Tony Awards (Best Performance by an Actor for John Wood and Best Comedy), as well as the New York Drama Critics Circle Award (Best Play), and Curry 's performance as the famous dadaist Tristan Tzara received good reviews. In 1981, Curry formed part of the original cast in the Broadway show Amadeus, playing the title character, Wolfgang Amadeus Mozart. He was nominated for his first Tony Award (Best Performance by a Leading Actor in a Play) for this role but lost out to his co-star Ian McKellen, who played Antonio Salieri. In 1982, Curry took the part of the Pirate King in the Drury Lane production of Joe Papp 's version of The Pirates of Penzance opposite George Cole, earning enthusiastic reviews. In the mid-1980s, Curry performed in The Rivals and in several plays with the Royal National Theatre of Great Britain, including The Threepenny Opera, Dalliance and Love For Love. In 1988, Curry did the national tour of Me and My Girl as the lead role of Bill Snibson, a role originated on Broadway by Robert Lindsay and followed by Jim Dale. In 1989 - 90, Tim Curry returned once again to the New York stage in The Art of Success. In 1993, Curry played Alan Swann in the Broadway musical version of My Favourite Year, earning him his second Tony Award nomination, this time for Best Performance by a Leading Actor in a Musical. In 2001, Curry appeared as Scrooge in the musical version of A Christmas Carol that played at Madison Square Garden. In 2004, Curry began his role of King Arthur in Spamalot in Chicago. The show successfully moved to Broadway in February 2005. The show sold more than $1 million worth of tickets in its first 24 hours. It brought him a third Tony nomination, again for Best Performance by a Leading Actor in a Musical. Curry reprised this role in London 's West End at the Palace Theatre, where Spamalot opened on 16 October 2006. His final performance came on 6 January 2007. He was nominated for a Laurence Olivier Award as the Best Actor in a Musical for the role, and also won the Theatregoers ' Choice Award (getting 39 % of the votes cast by over 12,000 theatregoers) as Best Actor in a Musical. From May to August 2011, Curry was scheduled to portray the Player in a Trevor Nunn stage production of Tom Stoppard 's Rosencrantz and Guildenstern Are Dead at the Chichester Festival Theatre and then in London. He withdrew from the production on 27 May, citing ill health. From 26 -- 29 April 2012, Tim Curry appeared in Eric Idle 's play What About Dick? at the Orpheum Theatre in Los Angeles. He had originally appeared in the play back in 2007 when it was still work in progress. Curry 's career in theatre was honoured on 7 June 2015 at the Actors Fund 's 19th annual Tony Awards Viewing Party, where he was awarded an Artistic Achievement Award. After The Rocky Horror Picture Show, Curry began to appear in many films, acting in supporting roles, such as Robert Graves in the British horror film The Shout, as Johnny LaGuardia in Times Square, as Daniel Francis "Rooster '' Hannigan in Annie, a film based on the broadway musical of the same name and as Jeremy Hancock in the political film The Ploughman 's Lunch. In 1985, Curry starred in the fantasy film, Legend as the Lord of Darkness. Director Ridley Scott cast Curry in the film after watching him in Rocky Horror, thinking he was ideal to play the role of Darkness. It took five and a half hours to apply the makeup needed for Darkness onto Curry and at the end of the day, he would spend an hour in a bath in order to liquefy the soluble spirit gum. At one point, Curry got too impatient and claustrophobic and pulled the makeup off too quickly, tearing off his own skin in the process. Scott had to shoot around the actor for a week as a result. The same year, he appeared in the comedy mystery film Clue as Wadsworth the butler. After this, Curry began to be cast in more comedy roles throughout the late 1980s and ' 90s such as Rev. Ray Porter in Pass the Ammo, Dr. Thornton Poole in Oscar, Mr. Hector in Home Alone 2: Lost in New York, Jigsaw in Loaded Weapon 1 and as Long John Silver in Muppet Treasure Island. Although he featured in mostly comedies throughout the ' 90s, he did appear in some action films, such as the thriller The Hunt for Red October as Dr. Yevgeniy Petrov, the 1993 reboot of The Three Musketeers as Cardinal Richelieu, in the superhero film The Shadow as Farley Claymore and as Herkermer Homolka in the 1995 action adventure Congo. He also starred in the 1998 direct - to - video film Addams Family Reunion playing the role of Gomez Addams. In the early 2000s, Curry was cast in the film adaption of Charlie 's Angels in the role of Roger Corwin, and in the parody film Scary Movie 2 playing Professor Oldman. Curry then went on to play Thurman Rice, a supporting role in the biographical film Kinsey. In recent years, Curry has mostly performed in animated films, his most recent feature film onscreen role has been in the British black comedy Burke & Hare as Prof Alexander Monro. Curry started off his career with small roles in television series, such as Eugene in Napoleon and Love, and guest roles in Armchair Theatre and Play for Today. Curry also appeared in the "Dead Dog Records '' storyline of the television series crime drama Wiseguy, as Winston Newquay. He also had recurring roles on the short - lived science fiction television series Earth 2 and the sitcom Rude Awakening. He has also guest starred on other series such as Roseanne, Tales from the Crypt (which earned him an Emmy award nomination), The Tracey Ullman Show, Lexx, The Naked Truth, Monk, Will & Grace, Psych, Agatha Christie 's Poirot and Criminal Minds. Curry also performed in a large number of television films and miniseries, including Three Men in a Boat, the titular role in Will Shakespeare, playing the role of Bill Sikes in a television adaptation of Oliver Twist, a wizard in the Halloween television film adaptation of The Worst Witch, Titanic, Terry Pratchett 's The Colour of Magic, Alice, Return to Cranford and many more. Although Curry has appeared in numerous television series throughout his career he has only had main roles in two: Over the Top, a sitcom that he also produced, and the revival series of Family Affair. Both were cancelled after one season. One of Curry 's best - known television roles, and best - known roles overall is Pennywise the Clown in the 1990 horror miniseries Stephen King 's It. Aside from one Fangoria interview in 1990, Curry never publicly acknowledged his involvement in It until an interview with Moviefone in 2015, where he called the role of Pennywise "a wonderful part '', giving his blessing to successor Will Poulter; Poulter was set to play the character in the reboot, although ultimately dropped out. Bill Skarsgård replaced him and while being interviewed at Fan Expo Canada Curry gave his approval, saying that he liked Skarsgård very much. Curry voiced Taurus Bullba in Darkwing Duck for 3 episodes. He has also appeared in a large number of animated television series and films, starting with the performance of the Serpent in The Greatest Adventure: Stories from the Bible. Curry also portrayed Captain Hook in the Fox animated series Peter Pan and the Pirates. Curry won a Daytime Emmy Award for his performance. Another animated television role was in The Wild Thornberrys, where he played Nigel Thornberry. He had small roles in the Little Mermaid TV series and the 2014 Cartoon Network mini-series Over the Garden Wall, as Auntie Whispers. In 1988 Curry recorded the lead voice as the castaway mouse Abelard Hassan DiChirico Flint in Michael Sporn 's Emmy Nominated adaptation of William Steig 's novel for children, "Abel 's Island '' for Italtoons, now distributed by Random House. Curry was mainly known for antagonist roles in animated series such as MAL in Captain Planet and the Planeteers, Skullmaster in Mighty Max, Dr Anton Sevarius in Gargoyles, George Herbert Walker ' King ' Chicken in Duckman, Lord Dragaunus in The Mighty Ducks, as Henri Poupon and Charlene 's coat in Jim Henson 's Dinosaurs, Scarlet Fever and Nick O'Teen in Ozzy & Drix, Professor Finbar Calamitous in The Adventures of Jimmy Neutron: Boy Genius, Slagar the Cruel in Redwall, Doctor Morocco in Transformers: Rescue Bots, and G. Gordon Godfrey in Young Justice. He also became the voice of Palpatine in Star Wars: The Clone Wars upon the death of Ian Abercrombie. During the 1990s, Curry played the voice - only role of cyber-villain Kilokahn in DIC 's live - action series Superhuman Samurai Syber - Squad. Curry also appeared in a number of animated films such as FernGully: The Last Rainforest, The Pebble and the Penguin, all three Rugrats films as side characters (excluding Rugrats Go Wild where he reprises his role as Nigel Thornberry), Beauty and the Beast: The Enchanted Christmas, He played Voley in US Version on The First Snow of Winter, Scooby - Doo! and the Witch 's Ghost, The Wild Thornberrys Movie, The Cat Returns, Valiant, Garfield: Tail of Two Kitties, Fly Me to the Moon, and many more. Curry has also lent his voice to numerous video games, such as, Gabriel Knight: Sins of the Fathers and Gabriel Knight 3: Blood of the Sacred, Blood of the Damned, where he voiced the title character, Gabriel Knight, Toonstruck, Sacrifice, Brütal Legend and Dragon Age: Origins. His audiobook work includes Lemony Snicket 's A Series of Unfortunate Events, Geraldine McCaughrean 's Peter Pan in Scarlet, Charles Dickens ' A Christmas Carol, Bram Stoker 's Dracula and the Abhorsen trilogy by Garth Nix. He also played Premier Anatoly Cherdenko in Command & Conquer: Red Alert 3. Aside from his performances on various soundtrack records, Curry has had some success as a solo musical artist. Curry received classical vocal training as a boy. He has mentioned that his musical influences included jazz vocalists such as Billie Holiday and Louis Armstrong and idolised the Beatles and the Rolling Stones as a teenager. In 1978, A&M Records released Curry 's debut solo album Read My Lips. The album featured an eclectic range of songs (mostly covers) performed in diverse genres. Highlights of the album are a reggae version of the Beatles ' song "I Will '', a rendition of "Wake Nicodemus '' featuring the Pipes and Drums of the 48th Highlanders of Canada, and a bar - room ballad, "Alan '', composed by Canadian singer - songwriter Tony Kosinec. The following year, Curry released his second and most successful album Fearless. The LP was more rock - oriented than Read My Lips and mostly featured original songs rather than cover versions. The record included Curry 's only US charting songs: "I Do the Rock '' and "Paradise Garage ''. Curry 's third and final album, Simplicity, was released in 1981, again by A&M Records. This record, which did not sell as well as the previous offerings, combined both original songs and cover versions. The writing, production and musician roster for Curry 's solo albums included an impressive list of collaborators, including Bob Ezrin and David Sanborn. In 1989, A&M released The Best of Tim Curry on CD and cassette, featuring songs from his albums (including a live version of "Alan '') and a previously unreleased song, a live cover version of Bob Dylan 's "Simple Twist of Fate ''. Curry toured America with his band through the late 1970s and the first half of the 1980s. In 1990 he performed as the prosecutor in Roger Waters ' production of The Wall in Berlin. Although Curry 's first album was released in 1978, he had previously recorded a nine - track album for Lou Adler 's Ode Records in 1976. However, the album remained unreleased in its entirety until February 2010, when it was made available as a legal download entitled... From the Vaults (though four tracks from these sessions had been released on a 1990 Rocky Horror box set). The album, produced by Adler, included Curry 's rendition of the Supremes ' hit "Baby Love ''. Curry resides in Toluca Lake, California. He is agnostic. In June 2012, Curry suffered a major stroke. As a result of the stroke, he now uses a wheelchair.
who was known as the father of indian national congress
Indian National Congress - Wikipedia The Indian National Congress (pronunciation (help info)) (INC, often called Congress) is a broad - based political party in India. Founded in 1885, it was the first modern nationalist movement to emerge in the British Empire in Asia and Africa. From the late 19th - century, and especially after 1920, under the leadership of Mahatma Gandhi, Congress became the principal leader of the Indian independence movement, with over 15 million members and over 70 million participants. The Congress led India to independence from Great Britain, and powerfully influenced other anti-colonial nationalist movements in the British Empire. The Congress is a secular party whose social liberal platform is generally considered on the centre - left of Indian politics. The Congress ' social policy is based upon the Gandhian principle of Sarvodaya -- the lifting up of all sections of society -- which involves the improvement of the lives of economically underprivileged and socially marginalised people. The party primarily endorses social liberalism -- seeking to balance individual liberty and social justice, and secularism -- asserting the right to be free from religious rule and teachings. After India 's independence in 1947, the Congress formed the government at center in most instances, and many regional state governments. Congress became India 's dominant political party; as of 2015, in the 15 general elections since independence, it has won an outright majority on six occasions and has led the ruling coalition a further four times, heading the central government for 49 years. There have been seven Congress Prime Ministers, the first being Jawaharlal Nehru (1947 -- 64), and the most recent Manmohan Singh (2004 -- 14). Although it did not fare well in the last general elections in India in 2014, it remains one of two major, nationwide, political parties in India, along with the right - wing, Hindu nationalist, Bhartiya Janata Party (BJP). In the 2014 general election, the Congress had its poorest post-independence general election performance, winning only 44 seats of the 543 - member Lok Sabha. From 2004 to 2014, the Congress - led United Progressive Alliance, a coalition of several regional parties, formed the Indian government and was headed by Prime Minister Manmohan Singh. As of July 2017, the party is in power in five states: Punjab, Himachal Pradesh, Karnataka, Meghalaya and Mizoram. The Congress has previously directly ruled Andhra Pradesh, Tamil Nadu, Gujarat, Madhya Pradesh, Rajasthan, Uttar Pradesh and Goa. The history of the Indian National Congress (INC) falls into two distinct eras: The Indian National Congress conducted its first session in Bombay from 28 -- 31 December 1885 at the initiative of retired Civil service officer, Allan Octavian Hume. In 1883, Hume had outlined his idea for a body representing Indian interests in an open letter to graduates of the University of Calcutta. Its aim was to obtain a greater share in government for educated Indians, and to create a platform for civic and political dialogue between them and the British Raj. Hume took the initiative, and in March 1885 a notice convening the first meeting of the Indian National Union to be held in Poona the following December was issued. Due to a cholera outbreak there it was moved to Bombay. Hume organised the first meeting in Bombay with the approval of the Viceroy Lord Dufferin. Womesh Chandra Bonnerjee was the first president of the Congress; the first session was attended by 72 delegates. Representing each province of India, the delegates comprised 54 Hindus and two Muslims; the rest were of Parsi and Jain backgrounds. Notable representatives included Scottish ICS officer William Wedderburn, Dadabhai Naoroji, Pherozeshah Mehta of the Bombay Presidency Association, Ganesh Vasudeo Joshi of the Poona Sarvajanik Sabha, social reformer and newspaper editor Gopal Ganesh Agarkar, Justice K.T. Telang, N.G. Chandavarkar, Dinshaw Wacha, Behramji Malabari, journalist and activist Gooty Kesava Pillai, and P. Rangaiah Naidu of the Madras Mahajana Sabha. Within the next few years, the demands of the Congress became more radical in the face of constant opposition from the British government, and the party decided to advocate in favour of the independence movement because it would allow a new political system in which the Congress could be a major party. By 1905, a division opened between the moderates led by Gokhale, who downplayed public agitation, and the new "extremists '' who advocated agitation, and regarded the pursuit of social reform as a distraction from nationalism. Bal Gangadhar Tilak, who tried to mobilise Hindu Indians by appealing to an explicitly Hindu political identity displayed in the annual public Ganapati festivals he inaugurated in western India, was prominent among the extremists. The Congress included a number of prominent political figures. Dadabhai Naoroji, a member of the sister Indian National Association was elected president of the party in 1886 and was the first Indian Member of Parliament in the British House of Commons (1892 -- 95). It also included Bal Gangadhar Tilak, Bipin Chandra Pal, Lala Lajpat Rai, Gopal Krishna Gokhale and Mohammed Ali Jinnah -- later leader of the Muslim League and instrumental in the creation of Pakistan. The Congress was transformed into a mass movement by Surendranath Banerjea during the partition of Bengal in 1905 and the resultant Swadeshi movement. Mahatma Gandhi returned from South Africa in 1915. With the help of the moderate group led by Ghokhale, Gandhi became president of the Congress. After the First World War, the party became associated with Gandhi, who remained its unofficial spiritual leader and icon. He formed an alliance with the Khilafat Movement in 1920 to fight for preservation of the Ottoman Caliphate, and rights for Indians using civil disobedience or satyagraha as the tool for agitation. In 1923, after the deaths of policemen at Chauri Chaura, Gandhi suspended the agitation. In protest, a number of leaders, Chittaranjan Das, Annie Besant, and Motilal Nehru, resigned to set up the Swaraj Party. The Khilafat movement collapsed and the Congress was split. The rise of Gandhi 's popularity and his satyagraha art of revolution led to support from: Sardar Vallabhbhai Patel, Pandit Jawaharlal Nehru, Dr. Rajendra Prasad, Khan Mohammad Abbas Khan, Khan Abdul Ghaffar Khan, Chakravarti Rajgopalachari, Dr. Anugraha Narayan Sinha, Jayaprakash Narayan, Jivatram Kripalani, and Maulana Abul Kalam Azad. As a result of prevailing nationalism, Gandhi 's popularity, and polices aimed at eradicating caste differences, untouchability, poverty, and religious and ethnic divisions, the Congress became a forceful and dominant group. Although its members were predominantly Hindu, it had members from other religions, economic classes, and ethnic and linguistic groups. At the Congress ' 1929 Lahore session under the presidency of Jawaharlal Nehru, Purna Swaraj (complete independence) was declared as the party 's goal, declaring 26 January 1930 as "Purna Swaraj Diwas '', Independence Day. The same year, Srinivas Iyenger was expelled from the party for demanding full independence, not just home rule as demanded by Gandhi. After the passage of the Government of India Act 1935, provincial elections were held in India in the winter of 1936 -- 37 in eleven provinces: Madrass, Central Provinces, Bihar, Orissa, United Provinces, Bombay Presidency, Assam, NWFP, Bengal, Punjab and Sindh. After contesting these elections, the Indian National Congress gained power in eight of them except Bengal, Punjab, and Sindh. The All - India Muslim League failed to form the government in any province. The Congress ministries resigned in October and November 1939 in protest against Viceroy Lord Linlithgow 's declaration that India was a belligerent in the Second World War without consulting the Indian people. In 1939, Subhas Chandra Bose, the elected president in both 1938 and 1939, resigned from the Congress over the selection of the working committee. The party was not the sole representative of the Indian polity, other parties included the Hindu Mahasabha and the Forward Bloc. The party was an umbrella organisation, sheltering radical socialists, traditionalists, and Hindu and Muslim conservatives. Gandhi expelled all the socialist groupings, including the Congress Socialist Party, the Krishak Praja Party, and the Swarajya Party, along with Subhas Chandra Bose in 1939. Azad Hind, an Indian provisional government had been established in Singapore in 1943, and was supported by Japan. In 1946, the British tried the Indian soldiers who had fought alongside the Japanese during World War II in the INA trials. In response the Congress helped form the INA Defence Committee, which assembled a legal team to defend the case of the soldiers of the Azad Hind government. The team included several famous lawyers, including Bhulabhai Desai, Asaf Ali, and Jawaharlal Nehru. The same year, Congress members initially supported the sailors who led the Royal Indian Navy mutiny, but they withdrew support at a critical juncture and the mutiny failed. After Indian independence in 1947, the Indian National Congress became the dominant political party in the country. In 1952, in the first general election held after Independence, the party swept to power in the national parliament and most state legislatures. It held power nationally until 1977 when it was defeated by the Janata coalition. It returned to power in 1980 and ruled until 1989, when it was once again defeated. The party formed the government in 1991 at the head of a coalition, as well as in 2004 and 2009, when it led the United Progressive Alliance. During this period the Congress remained centre - left in its social policies while steadily shifting from a socialist to a neoliberal economic outlook. The Party 's rivals at state level have been national parties including the Bharatiya Janata Party (BJP), the Communist Party of India (Marxist) (CPM), and various regional parties such as the Telugu Desam Party. A post-partition successor to the party survived as the Pakistan National Congress, a party which represented the rights of religious minorities in the state. The party 's support was strongest in the Bengali speaking province of East Pakistan. After the Bangladeshi War of Independence, it became known as the Bangladeshi National Congress, but was dissolved in 1975 by the government. From 1951 until his death in 1964, Jawaharlal Nehru was the Congress ' paramount leader under the tutelage of Mahatma Gandhi, whose Indian independence movement dominated the Party. Congress gained power in landslide victories in the general elections of 1951 -- 52, 1957, and 1962. During his tenure, Nehru implemented policies based on import substitution industrialisation, and advocated a mixed economy where the government - controlled public sector co-existed with the private sector. He believed the establishment of basic and heavy industries was fundamental to the development and modernisation of the Indian economy. The Nehru government directed investment primarily into key public sector industries -- steel, iron, coal, and power -- promoting their development with subsidies and protectionist policies. Nehru embraced secularism, socialistic economic practices based on state - driven industrialisation, and a non-aligned and non-confrontational foreign policy that became typical of the modern Congress Party. The policy of non-alignment during the Cold War meant Nehru received financial and technical support from both the Eastern and Western Blocs to build India 's industrial base from nothing. During his period in office, there were four known assassination attempts on Nehru. The first attempt on his life was during partition in 1947 while he was visiting the North - West Frontier Province in a car. The second was by a knife - wielding rickshaw - puller in Maharashtra in 1955. A third attempt happened in Bombay in 1956. The fourth was a failed bombing attempt on railway tracks in Maharashtra in 1961. Despite threats to his life, Nehru despised having excess security personnel around him and did not like his movements to disrupt traffic. In 1964, Nehru died because of an aortic dissection, raising questions about the party 's future. After his death, K. Kamaraj became the president of the All India Congress Committee. Kamaraj had also been involved in the Indian independence movement, and he introduced education to millions of the rural poor by providing free education along with a free midday meal, when he was chief minister of Tamil Nadu (1954 -- 63). As a member of "the syndicate '', a group within the Congress, he proposed the Kamaraj Plan that encouraged six Congress chief ministers and six senior cabinet ministers to resign to take up party work. Kamaraj was widely credited as the "kingmaker '' in Indian politics for bringing Lal Bahadur Shastri to power in 1964. No leader except Shastri had Nehru 's popular appeal. Shastri became a national hero following the victory in the Indo - Pakistani War of 1965. His slogan, "Jai Jawan Jai Kisan '' ("Hail the soldier, Hail the farmer ''), became very popular during the war. Shastri retained many members of Nehru 's Council of Ministers; T.T. Krishnamachari was retained as the Finance Minister of India, as was Defence Minister Yashwantrao Chavan. Shastri appointed Swaran Singh to succeed him as External Affairs Minister. Shashtri appointed Indira Gandhi, Jawaharlal Nehru 's daughter and former party president, Minister of Information and Broadcasting. Gulzarilal Nanda continued as the Minister of Home Affairs. As Prime Minister, Shastri continued Nehru 's policy of non-alignment, but built closer relations with the Soviet Union. In the aftermath of the Sino - Indian War of 1962, and the formation of military ties between China and Pakistan, Shastri 's government expanded the defence budget of India 's armed forces. He also promoted the White Revolution -- a national campaign to increase the production and supply of milk by creating the National Dairy Development Board. The Madras anti-Hindi agitation of 1965 occurred during Shastri 's tenure. On 11 January 1966, a day after signing the Tashkent Declaration, Shastri died in Tashkent, reportedly of a heart attack; but the circumstances of his death remain mysterious. After Shastri 's death, the Congress elected Indira Gandhi as leader over Morarji Desai. Once again, politician K. Kamaraj was instrumental in achieving this result. In 1967, following a poor performance in the general election, Indira Gandhi started moving towards the political left. In 1969, she was in a dispute with senior party leaders on a number of issues; the party president S. Nijalingappa expelled her from the Congress. Gandhi launched her own faction of the IRC, retaining the support of most of the Congress MPs, 65 of which supported the original party. In the mid-term parliamentary elections held in 1971, the Gandhi - led Congress (R) Party won a landslide victory on a platform of progressive policies such as the elimination of poverty (Garibi Hatao). The policies of the Congress (R) Party under Gandhi before the 1971 elections included proposals to abolish the Privy Purse to former rulers of the Princely states and the 1969 nationalisation of India 's 14 largest banks. The New Congress Party 's popular support began to wane in the mid-1970s. From 1975, Gandhi 's government grew increasingly more authoritarian and unrest among the opposition grew. On 12 June 1975, the High Court of Allahabad declared Indira Gandhi 's election to the Lok Sabha, the lower house of India 's parliament, void on the grounds of electoral malpractice. However, Gandhi rejected calls to resign and announced plans to appeal to the Supreme Court. She moved to restore order by ordering the arrest of most of the opposition participating in the unrest. In response to increasing disorder and lawlessness, Gandhi 's cabinet and government recommended that President Fakhruddin Ali Ahmed declare a State of Emergency, which he declared on 25 June 1975 based on the provisions of Article 352 of the Constitution. During the nineteen - month emergency, widespread oppression and abuse of power by Gandhi 's unelected younger son and political heir Sanjay Gandhi and his close associates occurred. This period of oppression ended on 23 January 1977, when Gandhi released all political prisoners and called fresh elections for the Lok Sabha to be held in March. The Emergency officially ended on 23 March 1977. In that month 's parliamentary elections, the opposition Janata Party won a landslide victory over the Congress, winning 295 seats in the Lok Sabha against the Congress ' 153. Gandhi lost her seat to her Janata opponent Raj Narain. On 2 January 1978, she and her followers seceded and formed a new opposition party, popularly called Congress (I) -- the I signifying Indira. During the next year, her new party attracted enough members of the legislature to become the official opposition. In November 1978, Gandhi regained a parliamentary seat. In January 1980, following a landslide victory for the Congress (I), she was again elected prime minister. The national election commission declared Congress (I) to be the real Indian National Congress for the 1984 general election and the designation I was dropped. During Gandhi 's new term as prime minister, her youngest son Sanjay died in an aeroplane crash in June 1980. This led her to encourage her elder son Rajiv, who was working as a pilot, to enter politics. Gradually, Indira Gandhi 's politics and outlook grew more authoritarian and autocratic, and she became the central figure of the Congress. As prime minister, she became known for her political ruthlessness and unprecedented centralisation of power. Gandhi 's term as prime minister also saw increasing turmoil in Punjab with demands for Sikh autonomy by Jarnail Singh Bhindranwale and his militant followers. In 1983, they headquartered themselves in the Golden Temple in Amritsar and started accumulating weapons. In June 1984, after several futile negotiations, Gandhi ordered the Indian Army to enter the Golden Temple to establish control over the temple complex and remove Bhindranwale and his armed followers. This event is known as Operation Blue Star. On 31 October 1984, two of Gandhi 's bodyguards, Satwant Singh and Beant Singh, shot her with their service weapons in the garden of the prime minister 's residence in response to her authorisation of Operation Blue Star. Gandhi was due to be interviewed by British actor Peter Ustinov, who was filming a documentary for Irish television. Her assassination prompted the 1984 anti-Sikh riots, during which more than 3,000 people were killed. In 1984, Indira Gandhi 's son Rajiv Gandhi became nominal head of the Congress and became prime minister upon her assassination. In December, he led the Congress to a landslide victory, where it secured 401 seats in the legislature. His administration took measures to reform the government bureaucracy and liberalise the country 's economy. Rajiv Gandhi 's attempts to discourage separatist movements in Punjab and Kashmir backfired. After his government became embroiled in several financial scandals, his leadership became increasingly ineffectual. Gandhi was regarded as a non-abrasive person who consulted other party members and refrained from hasty decisions. The Bofors scandal damaged his reputation as an honest politician, but he was posthumously cleared of bribery allegations in 2004. On 21 May 1991, Gandhi was killed by a bomb concealed in a basket of flowers carried by a woman associated with the Tamil Tigers. He was campaigning in Tamil Nadu for upcoming parliamentary elections. In 1998, an Indian court convicted 26 people in the conspiracy to assassinate Gandhi. The conspirators, who consisted of Tamil militants from Sri Lanka and their Indian allies, had sought revenge against Gandhi because the Indian troops he sent to Sri Lanka in 1987 to help enforce a peace accord there had fought with Tamil separatist guerrillas. Rajiv Gandhi was succeeded as party leader by P.V. Narasimha Rao who was elected prime minister in June 1991. His rise to the prime ministership was politically significant because he was the first holder of the office from South India. His administration oversaw a major economic change and several home incidents that affected India 's national security. Rao, who held the Industries portfolio, was personally responsible for the dismantling of the Licence Raj, which came under the purview of the Ministry of Commerce and Industry. He is often called the "father of Indian economic reforms ''. Future prime ministers Atal Bihari Vajpayee and Manmohan Singh continued the economic reform policies begun by Rao 's government. Rao accelerated the dismantling of the Licence Raj, reversing the socialist policies of previous governments. He employed Manmohan Singh as his finance minister to begin a historic economic change. With Rao 's mandate, Singh launched India 's globalisation reforms that involved implementing International Monetary Fund (IMF) policies to prevent India 's impending economic collapse. Rao was also referred to as Chanakya for his ability to push tough economic and political legislation through the parliament while he headed a minority government. By 1996, the party 's image was suffering from allegations of corruption, and in elections that year the Congress was reduced to 140 seats, its lowest number in the Lok Sabha to that point. Rao later resigned as prime minister and, in September, as party president. He was succeeded as president by Sitaram Kesri, the party 's first non-Brahmin leader. In the 1998 general election, the Congress won 141 seats in the Lok Sabha, its lowest tally until then. To boost its popularity and improve its performance in the forthcoming election, Congress leaders urged Sonia Gandhi, Rajiv Gandhi 's widow, to assume the leadership of the party. She had previously declined offers to become actively involved in party affairs, and had stayed away from politics. After her election as party leader, a section of the party that objected to the choice because of her Italian ethnicity broke away and formed the Nationalist Congress Party (NCP), led by Sharad Pawar. The breakaway faction commanded strong support in the state of Maharashtra and limited support elsewhere. The remainder continued to be known as the Indian National Congress. Sonia Gandhi 's appointment initially failed to have an impact; in the snap polls called by the National Democratic Alliance (NDA) government in 1999, the Congress won 114 seats -- its lowest tally ever. The leadership structure was unaltered and the party campaigned strongly in the assembly elections that followed. At these elections the party was successful; at one point, the Congress ruled 15 states. In the 2004 general election, the Congress forged an alliance with several regional parties, including the NCP and the Dravida Munnetra Kazhagam. The party 's campaign emphasised social inclusion and the welfare of common people, contrasting with the NDA 's "India Shining '' campaign that sought to highlight the successes of the NDA government in making India into a "modern nation ''. The Congress - led United Progressive Alliance (UPA) won 222 seats in the new parliament, defeating the NDA by a substantial margin. With the support of the communist front, the Congress won a majority and formed the new government. Despite massive support from within the Party, Gandhi declined the post of prime minister, choosing to appoint Manmohan Singh instead. She remained as party president and headed the National Advisory Council (NAC). During its first term in office, the UPA government passed several social reform bills. These included an employment guarantee bill, the Right to Information Act, and a right to education act. The NAC, as well as the Left Front that supported the government from the outside, were widely seen as being the driving force behind such legislation. The Left Front withdrew its support of the government over disagreements about the U.S. -- India Civil Nuclear Agreement. Despite the effective loss of 62 seats in parliament, the government survived the trust vote that followed. In the Lok Sabha elections held soon after, the Congress won 207 seats, the highest tally of any party since 1991. The UPA as a whole won 262, enabling it to form the government for the second time. The social welfare policies of the first UPA government, and the perceived divisiveness of the BJP, are broadly credited for the victory. By the 2014 Lok Sabha elections, the party had lost much of its popular support, mainly because of several years of poor economic conditions in the country, and growing discontent over a series of corruption allegations involving government officials, including the 2G spectrum scam and the Indian coal allocation scam. The Congress won only 44 seats, which was its worst - ever performance in a national election and brought into question whether it would continue to be identified as an officially recognised party. As of 2014, the election symbol of the Congress, as approved by the Election Commission of India, is an image of a right hand with its palm facing front and its fingers pressed together; this is usually shown in the centre of a tricolor flag. The hand symbol was first used by Indira Gandhi when she split from the Congress (R) faction following the 1977 elections and created the New Congress (I). The symbol of the original Congress during elections held between 1952 and 1971 was an image of two bullocks with a plough. The symbol of Indira 's Congress (R) during the period 1971 -- 77 was a cow with a suckling calf. The Congress is structured in a hierarchical manner and the organisational structure, created by Mohandas Gandhi 's re-arrangement of the party between 1918 and 1920 has been largely retained. A president and the All India Congress Committee (AICC) are elected by delegates from state and district parties at an annual national conference, In every Indian state and union territory -- or pradesh -- there is a Pradesh Congress Committee (PCC), which is the state - level unit of the party responsible for directing political campaigns at local and state levels, and assisting the campaigns for parliamentary constituencies.. Each PCC has a working committee of twenty members, most of whom are appointed by the party president, the leader of the state party, who is chosen by the prime minister. Those elected as members of the states ' legislative assemblies form the Congress Legislature Parties in the various state assemblies; their chairperson is usually the party 's nominee for Chief Ministership. The party is also organised into various committees, and sections; it publishes a daily newspaper, the National Herald. Despite being a party with a structure, the Congress under Indira did not hold any organizational elections after 1972 The AICC is composed of delegates sent from the PCCs. The delegates elect Congress committees, including the Congress Working Committee consisting of senior party leaders and office bearers. The AICC takes all important executive and political decisions. Since Indira Gandhi formed the Congress (I) in 1978, the President of the Indian National Congress has effectively been: the party 's national leader, head of the organisation, head of the Working Committee and all chief Congress committees, chief spokesman, and the Congress ' choice for Prime Minister of India. Constitutionally, the president is elected by the PCCs and members of the AICC; however, this procedure has often been by - passed by the Working Committee, which has elected its own candidate. The Congress Parliamentary Party (CPP) consists of elected MPs in the Lok Sabha and Rajya Sabha. There is also a Congress Legislative Party (CLP) leader in each state. The CLP consists of all Congress Members of the Legislative Assembly (MLAs) in each state. In cases of states where the Congress is single - handedly ruling the government, the CLP leader is the Chief Minister. Other directly affiliated groups include: the National Students Union of India (NSUI), the Indian Youth Congress -- the party 's youth wing -- Indian National Trade Union Congress, Mahila Congress, its women 's division, and Congress Seva Dal -- its voluntary organisation. The Congress is a civic nationalist party that follows a form of nationalism that supports the values of freedom, tolerance, equality, and individual rights. Throughout much of the Cold War period, the Congress supported a foreign policy of nonalignment that called for India to form ties with both the western and eastern blocs, but to avoid formal alliances with either. American support for Pakistan led the Party to endorse a friendship treaty with the Soviet Union in 1971. In 2004, when the Congress - led United Progressive Alliance came to power, its chairperson Sonia Gandhi unexpectedly relinquished the premiership to Manmohan Singh. This Singh - led "UPA I '' government executed several key pieces of legislation and projects, including the Rural Health Mission, Unique Identification Authority, the Rural Employment Guarantee scheme, and the Right to Information Act. The Congress endorses a mixed economy in which the private sector and the state direct the economy, reflecting characteristics of both market economies and planned economies. The modern Congress advocates import substitution industrialisation -- the replacement of foreign imports with domestic products. The party also believes mixed economies are likely to protect the environment, standardise the welfare system, and maintain employment standards and competition. The Congress also believes the Indian economy should be liberalised to increase the pace of development. In 2005, Prime Minister Manmohan Singh introduced a value added tax, which replaced the sales tax, and has continued the Golden Quadrilateral and the highway modernisation program that was initiated by Vajpayee 's government. In 2009, India achieved its highest GDP growth rate of 9 % becoming the second - fastest growing major economy in the world. In 2005, the Congress - led government started the National Rural Health Mission, which employed about 500,000 community health workers. It was praised by American economist Jeffrey Sachs. In 2006, it implemented a proposal to reserve 27 % of seats in the All India Institute of Medical Studies (AIIMS), the Indian Institutes of Technology (IITs), the Indian Institutes of Management (IIMs), and other central higher education institutions for Other Backward Classes, which led to 2006 Indian anti-reservation protests. The Singh government also continued the Sarva Shiksha Abhiyan programme, which includes the introduction and improvement of mid-day school meals and the opening of new schools throughout India, especially in rural areas, to fight illiteracy. During Manmohan Singh 's prime - ministership, eight Institutes of Technology were opened in the states of: Andhra Pradesh, Bihar, Gujarat, Orissa, Punjab, Madhya Pradesh, Rajasthan and Himachal Pradesh. The Congress has strengthened anti-terrorism laws with amendments to the Unlawful Activities (Prevention) Act (UAPA). The National Investigation Agency (India) (NIA) was created by the UPA government soon after the Nov 2008 Mumbai terror attacks in response to the need for a central agency to combat terrorism. The Unique Identification Authority of India was established in February 2009 to implement the proposed Multipurpose National Identity Card with the objective of increasing national security. The Congress has continued the foreign policy started by P. V. Narasimha Rao. This includes the peace process with Pakistan and the exchange of high - level visits by leaders from both countries. The party tried to end the border dispute with the People 's Republic of China through negotiations. Relations with Afghanistan have also been a concern of the party. During Afghan President Hamid Karzai 's visit to New Delhi in August 2008, Manmohan Singh increased the aid package to Afghanistan for the development of schools, health clinics, infrastructure, and defence. India is now as one of the single largest aid donors to Afghanistan. When in power between 2004 and 2014, the Congress worked on India 's relationship with the United States. Prime Minister Manmohan Singh visited the US in July 2005 to negotiate an Indo - US civilian nuclear agreement. US President George W. Bush visited India in March 2006; during this visit a nuclear agreement that would give India access to American nuclear fuel and technology in exchange for the IAEA inspection of its civil nuclear reactors was proposed. Over two years of negotiations, followed by approval from the IAEA, the Nuclear Suppliers Group and the US Congress, the agreement was signed on 10 October 2008. The Congress ' policy has been to cultivate friendly relations with Japan and European Union countries including the United Kingdom, France, and Germany. Diplomatic relations with Iran have continued, and negotiations over the Iran - Pakistan - India gas pipeline have taken place. In April 2006 New Delhi hosted an India -- Africa summit attended by the leaders of 15 African states. Congress ' policy has also been to improve relations with other developing countries, particularly Brazil and South Africa. As of July 2017, Congress is in power in the states of Punjab, Himachal Pradesh, Karnataka, Meghalaya and Mizoram, where the party has majority support. In Puducherry it shares power with alliance partners. Previously, Congress governed Andhra Pradesh, Tamil Nadu, Gujarat, Kerala, Madhya Pradesh, Rajasthan, Uttarakhand and Manipur. A majority of non-Congress prime ministers of India are former Congress members.
who is running for senate in michigan 2018
United States Senate election in Michigan, 2018 - wikipedia Debbie Stabenow Democratic The 2018 United States Senate election in Michigan will take place on November 6, 2018, in order to elect a Class 1 member of the United States Senate to represent the state of Michigan. Incumbent Democratic U.S. Senator Debbie Stabenow is running for re-election to a fourth term. The candidate filing deadline was April 24, 2018, and the primary election will be held on August 7, 2018.
the satsuma is named after a province in which country
Satsuma province - wikipedia Satsuma Province (薩摩 国, Satsuma - no Kuni) was an old province of Japan that is now the western half of Kagoshima Prefecture on the island of Kyūshū. Its abbreviation is Sasshū (薩州). Satsuma 's provincial capital was Satsumasendai. During the Sengoku period, Satsuma was a fief of the Shimazu daimyō, who ruled much of southern Kyūshū from their castle at Kagoshima city. They were the initial patrons of Satsuma ware, which was later widely exported to the West. In 1871, with the abolition of feudal domains and the establishment of prefectures after the Meiji Restoration, the provinces of Satsuma and Ōsumi were combined to eventually establish Kagoshima Prefecture. Satsuma was one of the main provinces that rose in opposition to the Tokugawa shogunate in the mid 19th century. Because of this, the oligarchy that came into power after the Meiji Restoration of 1868 had a strong representation from the Satsuma province, with leaders such as Ōkubo Toshimichi and Saigō Takamori taking up key government positions. Satsuma is well known for its production of sweet potatoes, known in Japan as 薩摩芋 (satsuma - imo or "Satsuma potato ''). On the other hand, Satsuma mandarins (known as mikan in Japan) do not specifically originate from Satsuma but were imported into the West through this province in the Meiji era.
in limit state of serviceability the partial safety factor for wind load is taken as
Limit state design - wikipedia Limit state design (LSD), also known as load and resistance factor design (LRFD), refers to a design method used in structural engineering. A limit state is a condition of a structure beyond which it no longer fulfills the relevant design criteria. The condition may refer to a degree of loading or other actions on the structure, while the criteria refer to structural integrity, fitness for use, durability or other design requirements. A structure designed by LSD is proportioned to sustain all actions likely to occur during its design life, and to remain fit for use, with an appropriate level of reliability for each limit state. Building codes based on LSD implicitly define the appropriate levels of reliability by their prescriptions. The method of limit state design, developed in the USSR and based on research led by Professor N.S. Streletski, was introduced in USSR building regulations in 1955. Limit state design requires the structure to satisfy two principal criteria: the ultimate limit state (ULS) and the serviceability limit state (SLS). Any design process involves a number of assumptions. The loads to which a structure will be subjected must be estimated, sizes of members to check must be chosen and design criteria must be selected. All engineering design criteria have a common goal: that of ensuring a safe structure and ensuring the functionality of the structure. A clear distinction is made between the ultimate state (US) and the ultimate limit state (ULS). The US is a physical situation that involves either excessive deformations leading and approaching collapse of the component under consideration or the structure as a whole, as relevant, or deformations exceeding pre-agreed values. It involves of course considerable inelastic (plastic) behavior of the structural scheme and residual deformations. While the ULS is not a physical situation but rather an agreed computational condition that must be fulfilled, among other additional criteria, in order to comply with the engineering demands for strength and stability under design loads. The ULS condition is computationally checked at a certain point along the behavior function of the structural scheme, located at the upper part of its elastic zone at approximately 15 % lower than the elastic limit. That means that the ULS is a purely elastic condition, located on the behavior function far below the real Ultimate point, which is located deeply within the plastic zone. The rationale for choosing the ULS at the upper part of the elastic zone is that as long as the ULS design criteria are fulfilled, the structure will behave in the same way under repetitive loadings, and as long as it keeps this way, it proves that the level of safety and reliability assumed as the basis for this design is properly maintained and justified, (following the probabilistic safety approach). A structure is deemed to satisfy the ultimate limit state criterion if all factored bending, shear and tensile or compressive stresses are below the factored resistances calculated for the section under consideration. The factored stresses referred to are found by applying Magnification Factors to the loads on the section. Reduction Factors are applied to determine the various factored resistances of the section. The limit state criteria can also be set in terms of load rather than stress: using this approach the structural element being analysed (i.e. a beam or a column or other load bearing elements, such as walls) is shown to be safe when the "Magnified '' loads are less than the relevant "Reduced '' resistances. Complying with the design criteria of the ULS is considered as the minimum requirement (among other additional demands) to provide the proper structural safety. 1) limit state of deflection. 2) limit state of cracking. 3) limit state of vibration. In addition to the ULS check mentioned above, a Service Limit State (SLS) computational check must be performed. As for the ULS, here also the SLS is not a physical situation but rather a computational check. The aim is to prove that under the action of Characteristic design loads (un-factored), and / or whilst applying certain (un-factored) magnitudes of imposed deformations, settlements, or vibrations, or temperature gradients etc. the structural behavior complies with, and does not exceed, the SLS design criteria values, specified in the relevant standard in force. These criteria involve various stress limits, deformation limits (deflections, rotations and curvature), flexibility (or rigidity) limits, dynamic behavior limits, as well as crack control requirements (crack width) and other arrangements concerned with the durability of the structure and its level of everyday service level and human comfort achieved, and its abilities to fulfill its everyday functions. In view of non-structural issues it might also involve limits applied to acoustics and heat transmission that might also affect the structural design. To satisfy the serviceability limit state criterion, a structure must remain functional for its intended use subject to routine (read: everyday) loading, and as such the structure must not cause occupant discomfort under routine conditions. This calculation check is performed at a point located at the lower half of the elastic zone, where characteristic (un-factored) actions are applied and the structural behavior is purely elastic. The load and resistance factors are determined using statistics and a pre-selected probability of failure. Variability in the quality of construction, consistency of the construction material are accounted for in the factors. Generally, a factor of unity (one) or less is applied to the resistances of the material, and a factor of unity or greater to the loads. Not often used, but in some load cases a factor may be less than unity due to a reduced probability of the combined loads. These factors can differ significantly for different materials or even between differing grades of the same material. Wood and masonry typically have smaller factors than concrete, which in turn has smaller factors than steel. The factors applied to resistance also account for the degree of scientific confidence in the derivation of the values -- i.e. smaller values are used when there is n't much research on the specific type of failure mode). Factors associated with loads are normally independent on the type of material involved, but can be influenced by the type of construction. In determining the specific magnitude of the factors, more deterministic loads (like dead loads, the weight of the structure and permanent attachments like walls, floor treatments, ceiling finishes) are given lower factors (for example 1.4) than highly variable loads like earthquake, wind, or live (occupancy) loads (1.6). Impact loads are typically given higher factors still (say 2.0) in order to account for both their unpredictable magnitudes and the dynamic nature of the loading vs. the static nature of most models. While arguably not philosophically superior to permissible or allowable stress design, it does have the potential to produce a more consistently designed structure as each element is intended to have the same probability of failure. In practical terms this normally results in a more efficient structure, and as such, it can be argued that LSD is superior from a practical engineering viewpoint. The following is the treatment of LSD found in the National Building Code of Canada: Limit state design has replaced the older concept of permissible stress design in most forms of civil engineering. A notable exception is transportation engineering. Even so, new codes are currently being developed for both geotechnical and transportation engineering which are LSD based. As a result, most modern buildings are designed in accordance with a code which is based on limit state theory. For example, in Europe, structures are designed to conform with the Eurocodes: Steel structures are designed in accordance with EN 1993, and reinforced concrete structures to EN 1992. Australia, Canada, China, France, Indonesia, and New Zealand (among many others) utilise limit state theory in the development of their design codes. In the purest sense, it is now considered inappropriate to discuss safety factors when working with LSD, as there are concerns that this may lead to confusion. The United States has been particularly slow to adopt limit state design (known as Load and Resistance Factor Design in the US). Design codes and standards are issued by diverse organizations, some of which have adopted limit state design, and others have not. The ACI 318 Building Code Requirements for Structural Concrete uses Limit State design. The ANSI / AISC 360 Specification for Structural Steel Buildings, the ANSI / AISI S - 100 North American Specification for the Design of Cold Formed Steel Structural Members, and The Aluminum Association 's Aluminum Design Manual contain two methods of design side by side: In contrast, the ANSI / AWWA D100 Welded Carbon Steel Tanks for Water Storage and API 650 Welded Tanks for Oil Storage still use allowable stress design. In Europe, the Limit State Design is enforced by the Eurocodes.
who has the most wins in college history
NCAA division i FBS football win - loss records - wikipedia The following data is current as of the end of the 2016 season, which ended after the 2017 College Football Playoff National Championship. The following list reflects the records according to the NCAA. Not all wins and losses in this list have occurred in the highest level of play, but are recognized by the NCAA. This list takes into account results modified later due to NCAA action, such as vacated victories and forfeits. This list includes the teams that are currently transitioning from FCS to FBS (Coastal Carolina). Percentages are figured to 3 decimal places. In the event of a tie, the team with the most wins is listed first. Ties count as one - half win and one - half loss.
explain why there is no value l for which lim
Limit of a function - wikipedia Although the function (sin x) / x is not defined at zero, as x becomes closer and closer to zero, (sin x) / x becomes arbitrarily close to 1. In other words, the limit of (sin x) / x as x approaches zero equals 1. In mathematics, the limit of a function is a fundamental concept in calculus and analysis concerning the behavior of that function near a particular input. Formal definitions, first devised in the early 19th century, are given below. Informally, a function f assigns an output f (x) to every input x. We say the function has a limit L at an input p: this means f (x) gets closer and closer to L as x moves closer and closer to p. More specifically, when f is applied to any input sufficiently close to p, the output value is forced arbitrarily close to L. On the other hand, if some inputs very close to p are taken to outputs that stay a fixed distance apart, we say the limit does not exist. The notion of a limit has many applications in modern calculus. In particular, the many definitions of continuity employ the limit: roughly, a function is continuous if all of its limits agree with the values of the function. It also appears in the definition of the derivative: in the calculus of one variable, this is the limiting value of the slope of secant lines to the graph of a function. Although implicit in the development of calculus of the 17th and 18th centuries, the modern idea of the limit of a function goes back to Bolzano who, in 1817, introduced the basics of the epsilon - delta technique to define continuous functions. However, his work was not known during his lifetime (Felscher 2000). Cauchy discussed variable quantities, infinitesimals, and limits and defined continuity of y = f (x) (\ displaystyle y = f (x)) by saying that an infinitesimal change in x necessarily produces an infinitesimal change in y in his 1821 book Cours d'analyse, while (Grabiner 1983) claims that he only gave a verbal definition. Weierstrass first introduced the epsilon - delta definition of limit in the form it is usually written today. He also introduced the notations lim and lim (Burton 1997). The modern notation of placing the arrow below the limit symbol is due to Hardy in his book A Course of Pure Mathematics in 1908 (Miller 2004). Imagine a person walking over a landscape represented by the graph of y = f (x). Her horizontal position is measured by the value of x, much like the position given by a map of the land or by a global positioning system. Her altitude is given by the coordinate y. She is walking towards the horizontal position given by x = p. As she gets closer and closer to it, she notices that her altitude approaches L. If asked about the altitude of x = p, she would then answer L. What, then, does it mean to say that her altitude approaches L? It means that her altitude gets nearer and nearer to L except for a possible small error in accuracy. For example, suppose we set a particular accuracy goal for our traveler: she must get within ten meters of L. She reports back that indeed she can get within ten meters of L, since she notes that when she is within fifty horizontal meters of p, her altitude is always ten meters or less from L. The accuracy goal is then changed: can she get within one vertical meter? Yes. If she is anywhere within seven horizontal meters of p, then her altitude always remains within one meter from the target L. In summary, to say that the traveler 's altitude approaches L as her horizontal position approaches p means that for every target accuracy goal, however small it may be, there is some neighborhood of p whose altitude fulfills that accuracy goal. The initial informal statement can now be explicated: This explicit statement is quite close to the formal definition of the limit of a function with values in a topological space. To say that means that ƒ (x) can be made as close as desired to L by making x close enough, but not equal, to p. The following definitions (known as (ε, δ) - definitions) are the generally accepted ones for the limit of a function in various contexts. Suppose f: R → R is defined on the real line and p, L ∈ R. It is said the limit of f, as x approaches p, is L and written if the following property holds: The value of the limit does not depend on the value of f (p), nor even that p be in the domain of f. A more general definition applies for functions defined on subsets of the real line. Let (a, b) be an open interval in R, and p a point of (a, b). Let f be a real - valued function defined on all of (a, b) except possibly at p itself. It is then said that the limit of f, as x approaches p, is L if, for every real ε > 0, there exists a real δ > 0 such that 0 < x − p < δ and x ∈ (a, b) implies f (x) − L < ε. Here again the limit does not depend on f (p) being well - defined. The letters ε and δ can be understood as "error '' and "distance '', and in fact Cauchy used ε as an abbreviation for "error '' in some of his work (Grabiner 1983), though in his definition of continuity he used an infinitesimal α (\ displaystyle \ alpha) rather than either ε or δ (see Cours d'Analyse). In these terms, the error (ε) in the measurement of the value at the limit can be made as small as desired by reducing the distance (δ) to the limit point. As discussed below this definition also works for functions in a more general context. The idea that δ and ε represent distances helps suggest these generalizations. Alternatively x may approach p from above (right) or below (left), in which case the limits may be written as or respectively. If these limits exist at p and are equal there, then this can be referred to as the limit of f (x) at p. If the one - sided limits exist at p, but are unequal, there is no limit at p (the limit at p does not exist). If either one - sided limit does not exist at p, the limit at p does not exist. A formal definition is as follows. The limit of f (x) as x approaches p from above is L if, for every ε > 0, there exists a δ > 0 such that f (x) − L < ε whenever 0 < x − p < δ. The limit of f (x) as x approaches p from below is L if, for every ε > 0, there exists a δ > 0 such that f (x) − L < ε whenever 0 < p − x < δ. If the limit does not exist then the oscillation of f at p is non-zero. Apart from open intervals, limits can be defined for functions on arbitrary subsets of R, as follows. Let f be a real - valued function defined on a subset S of the real line. Let p be a limit point of S -- that is, p is the limit of some sequence of distinct elements of S. The limit of f, as x approaches p from values in S, is L if, for every ε > 0, there exists a δ > 0 such that 0 < x − p < δ and x ∈ S implies f (x) − L < ε. This limit is often written The condition that f be defined on S is that S be a subset of the domain of f. This generalization includes as special cases limits on an interval, as well as left - handed limits of real - valued functions (e.g., by taking S to be an open interval of the form (− ∞, a) (\ displaystyle (- \ infty, a))), and right - handed limits (e.g., by taking S to be an open interval of the form (a, ∞) (\ displaystyle (a, \ infty))). The definition of limit given here does not depend on how (or whether) f is defined at p. Bartle (1967), refers to this as a deleted limit, because it excludes the value of f at p. The corresponding non-deleted limit does depend on the value of f at p, if p is in the domain of f: The definition is the same, except that the neighborhood x − p < δ now includes the point p, in contrast to the (deleted) neighborhood 0 < x − p < δ. This makes the definition of a non-deleted limit less general. One of the advantages of working with non-deleted limits is that they allow to state the theorem about limits of compositions without any constraints on the functions (other than the existence of their non-deleted limits) (Hubbard (2015)). Bartle (1967) notes that although by "limit '' some authors do mean this non-deleted limit, deleted limits are the most popular. For example, Apostol (1974), Courant (1924), Hardy (1921), Rudin (1964), Whittaker & Watson (1902) all by "limit '' mean the deleted version. The function has no limit at x 0 = 1 (\ displaystyle x_ (0) = 1) (the left - hand limit does not exist due to the oscillatory nature of the sine function, and the right - hand limit does not exist due to the asymptotic behaviour of the reciprocal function), but has a limit at every other x-coordinate. The function (the Dirichlet function) has no limit at any x-coordinate. The function has a limit at every non-zero x-coordinate (the limit equals 1 for negative x and equals 2 for positive x). The limit at x = 0 does not exist (the left - hand limit equals 1, whereas the right - hand limit equals 2). The functions and both have a limit at x = 0 and it equals 0. The function has a limit at any x-coordinate of the form π 2 + 2 n π (\ displaystyle (\ frac (\ pi) (2)) + 2n \ pi), where n is any integer. Suppose M and N are subsets of metric spaces A and B, respectively, and f: M → N is defined between M and N, with x ∈ M, p a limit point of M and L ∈ N. It is said that the limit of f as x approaches p is L and write if the following property holds: Again, note that p need not be in the domain of f, nor does L need to be in the range of f, and even if f (p) is defined it need not be equal to L. An alternative definition using the concept of neighbourhood is as follows: if, for every neighbourhood V of L in B, there exists a neighbourhood U of p in A such that f (U ∩ M − (p)) ⊆ V. Suppose X, Y are topological spaces with Y a Hausdorff space. Let p be a limit point of Ω ⊆ X, and L ∈ Y. For a function f: Ω → Y, it is said that the limit of f as x approaches p is L (i.e., f (x) → L as x → p) and written if the following property holds: This last part of the definition can also be phrased "there exists an open punctured neighbourhood U of p such that f (U ∩ Ω) ⊆ V ". Note that the domain of f does not need to contain p. If it does, then the value of f at p is irrelevant to the definition of the limit. In particular, if the domain of f is X − (p) (or all of X), then the limit of f as x → p exists and is equal to L if, for all subsets Ω of X with limit point p, the limit of the restriction of f to Ω exists and is equal to L. Sometimes this criterion is used to establish the non-existence of the two - sided limit of a function on R by showing that the one - sided limits either fail to exist or do not agree. Such a view is fundamental in the field of general topology, where limits and continuity at a point are defined in terms of special families of subsets, called filters, or generalized sequences known as nets. Alternatively, the requirement that Y be a Hausdorff space can be relaxed to the assumption that Y be a general topological space, but then the limit of a function may not be unique. In particular, one can no longer talk about the limit of a function at a point, but rather a limit or the set of limits at a point. A function is continuous in a limit point p of and in its domain if and only if f (p) is the (or, in the general case, a) limit of f (x) as x tends to p. For f (x) a real function, the limit of f as x approaches infinity is L, denoted means that for all ε > 0 (\ displaystyle \ varepsilon > 0), there exists c such that f (x) − L < ε (\ displaystyle f (x) - L < \ varepsilon) whenever x > c. Or, symbolically: Similarly, the limit of f as x approaches negative infinity is L, denoted means that for all ε > 0 (\ displaystyle \ varepsilon > 0) there exists c such that f (x) − L < ε (\ displaystyle f (x) - L < \ varepsilon) whenever x < c. Or, symbolically: For example Limits can also have infinite values. When infinities are not considered legitimate values, which is standard (but see below), a formalist will insist upon various circumlocutions. For example, rather than say that a limit is infinity, the proper thing is to say that the function "diverges '' or "grows without bound ''. In particular, the following informal example of how to pronounce the notation is arguably inappropriate in the classroom (or any other formal setting). In any case, for example the limit of f as x approaches a is infinity, denoted means that for all ε > 0 (\ displaystyle \ varepsilon > 0) there exists δ > 0 (\ displaystyle \ delta > 0) such that f (x) > ε (\ displaystyle f (x) > \ varepsilon) whenever x − a < δ (\ displaystyle x-a < \ delta). These ideas can be combined in a natural way to produce definitions for different combinations, such as For example Limits involving infinity are connected with the concept of asymptotes. These notions of a limit attempt to provide a metric space interpretation to limits at infinity. However, note that these notions of a limit are consistent with the topological space definition of limit if In this case, R is a topological space and any function of the form f: X → Y with X, Y ⊆ R is subject to the topological definition of a limit. Note that with this topological definition, it is easy to define infinite limits at finite points, which have not been defined above in the metric sense. Many authors allow for the projectively extended real line to be used as a way to include infinite values as well as extended real line. With this notation, the extended real line is given as R ∪ (− ∞, + ∞) and the projectively extended real line is R ∪ (∞) where a neighborhood of ∞ is a set of the form (x: x > c). The advantage is that one only needs three definitions for limits (left, right, and central) to cover all the cases. As presented above, for a completely rigorous account, we would need to consider 15 separate cases for each combination of infinities (five directions: − ∞, left, central, right, and + ∞; three bounds: − ∞, finite, or + ∞). There are also noteworthy pitfalls. For example, when working with the extended real line, x − 1 (\ displaystyle x ^ (- 1)) does not possess a central limit (which is normal): In contrast, when working with the projective real line, infinities (much like 0) are unsigned, so, the central limit does exist in that context: In fact there are a plethora of conflicting formal systems in use. In certain applications of numerical differentiation and integration, it is, for example, convenient to have signed zeroes. A simple reason has to do with the converse of lim x → 0 − x − 1 = − ∞ (\ displaystyle \ lim _ (x \ to 0 ^ (-)) (x ^ (- 1)) = - \ infty), namely, it is convenient for lim x → − ∞ x − 1 = − 0 (\ displaystyle \ lim _ (x \ to - \ infty) (x ^ (- 1)) = - 0) to be considered true. Such zeroes can be seen as an approximation to infinitesimals. There are three basic rules for evaluating limits at infinity for a rational function f (x) = p (x) / q (x): (where p and q are polynomials): If the limit at infinity exists, it represents a horizontal asymptote at y = L. Polynomials do not have horizontal asymptotes; such asymptotes may however occur with rational functions. By noting that x − p represents a distance, the definition of a limit can be extended to functions of more than one variable. In the case of a function f: R → R, if where (x, y) − (p, q) represents the Euclidean distance. This can be extended to any number of variables. Let f: X → Y be a mapping from a topological space X into a Hausdorff space Y, p ∈ X and L ∈ Y. If L is the limit (in the sense above) of f as x approaches p, then it is a sequential limit as well, however the converse need not hold in general. If in addition X is metrizable, then L is the sequential limit of f as x approaches p if and only if it is the limit (in the sense above) of f as x approaches p. For functions on the real line, one way to define the limit of a function is in terms of the limit of sequences. (This definition is usually attributed to Eduard Heine.) In this setting: if and only if for all sequences x n (\ displaystyle x_ (n)) (with x n (\ displaystyle x_ (n)) not equal to a for all n) converging to a (\ displaystyle a) the sequence f (x n) (\ displaystyle f (x_ (n))) converges to L (\ displaystyle L). It was shown by Sierpiński in 1916 that proving the equivalence of this definition and the definition above, requires and is equivalent to a weak form of the axiom of choice. Note that defining what it means for a sequence x n (\ displaystyle x_ (n)) to converge to a (\ displaystyle a) requires the epsilon, delta method. Similarly as it was the case of Weierstrass 's definition, a more general Heine definition applies to functions defined on subsets of the real line. Let f be a real - valued function with the domain Dm (f). Let a be the limit of a sequence of elements of Dm (f). Then the limit (in this sense) of f is L as x approaches p if for every sequence x n (\ displaystyle x_ (n)) ∈ Dm (f) \ (a) (so that for all n, x n (\ displaystyle x_ (n)) is not equal to a) that converges to a, the sequence f (x n) (\ displaystyle f (x_ (n))) converges to L (\ displaystyle L). This is the same as the definition of a sequential limit in the preceding section obtained by regarding the subset Dm (f) of R as a metric space with the induced metric. In non-standard calculus the limit of a function is defined by: if and only if for all x ∈ R ∗ (\ displaystyle x \ in \ mathbb (R) ^ (*)), f ∗ (x) − L (\ displaystyle f ^ (*) (x) - L) is infinitesimal whenever x − a (\ displaystyle x-a) is infinitesimal. Here R ∗ (\ displaystyle \ mathbb (R) ^ (*)) are the hyperreal numbers and f ∗ (\ displaystyle f ^ (*)) is the natural extension of f to the non-standard real numbers. Keisler proved that such a hyperreal definition of limit reduces the quantifier complexity by two quantifiers. On the other hand, Hrbacek writes that for the definitions to be valid for all hyperreal numbers they must implicitly be grounded in the ε - δ method, and claims that, from the pedagogical point of view, the hope that non-standard calculus could be done without ε - δ methods can not be realized in full. Bl·aszczyk et al. detail the usefulness of microcontinuity in developing a transparent definition of uniform continuity, and characterize Hrbacek 's criticism as a "dubious lament ''. At the 1908 international congress of mathematics F. Riesz introduced an alternate way defining limits and continuity in concept called "nearness ''. A point x (\ displaystyle x) is defined to be near a set A ⊆ R (\ displaystyle A \ subseteq \ mathbb (R)) if for every r > 0 (\ displaystyle r > 0) there is a point a ∈ A (\ displaystyle a \ in A) so that x − a < r (\ displaystyle x-a < r). In this setting the if and only if for all A ⊆ R (\ displaystyle A \ subseteq \ mathbb (R)), L (\ displaystyle L) is near f (A) (\ displaystyle f (A)) whenever a (\ displaystyle a) is near A (\ displaystyle A). Here f (A) (\ displaystyle f (A)) is the set (f (x) x ∈ A) (\ displaystyle \ (f (x) x \ in A \)). This definition can also be extended to metric and topological spaces. The notion of the limit of a function is very closely related to the concept of continuity. A function ƒ is said to be continuous at c if it is both defined at c and its value at c equals the limit of f as x approaches c: (We have here assumed that c is a limit point of the domain of f.) If a function f is real - valued, then the limit of f at p is L if and only if both the right - handed limit and left - handed limit of f at p exist and are equal to L. The function f is continuous at p if and only if the limit of f (x) as x approaches p exists and is equal to f (p). If f: M → N is a function between metric spaces M and N, then it is equivalent that f transforms every sequence in M which converges towards p into a sequence in N which converges towards f (p). If N is a normed vector space, then the limit operation is linear in the following sense: if the limit of f (x) as x approaches p is L and the limit of g (x) as x approaches p is P, then the limit of f (x) + g (x) as x approaches p is L + P. If a is a scalar from the base field, then the limit of af (x) as x approaches p is aL. If f is a real - valued (or complex - valued) function, then taking the limit is compatible with the algebraic operations, provided the limits on the right sides of the equations below exist (the last identity only holds if the denominator is non-zero). This fact is often called the algebraic limit theorem. In each case above, when the limits on the right do not exist, or, in the last case, when the limits in both the numerator and the denominator are zero, nonetheless the limit on the left, called an indeterminate form, may still exist -- this depends on the functions f and g. These rules are also valid for one - sided limits, for the case p = ± ∞, and also for infinite limits using the rules (see extended real number line). Note that there is no general rule for the case q / 0; it all depends on the way 0 is approached. Indeterminate forms -- for instance, 0 / 0, 0 × ∞, ∞ − ∞, and ∞ / ∞ -- are also not covered by these rules, but the corresponding limits can often be determined with L'Hôpital's rule or the Squeeze theorem. In general, from knowing that it does not follow that lim x → a f (g (x)) = c (\ displaystyle \ lim _ (x \ to a) f (g (x)) = c). However, this "chain rule '' does hold if one of the following additional conditions holds: As an example of this phenomenon, consider the following functions that violates both additional restrictions: Since the value at f (0) is a removable discontinuity, Thus, the naïve chain rule would suggest that the limit of f (f (x)) is 0. However, it is the case that and so For n (\ displaystyle n) a nonnegative integer and constants a 1, a 2, a 3,..., a n (\ displaystyle a_ (1), a_ (2), a_ (3), \ ldots, a_ (n)) and b 1, b 2, b 3,..., b n (\ displaystyle b_ (1), b_ (2), b_ (3), \ ldots, b_ (n)), This can be proven by dividing both the numerator and denominator by x n (\ displaystyle x ^ (n)). If the numerator is a polynomial of higher degree, the limit does not exist. If the denominator is of higher degree, the limit is 0. This rule uses derivatives to find limits of indeterminate forms 0 / 0 or ± ∞ / ∞, and only applies to such cases. Other indeterminate forms may be manipulated into this form. Given two functions f (x) and g (x), defined over an open interval I containing the desired limit point c, then if: then: lim x → c f (x) g (x) = lim x → c f ′ (x) g ′ (x) (\ displaystyle \ lim _ (x \ to c) (\ frac (f (x)) (g (x))) = \ lim _ (x \ to c) (\ frac (f ' (x)) (g ' (x)))) Normally, the first condition is the most important one. For example: lim x → 0 sin ⁡ (2 x) sin ⁡ (3 x) = lim x → 0 2 cos ⁡ (2 x) 3 cos ⁡ (3 x) = 2 ⋅ 1 3 ⋅ 1 = 2 3. (\ displaystyle \ lim _ (x \ to 0) (\ frac (\ sin (2x)) (\ sin (3x))) = \ lim _ (x \ to 0) (\ frac (2 \ cos (2x)) (3 \ cos (3x))) = (\ frac (2 \ cdot 1) (3 \ cdot 1)) = (\ frac (2) (3)).) Specifying an infinite bound on a summation or integral is a common shorthand for specifying a limit. A short way to write the limit lim n → ∞ ∑ i = s n f (i) (\ displaystyle \ lim _ (n \ to \ infty) \ sum _ (i = s) ^ (n) f (i)) is ∑ i = s ∞ f (i) (\ displaystyle \ sum _ (i = s) ^ (\ infty) f (i)). An important example of limits of sums such as these are series. A short way to write the limit lim x → ∞ ∫ a x f (t) d t (\ displaystyle \ lim _ (x \ to \ infty) \ int _ (a) ^ (x) f (t) \; dt) is ∫ a ∞ f (t) d t (\ displaystyle \ int _ (a) ^ (\ infty) f (t) \; dt). A short way to write the limit lim x → − ∞ ∫ x b f (t) d t (\ displaystyle \ lim _ (x \ to - \ infty) \ int _ (x) ^ (b) f (t) \; dt) is ∫ − ∞ b f (t) d t (\ displaystyle \ int _ (- \ infty) ^ (b) f (t) \; dt).
where is pretty little liars supposed to take place
Pretty Little Liars - Wikipedia Pretty Little Liars is an American teen drama mystery thriller television series developed by I. Marlene King, loosely based on the popular book series of the same name by Sara Shepard. The show premiered on June 8, 2010 on ABC Family and ended on June 27, 2017, for a total of seven seasons and 160 episodes. The series spawned two spin - offs: Ravenswood (2013 -- 2014) and the upcoming Pretty Little Liars: The Perfectionists. Set in the small suburban town of Rosewood, Pennsylvania (not far from Philadelphia), the series follows the lives of five girls: Spencer Hastings, Alison DiLaurentis, Aria Montgomery, Hanna Marin and Emily Fields, whose clique falls apart after the leader of the group, Alison, goes missing. One year later, the remaining estranged friends are reunited as they begin receiving messages from a mysterious villain named "A '' or "A.D. '', who threatens and tortures them for the mistakes they have made before and after Alison was alive. At first, they think it is Alison herself, but after her body is found, the girls realize that someone else is planning on ruining their lives. Originally developed as a television series by book packaging company Alloy Entertainment, the idea was described as "Desperate Housewives for teens. '' Alloy met with author Shepard, and gave her the property to develop into a book series. With Alloy and Warner Horizon Television interested in producing Pretty Little Liars as a television series for years, it was first planned for The WB in 2005 with a different writer until the network shut down in early 2006 and reestablished as The CW later that year. The first novel was published by HarperTeen in October 2006. In June 2008, Alloy noted that it was developing a Pretty Little Liars television pilot for ABC Family, with the novels adapted for television. After the pilot was shot in Vancouver, filming for the rest of the series moved to Los Angeles. The series was primarily filmed at the Warner Bros. studio and backlot in the city of Burbank, near Los Angeles. In June 2012, the series was selected by lottery for a California film and TV tax credit. ABC Family began casting for a Pretty Little Liars television pilot in October 2009. Lucy Hale was cast as Aria Montgomery in the project, followed by Troian Bellisario as Spencer Hastings and Ian Harding as Ezra Fitz in November 2009. In December 2009, The Futon Critic confirmed the casting of Ashley Benson as Hanna Marin and Shay Mitchell as Emily Fields, as well as the addition of Laura Leighton as Ashley Marin, Nia Peeples as Pam Fields, Roark Critchlow as Tom Marin, and Bianca Lawson as Maya. Mitchell had initially auditioned for the role of Spencer and then tried for Emily. The Hollywood Reporter also noted that Torrey DeVitto and Sasha Pieterse landed recurring roles in the pilot. The Alloy website later confirmed that Pieterse would be playing Alison DiLaurentis and DeVitto would be Melissa Hastings, also mentioning the casting of Janel Parrish as Mona Vanderwaal. On January 27, 2010, ABC Family picked up the series for 10 episodes, set to premiere in June 2010. In April 2010, the role of Aria 's father Byron was recast with Chad Lowe, and Holly Marie Combs was cast as Aria 's mother Ella. Jenna Marshall is played by Tammin Sursok. On January 7, 2011, Tilky Jones was cast as Logan Reed. On April 8, 2011, Annabeth Gish was cast for the role of Anne Sullivan, a therapist who tries to find out the characters ' secrets. On May 23, 2011, Andrea Parker signed up to appear as Jessica DiLaurentis, Alison 's mother. On June 29, 2011, it was announced that Natalie Hall would be replacing Natalie Floyd as Hanna 's soon - to - be stepsister, Kate. On January 30, 2012, The Hollywood Reporter reported that Tyler Blackburn had been promoted to series regular for the third season. In March 2012, Janel Parrish was also promoted from recurring to series regular for the third season. On March 16, 2016, TVLine reported that Andrea Parker was added as a series regular for the seventh season. Pretty Little Liars was called one of the most spectacular new shows of summer 2010 thanks to heavy promotion by ABC Family, including "spicy promos and hot posters ''. ABC Family encouraged fans to host a "Pretty Little Premiere Party '' for the show by sending the first respondents a fan kit, and selected applicants to become part of an interactive "Secret Keeper Game '' played with iPhones provided by the network. The show 's official Facebook and Twitter accounts also promoted special fan features, including a "Pretty Little Lie Detector ''. Los Angeles department store Kitson showcased the show in their shop window. A tie - in edition featuring the Season 1 poster and logo of the 2006 first novel in the Pretty Little Liars series was released on the date of the show 's premiere, as was the final book of the original book series, "Wanted ''. "Wanted '' later decidedly became the eighth book of the series, as Shepard later confirmed she would extend the series. A TV tie in of the second book "Flawless '' featuring an altered Season 3 poster was released on December 28, 2010. The theme song for Pretty Little Liars is "Secret '' by The Pierces, which was suggested by one of the show 's stars, Ashley Benson. The pilot episode featured music from artists such as The Fray, Ben 's Brother, MoZella, Orelia, and Colbie Caillat. The show has also featured music from Passion Pit, Lady Gaga, Pink, Florence and the Machine, Lykke Li, Selena Gomez & the Scene, McFly and Rachel Platten. Music from Katie Herzig can be heard multiple times throughout the show. A few songs being "Hey Na Na '' and "Where the Road Meets the Sun ''. Madi Diaz has been heard on the show with her song "Heavy Heart '', as well as Joy Williams with "Charmed Life '' and Foreign Slippers with "What Are You Waiting For? ''. The last episode of season one featured a song by Alexz Johnson that she originally recorded for the soundtrack of the Canadian hit TV show Instant Star. On June 14, 2011, "Jar of Hearts '' by Christina Perri was featured in the first episode of the second season. The song "Follow Suit '' by Trent Dabbs has also been featured on the show. In the episode "The Perfect Storm '', Lucy Hale sings a cover of the song "Who Knows Where the Time Goes? '' by British band Fairport Convention. The official soundtrack was released on February 15, 2011. Rosewood is a fictional town in the United States state of Pennsylvania. It is the principal setting of the series and the books. The Liars live in this town along with most of the other characters. It contains many principal locations: the police station, the Rosewood High School, the church, the Rosewood Shopping Center, Hollis College, the Rosewood Community Hospital, the Rosewood movie theater, the mausoleum, the dental office, the Ophthalmology Cabinet, and the Rosewood Court. Virtually all these locations are actually exteriors on the backlot of the Warner Bros. studio lot in Burbank. Interior scenes are filmed separately on nearby soundstages. For example, different sides of the same building on the backlot are used for the entrances of the Rosewood police station and high school. Pretty Little Liars premiered on June 8, 2010 in the United States, becoming ABC Family 's highest - rated series debut on record across the network 's target demographics. It ranked number one in key 12 -- 34 demos and teens, becoming the number - one scripted show in Women 18 -- 34, and Women 18 -- 49. The premiere was number two in the hour for total viewers, which generated 2.47 million unique viewers, and was ABC Family 's best delivery in the time slot since the premiere of The Secret Life of the American Teenager. The second episode retained 100 % of its premiere audience with 2.48 million viewers, despite the usual downward trend following a premiere of a show, and built on its premiere audience. It was the dominant number one of its time slot in Adults 18 -- 49, and the number one show in female teens. Subsequent episodes fluctuated between 2.09 and 2.74 million viewers. The August 10, 2010 "Summer Finale '' episode drew an impressive 3.07 million viewers. On June 28, 2010, ABC Family ordered 12 more episodes of the show, bringing its total first - season order to 22. On January 10, 2011, ABC Family picked the show up for a second season that premiered on June 14, 2011. During the summer of 2011, Pretty Little Liars was basic cable 's top scripted series in women aged 18 -- 34 and viewers 12 -- 34. The second half of season 2 aired on Mondays at 8 / 7c, beginning on January 2, 2012. On November 29, 2011, ABC Family renewed the show for a third season, which consisted of 24 episodes. On October 4, 2012, ABC Family announced that the show was renewed for a fourth season, again comprising 24 episodes. The second half of the third season began airing on January 8, 2013 and finished March 19, 2013. Pretty Little Liars returned for Season 4 on June 11, 2013. On March 25, 2013, it was again announced that Pretty Little Liars had been renewed for a fifth season scheduled for a 2014 air date and a new spin off show entitled Ravenswood would begin airing after the season four annual Halloween special in October 2013. The second half of season four premiered on January 7, 2014. It was announced on June 10, 2014 that Pretty Little Liars was renewed for two seasons, making the show ABC Family 's longest running original hit series. On August 29, 2016, I. Marlene King announced that Pretty Little Liars would be ending after the seventh season had aired. The second half of the seventh season will begin airing later than previous season, in April instead of January. Pretty Little Liars opened with mixed reviews. Metacritic gave the pilot episode 52 out of 100, based upon 14 critical reviews. The New York Daily News gave the show a positive review, commenting that it "makes most popular vampire romances look anemic '', while concluding, "Pretty Little Liars could go in several directions, including mundane teen clichés. It 's got an equally good shot at making us care about these imperfect pretty girls. '' A writer on Terror Hook has stated that "' Pretty Little Liars ' gets off to a very promising start. Great production all around, the writing keeps the viewer on their toes, and the acting just reinforces it. The overall mystery of the show in the end is dark and unpredictable, even stepping into the slasher film realm. '' The New York Post gave the show three out of five stars, stating, "OK, so we 've established that there is no socially redeeming value in this series and that your kids should n't watch it if they are too young and impressionable. But if you can distract them enough to miss the first 15 minutes, the show is n't half - bad. Actually, it is half - good, if that makes sense. '' The Los Angeles Times wrote that the series is "one of those shows that manages to mildly, and perhaps unintentionally, spoof its genre while fully participating in it, and that 's not a bad thing at all. '' Entertainment Weekly had a less favorable review, giving the show a letter grade of "D − '', saying, "Imagine the pitch for Liars: It 's I Know What You Did Last Summer meets Gossip Girl, but like not so subtle. '' It went on to say that the plot "hits every racy teen entertainment mark so hard (everyone 's hair is so full -- of secrets!) that it feels like the only thing missing is a visit from the ghosts of Jennifer Love Hewitt and Freddie Prinze, Jr. '' The Hollywood Reporter compared the show to those on The CW and noted, "Sure, there 's a lot here that sustains more eye - rolling than interested stares, but a little patience might be warranted. '' Since the series premiere, Pretty Little Liars has remained popular. In 2016, a New York Times study of the 50 TV shows with the most Facebook Likes found that the show 's "popularity is tilted toward women more than any other show in the data -- over 94 percent of ' likes ' come from women ''. The series earned its highest rated episode with 4.20 million total viewers, ranking among ABC Family 's top 5 telecasts in adult viewers 18 -- 34, total viewers and women viewers. The highest rated episodes include season one 's finale, with 3.64 million, and season two 's premiere and finale episodes, each yielding nearly 3.7 million viewers. The show stands as the most watched series on ABC Family, maintaining a steady viewership of over 2.5 million and currently standing as the only show to yield an average of over 2 million viewers. All previously released special features on all seven seasons.
why is there no jack in the box in virginia
Jack in the Box - Wikipedia Jack in the Box is an American fast - food restaurant chain founded February 21, 1951, by Robert O. Peterson in San Diego, California, where it is headquartered. The chain has 2,200 locations, primarily serving the West Coast of the United States and selected large urban areas in the eastern portion of the US including Texas. Food items include a variety of hamburger and cheeseburger sandwiches along with selections of internationally themed foods such as tacos and egg rolls. The company also operates the Qdoba Mexican Grill chain. Robert Oscar Peterson already owned several successful restaurants when he opened Topsy 's Drive - In at 6270 El Cajon Boulevard in San Diego in 1941. Several more Topsy 's were opened and eventually renamed Oscar 's (after Peterson 's middle name). By the late 1940s, the Oscar 's locations had developed a circus - like décor featuring drawings of a starry - eyed clown. In 1947, Peterson obtained rights for the intercom ordering concept from George Manos who owned one location named Chatter box in Anchorage, Alaska, the first known location to use the intercom concept for drive up windows. In 1951, Peterson converted the El Cajon Boulevard location into Jack in the Box, a hamburger stand focused on drive - through service. While the drive - through concept was not new, Jack in the Box innovated a two - way intercom system, the first major chain to use an intercom and the first to focus on drive - through. The intercom allowed much faster service than a traditional drive - up window; while one customer was being served at the window, a second and even a third customer 's order could be taken and prepared. A giant clown projected from the roof, and a smaller clown head sat atop the intercom, where a sign said, "Pull forward, Jack will speak to you. '' The Jack in the Box restaurant was conceived as a "modern food machine, '' designed by La Jolla, California master architect Russell Forester. Quick service made the new location very popular, and soon all of Oscar 's locations were redesigned with intercoms and rechristened Jack in the Box restaurants. Peterson 's holding company Foodmaker Company by 1966 was known as Foodmaker, Inc. At this time, all Jack in the Box locations -- over 180, mainly in California and the Southwest -- were company - owned; location sites, food preparation, quality control, and the hiring and training of on - site managers and staff in each location were subject to rigorous screening and strict performance standards. In 1968, Peterson sold Foodmaker to Ralston Purina Company. In the 1970s, Foodmaker led the Jack in the Box chain toward its most prolific growth (television commercials in the early 1970s featured child actor Rodney Allen Rippy) and began to franchise locations. The chain began to increasingly resemble its larger competitors, particularly industry giant McDonald 's. Jack in the Box began to struggle in the latter part of the decade; its expansion into East Coast markets was cut back, then halted. By the end of the decade, Jack in the Box restaurants were sold in increasing numbers. Around 1980, Foodmaker dramatically altered Jack in the Box 's marketing strategy by literally blowing up the chain 's symbol, the jack in the box, in television commercials with the tagline, "The food is better at the Box ''. Jack in the Box announced that it would no longer compete for McDonald 's target customer base of families with young children. Instead, Foodmaker targeted older, more affluent "yuppie '' customers with a higher - quality, more upscale menu and a series of whimsical television commercials featuring Dan Gilvezan, who attempted to compare the new menu items to that of McDonald 's and other fast - food chains, to no avail; hence "There 's No Comparison '', their slogan at the time. Jack in the Box restaurants were remodeled and redecorated with decorator pastel colors and hanging plants; the logo, containing a clown 's head in a red box with the company name in red next to or below the box (signs in front of the restaurant displayed the clown 's head only), was modified, stacking the words in a red diagonal box while still retaining the clown 's head; by about 1981 or 1982, the clown 's head was removed from the logo, which would remain until 2009. Television advertising from about 1985 onward featured minimalistic music by a small chamber - like ensemble (specifically a distinctive seven - note plucked musical signature). The menu, previously focused on hamburgers led by the flagship Jumbo Jack, became much more diverse, including salads, chicken sandwiches, finger foods, and Seasoned Curly Fries (at least two new menu items were introduced per year), at a time when few fast - food operations offered more than standard hamburgers. Annual sales increased through the 1980s. Ralston Purina tried further to mature the restaurant 's image, renaming it "Monterey Jack 's '' in 1985, a disastrous move; the Jack in the Box name was restored in 1986. After 18 years, Ralston Purina decided in 1985 that Foodmaker was a non-core asset and sold it to management. By 1987, sales reached $655 million, the chain boasted 897 restaurants, and Foodmaker became a publicly traded company. JBX Grill was a line of fast casual restaurants introduced in 2004 by Jack in the Box Inc. They featured high - quality, cafe - style food, avoiding most of the cheaper fast - food items typically served at Jack in the Box. The architecture and decor maintained an upbeat, positive atmosphere, and the customer service was comparable to most dine - in restaurants. Two of the Jack in the Box restaurants in San Diego, California (where Jack in the Box is headquartered) were converted to JBX Grill restaurants to test the concept. (The locations in Hillcrest and Pacific Beach still retain many of the JBX elements, including an indoor / outdoor fireplace and modern architecture.) There were also restaurants in Bakersfield, California, Boise, Idaho, and Nampa, Idaho. However, the concept later proved unsuccessful, and the last stores were reconverted to Jack in the Box in 2006. Although best known for its hamburgers, Jack in the Box 's most popular product is its taco, which it has sold since the first restaurant in the 1950s. As of 2017, the company sells 554 million a year manufactured in three factories in Texas and Kansas to "a legion of fans who swear by the greasy vessels even as they sometimes struggle to understand their appeal '', The Wall Street Journal reported. The newspaper quoted one fan who compared it to "' a wet envelope of cat food ' '' and observed that "there are two kinds of people: those who think they 're disgusting and those who agree they 're disgusting but are powerless to resist them ''. A Los Angeles restaurateur praised it, however, as "the most underrated taco of all time ''; celebrity fans include Chelsea Handler, Selena Gomez, and Chrissy Teigen. What makes the taco unusual is that it is created with the meat and hard taco shell in the Texas and Kansas facilities, then frozen for transport and storage. At the restaurant, it is then deep fried, then prepared with lettuce, cheese and mild taco sauce before serving. Besides tacos, other Americanized foods from ethnic cuisines that Jack in the Box offers include egg rolls, breakfast burritos, and poppers. New items come in on a rotation every three to four months, including the Philly cheesesteak and the deli style pannidos (deli trio, ham & turkey, zesty turkey) which were replaced by Jack 's ciabatta burger and included the original ciabatta burger and the bacon ' n ' cheese ciabatta. Jack in the Box also carries seasonal items such as pumpkin pie shakes, Oreo mint shakes, and eggnog shakes during the Thanksgiving and Christmas holidays. In some locations, local delicacies are a regular part of the menu. Locations in Hawaii, for example, include the Paniolo Breakfast (Portuguese sausage, eggs, and rice platter) and teriyaki chicken and rice bowl. In the Southern United States, the company offers biscuits and sweet tea. In Imperial County, California, some locations sell date shakes, reflecting the crop 's ubiquity in the region 's farms. In the spring of 2007, Jack in the Box also introduced its sirloin burger and followed this up recently with the sirloin steak melt. Its more recent foray into the deli market was the less - popular Ultimate Club Sandwich which was initially removed in Arizona due to poor sales and has since been phased out at all locations. The Bonus Jack was first released in 1970 and has been reintroduced to Jack in the Box menus at times throughout the years. In November 2009, the company discontinued their popular ciabatta sandwiches / burgers. In 2012, Jack in the Box introduced a bacon milkshake as part of its "Marry Bacon '' campaign. In October 2016, "Brunchfast '' items were introduced. Those are Bacon & Egg Chicken Sandwich, Blood Orange Fruit Cooler, Brunch Burger, Cranberry Orange Muffins, Homestyle Potatoes, and Southwest Scrambler Plate. The restaurant rebounded in popularity in 1994 after a highly successful marketing campaign that featured the fictitious Jack in the Box chairman Jack character (formerly voiced by the campaign 's creator Dick Sittig), who has a ping pong ball - like head, a yellow clown cap, two blue eyes, a pointy black nose, and a linear red smile that changes with his emotions, and is dressed in a business suit. Jack was reintroduced specifically to signal the new direction the company was taking to refocus and regroup after the E. coli disaster. In the original spot that debuted in Fall 1994, Jack ("through the miracle of plastic surgery '', he says as he confidently strides into the office building) reclaims his rightful role as founder and CEO, and, apparently as revenge for being blown up in 1980, approaches the closed doors of the Jack in the Box boardroom (a fictionalized version, shown while the aforementioned minimalist theme music from the 1980s Jack in the Box commercials plays), activates a detonation device, and the boardroom explodes in a shower of smoke, wood, and paper. The spot ends with a close - up shot of a small white paper bag, presumably filled with Jack in the Box food, dropping forcefully onto a table; the bag is printed with the words "Jack 's Back '' in bold red print, then another bag drops down with the Jack in the Box logo from that period. Later ads feature the first bag showing the text of the food item or offer the commercial is promoting. A commercial was released where Jack goes to the house of a man who has records of calling Jack in the Box "Junk in the Box ''. When the man shoves Jack yelling "Beat it clown! '', Jack chases him outside, tackles him to the ground, and forces him to try Jack 's food and confess his deed. The commercial ends with Jack saying "I 'm sorry for the grass stains. '' "Really? '' "No ''. The commercials in the "Jack 's Back '' campaign (which has won several advertising industry awards) tend to be lightly humorous and often involve Jack making business decisions about the restaurant chain 's food products, or out in the field getting ideas for new menu items. While a series of ads claiming to ask when Burger King and McDonald 's will change their ways about making their hamburgers featured a phone number, it is unknown whether the caller would actually be connected to Jack himself. In addition, many commercials have advertised free car antenna balls with every meal, thus increasing brand awareness. Often different types of antenna balls were available during a holiday or major event or themed toward a sports team local to the restaurant. The antenna balls have since been discontinued due to the demise of the mast - type car antenna. During the height of the now - defunct XFL, one of the continuing ad series involved a fictitious professional American football team owned by Jack. The team, called the Carnivores, played against teams such as the Tofu Eaters and the Vegans. In 1997, a successful advertising campaign was launched using a fictional musical group called the Spicy Crispy Girls (a take off of the Spice Girls, a British pop music girl group - at the time one of the most popular groups in the world), in comedic national television commercials. The commercials were used to promote the new Jack in the Box Spicy Crispy Chicken Sandwich (now known as Jack 's Spicy Chicken), with the girls dancing in "the Jack groove. '' The Spicy Crispy Girls concept was used as a model for another successful advertising campaign called the ' Meaty Cheesy Boys ' to promote the Ultimate Cheeseburger (see below). At the 1998 Association of Independent Commercial Producers (AICP) Show, one of the Spicy Crispy Girls commercials won the top award for humor. The Meaty Cheesy Boys, a mock boy band to promote the Ultimate Cheeseburger, were created during an ad campaign featuring an out - of - control advertising executive previously fired by Jack. The same ad exec featured in a spot where a medical doctor made exaggerated claims of the benefits of fast food that it would cure baldness, help trim extra pounds, and remove wrinkles. Jack asks the ad exec incredulously, "Where did you find this guy? '' The ad exec responds proudly, "Tobacco company. '' In 2000, an ad involved a man washed up on a remote island with only a Jack in the Box antenna ball as company. Later that year, director Robert Zemeckis, claiming the agency had appropriated elements of his Oscar - nominated film Cast Away for the ad, had his lawsuit against the ad agency thrown out. In April 2006, Jack in the Box launched an ad campaign called Bread is Back, taking a stab at the low carbohydrate diets of recent years. In 2006, Jack in the Box took use of this perception creating a commercial featuring a typical stoner who is indecisive about ordering. When faced with a decision, the Jack in the Box figurine in his car tells him to "stick to the classics '' and order 30 tacos implying that he has the "munchies ''. This ad later stirred up controversy among a San Diego teen group who claimed that the ad was irresponsible showing a teenager who was under the influence of drugs. To protest, they presented the company with 2000 postcards protesting the ad, despite the fact that it had not aired since the beginning of the previous month. This commercial was redone in 2009 to feature the new logo and the new Campaign. Another ad touting the chain 's milkshakes aired circa 2003 and was shot in the stilted style of a 1970s - era anti-drug spot, urging kids to "say no to fake shakes '' and featured "Larry The Crime Donkey, '' a parody of McGruff the Crime Dog. In 2007, Jack in the Box began a commercial campaign for their new 100 % sirloin beef hamburgers, implying that they were of higher quality than the Angus beef used by Carl 's Jr., Hardee 's, Wendy 's, and Burger King. That May, CKE Restaurants, Inc., the parent company of Carl 's Jr. and Hardee 's, filed a lawsuit against Jack in the Box, Inc. CKE claimed, among other things, that the commercials tried to give the impression that Carl 's Jr. / Hardee 's Angus beef hamburgers contained cow anuses by having an actor swirl his finger in the air in a circle while saying "Angus '' in one commercial and having other people in the second commercial laugh when the word "Angus '' was mentioned. They also attacked Jack in the Box 's claim that sirloin, a cut found on all cattle, was of higher quality than Angus beef, which is a breed of cattle. During Super Bowl XLIII on February 1, 2009, a commercial depicted Jack in a Full Body Cast after getting hit by a bus. In October 2009, Jack in the Box debuted a popular commercial to market their "Teriyaki Bowl '' meals. The commercial features employees getting "bowl cut '' haircuts. At the end of the commercial, Jack reveals that his "bowl cut '' is a wig, to the dismay of the employees. The One variation has a miniature clown hat (dating back to 1978) with three dots in the upper left hand corner; the clown head was removed in 1980. In the 1970s, the clown head was in a red box all by itself, with the company name either below or next to the box; signs in front of the restaurants had the clown head only. The ' clown head ' can be seen on several YouTube videos depicting Jack in the Box commercials from the 1970s and 1980s. Most Jack in the Box locations opened before late 2008 had this logo, although the company is slowly replacing them with the newer logo, along with general updating of the locations ' decor. Some locations continue to use this logo as their "Open / Closed '' sign. In 1981, horse meat labeled as beef was discovered at a Foodmaker plant that supplied hamburger and taco meat to Jack in the Box. The meat was originally from Profreeze of Australia, and during their checks on location, the food inspectors discovered other shipments destined for the United States which included kangaroo meat. In 1993, Jack in the Box suffered a major corporate crisis involving E. coli O157: H7 bacteria. Four children died of hemolytic uremic syndrome and 600 others were reported sick after eating undercooked patties contaminated with fecal material containing the bacteria at a location in Tacoma, Washington and other parts of the Pacific Northwest. The chain was faced with several lawsuits, each of which was quickly settled (but left the chain nearly bankrupt and losing customers). At the time, Washington state law required that hamburgers be cooked to an internal temperature of at least 155 ° F (68 ° C), the temperature necessary to kill E. coli bacteria, although the FDA requirement at that time was only 140 ° F (60 ° C), which was the temperature Jack in the Box cooked. After the incident, Jack in the Box mandated that in all nationwide locations, their hamburgers be cooked to at least 155 ° F (68 ° C). Additionally, all meat products produced in the United States are required to comply with HACCP (Hazard Analysis and Critical Control Points) regulations. Every company that produces meat products is required to have a HACCP plan that is followed continuously. Jack in the Box also worked with food safety experts from manufacturing companies and created a comprehensive program to test for bacteria in every food product. In 2005, Jack in the Box announced plans for nationwide expansion by 2010. In support of this objective, the chain began airing ads in states several hundred miles from the nearest location. The expansion strategy at that time was targeted at Colorado, Delaware, Florida and Texas. In 2007, the first new Colorado store opened in Golden, Colorado, marking an end to Jack in the Box 's 11 - year - long absence from the state. In Albuquerque, New Mexico, several locations opened in June 2009. Jack in the Box restaurants last made an appearance in the Albuquerque market approximately two decades ago. In September 2010, it was announced that 40 under - performing company - owned Jack in the Box restaurants located mostly in Texas and the Southeast would close. In March 2011, Jack in the Box launched the Munchie Mobile in San Diego, a food truck that will serve Jack 's burgers and fries. In June 2012, Jack in the Box launched their second food truck in the southeast region of the United States. Another truck was launched for the Northern Texas area in April 2013. In January 2012, Jack in the Box opened its first of three locations in the Indianapolis area. A few months later, the first Ohio location opened in September 2012 in West Chester. On December 16, 2004, the company restated three years of results due to an accounting change that prompted the company to cut first - quarter and 2005 earnings expectations.
where is same kind of different as me being filmed
Same Kind of Different as Me (film) - wikipedia Same Kind of Different as Me is a 2017 American Christian drama film directed by Michael Carney, in his feature directorial debut, and written by Ron Hall, Alexander Foard and Michael Carney. It is based on the 2006 book of the same name by Ron Hall, Denver Moore and Lynn Vincent. The film stars Greg Kinnear, Renée Zellweger, Djimon Hounsou, Olivia Holt, Jon Voight, and Stephanie Leigh Schlund. The film was released on October 20, 2017, by Pure Flix Entertainment. After Ron Hall (Greg Kinnear), a selfish successful art dealer, admits to cheating on her, his wife Deborah (Renée Zellweger) forces him to volunteer at a homeless shelter. There Denver Moore (Djimon Hounsou), a homeless ex-convict, helps him to change his life. On October 20, 2014, Renée Zellweger joined the cast to play Deborah. On October 28, 2014, Greg Kinnear, Djimon Hounsou and Jon Voight joined the cast. On November 7, 2014, Olivia Holt joined the cast to play Regan Hall. Principal photography began on October 27, 2014, and ended on December 19, 2014. The film was originally scheduled to be released on April 29, 2016, but was pushed back to February 3, 2017. On December 30, 2016, the film was pushed back to October 20, 2017, and was acquired by Pure Flix Entertainment. The film was released on DVD and Blu - ray on February 20, 2018. On review aggregator Rotten Tomatoes, the film has an approval rating of 38 % based on eight reviews and an average rating of 5.6 / 10. On Metacritic, which assigns a normalized rating, the film has a weighted average score of 49 out of 100 based on six critics, indicating "mixed or average reviews ''.
how many days does it take for a bee to hatch
Honey bee life cycle - wikipedia The honey bee life cycle, here referring exclusively to the domesticated Western honey bee, depends greatly on their social structure. Unlike a bumble bee colony or a paper wasp colony, the life of a honey bee colony is perennial. The three types of honey bees in a hive are: queens (egg - producers), workers (non-reproducing females), and drones (males whose main duty is to find and mate with a queen). Honey bees hatch from eggs in three to four days. They are then fed by worker bees and develop through several stages in the cells. Cells are capped by worker bees when the larva pupates. Queens and drones are larger than workers, so require larger cells to develop. A colony may typically consist of tens of thousands of individuals. While some colonies live in hives provided by humans, so - called "wild '' colonies (although all honey bees remain wild, even when cultivated and managed by humans) typically prefer a nest site that is clean, dry, protected from the weather, about 20 liters in volume with a 4 - to 6 - cm entrance about 3 m above the ground, and preferably facing south or south - east (in the Northern Hemisphere) or north or north - east (in the Southern Hemisphere). Development from egg to emerging bee varies among queens, workers, and drones. Queens emerge from their cells in 15 -- 16 days, workers in 21 days, and drones in 24 days. Only one queen is usually present in a hive. New virgin queens develop in enlarged cells through differential feeding of royal jelly by workers. When the existing queen ages or dies or the colony becomes very large, a new queen is raised by the worker bees. When the hive is too large, the old queen will take half the hive and half the reserves with her in a swarm. This occurs a few days prior to the new queen hatching. If several queens emerge they will begin piping (a high buzzing noise) signaling their location for the other virgin queens to come fight. Once one has eliminated the others, she will go around the hive chewing the sides of any other queen cells and stinging and killing the pupae. The queen takes one or several nuptial flights. The drones leave the hive when the queen is ready and mate, and mate in turns, dying after doing so. After mating the queen begins laying eggs. A fertile queen is able to lay fertilized or unfertilized eggs. Each unfertilized egg contains a unique combination of 50 % of the queen 's genes and develops into a haploid drone. The fertilized eggs develop into either diploid workers or virgin queens if fed royal jelly. The average lifespan of a queen is three to four years; drones usually die upon mating or are expelled from the hive before the winter; and workers may live for a few weeks in the summer and several months in areas with an extended winter. (Days until emergence) (range: 18 -- 22 days) The weight progression of the worker egg, larva:
who id the oldest person in the world
List of the verified oldest people - wikipedia This is a list of the 100 verified oldest people, arranged in descending order of each individual 's age in years and days. A year typically refers to a calendar year, the time between two dates of the same name. However, years can be of different lengths due to the presence or absence of a leap day within the year, or to the conversion of dates from one calendar to another. The oldest person ever whose age has been verified is Jeanne Calment (1875 -- 1997) of France, who died at the age of 122 years, 164 days. There are five living people on this list, all of whom are women and the oldest of whom is Kane Tanaka of Japan, aged 115 years, 223 days. Since all the people on this list have lived past the age of 110, all of them have been, or still are, supercentenarians. Deceased Living † ^ denotes age at death, or, if living, age as of today, 13 August 2018 a ^ Manfredini was born in Italy. b ^ Mortensen was born in Denmark. c ^ Gaudette was born in the United States. d ^ Holtz was born in the German Empire; her birthplace is now in Poland. e ^ Gerena was born in Puerto Rico. f ^ Ray was born in Canada. g ^ Steinberg was born in Kishinev, then part of the Russian Empire. It is now the capital of Moldova. h ^ Pizzinato - Papo was born in Ala, which was then part of Austria - Hungary; it is now located in Italy. i ^ Clawson was born in the United Kingdom. j ^ Domingues was born in Cape Verde, at that time a colony of Portugal. k ^ Benkner was born in Germany. l ^ Primout was born in French Algeria; it is now Algeria. m ^ Velasco was born in Mexico. n ^ Cock was born in the former British colony Victoria, which is now part of Australia.
hicks law states that reaction time will increase logarithmically as the
Hick 's law - wikipedia Hick 's law, or the Hick -- Hyman law, named after British and American psychologists William Edmund Hick and Ray Hyman, describes the time it takes for a person to make a decision as a result of the possible choices he or she has: increasing the number of choices will increase the decision time logarithmically. The Hick -- Hyman law assesses cognitive information capacity in choice reaction experiments. The amount of time taken to process a certain amount of bits in the Hick -- Hyman law is known as the rate of gain of information. Hick 's law is sometimes cited to justify menu design decisions. For example, to find a given word (e.g. the name of a command) in a randomly ordered word list (e.g. a menu), scanning of each word in the list is required, consuming linear time, so Hick 's law does not apply. However, if the list is alphabetical and the user knows the name of the command, he or she may be able to use a subdividing strategy that works in logarithmic time. In 1868, Franciscus Donders reported the relationship between having multiple stimuli and choice reaction time. In 1885, J. Merkel discovered the response time is longer when a stimulus belongs to a larger set of stimuli. Psychologists began to see similarities between this phenomenon and information theory. Hick first began experimenting with this theory in 1951. His first experiment involved 10 lamps with corresponding Morse code keys. The lamps would light at random every five seconds. The choice reaction time was recorded with the number of choices ranging from 2 -- 10 lamps. Hick performed a second experiment using the same task, while keeping the number of alternatives at 10. The participant performed the task the first two times with the instruction to perform the task as accurately as possible. For the last task, the participant was asked to perform the task as quickly as possible. While Hick was stating that the relationship between reaction time and the number of choices was logarithmic, Hyman wanted to better understand the relationship between the reaction time and the mean number of choices. In Hyman 's experiment, he had eight different lights arranged in a 6x6 matrix. Each of these different lights was given a name, so the participant was timed in the time it took to say the name of the light after it was lit. Further experiments changed the number of each different type of light. Hyman was responsible for determining a linear relation between reaction time and the information transmitted. Given n equally probable choices, the average reaction time T required to choose among the choices is approximately: where b is a constant that can be determined empirically by fitting a line to measured data. The logarithm expresses depth of "choice tree '' hierarchy -- log indicates binary search was performed. Addition of 1 to n takes into account the "uncertainty about whether to respond or not, as well as about which response to make. '' In the case of choices with unequal probabilities, the law can be generalized as: where H is strongly related to the information - theoretic entropy of the decision, defined as: where p refers to the probability of the ith alternative yielding the information - theoretic entropy. Hick 's law is similar in form to Fitts 's law. Hick 's law has a logarithmic form because people subdivide the total collection of choices into categories, eliminating about half of the remaining choices at each step, rather than considering each and every choice one - by - one, which would require linear time. E. Roth (1964) demonstrated a correlation between IQ and information processing speed, which is the reciprocal of the slope of the function: where n is the number of choices. The time it takes to come to a decision is: The stimulus -- response compatibility is known to also affect the choice reaction time for the Hick -- Hyman law. This means that the response should be similar to the stimulus itself (such as turning a steering wheel to turn the wheels of the car). The action the user performs is similar to the response the driver receives from the car. Studies suggest that the search for a word within a randomly ordered list -- in which the reaction time increases linearly according to the number of items -- does not allow for the generalization of the scientific law, considering that, in other conditions, the reaction time may not be linearly associated to the logarithm of the number of elements or even show other variations of the basic plane. Exceptions to Hick 's law have been identified in studies of verbal response to familiar stimuli, where there is no relationship or only a subtle increase in the reaction time associated with an increased number of elements, and saccade responses, where it was shown that there is either no relationship, or a decrease in the saccadic time with the increase of the number of elements, thus an antagonistic effect to that postulated by Hick 's law. The generalization of Hick 's law was also tested in studies on the predictability of transitions associated with the reaction time of elements that appeared in a structured sequence. This process was first described as being in accordance to Hick 's law, but more recently it was shown that the relationship between predictability and reaction time is sigmoid, not linear associated with different modes of action.
what is an avenging angel in the bible
Destroying angel (Bible) - wikipedia The Destroying Angel or Angel of Death in the Hebrew Bible is an entity sent out by Yahweh on several occasions to kill enemies of the Israelites. In 2 Samuel 24: 15, it kills the inhabitants of Jerusalem. In I Chronicles 21: 15, the same "angel of the Lord '' is seen by David to stand "between the earth and the heaven, with a drawn sword in his hand stretched out against Jerusalem. '' Later, the angel of the Lord kills 185,000 men of Sennacherib 's Assyrian army, thereby saving Hezekiah 's Jerusalem in II Kings 19: 35. The angel (malak) is referred to under various terms, including Mashḥit (pron. mash - heet (h) or - kheet (h)) (Mashchit (h), מַשְׁחִית and Ha - Mashchit (h) / Ha - Mashḥit, הַמַשְׁחִית), "destroying angel '' (מַלְאָך הַמַשְׁחִית, malak ha - mashḥit or in the plural מַשְׁחִיתִים, mashchitim / mashchithim / mashḥitim - "spoilers, ravagers ''), Angel of the Lord, "destroyer '' (מְמִיתִים, memitim - "executioners '', "slayers '') is found in Job 33: 22 and in Proverbs 16: 14 in the plural, "Messengers of death '' Mashchith was also used as an alternate name for one of the seven compartments of Gehenna.
describe each of the four steps used in the process of creating a computer program
Computer program - wikipedia A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function. A computer program is usually written by a computer programmer in a programming language. From the program in its human - readable form of source code, a compiler can derive machine code -- a form consisting of instructions that the computer can directly execute. Alternatively, a computer program may be executed with the aid of an interpreter. A part of a computer program that performs a well - defined task is known as an algorithm. A collection of computer programs, libraries, and related data are referred to as software. Computer programs may be categorized along functional lines, such as application software and system software. The earliest programmable machines preceded the invention of the digital computer. In 1801, Joseph - Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be woven and repeated by arranging the cards. In 1837, Charles Babbage was inspired by Jacquard 's loom to attempt to build the Analytical Engine. The names of the components of the calculating device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled. The device would have had a "store '' -- memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the "store '' would then have then been transferred to the "mill '' (analogous to the CPU of a modern machine), for processing. It was programmed using two sets of perforated cards -- one to direct the operation and the other for the input variables. However, after more than 17,000 pounds of the British government 's money, the thousands of cogged wheels and gears never fully worked together. During a nine - month period in 1842 -- 43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea. The memoir covered the Analytical Engine. The translation contained Note G which completely detailed a method for calculating Bernoulli numbers using the Analytical Engine. This note is recognized by some historians as the world 's first written computer program. In 1936, Alan Turing introduced the Universal Turing machine -- a theoretical device that can model every computation that can be performed on a Turing complete computing machine. It is a finite - state machine that has an infinitely long read / write tape. The machine can move the tape back and forth, changing its contents as it performs an algorithm. The machine starts in the initial state, goes through a sequence of steps, and halts when it encounters the halt state. This machine is considered by some to be the origin of the stored - program computer -- used by John von Neumann (1946) for the "Electronic Computing Instrument '' that now bears the von Neumann architecture name. The Z3 computer, invented by Konrad Zuse (1941) in Germany, was a digital and programmable computer. A digital computer uses electricity as the calculating component. The Z3 contained 2,400 relays to create the circuits. The circuits provided a binary, floating - point, nine - instruction computer. Programming the Z3 was through a specially designed keyboard and punched tape. The Electronic Numerical Integrator And Computer (Fall 1945) was a Turing complete, general - purpose computer that used 17,468 vacuum tubes to create the circuits. At its core, it was a series of Pascalines wired together. Its 40 units weighed 30 tons, occupied 1,800 square feet (167 m), and consumed $650 per hour (in 1940s currency) in electricity when idle. It had 20 base - 10 accumulators. Programming the ENIAC took up to two months. Three function tables were on wheels and needed to be rolled to fixed function panels. Function tables were connected to function panels using heavy black cables. Each function table had 728 rotating knobs. Programming the ENIAC also involved setting some of the 3,000 switches. Debugging a program took a week. The ENIAC featured parallel operations. Different sets of accumulators could simultaneously work on different algorithms. It used punched card machines for input and output, and it was controlled with a clock signal. It ran for eight years, calculating hydrogen bomb parameters, predicting weather patterns, and producing firing tables to aim artillery guns. The Manchester Small - Scale Experimental Machine (June 1948) was a stored - program computer. Programming transitioned away from moving cables and setting dials; instead, a computer program was stored in memory as numbers. Only three bits of memory were available to store each instruction, so it was limited to eight instructions. 32 switches were available for programming. Computers manufactured until the 1970s had front - panel switches for programming. The computer program was written on paper for reference. An instruction was represented by a configuration of on / off settings. After setting the configuration, an execute button was pressed. This process was then repeated. Computer programs also were manually input via paper tape or punched cards. After the medium was loaded, the starting address was set via switches and the execute button pressed. In 1961, the Burroughs B5000 was built specifically to be programmed in the ALGOL 60 language. The hardware featured circuits to ease the compile phase. In 1964, the IBM System / 360 was a line of six computers each having the same instruction set architecture. The Model 30 was the smallest and least expensive. Customers could upgrade and retain the same application software. Each System / 360 model featured multiprogramming. With operating system support, multiple programs could be in memory at once. When one was waiting for input / output, another could compute. Each model also could emulate other computers. Customers could upgrade to the System / 360 and retain their IBM 7094 or IBM 1401 application software. Computer programming is the process of writing or editing source code. Editing source code involves testing, analyzing, refining, and sometimes coordinating with other programmers on a jointly developed program. A person who practices this skill is referred to as a computer programmer, software developer, and sometimes coder. The sometimes lengthy process of computer programming is usually referred to as software development. The term software engineering is becoming popular as the process is seen as an engineering discipline. Computer programs can be categorized by the programming language paradigm used to produce them. Two of the main paradigms are imperative and declarative. Imperative programming languages specify a sequential algorithm using declarations, expressions, and statements: One criticism of imperative languages is the side effect of an assignment statement on a class of variables called non-local variables. Declarative programming languages describe what computation should be performed and not how to compute it. Declarative programs omit the control flow and are considered sets of instructions. Two broad categories of declarative languages are functional languages and logical languages. The principle behind functional languages (like Haskell) is to not allow side effects, which makes it easier to reason about programs like mathematical functions. The principle behind logical languages (like Prolog) is to define the problem to be solved -- the goal -- and leave the detailed solution to the Prolog system itself. The goal is defined by providing a list of subgoals. Then each subgoal is defined by further providing a list of its subgoals, etc. If a path of subgoals fails to find a solution, then that subgoal is backtracked and another path is systematically attempted. A computer program in the form of a human - readable, computer programming language is called source code. Source code may be converted into an executable image by a compiler or executed immediately with the aid of an interpreter. Compilers are used to translate source code from a programming language into either object code or machine code. Object code needs further processing to become machine code, and machine code consists of the central processing unit 's native instructions, ready for execution. Compiled computer programs are commonly referred to as executables, binary images, or simply as binaries -- a reference to the binary file format used to store the executable code. Interpreters are used to execute source code from a programming language line - by - line. The interpreter decodes each statement and performs its behavior. One advantage of interpreters is that they can easily be extended to an interactive session. The programmer is presented with a prompt, and individual lines of code are typed in and performed immediately. The main disadvantage of interpreters is computer programs run slower than when compiled. Interpreting code is slower because the interpreter must decode each statement and then perform it. However, software development may be faster using an interpreter because testing is immediate when the compiling step is omitted. Another disadvantage of interpreters is an interpreter must be present on the executing computer. By contrast, compiled computer programs need no compiler present during execution. Just in time compilers pre-compile computer programs just before execution. For example, the Java virtual machine Hotspot contains a Just In Time Compiler which selectively compiles Java bytecode into machine code - but only code which Hotspot predicts is likely to be used many times. Either compiled or interpreted programs might be executed in a batch process without human interaction. Scripting languages are often used to create batch processes. One common scripting language is Unix shell, and its executing environment is called the command - line interface. No properties of a programming language require it to be exclusively compiled or exclusively interpreted. The categorization usually reflects the most popular method of language execution. For example, Java is thought of as an interpreted language and C a compiled language, despite the existence of Java compilers and C interpreters. Typically, computer programs are stored in non-volatile memory until requested either directly or indirectly to be executed by the computer user. Upon such a request, the program is loaded into random - access memory, by a computer program called an operating system, where it can be accessed directly by the central processor. The central processor then executes ("runs '') the program, instruction by instruction, until termination. A program in execution is called a process. Termination is either by normal self - termination or by error -- software or hardware error. Many operating systems support multitasking which enables many computer programs to appear to run simultaneously on one computer. Operating systems may run multiple programs through process scheduling -- a software mechanism to switch the CPU among processes often so users can interact with each program while it runs. Within hardware, modern day multiprocessor computers or computers with multicore processors may run multiple programs. Multiple lines of the same computer program may be simultaneously executed using threads. Multithreading processors are optimized to execute multiple threads efficiently. A computer program in execution is normally treated as being different from the data the program operates on. However, in some cases, this distinction is blurred when a computer program modifies itself. The modified computer program is subsequently executed as part of the same program. Self - modifying code is possible for programs written in machine code, assembly language, Lisp, C, COBOL, PL / 1, and Prolog. Computer programs may be categorized along functional lines. The main functional categories are application software and system software. System software includes the operating system which couples computer hardware with application software. The purpose of the operating system is to provide an environment in which application software executes in a convenient and efficient manner. In addition to the operating system, system software includes embedded programs, boot programs, and micro programs. Application software designed for end users have a user interface. Application software not designed for the end user includes middleware, which couples one application with another. Application software also includes utility programs. The distinction between system software and application software is under debate. There are many types of application software: Utility programs are application programs designed to aid system administrators and computer programmers. An operating system is a computer program that acts as an intermediary between a user of a computer and the computer hardware. In the 1950s, the programmer, who was also the operator, would write a program and run it. After the program finished executing, the output may have been printed, or it may have been punched onto paper tape or cards for later processing. More often than not the program did not work. The programmer then looked at the console lights and fiddled with the console switches. If less fortunate, a memory printout was made for further study. In the 1960s, programmers reduced the amount of wasted time by automating the operator 's job. A program called an operating system was kept in the computer at all times. Originally, operating systems were programmed in assembly; however, modern operating systems are typically written in C. A stored - program computer requires an initial computer program stored in its read - only memory to boot. The boot process is to identify and initialize all aspects of the system, from processor registers to device controllers to memory contents. Following the initialization process, this initial computer program loads the operating system and sets the program counter to begin normal operations. Independent of the host computer, a hardware device might have embedded firmware to control its operation. Firmware is used when the computer program is rarely or never expected to change, or when the program must not be lost when the power is off. Microcode programs control some central processing units and some other hardware. This code moves data between the registers, buses, arithmetic logic units, and other functional units in the CPU. Unlike conventional programs, microcode is not usually written by, or even visible to, the end users of systems, and is usually provided by the manufacturer, and is considered internal to the device.
what movie is owen wilson talking about in midnight in paris
Midnight in Paris - Wikipedia Midnight in Paris is a 2011 fantasy comedy film written and directed by Woody Allen. Set in Paris, the film follows Gil Pender, a screenwriter, who is forced to confront the shortcomings of his relationship with his materialistic fiancée and their divergent goals, which become increasingly exaggerated as he travels back in time each night at midnight. The movie explores themes of nostalgia and modernism. Produced by the Spanish group Mediapro and Allen 's US - based Gravier Productions, the film stars Owen Wilson, Rachel McAdams, Kathy Bates, Adrien Brody, Carla Bruni, Marion Cotillard and Michael Sheen. It premiered at the 2011 Cannes Film Festival and was released in the United States on May 20, 2011. The film opened to critical acclaim and has commonly been cited as one of Allen 's best films in recent years. In 2012, the film won both the Academy Award for Best Original Screenplay and the Golden Globe Awards for Best Screenplay; and was nominated for three other Academy Awards: Best Picture, Best Director and Best Art Direction. In 2010, Gil Pender, a successful but creatively unfulfilled Hollywood screenwriter, and his fiancée Inez, are in Paris vacationing with Inez 's wealthy, conservative parents. Gil is struggling to finish his first novel, centered on a man who works in a nostalgia shop. Inez dismisses his ambition as a romantic daydream, and encourages him to stick with lucrative screenwriting. Gil is considering moving to Paris (which he notes, much to the dismay of his fiancée, is at its most beautiful in the rain). Inez is intent on living in Malibu. By chance, they are joined by Inez 's friend Paul, who is described as both pedantic and a pseudo-intellectual, and his wife Carol. Paul speaks with great authority but questionable accuracy on the history and artworks of Paris. Paul contradicts a tour guide at the Musée Rodin, and insists that his knowledge of Rodin 's relationships is more accurate than that of the guide. Inez admires him; Gil finds him insufferable. Gil gets drunk one night when Inez has gone off dancing with Paul and his wife, and becomes lost in the back streets of Paris. At midnight, a 1920s Peugeot Type 176 car draws up beside him, and the passengers, dressed in 1920s clothing, urge him to join them. They go to a party for Jean Cocteau, where he encounters Cole Porter, as well as Zelda and Scott Fitzgerald. Gil realizes (but does n't draw attention to) that he has been transported back to the 1920s, an era he idolizes. Scott and Zelda, along with Cole and his wife, Linda Lee Porter, go to another bar, Chez Bricktop, where he sees Josephine Baker. At a third bar, Gil meets Juan Belmonte and Ernest Hemingway. Hemingway offers to show Gil 's novel to Gertrude Stein, and Gil goes to fetch his manuscript from his hotel. However, as soon as he leaves, he finds he has returned to 2010 and that the bar where the 1920s literati were drinking is now a laundromat. Gil attempts to bring Inez to the past with him the following night, but she becomes impatient and peevishly returns to the hotel. Just after she leaves, the clock strikes midnight and the same car arrives, this time with Hemingway inside. He takes Gil to meet Stein, who agrees to read his novel and introduces him to Pablo Picasso and his lover Adriana. Adriana and Gil are instantly attracted to each other. Stein reads aloud the novel 's first line: Adriana says that she is hooked by these few lines and has always had a longing for the past, especially the Belle Époque of the late 1800s. Gil spends each of the next few nights in the past. His late - night wanderings annoy Inez, and arouse the suspicion of her father, who hires a private detective to follow him. Gil spends more and more time with Adriana, who leaves Picasso for a brief dalliance with Hemingway. Gil realizes he is falling in love with her, leaving him in conflict. He confides his predicament to Salvador Dalí, Man Ray and Luis Buñuel, but being surrealists they see nothing strange about his claim to have come from the future, finding it to be perfectly normal. They discuss the impossibility of Gil 's relationship with Adriana, and each of the artists envisages a different masterpiece inspired by such an unusual romance. Later on Gil suggests a movie plot to Buñuel - that of Buñuel 's own 1962 film The Exterminating Angel - and leaves while Buñuel continues to question the plot idea. In present - day Paris, while Inez shops for expensive furniture, Gil meets Gabrielle, an antique dealer and fellow admirer of the Lost Generation. He buys a Cole Porter gramophone record, and later finds Adriana 's diary from the 1920s at a book stall by the Seine, which reveals that she was in love with him. Reading that she dreamed of receiving a gift of earrings from him and then making love to him, Gil attempts to take a pair of Inez 's earrings to give to Adriana, but is thwarted by Inez 's early return from a trip. Gil buys earrings for Adriana. Returning to the past, he finds her at a party and tells Adriana, "I sense there are some complicated feelings you have for me. '' He takes her for a walk, they kiss, and he gives her the earrings. While she 's putting them on, a horse - drawn carriage comes down the street, and a richly - dressed couple inside the carriage invite Gil and Adriana for a ride. The carriage transports the passengers to the Belle Époque, an era Adriana considers Paris 's Golden Age. Gil and Adriana go first to Maxim 's Paris, then to the Moulin Rouge where they meet Henri de Toulouse - Lautrec, Paul Gauguin, and Edgar Degas. Gil asks what they thought the best era was, and the three determine that the greatest era was the Renaissance. The enthralled Adriana is offered a job designing ballet costumes, and proposes to Gil that they stay, but Gil, upon observing that different people long for different "golden ages '', has an epiphany, and realizes that despite the allure of nostalgia, any time can eventually become a dull "present '', so it 's best to embrace your actual present. Adriana however, elects to stay in the 1890s, and they part. Gil rewrites the first two chapters of his novel and retrieves his draft from Stein, who praises his progress as a writer and tells him that Hemingway likes it, but questions why the main character has not realized that his fiancée (based on Inez) is having an affair with a pedantic character (based on Paul). Gil returns to 2010 and confronts Inez. She admits to having slept with Paul, but dismisses it as a meaningless fling. Gil breaks up with her and decides to move to Paris. Amid Inez 's pique, Gil calmly leaves, after which Inez 's father tells her and her mother that he had Gil followed, though the detective has mysteriously disappeared. It is revealed that the detective found himself in the Versailles of Louis XVI and Marie Antoinette, and is last seen fleeing from the palace guards. Walking by the Seine at midnight, Gil bumps into Gabrielle and, after it starts to rain, he offers to walk her home and they learn that they share the love of Paris in the rain. The cast includes (in credits order): This is the second time McAdams and Wilson co-starred as a couple; they did so before in 2005 's Wedding Crashers. In comparing the two roles, McAdams describes the one in Midnight in Paris as being far more antagonistic than the role in Wedding Crashers. Allen had high praises for her performance and that of co-star Marion Cotillard. Cotillard was cast as Wilson 's other love interest, the charismatic Adriana. Carla Bruni, singer - songwriter and wife of former French president Nicolas Sarkozy, was recruited by Allen for a role as a museum guide. There were false reports that Allen re-filmed Bruni 's scenes with Léa Seydoux, but Seydoux rebuffed these rumors revealing she had an entirely separate role in the film. Allen also shot down reports that a scene with Bruni required over 30 takes: "I am appalled. I read these things and I could not believe my eyes... These are not exaggerations, but inventions from scratch. There is absolutely no truth. '' He continued to describe Bruni as "very professional '' and insisted he was pleased with her scenes, stating that "every frame will appear in the film. '' Allen employed a reverse approach in writing the screenplay for this film, by building the film 's plot around a conceived movie title, ' Midnight in Paris '. The time - travel portions of Allen 's storyline are evocative of the Paris of the 1920s described in Ernest Hemingway 's memoir A Moveable Feast, with Allen 's characters interacting with the likes of Hemingway, Gertrude Stein, and F. Scott and Zelda Fitzgerald, and uses the phrase "a moveable feast '' in two instances, with a copy of the book appearing in one scene. Allen originally wrote the character Gil as an east coast intellectual, but he rethought it when he and casting director Juliet Taylor began considering Owen Wilson for the role. "I thought Owen would be charming and funny but my fear was that he was not so eastern at all in his persona, '' says Allen. Allen realized that making Gil a Californian would actually make the character richer, so he rewrote the part and submitted it to Wilson, who readily agreed to do it. Allen describes him as "a natural actor ''. The set - up has certain plot points in common with the British sitcom Goodnight Sweetheart. Principal photography began in Paris in July 2010. Allen states that the fundamental aesthetic for the camera work was to give the film a warm ambiance. He describes that he likes it (the cinematography), "intensely red, intensely warm, because if you go to a restaurant and you 're there with your wife or your girlfriend, and it 's got red - flecked wallpaper and turn - of - the - century lights, you both look beautiful. Whereas if you 're in a seafood restaurant and the lights are up, everybody looks terrible. So it looks nice. It 's very flattering and very lovely. '' To achieve this he and his cinematographer, Darius Khondji, used primarily warm colors in the film 's photography, filmed in flatter weather and employed limited camera movements, in attempts to draw little attention to itself. This is the first Woody Allen film to go through a digital intermediate, instead of being color timed in the traditional photochemical way. According to Allen, its use here is a test to see if he likes it enough to use on his future films. Allen 's directorial style placed more emphasis on the romantic and realistic elements of the film than the fantasy elements. He states that he "was interested only in this romantic tale, and anything that contributed to it that was fairytale was right for me. I did n't want to get into it. I only wanted to get into what bore down on his (Owen Wilson 's) relationship with Marion. '' The film opens with a 3 ⁄ - minute postcard - view montage of Paris, showing some of the iconic tourist sites. Kenneth Turan of the Los Angeles Times describes the montage as a stylistic approach that lasts longer than necessary to simply establish location. According to Turan, "Allen is saying: Pay attention -- this is a special place, a place where magic can happen. '' Midnight in Paris is the first Woody Allen film shot entirely on location in Paris, though both Love and Death (1975) and Everyone Says I Love You (1996) were partially filmed there. Filming locations include Giverny, John XXIII Square (near Notre Dame), Montmartre, Deyrolle, the Palace of Versailles, the Opéra, Pont Alexandre III, the Sacré - Cœur, the Île de la Cité itself, and streets near the Panthéon. The film is co-produced by Allen 's Gravier Productions and the Catalan company Mediapro and was picked up by Sony Pictures Classics for distribution. It is the fourth film the two companies have co-produced, the others being Sweet and Lowdown, Whatever Works and You Will Meet a Tall Dark Stranger. In promoting the film, Allen was willing to do only a limited amount of publicity at its Cannes Film Festival debut in May. Wilson was already committed to promoting Pixar 's Cars 2, which opened in late June, several weeks after Allen 's film arrived in cinemas. Due to these challenges and the relatively small ($10 million) budget for promotion, Sony Classics had to perform careful media buying and press relations to promote the film. The film 's poster is a reference to Vincent van Gogh 's 1889 painting The Starry Night. The film made its debut at the 2011 Cannes Film Festival on Wednesday May 11, when it opened the festival as a first - ever screening for both professionals and the public; it was released nationwide in France that same day, Wednesday being the traditional day of change in French cinemas. It went on limited release in six theaters in the United States on May 20 and took $599,003 in the first weekend, spreading to 944 cinemas three weeks later, when it went on wide release. Midnight in Paris achieved the highest gross of any of Allen 's films in North America, before adjusting for inflation. The film earned $56.3 million in North America, overtaking his previous best, Hannah and Her Sisters, at $40 million. As of 2016, Midnight in Paris is the highest - grossing film directed by Woody Allen, with $151 million worldwide on a $17 million budget. Midnight in Paris received critical acclaim. On Rotten Tomatoes the film has an approval rating of 93 %, based on 208 reviews, with an average rating of 7.8 / 10. The site 's critical consensus reads, "It may not boast the depth of his classic films, but the sweetly sentimental Midnight in Paris is funny and charming enough to satisfy Woody Allen fans. '' The film has received Allen 's best reviews and score on the site since 1994 's Bullets Over Broadway. On Metacritic, the film has a score of 81 out of 100, based on 40 reviews, indicating "universal acclaim ''. The film received some generally positive reviews after its premiere at the 64th Cannes Film Festival. Todd McCarthy from The Hollywood Reporter praised Darius Khondji 's cinematography and claimed the film "has the concision and snappy pace of Allen 's best work ''. A.O. Scott of The New York Times commented on Owen Wilson 's success at playing the Woody Allen persona. He states that the film is marvelously romantic and credibly blends "whimsy and wisdom ''. He praised Khondji 's cinematography, the supporting cast and remarked that it is a memorable film and that "Mr. Allen has often said that he does not want or expect his own work to survive, but as modest and lighthearted as Midnight in Paris is, it suggests otherwise: Not an ambition toward immortality so much as a willingness to leave something behind -- a bit of memorabilia, or art, if you like that word better -- that catches the attention and solicits the admiration of lonely wanderers in some future time. '' Roger Ebert gave the film 3 ⁄ stars out of 4. He ended his review thus: This is Woody Allen 's 41st film. He writes his films himself, and directs them with wit and grace. I consider him a treasure of the cinema. Some people take him for granted, although Midnight in Paris reportedly charmed even the jaded veterans of the Cannes press screenings. There is nothing to dislike about it. Either you connect with it or not. I 'm wearying of movies that are for "everybody '' -- which means, nobody in particular. Midnight in Paris is for me, in particular, and that 's just fine with moi. '' Richard Roeper, an American film critic, gave Midnight in Paris an "A ''; referring to it as a "wonderful film '' and "one of the best romantic comedies in recent years ''. He commented that the actors are uniformly brilliant and praised the film 's use of witty one - liners. In The Huffington Post, Rob Kirkpatrick said the film represented a return to form for the director ("it 's as if Woody has rediscovered Woody '') and called Midnight in Paris "a surprising film that casts a spell over us and reminds us of the magical properties of cinema, and especially of Woody Allen 's cinema. '' Midnight in Paris has been compared to Allen 's The Purple Rose of Cairo (1985), in that the functioning of the magical realism therein is never explained. David Edelstein, New York, commended that approach, stating that it eliminates, "the sci - fi wheels and pulleys that tend to suck up so much screen time in time - travel movies. '' He goes on to applaud the film stating that, "this supernatural comedy is n't just Allen 's best film in more than a decade; it 's the only one that manages to rise above its tidy parable structure and be easy, graceful, and glancingly funny, as if buoyed by its befuddled hero 's enchantment. '' Peter Johnson of PopCitizen felt that the film 's nature as a "period piece '' was far superior to its comedic components, which he referred to as lacking. "While the period settings of Midnight in Paris are almost worth seeing the film... it hardly qualifies as a moral compass to those lost in a nostalgic revelry, '' he asserts. Joe Morgenstern of The Wall Street Journal acknowledged the cast and the look of the film and, despite some familiarities with the film 's conflict, praised Allen 's work on the film. He wrote, "For the filmmaker who brought these intertwined universes into being, the film represents new energy in a remarkable career. ''. Peter Bradshaw of The Guardian, giving the film 3 out of 5 stars, described it as "an amiable amuse - bouche '' and "sporadically entertaining, light, shallow, self - plagiarising. '' He goes on to add that it 's "a romantic fantasy adventure to be compared with the vastly superior ideas of his comparative youth, such as the 1985 movie The Purple Rose of Cairo. '' In October 2013, the film was voted by the Guardian readers as the ninth best film directed by Woody Allen. More scathing is Richard Corliss of Time, who describes the film as "pure Woody Allen. Which is not to say great or even good Woody, but a distillation of the filmmaker 's passions and crotchets, and of his tendency to pass draconian judgment on characters the audience is not supposed to like... his Midnight strikes not sublime chimes but the clangor of snap judgments and frayed fantasy. '' Quentin Tarantino named Midnight in Paris as his favorite film of 2011. The film was well received in France. The website Allocine (Hello Cinema) gave it 4.2 out of 5 stars based on a sample of twenty reviews. Ten of the reviews gave it a full five stars, including Le Figaro, which praised the film 's evocation of its themes and said "one leaves the screening with a smile on one 's lips ''. The William Faulkner estate later filed a lawsuit against Sony Pictures Classics for the film 's bit of dialogue, "The past is not dead. Actually, it 's not even past, '' a paraphrasing of an often - quoted line from Faulkner 's 1950 book Requiem for a Nun ("The past is never dead. It 's not even past. ''), claiming that the paraphrasing was an unlicensed use of the estate. Faulkner is directly credited in the dialogue when Gil claims to have met the writer at a dinner party (though Faulkner is never physically portrayed in the film). Julie Ahrens of the Fair Use Project at the Stanford University 's Center for Internet and Society was quoted as saying in response to the charge, "The idea that one person can control the use of those particular words seems ridiculous to me. Any kind of literary allusion is ordinarily celebrated. This seems to squarely fall in that tradition. '' Sony 's response stated that they consider the action "a frivolous lawsuit ''. In July 2013, a federal judge in Mississippi dismissed the lawsuit on fair use grounds. The soundtrack was released on December 9, 2011, and released on Blu - ray and DVD on December 20, 2011.
how many episodes are in season one of black mirror
List of Black Mirror episodes - wikipedia Black Mirror is a British television series created by Charlie Brooker. The series is produced by Zeppotron for Endemol. Regarding the programme 's content and structure, Brooker noted, "each episode has a different cast, a different setting, and even a different reality. But they 're all about the way we live now; and the way we might be living in 10 minutes ' time if we 're clumsy. '' In September 2015, Netflix commissioned a third series of 12 episodes, which was later divided into two separate series, the third and fourth, each comprising six episodes. The third series was released on Netflix worldwide on 21 October 2016. As of 21 October 2016, 13 episodes of Black Mirror have been released, including one special, concluding the third series.
principled leadership is also known as what type of leadership
Leadership - wikipedia Leadership is both a research area and a practical skill encompassing the ability of an individual or organization to "lead '' or guide other individuals, teams, or entire organizations. The literature debates various viewpoints: contrasting Eastern and Western approaches to leadership, and also (within the West) US vs. European approaches. US academic environments define leadership as "a process of social influence in which a person can enlist the aid and support of others in the accomplishment of a common task ''. Leadership seen from a European and non-academic perspective encompasses a view of a leader who can be moved not only by communitarian goals but also by the search for personal power. Leadership can derive from a combination of several factors, as the European researcher Daniele Trevisani highlights: "Leadership is a holistic spectrum that can arise from: (1) higher levels of physical power, need to display power and control others, force superiority, ability to generate fear, or group - member 's need for a powerful group protector (Primal Leadership), (2) superior mental energies, superior motivational forces, perceivable in communication and behaviors, lack of fear, courage, determination (Psychoenergetic Leadership), (3) higher abilities in managing the overall picture (Macro-Leadership), (4) higher abilities in specialized tasks (Micro-Leadership), (5) higher ability in managing the execution of a task (Project Leadership), and (6) higher level of values, wisdom, and spirituality (Spiritual Leadership), where any Leader derives its Leadership Style from a unique mix of one or more of the former factors ''. Studies of leadership have produced theories involving traits, situational interaction, function, behavior, power, vision and values, charisma, and intelligence, among others. Sanskrit literature identifies ten types of leaders. Defining characteristics of the ten types of leaders are explained with examples from history and mythology. Aristocratic thinkers have postulated that leadership depends on one 's "blue blood '' or genes. Monarchy takes an extreme view of the same idea, and may prop up its assertions against the claims of mere aristocrats by invoking divine sanction (see the divine right of kings). Contrariwise, more democratically inclined theorists have pointed to examples of meritocratic leaders, such as the Napoleonic marshals profiting from careers open to talent. In the autocratic / paternalistic strain of thought, traditionalists recall the role of leadership of the Roman pater familias. Feminist thinking, on the other hand, may object to such models as patriarchal and posit against them emotionally attuned, responsive, and consensual empathetic guidance, which is sometimes associated with matriarchies. Comparable to the Roman tradition, the views of Confucianism on "right living '' relate very much to the ideal of the (male) scholar - leader and his benevolent rule, buttressed by a tradition of filial piety. Leadership is a matter of intelligence, trustworthiness, humaneness, courage, and discipline... Reliance on intelligence alone results in rebelliousness. Exercise of humaneness alone results in weakness. Fixation on trust results in folly. Dependence on the strength of courage results in violence. Excessive discipline and sternness in command result in cruelty. When one has all five virtues together, each appropriate to its function, then one can be a leader. -- Sun Tzu Machiavelli 's The Prince, written in the early 16th century, provided a manual for rulers ("princes '' or "tyrants '' in Machiavelli 's terminology) to gain and keep power. In the 19th century the elaboration of anarchist thought called the whole concept of leadership into question. (Note that the Oxford English Dictionary traces the word "leadership '' in English only as far back as the 19th century.) One response to this denial of élitism came with Leninism, which demanded an élite group of disciplined cadres to act as the vanguard of a socialist revolution, bringing into existence the dictatorship of the proletariat. Other historical views of leadership have addressed the seeming contrasts between secular and religious leadership. The doctrines of Caesaro - papism have recurred and had their detractors over several centuries. Christian thinking on leadership has often emphasized stewardship of divinely provided resources -- human and material -- and their deployment in accordance with a Divine plan. Compare servant leadership. For a more general take on leadership in politics, compare the concept of the statesperson. The search for the characteristics or traits of leaders has continued for centuries. Philosophical writings from Plato 's Republic to Plutarch 's Lives have explored the question "What qualities distinguish an individual as a leader? '' Underlying this search was the early recognition of the importance of leadership and the assumption that leadership is rooted in the characteristics that certain individuals possess. This idea that leadership is based on individual attributes is known as the "trait theory of leadership ''. A number of works in the 19th century -- when the traditional authority of monarchs, lords and bishops had begun to wane -- explored the trait theory at length: note especially the writings of Thomas Carlyle and of Francis Galton, whose works have prompted decades of research. In Heroes and Hero Worship (1841), Carlyle identified the talents, skills, and physical characteristics of men who rose to power. Galton 's Hereditary Genius (1869) examined leadership qualities in the families of powerful men. After showing that the numbers of eminent relatives dropped off when his focus moved from first - degree to second - degree relatives, Galton concluded that leadership was inherited. In other words, leaders were born, not developed. Both of these notable works lent great initial support for the notion that leadership is rooted in characteristics of a leader. Cecil Rhodes (1853 -- 1902) believed that public - spirited leadership could be nurtured by identifying young people with "moral force of character and instincts to lead '', and educating them in contexts (such as the collegiate environment of the University of Oxford) which further developed such characteristics. International networks of such leaders could help to promote international understanding and help "render war impossible ''. This vision of leadership underlay the creation of the Rhodes Scholarships, which have helped to shape notions of leadership since their creation in 1903. In the late 1940s and early 1950s, however, a series of qualitative reviews of these studies (e.g., Bird, 1940; Stogdill, 1948; Mann, 1959) prompted researchers to take a drastically different view of the driving forces behind leadership. In reviewing the extant literature, Stogdill and Mann found that while some traits were common across a number of studies, the overall evidence suggested that persons who are leaders in one situation may not necessarily be leaders in other situations. Subsequently, leadership was no longer characterized as an enduring individual trait, as situational approaches (see alternative leadership theories below) posited that individuals can be effective in certain situations, but not others. The focus then shifted away from traits of leaders to an investigation of the leader behaviors that were effective. This approach dominated much of the leadership theory and research for the next few decades New methods and measurements were developed after these influential reviews that would ultimately reestablish trait theory as a viable approach to the study of leadership. For example, improvements in researchers ' use of the round robin research design methodology allowed researchers to see that individuals can and do emerge as leaders across a variety of situations and tasks. Additionally, during the 1980s statistical advances allowed researchers to conduct meta - analyses, in which they could quantitatively analyze and summarize the findings from a wide array of studies. This advent allowed trait theorists to create a comprehensive picture of previous leadership research rather than rely on the qualitative reviews of the past. Equipped with new methods, leadership researchers revealed the following: While the trait theory of leadership has certainly regained popularity, its reemergence has not been accompanied by a corresponding increase in sophisticated conceptual frameworks. Specifically, Zaccaro (2007) noted that trait theories still: Considering the criticisms of the trait theory outlined above, several researchers have begun to adopt a different perspective of leader individual differences -- the leader attribute pattern approach. In contrast to the traditional approach, the leader attribute pattern approach is based on theorists ' arguments that the influence of individual characteristics on outcomes is best understood by considering the person as an integrated totality rather than a summation of individual variables. In other words, the leader attribute pattern approach argues that integrated constellations or combinations of individual differences may explain substantial variance in both leader emergence and leader effectiveness beyond that explained by single attributes, or by additive combinations of multiple attributes... In response to the early criticisms of the trait approach, theorists began to research leadership as a set of behaviors, evaluating the behavior of successful leaders, determining a behavior taxonomy, and identifying broad leadership styles. David McClelland, for example, posited that leadership takes a strong personality with a well - developed positive ego. To lead, self - confidence and high self - esteem are useful, perhaps even essential. Kurt Lewin, Ronald Lipitt, and Ralph White developed in 1939 the seminal work on the influence of leadership styles and performance. The researchers evaluated the performance of groups of eleven - year - old boys under different types of work climate. In each, the leader exercised his influence regarding the type of group decision making, praise and criticism (feedback), and the management of the group tasks (project management) according to three styles: authoritarian, democratic, and laissez - faire. The managerial grid model is also based on a behavioral theory. The model was developed by Robert Blake and Jane Mouton in 1964 and suggests five different leadership styles, based on the leaders ' concern for people and their concern for goal achievement. B.F. Skinner is the father of behavior modification and developed the concept of positive reinforcement. Positive reinforcement occurs when a positive stimulus is presented in response to a behavior, increasing the likelihood of that behavior in the future. The following is an example of how positive reinforcement can be used in a business setting. Assume praise is a positive reinforcer for a particular employee. This employee does not show up to work on time every day. The manager of this employee decides to praise the employee for showing up on time every day the employee actually shows up to work on time. As a result, the employee comes to work on time more often because the employee likes to be praised. In this example, praise (the stimulus) is a positive reinforcer for this employee because the employee arrives at work on time (the behavior) more frequently after being praised for showing up to work on time. The use of positive reinforcement is a successful and growing technique used by leaders to motivate and attain desired behaviors from subordinates. Organizations such as Frito - Lay, 3M, Goodrich, Michigan Bell, and Emery Air Freight have all used reinforcement to increase productivity. Empirical research covering the last 20 years suggests that reinforcement theory has a 17 percent increase in performance. Additionally, many reinforcement techniques such as the use of praise are inexpensive, providing higher performance for lower costs. Situational theory also appeared as a reaction to the trait theory of leadership. Social scientists argued that history was more than the result of intervention of great men as Carlyle suggested. Herbert Spencer (1884) (and Karl Marx) said that the times produce the person and not the other way around. This theory assumes that different situations call for different characteristics; according to this group of theories, no single optimal psychographic profile of a leader exists. According to the theory, "what an individual actually does when acting as a leader is in large part dependent upon characteristics of the situation in which he functions. '' Some theorists started to synthesize the trait and situational approaches. Building upon the research of Lewin et al., academics began to normalize the descriptive models of leadership climates, defining three leadership styles and identifying which situations each style works better in. The authoritarian leadership style, for example, is approved in periods of crisis but fails to win the "hearts and minds '' of followers in day - to - day management; the democratic leadership style is more adequate in situations that require consensus building; finally, the laissez - faire leadership style is appreciated for the degree of freedom it provides, but as the leaders do not "take charge '', they can be perceived as a failure in protracted or thorny organizational problems. Thus, theorists defined the style of leadership as contingent to the situation, which is sometimes classified as contingency theory. Four contingency leadership theories appear more prominently in recent years: Fiedler contingency model, Vroom - Yetton decision model, the path - goal theory, and the Hersey - Blanchard situational theory. The Fiedler contingency model bases the leader 's effectiveness on what Fred Fiedler called situational contingency. This results from the interaction of leadership style and situational favorability (later called situational control). The theory defined two types of leader: those who tend to accomplish the task by developing good relationships with the group (relationship - oriented), and those who have as their prime concern carrying out the task itself (task - oriented). According to Fiedler, there is no ideal leader. Both task - oriented and relationship - oriented leaders can be effective if their leadership orientation fits the situation. When there is a good leader - member relation, a highly structured task, and high leader position power, the situation is considered a "favorable situation ''. Fiedler found that task - oriented leaders are more effective in extremely favorable or unfavorable situations, whereas relationship - oriented leaders perform best in situations with intermediate favorability. Victor Vroom, in collaboration with Phillip Yetton (1973) and later with Arthur Jago (1988), developed a taxonomy for describing leadership situations, which was used in a normative decision model where leadership styles were connected to situational variables, defining which approach was more suitable to which situation. This approach was novel because it supported the idea that the same manager could rely on different group decision making approaches depending on the attributes of each situation. This model was later referred to as situational contingency theory. The path - goal theory of leadership was developed by Robert House (1971) and was based on the expectancy theory of Victor Vroom. According to House, the essence of the theory is "the meta proposition that leaders, to be effective, engage in behaviors that complement subordinates ' environments and abilities in a manner that compensates for deficiencies and is instrumental to subordinate satisfaction and individual and work unit performance ''. The theory identifies four leader behaviors, achievement - oriented, directive, participative, and supportive, that are contingent to the environment factors and follower characteristics. In contrast to the Fiedler contingency model, the path - goal model states that the four leadership behaviors are fluid, and that leaders can adopt any of the four depending on what the situation demands. The path - goal model can be classified both as a contingency theory, as it depends on the circumstances, and as a transactional leadership theory, as the theory emphasizes the reciprocity behavior between the leader and the followers. The Situational Leadership ® Model proposed by Hersey suggests four leadership - styles and four levels of follower - development. For effectiveness, the model posits that the leadership - style must match the appropriate level of follower - development. In this model, leadership behavior becomes a function not only of the characteristics of the leader, but of the characteristics of followers as well. Functional leadership theory (Hackman & Walton, 1986; McGrath, 1962; Adair, 1988; Kouzes & Posner, 1995) is a particularly useful theory for addressing specific leader behaviors expected to contribute to organizational or unit effectiveness. This theory argues that the leader 's main job is to see that whatever is necessary to group needs is taken care of; thus, a leader can be said to have done their job well when they have contributed to group effectiveness and cohesion (Fleishman et al., 1991; Hackman & Wageman, 2005; Hackman & Walton, 1986). While functional leadership theory has most often been applied to team leadership (Zaccaro, Rittman, & Marks, 2001), it has also been effectively applied to broader organizational leadership as well (Zaccaro, 2001). In summarizing literature on functional leadership (see Kozlowski et al. (1996), Zaccaro et al. (2001), Hackman and Walton (1986), Hackman & Wageman (2005), Morgeson (2005)), Klein, Zeigert, Knight, and Xiao (2006) observed five broad functions a leader performs when promoting organization 's effectiveness. These functions include environmental monitoring, organizing subordinate activities, teaching and coaching subordinates, motivating others, and intervening actively in the group 's work. A variety of leadership behaviors are expected to facilitate these functions. In initial work identifying leader behavior, Fleishman (1953) observed that subordinates perceived their supervisors ' behavior in terms of two broad categories referred to as consideration and initiating structure. Consideration includes behavior involved in fostering effective relationships. Examples of such behavior would include showing concern for a subordinate or acting in a supportive manner towards others. Initiating structure involves the actions of the leader focused specifically on task accomplishment. This could include role clarification, setting performance standards, and holding subordinates accountable to those standards. The Integrated Psychological theory of leadership is an attempt to integrate the strengths of the older theories (i.e. traits, behavioral / styles, situational and functional) while addressing their limitations, largely by introducing a new element -- the need for leaders to develop their leadership presence, attitude toward others and behavioral flexibility by practicing psychological mastery. It also offers a foundation for leaders wanting to apply the philosophies of servant leadership and authentic leadership. Integrated Psychological theory began to attract attention after the publication of James Scouller 's Three Levels of Leadership model (2011). Scouller argued that the older theories offer only limited assistance in developing a person 's ability to lead effectively. He pointed out, for example, that: Scouller proposed the Three Levels of Leadership model, which was later categorized as an "Integrated Psychological '' theory on the Businessballs education website. In essence, his model aims to summarize what leaders have to do, not only to bring leadership to their group or organization, but also to develop themselves technically and psychologically as leaders. The three levels in his model are Public, Private and Personal leadership: Scouller argued that self - mastery is the key to growing one 's leadership presence, building trusting relationships with followers and dissolving one 's limiting beliefs and habits, thereby enabling behavioral flexibility as circumstances change, while staying connected to one 's core values (that is, while remaining authentic). To support leaders ' development, he introduced a new model of the human psyche and outlined the principles and techniques of self - mastery, which include the practice of mindfulness meditation. Bernard Bass and colleagues developed the idea of two different types of leadership, transactional that involves exchange of labor for rewards and transformational which is based on concern for employees, intellectual stimulation, and providing a group vision. The transactional leader (Burns, 1978) is given power to perform certain tasks and reward or punish for the team 's performance. It gives the opportunity to the manager to lead the group and the group agrees to follow his lead to accomplish a predetermined goal in exchange for something else. Power is given to the leader to evaluate, correct, and train subordinates when productivity is not up to the desired level, and reward effectiveness when expected outcome is reached. This LMX theory addresses a specific aspect of the leadership process is the leader -- member exchange (LMX) theory, which evolved from an earlier theory called the vertical dyad linkage (VDL) model. Both of these models focus on the interaction between leaders and individual followers. Similar to the transactional approach, this interaction is viewed as a fair exchange whereby the leader provides certain benefits such as task guidance, advice, support, and / or significant rewards and the followers reciprocate by giving the leader respect, cooperation, commitment to the task and good performance. However, LMX recognizes that leaders and individual followers will vary in the type of exchange that develops between them. LMX theorizes that the type of exchanges between the leader and specific followers can lead to the creation of in - groups and out - groups. In - group members are said to have high - quality exchanges with the leader, while out - group members have low - quality exchanges with the leader. In - group members are perceived by the leader as being more experienced, competent, and willing to assume responsibility than other followers. The leader begins to rely on these individuals to help with especially challenging tasks. If the follower responds well, the leader rewards him / her with extra coaching, favorable job assignments, and developmental experiences. If the follower shows high commitment and effort followed by additional rewards, both parties develop mutual trust, influence, and support of one another. Research shows the in - group members usually receive higher performance evaluations from the leader, higher satisfaction, and faster promotions than out - group members. In - group members are also likely to build stronger bonds with their leaders by sharing the same social backgrounds and interests. Out - group members often receive less time and more distant exchanges than their in - group counterparts. With out - group members, leaders expect no more than adequate job performance, good attendance, reasonable respect, and adherence to the job description in exchange for a fair wage and standard benefits. The leader spends less time with out - group members, they have fewer developmental experiences, and the leader tends to emphasize his / her formal authority to obtain compliance to leader requests. Research shows that out - group members are less satisfied with their job and organization, receive lower performance evaluations from the leader, see their leader as less fair, and are more likely to file grievances or leave the organization. Leadership can be perceived as a particularly emotion - laden process, with emotions entwined with the social influence process. In an organization, the leader 's mood has some effects on his / her group. These effects can be described in three levels: In research about client service, it was found that expressions of positive mood by the leader improve the performance of the group, although in other sectors there were other findings. Beyond the leader 's mood, her / his behavior is a source for employee positive and negative emotions at work. The leader creates situations and events that lead to emotional response. Certain leader behaviors displayed during interactions with their employees are the sources of these affective events. Leaders shape workplace affective events. Examples -- feedback giving, allocating tasks, resource distribution. Since employee behavior and productivity are directly affected by their emotional states, it is imperative to consider employee emotional responses to organizational leaders. Emotional intelligence, the ability to understand and manage moods and emotions in the self and others, contributes to effective leadership within organizations. The neo-emergent leadership theory (from the Oxford school of leadership) sees leadership as created through the emergence of information by the leader or other stakeholders, not through the true actions of the leader himself. In other words, the reproduction of information or stories form the basis of the perception of leadership by the majority. It is well known that the naval hero Lord Nelson often wrote his own versions of battles he was involved in, so that when he arrived home in England he would receive a true hero 's welcome. In modern society, the press, blogs and other sources report their own views of leaders, which may be based on reality, but may also be based on a political command, a payment, or an inherent interest of the author, media, or leader. Therefore, one can argue that the perception of all leaders is created and in fact does not reflect their true leadership qualities at all. Many personality characteristics were found to be reliably associated with leadership emergence. The list include, but is not limited to following (list organized in alphabetical order): assertiveness, authenticity, Big Five personality factors, birth order, character strengths, dominance, emotional intelligence, gender identity, intelligence, narcissism, self - efficacy for leadership, self - monitoring and social motivation. The relationship between assertiveness and leadership emergence is curvilinear; individuals who are either low in assertiveness or very high in assertiveness are less likely to be identified as leaders. Individuals who are more aware of their personality qualities, including their values and beliefs, and are less biased when processing self - relevant information, are more likely to be accepted as leaders. See Authentic Leadership. Those who emerge as leaders tend to be more (order in strength of relationship with leadership emergence): extroverted, conscientious, emotionally stable, and open to experience, although these tendencies are stronger in laboratory studies of leaderless groups. Agreeableness, the last factor of the Big Five personality traits, does not seem to play any meaningful role in leadership emergence Those born first in their families and only children are hypothesized to be more driven to seek leadership and control in social settings. Middle - born children tend to accept follower roles in groups, and later - borns are thought to be rebellious and creative Those seeking leadership positions in a military organization had elevated scores on a number of indicators of strength of character, including honesty, hope, bravery, industry, and teamwork. Individuals with dominant personalities -- they describe themselves as high in the desire to control their environment and influence other people, and are likely to express their opinions in a forceful way -- are more likely to act as leaders in small - group situations. Individuals with high emotional intelligence have increased ability to understand and relate to people. They have skills in communicating and decoding emotions and they deal with others wisely and effectively. Such people communicate their ideas in more robust ways, are better able to read the politics of a situation, are less likely to lose control of their emotions, are less likely to be inappropriately angry or critical, and in consequence are more likely to emerge as leaders. Masculine individuals are more likely to emerge as leaders than are feminine individuals. This trend is expected to change in the modern era as in more developed countries we have seen how women have begun to rise to leadership position in the society as they were given equal rights compared to men. Individuals with higher intelligence exhibit superior judgement, higher verbal skills (both written and oral), quicker learning and acquisition of knowledge, and are more likely to emerge as leaders. Correlation between IQ and leadership emergence was found to be between. 25 and. 30. However, groups generally prefer leaders that do not exceed intelligence prowess of average member by a wide margin, as they fear that high intelligence may be translated to differences in communication, trust, interests and values Individuals who take on leadership roles in turbulent situations, such as groups facing a threat or ones in which status is determined by intense competition among rivals within the group, tend to be narcissistic: arrogant, self - absorbed, hostile, and very self - confident. Confidence in one 's ability to lead is associated with increases in willingness to accept a leadership role and success in that role. High self - monitors are more likely to emerge as the leader of a group than are low self - monitors, since they are more concerned with status - enhancement and are more likely to adapt their actions to fit the demands of the situation Individuals who are both success - oriented and affiliation - oriented, as assessed by projective measures, are more active in group problem - solving settings and are more likely to be elected to positions of leadership in such groups A leadership style is a leader 's style of providing direction, implementing plans, and motivating people. It is the result of the philosophy, personality, and experience of the leader. Rhetoric specialists have also developed models for understanding leadership (Robert Hariman, Political Style, Philippe - Joseph Salazar, L'Hyperpolitique. Technologies politiques De La Domination). Different situations call for different leadership styles. In an emergency when there is little time to converge on an agreement and where a designated authority has significantly more experience or expertise than the rest of the team, an autocratic leadership style may be most effective; however, in a highly motivated and aligned team with a homogeneous level of expertise, a more democratic or Laissez - faire style may be more effective. The style adopted should be the one that most effectively achieves the objectives of the group while balancing the interests of its individual members. A field in which leadership style has gained strong attention is that of military science, recently expressing a holistic and integrated view of leadership, including how a leader 's physical presence determines how others perceive that leader. The factors of physical presence are military bearing, physical fitness, confidence, and resilience. The leader 's intellectual capacity helps to conceptualize solutions and acquire knowledge to do the job. A leader 's conceptual abilities apply agility, judgment, innovation, interpersonal tact, and domain knowledge. Domain knowledge for leaders encompasses tactical and technical knowledge as well as cultural and geopolitical awareness. Under the autocratic leadership style, all decision - making powers are centralized in the leader, as with dictators. Autocratic leaders do not entertain any suggestions or initiatives from subordinates. The autocratic management has been successful as it provides strong motivation to the manager. It permits quick decision - making, as only one person decides for the whole group and keeps each decision to him / herself until he / she feels it needs to be shared with the rest of the group. The democratic leadership style consists of the leader sharing the decision - making abilities with group members by promoting the interests of the group members and by practicing social equality. This has also been called shared leadership. In Laissez - faire or free - rein leadership, decision - making is passed on to the sub-ordinates. The sub-ordinates are given complete right and power to make decisions to establish goals and work out the problems or hurdles. Task - oriented leadership is a style in which the leader is focused on the tasks that need to be performed in order to meet a certain production goal. Task - oriented leaders are generally more concerned with producing a step - by - step solution for given problem or goal, strictly making sure these deadlines are met, results and reaching target outcomes. Relationship - oriented leadership is a contrasting style in which the leader is more focused on the relationships amongst the group and is generally more concerned with the overall well - being and satisfaction of group members. Relationship - oriented leaders emphasize communication within the group, show trust and confidence in group members, and show appreciation for work done. Task - oriented leaders are typically less concerned with the idea of catering to group members, and more concerned with acquiring a certain solution to meet a production goal. For this reason, they typically are able to make sure that deadlines are met, yet their group members ' well - being may suffer. Relationship - oriented leaders are focused on developing the team and the relationships in it. The positives to having this kind of environment are that team members are more motivated and have support. However, the emphasis on relations as opposed to getting a job done might make productivity suffer. Another factor that covaries with leadership style is whether the person is male or female. When men and women come together in groups, they tend to adopt different leadership styles. Men generally assume an agentic leadership style. They are task - oriented, active, decision focused, independent and goal oriented. Women, on the other hand, are generally more communal when they assume a leadership position; they strive to be helpful towards others, warm in relation to others, understanding, and mindful of others ' feelings. In general, when women are asked to describe themselves to others in newly formed groups, they emphasize their open, fair, responsible, and pleasant communal qualities. They give advice, offer assurances, and manage conflicts in an attempt to maintain positive relationships among group members. Women connect more positively to group members by smiling, maintaining eye contact and respond tactfully to others ' comments. Men, conversely, describe themselves as influential, powerful and proficient at the task that needs to be done. They tend to place more focus on initiating structure within the group, setting standards and objectives, identifying roles, defining responsibilities and standard operating procedures, proposing solutions to problems, monitoring compliance with procedures, and finally, emphasizing the need for productivity and efficiency in the work that needs to be done. As leaders, men are primarily task - oriented, but women tend to be both task - and relationship - oriented. However, it is important to note that these sex differences are only tendencies, and do not manifest themselves within men and women across all groups and situations. In the past, some researchers have argued that the actual influence of leaders on organizational outcomes is overrated and romanticized as a result of biased attributions about leaders (Meindl & Ehrlich, 1987). Despite these assertions, however, it is largely recognized and accepted by practitioners and researchers that leadership is important, and research supports the notion that leaders do contribute to key organizational outcomes (Day & Lord, 1988; Kaiser, Hogan, & Craig, 2008). To facilitate successful performance it is important to understand and accurately measure leadership performance. Job performance generally refers to behavior that is expected to contribute to organizational success (Campbell, 1990). Campbell identified a number of specific types of performance dimensions; leadership was one of the dimensions that he identified. There is no consistent, overall definition of leadership performance (Yukl, 2006). Many distinct conceptualizations are often lumped together under the umbrella of leadership performance, including outcomes such as leader effectiveness, leader advancement, and leader emergence (Kaiser et al., 2008). For instance, leadership performance may be used to refer to the career success of the individual leader, performance of the group or organization, or even leader emergence. Each of these measures can be considered conceptually distinct. While these aspects may be related, they are different outcomes and their inclusion should depend on the applied or research focus. A toxic leader is someone who has responsibility over a group of people or an organization, and who abuses the leader -- follower relationship by leaving the group or organization in a worse - off condition than when he / she joined it. Most theories in the 20th century argued that great leaders were born, not made. Current studies have indicated that leadership is much more complex and can not be boiled down to a few key traits of an individual. Years of observation and study have indicated that one such trait or a set of traits does not make an extraordinary leader. What scholars have been able to arrive at is that leadership traits of an individual do not change from situation to situation; such traits include intelligence, assertiveness, or physical attractiveness. However, each key trait may be applied to situations differently, depending on the circumstances. The following summarizes the main leadership traits found in research by Jon P. Howell, business professor at New Mexico State University and author of the book Snapshots of Great Leadership. Determination and drive include traits such as initiative, energy, assertiveness, perseverance and sometimes dominance. People with these traits often tend to wholeheartedly pursue their goals, work long hours, are ambitious, and often are very competitive with others. Cognitive capacity includes intelligence, analytical and verbal ability, behavioral flexibility, and good judgment. Individuals with these traits are able to formulate solutions to difficult problems, work well under stress or deadlines, adapt to changing situations, and create well - thought - out plans for the future. Howell provides examples of Steve Jobs and Abraham Lincoln as encompassing the traits of determination and drive as well as possessing cognitive capacity, demonstrated by their ability to adapt to their continuously changing environments. Self - confidence encompasses the traits of high self - esteem, assertiveness, emotional stability, and self - assurance. Individuals who are self - confident do not doubt themselves or their abilities and decisions; they also have the ability to project this self - confidence onto others, building their trust and commitment. Integrity is demonstrated in individuals who are truthful, trustworthy, principled, consistent, dependable, loyal, and not deceptive. Leaders with integrity often share these values with their followers, as this trait is mainly an ethics issue. It is often said that these leaders keep their word and are honest and open with their cohorts. Sociability describes individuals who are friendly, extroverted, tactful, flexible, and interpersonally competent. Such a trait enables leaders to be accepted well by the public, use diplomatic measures to solve issues, as well as hold the ability to adapt their social persona to the situation at hand. According to Howell, Mother Teresa is an exceptional example who embodies integrity, assertiveness, and social abilities in her diplomatic dealings with the leaders of the world. Few great leaders encompass all of the traits listed above, but many have the ability to apply a number of them to succeed as front - runners of their organization or situation. One of the more recent definitions of leadership comes from Werner Erhard, Michael C. Jensen, Steve Zaffron, and Kari Granger who describe leadership as "an exercise in language that results in the realization of a future that was n't going to happen anyway, which future fulfills (or contributes to fulfilling) the concerns of the relevant parties... ''. This definition ensures that leadership is talking about the future and includes the fundamental concerns of the relevant parties. This differs from relating to the relevant parties as "followers '' and calling up an image of a single leader with others following. Rather, a future that fulfills on the fundamental concerns of the relevant parties indicates the future that was n't going to happen is not the "idea of the leader '', but rather is what emerges from digging deep to find the underlying concerns of those who are impacted by the leadership. An organization that is established as an instrument or means for achieving defined objectives has been referred to as a formal organization. Its design specifies how goals are subdivided and reflected in subdivisions of the organization. Divisions, departments, sections, positions, jobs, and tasks make up this work structure. Thus, the formal organization is expected to behave impersonally in regard to relationships with clients or with its members. According to Weber 's definition, entry and subsequent advancement is by merit or seniority. Employees receive a salary and enjoy a degree of tenure that safeguards them from the arbitrary influence of superiors or of powerful clients. The higher one 's position in the hierarchy, the greater one 's presumed expertise in adjudicating problems that may arise in the course of the work carried out at lower levels of the organization. It is this bureaucratic structure that forms the basis for the appointment of heads or chiefs of administrative subdivisions in the organization and endows them with the authority attached to their position. In contrast to the appointed head or chief of an administrative unit, a leader emerges within the context of the informal organization that underlies the formal structure. The informal organization expresses the personal objectives and goals of the individual membership. Their objectives and goals may or may not coincide with those of the formal organization. The informal organization represents an extension of the social structures that generally characterize human life -- the spontaneous emergence of groups and organizations as ends in themselves. In prehistoric times, humanity was preoccupied with personal security, maintenance, protection, and survival. Now humanity spends a major portion of waking hours working for organizations. The need to identify with a community that provides security, protection, maintenance, and a feeling of belonging has continued unchanged from prehistoric times. This need is met by the informal organization and its emergent, or unofficial, leaders. Leaders emerge from within the structure of the informal organization. Their personal qualities, the demands of the situation, or a combination of these and other factors attract followers who accept their leadership within one or several overlay structures. Instead of the authority of position held by an appointed head or chief, the emergent leader wields influence or power. Influence is the ability of a person to gain co-operation from others by means of persuasion or control over rewards. Power is a stronger form of influence because it reflects a person 's ability to enforce action through the control of a means of punishment. A leader is a person who influences a group of people towards a specific result. It is not dependent on title or formal authority. (Elevos, paraphrased from Leaders, Bennis, and Leadership Presence, Halpern & Lubar.) Ogbonnia (2007) defines an effective leader "as an individual with the capacity to consistently succeed in a given condition and be viewed as meeting the expectations of an organization or society. '' Leaders are recognized by their capacity for caring for others, clear communication, and a commitment to persist. An individual who is appointed to a managerial position has the right to command and enforce obedience by virtue of the authority of their position. However, she or he must possess adequate personal attributes to match this authority, because authority is only potentially available to him / her. In the absence of sufficient personal competence, a manager may be confronted by an emergent leader who can challenge her / his role in the organization and reduce it to that of a figurehead. However, only authority of position has the backing of formal sanctions. It follows that whoever wields personal influence and power can legitimize this only by gaining a formal position in the hierarchy, with commensurate authority. Leadership can be defined as one 's ability to get others to willingly follow. Every organization needs leaders at every level. Over the years the philosophical terminology of "management '' and "leadership '' have, in the organizational context, been used both as synonyms and with clearly differentiated meanings. Debate is fairly common about whether the use of these terms should be restricted, and generally reflects an awareness of the distinction made by Burns (1978) between "transactional '' leadership (characterized by e.g. emphasis on procedures, contingent reward, management by exception) and "transformational '' leadership (characterized by e.g. charisma, personal relationships, creativity). In contrast to individual leadership, some organizations have adopted group leadership. In this so - called shared leadership, more than one person provides direction to the group as a whole. It is furthermore characterized by shared responsibility, cooperation and mutual influence among the team members. Some organizations have taken this approach in hopes of increasing creativity, reducing costs, or downsizing. Others may see the traditional leadership of a boss as costing too much in team performance. In some situations, the team members best able to handle any given phase of the project become the temporary leaders. Additionally, as each team member has the opportunity to experience the elevated level of empowerment, it energizes staff and feeds the cycle of success. Leaders who demonstrate persistence, tenacity, determination, and synergistic communication skills will bring out the same qualities in their groups. Good leaders use their own inner mentors to energize their team and organizations and lead a team to achieve success. According to the National School Boards Association (USA): These Group Leaderships or Leadership Teams have specific characteristics: Characteristics of a Team Ten characteristics of well - functioning teams: Self - leadership is a process that occurs within an individual, rather than an external act. It is an expression of who we are as people. Mark van Vugt and Anjana Ahuja in Naturally Selected: The Evolutionary Science of Leadership present evidence of leadership in nonhuman animals, from ants and bees to baboons and chimpanzees. They suggest that leadership has a long evolutionary history and that the same mechanisms underpinning leadership in humans can be found in other social species, too. Richard Wrangham and Dale Peterson, in Demonic Males: Apes and the Origins of Human Violence, present evidence that only humans and chimpanzees, among all the animals living on Earth, share a similar tendency for a cluster of behaviors: violence, territoriality, and competition for uniting behind the one chief male of the land. This position is contentious. Many animals beyond apes are territorial, compete, exhibit violence, and have a social structure controlled by a dominant male (lions, wolves, etc.), suggesting Wrangham and Peterson 's evidence is not empirical. However, we must examine other species as well, including elephants (which are matriarchal and follow an alpha female), meerkats (who are likewise matriarchal), and many others. By comparison, bonobos, the second - closest species - relatives of humans, do not unite behind the chief male of the land. The bonobos show deference to an alpha or top - ranking female that, with the support of her coalition of other females, can prove as strong as the strongest male. Thus, if leadership amounts to getting the greatest number of followers, then among the bonobos, a female almost always exerts the strongest and most effective leadership. However, not all scientists agree on the allegedly peaceful nature of the bonobo or its reputation as a "hippie chimp ''. Leadership, although largely talked about, has been described as one of the least understood concepts across all cultures and civilizations. Over the years, many researchers have stressed the prevalence of this misunderstanding, stating that the existence of several flawed assumptions, or myths, concerning leadership often interferes with individuals ' conception of what leadership is all about (Gardner, 1965; Bennis, 1975). According to some, leadership is determined by distinctive dispositional characteristics present at birth (e.g., extraversion; intelligence; ingenuity). However, according to Forsyth (2009) there is evidence to show that leadership also develops through hard work and careful observation. Thus, effective leadership can result from nature (i.e., innate talents) as well as nurture (i.e., acquired skills). Although leadership is certainly a form of power, it is not demarcated by power over people -- rather, it is a power with people that exists as a reciprocal relationship between a leader and his / her followers (Forsyth, 2009). Despite popular belief, the use of manipulation, coercion, and domination to influence others is not a requirement for leadership. In actuality, individuals who seek group consent and strive to act in the best interests of others can also become effective leaders (e.g., class president; court judge). The validity of the assertion that groups flourish when guided by effective leaders can be illustrated using several examples. For instance, according to Baumeister et al. (1988), the bystander effect (failure to respond or offer assistance) that tends to develop within groups faced with an emergency is significantly reduced in groups guided by a leader. Moreover, it has been documented that group performance, creativity, and efficiency all tend to climb in businesses with designated managers or CEOs. However, the difference leaders make is not always positive in nature. Leaders sometimes focus on fulfilling their own agendas at the expense of others, including his / her own followers (e.g., Pol Pot; Josef Stalin). Leaders who focus on personal gain by employing stringent and manipulative leadership styles often make a difference, but usually do so through negative means. In Western cultures it is generally assumed that group leaders make all the difference when it comes to group influence and overall goal - attainment. Although common, this romanticized view of leadership (i.e., the tendency to overestimate the degree of control leaders have over their groups and their groups ' outcomes) ignores the existence of many other factors that influence group dynamics. For example, group cohesion, communication patterns among members, individual personality traits, group context, the nature or orientation of the work, as well as behavioral norms and established standards influence group functionality in varying capacities. For this reason, it is unwarranted to assume that all leaders are in complete control of their groups ' achievements. Despite preconceived notions, not all groups need have a designated leader. Groups that are primarily composed of women, are limited in size, are free from stressful decision - making, or only exist for a short period of time (e.g., student work groups; pub quiz / trivia teams) often undergo a diffusion of responsibility, where leadership tasks and roles are shared amongst members (Schmid Mast, 2002; Berdahl & Anderson, 2007; Guastello, 2007). Although research has indicated that group members ' dependence on group leaders can lead to reduced self - reliance and overall group strength, most people actually prefer to be led than to be without a leader (Berkowitz, 1953). This "need for a leader '' becomes especially strong in troubled groups that are experiencing some sort of conflict. Group members tend to be more contented and productive when they have a leader to guide them. Although individuals filling leadership roles can be a direct source of resentment for followers, most people appreciate the contributions that leaders make to their groups and consequently welcome the guidance of a leader (Stewart & Manz, 1995). In most cases, these teams are tasked to operate in remote and changeable environments with limited support or backup (action environments). Leadership of people in these environments requires a different set of skills to that of front line management. These leaders must effectively operate remotely and negotiate the needs of the individual, team, and task within a changeable environment. This has been termed action oriented leadership. Some examples of demonstrations of action oriented leadership include extinguishing a rural fire, locating a missing person, leading a team on an outdoor expedition, or rescuing a person from a potentially hazardous environment. Other examples include modern technology deployments of small / medium - sized IT teams into client plant sites. Leadership of these teams requires hands on experience and a lead - by - example attitude to empower team members to make well thought out and concise decisions independent of executive management and / or home base decision makers. Zachary Hansen was an early adopter of Scrum / Kanban branch development methodologies during the mid 90 's to alleviate the dependency that field teams had on trunk based development. This method of just - in - time action oriented development and deployment allowed remote plant sites to deploy up - to - date software patches frequently and without dependency on core team deployment schedules satisfying the clients need to rapidly patch production environment bugs as needed. Carlyle 's 1840 "Great Man theory '', which emphasized the role of leading individuals, met opposition in the 19th and 20th centuries. Karl Popper noted in 1945 that leaders can mislead and make mistakes - he warns against deferring to "great men ''. Noam Chomsky and others have subjected the concept of leadership to critical thinking and have provided an analysis that asserts that people abrogate their responsibility to think and will actions for themselves. While the conventional view of leadership may satisfy people who "want to be told what to do '', these critics say that one should question why they are being subjected to a will or intellect other than their own if the leader is not a subject - matter expert (SME). Concepts such as autogestion, employeeship, and common civic virtue, etc., challenge the fundamentally anti-democratic nature of the leadership principle by stressing individual responsibility and / or group authority in the workplace and elsewhere and by focusing on the skills and attitudes that a person needs in general rather than separating out "leadership '' as the basis of a special class of individuals. Similarly, various historical calamities (such as World War II) can be attributed to a misplaced reliance on the principle of leadership as exhibited in dictatorship. The idea of leaderism paints leadership and its excesses in a negative light. Executives are energetic, outgoing, and competitive. They can be visionary, hard - working, and decisive. However, managers need to be aware of unsuccessful executives who once showed management potential but who are later dismissed or retired early. They typically fail because of personality factors rather than job performances. Terms fallacies in their thinking are: Notes Books Journal articles
name of the home minister of india 2018
Minister of Home Affairs (India) - Wikipedia No fixed term -- -- -- -- -- -- -- -- -- -- -- -- -- -- Executive: -- -- -- -- -- -- -- Legislature: Judiciary: -- -- -- -- -- -- -- Political parties National coalitions: -- -- -- -- -- -- -- State governments Legislatures: -- -- -- -- -- -- -- Local governments: Rural bodies: Urban bodies: The Minister of Home Affairs (or simply, the Home Minister) is the head of the Ministry of Home Affairs of the Government of India. One of the senior-most offices in the Union Cabinet, the chief responsibility of the Home Minister is the maintenance of India 's internal security; the country 's large police force comes under his ambit. Occasionally, he / she is assisted by the Minister of State of Home Affairs and the lower - ranked Deputy Minister of Home Affairs. Ever since the time of independent India 's first Home Minister, Sardar Vallabhbhai Patel, the office has been seen as second in seniority only to the Prime Minister in the Union Cabinet. Like Patel, several Home Ministers have since held the additional portfolio of Deputy Prime Minister. As of February 2018, three Home Ministers have gone on to become the Prime Minister who are: Lal Bahadur Shastri, Charan Singh and P.V. Narasimha Rao. Since 26 May 2014, the Home Minister of India is Rajnath Singh of the Bharatiya Janta Party, taking over the reins from Sushilkumar Shinde. Akash Singh, Aryan Singh 31 July 2015 19 April 2018
where does the epstein barr virus come from
Epstein -- Barr virus - wikipedia The Epstein -- Barr virus (EBV), also called human herpesvirus 4 (HHV - 4), is one of eight known human herpesvirus types in the herpes family, and is one of the most common viruses in humans. It is best known as the cause of infectious mononucleosis ("glandular fever ''). It is also associated with particular forms of cancer, such as Hodgkin 's lymphoma, Burkitt 's lymphoma, gastric cancer, nasopharyngeal carcinoma, and conditions associated with human immunodeficiency virus (HIV), such as hairy leukoplakia and central nervous system lymphomas. There is evidence that infection with EBV is associated with a higher risk of certain autoimmune diseases, especially dermatomyositis, systemic lupus erythematosus, rheumatoid arthritis, Sjögren 's syndrome, and multiple sclerosis. Some 200,000 cancer cases per year are thought to be attributable to EBV. Infection with EBV occurs by the oral transfer of saliva and genital secretions. Most people become infected with EBV and gain adaptive immunity. In the United States, about half of all five - year - old children and about 90 percent of adults have evidence of previous infection. Infants become susceptible to EBV as soon as maternal antibody protection disappears. Many children become infected with EBV, and these infections usually cause no symptoms or are indistinguishable from the other mild, brief illnesses of childhood. In the United States and other developed countries, many people are not infected with EBV in their childhood years. When infection with EBV occurs during adolescence, it causes infectious mononucleosis 35 to 50 percent of the time. EBV infects B cells of the immune system and epithelial cells. Once EBV 's initial lytic infection is brought under control, EBV latency persists in the individual 's B cells for the rest of the individual 's life. As previously stated, adolescents who contract EBV exhibit few symptoms or may even appear asymptomatic, but when EBV is contracted as an adult it may cause fatigue, fever, inflamed throat, swollen lymph nodes in the neck, enlarged spleen, swollen liver, or rash. The virus is approximately 122 -- 180 nm in diameter and is composed of a double helix of DNA which contains about 172,000 base pairs and 85 genes. The DNA is surrounded by a protein nucleocapsid. This nucleocapsid is surrounded by a tegument made of protein, which in turn is surrounded by an envelope containing both lipids and surface projections of glycoproteins which are essential to infection of the host cell. The term viral tropism refers to which cell types EBV infects. EBV can infect different cell types, including B cells and epithelial cells. The viral three - part glycoprotein complexes of gHgL gp42 mediate B cell membrane fusion; although the two - part complexes of gHgL mediate epithelial cell membrane fusion. EBV that are made in the B cells have low numbers of gHgLgp42 complexes, because these three - part complexes interact with Human - leukocyte - antigen class II molecules present in B cells in the endoplasmic reticulum and are degraded. In contrast, EBV from epithelial cells are rich in the three - part complexes because these cells do not normally contain HLA class II molecules. As a consequence, EBV made from B cells are more infectious to epithelial cells, and EBV made from epithelial cells are more infectious to B cells. Viruses lacking the gp42 portion are able to bind to human B cells but unable to infect. EBV can infect both B cells and epithelial cells. The mechanisms for entering these two cells are different. To enter B cells, viral glycoprotein gp350 binds to cellular receptor CD21 (also known as CR2). Then, viral glycoprotein gp42 interacts with cellular MHC class II molecules. This triggers fusion of the viral envelope with the cell membrane, allowing EBV to enter the B cell. Human CD35, also known as complement receptor 1 (CR1), is an additional attachment factor for gp350 / 220, and can provide a route for entry of EBV into CD21 - negative cells, including immature B - cells. EBV infection downregulates expression of CD35. To enter epithelial cells, viral protein BMRF - 2 interacts with cellular β1 integrins. Then, viral protein gH / gL interacts with cellular αvβ6 / αvβ8 integrins. This triggers fusion of the viral envelope with the epithelial cell membrane, allowing EBV to enter the epithelial cell. Unlike B cell entry, epithelial cell entry is actually impeded by viral glycoprotein gp42. Once EBV enters the cell, the viral capsid dissolves and the viral genome is transported to the cell nucleus. The lytic cycle, or productive infection, results in the production of infectious virions. EBV can undergo lytic replication in both B cells and epithelial cells. In B cells, lytic replication normally only takes place after reactivation from latency. In epithelial cells, lytic replication often directly follows viral entry. For lytic replication to occur, the viral genome must be linear. The latent EBV genome is circular, so it must linearize in the process of lytic reactivation. During lytic replication, viral DNA polymerase is responsible for copying the viral genome. This contrasts with latency, in which host cell DNA polymerase copies the viral genome. Lytic gene products are produced in three consecutive stages: immediate - early, early, and late. Immediate - early lytic gene products act as transactivators, enhancing the expression of later lytic genes. Immediate - early lytic gene products include BZLF1 (also known as Zta, EB1, associated with its product gene ZEBRA) and BRLF1 (associated with its product gene Rta). Early lytic gene products have many more functions, such as replication, metabolism, and blockade of antigen processing. Early lytic gene products include BNLF2. Finally, late lytic gene products tend to be proteins with structural roles, like VCA, which forms the viral capsid. Other late lytic gene products, such as BCRF1, help EBV evade the immune system. EGCG, a polyphenol in green tea, has shown in a study to inhibit EBV spontaneous lytic infection at the DNA, gene transcription and protein levels in a time and dose - dependent manner; the expression of EBV lytic genes Zta, Rta, and early antigen complex EA - D (induced by Rta), however, the highly stable EBNA - 1 gene found across all stages of EBV infection is unaffected. Specific inhibitors (to the pathways) suggest that Ras / MEK / MAPK pathway contributes to EBV lytic infection though BZLF1 and PI3 - K pathway through BRLF1, the latter completely abrogating the ability of a BRLF1 adenovirus vector to induce the lytic form of EBV infection. Additionally, the activation of some genes but not others is being studied to determine just how to induce immune destruction of latently infected B - cells by use of either TPA or sodium butyrate. Unlike lytic replication, latency does not result in production of virions. Instead, the EBV genome circular DNA resides in the cell nucleus as an episome and is copied by cellular DNA polymerase. In latency, only a portion of EBV 's genes are expressed. Latent EBV expresses its genes in one of three patterns, known as latency programs. EBV can latently persist within B cells and epithelial cells, but different latency programs are possible in the two types of cell. EBV can exhibit one of three latency programs: Latency I, Latency II, or Latency III. Each latency program leads to the production of a limited, distinct set of viral proteins and viral RNAs. It is also postulated that a program exists in which all viral protein expression is shut off (Latency 0). Within B cells, all three latency programs are possible. EBV latency within B cells usually progresses from Latency III to Latency II to Latency I. Each stage of latency uniquely influences B cell behavior. Upon infecting a resting naive B cell, EBV enters Latency III. The set of proteins and RNAs produced in Latency III transforms the B cell into a proliferating blast (also known as B cell activation). Later, the virus restricts its gene expression and enters Latency II. The more limited set of proteins and RNAs produced in Latency II induces the B cell to differentiate into a memory B cell. Finally, EBV restricts gene expression even further and enters Latency I. Expression of EBNA - 1 allows the EBV genome to replicate when the memory B cell divides. Within epithelial cells, only Latency II is possible. In primary infection, EBV replicates in oro - pharyngeal epithelial cells and establishes Latency III, II, and I infections in B - lymphocytes. EBV latent infection of B - lymphocytes is necessary for virus persistence, subsequent replication in epithelial cells, and release of infectious virus into saliva. EBV Latency III and II infections of B - lymphocytes, Latency II infection of oral epithelial cells, and Latency II infection of NK - or T - cell can result in malignancies, marked by uniform EBV genome presence and gene expression. Latent EBV in B cells can be reactivated to switch to lytic replication. This is known to happen in vivo, but what triggers it is not known precisely. In vitro, latent EBV in B cells can be reactivated by stimulating the B cell receptor, so reactivation in vivo probably takes place when latently infected B cells respond to unrelated infections. In vitro, latent EBV in B cells can also be reactivated by treating the cells with sodium butyrate or TPA. When EBV infects B cells in vitro, lymphoblastoid cell lines eventually emerge that are capable of indefinite growth. The growth transformation of these cell lines is the consequence of viral protein expression. EBNA - 2, EBNA - 3C and LMP - 1 are essential for transformation, whereas EBNA - LP and the EBERs are not. It is postulated that following natural infection with EBV, the virus executes some or all of its repertoire of gene expression programs to establish a persistent infection. Given the initial absence of host immunity, the lytic cycle produces large amounts of virus to infect other (presumably) B - lymphocytes within the host. The latent programs reprogram and subvert infected B - lymphocytes to proliferate and bring infected cells to the sites at which the virus presumably persists. Eventually, when host immunity develops, the virus persists by turning off most (or possibly all) of its genes, only occasionally reactivating to produce fresh virions. A balance is eventually struck between occasional viral reactivation and host immune surveillance removing cells that activate viral gene expression. The site of persistence of EBV may be bone marrow. EBV - positive patients who have had their own bone marrow replaced with bone marrow from an EBV - negative donor are found to be EBV - negative after transplantation. All EBV nuclear proteins are produced by alternative splicing of a transcript starting at either the Cp or Wp promoters at the left end of the genome (in the conventional nomenclature). The genes are ordered EBNA - LP / EBNA - 2 / EBNA - 3A / EBNA - 3B / EBNA - 3C / EBNA - 1 within the genome. The initiation codon of the EBNA - LP coding region is created by an alternate splice of the nuclear protein transcript. In the absence of this initiation codon, EBNA - 2 / EBNA - 3A / EBNA - 3B / EBNA - 3C / EBNA - 1 will be expressed depending on which of these genes is alternatively spliced into the transcript. EBV can be divided into two major types, EBV type 1 and EBV type 2. These two subtypes have different EBNA - 3 genes. As a result, the two subtypes differ in their transforming capabilities and reactivation ability. Type 1 is dominant throughout most of the world, but the two types are equally prevalent in Africa. One can distinguish EBV type 1 from EBV type 2 by cutting the viral genome with a restriction enzyme and comparing the resulting digestion patterns by gel electrophoresis. EBV has been implicated in several diseases, including infectious mononucleosis, Burkitt 's lymphoma, Hodgkin 's lymphoma, stomach cancer, nasopharyngeal carcinoma, multiple sclerosis, and lymphomatoid granulomatosis. Additional diseases that have been linked to EBV include Gianotti -- Crosti syndrome, erythema multiforme, acute genital ulcers, oral hairy leukoplakia. Hypersensitivity to mosquito bites has been associated with EBV infection. The Epstein -- Barr virus has been implicated in disorders related to alpha - synuclein aggregation (e.g. Parkinson 's disease, dementia with Lewy bodies, and multiple system atrophy). The Epstein -- Barr virus was named after Michael Anthony Epstein (born 18 May 1921), now a professor emeritus at the University of Bristol, and Yvonne Barr (1932 -- 2016), a 1966 Ph. D graduate from the University of London, who together discovered and, in 1964, published on the existence of the virus. In 1961, Epstein, a pathologist and expert electron microscopist, attended a lecture on "The Commonest Children 's Cancer in Tropical Africa -- A Hitherto Unrecognised Syndrome. '' This lecture, by Denis Parsons Burkitt, a surgeon practicing in Uganda, was the description of the "endemic variant '' (pediatric form) of the disease that bears his name. In 1963, a specimen was sent from Uganda to Middlesex Hospital to be cultured. Virus particles were identified in the cultured cells, and the results were published in The Lancet in 1964 by Epstein, Bert Achong, and Barr. Cell lines were sent to Werner and Gertrude Henle at the Children 's Hospital of Philadelphia who developed serological markers. In 1967, a technician in their laboratory developed mononucleosis and they were able to compare a stored serum sample, showing that antibodies to the virus developed. In 1968, they discovered that EBV can directly immortalize B cells after infection, mimicking some forms of EBV - related infections, and confirmed the link between the virus and infectious mononucleosis. As a relatively complex virus, EBV is not yet fully understood. Laboratories around the world continue to study the virus and develop new ways to treat the diseases it causes. One popular way of studying EBV in vitro is to use bacterial artificial chromosomes. Epstein -- Barr virus can be maintained and manipulated in the laboratory in continual latency (a property shared with Kaposi 's sarcoma - associated herpesvirus, another of the eight human herpesviruses). Although many viruses are assumed to have this property during infection of their natural hosts, there is not an easily managed system for studying this part of the viral lifecycle. Genomic studies of EBV have been able to explore lytic reactivation and regulation of the latent viral episome.
which law and what part specifically passed by congress in 1789 was declared unconstitutional
Judiciary Act of 1789 - wikipedia The Judiciary Act of 1789 (ch. 20, 1 Stat. 73) was a United States federal statute adopted on September 24, 1789, in the first session of the First United States Congress. It established the federal judiciary of the United States. Article III, Section 1 of the Constitution prescribed that the "judicial power of the United States, shall be vested in one supreme Court, and such inferior Courts '' as Congress saw fit to establish. It made no provision for the composition or procedures of any of the courts, leaving this to Congress to decide. The existence of a separate federal judiciary had been controversial during the debates over the ratification of the Constitution. Anti-Federalists had denounced the judicial power as a potential instrument of national tyranny. Indeed, of the ten amendments that eventually became the Bill of Rights, five (the fourth through the eighth) dealt primarily with judicial proceedings. Even after ratification, some opponents of a strong judiciary urged that the federal court system be limited to a Supreme Court and perhaps local admiralty judges. The Congress, however, decided to establish a system of federal trial courts with broader jurisdiction, thereby creating an arm for enforcement of national laws within each state. Senator Richard Henry Lee (AA - Virginia) reported the judiciary bill out of committee on June 12, 1789; Oliver Ellsworth of Connecticut was its chief author. The bill passed the Senate 14 -- 6 on July 17, 1789, and the House of Representatives then debated the bill in July and August 1789. The House passed an amended bill 37 -- 16 on September 17, 1789. The Senate struck four of the House amendments and approved the remaining provisions on September 19, 1789. The House passed the Senate 's final version of the bill on September 21, 1789. President George Washington signed the Act into law on September 24, 1789. The Act set the number of Supreme Court justices at six: one Chief Justice and five Associate Justices. The Supreme Court was given exclusive original jurisdiction over all civil actions between states, or between a state and the United States, as well as over all suits and proceedings brought against ambassadors and other diplomatic personnel; and original, but not exclusive, jurisdiction over all other cases in which a state was a party and any cases brought by an ambassador. The Court was given appellate jurisdiction over decisions of the federal circuit courts as well as decisions by state courts holding invalid any statute or treaty of the United States; or holding valid any state law or practice that was challenged as being inconsistent with the federal constitution, treaties, or laws; or rejecting any claim made by a party under a provision of the federal constitution, treaties, or laws. SECTION 1. Be it enacted by the Senate and House of Representatives of the United States of America in Congress assembled, That the supreme court of the United States shall consist of a chief justice and five associate justices, any four of whom shall be a quorum, and shall hold annually at the seat of government two sessions, the one commencing the first Monday of February, and the other the first Monday of August. -- Judiciary Act of 1789 The Act also created 13 judicial districts within the 11 states that had then ratified the Constitution (North Carolina and Rhode Island were added as judicial districts in 1790, and other states as they were admitted to the Union). Each state comprised one district, except for Virginia and Massachusetts, each of which comprised two. Massachusetts was divided into the District of Maine (which was then part of Massachusetts) and the District of Massachusetts (which covered modern - day Massachusetts). Virginia was divided into the District of Kentucky (which was then part of Virginia) and the District of Virginia (which covered modern - day West Virginia and Virginia). This Act established a circuit court and district court in each judicial district (except in Maine and Kentucky, where the district courts exercised much of the jurisdiction of the circuit courts). The circuit courts, which comprised a district judge and (initially) two Supreme Court justices "riding circuit, '' had original jurisdiction over serious crimes and civil cases of at least $500 involving diversity jurisdiction or the United States as plaintiff in common law and equity. The circuit courts also had appellate jurisdiction over the district courts. The single - judge district courts had jurisdiction primarily over admiralty cases, petty crimes, and suits by the United States for at least $100. Notably, the federal trial courts had not yet received original federal question jurisdiction. Congress authorized all people to either represent themselves or to be represented by another person. The Act did not prohibit paying a representative to appear in court. Congress authorized persons who were sued by citizens of another state, in the courts of the plaintiff 's home state, to remove the lawsuit to the federal circuit court. The power of removal, and the Supreme Court 's power to review state court decisions where federal law was at issue, established that the federal judicial power would be superior to that of the states. The Act created the Office of Attorney General, whose primary responsibility was to represent the United States before the Supreme Court. The Act also created a United States Attorney and a United States Marshal for each judicial district. The Judiciary Act of 1789 included the Alien Tort Statute, now codified as 28 U.S.C. § 1350, which provides jurisdiction in the district courts over lawsuits by aliens for torts in violation of the law of nations or treaties of the United States. Immediately after signing the Judiciary Act into law, President Washington submitted his nominations to fill the offices created by the Act. Among the nominees were John Jay for Chief Justice of the United States; John Rutledge, William Cushing, Robert H. Harrison, James Wilson, and John Blair Jr. as Associate Justices; Edmund Randolph for Attorney General; and myriad district judges, United States Attorneys, and United States Marshals for Connecticut, Delaware, Georgia, Kentucky, Maryland, Maine, Massachusetts, New Hampshire, New Jersey, New York, Pennsylvania, South Carolina, and Virginia. All six of Washington 's Supreme Court nominees were confirmed by the Senate. Harrison, however, declined to serve. In his place, Washington later nominated James Iredell, who joined the Court in 1790, thereby bringing the Court to its "full strength '' complement of six members. The first six persons to serve on the United States Supreme Court (ordered by seniority) were: John Jay Chief Justice John Rutledge Associate Justice William Cushing Associate Justice James Wilson Associate Justice John Blair Associate Justice James Iredell Associate Justice A clause granting the Supreme Court the power to issue writs of mandamus under its original jurisdiction was declared unconstitutional by Marbury v. Madison, one of the seminal cases in American law. The Supreme Court held that Section 13 of the Judiciary Act was unconstitutional because it purported to enlarge the original jurisdiction of the Supreme Court beyond that permitted by the Constitution. In Marbury, the Supreme Court ruled that Congress can not pass laws that are contrary to the Constitution, and that it is the role of the judicial system to interpret what the Constitution permits. Thus, the Judiciary Act of 1789 was the first act of Congress to be partially invalidated by the Supreme Court.
who bought the ruby slippers from wizard of oz
Ruby slippers - wikipedia The ruby slippers are the magic pair of shoes worn by Dorothy Gale as played by Judy Garland in the classic 1939 MGM musical movie The Wizard of Oz. Because of their iconic stature, the ruby slippers are now considered among the most treasured and valuable items of film memorabilia. As is customary for important props, a number of pairs were made for the film, though the exact number is unknown. Five pairs are known to have survived; one pair was stolen in August 2005 and has never been recovered. In L. Frank Baum 's original novel, The Wonderful Wizard of Oz (1900), on which the film is based, Dorothy wears Silver Shoes. However, the color of the shoes was changed to red in order to take full advantage of the new Technicolor film process being used in big - budget Hollywood films of that era. Film screenwriter Noel Langley is credited with the idea. In the MGM film, an adolescent farm girl named Dorothy (played by Judy Garland), her dog Toto, and their farmhouse are swept away from Kansas by a tornado and taken to the magical Land of Oz. The house falls on and kills the Wicked Witch of the East, freeing the Munchkins from her tyranny. Glinda the Good Witch of the North arrives via magic bubble and shows Dorothy the dead woman 's two feet visibly sticking out from under the house wearing the ruby slippers. When the Wicked Witch of the West comes to claim her dead sister 's shoes, Glinda magically transfers them to Dorothy 's feet. Glinda tells Dorothy to keep tight inside of them and never take them off, as the slippers must be very powerful or the Wicked Witch would not want them so badly. Throughout the rest of the film, the Wicked Witch schemes to obtain the shoes. When she captures Dorothy, she tries to take the slippers, but receives a painful shock. The Wicked Witch then realizes that the slippers will only come off if the wearer is dead, so she decides to kill Dorothy. Before she does, however, Dorothy accidentally splashes her with a bucket of water, causing her to melt away. At the end, it is revealed that Dorothy can return home by simply closing her eyes, clicking the heels of the slippers together three times and repeating the phrase, "There 's no place like home. '' The slippers were designed by Gilbert Adrian, MGM 's chief costume designer. Initially, two pairs were made in different styles. The so - called "Arabian test pair '' was "a wildly jeweled, Arabian motif, with curling toes and heels. '' This pair was used in costume tests, but was rejected as unsuitable for Dorothy 's Kansas farmgirl image. The second design was approved, with one modification. The red bugle beads used to simulate rubies proved too heavy, so they were mostly replaced with sequins, about 2,300 for each shoe. It is believed that at least six or seven pairs of the final design were made. According to producer Mervyn LeRoy, "We must have had five or ten pairs of those shoes ''. The wardrobe woman who worked on the film claimed "six identical pairs '' had been made. Four pairs used in the movie have been accounted for. Rhys Thomas speculates that they were likely made by Joe Napoli of the Western Costume Company, and not all at once, but as the need arose. Garland requested one pair a half - size larger, as her feet would become slightly swollen in the afternoon from the rigors of morning rehearsals and filming. According to Rhys Thomas in his Los Angeles Times article, "all the ruby slippers are between Size 5 and 6, varying between B and D widths. '' The four surviving pairs were made from white silk pumps from the Innes Shoe Company in Los Angeles. At the time, many movie studios used plain white silk shoes because they were inexpensive and easy to dye. It is likely that most of the shoes worn by female characters in The Wizard of Oz were plain Innes shoes with varying heel heights, dyed to match each costume. There is an embossed gold or silver stamp or an embroidered cloth label bearing the name of the company inside each right shoe. To create the ruby slippers, the shoes were dyed red, then burgundy sequined organza overlays were attached to each shoe 's upper and heel. The film 's early three - strip Technicolor process required the sequins to be darker than most red sequins found today; bright red sequins would have appeared orange on screen. Two weeks before the start of shooting, Adrian added butterfly - shaped red strap leather bows. Each of the Art Deco - inspired bows had three large, rectangular, red - glass jewels with dark red bugle beads, outlined in red glass rhinestones in silver settings. The stones and beads were sewn to the bows, then to the organza - covered shoe. Three pairs of the surviving slippers had orange felt glued to their soles to deaden the sound of Garland dancing on the Yellow Brick Road. It is theorized that Garland wore one primary pair during shooting. This may be the pair known as "the People 's Shoes '', on public display at the Smithsonian Institution. Another pair, the close - up or insert shoes, is in best shape of all, appears to be better made, has no orange felt on the soles and has "# 7 Judy Garland '' written in the lining. According to the Library of Congress, "it is widely believed that they were used primarily for close - ups and possibly the climactic scene where Dorothy taps her heels together. '' Circular scuff marks on the soles support the theory that they were the ones Garland had on when she clicked her heels together. The lack of felt indicates these were likely also the shoes taken from the feet of the dead Wicked Witch of the East (since the soles are visible in the film), hence their nickname: the "Witch 's Shoes ''. The last known pair was, some believe, made for Bobbie Koshay, Garland 's stunt double. This is most likely the size 6B pair (owned first by Roberta Bauman, then Anthony Landini, and currently by David Elkouby) whose lining says "Double '' instead of "Judy Garland ''. However, some believe this pair may have been the second pair created, therefore explaining the "Double '' in the lining, but still worn by Garland and Koshay. Several pairs of Garland 's own shoes are size 6 ​ ⁄. Also, Garland can be seen wearing this pair in photos taken after the film 's primary shooting was finished in 1939. In one film sequence, Garland is not wearing the ruby slippers. As trees pelt the Scarecrow with apples, Garland can be briefly glimpsed wearing a black shoe on her right foot. For many years, movie studios were careless with old props, costumes, scripts, and other materials, unaware of their increasing value as memorabilia. Often, workers would just keep props as souvenirs without permission, aware that their employers did not particularly care. One of the more notorious of these was costumer Kent Warner, who amassed a large private collection and supplemented his income with sales. It was he who found the slippers in February or March 1970 while helping to set up a mammoth auction of MGM props and wardrobe. They had been stored and forgotten in the basement of MGM 's wardrobe department. One pair became the centerpiece of the auction. Warner kept the best pair for himself, size 5B, and apparently sold the rest. The slippers in the MGM auction (size 5C) were bought for $15,000 by a lawyer acting for an unidentified client. This is believed to be the pair on permanent exhibition in the Popular Culture wing of the National Museum of American History at the Smithsonian Institution in Washington, D.C., though the donor insisted on anonymity. Dr. Brent Glass, the director of the museum, appeared on the January 23, 2008 The Oprah Winfrey Show with the slippers and informed Oprah Winfrey that "they were worn by Judy Garland during her dance routines on the Yellow Brick Road because there 's felt on the bottom of these slippers ''. However, according to Rhys Thomas, all but one pair had orange felt on the soles. The Smithsonian pair is undergoing a rapid deterioration from aging and the museum is raising money to fund research on preservation. Another pair was originally owned by a Tennessee woman named Roberta Bauman who got them by placing second in a National Four Star Club "Name the Best Movies of 1939 '' contest. In 1988, auction house Christie 's sold them for $150,000 plus $15,000 buyer 's premium to Anthony Landini. Landini worked with the Disney Company to start showing them at the Disney / MGM Studios ' Florida Theme Park in the queue for the Great Movie Ride, whose facade and queue area are themed after Grauman 's Chinese Theater in Los Angeles. They were visible at the ride 's debut in 1989. Landini auctioned his pair of slippers, again at Christie 's East, on May 24, 2000, for $666,000 (including the buyer 's premium). They were sold to David Elkouby and his partners, who own memorabilia shops in Hollywood. Elkouby and Co. has yet to display the shoes. The pair Warner kept, the "Witch 's Shoes '', was in the best condition. Warner sold the shoes in 1981 to an unknown buyer through Christie 's East for $12,000. Two weeks after Landini bought his slippers, this pair resurfaced and was offered privately through Christie 's to the under - bidder of the Bauman shoes, Philip Samuels of St. Louis, Missouri. Samuels bought them for the same price that Landini had paid, $165,000. He has used his shoes for fund raising for children 's charities, as well as lending them to the Smithsonian when their slippers are cleaned, repaired or (previously) on tour. Auction house Profiles in History announced that this pair would be the highlight of its December 15 -- 17, 2011 Icons of Hollywood auction. In an interview, Joe Maddalena, head of Profiles in History, estimated that they would go for two to three million dollars. They were offered with a starting reserve price of two million dollars on December 16, 2011, but did not sell. Actor Leonardo DiCaprio and other benefactors, including director Steven Spielberg, made it possible for the Academy of Motion Picture Arts and Sciences to acquire the pair for an undisclosed price in February 2012 for their forthcoming museum. Kent Warner sold one pair, size 5 ​ ⁄ B, to Michael Shaw in 1970. These were stolen from an exhibit at the Judy Garland Museum in Grand Rapids, Minnesota on the night of August 27 -- 28, 2005. In 2015, the Associated Press reported that an anonymous donor has offered a $1 million reward for information about the stolen slippers. The very elaborate curled - toe "Arabian '' pair was owned by actress and memorabilia preservationist Debbie Reynolds. Reynolds acknowledged she got them from Kent Warner. Reynolds ' slippers were sold for $510,000 (not including the buyer 's premium) as part of the June 2011 auction of part of the actress 's collection. The ruby slippers play an integral role in the 1985 Walt Disney Pictures film Return to Oz, for which Disney had to obtain rights from MGM to use reproductions in the film. Unlike the originals, the hand - made British French - heeled shoes for Return to Oz were covered in hundreds of dark red crystals. The stones were soaked in sulfuric acid to remove the silver backing, and two types of glue were used to affix them to the shoes (a spray glue and an optical glue). No matter what was done, the stones kept falling off during filming. Stagehands were specifically hired to sweep up loose "rubies '' that would fall off the slippers after a scene was shot. Being little girls, actresses Fairuza Balk who played Dorothy and Emma Ridley who played Princess Ozma, simply could not keep from playing, skipping and tapping their heels, so eventually they were required to take off the slippers between takes. Effects were later added in post production to give the slippers their magical glow. Simple, red grosgrain ribbon with additional stones were used for the bows. Seven pairs were made for the filming: two pairs, size three for Ridley, three pair (size unknown) for Balk and two men 's size 11 for the Nome King played by actor Nicol Williamson. In 1985, Walt Disney Productions gave away a pair of slippers to promote the film. They were won by a British family, who sold them to prominent Oz collector Willard Carroll in a 2001 eBay auction. The Western Costume Company in Hollywood claims to have made Garland 's original slippers. While it is likely that Western would have been contracted to make some of The Wizard of Oz 's many costumes, no records of the original slippers exist to either validate or disprove their claim. In 1989, to commemorate the movie 's 50th anniversary, Western produced the only authorized reproductions. Hand - lasted on Judy Garland 's original foot mold and completely sequined and jeweled, the reproduction slippers were nearly identical to the originals. Western planned a limited edition of 500 pairs at $5000 each, but halted the project after selling only 16 pairs. One of these pairs fetched $35,000 (including buyer 's premium) at a November 25, 2013, auction. An imitation pair of ruby slippers appeared in the 2002 movie The Master of Disguise. Another pair appeared in an Oz sequence in the cult comedy Kentucky Fried Movie. Reproductions were also featured in Night at the Museum: Battle of the Smithsonian. In this film, Kahmunrah tosses them away after discovering that the rubies are fake. In honor of the fiftieth anniversary of The Wizard of Oz, the Harry Winston jewellery company created a size - four pair of slippers using "about 25 carats of diamonds and 1,500 carats of rubies ''. Valued at $3 million, they are reportedly the most expensive pair of shoes in the world. During the fall 2008 Fashion Week in New York City, the Swarovski company held a charity contest to commemorate the seventieth anniversary of the film, with nineteen designers redesigning the ruby slippers, including Gwen Stefani, Diane von Fürstenberg, and Moschino. The "Arabian '' design was displayed with the designer entries. In the 1990 -- 1991 animated TV series The Wizard of Oz (produced by DiC Animation City), the ruby slippers ' powers are significantly enhanced. Not only do they retain their movie - inspired ability to repel the Wicked Witch of the West 's touch, as well as the capability to teleport their user (and an unspecified number of companions) to any location desired, but they also demonstrate numerous other attributes and capabilities as well. Among them are the ability to: In this series, Dorothy remains inexperienced and unfamiliar with the shoes ' magic, and as such, calls upon their power only as a last resort; often resulting in a deus ex machina scenario. The Cowardly Lion and Truckle, the Wicked Witch of the West 's chief Flying Monkey, also get to wear them briefly. In the Charmed season 5 episode Happily Ever After, Piper, after going to the Fairytale Castle to vanquish the Wicked Witch, returns home using the ruby slippers. The slippers briefly appear in the season 4 episode "Fractured '' of Warehouse 13 in the Dark Vault, seemingly having a life of their own, accompanied by a witch 's cackle and a few notes of "Over the Rainbow ''. Supposedly an "Artifact '' -- a potentially dangerous and malicious object that grants the wearer dangerous powers -- since many artifacts are based on works of fact and fiction. The season 9 episode "Slumber Party '' of the series Supernatural features Dorothy and the Wicked Witch. Dorothy, here portrayed as a hard - as - nails fighter, realizes the shoes are the only thing that can kill the seemingly invincible witch. At one point, she admits she never really wore the iconic shoes, having considered it "tacky '' to wear the shoes of a dead witch. Near the end of the episode, Charlie Bradbury uses the shoes to kill the Wicked Witch and foil her plot to bring her armies to Earth and take over the world. According to the revisionist version of the Oz history chronicled in Gregory Maguire 's novel Wicked: The Life and Times of the Wicked Witch of the West, the slippers were given to Nessarose, the future Wicked Witch of the East, by her father. They were constructed with handmade glass beads and reflected many different colors in the lighting, giving them an almost chameleon effect. After being enchanted by Elphaba 's old best friend and roommate Glinda (the Good Witch of the North), they become items of power that allow the armless and handicapped Nessarose to magically stand and walk independently and without any additional support. In the musical adaptation, Wicked, it is Elphaba, the Wicked Witch of the West, who enchants the shoes, giving crippled Nessarose the ability to walk without a wheelchair. The Ruby Slippers of Oz (Tale Weaver Publishing, 1989) by Rhys Thomas is a history of the famous shoes and Kent Warner 's part in it. The progressive band Electric Light Orchestra used a frame from the 1939 film on the cover of their fourth studio album Eldorado, released in 1974. The cover was laid out by Sharon Osbourne (then known as Sharon Arden) and the picture was printed in reverse: the shoes point left in the film. In World of Warcraft, the Ruby Slippers are a pair of level 70 epic cloth shoes dropped by the Wizard of Oz - themed "opera event '' in the Karazhan raid instance. The shoes function similarly to the hearthstone that all characters start out with, allowing them to teleport from their current location to the inn where the hearthstone is set. The caption under the statistic lines, much like in the movie, is "There 's no place like home. '' The slippers are part of the twelve "Foundation Elements '' in the toys - to - life video game Lego Dimensions.
who wins season 4 of rupaul's drag race
RuPaul 's Drag Race (season 4) - wikipedia The fourth season of RuPaul 's Drag Race began airing on January 30, 2012, with cast members announced November 13, 2011. The winner of season four headlined Logo 's Drag Race Tour featuring Absolut Vodka, won a one - of - a-kind trip, a lifetime supply of NYX Cosmetics, and a cash prize of $ 100,000. Like the last season, Santino Rice & Billy B (Billy Brasfield), celebrity makeup artist and star of the HGTV mini-series Hometown Renovation, shared the same seat at the judges table alternatively, Brasfield filling in for Rice when needed. Both judges appeared side - by - side in the audience during the "Reunited '' episode. The theme song playing during the runway every episode was "Glamazon '' and the song played during the credits was "The Beginning '', both from RuPaul 's album Glamazon. The winner of the fourth season of RuPaul 's Drag Race was Sharon Needles, with Chad Michaels and Phi Phi O'Hara being the runners - up, making it the first time in the shows history to have two runners - up. Chad Michaels and Latrice Royale competed on the first season of All Stars. Latrice placed 7th / 8th overall with season 3 contestant Manila Luzon. Chad won the competition. Phi Phi O'Hara competed on the second season of All Stars. She placed 7th overall. (Ages and names stated are at time of contest) (In alphabetic order by stage name and / or last name) The first episode of season 4 was released prior to its January 30 air date to fans of the RuPaul 's Drag Race Facebook page. The sneak peek video was released on January 27, but its content was cut short before the runway walk and judging. Thirteen new queens begin their quests for the title of "America 's Next Drag Superstar, '' but first must survive drag zombies and the end of the world. In this women 's wrestling challenge, an homage to G.L.O.W., the queens are separated into three teams headed by the mini-challenge winners, and are asked to create a wrestling storyline and to choreograph a match, to be performed in front of the judges and a live audience. In addition, the teams have to split into binomials: the nice girls, or Faces, and the bad girls, or Heels. Following the wrestling matches, the girls are asked to walk the runway in their best "girly - girl '' couture. The queens make commercials for RuPaul 's albums Glamazon and Champion. The queens are separated into two teams, led by Phi Phi O'Hara (Team Champion) and Kenya Michaels (Team Glamazon). The queens design ship - shaped floats for a Pride parade runway extravaganza. One color from the 8 - stripe rainbow flag was randomly assigned to each contestant by Willam, who won the wet T - shirt contest mini-challenge. The contenders ' challenge was to produce a magazine cover. The magazines were shared out among the queens by Latrice Royale, who won the mini-challenge. The top five girls return to the work room, and RuPaul brings them their next mini challenge: create fashionable footwear out of clear platform heels using an Absolut cocktail as inspiration. Phi Phi wins the mini challenge, and Ru then presents them with their main challenge. The girls will be campaigning for the Drag Queen Presidency, and must put together a presentation for a round table political debate. For their runway, the queens needed dress their best for the president 's inaugural ball. It is revealed that Kenya has returned to the competition, and she wins the mini-challenge, which was to drag out a stuffed teddy bear. For their main challenge, the queens are paired up with dads for a maternity runway / makeover and striptease. For the mini challenge, the queens made a puppet of their own peers and made a bitch fest. Chad won the challenge and as a reward, she allowed to pair each queen with a dog to use as inspiration for the main challenge. The queens compete in their final challenge: starring in RuPaul 's music video for "Glamazon ''. In a twist, all three finalists had to perform the lip - sync song. Afterwards, RuPaul announced that the winner of the race will be revealed on "RuPaul 's Drag Race: Reunited '' the next week. For the first time, a live audience of fans, mainly from the Los Angeles area sit in as RuPaul and the contestants return for the annual reunion, where RuPaul reveals the winner of the season. The finale episode was taped at the El Portal Theatre in North Hollywood on Wednesday April 25, 2012, where three different outcomes were filmed in an attempt to keep the winner from being revealed before the episode aired, as had happened in previous seasons. We find out Latrice Royale is crowned "Miss Congeniality. '' We also learn that the reason Willam was disqualified was that he was caught having conjugal visits with his husband, who had tracked Willam to the hotel where the contestants were sequestered during production. This is in breach of the clause barring contestants engaging in unauthorized outside contact. In January 2012, Logo released the second running of Fantasy Drag Race, an online fan contest inspired by fantasy football where viewers assemble a team of three season four Drag Race contestants. Players receive and lose points based on their team 's performance on the show, and can earn additional points by redeeming codes and performing tasks given out when episodes of the show first air. The highest scoring players receive Drag Race and NYX Cosmetics products, and one player wins a trip for two to the first stop on Logo 's Drag Race Tour. Already having a generous social media presence, Logo expanded its efforts across Facebook, Twitter, Tumblr, GetGlue, and Foursquare in preparation for the premiere of season four. Both RuPaul and contestants tweet live while the show airs, and LogoTalk! chat parties (featuring judges, contestants from previous seasons, and contestants from season four) occur on the official Logo website while participants watch new episodes. Season four specifically marks an increased interest from Logo in Tumblr, where the network publishes animated GIFs, contestant trading cards, and images that incorporate internet memes. Dan Sacher, VP of digital for VH1 and Logo, has stated that their online marketing efforts are part of helping the small network expand their fan base across as many outlets as possible. The premiere episode of season four averaged a 0.6 rating in the 18 - 49 demographic, totaling 481,000 viewers, and ranked as the highest - rated premiere in Logo 's network history. Untucked totalled 254,000 viewers, marking the companion show 's most watched debut. During the evening of the premiere, the show registered eight US trending topics on Twitter (including Jiggly Caliente, Sharon Needles, Phi Phi O'Hara, and Latrice Royale) and reached a 7th place ranking on Trendrr. Leading up to the first episode, the show 's Facebook page saw an 89 % increase (earning over half a million fans). The season finale scored a 0.7 rating in the 18 - 49 demographic and drew 601,000 viewers total, while the reunion episode became the highest rated episode of the season. Season four 's "RuPaul 's Drag Race: Reunited '' was also the highest - rated reunion in the franchise 's history, seeing a 33 % increase in the 18 - 49 demographic compared to season three. The reunion registered five trending topics on Twitter (including Sharon Needles, Phi Phi, Willam, and a new portmanteau Willam introduced to the show: "RuPaulogize ''), and ranked 4th among non-sports cable programs for the night on Trendrr. During season 4, the show 's Twitter following increased by 77 %, and the Facebook page accrued a 36 % increase in likes. TV.com also declared it was the best reality show on television.
how do you say bless you in hindi
Responses to sneezing - wikipedia In English - speaking countries, the common verbal response to another person 's sneeze is "bless you '', or, less commonly in the United States and Canada, "Gesundheit '', the German word for health (and the response to sneezing in German - speaking countries). There are several proposed bless - you origins for use in the context of sneezing. In non-English - speaking cultures, words referencing good health or a long life are often used instead of "bless you, '' though some also use references to God. In some Asian cultures such as Korean and Japanese cultures, the practice of responding to another person 's sneeze does not exist. In the Assyrian language, "Shemad Alaha '', "in God 's name '', is the response to a sneeze. فرج (Faraj), صحة (Sahha). "Relief! '', "Health! '' (India) Jibah Jibah (জীবঃ জীবঃ) "(India) May you live long '' More rarely there is the expression 多 保重 (duōbǎozhòng) 多 喝 点 水 (duō he dian shui) "Take care '', "Drink more water ''. Old - fashioned: à tes / vos amours after the second sneeze, and qu'elles durent toujours after the third. More archaically, one can say Que Dieu te / vous bénisse. 2) Helf Gott! / Helfgott! / Helf dir Gott! (Southern Germany / Austria / Transylvanian - Saxon; archaic / mostly used by more or less religious elderly) 3) Großwachsen! (Transylvanian - Saxon; from Romanian "Să creşti mare! ''; used solely for children, usually after the usual "Gesundheit '' for the first and / or second response) 4) Zum Wohl! (Southern Germany / Austria) 2) "May God help you! '' 3) "You shall grow tall! '' 4) "To your well - being! '' or Háíshį́į́ naa ntsékees / naa yáłti ' 2) Să crești mare! (for children; usually "Noroc '' comes first, then "Sănătate '' and, as a third option, "Să crești mare! '') 2) "May you grow up! '' Someone might say правду говорю (pravdu govor'u) if they sneeze while talking. "I 'm telling the truth. '' 2) Pis Maco mostly used with children 2) "go away kitten '' as sound of sneezing often sounds like cat 's cough
who plays dwayne's mom on a different world
List of a Different World characters - wikipedia A Different World is a spin - off from the American television sitcom The Cosby Show. It aired on NBC for six seasons, from 1987 to 1993. A native of Brooklyn; Denise Huxtable is the daughter of Hillman alumni Cliff and Clair. She enrolled in Hillman and was roommates with Maggie Lauten and Jaleesa Vinson during her sophomore year. Denise was a poor student who often procrastinated and struggled to manage her time and money. She disliked Whitley Gilbert and endured Dwayne Wayne 's crush on her. She left Hillman at the end of her sophomore year to travel to Africa. There, she met and married Lt. Martin Kendall of the U.S. Navy and became the stepmother of Olivia Kendall. She only reappears once in season 3 to give Dwayne closure on his crush. Note: Denise originated as a regular character on The Cosby Show, on which she was featured on a regular basis during seasons 1 -- 3 and 6 -- 7, and on a recurring basis during seasons 4 and 5. She left "A Different World '' after season 1. Dwayne Cleofis Wayne is a mathematics major at Hillman. A native of Brooklyn, he achieved a perfect score on the math portion of the SAT. He is best known for his flip up eyeglasses / shades and making unsuccessful advances on numerous women throughout his freshman year. He had a crush on Denise and unsuccessfully ran for the title of "Miss Hillman '' at her urging to highlight the sexism of the pageant. His best friend / roommate is Ronald Johnson. He dated Suzanne Taylor although the two broke up because she was not ready to get serious. Although he dated several women across the series, he was most involved in an on - again - off - again relationship with Whitley Gilbert across the series. After working at a summer internship in Japan, he began a relationship with Kinu Owens. However, he broke up with her once he realized he was still in love with Whitley. He graduated from Hillman as valedictorian of Class of 1991 and engaged to Whitley. He became a mathematics professor at Hillman but his engagement with Whitley was broken off when he almost cheated on her. However, during her relationship with Byron Douglas III, the two slept together and he interrupted Whitley and Byron 's wedding ceremony to declare his love for her. Whitley left Byron at the altar for Dwayne and the two quickly married. They honeymooned in Los Angeles, which coincided with the 1992 riots following the verdict in the Rodney King trial. At the end of the series, Whitley became pregnant with their first child and he designed a new video game for Kinishewa with Ron. The couple decided to move to Japan for his work. In a reunion special, it was revealed that while at Kinishewa, Dwayne invented the flip cell phone, inspired by his flip glasses. Dwayne would 've been in an interracial relationship with Maggie Lauren if actress Marisa Tomei did n't leave the show after season one. Whitley Gilbert - Wayne is Art History and French major at Hillman. A native of Richmond (Virginia), she is daughter of Hillman alumni Mercer and Marion Gilbert. She began the series with a snobbish, prissy attitude and disliked by many of the characters although following Denise 's departure after season 1, the show was retooled to feature her in the lead. As a consequence, Whitley 's character was mellowed out and she was paired romantically with Dwayne, with whom she had an on again - off again relationship. Whitley originally went to college to land a husband although she quickly realized she enjoyed art history and stayed at Hillman for a fifth year to take business courses. After graduation, she was engaged to Dwayne and had a part - time job at E.H. Wright Industries as an assistant art buyer and as a dorm director at Hillman. After learning that Dwayne went on a date with another woman, Whitley broke off their engagement although the two reunited when Whitley cheated on her fiance Byron Douglas III with Dwayne. Still, she was ready to marry Byron until Dwayne interrupted her wedding and declared his love for her. The two eloped and spent their honeymoon in Los Angeles, which coincided with the 1992 riots following the verdict in the Rodney King trial. After being laid off at E.H. Wright Industries, Whitley was employed in a series of odd jobs. At the end of the series, she moves with Dwayne to Japan and is pregnant with the couple 's first child. Maggie Lauten (Marisa Tomei): A "military brat '' and journalism major, she is one of the few white students at predominantly African American Hillman. She transferred to Hillman at the start of the sophomore year and was roommates with Denise Huxtable and Jaleesa Vinson. Maggie was a sweet - natured, although occasionally ditzy, girl. If Maggie would have stayed on the show after season one, Debbie Allen (the show 's producer from seasons 2 - 6) would have given her a black boyfriend and there would have been an episode where Dwayne brings her home for Thanksgiving dinner and Dwayne 's parents disapproves of his interracial relationship. Due to Marisa Tomei leaving the show after the first season, the story never happened. Jaleesa Vinson - Taylor (Dawnn Lewis): native of Camden (New Jersey), sister of Danielle and Yvonne Vinson, ex-wife of Lamar Collins, enrolled at Hillman at age 25, a business management major, roommate of Denise and Maggie during sophomore year, worked part - time at the Hillman library, assistant dorm director of Gilbert Hall, vacationed in Greece with Maggie during the summer of 1988, roommate of Freddie during junior and senior years, worked a summer installing cable television, entered into serious relationship with Walter, co-dorm director of Gilbert Hall, engaged to Walter, couple halts wedding at the altar and mutually separates, graduated (Class of 1990), accepted an entry - level corporate position, off - campus roommate of Whitley, married Colonel Bradford Taylor (in a surprise elopement), stepmother of Suzanne and Terrence, started a temporary employment agency, gave birth to daughter Imani, disappeared after season five. Stevie Rallen (Loretta Devine): mother of J.T., graduate student dorm director of Gilbert Hall, replaced by Lettie Bostic Ronald Marlon Johnson, Jr. (Darryl M. Bell): native of Detroit (Michigan), son of Ron Johnson Sr. and brother of Rachel Johnson, roommate and best friend of Dwayne, involved in serious relationship with Millie, member of the ROTC pledged Kappa Lambda Nu Fraternity and successfully "crossed over '', dated numerous women during most of his college career, worked summers as a salesman for his father 's automobile dealership, falsely implied that Dwayne 's campaign for student body president was endorsed by former presidential candidate Jesse Jackson, member of the Hillman ROTC, managed / performed in the band X-Pression, graduated after nine semesters (January 1992), victim of a bias incident at Virginia A&M University (on the weekend of Martin Luther King, Jr. Day 1992), band breaks up, employed as spokesman for a phone sex hotline, employed as a car salesman (independent of his father), unemployed, criticized Kimberly 's interracial relationship with Matthew (but later admitted he was jealous), entered into serious relationship with Kimberly (after pursuing her for months), antagonist of Shazza, cheated on Kimberly with Freddie (while Freddie was involved with Shazza), broke up with Kimberly, entered into serious relationship with Freddie, manager and co-owner of "The Place Where The Blues Will Be Played '' with Mr. Gaines, provided the concept which inspired Dwayne 's new video game for Kinishewa, chosen to be godfather of Dwayne and Whitley 's unborn child. Walter Oakes (Sinbad): graduate student, football / baseball / basketball / track coach, dorm director of all - male residence hall, involved in serious relationship with Jaleesa, co-dorm director of Gilbert Hall, engaged to Jaleesa, couple halts wedding at the altar and mutually separates, moves to Philadelphia to manage community center. Leticia "Lettie '' Bostic (Mary Alice): enrolled in Hillman, dropped out of Hillman a few credits short of her degree, moved to Paris, met Pablo Picasso, spied for the Allies during World War II, rejected marriage proposal from South African freedom fighter Marcus Mpepo, replaced Stevie as dorm director of Gilbert Hall decades later in 1988, chastises Kimberly for engaging in unprotected sex, disappeared after season two. Winifred "Freddie '' Brooks (Cree Summer): native of New Mexico, daughter of Joni Brooks, cousin of Matthew, roommate of Jaleesa during freshman and sophomore years, has unrequited feelings for Dwayne, was a student activist throughout her undergraduate career, dated Garth Parks (who almost rapes her), dated Ernest Bennett, roommate of Kimberly during junior and senior years, lost her virginity in a one - time sexual encounter with Ron, entered into serious relationship with Shazza Zulu, graduated (Class of 1992), enrolled in Hillman Law School, co-dorm director of Height Hall, cheated on Shazza with Ron (while Ron was involved with Kimberly), broke up with Shazza on Thanksgiving Day 1992, entered into serious relationship with Ron, earned law review membership and completed first year of law school. Kimberly Reese (Charnele Brown): native of Columbus (Ohio), daughter of Clinton Reese, roommate (and best friend) of Whitley during freshman and sophomore years, a biology major, employed by Mr. Gaines at The Pit part - time throughout her undergraduate career, involved in serious relationship with Robert (had a pregnancy "false alarm ''), rejected a much - needed scholarship because of the sponsoring corporation 's investments in apartheid - controlled South Africa, employed at funeral home part - time, roommate of Freddie during junior and senior years, involved in serious relationship with Matthew, pledged Alpha Delta Rho Sorority and successfully "crossed over '', performed in the band X-Pression, graduated (Class of 1992), entered into serious relationship with Ron (after being pursued for months), enrolled in Hillman Medical School, co-dorm director of Height Hall, broke up with Ron, entered into serious relationship with fellow medical student Spencer Boyer, completed first year of medical school, engaged to Spencer (after turning down numerous proposals from him). Colonel Bradford Taylor (Glynn Turman): served in the U.S. Army during the Vietnam War, retired with the rank of colonel, ex-husband of Johanna, father of Suzanne and Terrence, professor of mathematics at Hillman (nicknamed "Dr. War '' because of his reputation as a demanding professor), commander of the Hillman ROTC, became Dwayne 's primary mentor, was distraught when his former student Zelmer Collier was deployed to the Persian Gulf just before the start of Operation Desert Storm, rejected membership in an all - white country club after being criticized by Terrence, married Jalessa Vinson (in a surprise elopement), brother - in - law of Danielle Vinson, became a father for the third time when Jaleesa gave birth to daughter Imani. Vernon Gaines (Lou Myers): met Lena Horne during World War II; husband of Velma and father of Darnell; uncle of Faith, Hope, Charity and Henrietta; owner and manager of The Pit; employer of Byron Douglas III; employer and father figure of Kimberly; co-owner of apartment building with Velma; frequent critic of his ne'er - do - well son; temporary employer of Whitley; employer of both Lena and Charmaine; co-owner of "The Place Where The Blues Will Be Played '' with Ron; reunited with Lena Horne in 1993. Gina Deveaux (Ajai Sanders): family emigrated to the U.S. from Martinique, pledged Alpha Delta Rho Sorority and successfully "crossed over '', dated Dion (who physically abused her), pressed charges against Dion and broke up with him, roommate of Lena and Charmaine at the start of junior year, involved in incident that led Charmaine to falsely accuse Terrell of sexual harassment, placed on academic probation, moved off - campus into house (with Lena, Charmaine, Terrell and Dorian), rejected Dion again after he violated probation and contacted her, completed junior year and still enrolled at Hillman. Lena James (Jada Pinkett): native of Baltimore (Maryland), daughter of Grover James, originally an engineering major but later journalism major, named after singer Lena Horne, ended high school relationship with Piccolo, employed by Mr. Gaines at The Pit part - time, developed a brief crush on Dwayne, roommate of Gina and Charmaine at the start of sophomore year, entered into serious (yet celibate) relationship with Dorian, rejected Piccolo 's attempt to reconcile, moved off - campus into house (with Gina, Charmaine, Terrell and Dorian), completed sophomore year and still enrolled at Hillman. Charmaine Brown (Karen Malina White): native of Brooklyn (New York), best friend of Claire Huxtable 's distant cousin Pam Tucker, began dating Lance Rodman in high school, visited Hillman with Lance during her senior year of high school, roommate of Gina and Lena at the start of her freshman year at Hillman, amazed and annoyed others with her rapid pattern of speech, employed by Mr. Gaines at The Pit part - time, mistakenly accused Terrell of sexual harassment, relationship with Lance ended when he broke up with her by telephone, failed French midterm after she and Terrell were caught cheating, harassed by local residents (along with Terrell), moved off - campus into house (with Gina, Lena, Terrell and Dorian), completed freshman year and still enrolled at Hillman. (Note: Charmaine originated as a recurring character on The Cosby Show. She was featured during seasons 7 and 8 of that series.)
when's the last time the patriots played on thanksgiving
NFL on Thanksgiving Day - wikipedia The National Football League (NFL) on Thanksgiving Day is a traditional series of games played during the Thanksgiving holiday in the United States. It has been a regular occurrence since the league 's inception in 1920. Currently, three NFL games are played every Thanksgiving. The first two are hosted by the Detroit Lions and the Dallas Cowboys; a third game, with no fixed opponents, has been played annually since 2006. The concept of American football games being played on Thanksgiving Day dates back to 1876, shortly after the game had been invented, as it was a day that most people had off from work. In that year, the college football teams at Yale and Princeton began an annual tradition of playing each other on Thanksgiving Day. The University of Michigan also made it a tradition to play annual Thanksgiving games, holding 19 such games from 1885 to 1905. The Thanksgiving Day games between Michigan and the Chicago Maroons in the 1890s have been cited as "The Beginning of Thanksgiving Day Football. '' In some areas, high - school teams play on Thanksgiving, usually to wrap - up the regular - season. By the time football had become a professional event, playing on Thanksgiving had already become an institution. Records of pro football being played on Thanksgiving date back to as early as the 1890s, with the first pro -- am team, the Allegheny Athletic Association of Pittsburgh, Pennsylvania. In 1902, the "National '' Football League, a Major League Baseball - backed organization based entirely in Pennsylvania and unrelated to the current NFL, attempted to settle its championship over Thanksgiving weekend; after the game ended in a tie, eventually all three teams in the league claimed to have won the title. Members of the Ohio League, during its early years, usually placed their marquee matchups on Thanksgiving Day. For instance, in 1905 and 1906 the Latrobe Athletic Association and Canton Bulldogs, considered at the time to be two of the best teams in professional football (along with the Massillon Tigers), played on Thanksgiving. A rigging scandal with the Tigers leading up to the 1906 game led to severe drops in attendance for the Bulldogs and ultimately led to their suspension of operations. During the 1910s, the Ohio League stopped holding Thanksgiving games because many of its players coached high school teams and were unavailable. This was not the case in other regional circuits: in 1919, the New York Pro Football League featured a Thanksgiving matchup between the Buffalo Prospects and the Rochester Jeffersons. The game ended in a scoreless tie, leading to a rematch the next Sunday for the league championship. Several other NFL teams played regularly on Thanksgiving in the first eighteen years of the league, including the Chicago Bears and Chicago Cardinals (1922 -- 33; the Bears played the Lions from 1934 to 1938 while the Cardinals switched to the Green Bay Packers for 1934 and 1935), Frankford Yellow Jackets, Pottsville Maroons, Buffalo All - Americans, Canton Bulldogs (even after the team moved to Cleveland they played the 1924 Thanksgiving game in Canton), and the New York Giants (1929 -- 38, who always played a crosstown rival). The first owner of the Lions, George A. Richards, started the tradition of the Thanksgiving Day game as a gimmick to get people to go to Lions football games, and to continue a tradition begun by the city 's previous NFL teams. What differentiated the Lions ' efforts from other teams that played on the holiday was that Richards owned radio station WJR, a major affiliate of the NBC Blue Network; he was able to negotiate an agreement with NBC to carry his Thanksgiving games live across the network. During the Franksgiving controversy in 1939 and 1940, the only two teams to play the game were the Pittsburgh Steelers and Philadelphia Eagles, as both teams were in the same state (Pennsylvania). (At the time, then - president Franklin Roosevelt wanted to move the holiday for economic reasons and many states were resistant to the move; half the states recognized the move and the other half did not. This complicated scheduling for Thanksgiving games. Incidentally, the two teams were also exploring the possibility of a merger at the time.) Because of the looming World War II and the resulting shorter seasons, the NFL did not schedule any Thanksgiving games in 1941, nor did it schedule any in the subsequent years until the war ended in 1945. When the Thanksgiving games resumed in 1945, only the Lions ' annual home game would remain on the Thanksgiving holiday. In 1951, the Packers began a thirteen - season run as the perpetual opponent to the Lions each year through 1963. The All - America Football Conference and American Football League, both of which would later be absorbed into the NFL, also held Thanksgiving contests, although neither of those leagues had permanent hosts. Likewise, the AFL of 1926 also played two Thanksgiving games in its lone season of existence, while the AFL of 1936 hosted one in its first season, which featured the Cleveland Rams, a future NFL team, and the 1940 -- 41 incarnation of the American Football League played two games in 1940 on the earlier "Franksgiving '' date. In 1966, the Dallas Cowboys, who had been founded six years earlier, adopted the practice of hosting Thanksgiving games. It is widely rumored that the Cowboys sought a guarantee that they would regularly host Thanksgiving games as a condition of their very first one (since games on days other than Sunday were uncommon at the time and thus high attendance was not a certainty). This is only partly true; Dallas had in fact decided to host games on Thanksgiving by their own decision because there was nothing else to do or watch on that day. In 1975 and 1977, at the behest of then - Commissioner Pete Rozelle, the St. Louis Cardinals replaced Dallas as a host team (Dallas then hosted St. Louis in 1976). Although the Cardinals, at the time known as the "Cardiac Cards '' due to their propensity for winning very close games, were a modest success at the time, they were nowhere near as popular nationwide as the Cowboys, who were regular Super Bowl contenders during this era. This, combined with St. Louis 's consistently weak attendance, a series of ugly Cardinals losses in the three - game stretch, and opposition from the Kirkwood -- Webster Groves Turkey Day Game (a local high school football contest) led to Dallas resuming regular hosting duties in 1978; it was then, after Rozelle asked Dallas to resume hosting Thanksgiving games, that the Cowboys requested (and received) an agreement guaranteeing the Cowboys a spot on Thanksgiving Day forever. Notwithstanding the aforementioned St. Louis - hosted games in 1975 and 1977, the two "traditional '' Thanksgiving Day pro football games since the 1970 AFL -- NFL merger have then been in Detroit and Dallas. Because of TV network commitments in place through the 2013 season, to make sure that both the AFC - carrying network (NBC from the 1970 merger to 1997, and CBS since 1998) and the NFC - carrying network (CBS from the 1970 merger to 1993, and Fox since 1994) got at least one game each, one of these games was between NFC opponents, and one featured AFC - NFC opponents. Thus, the AFC could showcase only one team on Thanksgiving, and the AFC team was always the visiting team. Since 2006, a third NFL game on Thanksgiving has been played at night. It originally aired on the NFL Network as part of its Thursday Night Football package until 2011; NBC began carrying the night game in 2012. The Thanksgiving night game has no fixed opponents or conferences, enabling the league to freely choose whatever marquee match - up to feature on that night. The 2012 changes also allowed both Dallas and Detroit in the future to offer NFC games (one would be played at night), and CBS can offer a game with two AFC teams. In 2014, the NFL added the cross-flex rule, allowing CBS to televise NFC away games, and Fox to broadcast AFC away games, under select circumstances on Sunday afternoons; however, this did not cover the Thanksgiving contests. CBS also signed a separate contract to carry Thursday Night Football from the 2014 season onward, which allowed that network to carry games from either conference on Thursdays; from that year through 2016, CBS carried all - NFC contests every year on Thanksgiving, and in 2014 and 2015, no AFC teams played in any of the Thanksgiving games. The NFL 's flexible scheduling rule currently does not apply for Thanksgiving games; however, the NFL in theory could in the future apply the rule to change start times and networks for the three games. Since 2001 teams playing on Thanksgiving have worn throwback uniforms on numerous occasions. In some years (namely 2002), it extended to nearly all games of the weekend, and in some cases also involved classic field logos at the respective stadiums. In 2001 -- 2004, and again in 2008, 2010, and 2017 the Detroit Lions have worn throwback uniforms based on their very early years. From 2001 to 2003, Dallas chose to represent the 1990s Cowboys dynasty by wearing the navy "Double - Star '' jersey not seen since 1995. In 2004, the team wore uniforms not seen since 1963. In 2009, to celebrate the 50th anniversary of the AFL, both Dallas and Oakland played in a "AFL Legacy Game. '' In 2013, the Cowboys intended to wear their 1960s throwbacks, but chose not to do so after the NFL adopted a new policy requiring players and teams to utilize only one helmet a season to address the league 's new concussion protocol; rather than sport an incomplete throwback look, the Cowboys instead wore their standard blue jerseys at home for the first time since 1963. In 2015, the Cowboys resurrected their 1994 white "Double - Star '' jerseys only this time wore them with white pants as part of the league 's "Color Rush '', a trial run of specially - designed, monochromatic jerseys to be worn during Thursday games. It has remained a tradition for Dallas and Detroit to host the afternoon games dating back several decades. However, in recent years, other teams have expressed interest in hosting Thanksgiving games. Lamar Hunt, the former owner of the Chiefs (who had hosted Thanksgiving games from 1967 -- 69 as an AFL team prior to the merger), lobbied heavily in favor of his team hosting a game on the holiday. When the NFL adopted a third, prime time game, the Chiefs were selected as the first team to host such a contest, but the team was not made a permanent host, and Hunt 's death shortly after the 2006 contest ended the lobbying on behalf of that team. The host issue came to a head in 2008, focusing particularly on the winless Lions. Going into the game, Detroit had lost their last four Thanksgiving games, and opinions amongst the media had suggested removing Detroit and replacing them with a more attractive matchup. The team also required an extension to prevent a local television blackout. The Lions were routed by Tennessee 47 -- 10, en route to the team 's 0 -- 16 season. NFL commissioner Roger Goodell confirmed that the Lions would stay on Thanksgiving for the 2009 season, but kept the issue open to revisit in the future. Conversely, the Dallas Cowboys, who typically represent a larger television draw, have had much fewer public calls to be replaced on Thanksgiving. One issue that has been debated is a perceived unfair advantage of playing at home on Thanksgiving. The advantage is given in the form of an extra day of practice for the home team while the road team has to travel to the game site. This is true for most Thursday games, but with the night games, the visitor can travel to the game site after practice and hold the final walk - thru the following morning. With the introduction of the prime time game, which effectively allows all teams in the league an opportunity to play on Thanksgiving, along with the introduction of year - long Thursday Night Football ensuring all teams have one Thursday game during the regular season (thus negating any on - field advantages or disadvantages to being selected for Thanksgiving), the calls for Detroit and Dallas to be removed have curtailed. (Winning teams are denoted by boldface type; tie games are italicized.) Of current NFL franchises. This includes American Football League (AFL) games; however, it does not include All - America Football Conference (AAFC) games. The last currently active franchise to have never played on Thanksgiving through 2017 is the Jacksonville Jaguars, who joined the league in 1995. An idiosyncrasy in the NFL 's current scheduling formula, which has been in effect since 2002 and revised in 2010, effectively prevents teams from the AFC North from playing the Lions or Cowboys on Thanksgiving, as the formula has the AFC North playing in Dallas or Detroit in years when the other team is slated to play the AFC game on Thanksgiving. These teams can, under the 2014 television contracts, play only in the third (night) game; should cross-flex be expanded to Thursdays or Fox win the Thursday Night contract in 2018 (either scenario would allow both Fox and CBS to carry AFC games), this idiosyncrasy will be eliminated. The Los Angeles Rams have the longest active appearance drought of any team, with their last appearance coming in 1975. Among current NFL markets, Cleveland has had the longest wait to have a team from its city play on Thanksgiving; the Browns last appeared in 1989, several years before being suspended in the Cleveland Browns relocation controversy, and have not appeared in the game since rejoining the league as an expansion team. Since 2010, the league has made efforts to end the longest droughts. New Orleans, Cincinnati, Baltimore, Houston, and Carolina all played their first Thanksgiving games during this time frame. San Francisco likewise played their first Thanksgiving game since 1972 in 2011, and the Los Angeles Chargers, who last played on the holiday in 1969 (while the team was still an AFL franchise in San Diego) before actually joining the league, appeared for the first time as an NFL member in 2017. * All - America Football Conference team. Since 1989, informal and sometimes lighthearted Man of the Match awards have been issued by the networks broadcasting the respective games. Running back Emmitt Smith holds the record for most Thanksgiving MVPs with five (1990, 1992, 1994, 1996 and 2002). Voting on the respective awards is typically done informally by the announcing crew themselves, and criteria are loose. Noteworthy statistical accomplishments weigh heavily, and "group '' awards are common. The announcement of the winner (s), and the presentation of the award is normally done immediately following the game, during post-game network coverage. In 1989, John Madden of CBS awarded the first "Turkey Leg Award '', for the game 's most valuable player. Pursuant to its name, it was an actual cooked turkey leg, and players typically took a celebratory bite out of the leg for the cameras during post-game interviews. Reggie White of the Eagles was the first recipient. The gesture was seen mostly as a humorous gimmick relating to Madden 's famous multi-legged turkey, cooked and delivered by local restaurant owner Joe Pat Fieseler of Harvey 's Barbecue (located less than a mile from Texas Stadium). Since then, however, the award has gained subtle notoriety. Madden brought the award to FOX in 1994, and it continued through 2001. Because of the loose and informal nature of the award, at times it has been awarded to multiple players. On one occasion in 1994, it was given to players of both teams. When John Madden left FOX after 2001, the network introduced a new award starting in 2002, named the "Galloping Gobbler. '' It was represented by a small figurine of a silver turkey wearing a football helmet striking a Heisman - like pose. Much like Cleatus and Digger, the original Galloping Gobbler trophy reflected Fox 's irreverent mascots, and went through several iterations. Unimpressed by its tackiness, 2002 winner Emmitt Smith (who holds the record for most Thanksgiving MVP awards and had won the Turkey Leg Award four previous times) famously threw his in a trash can. In 2007, the kitschy statuette was replaced with a bronze - colored statue of a nondescript turkey holding a football. In 2011, the trophies were discarded altogether and replaced by an attractive plaque. Unlike the aforementioned "Turkey Leg Award '', the "Galloping Gobbler '' is normally awarded to only one player annually, however in 2016, co-winners were honored. For 2017, the Galloping Gobbler was permanently retired, and replaced with the "Game Ball, '' an stylish, ornate football - shaped trophy, reminiscent of the tradition where game - used balls are typically awarded to players of the game. No one at Fox seemed to notice the first ball awarded has the stripe markings of a college ball. When the NFL returned to CBS in 1998, they introduced their own award, the "All - Iron Award '', which is, suitably enough, a small silver iron, a reference to Phil Simms ' All - Iron team for toughness. The All - Iron winner also receives a skillet of blackberry cobbler made by Simms ' mother. Through 2006, the trophy was only awarded to one player annually. Occasionally, it has been issued as a "group award '' in addition to a single player award. In 2008, Simms stated it was "too close to call '' and named four players to the trophy; he then gave the award to several people every year until 2013, after which he reverted to a single MVP in 2014. Simms was removed from the broadcast booth for the 2017 season and no player of the game was chosen that year. During the time when NFL Network held the broadcast rights the prime time game, from 2007 to 2011 they gave out the "Pudding Pie Award '' for MVPs. The award was an actual pie. In 2009, NFL Network gave Brandon Marshall a pumpkin pie rather than the chocolate pudding pie of the previous two years. NBC, which carried Thanksgiving afternoon games through 1997, did not issue an MVP award during that time. NBC began broadcasting the Thanksgiving prime time game in 2012, at which point the MVP award was added. The award is currently called the Sunday Night Football on Thanksgiving Night Player of the Game, and is typically awarded to multiple players on the winning team. From 2012 to 2015, the NBC award was referred to as the "Madden Thanksgiving Player - of - the - Game '', honoring John Madden (who announced NBC games from 2006 to 2008). In the first few years, the award specifically went to players on both offense and defense, but in recent years, defensive players have not necessarily been recognized. The winning players are presented with ceremonial game balls and, as a gesture to Madden, a cooked turkey leg. DuMont was the first network to televise Thanksgiving games in 1953; CBS took over in 1956, and in 1965, the first ever color television broadcast of an NFL game was the Thanksgiving match between the Lions and the Baltimore Colts. Starting in 2012, all three broadcast networks with NFL rights will carry one game apiece. The first two games are split between CBS and Fox. These games are rotated annually, with CBS getting the 12: 30 p.m. (EST) "early '' game, and Fox getting the 4: 25 p.m. "late '' game in even - numbered years, while Fox likewise gets the "early '' game and CBS the "late '' game in odd - numbered years. The third game, with a prime time 8: 30 p.m. start, is carried by NBC. The NFL may involve the Flexible Scheduling rule in the future to reassign games if the night game has less importance than the Dallas or Detroit game. Westwood One holds national radio broadcast rights to all three games, with Compass Media Networks sharing rights to the Cowboys contest. (Under league rules, only radio stations that carry at least 12 Cowboys games in a season are allowed to carry the Compass broadcast.) The participating teams also air the games on their local flagship stations and regional radio networks. The Cowboys Thanksgiving game has regularly been the most watched NFL regular season telecast each year, with the Lions Thanksgiving game usually in the top five.
actor who plays finn in the force awakens
John Boyega - wikipedia John Adedayo B. Adegboyega (born 17 March 1992), known professionally as John Boyega, is an English actor best known for playing Finn in the 2015 film Star Wars: The Force Awakens, the seventh film of the Star Wars series, and its 2017 sequel Star Wars: The Last Jedi. Boyega rose to prominence in his native United Kingdom for his role as Moses in the 2011 sci - fi comedy film Attack the Block. Boyega 's other credits include historical drama film Detroit (2017), four episodes of the television series 24: Live Another Day and the drama Imperial Dreams (2014). Boyega received the BAFTA Rising Star Award in 2016. Boyega was born in London, England on 17 March 1992 to British Nigerian parents, Abigail (née Aboderin), and Samson Adegboyega, a Pentecostal minister. His first role was a leopard in a play at his primary school. Boyega was a pupil at Oliver Goldsmith Primary School. While acting in a play there at the age of nine, he was noticed by Teresa Early, the artistic director of Theatre Peckham, a learning theatre for young people who live in south London. After obtaining financial assistance from a hardship fund, he joined the theatre, spending his time there outside school hours between the ages of nine and 14. Boyega 's father, a preacher, had wanted Boyega to become a preacher too, but was supportive of his son 's theatrical interests. In 2003, Boyega started his secondary education at Westminster City School, where he took part in various school productions. Between 2008 and 2010, he attended South Thames College at the college 's Wandsworth campus to study for a National Diploma in Performing Arts. His activities at the college included playing the title role in the college 's production of Othello. He enrolled at the University of Greenwich to study BA Film Studies & Media Writing, but dropped out to focus on acting. Boyega trained at the Identity School of Acting in Hackney, and appeared in Six Parties at the National Theatre and Category B at the Tricycle Theatre prior to being offered a role in the 2011 film Attack the Block. In September 2011, HBO announced that Boyega had been cast in the boxing drama pilot Da Brick, loosely based on Mike Tyson 's life. Boyega was expected to play Donnie, who is released from a juvenile detention centre on his 18th birthday and begins to examine what it means to be a man. The pilot was written by John Ridley, but was not picked up by HBO. Also in 2011, he acted in the film Junkhearts in which he portrayed Jamal, a drug dealer who finds some guns and tries to sell them. Boyega was chosen by Fionnuala Halligan of Screen International as one of the "UK Stars of Tomorrow 2011 '' and appeared alongside two other actors on the front cover of that magazine in its July 2011 edition. In March 2012, Boyega was cast in the film adaptation of Chimamanda Ngozi Adichie 's book Half of a Yellow Sun. On 29 April 2014, it was confirmed that Boyega had been cast as a lead character in Star Wars: The Force Awakens. It was later revealed Boyega would play Finn, a stormtrooper for the First Order, who leaves the military power after witnessing their cruelty in his first combat mission before joining the fight against them. The film was released on 18 December 2015. Both the film and Boyega 's performance received acclaim from both audiences and critics. In January 2016, Boyega formed his own production company, Upperroom Entertainment Limited. In June 2016, Boyega announced that his company would be co-producing the sequel to the 2013 movie Pacific Rim titled Pacific Rim: Uprising alongside Legendary Entertainment. Boyega will also play the lead role in the project. In 2017, Boyega starred in Detroit, Kathryn Bigelow 's film about the 1967 Detroit riots. The same year, he reprised his role as Finn in Star Wars: The Last Jedi.
who is singing national anthem at super bowl 2017
List of national anthem performers at the Super Bowl - wikipedia This article is a list of national anthem performers at the Super Bowl. The U.S. national anthem ("The Star - Spangled Banner '') has been performed at all but one Super Bowl since its first year in 1967; Vikki Carr sang "America the Beautiful '' in place of the anthem at Super Bowl XI in 1977. Since Super Bowl XVI in 1982, famous singers or music groups have performed the anthem at the vast majority of Super Bowl games. Beginning with Super Bowl XLIII in 2009, "America the Beautiful '' is sung before the national anthem every year. Some early Super Bowls featured marching bands performing the anthem, and the recitation of the Pledge of Allegiance. Acts that have performed three times: Acts that have performed two times: Singers that performed in or near their hometown metropolitan area: The performance by Whitney Houston at Super Bowl XXV in 1991, during the Gulf War, had been for many years regarded as one of the best renditions ever. It was released as a single a few weeks later, appeared on the album Whitney: The Greatest Hits, and was re-released as a single in 2001 shortly after the September 11 attacks. The 1992 performance marked the first time American Sign Language was used alongside the lead singer. Faith Hill performed the anthem at Super Bowl XXXIV in 2000. It became popular in country radio. Following the September 11 attacks, her version entered the country singles chart at number 35, despite not being released as an official single, and reentered the same chart at number 49 in July 2002. At Super Bowl XLVIII in 2014, in an emotional and groundbreaking performance, soprano Renée Fleming became the first opera singer to perform the national anthem, scoring the highest ratings for a Fox Network program in its history, and the second - highest ratings for any television program in history. Just days after Super Bowl XXV, a report surfaced that Whitney Houston lip synced her performance. It was confirmed that she was actually singing into a dead microphone, but the performance heard in the stadium and on television was prerecorded. Since 1993, the NFL has required performers to supply a backup track. This came after Garth Brooks walked out of the stadium prior to his XXVII performance. Only 45 minutes before kickoff, he refused to take the stage, due to a dispute with NBC. Brooks requested that the network premiere his new music video "We Shall Be Free '' during the pregame. The network chose not to air the video, due to content some felt was disturbing imagery. Brooks had also refused to pre-record the anthem, which meant the league had nothing to play if he left. Television producers spotted Jon Bon Jovi in the grandstands, and were prepared to use him as a replacement. After last - minute negotiations, NBC agreed to air a clip of the video during the broadcast of the game, and Brooks was coaxed back into the stadium and sang. Following the "wardrobe malfunction '' controversy during Super Bowl XXXVIII in 2004, all scheduled performers for Super Bowl XXXIX were chosen under heavy scrutiny. Game organizers decided not to use a popular music vocalist. The combined choirs of the U.S. Military Academy, the Naval Academy, Air Force Academy, Coast Guard Academy, and the U.S. Army Herald Trumpets were invited to perform. This was the first time since the second inauguration of President Richard Nixon in 1973 that all four service academies sang together. Two days after Super Bowl XLIII, it was revealed that Jennifer Hudson also had lip synced. At the beginning of Super Bowl XLV, Christina Aguilera sang the lyrics incorrectly. Instead of singing "O'er the ramparts we watched, were so gallantly streaming '', the pop star sang "What so proudly we watched at the twilight 's last gleaming ''. According to the New York Times, she also changed "gleaming '' to "reaming ''. The following Super Bowls featured other patriotic performances besides the national anthem. Since 2009, "America the Beautiful '' is sung before the national anthem.
paddy field is an example of terrestrial ecosystem
Paddy field - wikipedia A paddy field is a flooded parcel of arable land used for growing semiaquatic rice. Paddy cultivation should not be confused with cultivation of deepwater rice, which is grown in flooded conditions with water more than 50 cm (20 in) deep for at least a month. Genetic evidence shows that all forms of paddy rice, both indica and japonica, spring from a domestication of the wild rice Oryza rufipogon that first occurred 8,200 -- 13,500 years ago South of the Yangtze River in present - day China. However, the domesticated indica subspecies currently appears to be a product of the introgression of favorable alleles from japonica at a later date, so that there are possibly several events of cultivation and domestication. Paddy fields are the typical feature of rice farming in east, south and southeast Asia. Fields can be built into steep hillsides as terraces and adjacent to depressed or steeply sloped features such as rivers or marshes. They can require a great deal of labor and materials to create, and need large quantities of water for irrigation. Oxen and water buffalo, adapted for life in wetlands, are important working animals used extensively in paddy field farming. During the 20th century, paddy - field farming became the dominant form of growing rice. Hill tribes of Thailand still cultivate dry - soil varieties called upland rice. Paddy field farming is practiced in Asia, namely in Cambodia, Bangladesh, China, Taiwan, India, Indonesia, Iran, Japan, North Korea, South Korea, Malaysia, Myanmar, Nepal, Pakistan, Philippines, Sri Lanka, Thailand, Vietnam, and Laos, and in Europe, Northern Italy, the Camargue in France, and in Spain, particularly in the Albufera de València wetlands in the Valencian Community, the Ebro Delta in Catalonia and the Guadalquivir wetlands in Andalusia, as well as along the eastern coast of Brazil, the Artibonite Valley in Haiti, and Sacramento Valley in California, among other places. Paddy fields are a major source of atmospheric methane and have been estimated to contribute in the range of 50 to 100 million tonnes of the gas per annum. Studies have shown that this can be significantly reduced while also boosting crop yield by draining the paddies to allow the soil to aerate to interrupt methane production. Studies have also shown the variability in assessment of methane emission using local, regional and global factors and calling for better inventorisation based on micro level data. The word "paddy '' is derived from the Malay word padi, rice plant. Archaeologists generally accept that wet - field cultivation originated in China. The earliest paddy field found, dates to 4330 BC, based on carbon dating of grains of rice and soil organic matter found at the Chaodun site in Kunshan County. At Caoxieshan, a site of the Neolithic Majiabang culture, archaeologists excavated paddy fields. Some archaeologists claim that Caoxieshan may date to 4000 -- 3000 BC. There is archaeological evidence, that un husked rice was stored for the military and for burial with the deceased, from the Neolithic period to the Han Dynasty in China. There are ten archaeologically excavated rice paddy fields in Korea. The two oldest are the Okhyun and Yaumdong sites, found in Ulsan, dating to the early Mumun pottery period. Paddy field farming goes back thousands of years in Korea. A pit - house at the Daecheon - ni site yielded carbonized rice grains and radiocarbon dates, indicating that rice cultivation in dry - fields may have begun as early as the Middle Jeulmun pottery period (c. 3500 -- 2000 BC) in the Korean Peninsula. Ancient paddy fields have been carefully unearthed in Korea by institutes such as Kyungnam University Museum (KUM) of Masan. They excavated paddy field features at the Geumcheon - ni Site near Miryang, South Gyeongsang Province. The paddy field feature was found next to a pit - house that is dated to the latter part of the Early Mumun pottery period (c. 1100 -- 850 BC). KUM has conducted excavations, that have revealed similarly dated paddy field features, at Yaeum - dong and Okhyeon, in modern - day Ulsan. The earliest Mumun features were usually located in low - lying narrow gullies, that were naturally swampy and fed by the local stream system. Some Mumun paddy fields in flat areas were made of a series of squares and rectangles, separated by bunds approximately 10 cm in height, while terraced paddy fields consisted of long irregular shapes that followed natural contours of the land at various levels. Mumun Period rice farmers used all of the elements that are present in today 's paddy fields, such as terracing, bunds, canals, and small reservoirs. We can grasp some paddy - field farming techniques of the Middle Mumun (c. 850 -- 550 BC), from the well - preserved wooden tools excavated from archaeological rice fields at the Majeon - ni Site. However, iron tools for paddy - field farming were not introduced until sometime after 200 BC. The spatial scale of paddy - fields increased, with the regular use of iron tools, in the Three Kingdoms of Korea Period (c. AD 300 / 400 - 668). The first paddy fields in Japan date to the Early Yayoi period (300 BC -- 250 AD). The Early Yayoi has been re-dated, and it appears that wet - field agriculture developed at about the same time as in the Korean peninsula. Evidence of wild rice on the island of Sulawesi dates from 3000 BCE. Historic evidence for the earliest cultivation, however, comes from eighth century stone inscriptions from the central island of Java, which show kings levied taxes in rice. In ancient Java, during the Medang Mataram period, many inscriptions are related to the establishment of the sima lands. This signify the formation and expansion of Javanese agricultural villages in the region during this period. Either by opening a forest or converting a ladang (dry rice cultivation) to sawah (wet rice cultivation). A sima is an arable wet rice agricultural land with rice surpluses available for taxation, and officially recognised through royal edict. Most of these sima lands are ruled by regional rakai or samget (landed gentry) in their realm. The Rakais that rule the land are granted a royal permission to collect tax, yet some parts of these tax should be regularly paid to the king 's court. In some instance, some of these sima inscription stated that this sima land has become a tax - free land, in exchange that the rice harvest surpluses collected from this land are used to construct or maintain a religious building. The images of rice cultivation, rice barn, and mouse pest investing a rice field is evident in Karmawibhangga bas - reliefs of Borobudur. Divisions of labour between men, women, and animals that are still in place in Indonesian rice cultivation, were carved into relief friezes on the ninth century Prambanan temples in Central Java: a water buffalo attached to a plough; women planting seedlings and pounding grain; and a man carrying sheaves of rice on each end of a pole across his shoulders (pikulan). In the sixteenth century, Europeans visiting the Indonesian islands saw rice as a new prestige food served to the aristocracy during ceremonies and feasts. Rice production in Indonesian history is linked to the development of iron tools and the domestication of Wild Asian Water Buffalo as water buffalo for cultivation of fields and manure for fertiliser. Rice production requires exposure to the sun. Once covered in dense forest, much of the Indonesian landscape has been gradually cleared for permanent fields and settlements as rice cultivation developed over the last fifteen hundred years. In the Philippines, the use of rice paddies can be traced to prehistoric times, as evidenced in the names of towns such as Pila, Laguna, whose name can be traced to the straight mounds of dirt that form the boundaries of the rice paddy, or "Pilapil. '' Wet rice cultivation in Vietnam dates back to the Neolithic Hoa Binh culture and Bac Son culture. Although China 's agricultural output is the largest in the world, only about 15 % of its total land area can be cultivated. About 75 % of the cultivated area is used for food crops. Rice is China 's most important crop, raised on about 25 % of the cultivated area. Most rice is grown south of the Huai River, in the Yangtze valley, the Zhu Jiang delta, and in Yunnan, Guizhou, and Sichuan provinces. Rice appears to have been used by the Early Neolithic populations of Lijiacun and Yunchanyan in China. Evidence of possible rice cultivation from ca. 11,500 BP has been found, however it is still questioned whether the rice was indeed being cultivated, or instead being gathered as wild rice. Bruce Smith, an archaeologist at the Smithsonian Institution in Washington, D.C., who has written on the origins of agriculture, says that evidence has been mounting that the Yangtze was probably the site of the earliest rice cultivation. In 1998, Crawford & Shen reported that the earliest of 14 AMS or radiocarbon dates on rice from at least nine Early to Middle Neolithic sites is no older than 7000 BC, that rice from the Hemudu and Luojiajiao sites indicates that rice domestication likely began before 5000 BC, but that most sites in China from which rice remains have been recovered are younger than 5000 BC. During the Spring and Autumn period (722 -- 481 BC), two revolutionary improvements in farming technology took place. One was the use of cast iron tools and beasts of burden to pull plows, and the other was the large - scale harnessing of rivers and development of water conservation projects. Sunshu Ao of the 6th century BC and Ximen Bao of the 5th century BC are two of the earliest hydraulic engineers from China, and their works were focused upon improving irrigation systems. These developments were widely spread during the ensuing Warring States period (403 -- 221 BC), culminating in the enormous Du Jiang Yan Irrigation System engineered by Li Bing by 256 BC for the State of Qin in ancient Sichuan. During the Eastern Jin (317 -- 420) and the Northern and Southern Dynasties (420 -- 589), land - use became more intensive and efficient, rice was grown twice a year and cattle began to be used for plowing and fertilization. In circa 750, 75 % of China 's population lived north of the river Yangtze, but by 1250, 75 % of China 's population lived south of the river Yangtze. Such large - scale internal migration was possible due to introduction of quick - ripening strains of rice from Vietnam suitable for multi-cropping. Localities in China which are famous for their spectacular rice paddies are Yuanyang County, Yunnan, and Longsheng County, Guangxi. India has the largest paddy output in the world and is also the fourth largest exporter of rice in the world. In India, West Bengal is the largest rice producing state. Paddy fields are a common sight throughout India, both in the northern gangetic plains and the southern peninsular plateaus. Paddy is cultivated at least twice a year in most parts of India, the two seasons being known as Rabi and Kharif respectively. The former cultivation is dependent on irrigation, while the latter depends on Monsoon. The paddy cultivation plays a major role in socio - cultural life of rural India. Many festivals such as Onam in Kerala, Bihu in Assam, Makara Sankranthi in Andhra Pradesh, Thai Pongal In Tamil Nadu, Makar Sankranti in Karnataka, Nabanna in West Bengal celebrates harvest of Paddy. Kaveri delta region of Thanjavur is historically known as the rice bowl of Tamil Nadu and Kuttanadu is called the rice bowl of Kerala. Prime Javanese paddy yields roughly 6 metric tons of unmilled rice (2.5 metric tons of milled rice) per hectare. When irrigation is available, rice farmers typically plant Green Revolution rice varieties allowing three growing seasons per year. Since fertilizer and pesticide are relatively expensive inputs, farmers typically plant seeds in a very small plot. Three weeks following germination, the 15 - 20 centimetre (6 -- 8 in) stalks are picked and replanted at greater separation, in a backbreaking manual procedure. Rice harvesting in Central Java is often performed not by owners or sharecroppers of paddy, but rather by itinerant middlemen, whose small firms specialize in harvesting, transport, milling, and distribution to markets. The fertile volcanic soil of much of the Indonesian archipelago -- and particularly the islands of Java and Bali -- has made rice a central dietary staple. Steep terrain on Bali resulted in an intricate cooperation systems, locally called subak, to manage water storage and drainage for rice terraces. Rice is grown in northern Italy, especially in the valley of the river Po. The paddy fields are irrigated by fast - flowing streams descending from the Alps. The acidic soil conditions common in Japan due to volcanic eruptions have made the paddy field the most productive farming method. Paddy fields are represented by the kanji 田 (commonly read as ta) that has had a strong influence on Japanese culture. In fact, the character 田, which originally meant ' field ' in general, is used in Japan exclusively to convey the meaning ' rice paddy field '. One of the oldest samples of writing in Japan is widely credited to the kanji 田 found on pottery at the archaeological site of Matsutaka in Mie Prefecture that dates to the late 2nd century. Ta (田) is used as a part of many place names as well as in many family names. Most of these places are somehow related to the paddy field and in many cases, are based on the history of a particular location. For example, where a river runs through a village, the place east of river may be called Higashida (東田), literally "east paddy field. '' A place with a newly irrigated paddy field, especially those during or later than Edo period, may be called Nitta or Shinden (both 新 田), "new paddy field. '' In some places, lakes and marshes were likened to a paddy field and were named with ta, like Hakkōda (八甲田). Today, many family names have ta as a component, a practice which can be largely attributed to a government edict in the early Meiji Period which required all citizens to have a family name. Many chose a name based on some geographical feature associated with their residence or occupation, and as nearly three fourths of the population were farmers, many made family names using ta. Some common examples are Tanaka (田中), literally meaning "in the paddy field; '' Nakata (中田), "middle paddy field; '' Kawada (川田), "river paddy field; '' and Furuta (古田), "old paddy field. '' In recent years rice consumption in Japan has fallen and many rice farmers are increasingly elderly. The government has subsidized rice production since the 1970s, and favors protectionist policies regarding cheaper imported rice. Arable land in small alluvial flats of most rural river valleys in South Korea are dedicated to paddy - field farming. Farmers assess paddy fields for any necessary repairs in February. Fields may be rebuilt, and bund breaches are repaired. This work is carried out until mid-March, when warmer spring weather allows the farmer to buy or grow rice seedlings. They are transplanted (usually by rice transplanter) from the indoors into freshly flooded paddy fields in May. Farmers tend and weed their paddy fields through the summer until around the time of Chuseok, a traditional holiday held on 15 August of the Lunar Calendar (circa mid-September by Solar Calendar). The harvest begins in October. Coordinating the harvest can be challenging because many Korean farmers have small paddy fields in a number of locations around their villages, and modern harvesting machines are sometimes shared between extended family members. Farmers usually dry the harvested grains in the sun before bringing them to market. The Chinese (or Sino - Korean) character for ' field ', jeon (Hangul: 전; Hanja: 田), is found in some place names, especially small farming townships and villages. However, the specific Korean term for ' paddy ' is a purely Korean word, "non '' (Hangul: 논). In Madagascar, the average annual consumption of rice is 130 kg per person, one of the largest in the world. According to a 1999 study of UPDRS / FAO: The majority of rice is related to irrigation (1,054,381 ha). The choice of methods conditioning performance is determined by the variety and quality control of water... The "Tavy '', is traditionally the culture of flooded upland rice on burning of cleared natural rain forest (135,966 ha). Criticized as being the cause of deforestation, "Tavy '' is still widely practiced by farmers in Madagascar, who find a good compromise between climate risks, availability of labour and food security. "Tanety '' means hill. By extension, the "tanety '' is also growing upland rice, carried out on the grassy slopes have been deforested for the operation of charcoal. (139,337 ha) Among the many varieties, rice of Madagascar include: "Vary lava '' is a translucent long and large grain rice. It is a luxury ricer. "Vary Makalioka, is translucent long and thin grain rice. "Vary Rojofotsy '' is a - half long. grain rice "Vary mena '' or red rice, is exclusive to Madagascar. Paddy field are typically found on Peninsular Malaysia, in most of its regions. The most scenic paddy fields are located in northern Malaysia, in Kedah, Perlis and Penang; almost covering these states. Paddy fields also can be found on Malaysia 's eastern coast region, mainly in Kelantan and Terengganu, and also in Selangor, especially in the districts of Kuala Selangor and Sabak Bernam. Before Malaysia became heavily reliant on its industrial output, people were mainly involved in agriculture, especially in the production of rice. It was for that reason, that people usually built their houses next to paddy fields. The very spicy chili pepper that is often eaten in Malaysia, the bird 's eye chili, is locally called cili padi, literally "paddy chili ''. Rice is grown primarily in three areas -- the Irrawaddy Delta, the area along and the delta of the Kaladan River, and the Central plains around Mandalay, though there has been an increase in rice farming in Shan State and Kachin State in recent years. Up until the later 1960s, Myanmar was the main exporter of rice. Termed the rice basket of South East Asia, much of the rice grown in Myanmar does not rely on fertilizers and pesticides, thus, although "organic '' in a sense, it has been unable to cope with population growth and other rice economies which utilized fertilizers. Rice is now grown in all the three seasons of Myanmar, though primarily in the Monsoon season -- from June to October. Rice grown in the delta areas rely heavily on the river water and sedimented minerals from the northern mountains, whilst the rice grown in the central regions require irrigation from the Irrawaddy River. The fields are tilled when the first rains arrive -- traditionally measured at 40 days after Thingyan, the Burmese New Year -- around the beginning of June. In modern times, tractors are used, but traditionally, buffalos were employed. The rice plants are planted in nurseries and then transplanted by hand into the prepared fields. The rice is then harvested in late November -- "when the rice bends with age ''. Most of the rice planting and harvesting are done by hand. The rice is then threshed and stored, ready for the mills. In Nepal, rice (Nepali: धान, Dhaan) is grown in the Terai and hilly regions. It is mainly grown during the summer monsoon in Nepal. Paddy fields are a common sight in the Philippines. Several vast paddy fields exist in the provinces of Ifugao, Nueva Ecija, Isabela, Cagayan, Bulacan, Quezon, and other provinces. Nueva Ecija is considered the main rice growing province of the Philippines and the leading producer of onions in the Municipality of Bongabon in Southeast Asia. It is currently the 9th richest province in the country. The Banaue Rice Terraces is an example of paddy fields in the country, it is located in Northern Luzon, Philippines and were built by the Ifugaos 2,000 years ago. Streams and springs found in the mountains were tapped and channeled into Irrigation canals that run downhill through the rice terraces. Other notable Philippine paddy fields are the Batad Rice Terraces, the Bangaan Rice Terraces, the Mayoyao Rice Terraces and the Hapao Rice Terraces. Located at Barangay Batad in Banaue, the Batad Rice Terraces are shaped like an amphitheatre, and can be reached by a 12 - kilometer ride from Banaue Hotel and a 2 - hour hike uphill through mountain trails. The Bangaan Rice Terraces portray the typical Ifugao community, where the livelihood activities are within the village and its surroundings. The Bangaan Rice Terraces is accessible in a one - hour ride from Poblacion, Banaue, then a 20 - minute trek down to the village. It can be viewed best from the road to Mayoyao. The Mayoyao Rice Terraces is located at Mayoyao, 44 kilometers away from Poblacion, Banaue. The town of Mayoyao lies in the midst of these rice terraces. All dikes are tiered with flat stones. The Hapao Rice Terraces can be reached within 55 kilometers from the capital town of Lagawe. Other Ifugao stone - walled rice terraces are located in the municipality of Hungduan. Agriculture in Sri Lanka mainly depends on rice production. Sri Lanka sometimes exports rice to its neighboring countries. Around 1.5 million hectares of land is cultivated in Sri Lanka for paddy in 2008 / 2009 maha: 64 % of which is cultivated during the dry season and 35 % cultivated during the wet season. Around 879,000 farmer families are engaged in paddy cultivation in Sri Lanka. They make up 20 % of the country 's population and 32 % of the employment. Rice production in Thailand represents a significant portion of the Thai economy. It uses over half of the farmable land area and labor force in Thailand. Thailand has a strong tradition of rice production. It has the fifth - largest amount of land under rice cultivation in the world and is the world 's largest exporter of rice. Thailand has plans to further increase its land available for rice production, with a goal of adding 500,000 hectares to its already 9.2 million hectares of rice - growing areas. The Thai Ministry of Agriculture expected rice production to yield around 30 million tons of rice for 2008. The most produced strain of rice in Thailand is jasmine rice, which has a significantly lower yield rate than other types of rice, but also normally fetches more than double the price of other strains in a global market. Rice fields in Vietnam (ruộng or cánh đồng in Vietnamese) are the predominant land use in the valley of the Red River and the Mekong Delta. In the Red River Delta of northern Vietnam, control of seasonal riverine floodings is achieved by an extensive network of dykes which over the centuries total some 3000 km. In the Mekong Delta of southern Vietnam, there is an interlacing drainage and irrigation canal system that has become the symbol of this area. It jointly serves as transportation routes, allowing farmers to bring their produce to market. In Northwestern Vietnam, Thai people built their "valley culture '' based on the cultivation of glutinous rice planted in upland fields, requiring terracing of the slopes. The primary festival related to the agrarian cycle is "lễ hạ điền '' (literally "descent into the fields '') held as the start of the planting season in hope of a bountiful harvest. Traditionally, the event was officiated with much pomp. The monarch carried out the ritual plowing the first furrow while local dignitaries and farmers followed suit. Thổ địa (deities of the earth), thành hoàng làng (the village patron spirit), Thần Nông (god of agriculture), and thần lúa (god of rice plants) were all venerated with prayers and offerings. In colloquial Vietnamese, wealth is frequently associated with the vastness of the individual 's land holdings. Paddy fields so large as for "storks to fly with their wings out - stretched '' ("đồng lúa thẳng cánh cò bay '') can be heard as a common metaphor. Wind - blown undulating rice plants across a paddy field in literary Vietnamese is termed figuratively "waves of rice plants '' ("sóng lúa '').
who was the first elected woman president in the world
List of elected and appointed female heads of state and government - wikipedia This is a list of women who have been elected or appointed head of state or government of their respective countries since the mid-20th century. The list includes female presidents who are head of state and may also be head of government, as well as female heads of government who are not concurrently head of state, such as prime ministers. The list does not include female monarchs who are head of state. To date, the country with most female Heads of State is San Marino (16, with three of them serving two non-consecutive terms), followed by Switzerland with seven. Among countries with only a single person in the position, Haiti has had the most female heads of state or government, with four. Included are women who have been appointed representatives of heads of state, such as female governors - general and French representatives of Andorra. As governors - general are appointed representatives of the monarch of the Commonwealth realms (currently Elizabeth II) and the French Representatives of Andorra are appointed representatives of the French Co-Prince of Andorra (currently Emmanuel Macron), they act as heads of state and carry out on a regular basis the functions and duties associated with such a role in the Commonwealth realms (excluding the United Kingdom, which has no governor - general, as the monarch of the Commonwealth realms primarily resides there) and Andorra, respectively. Also included in the list are eleven women who have held an office styled either as Prime Minister or State Counsellor during periods when Guyana (since 1980), Sri Lanka (since 1978), Namibia (since 1990), South Korea (since 1987), Peru (since 1993) and Myanmar (since 2016) had an executive presidency and the prime minister (or a similar position) was not legally and constitutionally the head of government, but rather a deputy to the president who was the combined head of state and head of government.
where does the most metabolic activity in the cell occur
Cytosol - wikipedia The cytosol or cytoplasmic matrix is the liquid found inside cells. It constitutes most of the intracellular fluid (ICF). It is separated into compartments by membranes. For example, the mitochondrial matrix separates the mitochondrion into many compartments. In the eukaryotic cell, the cytosol is within the cell membrane and is part of the cytoplasm, which also comprises the mitochondria, plastids, and other organelles (but not their internal fluids and structures); the cell nucleus is separate. The cytosol is thus a liquid matrix around the organelles. In prokaryotes, most of the chemical reactions of metabolism take place in the cytosol, while a few take place in membranes or in the periplasmic space. In eukaryotes, while many metabolic pathways still occur in the cytosol, others are contained within organelles. The cytosol is a complex mixture of substances dissolved in water. Although water forms the large majority of the cytosol, its structure and properties within cells is not well understood. The concentrations of ions such as sodium and potassium are different in the cytosol than in the extracellular fluid; these differences in ion levels are important in processes such as osmoregulation, cell signaling, and the generation of action potentials in excitable cells such as endocrine, nerve and muscle cells. The cytosol also contains large amounts of macromolecules, which can alter how molecules behave, through macromolecular crowding. Although it was once thought to be a simple solution of molecules, the cytosol has multiple levels of organization. These include concentration gradients of small molecules such as calcium, large complexes of enzymes that act together to carry out metabolic pathways, and protein complexes such as proteasomes and carboxysomes that enclose and separate parts of the cytosol. The term "cytosol '' was first introduced in 1965 by H.A. Lardy, and initially referred to the liquid that was produced by breaking cells apart and pelleting all the insoluble components by ultracentrifugation. Such a soluble cell extract is not identical to the soluble part of the cell cytoplasm and is usually called a cytoplasmic fraction. The term cytosol is now used to refer to the liquid phase of the cytoplasm in an intact cell. This excludes any part of the cytoplasm that is contained within organelles. Due to the possibility of confusion between the use of the word "cytosol '' to refer to both extracts of cells and the soluble part of the cytoplasm in intact cells, the phrase "aqueous cytoplasm '' has been used to describe the liquid contents of the cytoplasm of living cells. Prior to this, other terms were used for the cell fluid, not always synonymously, as its nature was not very clear (see protoplasm). The proportion of cell volume that is cytosol varies: for example while this compartment forms the bulk of cell structure in bacteria, in plant cells the main compartment is the large central vacuole. The cytosol consists mostly of water, dissolved ions, small molecules, and large water - soluble molecules (such as proteins). The majority of these non-protein molecules have a molecular mass of less than 300 Da. This mixture of small molecules is extraordinarily complex, as the variety of molecules that are involved in metabolism (the metabolites) is immense. For example, up to 200,000 different small molecules might be made in plants, although not all these will be present in the same species, or in a single cell. Estimates of the number of metabolites in single cells such as E. coli and baker 's yeast predict that under 1,000 are made. Most of the cytosol is water, which makes up about 70 % of the total volume of a typical cell. The pH of the intracellular fluid is 7.4. while human cytosolic pH ranges between 7.0 - 7.4, and is usually higher if a cell is growing. The viscosity of cytoplasm is roughly the same as pure water, although diffusion of small molecules through this liquid is about fourfold slower than in pure water, due mostly to collisions with the large numbers of macromolecules in the cytosol. Studies in the brine shrimp have examined how water affects cell functions; these saw that a 20 % reduction in the amount of water in a cell inhibits metabolism, with metabolism decreasing progressively as the cell dries out and all metabolic activity halting when the water level reaches 70 % below normal. Although water is vital for life, the structure of this water in the cytosol is not well understood, mostly because methods such as nuclear magnetic resonance spectroscopy only give information on the average structure of water, and can not measure local variations at the microscopic scale. Even the structure of pure water is poorly understood, due to the ability of water to form structures such as water clusters through hydrogen bonds. The classic view of water in cells is that about 5 % of this water is strongly bound in by solutes or macromolecules as water of solvation, while the majority has the same structure as pure water. This water of solvation is not active in osmosis and may have different solvent properties, so that some dissolved molecules are excluded, while others become concentrated. However, others argue that the effects of the high concentrations of macromolecules in cells extend throughout the cytosol and that water in cells behaves very differently from the water in dilute solutions. These ideas include the proposal that cells contain zones of low and high - density water, which could have widespread effects on the structures and functions of the other parts of the cell. However, the use of advanced nuclear magnetic resonance methods to directly measure the mobility of water in living cells contradicts this idea, as it suggests that 85 % of cell water acts like that pure water, while the remainder is less mobile and probably bound to macromolecules. The concentrations of the other ions in cytosol are quite different from those in extracellular fluid and the cytosol also contains much higher amounts of charged macromolecules such as proteins and nucleic acids than the outside of the cell structure. In contrast to extracellular fluid, cytosol has a high concentration of potassium ions and a low concentration of sodium ions. This difference in ion concentrations is critical for osmoregulation, since if the ion levels were the same inside a cell as outside, water would enter constantly by osmosis - since the levels of macromolecules inside cells are higher than their levels outside. Instead, sodium ions are expelled and potassium ions taken up by the Na + / K + - ATPase, potassium ions then flow down their concentration gradient through potassium - selection ion channels, this loss of positive charge creates a negative membrane potential. To balance this potential difference, negative chloride ions also exit the cell, through selective chloride channels. The loss of sodium and chloride ions compensates for the osmotic effect of the higher concentration of organic molecules inside the cell. Cells can deal with even larger osmotic changes by accumulating osmoprotectants such as betaines or trehalose in their cytosol. Some of these molecules can allow cells to survive being completely dried out and allow an organism to enter a state of suspended animation called cryptobiosis. In this state the cytosol and osmoprotectants become a glass - like solid that helps stabilize proteins and cell membranes from the damaging effects of desiccation. The low concentration of calcium in the cytosol allows calcium ions to function as a second messenger in calcium signaling. Here, a signal such as a hormone or an action potential opens calcium channels so that calcium floods into the cytosol. This sudden increase in cytosolic calcium activates other signalling molecules, such as calmodulin and protein kinase C. Other ions such as chloride and potassium may also have signaling functions in the cytosol, but these are not well understood. Protein molecules that do not bind to cell membranes or the cytoskeleton are dissolved in the cytosol. The amount of protein in cells is extremely high, and approaches 200 mg / ml, occupying about 20 - 30 % of the volume of the cytosol. However, measuring precisely how much protein is dissolved in cytosol in intact cells is difficult, since some proteins appear to be weakly associated with membranes or organelles in whole cells and are released into solution upon cell lysis. Indeed, in experiments where the plasma membrane of cells were carefully disrupted using saponin, without damaging the other cell membranes, only about one quarter of cell protein was released. These cells were also able to synthesize proteins if given ATP and amino acids, implying that many of the enzymes in cytosol are bound to the cytoskeleton. However, the idea that the majority of the proteins in cells are tightly bound in a network called the microtrabecular lattice is now seen as unlikely. In prokaryotes the cytosol contains the cell 's genome, within a structure known as a nucleoid. This is an irregular mass of DNA and associated proteins that control the transcription and replication of the bacterial chromosome and plasmids. In eukaryotes the genome is held within the cell nucleus, which is separated from the cytosol by nuclear pores that block the free diffusion of any molecule larger than about 10 nanometres in diameter. This high concentration of macromolecules in cytosol causes an effect called macromolecular crowding, which is when the effective concentration of other macromolecules is increased, since they have less volume to move in. This crowding effect can produce large changes in both the rates and the position of chemical equilibrium of reactions in the cytosol. It is particularly important in its ability to alter dissociation constants by favoring the association of macromolecules, such as when multiple proteins come together to form protein complexes, or when DNA - binding proteins bind to their targets in the genome. Although the components of the cytosol are not separated into regions by cell membranes, these components do not always mix randomly and several levels of organization can localize specific molecules to defined sites within the cytosol. Although small molecules diffuse rapidly in the cytosol, concentration gradients can still be produced within this compartment. A well - studied example of these are the "calcium sparks '' that are produced for a short period in the region around an open calcium channel. These are about 2 micrometres in diameter and last for only a few milliseconds, although several sparks can merge to form larger gradients, called "calcium waves ''. Concentration gradients of other small molecules, such as oxygen and adenosine triphosphate may be produced in cells around clusters of mitochondria, although these are less well understood. Proteins can associate to form protein complexes, these often contain a set of proteins with similar functions, such as enzymes that carry out several steps in the same metabolic pathway. This organization can allow substrate channeling, which is when the product of one enzyme is passed directly to the next enzyme in a pathway without being released into solution. Channeling can make a pathway more rapid and efficient than it would be if the enzymes were randomly distributed in the cytosol, and can also prevent the release of unstable reaction intermediates. Although a wide variety of metabolic pathways involve enzymes that are tightly bound to each other, others may involve more loosely associated complexes that are very difficult to study outside the cell. Consequently, the importance of these complexes for metabolism in general remains unclear. Some protein complexes contain a large central cavity that is isolated from the remainder of the cytosol. One example of such an enclosed compartment is the proteasome. Here, a set of subunits form a hollow barrel containing proteases that degrade cytosolic proteins. Since these would be damaging if they mixed freely with the remainder of the cytosol, the barrel is capped by a set of regulatory proteins that recognize proteins with a signal directing them for degradation (a ubiquitin tag) and feed them into the proteolytic cavity. Another large class of protein compartments are bacterial microcompartments, which are made of a protein shell that encapsulates various enzymes. These compartments are typically about 100 - 200 nanometres across and made of interlocking proteins. A well - understood example is the carboxysome, which contains enzymes involved in carbon fixation such as RuBisCO. Although the cytoskeleton is not part of the cytosol, the presence of this network of filaments restricts the diffusion of large particles in the cell. For example, in several studies tracer particles larger than about 25 nanometres (about the size of a ribosome) were excluded from parts of the cytosol around the edges of the cell and next to the nucleus. These "excluding compartments '' may contain a much denser meshwork of actin fibres than the remainder of the cytosol. These microdomains could influence the distribution of large structures such as ribosomes and organelles within the cytosol by excluding them from some areas and concentrating them in others. The cytosol has no single function and is instead the site of multiple cell processes. Examples of these processes include signal transduction from the cell membrane to sites within the cell, such as the cell nucleus, or organelles. This compartment is also the site of many of the processes of cytokinesis, after the breakdown of the nuclear membrane in mitosis. Another major function of cytosol is to transport metabolites from their site of production to where they are used. This is relatively simple for water - soluble molecules, such as amino acids, which can diffuse rapidly through the cytosol. However, hydrophobic molecules, such as fatty acids or sterols, can be transported through the cytosol by specific binding proteins, which shuttle these molecules between cell membranes. Molecules taken into the cell by endocytosis or on their way to be secreted can also be transported through the cytosol inside vesicles, which are small spheres of lipids that are moved along the cytoskeleton by motor proteins. The cytosol is the site of most metabolism in prokaryotes, and a large proportion of the metabolism of eukaryotes. For instance, in mammals about half of the proteins in the cell are localized to the cytosol. The most complete data are available in yeast, where metabolic reconstructions indicate that the majority of both metabolic processes and metabolites occur in the cytosol. Major metabolic pathways that occur in the cytosol in animals are protein biosynthesis, the pentose phosphate pathway, glycolysis and gluconeogenesis. The localization of pathways can be different in other organisms, for instance fatty acid synthesis occurs in chloroplasts in plants and in apicoplasts in apicomplexa.
what happened between caleb and ashley on heartland
Heartland (Canadian TV series) - wikipedia Heartland is a Canadian family drama television series which debuted on CBC on October 14, 2007. The series is based on the Heartland book series by Lauren Brooke. Heartland follows sisters Amy and Lou Fleming, their grandfather Jack Bartlett, and Ty Borden, through the highs and lows of life at the ranch. As of the episode shown on March 29, 2015, Heartland surpassed Street Legal as the longest - running one - hour scripted drama in the history of Canadian television. It celebrated its ten - year anniversary in 2016, and Season 10 began airing October 2, 2016. It was announced on March 22, 2017 that Heartland was renewed for Season 11. The season premiered on September 24, 2017. Much of the series is filmed on location in and around High River, Alberta, with additional filming in studio and on location in nearby Calgary. A June 2013 flood in High River swamped the standing set for Maggie 's Diner. Entertainment One has released the first seven seasons of Heartland on DVD in Region 1 (Canada only). The standalone TV movie A Heartland Christmas was released on DVD in Canada on November 1, 2011 and in the USA on October 29, 2013. Season 8 was released in Canada on October 6, 2015. In Region 2, 4Digital Media has released the first 6 seasons on DVD in the UK. In Region 4, Season 1 and Season 2 Parts 1 and 2 have been released by Roadshow Home Video. Seasons 1 - 9 are available on Netflix and Hulu US season 10 will be available in 2018. In its series premiere, Heartland beat out Global 's comedy Da Kink in My Hair with 513,000 viewers in a battle of two new series. After four episodes, Heartland had an average viewership of 464,000. In its first - season finale, Heartland attracted 625,000 viewers. The third - season premiere brought in over 1 million viewers, a new record for the show. The 100th episode "After All We 've Been Through '' was watched by 945,000 viewers.
who won the war between india and pakistan in 1971
Indo - Pakistani war of 1971 - wikipedia Western Front: Eastern Front: Western Front: India Pakistan 2,500 -- 3,843 killed. Pakistani claims Indian claims Neutral claims 9,000 killed 25,000 wounded 97,368 captured 2 Destroyers 1 Minesweeper 1 Submarine 3 Patrol vessels 7 Gunboats Pakistani claims Indian claims Neutral claims Systematic events § indicates events in the internal resistance movement linked to the Indo - Pakistani War. ‡ indicates events in the Indo - Pakistani War linked to the internal resistance movement in Bangladesh. The Indo - Pakistani War of 1971 was a military confrontation between India and Pakistan that occurred during the liberation war in East Pakistan from 3 December 1971 to the fall of Dacca (Dhaka) on 16 December 1971. The war began with preemptive aerial strikes on 11 Indian air stations, that led to the commencement of hostilities with Pakistan and Indian entry into the war of independence in East Pakistan on the side of Bengali nationalist forces. Lasting just 13 days, it is one of the shortest wars in history. During the war, Indian and Pakistani militaries simultaneously clashed on the eastern and western fronts; the war ended after the Eastern Command of the Pakistan military signed the Instrument of Surrender on 16 December 1971 in Dhaka, marking the formation of East Pakistan as the new nation of Bangladesh. Officially, East Pakistan had earlier called for its secession from the unity of Pakistan on 26 March 1971. Approximately 90,000 to 93,000 Pakistani servicemen were taken prisoner by the Indian Army, which included 79,676 to 81,000 uniformed personnel of the Pakistan Armed Forces, including some Bengali soldiers who had remained loyal to Pakistan. The remaining 10,324 to 12,500 prisoners were civilians, either family members of the military personnel or collaborators (razakars). It is estimated that between 300,000 and 3,000,000 civilians were killed in Bangladesh. As a result of the conflict, a further eight to ten million people fled the country to seek refuge in India. During the 1971 Bangladesh war for independence, members of the Pakistani military and supporting Islamist militias called the Razakars raped between 200,000 and 400,000 Bangladeshi women and girls in a systematic campaign of genocidal rape. The Indo - Pakistani conflict was sparked by the armed liberation struggle in East Pakistan between the dominant Bengalis and the multi-ethnic West Pakistanis over the right to govern and the constitution. The political tensions between East Bengal and West Pakistan had its origin in the creation of Pakistan as a result of the partition of India by the United Kingdom in 1947; the popular language movement in 1950; mass riots in East Bengal in 1964; and the mass protests in 1969. These led to the resignation of President Ayub Khan, who invited army chief General Yahya Khan to take over the central government. The geographical distance between the eastern and western wings of Pakistan was vast; East Pakistan lay over 1,000 miles (1,600 km) away, which greatly hampered any attempt to integrate the Bengali and the Pakistani cultures. To overcome the Bengali domination and prevent formation of the central government in Islamabad, the controversial One Unit program established the two wings of East and West Pakistan. West Pakistanis ' opposition to these efforts made it difficult to effectively govern both wings. In 1969, President Yahya Khan announced the first general elections and disestablished the status of West Pakistan as a single province in 1970, in order to restore it to its original heterogeneous status comprising four provinces, as defined at the time of establishment of Pakistan in 1947. In addition, there were also religious and racial tensions between Bengalis and the multi-ethnic West Pakistanis, as Bengalis looked different from the dominant West Pakistanis. The general elections, held in 1970, resulted in East Pakistan 's Awami League gaining 167 out of 169 seats for the East Pakistan Legislative Assembly, and a near - absolute majority in the 313 - seat National Assembly, while the vote in West Pakistan was mostly won by the socialist Pakistan Peoples Party. The Awami League leader Sheikh Mujibur Rahman stressed his political position by presenting his Six Points and endorsing the Bengalis ' right to govern. The League 's election success caused many West Pakistanis to fear that it would allow the Bengalis to draft the constitution based on the six - points and liberalism. To resolve the crisis, the Ahsan -- Yaqub Mission was formed to provide recommendations, and its findings were met with favourable reviews from the Awami League, the Pakistan Peoples Party, and the Pakistan Muslim League as well as from President Yahya Khan. However, the mission was not supported by the elements in the National Security Council and was subsequently vetoed. Zulfikar Ali Bhutto, the chairman of Pakistan Peoples Party, endorsed the veto and subsequently refused to yield the premiership of Pakistan to Sheikh Mujibur Rahman. The Awami League called for general strikes in the country. President Yahya Khan postponed the inauguration of the National Assembly, causing a shattering disillusionment to the Awami League and their supporters throughout East Pakistan. In reaction, Sheikh Mujibur Rahman called for general strikes that eventually shutdown the government, and dissidents in the East began targeting the ethnic Bihari community, which had supported West Pakistan. In early March 1971, approximately 300 Biharis were slaughtered in riots by Bengali mobs in Chittagong alone. The Government of Pakistan used the "Bihari massacre '' to justify its deployment of the military in East Pakistan on 25 March, when it initiated its military crackdown. President Yahya Khan called on the military - which was overwhelmingly led by West Pakistanis - to suppress dissent in the East, after accepting the resignation of Lieutenant - General Yaqub Ali Khan, the chief of staff of the East - Pakistani military. Mass arrests of dissidents began and, after several days of strikes and non-cooperation, the Pakistani military, led by Lieutenant - General Tikka Khan, cracked down on Dhaka on the night of 25 March 1971. The government outlawed the Awami League, which forced many of its members and sympathisers into refuge in Eastern India. Mujib was arrested on the night of 25 / 26 March 1971 at about 1: 30 am (as per Radio Pakistan 's news on 29 March 1971) and taken to West Pakistan. Operation Searchlight, followed by Operation Barisal, attempted to kill the intellectual elite of the east. On 26 March 1971, Major Ziaur Rahman of Pakistan Army declared the independence of Bangladesh on behalf of Sheikh Mujibur Rahman. In April, the exiled Awami League leaders formed a government - in - exile in Baidyanathtala of Meherpur. The East Pakistan Rifles and Bengali officers in Pakistan 's army, navy, and marines, defected to the rebellion after taking refuge in different parts of India. The Bangladesh Force, namely the Mukti Bahini, consisting of Niyomito Bahini (Regular Force) and Oniyomito Bahini (Guerilla Force), was formed under the retired colonel Mohammad Ataul Gani Osmani. After the resignations of Admiral S.M. Ahsan and Lieutenant - General Yaqub Ali Khan, the media correspondents began airing reports of the Pakistani military 's widespread genocide against their Bengali citizens, particularly aimed at the minority Bengali Hindu population, which led to approximately 10 million people seeking refuge in the neighbouring states of Eastern India. The Indian government opened the East Pakistan -- India border to allow the Bengali refugees to find safe shelter; the governments of West Bengal, Bihar, Assam, Meghalaya and Tripura established refugee camps along the border. The resulting flood of impoverished East Pakistani refugees strained India 's already overburdened economy. The Indian government repeatedly appealed to the international community for assistance, but failed to elicit any response, despite the External Affairs minister Swaran Singh meeting foreign ministers of other countries. Prime Minister Indira Gandhi on 27 March 1971 expressed full support of her government for the independence struggle of the people of East Pakistan, and concluded that instead of taking in millions of refugees, it was economical to go to war against Pakistan. On 28 April 1971, the Gandhi cabinet had ordered the Chief of the Army Staff General Sam Manekshaw to "Go into East Pakistan ''. Defected East Pakistan military officers and the elements of Indian Research and Analysis Wing (RAW) immediately started using the Indian refugee camps for recruitment and training of Mukti Bahini guerrillas that were to be trained against Pakistan. In 1971, a strong wave of Indian - supported Bangladeshi nationalism emerged in the East. Violence and the systematic targeted killings of unarmed multi-ethic Pakistanis living in the East started. Vehicle bombings on government secretariats became a normal narrative in news reports, and high - profile assassinations of Bengali politicians loyal to Pakistan became common in the East. According to Jussi Hanhimäki, Finnish historian of terrorism, the Bengali terrorism in the East is a somewhat "forgotten episode of annals of terrorism. '' The Hamoodur Rahman Commission endorsed the claims of Bengali terrorism when it critically penned that the ill - treatment of families of multi-ethnic Pakistanis led to the Pakistani military soldiers reacting violently to restore the writ of the government. The news media 's mood in Pakistan had also turned increasingly jingoistic and militaristic against East Pakistan and India when the Pakistani news media reported the complexity of the situation in the East, though the reactions from Pakistan 's news media pundits were mixed. By the end of September 1971, a propaganda campaign, possibly orchestrated by elements within the Government of Pakistan, resulted in stickers endorsing Crush India becoming a standard feature on the rear windows of vehicles in Rawalpindi, Islamabad and Lahore; this soon spread to the rest of West Pakistan. By October, other stickers proclaimed Hang the Traitor in an apparent reference to Sheikh Mujibur Rahman. By the first week of December, the conservative print media outlets in the country had published jihad related materials to boost the recruitment in the military. By the end of April 1971, Prime Minister Indira Gandhi had asked the Indian Army chief General Sam Manekshaw if he was ready to go to war with Pakistan. According to Manekshaw 's own personal account, he refused, citing the onset of monsoon season in East Pakistan and also the fact that the army tanks were being being refitted. He offered his resignation, which Gandhi declined. He then said he could guarantee victory if she would allow him to prepare for the conflict on his terms, and set a date for it; Gandhi accepted his conditions. In reality, Gandhi was well aware of the difficulties of a hasty military action, but she needed to get the military 's views to satisfy her hawkish colleagues and the public opinion, which were critical of India 's restraint. By November 1971, and Indian - Pakistani war seemed inevitable. The Soviet Union reportedly warned Pakistan against the war, which they termed as "suicidal course for Pakistan 's unity. '' Despite this warning, in November 1971, thousands of people led by conservative Pakistani politicians marched in Lahore and across Pakistan, calling for Pakistan to Crush India. India responded by starting a massive buildup of the Indian Army on the western borders; the army waited until December, when the drier ground in the East would have made for easier operations and the Himalayan passes would have been closed by snow, preventing any Chinese intervention. On 23 November, President Yahya Khan declared a state of emergency in all of Pakistan and told his people to prepare for war. On the evening of 3 December, at about 5: 40 pm, the Pakistan Air Force (PAF) launched surprise pre-emptive strikes on eleven airfields in north - western India, including Agra, which was 300 miles (480 km) from the border. At the time of this attack, the Taj Mahal had been camouflaged with a forest of twigs and leaves and draped with burlap, because its marble glowed like a white beacon in the moonlight. These preemptive strikes, known as Operation Chengiz Khan, were inspired by the success of Israeli Operation Focus in the Arab -- Israeli Six - Day War. Unlike the Israeli attack on Arab airbases in 1967, which involved a large number of Israeli planes, Pakistan flew no more than 50 planes to India. In an address to the nation on radio that same evening, Prime Minister Gandhi held that the air strikes were a declaration of war against India and the Indian Air Force (IAF) responded with initial air strikes that very night. These expanded to massive retaliatory air strikes the next morning. This air action marked the official start of the Indo - Pakistani War of 1971; Prime Minister Gandhi ordered the immediate mobilisation of troops and launched a full - scale invasion of Pakistan. This involved Indian forces in massive coordinated air, sea and land assaults on Pakistan from all fronts. The main Indian objective on the Eastern front was to capture Dacca, and on the Western front was to prevent Pakistan from entering Indian soil. There was no Indian intention of conducting any major offensive into Pakistan to dismember it into different states. Unlike the 1965 war, the Navy NHQ staffers and commanders of the Pakistan Navy knew very well that the Navy was ill - prepared for the naval conflict with India. The Pakistan Navy was in no condition of fighting an offensive war in deep sea against the Indian Navy, and neither was it in a condition to mount serious defence against Indian Navy 's seaborne encroachment. In the western theatre of the war, the Indian Navy 's Western Naval Command under Vice Admiral S.N. Kohli, successfully launched a surprise attack on Karachi port on the night of 4 / 5 December 1971 under the codename Trident. The naval attack involving the Soviet - built Osa missile boats sank the Pakistan Navy 's destroyer PNS Khyber and minesweeper PNS Muhafiz while PNS Shah Jahan was also badly damaged. Pakistani naval sources reported that about 720 Pakistani sailors were killed or wounded, and Pakistan lost reserve fuel and many commercial ships, thus crippling the Pakistan Navy 's further involvement in the conflict. In retaliation, the Pakistan Navy submarines, Hangor, Mangro, and Shushuk, began their operations to seek out the major Indian warships. On 9 December 1971, Hangor reportedly sank INS Khukri, inflicting 194 Indian casualties, and this attack was the first submarine kill since World War II. The sinking of INS Khukri was followed by another Indian attack on Karachi port on the night of 8 / 9 December 1971 under the codename Python. A squadron of Indian Navy 's Osa missile boats approached the Karachi port and launched a series of Soviet - acquired Styx missiles, that resulted in further destruction of reserve fuel tanks and the sinking of three Pakistani merchant ships, as well as foreign ships docked in Karachi. The Pakistan Air Force did not attack the Indian Navy ships, and confusion remained the next day when the civilian pilots of Pakistan International, acting as reconnaissance war pilots, misidentified PNS Zulfiqar and the air force attacked its own warship, inflicting major damages and killing several officers on board. In the eastern theatre of the war, the Indian Eastern Naval Command, under Vice Admiral Nilakanta Krishnan, completely isolated East Pakistan by a naval blockade in the Bay of Bengal, trapping the Eastern Pakistan Navy and eight foreign merchant ships in their ports. From 4 December onwards, the aircraft carrier INS Vikrant was deployed, and its Sea Hawk fighter - bombers attacked many coastal towns in East Pakistan, including Chittagong and Cox 's Bazar. Pakistan countered the threat by sending the submarine PNS Ghazi, which sank en route under mysterious circumstances off Visakhapatnam 's coast. Due to high number of defections, the Navy relied on deploying the Pakistan Marines - led by Rear Admiral Leslie Mungavin - where they had to conduct riverine operations against the Indian Army, but they too suffered major losses, mainly due to their lack of understanding of expeditionary warfare and the wet terrain of East Pakistan. The damage inflicted on the Pakistan Navy stood at 7 gunboats, 1 minesweeper, 1 submarine, 2 destroyers, 3 patrol crafts belonging to the coast guard, 18 cargo, supply and communication vessels; and large - scale damage inflicted on the naval base and docks in the coastal town of Karachi. Three merchant navy ships -- Anwar Baksh, Pasni and Madhumathi -- and ten smaller vessels were captured. Around 1900 personnel were lost, while 1413 servicemen were captured by Indian forces in Dacca. According to one Pakistani scholar, Tariq Ali, Pakistan lost half its navy in the war. After the sneak attack, the PAF adopted a defensive stance in response to the Indian retaliation. As the war progressed, the IAF continued to battle the PAF over conflict zones, but the number of sorties flown by the PAF decreased day -- by -- day. The IAF flew 4,000 sorties while the PAF offered little in retaliation, partly because of the paucity of non-Bengali technical personnel. This lack of retaliation has also been attributed to the deliberate decision of the PAF 's Air AHQ to cut its losses, as it had already incurred huge losses in the conflict in the liberation war in the East. The PAF avoided making contacts with the Indian Navy after the latter raided the port of Karachi twice, but the PAF did retaliate by bombing Okha harbour, destroying the fuel tanks used by the boats that had attacked. In the East, No. 14 Squadron Tail Choppers under Squadron Leader PQ Mehdi, who was taken as POW, was destroyed, putting the Dhaka air defence out of commission and resulting in Indian air superiority in the East. At the end of the war, PAF pilots made successful daring escapes from East Pakistan to neighbouring Burma; many PAF personnel had already left the East for Burma on their own before Dacca was overrun by the Indian military in December 1971. As Indian Army tightened its grip in the East Pakistan, the Indian Air Force continued with its attacks against Pakistan as the campaign developed into a series of daylight anti-airfield, anti-radar, and close - support attacks by fighter jets, with night attacks against airfields and strategic targets by B - 57s and C - 130 s of Pakistan and Canberras and An - 12s of India. The PAF deployed the F - 6s mainly on defensive combat air patrol missions over their own bases, but without the preferential air superiority, it was unable to conduct effective offensive operations. The IAF 's raids damaged one USAF and one UN aircraft in Dacca, while the RCAF 's DHC - 4 Caribou was also destroyed in Islamabad, along with the USAF 's Beech U-8 owned by the US military 's liaison chief Brigadier - General Chuck Yeager. Sporadic raids by the IAF continued against PAF forward air bases in Pakistan until the end of the war, and interdiction and close - support operations were maintained. One of the most successful air raids by India into West Pakistan happened on 8 December 1971, when Indian Hunter aircraft from the Pathankot - based 20 Squadron, attacked the Pakistani base in Murid and destroyed 5 F - 86 aircraft on the ground. This was confirmed by Pakistan 's military historian, Air Commodore M Kaiser Tufail, in his book ' In The Ring and On Its Feet - Pakistan Air Force in the 1971 Indo - Pak War '. The PAF played a more limited role in the operations and were reinforced by F - 104s from Jordan, Mirages from an unidentified Middle Eastern ally (whose identity remains unknown), and by F - 86s from Saudi Arabia. Their arrival helped camouflage the extent of PAF losses, and the Libyan F - 5s were reportedly deployed to Sargodha AFB, perhaps as a potential training unit to prepare Pakistani pilots for an influx of more F - 5s from Saudi Arabia. The IAF was able to conduct a wide range of missions -- troop support; air combat; deep penetration strikes; para-dropping behind enemy lines; feints to draw enemy fighters away from the actual target; bombing and reconnaissance. The PAF, which was solely focused on air combat, was blown out of the subcontinent 's skies within the first week of the war. Those PAF aircraft that survived took refuge at Iranian air bases or in concrete bunkers, refusing to offer a fight. India flew 1,978 sorties in the East and about 4,000 in Pakistan, while the PAF flew about 30 and 2,840 at the respective fronts. More than 80 percent of IAF sorties were close - support and interdiction and about 45 IAF aircraft were lost. Pakistan lost 75 aircraft, not including any F - 6s, Mirage IIIs, or the six Jordanian F - 104s which failed to return to their donors. The imbalance in air losses was explained by the IAF 's considerably higher sortie rate and its emphasis on ground - attack missions. Before the start of the war, the Indian Army was well organised on both fronts and enjoyed significant numerical superiority over the Pakistan Army. The Indian Army 's extraordinary war performance at both fronts brought up the prestige, confidence, and dignity that it had lost during the war with China in 1962. When the conflict started, the war immediately took a decisive turn in favour of India and their Bengali rebel allies militarily and diplomatically. On both fronts, Pakistan launched several ground offensives, but the Indian Army held its ground and initiated well - coordinated ground operations on both fronts. Major ground attacks were concentrated on the western border by the Pakistan Army, fighting together with the Pakistan Marines in the southern border, but the Indian Army was successful in penetrating into Pakistani soil. It eventually made some quick and initial gains, including the capture of around 5,795 square miles (15,010 km) of Pakistani territory; this land gained by India in Azad Kashmir, Punjab and Sindh sectors was later ceded in the Simla Agreement of 1972, as a gesture of goodwill. Casualties inflicted to Pakistan Army 's I Corps, II Corps, and Pakistan Marines ' Punjab detachment were very high, and many soldiers and marines perished due to lack of operational planning and lack of coordination within the marine - army formations against Indian Army 's Southern and Western Commands. By the time the war came to end, the army soldiers and marines were highly demoralised -- both emotionally and psychologically -- on the western front and had no will to put up a defensive fight against the approaching Indian Army soldiers. The War Enquiry Commission later exposed the fact that for the Pakistan Army and Pakistan Marines, the arms and training of marines, soldiers and officers were needed at every level, and every level of command. On 23 November 1971, the Indian Army conventionally penetrated to the eastern fronts and crossed East Pakistan 's borders to join their Bengali nationalist allies. Contrary to the 1965 war, which had emphasised set - piece battles and slow advances, this time the strategy adopted was a swift, three - pronged assault of nine infantry divisions with attached armoured units and close air support that rapidly converged on Dacca, the capital of East Pakistan. Lieutenant General Jagjit Singh Aurora, the GOC - in - C of the Indian Army 's Eastern Command, led the full Indian thrust into East Pakistan. As the Indian Eastern Command attacked the Pakistan Eastern Command, the Indian Air Force rapidly destroyed the small air contingent in East Pakistan and put the Dacca airfield out of commission. In the meantime, the Indian Navy effectively blockaded East Pakistan. The Indian campaign 's "blitzkrieg '' techniques exploited weaknesses in the Pakistani positions and bypassed opposition; this resulted in a swift victory. Faced with insurmountable losses, the Pakistani military capitulated in less than a fortnight and psychological panic spread in the Eastern Command 's military leadership. Subsequently, the Indian Army encircled Dacca and issued an ultimatum to surrender in "30 - minutes '' time window on 16 December 1971. Upon hearing the ultimatum, the East - Pakistan government collapsed when the Lt - Gen. A.A.K. Niazi (Cdr. of Eastern Command) and his deputy, V - Adm. M.S. Khan, surrendered without offering any resistance. On 16 December 1971, Pakistan ultimately called for unilateral ceasefire and surrendered its entire four - tier military to the Indian Army -- hence ending the Indo - Pakistani war of 1971. On the ground, Pakistan suffered the most, with 8,000 killed and 25,000 wounded, while India only had 3,000 dead and 12,000 wounded. The loss of armoured vehicles was similarly imbalanced and this finally represented a major defeat for Pakistan. Officially, the Instrument of Surrender of Pakistan Eastern Command stationed in East Pakistan, was signed between the Lieutenant General Jagjit Singh Aurora, the GOC - in - C of Indian Eastern Command and Lieutenant - General A.A.K. Niazi, the Commander of the Pakistan Eastern Command, at the Ramna Race Course in Dacca at 16: 31Hrs IST on 16 December 1971. As the surrender was accepted silently by Lieutenant - General Aurora, the surrounding crowds on the race course started shouting anti-Pakistan slogans, and there were reports of abuses aimed at the surrendering commanders of Pakistani military. Hostilities officially ended at 14: 30 GMT on 17 December, after the fall of Dacca on 15 December, and India claimed large gains of territory in Pakistan (although pre-war boundaries were recognised after the war). The war confirmed the independence of Bangladesh. Following the surrender, the Indian Army took approximately 90,000 Pakistani servicemen and their Bengali supporters as POWs, making it the largest surrender since World War II. Initial counts recorded that approximately 79,676 war prisoners were uniformed personnel, and the overwhelming majority of the war prisoners were officers - most of them were from the Army and Navy, while relatively small numbers were from the Air Force and Marines; others in larger number were serving in the paramilitary. The remaining prisoners were civilians who were either family members of the military personnel or collaborators (razakars). The Hamoodur Rahman Commission and the POW Investigation Commission reports instituted by Pakistan lists the Pakistani POWs as given in the table below. Apart from soldiers, it was estimated that 15,000 Bengali civilians were also made prisoners of war. The Soviet Union sympathised with the East Pakistanis, and supported the Indian Army and Mukti Bahini 's incursion against Pakistan during the war, in a broader view of recognising that the succession of East Pakistan as Independent Bangladesh would weaken the position of its rivals -- the United States and China. The Soviet Union gave assurances to India that if a confrontation with the United States or China developed, it would take counter-measures. This assurance was enshrined in the Indo - Soviet Treaty of Friendship and Cooperation signed in August 1971. However, the Indo - Soviet treaty did not mean a total commitment to every Indian position, even though the Soviet Union had accepted the Indian position during the conflict, according to author Robert Jackson. The Soviet Union continued its sympathetic gesture to Pakistan until mid-October 1971, when it stressed Pakistan to come up with a political settlement and affirmed its continuation of industrial aid to Pakistan. By November 1971, the Soviet ambassador to Pakistan Alexei Rodionov directed a secretive message (Rodionov message) that ultimately warned Pakistan that "it will be embarking on a suicidal course if it escalates tensions in the subcontinent. The United States stood with Pakistan by supporting it morally, politically, economically and materially when U.S. President Richard Nixon and his Secretary of State Henry Kissinger refused to use rhetoric in a hopeless attempt to intervene in a large civil war. The U.S. establishment perceived to the impression that they needed Pakistan to help stop Soviet influence in South Asia in an informal alliance with India. During the Cold War, Pakistan was a close formal ally of the United States and also had close relations with the People 's Republic of China, with whom Nixon had been negotiating a rapprochement and where he intended to visit in February 1972. Nixon feared that an Indian invasion of Pakistan would mean total Soviet domination of the region, and that it would seriously undermine the global position of the United States and the regional position of America 's new tactical ally, China. Nixon encouraged Jordan and Iran to send military supplies to Pakistan, while also encouraging China to increase its arms supplies to Pakistan, but all supplies were very limited. The Nixon administration also ignored reports it received of the "genocidal '' activities of the Pakistani military in East Pakistan, most notably the Blood telegram, and this prompted widespread criticism and condemnation - both by the United States Congress and in the international press. Then U.S. Ambassador to the United Nations, George Bush, Sr, introduced a resolution in the UN Security Council calling for a cease - fire and the withdrawal of armed forces by India and Pakistan. However, it was vetoed by the Soviet Union, and the following days witnessed the use of great pressure on the Soviets from the Nixon - Kissinger duo to get India to withdraw, but to no avail. When Pakistan 's defeat in the eastern sector seemed certain, Nixon deployed Task Force 74 - led by the aircraft carrier USS Enterprise - into the Bay of Bengal. Enterprise and its escort ships arrived on station on 11 December 1971. According to a Russian documentary, the United Kingdom also deployed a carrier battle group led by the aircraft carrier HMS Eagle to the Bay, on her final deployment. On 6 and 13 December, the Soviet Navy dispatched two groups of cruisers and destroyers from Vladivostok; they trailed US Task Force 74 into the Indian Ocean from 18 December 1971 until 7 January 1972. The Soviets also had a nuclear submarine to help ward off the threat posed by the USS Enterprise task force in the Indian Ocean. As the war progressed, it became apparent to the United States that India was going to invade and disintegrate Pakistan in a matter of weeks, therefore President Nixon spoke with the USSR Secretary General Leonid Brezhnev on a hotline on 10 December, where Nixon reportedly urged Brezhnev to restrain India as he quoted: "in the strongest possible terms to restrain India with which... you (Brezhnev) have great influence and for whose actions you must share responsibility. '' After the war, the United States accepted the new balance of power and recognised India as a dominant player in South Asia; the US immediately engaged in strengthening bilateral relations between the two countries in the successive years. The Soviet Union, while being sympathetic to Pakistan 's loss, decided to engage with Pakistan after sending an invitation through Rodionov to ZA Bhutto, who paid a state visit to the Soviet Union in 1972 to strengthen bilateral relations that continued over the years. During the course of the war, China harshly criticised India for its involvement in the East Pakistan crises, and accused India of having imperialistic designs in South Asia. Before the war started, Chinese leaders and officials had long been philosophically advising the Pakistan government to make peaceful political settlements with the East Pakistani leaders, as China feared that India was secretly supporting, infiltrating, and arming the Bengali rebels against the East Pakistani government. China was also critical of the Government of East Pakistan, led by its Governor Lieutenant - General Tikka Khan - which used ruthless measures to deal with the Bengali opposition - and did not endorse the Pakistani position on that issue. When the war started, China reproached India for its direct involvement and infiltration in East Pakistan. It disagreed with Pakistani President Yahya Khan 's consideration of military options, and criticised East Pakistan Awami League politicians ' ties with India. China reacted with great alarm when the prospects of Indian invasion of Pakistan and integration of Pakistan - administered Kashmir into their side of Kashmir, became imminent. US President Nixon encouraged China to mobilise its armed forces along its border with India to discourage the Indian assault, but the Chinese did not respond to this encouragement since the Indian Army 's Northern Command was well prepared to guard the Line of Actual Control, and was already engaging and making advances against the Pakistan Army 's X Corps in the Line of Control. China did not welcome the break - up of Pakistan 's unity by the East Pakistani politicians, and effectively vetoed the membership of Bangladesh when it applied to the United Nations in 1972. China objected to admitting Bangladesh on the grounds that two UN resolutions concerning Bangladesh, requiring the repatriation of Pakistani POWs and civilians, had not yet been implemented. Furthermore, China was also among the last countries to recognise the independence of Bangladesh, refusing to do so until 31 August 1975. To this date, its relations with Bangladesh are determined by the Pakistan factor. During the course of the conflict, Iran also stood with Pakistan politically and diplomatically. It was concerned with the imminent break - up of Pakistan which, it feared, would have caused the state to fractionalise into small pieces, ultimately resulting in Iran 's encirclement by rivals. After the war, however, Iran began cementing ties with India based on mutual security co-operation. At the beginning of the conflict, Iran had helped Pakistan by sheltering PAF 's fighter jets and providing it with free fuel to take part in the conflict, in an attempt to keep Pakistan 's regional integrity united. When Pakistan called for unilateral ceasefire and the surrender was announced, the Shah of Iran hastily responded by preparing the Iranian military to come up with contingency plans to forcefully invade Pakistan and annex its Balochistan province into its side of Balochistan, by any means necessary, before anybody else did it. The war stripped Pakistan of more than half of its population, and with nearly one - third of its army in captivity, clearly established India 's military and political dominance of the subcontinent. India successfully led a diplomatic campaign to isolate Pakistan and skillfully manipulate Pakistan 's supporting countries to limit the extent of support to Pakistan. In addition, Prime Minister Indira Gandhi 's state visit to United Kingdom and France further helped break ice with the United States, and blocked any pro-Pakistan resolution in the United Nations. There was also a meeting between Prime Minister Gandhi and President Nixon in November 1971, where she rejected the US advice against intervening in the conflict. The victory also defined India 's much broader role in foreign politics, as many countries in the world had come to realise - including the United States - that the balance of power had shifted to India as a major player in the region. In the wake of changing geopolitical realities, India sought to establish closer relations with regional countries such as Iran, which was a traditional ally of Pakistan. The United States itself accepted a new balance of power, and when India conducted a surprise nuclear test in 1974, the US notified India that it had no "interest in actions designed to achieve new balance of power. '' In spite of the magnitude of the victory, India was surprisingly restrained in its reaction. Mostly, Indian leaders seemed pleased by the relative ease with which they had accomplished their goals -- the establishment of Bangladesh and the prospect of an early return to their homeland of the 10 million Bengali refugees who were the cause of the war. In announcing the Pakistani surrender, Prime Minister Indira Gandhi declared in the Indian Parliament: Dacca is now the free capital of a free country. We hail the people of Bangladesh in their hour of triumph. All nations who value the human spirit will recognise it as a significant milestone in man 's quest for liberty. Colonel John Gill of National Defense University, US, remarks that, while India achieved a military victory, it was not able to reap the political fruits it might have hoped for in Bangladesh. After a brief ' honeymoon ' phase between India and Bangladesh, their relationship began to sour. India 's relations with Bangladesh have remained frequently problematic and tense. Whilst India enjoys excellent relations with Bangladesh during Awami League tenures, relations deteriorate when the Bangladesh Nationalist Party is in power. A 2014 Pew Research Center opinion poll in Bangladesh found that India was perceived as the greatest threat to Bangladesh. This was the top choice (27 %) of Bangladeshis. However, 70 % of Bangladeshis held a positive view of India: while 50 % of Bangladeshis held a positive view of Pakistan. For Pakistan, the war was a complete and humiliating defeat, a psychological setback that came from a defeat at the hands of rival India. Pakistan lost half its population and a significant portion of its economy, and suffered setbacks to its geopolitical role in South Asia. In the post-war era, Pakistan struggled to absorb the lessons learned from the military interventions in the democratic system and the impact of the Pakistani military 's failure was grave and long - lasting. From the geopolitical point of view, the war ended in the breaking - up of the unity of Pakistan from being the largest Muslim country in the world to its politico - economic and military collapse that resulted from a direct foreign intervention in 1971. The Pakistani policy - making institutions further feared that the historicity of the Two - nation theory had been disproved by the war, that Muslim nationalism had proved insufficient to keep Bengalis a part of Pakistan. The Pakistani people were not mentally prepared to accept the magnitude of this kind of defeat, as the state electronic media had been projecting imaginary victories; however, the privately - owned electronic news media coverage in East Pakistan had reported the complexity of the situation. When the ceasefire that came from the surrender of East Pakistan was finally announced, the people could not come to terms with the magnitude of defeat; spontaneous demonstrations and massive protests erupted on the streets of major metropolitan cities in Pakistan. According to Pakistani historians, the trauma was extremely severe, and the cost of the war for Pakistan in monetary terms and in human resources was very high. Demoralized and finding unable to control the situation, the Yahya administration fell when President Yahya Khan turned over his presidency to Zulfiqar Ali Bhutto, who was sworn in on 20 December 1971 as President with the control of the military. The loss of East Pakistan shattered the prestige of the Pakistani military. Pakistan lost half its navy, a quarter of its air force, and a third of its army. The war also exposed the shortcomings of Pakistan 's declared strategic doctrine that the "defence of East Pakistan lay in West Pakistan ''. Hussain Haqqani, in his book Pakistan: Between Mosque and Military notes, Moreover, the army had failed to fulfill its promises of fighting to the last man. The eastern command had laid down arms after losing only 1,300 men in battle. In West Pakistan 1,200 military deaths had accompanied lackluster military performance. In his book The 1971 Indo - Pak War: A Soldier 's Narrative, Pakistan Army 's Major General Hakeem Arshad Qureshi, a veteran of this conflict, noted: We must accept the fact that, as a people, we had also contributed to the bifurcation of our own country. It was not a Niazi, or a Yahya, even a Mujib, or a Bhutto, or their key assistants, who alone were the cause of our break - up, but a corrupted system and a flawed social order that our own apathy had allowed to remain in place for years. At the most critical moment in our history we failed to check the limitless ambitions of individuals with dubious antecedents and to thwart their selfish and irresponsible behaviour. It was our collective ' conduct ' that had provided the enemy an opportunity to dismember us. The Indian Army Chief in 1971, Field Marshal Sam Manekshaw, had the highest respect for the fighting capability of the Pakistan Army, and he did not accept the theory that they did not fight the war with enough vigour and zeal. In a BBC interview, he said: The Pakistan Army in East Pakistan fought very gallantly. But they had no chance. They were a thousand miles away from their base. I had eight or nine months to make my preparations. I had got a superiority of almost 15 to 1... However, independent defence sources stated that the Indian superiority was less than 2 to 1. The United States Air Force 's Brigadier - General Chuck Yeager, the World War II veteran and US flying ace who witnessed the war in 1971, is of the view that Pakistan did not lose the war, as India did not annex it. After the war, the Pakistan Army 's generals in the East held each other responsible for the atrocities committed, but most of the burden was laid on Lieutenant - General Tikka Khan, who earned notoriety from his actions as governor of the East; he was called the "Butcher of Bengal '' because of the widespread atrocities committed within the areas of his responsibility. Unlike his contemporary Yaqub who was a pacifist and knew well of the limits of force, Tikka was a "soldier known for his eager use of force '' to settle his differences. Confessing at the hearings of the War Enquiry Commission, Lieutenant - General A.A.K. Niazi reportedly commented on Tikka 's actions and noted: "On the night between 25 / 26 March 1971, (General) Tikka struck. Peaceful night was turned into a time of wailing, crying and burning. (General) Tikka let loose everything at his disposal as if raiding an enemy, not dealing with his own misguided and misled people. The military action was a display of stark cruelty more merciless than the massacres at Bukhara and Baghdad by Chengiz Khan and Halaku Khan... (General) Tikka... resorted to the killing of civilians and a scorched earth policy. His orders to his troops were: "I want the land, not the people... ''. '' Major - General Rao Farman reportedly had written in his table diary: "Green land of East Pakistan will be painted red. It was painted red by Bengali blood. '' However, Farman forcefully denied writing that comment, and laid all responsibility on Tikka, while testifying at the War Enquiry Commission in 1974. Major reforms were carried out by successive governments in Pakistan after the war in the light of many insightful recommendations made in the Hamoodur Rahman Commission 's Report. To address the economic disparity, the NFC system was established to equally distribute the taxation revenue among the four provinces, the large - scale nationalization of industries and nationwide census were carried out in 1972. The Constitution was promulgated in 1973 that reflected this equal balance and a compromise between Islamism and Humanism, and provided guaranteed equal human rights to all. The military was heavily reconstructed and heavily reorganised, with President Bhutto appointing chiefs of staff in each inter-service, contrary to C - in - Cs, and making instruction on human rights compulsory in the military syllabus in each branch of inter-services. Major investments were directed towards modernising the navy. The military 's chain of command was centralized in JS HQ led by an appointed Chairman joint chiefs committee to coordinate the combined and well - integrated military efforts to safeguard the nation 's defence and unity. In addition, Pakistan sought to have a diversified foreign policy, as Pakistani geostrategists had been shocked that both China and the United States provided limited support to Pakistan during the course of the war, with the US displaying an inability to supply weapons that Pakistan needed the most. On January 20, 1972, Pakistan under Bhutto launched the clandestine development of nuclear weapons in a view of "never to allow another foreign invasion of Pakistan. '' This crash program reached parity in 1977 when the first weapon design was successfully achieved. As a result of the war, East Pakistan disintegrated and became an independent country, Bangladesh, as the world 's fourth most populous Muslim state on 16 December 1971. Pakistan itself secured the release of Sheikh Mujibur Rahman from the Headquarter Prison and allowed him to return to Dacca. On 19 January 1972, Mujib was inaugurated as the first President of Bangladesh, later becoming the Prime Minister of Bangladesh in 1974. On the brink of defeat in around 14 December 1971, the media reports indicated that the Pakistan Army soldiers, the local East Pakistan Police they controlled, razakars and the Shanti Committee carried out systematic killings of professionals such as physicians, teachers, and other intellectuals, as part of a pogrom against the Bengali Hindu minorities who constituted the majority of urban educated intellectuals. Young men, especially students, who were seen as possible rebels and recruiters were also targeted by the stationed military, but the extent of casualties in East Pakistan is not known, and the issue is itself controversial and contradictory among the authors who wrote books on the pogrom; the Pakistani government itself denied the charges of its involvement in 2015. R.J. Rummel cites estimates ranging from one to three million people killed. Other estimates place the death toll lower, at 300,000. Bangladesh government figures state that Pakistani forces aided by collaborators killed three million people, raped 200,000 women and displaced millions of others. According to the authors Kenton Worcester, Sally Bermanzohn, and Mark Ungar, Bengalis themselves killed about 150,000 non-Bengalis living in the East There had been reports of Bengali insurgents indiscriminately killing non-Bengalis throughout the East; however, neither side provided substantial proofs for their claims and both Bangladeshi and Pakistani figures contradict each other over this issue. Bihari representatives in June 1971 claimed a higher figure of 500,000 killed by Bengalis. In 2010, the Awami League 's government decided to set up a tribunal to prosecute the people involved in alleged war crimes and those who collaborated with Pakistan. According to the government, the defendants would be charged with crimes against humanity, genocide, murder, rape and arson. According to John H. Gill, there was widespread polarisation between pro-Pakistan Bengalis and pro-liberation Bengalis during the war, and those internal battles are still playing out in the domestic politics of modern - day Bangladesh. To this day, the issue of committed atrocities and pogroms is an influential factor in the foreign relations between Pakistan and Bangladesh. In the aftermath of the war, the Pakistani Government constituted the War Enquiry Commission, to be headed by Chief Justice Hamoodur Rahman, who was an ethnic Bengali, and composed of the senior justices of the Supreme Court of Pakistan. The War Enquiry Commission was mandated with carrying out thorough investigations into the intelligence, strategic, political and military failures that causes the defeat in the war. The War Commission also looked into Pakistan 's political and military involvement in the history of East Pakistan that encompasses 1947 -- 71. The First War Report was submitted in July 1972, but it was very critically opined and penned on political misconducts of politicians and the military interference in national politics. Written in moral and philosophical perspective, the First Report was lengthy and provided accounts that were unpalatable to be released to the public. Initially, there were 12 copies that were all destroyed, except for the one that was kept and marked as "Top Secret '' to prevent the backlash effects on the demoralised military. In 1976, the Supplementary Report was submitted, which was the comprehensive report compiled together with the First Report; this report was also marked as classified. In 2000, the excerpts of the Supplementary Report were leaked to a political correspondent of Pakistan 's Dawn, which the Dawn published together with India Today. The First Report is still marked as classified, while the Supplementary Report 's excerpts were suppressed by the news correspondents. The War Report 's supplementary section was published by the Pakistan Government, but it did not officially hand over the report to Bangladesh despite its requests.) The War Report exposed many military failures, from the strategic to the tactical -- intelligence levels, while it confirmed the looting, rapes and the unnecessary killings by the Pakistan military and their local agents. It laid the blame squarely on Pakistan Army generals, accusing them of debauchery, smuggling, war crimes and neglect of duty. The War Commission had recommended public trial of Pakistan Army generals on the charges that they had been responsible for the situation in the first place and that they had succumbed without a fight, but no actions were ever taken against those responsible, except the dismissal of chiefs of the Pakistan Army, Pakistan Air Force, Pakistan Navy, and decommissioning of the Pakistan Marines. The War Commission, however, rejected the charge that 200,000 Bengali girls were raped by the Pakistan Army, remarking, "It is clear that the figures mentioned by the Dacca authorities are altogether fantastic and fanciful, '' and cited the evidence of a British abortion team that had carried out the termination of "only a hundred or more pregnancies ''. The Commission also claimed that "approximately 26,000 persons (were) killed during the action by the Pakistan military '' Bina D'Costa states that the War Commission was aware of the military 's brutality in East Pakistan, but "chose to downplay the scale of the atrocities committed. '' The second commission was known as Indo - Pakistani War of 1971 Prisoners of War Investigation, conducted solely by the Pakistani government, that was to determine the numbers of Pakistani military personnel who surrendered, including the number of civilian POWs. The official number of the surrendered military personnel was soon released by the Government of Pakistan after the war was over. On 2 July 1972, the Indo - Pakistani summit was held in Simla, Himachal Pradesh, India where the Simla Agreement was reached and signed between President Zulfikar Ali Bhutto and Prime Minister Indira Gandhi. The treaty provided insurance to Bangladesh that Pakistan recognised Bangladesh 's sovereignty, in exchange for the return of the Pakistani POWs. In mere five months, India systematically released more than 90,000 war prisoners, with Lieutenant - General A.A.K. Niazi being the last war prisoner to be handed over to Pakistan. The treaty also gave back more than 13,000 km2 of land that the Indian Army had seized in Pakistan during the war, though India retained a few strategic areas, including Turtuk, Dhothang, Tyakshi (earlier called Tiaqsi) and Chalunka of Chorbat Valley, which was more than 804 km2. The Indian hardliners, however, felt that the treaty had been too lenient to President Bhutto, who had pleaded for leniency, arguing that the fragile stability in Pakistan would crumble if the accord was perceived as being overly harsh by Pakistanis and that he would be accused of losing Kashmir in addition to the loss of East Pakistan. As a result, Prime Minister Gandhi was criticised by a section in India for believing Bhutto 's "sweet talk and false vows '', while the other section claimed the agreement to be successful, for not letting it to fall into "Versailles Syndrome '' trap. In 1973, India and Pakistan reached another compromise when both countries signed a trilateral agreement with Bangladesh that actually brought the war prisoners, non-Bengali and Pakistan - loyal Bengali bureaucrats and civilian servants to Pakistan. The Delhi Agreement witnessed the largest mass population transfer since the Partition of India in 1947. In 2009, the issue of establishing the International Crimes Tribunal began to take public support. The tribunal was formally established in 2010 to investigate and prosecute suspects for the genocide committed in 1971 by the Pakistan Army and their local collaborators, Razakars, Al - Badr and Al - Shams during the Bangladesh Liberation War. After the war, 41 battle honours and 4 theatre honours were awarded to units of the Indian Army; notable among them are: For bravery, a number of soldiers and officers on both sides were awarded the highest gallantry award of their respective countries. Following is a list of the recipients of the Indian award Param Vir Chakra, Bangladeshi award Bir Sreshtho and the Pakistani award Nishan - E-Haider: Recipients of the Param Vir Chakra: Recipients of the Bir Sreshtho: Recipients of the Nishan - E-Haider: On 25 July 2011, Bangladesh Swadhinata Sammanona, the Bangladesh Freedom Honour, was posthumously conferred on former Indian Prime Minister Indira Gandhi. On 28 March 2012, President of Bangladesh Zillur Rahman and the Prime Minister Sheikh Hasina conferred Bangladesh Liberation War Honour and Friends of Liberation War Honour to 75 individuals, six organisations, Mitra Bahini and the people of India at a special ceremony at the Bangabandhu International Conference Centre, Dhaka. This included eight heads of states: former Nepalese President Ram Baran Yadav, the third King of Bhutan Jigme Dorji Wangchuck, former Soviet Presidents Leonid IIyich Brezhnev and Nikolai Viktorovich Podgorny, former Soviet Prime Minister Alexei Nikolaevich Kosygin, former Yugoslav President Marshal Josip Broz Tito, former UK Prime Minister Sir Edward Richard George Heath and former Nepalese Prime Minister Bishweshwar Prasad Koirala. The organisations include the BBC, Akashbani (All India Radio), International Committee of the Red Cross, United Nations High Commissioner for Refugees, Oxfam and Kolkata University Shahayak Samiti. The list of foreign friends of Bangladesh has since been extended to 568 people. It includes 257 Indians, 88 Americans, 41 Pakistanis, 39 Britons, 9 Russians, 18 Nepalese, 16 French and 18 Japanese. General:
where are the enzymes that break down maltose and peptides produced
Digestive enzyme - wikipedia Digestive enzymes are enzymes that break down polymeric macromolecules into their smaller building blocks, in order to facilitate their absorption by the body. Digestive enzymes are found in the digestive tracts of animals (including humans) and in the traps of carnivorous plants, where they aid in the digestion of food, as well as inside cells, especially in their lysosomes, where they function to maintain cellular survival. Digestive enzymes of diverse specificities and are found in the saliva secreted by the salivary glands, in the secretions of cells lining the stomach, in the pancreatic juice secreted by pancreatic exocrine cells, and in the secretions of cells lining the small and large intestines. Digestive enzymes are classified based on their target substrates: In the human digestive system, the main sites of digestion are the oral cavity, the stomach, and the small intestine. Digestive enzymes are secreted by different exocrine glands including: Complex food substances that are taken by animals and humans must be broken down into simple, soluble, and diffusible substances before they can be absorbed. In the oral cavity, salivary glands secrete an array of enzymes and substances that aid in digestion and also disinfection. They include the following: Of note is the diversity of the salivary glands. There are two types of salivary glands: The enzymes that are secreted in the stomach are called gastric enzymes. The stomach plays a major role in digestion, both in a mechanical sense by mixing and crushing the food, and also in an enzymatic sense, by digesting it. The following are enzymes, hormones or compounds produced by the stomach and their respective function: Of note is the division of function between the cells covering the stomach. There are four types of cells in the stomach: Secretion by the previous cells is controlled by the enteric nervous system. Distention in the stomach or innervation by the vagus nerve (via the parasympathetic division of the autonomic nervous system) activates the ENS, in turn leading to the release of acetylcholine. Once present, acetylcholine activates G cells and parietal cells. Pancreas is both an endocrine and an exocrine gland, in that it functions to produce endocrinic hormones released into the circulatory system (such as insulin, and glucagon), to control glucose metabolism, and also to secrete digestive / exocrinic pancreatic juice, which is secreted eventually via the pancreatic duct into duodenum. Digestive or exocrine function of pancreas is as significant to the maintenance of health as its endocrine function. Two of the population of cells in the pancreatic parenchyma make up its digestive enzymes: Pancreatic juice, composed of the secretions of both ductal and acinar cells, is made up of the following digestive enzymes: Pancreas 's exocrine function owes part of its immaculate function to bio-feedback mechanisms controlling secretion of its juice. The following significant pancreatic bio-feedback mechanisms are essential to the maintenance of pancreatic juice balance / production: The following enzymes / hormones are produced in the duodenum: Throughout the lining of the small intestine there are numerous brush border enzymes whose function is to further break down the chyme released from the stomach into absorbable particles. These enzymes are absorbed whilst peristalsis occurs. Some of these enzymes include:
pictures of where the wild things are characters
Where the Wild Things Are (film) - wikipedia Where the Wild Things Are is a 2009 fantasy drama film directed by Spike Jonze. Written by Jonze and Dave Eggers, it is adapted from Maurice Sendak 's 1963 children 's book of the same name. It combines live - action, performers in costumes, animatronics, and computer - generated imagery (CGI). The film stars Max Records and features the voices of James Gandolfini, Paul Dano, Lauren Ambrose, Forest Whitaker, Catherine O'Hara, and Chris Cooper. The film centers on a lonely eight - year - old boy named Max who sails away to an island inhabited by creatures known as the "Wild Things, '' who declare Max their king. In the early 1980s, Disney considered adapting the film as a blend of traditionally animated characters and computer - generated environments, but development did not go past a test film to see how the animation hybridizing would result. In 2001, Universal Studios acquired rights to the book 's adaptation and initially attempted to develop a computer - animated adaptation with Disney animator Eric Goldberg, but the CGI concept was replaced with a live - action one in 2003, and Goldberg was dropped for Spike Jonze. The film was co-produced by actor Tom Hanks through his production company Playtone and made with an estimated budget of $100 million. Where the Wild Things Are was a joint production between Australia, Germany, and the United States, and was filmed principally in Melbourne. The film was released on October 16, 2009, in the United States, on December 3 in Australia, and on December 17 in Germany. The film was met with mostly positive reviews and appeared on many year - end top ten lists. However the film flopped commercially at the box office, making $100.1 million from a budget of $100 million. The film was released on DVD and Blu - ray on March 2, 2010. Max, a lonely eight - year - old boy with an active imagination whose parents are divorced, is wearing a wolf costume and chasing his dog. His older sister, Claire, does nothing when her friends crush Max 's snow fort with him inside during a snowball fight. Out of frustration, Max messes up her bedroom and destroys a frame he made for her. At school, Max 's teacher teaches him and his classmates about the eventual death of the sun. Later his mother, Connie, invites her boyfriend Adrian to dinner. Max becomes upset with his mother for not coming to the fort he made in his room. He wears his wolf costume, acts like an animal, and demands to be fed. When his mother gets upset, he throws a tantrum and bites her on the shoulder. She yells at him and he runs away, scared by what transpired. At the edge of a pond, Max finds a small boat that he boards. The pond soon becomes an ocean. Max, still in his wolf suit, reaches an island. He stumbles upon a group of seven large, monstrous creatures. One of them, Carol, is in the middle of a destructive tantrum caused by the departure of a female Wild Thing named K.W. As Carol wreaks havoc Max tries joining in on the mayhem, but finds himself facing the suspicious anger of the Wild Things. When they contemplate eating him, Max convinces them that he is a king with magical powers capable of bringing harmony to the group. They crown him as their new king. Shortly after, K.W. returns and Max declares a wild rumpus, in which the Wild Things smash trees and tackle each other. The Wild Things introduce themselves as Carol, Ira, Judith, Alexander, Douglas, the Bull, and K.W. Soon, they pile on one another before going to sleep, with Max at the center. Carol takes Max on a tour of the island, showing him a model he built depicting what he wishes the island looked like. Inspired by this, Max orders the construction of an enormous fort, with Carol in charge of construction. When K.W. brings her two owl friends Bob and Terry to the fort, a disagreement ensues, as Carol feels they are outsiders. To release their frustrations, Max divides the tribe into "good guys '' and "bad guys '' for a dirt clod fight, but Alexander is hurt during the game. After an argument between K.W. and Carol, K.W. leaves once again. Max finds Alexander alone in the fort. Alexander reveals that he suspected that Max is not a king with magical powers, but warns him to never let Carol know. At pre-dawn, Carol throws another tantrum -- this time, about the fort, K.W. 's absence, and the eventual death of the sun, which Max talked with Carol about earlier. When Carol gets angry with Max for not doing a good job as king, Douglas tries explaining that he is "just a boy, pretending to be a wolf, pretending to be a king '', exposing the truth to the rest of the Wild Things. Carol becomes enraged and rips off Douglas 's right arm, though only sand pours from the wound. Carol chases Max into the forest and attempts to eat him. Max is saved by K.W., who hides him in her stomach. Max listens as Carol and K.W. argue over Carol 's behavior. Max finds the crushed remains of Carol 's model island and leaves a token of affection for him to find. Max finds Carol and tells him he is going home because he is not a king. The other Wild Things escort Max to his boat. Carol runs to join them after finding Max 's token and arrives in time to see him off. He starts to howl and Max howls back, then all the other Wild Things join in. Carol looks at K.W. and she smiles kindly at him. Returning home, Max is embraced by his mother, who gives him a bowl of soup, a piece of cake and a glass of milk and sits with him as he eats. He watches as she falls asleep. Where the Wild Things Are started its development life in the early 1980s, originally to be an animated feature by Disney that would have blended traditionally animated characters with computer - generated settings. Animators Glen Keane and John Lasseter (who later moved on to Pixar) had completed a test film to see how the animation hybridizing would work out, but the project proceeded no further. Universal Studios acquired rights to the book 's adaptation in 2001 and initially attempted to develop a computer - animated adaptation with Disney animator Eric Goldberg, but in 2003 the CGI concept was replaced with a live - action one, and Goldberg was replaced with Spike Jonze. After years of interest from various producers, Sendak favored Spike Jonze as director, noting he was "young, interesting and had a spark that none of the others had ''. The film was originally set for release from Universal, and a teaser of the film was attached to the studio 's 2000 adaptation of How the Grinch Stole Christmas. Disagreements between Universal and Sendak over Jonze 's approach to the story led to a turnaround arrangement where the film 's production was transferred to Warner Bros. -- NPR, All Things Considered In 2005, Jonze and Dave Eggers completed a 111 - page screenplay, expanding the original ten - sentence story. On July 8, 2006, production began open auditions for the role of Max. The process took months, but, eventually, Max Records was cast. Academy Award - winning make - up effects supervisor Howard Berger (The Chronicles of Narnia) turned down offers to work on the film four times. Although the book inspired him as a child to work in special effects, he felt filming it was a "horrible idea. '' Jim Henson 's Creature Shop provided the animatronic suits for the Wild Things. Filming began in April 2006 at Docklands Studios Melbourne in Melbourne, Australia. Jonze kept in close consultation with Sendak throughout the process, and the author approved creature designs created by Jim Henson 's Creature Shop. To make the set a more comfortable environment for Max Records, Jonze encouraged the crew members to bring their children to the set. Some of them can be seen in the film 's classroom scene. Michelle Williams was originally cast as the female Wild Thing K.W. only to leave the project after her voice "did n't match the original vision of how the Wild Thing should sound ''. She was replaced by Lauren Ambrose, and filming continued. In 2008, test footage was leaked onto the internet leading to mixed reactions. Jonze responded, "That was a very early test with the sole purpose of just getting some footage to Ben, our VFX supervisor, to see if our VFX plan for the faces would work. '' Following early fan outcry over the leaked video and rumored "scared children '' in test audiences, Warner Bros. announced a year - long delay. On February 20, 2008, speculation emerged that Warner Bros. was considering reshooting the entire film. then - WB president Alan F. Horn responded, "We 've given him more money and, even more importantly, more time for him to work on the film. We 'd like to find a common ground that represents Spike 's vision but still offers a film that really delivers for a broad - based audience. No one wants to turn this into a bland, sanitized studio movie. This is a very special piece of material and we 're just trying to get it right. '' Producer Gary Goetzman followed, "We support Spike 's vision. We 're helping him make the vision he wants to make. '' At the end of 2008, Spike got together with Framestore in London to complete his movie and work with them to bring to life the performances through their animation and visual effects team. Over the course of the next six months, Spike spent time with the animators on the floor of the studio as they worked together to realise his intention for the performances that had started many years before with the voices, continued with the suit performances in Australia, and were completed in London 's Soho. For the film 's trailer, Arcade Fire provided a re-recorded version of the track "Wake Up '' from their album Funeral. The new version is not featured in the actual film or the soundtrack and has never been made available to the public. During the film, various songs can be heard such as "Hideaway '', "Rumpus '', "Worried Shoes '' and "All is Love '' by Karen O, Zahida K, Anisa RK and the Kids. -- Mary Pols, Time magazine The studio decided not to position the film as a children 's movie and spent 70 % of the advertising on broad - based and adult - driven promotion. The film was released in North America in both conventional and IMAX theaters on October 16, 2009. Early Friday box office estimates show the film earned about $32.7 million on its opening weekend in theaters. It grossed $77.2 million during its theatrical run in the U.S. and Canada, plus $22.8 million internationally. Overall, the studio took a loss as the final budget of the movie was estimated to be around $100 million. Internationally, the film was released in Australia on December 4, 2009; in Ireland and the UK on December 11, 2009; and in Germany on December 17, 2009. It was released in Russia on February 4, 2010. Reception to the film has been generally positive. The film holds a 73 % "Fresh '' rating on review website Rotten Tomatoes from 253 reviews with an average score of 6.9 / 10. The site 's critical consensus reads: "Some may find its dark tone and slender narrative off - putting, but Spike Jonze 's heartfelt adaptation of the classic children 's book is as beautiful as it is uncompromising. '' Review aggregation website Metacritic gave the film an average score of 71 out of 100 based on 37 reviews. Lisa Schwarzbaum of Entertainment Weekly gave the film an A declaring "This is one of the year 's best. '' Manohla Dargis of the New York Times wrote that Spike Jonze 's "filmmaking exceeds anything he 's done '' before, while also noting the imaginative visuals and otherworldly feel, along with the fantastic creature effects on the "Wild Things ''. Peter Travers of Rolling Stone gave the film four stars saying, "For all the money spent, the film 's success is best measured by its simplicity and the purity of its innovation. '' Roger Ebert awarded the film three stars out of four. Some critics have noted the movie 's dark adaptation for children, such as David Denby from The New Yorker saying, "I have a vision of eight - year - olds leaving the movie in bewilderment. Why are the creatures so unhappy? '' Stephanie Zacharek of Salon.com criticized the film 's visual aspect, "Even the look of the picture becomes tiresome after a while -- it starts to seem depressive and shaggy and tired. '' She also stated that "The movie is so loaded with adult ideas about childhood -- as opposed to things that might delight or engage an actual child. '' The Globe and Mail 's Liam Lacey branded the production a "self - consciously sad film. '' Critic A.O. Scott named the film the best of 2009 and placed it at number five on his list of top ten movies of the decade. Warner Bros. submitted the film for consideration for the 2009 award season. There were fears, expressed by production company Warner Bros., that the film was not family friendly and may frighten children; however these fears were not shared by either Jonze or Sendak, and Jonze refused to compromise. Maurice Sendak said after having seen a completed cut of the film, "I 've never seen a movie that looked or felt like this. And it 's (Spike Jonze 's) personal ' this. ' And he 's not afraid of himself. He 's a real artist that lets it come through in the work. So he 's touched me. He 's touched me very much. '' After seeing the finished product, a Warner Bros. executive stated of Jonze, "He 's a perfectionist and just kept working on it, but now we know that at the end of the day he nailed it. '' Film classification agencies have tended to assign "parental guidance '' ratings rather than general or family ratings. MPAA in the United States assessed a PG rating "for mild thematic elements, some adventure action, and brief language ''. A PG rating was also declared in the United Kingdom by BBFC, citing "mild threat and brief violence ''. In Canada, the film also received a PG rating in Ontario with an alert for frightening scenes while Quebec awarded a General rating. British Columbia also assessed the film with a G rating with a proviso that it "may frighten young children ''. In Ireland the film has been classified PG because of what is claimed as having "mild '' violence Similarly in South Africa, the film received a PG rating with a consumer content Violence indicator, noting there were "moments of mildish menace and poignant themes. '' Australia also applied a PG rating to the film and noted "mild violence and scary scenes ''. The movie 's release generated conflicting views over whether it is harmful to expose children to frightening scenes. Jonze indicated that his goal was "to make a movie about childhood '' rather than to create a children 's movie. Dan Fellman, Warner Brothers ' head of movie distribution, noted that the film 's promotion was not directed towards children, advising parents to exercise their own discretion. In an interview with Newsweek, Sendak stated that parents who deemed the film 's content to be too disturbing for children should "go to hell. That 's a question I will not tolerate '' and he further noted "I saw the most horrendous movies that were unfit for child 's eyes. So what? I managed to survive. '' The film was released as a Blu - ray / DVD / Digital copy combo pack and on DVD on March 2, 2010. The home media release was accompanied by a Canadian - produced live - action / animated short film adaptation of another Sendak work, Higglety Pigglety Pop! or There Must Be More to Life, produced especially for the Blu - ray edition. A video game based on the film was released on October 13, 2009, for the PlayStation 3, Xbox 360, Wii, and Nintendo DS. The former three were developed by Griptonite Games, and the latter by WayForward. All were published by Warner Bros. Games. To coincide with the film 's release, Girl Skateboards (which Jonze co-owns) came out with seven pro-model skateboards with the Wild Things as the board graphics. Lakai shoes also re-designed most of their pro-model and stock shoes and added in different colors, adding in pictures of the Wild Things on the side and on others with Where the Wild Things Are printed on the side. UGG Australia also designed limited edition Where The Wild Things Are boots. A series of collectible vinyl dolls of the Wild Things and Max was released from the Japanese company MediCom Toys. Other releases include an eight - inch articulated figure of Max in wolf costume and smaller scale sets of the characters released under their Kubrick figure banner. McSweeney 's published The Wild Things by Dave Eggers, a full - length novel based on the film adaptation.
where was how to lose a guy in ten days filmed
How to Lose a Guy in 10 Days - Wikipedia How to Lose a Guy in 10 Days is a 2003 romantic comedy film directed by Donald Petrie, starring Kate Hudson and Matthew McConaughey. It is based on a short cartoon book of the same name by Michele Alexander and Jeannie Long. Andie Anderson (Kate Hudson) is a writer for a women 's magazine called Composure as the "How to... '' girl. She is bored and wishes she could write more about important things such as politics, economics, religion, poverty; stuff she actually cares about. After Andie 's best friend Michelle (Kathryn Hahn) experiences yet another break - up, Andie is inspired to write a new article titled "How to Lose a Guy in 10 Days ''; she will start dating a guy and eventually drive him away using only the "classic mistakes women make '' in relationships. At the same time, advertising executive Benjamin "Ben '' Barry (Matthew McConaughey) is striving for a pitch to advertise a new diamond campaign. When his boss questions Ben 's knowledge about romance, Ben bets he could make any woman fall in love with him if he wanted to. His boss accepts the bet and confirms that if he can make any woman fall in love with him before the upcoming company ball, in just 10 days, he will allow Ben to head the advertising for the new diamond company. Ben 's rival co-workers, Judy Spears (Michael Michele) and Judy Green (Shalom Harlow), who were at Composure magazine earlier in the day and are aware of Andie 's new assignment, set Ben up to have him pick Andie as the girl to test his theory on. Ben and Andie meet and soon their quests, neither revealing their true intentions. Andie works hard to drive Ben insane and make him break up with her in order to complete her article, but Ben continues to stick around in hopes of making her fall in love with him. Andie gets Ben knocked out in a movie theater by talking aloud while watching a chick flick, rapidly moves her things into his apartment, acts overly possessive and sensitive and clingy, ruins his boys ' poker night for him and his friends, and takes him to a Celine Dion concert when he was under the assumption he was going to see a New York Knicks basketball game. Ben stays with her despite everything, and after coming very close to breaking up they attend couples counseling, led by Andie 's friend Michelle. They agree, as a solution to their "problems '', to visit Ben 's family in Staten Island for the weekend. While holidaying together, Ben and Andie begin to form a genuine bond, and upon arriving home Ben even refers to Andie as his girlfriend. Andie then tries to explain to boss Lana (Bebe Neuwirth) that she can not continue writing and publishing this article as she has "really got to know this guy '', but Lana remains insistent upon it. Around the same time, Andie and Ben go to the company ball together where Ben 's boss, Phillip (Robert Klein), meets Andie and tells Ben that he "met her, she loves you, you win ''. Seeing Ben 's good news, Judy and Judy are instantly envious and set about to ruin it for their co-worker. They tell his close colleagues, Tony (Adam Goldberg) and Thayer (Thomas Lennon), that Andie knew about the bet all along and was playing along to help Ben win. Tony and Thayer then rush to Andie 's side and beg her to keep quiet, when they do not realize she is still blissfully unaware of the bet. Almost simultaneously, Lana, who is unaware of Ben 's role in Andie 's "How To '' article, reveals Andie 's true intentions to Ben. Upon learning of Ben 's bet, Andie attempts to humiliate Ben in front of everyone at the party, and the pair argue on stage. They go their separate ways before Ben is shown Andie 's article and encouraged to read it. She explains in it how she "lost the one person she ever fell for '', and when he hears she quit her job at Composure and is on her way to Washington, D.C. for an interview, he chases her taxi and stops her. Once he accuses her of running away, they reveal their true feelings for each other and the film ends with Ben instructing the taxi driver to return Andie 's belongings to her home, and then they kiss. Gwyneth Paltrow was originally going to star as Andie Anderson but later pulled out before pre-production began, and Kate Hudson replaced her. The yellow gown Kate Hudson wore in the movie was designed by celebrity designer Dina Bar - El. The necklace she wears with the yellow gown is called, in the film, the "Isadora Diamond '' named after Isadora Duncan. The 80 - carat yellow diamond in the necklace was designed by Harry Winston and is worth $6 million. How to Lose a Guy in 10 Days received mixed reviews from critics. Rotten Tomatoes gave the film a rating of 42 %, based on 161 reviews, with an average rating of 5 / 10. The site 's critical consensus reads, "Matthew McConaughey and Kate Hudson are charming together, but they ca n't overcome How to Lose a Guy in 10 Days ' silly premise and predictable script. '' Metacritic gave the film a score of 45 out of 100, based on 31 critics, indicating "mixed or average reviews ''. The film was released on February 7, 2003, and earned $23,774,850 in its first weekend. Its final gross is $105,813,373 in the US and $71,558,068 overseas.
will there be a season 18 of hell's kitchen
Hell 's kitchen (U.S. season 18) - wikipedia Hell 's Kitchen: Rookies vs Veterans is the eighteenth season of the American competitive reality television series Hell 's Kitchen and premiered on September 28, 2018, on Fox. Gordon Ramsay returns as host and head chef, and Season 10 winner Christina Wilson and British MasterChef judge James "Jocky '' Petrie return as the red and blue sous chefs, respectively, alongside maitre 'd Marino Monferrato. Season 18 features eight new contestants battling eight returning veterans. For the first time, the winner of Season 18 will receive a position as an executive chef at Gordon Ramsay Hell 's Kitchen Restaurant at Caesars Palace in Las Vegas, Nevada. This will be the first season since Season 1 to not begin with men on the blue team and women on the red team (in fact, by the end of the fourth episode, the opposite would be true due to a dramatic team reshuffle). It is the third of the last four seasons to feature a chef eliminated during service. The signature dish challenge results also turned out to reflect the competition rankings, as only one chef from each team (Scott and Jen) scored as low as 2 and they turned out to be the first two chefs eliminated from the competition. The intro for Season 18 was recycled from the previous season except for the title card where it instead shows the rookies on the left side and the veterans on the right side. Sixteen chefs competed in season 18. Sixteen chefs arrive at Hell 's Kitchen, and are greeted by Marino and the Sous Chefs, who serve them butternut squash risotto for lunch. After lunch, Ramsay revealed himself and announced the grand prize for that season. After half of the chefs explain their credentials, the other half reveal themselves to actually be returning chefs from previous seasons, much to the rookies ' shock. It was then that Ramsay revealed that for the first time, he would be pitting rookies versus veterans, with the rookies cooking in the red kitchen under Sous Chef Christina, and the veterans cooking in the blue kitchen under Sous Chef Jocky. Team challenge / signature dish: For this year, the rookies received the exact ingredients they wanted while the veterans did not, and Ramsay told them that they would be cooking exact copies of the rookies ' signature dishes. After the veterans paired up with the rookies on their dishes, both teams had 45 minutes to cook their dishes, with the 1 - 5 ranking since Season 13 being used. On the shrimp & grits round, both T and Motto scored three points, while on the fish stew round, Kanae scored three while Kevin received a perfect five. Roe scored three points over Scott 's two on the black bass round, while Chris and Bret each received four points on the duck breast. On the scallops with parsnip round, both Ariel and Jose scored four points, and on the snapper round, both Scotley and Trev scored three points. On the grilled pork round, while Jen scored two points, Mia scored a perfect five. Finally, on the scallops and cucumber gazpacho round, both Gizzy and Heather scored perfect 5 's, and the challenge ended in a tie at 29 points, the second such occurrence in the signature dish challenge since season 8. This was also the first time under the new scoring format that nobody 's signature dish received a 1. For the tie breaker, Ramsay called up Mia and Kevin to bring their dishes up again, and after a second look, he gave Mia 's dish the edge, and the rookies won their first challenge. Reward / punishment: The rookies celebrate their first challenge win when they had dinner at Charcoal in Venice Beach with Ramsay and Chef Josiah Citrin. The veterans prep for opening night by deshelling the sunflower seeds and detailing the mushrooms. Jen earned her teammates ' ire for costing them the challenge, making duxelle without cleaning the mushrooms, and giving attitude to Sous Chef Jocky. She and Bret started to get on each other 's nerves over their conflicting personalities, but started to warm up to him when he revealed that his tattoos were in memory of his deceased parents. Challenge -- Part 2: Continuing from the previous episode Ramsay revealed that someone they were saying goodbye to was the lobster risotto from the menu, which has been a staple dish since the beginning of the series. After, Ramsay revealed that he would be giving each of the chefs an opportunity to put their own risotto dish on the menu that season, with the winner also receiving a Punishment Pass, a one use advantage that can allow any chef to skip out on a punishment and join the winning team on their reward. The chefs had 45 minutes to cook their risottos, and Ramsay had the Sous Chefs taste their dishes to pick the top four of both teams. After, the rookies were represented by Chris, Gizzy, Mia, and Scott, while the veterans were represented by T, Roe, Ariel, and Bret. In the end, Bret 's tomato risotto with grilled shrimp and asparagus beat out Mia 's dish, and he won the challenge, although his emotional celebration annoyed the rookies. Before service: Before service began, the Sous Chefs called both teams down to discuss the new menu items, but neither Christina nor Jocky were happy when the rookies arrived late and separated. While prepping for service, Trev was caught improperly marking the Wellingtons, much to Jocky 's irritation, as Christina told the rookies to use the veterans ' arrogance as their advantage. Service: Don McLean, Annie Wersching, Kyle Schmid, and Courtney Sixx were seen at opening night, and a shrimp and pasta appetizer was served tableside by Roe and Scotley. The veterans started out rough with T serving a risotto that had more cheese taste than tomato, but recovered, and while Trev served mushy scallops on a small frying pan, he recovered as well. On entrees, Bret accidentally sliced his lamb when it was not his place, and Jen was late on serving the sauce for the pork. Despite those problems, the veterans finished service on a high note. In the red kitchen, Scott only fired one serving of shrimp for two risottos ordered, and was caught cooking more shrimp for the tableside despite it being Scotley 's responsibility. The problems continued on entrees as Jose served raw halibut, Chris was caught writing down cooking times instead of helping Mia out on meat, and a communication breakdown occurred when nobody could recite an order correctly. After Mia served raw lamb, Ramsay finally had enough of the rookies ' fragmented performance and threw them out of the kitchen, before asking them to name two people for elimination. Elimination: The rookies agreed to nominate Scott and Chris as they were the ones who were the most lost that night, but Ramsay decided to call down Jose as well. During their pleas, Chris revealed that he had a major accident that destroyed 90 % of his face, and that it damaged his memory, but Ramsay told him that he needed to see him fight back. Scott ended up being eliminated for his lack of focus during service, as well as failing to uphold his impressive resume. Challenge: The chefs were tasked with cooking lunch for the Marine Corps. The menu included entrees that represented the air, land, and sea: chicken parmesan, New York strip and frittes, and fish and chips. For the rookies, Mia did not take well to Scotley 's micromanagement and as a result of his misdirection, a paper towel caught fire on the fish station. Gizzy was lectured for waiting for water to boil before putting in pasta. The veterans started off slow as Jen overdressed salad, leading Heather, who was supposed to be on French fries, to jump on the station much to Jen 's annoyance. Kevin rushed Heather on fries after she forgot to delegate her station, causing an order to be sent back from the blue dining room for being undercooked. Ramsay berated the veterans for not communicating and giving up after Roe served soggy fish, but they rallied behind Kevin 's leadership. Both teams reached their last ticket at the same time, but Ariel sent cold steak, allowing the rookies time to complete their ticket and win their second challenge in a row. Reward / punishment: The rookies were rewarded with a trip to Paramount Ranch, where they would star in their own Western movie called "Hell 's Riders '', alongside Marino. Ramsay offered Bret the chance to join them using his punishment pass from the risotto challenge, and Bret accepted, irritating his teammates, who were tasked with preparing calamari for a tableside appetizer. Jen walked out of prep and broke down in tears after getting in an argument with Ariel and T over how to prepare beurre blanc sauce (ironically, they thought Jen 's from the last prep session was better). Before service began, Ramsay and the sous chefs showed everyone the Hell 's Riders trailer. Service: Aly Michalka was in attendance despite not being mentioned in this service. Mia and Ariel served grilled calamari tableside. Both teams had relatively little trouble with appetizers, except for Motto cooking a risotto off the stove and making it too crunchy, and Trev taking too long to assemble cold appetizers with Ramsay chiding him for taking a sip of water before he served the appetizers. On entrees, Jen served bland mashed potatoes for the first table, argued about it with Heather and Sous Chef Jocky when they tasted for her, and when Ramsay came in to check on the team, aggressively asked him what he was about to yell at her for. Scotley provided inconsistent times on his beef wellingtons, which caused Gizzy to overcook halibut, and Ramsay replaced her on the fish station with Jose. Mia entered the kitchen from tableside, helping Kanae catch up on garnish and Jose repeat an order from Sous Chef Christina. Automatic elimination: When Jen brought up too few leeks for two orders of duck and got overwhelmed by her teammates asking her for times, she accused Ramsay of having thrown away some of the garnish to sabotage her. Ramsay angrily dropped the bin on the floor and led the entire blue team into the storeroom while Jen continued to yell at him for disrespecting her, until he was left with no choice but to order Jen to take off her jacket and leave Hell 's Kitchen through the front door. This marks the 9th time a contestant was eliminated during service. After this, the blue team got its entrees out more quickly and neither kitchen made another error. After service, Ramsay announced to both teams that he sent Jen home because he would not tolerate insults against his reputation. Then for the second time in two seasons, both teams were declared winners after an automatic disqualification, in lieu of the nomination process. Unlike most chefs who get eliminated during service, Jen still got her coat hung and picture burned at the end of the episode. Challenge: The sous chefs handed winter jackets to all the contestants. When they went outside, Ramsay had set up an artificial snow slope. He tasked each team to take four runs down the slope in sleds. The blue team finished first, and they were given a 10 - second headstart to grab ingredients for the winter soup challenge, where each contestant had 45 minutes to prepare a soup. Ramsay brought in Traci Des Jardins and Brian Boitano as guest judges, and each judge would rate each soup from 1 - 3 stars. The veterans got off to a strong start as Trev scored 6 stars, Heather scored 8, Ariel scored 7, T scored 8, and Kevin scored 6. But Roe and Bret scored the minimum 3 stars due to insufficient seasoning and canned tomatoes, respectively. This left the veterans with a score of 41. For the rookies, Mia scored the maximum 9 stars, Motto scored 6, Gizzy scored a perfect 9, Chris scored the minimum 3, Jose scored 7, and Scotley scored 8 for a total of 42, automatically giving the rookies the win before Kanae 's soup was tasted. Kanae managed to score a perfect 9 as well to make the final score 51 - 41, and it was the rookies ' third challenge win in a row. Reward / punishment: The rookies were rewarded with a day and night of pampering at L'Horizon Resort in Palm Springs and dinner at Sopa, while the veterans had to shovel out the snow and prepare both kitchens for service. Bret took his poor performance very personally, drawing his teammates ' ire. Trev struggled to poach eggs during prep, earning Sous - Chef Jocky 's criticism once again. Service: Hayley Orrantia was in attendance, while Cheryl Hines and Rachel Harris dined at the blue chef 's table and Morgan Spurlock at the red. Gizzy and Bret served tableside clam chowder, but both struggled. The former was criticized for entering the kitchen while attempting to help Mia on flatbreads, while the latter worked slowly and held up his team. Scotley and Mia argued on appetizers, prompting Sous - Chef Christina to intervene. On entrees, Chris forgot about a pork chop for Morgan Spurlock 's table that was put into the oven (by Ramsay) before service opened, and Ramsay found it burned completely black. Kevin served cold New York strip, while Motto overcooked steak but recovered. On the next ticket, Kevin overcooked and undercooked lamb chops while rejecting Trev 's assistance, leading Ramsay to march the entire Blue Team into the pantry. When Kevin served the lamb raw again, Ramsay stepped in to cook it himself. Even though nobody was ejected, Ramsay declared both teams losers due to their careless errors and regression from the previous week. Nominations: The blue team nominated Kevin and Trev, while the red team nominated Chris and Gizzy. After questioning the nominees, Ramsay first asked Kevin and Gizzy for their jackets, only to ask Kevin to go the Red Team and Gizzy to the Blue Team. Then he asked for Trev 's jacket, only to send him to the Red Team as well. It seemed as though Chris would be eliminated, but Ramsay merely sent him back in line and warned him it was his last chance. Finally, Ramsay sent Mia and Kanae to the Blue Team and Bret to the Red Team, and announced that he was ending the Rookies vs. Veterans experiment in favor of the traditional battle of the sexes, only in opposite color jackets (the men are the red team and the women are the blue team). This is the largest team reassignment (6 people) in the show 's history.
who asserted that high population is not a problem in the 1980s
Paul R. Ehrlich - wikipedia Paul Ralph Ehrlich (born May 29, 1932) is an American biologist, best known for his warnings about the consequences of population growth and limited resources. He is the Bing Professor of Population Studies of the Department of Biology of Stanford University and president of Stanford 's Center for Conservation Biology. Ehrlich became well known for his controversial 1968 book The Population Bomb, which asserted that the world 's human population would soon increase to the point where mass starvation ensued. Among the solutions he suggested in that book was population control, to be used in his opinion if voluntary methods were to fail. Ehrlich has been criticized for his opinions; for example, Ronald Bailey termed Ehrlich an irrepressible doomster. However, Carl Haub observed that Ehrlich 's warnings had encouraged governments to change their policies to avert disaster. Ehrlich has acknowledged that some of what he predicted has not occurred, but that his predictions about disease and climate change were essentially correct, and maintains his opinion that overpopulation is a major problem. Ehrlich was born in Philadelphia, Pennsylvania, the son of William Ehrlich and Ruth (Rosenberg) Ehrlich. His father was a shirt salesman, his mother a Greek and Latin scholar. Ehrlich earned a bachelor 's degree in zoology from the University of Pennsylvania in 1953, an M.A. from the University of Kansas in 1955, and a Ph. D. from the University of Kansas in 1957, supervised by the prominent bee researcher Charles Duncan Michener. During his studies he participated with surveys of insects in the areas of the Bering Sea and Canadian Arctic, and then with a National Institutes of Health fellowship, investigated the genetics and behavior of parasitic mites. In 1959 he joined the faculty at Stanford University, being promoted to professor of biology in 1966. By training he is an entomologist specializing in Lepidoptera (butterflies); he published a major paper about the evolution of plants and insects. He was appointed to the Bing Professorship in 1977. He is president of the Center for Conservation Biology at Stanford University. He is a fellow of the American Association for the Advancement of Science, the United States National Academy of Sciences, the American Academy of Arts and Sciences and the American Philosophical Society. A lecture that Ehrlich gave on the topic of overpopulation at the Commonwealth Club of California was broadcast by radio in April 1967. The success of the lecture caused further publicity, and the suggestion from David Brower the executive director of the environmentalist Sierra Club, and Ian Ballantine of Ballantine Books to write a book concerning the topic. Ehrlich and his wife, Anne Ehrlich, collaborated on the book, The Population Bomb, but the publisher insisted that a single author be credited. Although Ehrlich was not the first to warn about population issues -- concern had been widespread during the 1950s and 1960s -- his charismatic and media - savvy methods helped publicize the topic. The original edition of The Population Bomb began with this statement: "The battle to feed all of humanity is over. In the 1970s hundreds of millions of people will starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate... '' Ehrlich argued that the human population was too great, and that while the extent of disaster could be mitigated, humanity could not prevent severe famines, the spread of disease, social unrest, and other negative consequences of overpopulation. By the end of the 1970s, this prediction proved to be incorrect. However, he continued to argue that societies must take strong action to decrease population growth in order to mitigate future disasters, both ecological and social. In the book Ehrlich presented a number of "scenarios '' detailing possible future events, some of which have been used as examples of errors in the years since. Of these scenarios, Ehrlich has said that although, "we clearly stated that they were not predictions and that ' we can be sure that none of them will come true as stated, ' (p. 72) -- their failure to occur is often cited as a failure of prediction. In honesty, the scenarios were way off, especially in their timing (we underestimated the resilience of the world system). But they did deal with future issues that people in 1968 should have been thinking about. '' Ehrlich further states that he still endorses the main thesis of the book, and that its message is as apt now as it was in 1968. Ehrlich 's opinions have evolved over time, and he has proposed different solutions to the problem of overpopulation. In Population Bomb he wrote, "We must have population control at home, hopefully through a system of incentives and penalties, but by compulsion if voluntary methods fail. We must use our political power to push other countries into programs which combine agricultural development and population control. '' Voluntary measures he has endorsed include the easiest possible availability of birth control and abortion. In 1967 he had expressed his belief that aid should only be given to those countries that were not considered to be "hopeless '' to feed their own populations. In their sequel to The Population Bomb, the Ehrlichs wrote about how the world 's growing population dwarfs the Earth 's capacity to sustain current living standards. The book calls for action to confront population growth and the ensuing crisis: When is an area overpopulated? When its population ca n't be maintained without rapidly depleting nonrenewable resources (39) (or converting renewable resources into nonrenewable ones) and without degrading the capacity of the environment to support the population. In short, if the long - term carrying capacity of an area is clearly being degraded by its current human occupants, that area is overpopulated. In this paper, the Ehrlichs discuss the ' optimal size ' for human population, given current technological realities. They refer to establishing "social policies to influence fertility rates. '' During a 2004 interview, Ehrlich answered questions about the predictions he made in The Population Bomb. He acknowledged that some of what he had published had not occurred, but reaffirmed his basic opinion that overpopulation is a major problem. He noted that, "Fifty - eight academies of science said that same thing in 1994, as did the world scientists ' warning to humanity in the same year. My view has become depressingly mainline! '' Ehrlich also stated that 600 million people were very hungry, billions were under - nourished, and that his predictions about disease and climate change were essentially correct. Retrospectively, Ehrlich believes that The Population Bomb was "way too optimistic ''. In a 2008 discussion hosted by the website Salon, Paul Ehrlich has become more critical of the United States specifically, claiming that it should control its population and consumption as an example to the rest of the world. He has disavowed some of what he said in The Population Bomb. He still thinks that governments should discourage people from having more than two children, suggesting, for example a greater tax rate for larger families. In 2011, as the world 's population passed the seven billion mark Ehrlich has argued that the next two billion people on Earth would cause more damage than the previous two billion because we are now increasingly having to resort to using more marginal and environmentally damaging resources. As of 2013, Ehrlich continues to perform policy research concerning population and resource issues, with an emphasis upon endangered species, cultural evolution, environmental ethics, and the preservation of genetic resources. Along with Dr. Gretchen Daily, he has performed work in countryside biogeography; that is, the study of making human - disturbed areas hospitable to biodiversity. His research group at Stanford University examines extensively natural populations of the Bay checkerspot butterfly (Euphydryas editha bayensis). The population - related disasters Ehrlich predicted have largely failed to materialize, with population growth rates slowing and new food production technologies increasing the food supply faster than the population. Ehrlich endorses his general thesis that the human population is too large, posing a direct threat to human survival and the environment of the planet. Critics have disputed Ehrlich 's main thesis about overpopulation and its effects on the environment and human society, and his solutions, as well as some of his specific predictions made since the late 1960s. One criticism concerns Ehrlich 's allegedly alarmist and sensational statements and inaccurate "predictions ''. Ronald Bailey of Reason Magazine has termed him an "irrepressible doomster... who, as far as I can tell, has never been right in any of his forecasts of imminent catastrophe. '' On the first Earth Day in 1970, he warned that "(i) n ten years all important animal life in the sea will be extinct. Large areas of coastline will have to be evacuated because of the stench of dead fish. '' In a 1971 speech, he predicted that: "By the year 2000 the United Kingdom will be simply a small group of impoverished islands, inhabited by some 70 million hungry people. '' "If I were a gambler, '' Professor Ehrlich concluded before boarding an airplane, "I would take even money that England will not exist in the year 2000. '' When this scenario did not occur, he responded that "When you predict the future, you get things wrong. How wrong is another question. I would have lost if I had had taken the bet. However, if you look closely at England, what can I tell you? They 're having all kinds of problems, just like everybody else. '' Ehrlich wrote in The Population Bomb that, "India could n't possibly feed two hundred million more people by 1980. '' Carl Haub of the Population Reference Bureau has replied that it was precisely the alarmist rhetoric that prevented the catastrophes of which Ehrlich warned. "It makes no sense that Ehrlich is now criticized as being alarmist because his dire warnings did not, in the main, come true. But it was because of such warnings from Ehrlich and others that countries took action to avoid potential disaster. '' During the 1960s and 70s when Ehrlich made his most alarming warnings, there was a widespread belief among experts that population growth presented an extremely serious threat to the future of human civilization, although differences existed regarding the severity of the situation, and how to decrease it. Dan Gardner argues that Ehrlich has been insufficiently forthright in acknowledging errors he made, while being intellectually dishonest or evasive in taking credit for things he claims he got "right ''. For example, he rarely acknowledges the mistakes he made in predicting material shortages, massive death tolls from starvation (as many as one billion in the publication Age of Affluence) or regarding the disastrous effects on specific countries. Meanwhile, he is happy to claim credit for "predicting '' the increase of AIDS or global warming. However, in the case of disease, Ehrlich had predicted the increase of a disease based on overcrowding, or the weakened immune systems of starving people, so it is "a stretch to see this as forecasting the emergence of AIDS in the 1980s. '' Similarly, global warming was one of the scenarios that Ehrlich described, so claiming credit for it, while disavowing responsibility for failed scenarios is a double standard. Gardner believes that Ehrlich is displaying classical signs of cognitive dissonance, and that his failure to acknowledge obvious errors of his own judgement render his current thinking suspect. Barry Commoner has criticized Ehrlich 's 1970 statement that "When you reach a point where you realize further efforts will be futile, you may as well look after yourself and your friends and enjoy what little time you have left. That point for me is 1972. '' Gardner has criticized Ehrlich for endorsing the strategies proposed by William and Paul Paddock in their book Famine 1975!. They had proposed a system of "triage '' that would end food aid to "hopeless '' countries such as India and Egypt. In Population Bomb, Ehrlich suggests that "there is no rational choice except to adopt some form of the Paddocks ' strategy as far as food distribution is concerned. '' Had this strategy been implemented for countries such as India and Egypt, which were reliant on food aid at that time, they would almost certainly have suffered famines. Instead, both Egypt and India have greatly increased their food production and now feed much larger populations without reliance on food aid. Another group of critics, generally of the political left, argues that Ehrlich emphasizes overpopulation too much as a problem in itself, instead of distribution of resources. Barry Commoner argued that Ehrlich emphasized overpopulation too much as the source of environmental problems, and that his proposed solutions were politically unacceptable because of the coercion that they implied, and because they would cost poor people disproportionately. He argued that technological, and above all social development would result in a natural decrease of both population growth and environmental damage. Ehrlich denies any type of racism, and has argued that if his policy ideas were implemented properly they would not be repressive. Julian Simon, a cornucopian economist, argued that overpopulation is not a problem in itself, and that humanity will adapt to changing conditions. Simon argued that eventually human creativity will improve living standards, and that most resources were replaceable. Simon stated that over hundreds of years, the prices of virtually all commodities have decreased significantly and persistently. Ehrlich termed Simon the proponent of a "space - age cargo cult '' of economists convinced that human creativity and ingenuity would create substitutes for scarce resources and reasserted the idea that population growth was outstripping the earth 's supplies of food, fresh water and minerals. This exchange resulted in the Simon -- Ehrlich wager, a bet about the trend of prices for resources during a ten - year period that was made with Simon in 1980. Ehrlich was allowed to choose ten commodities that he predicted would become scarce and thus increase in price. Ehrlich chose mostly metals, and lost the bet, as their average price decreased by about 30 % in the next 10 years. Simon and Erlich could not agree about the terms of a second bet. Ehrlich has argued that humanity has simply deferred the disaster by the use of more intensive agricultural techniques, such as those introduced during the so - called Green Revolution. Ehrlich claims that increasing populations and affluence are increasingly stressing the global environment, due to such factors as loss of biodiversity, overfishing, global warming, urbanization, chemical pollution and competition for raw materials. He maintains that due to growing global incomes, reducing consumption and human population is critical to protecting the environment and maintaining living standards, and that current rates of growth are still too great for a sustainable future. Ehrlich was one of the initiators of the group Zero Population Growth (renamed Population Connection) in 1968, along with Richard Bowers and Charles Remington. He and his wife Anne were part of the board of advisers of the Federation for American Immigration Reform until 2003. He is currently a patron of Population Matters, (formerly known as the Optimum Population Trust). Ehrlich has spoken at conferences in Israel on the issue of desertification. He has argued that "True Zionists should have small families ''. Ehrlich has been married to Anne H. Ehrlich (born Anne Fitzhugh Howland, 1933) since 1954; he and Anne are parents to one child, Lisa Marie. Ehrlich has said he has had a vasectomy.
who established music education in the boston schools as part of every child's education
Music education in the United States - wikipedia Music education in the United States can be traced through historical documentation to the colonial era. Among the Native Americans prior to European and African settlement, music education was entirely oral. The earliest systematic music education in the country was centered on the training of singers for Protestant church services, to lead the congregation in psalm - singing. In the 18th century, the first singing schools in the country were founded, and a number of legendary traveling singing masters traveled New England, teaching in barns, schoolhouses and other informal locations; these masters included Francis Hopkinson and William Billings. By the end of the century, more formal singing schools in cities like Savannah, Philadelphia and Boston become social singing societies. Public education in the United States first offered music as part of the curriculum in Boston in the 1830s, and it spread through the help of singing teacher Lowell Mason, after he successfully advocated it to the Boston School Committee in 1838. The committee ultimately decided to include music as a curricular subject because it was of a moral, physical, and intellectual nature. Music was considered moral because it played such a part in religion, as well as the fact that had been documented to produce "happiness, contentment, cheerfulness, and tranquility. '' It was of a physical nature because singing was exercise for the lungs. The committee justified music 's intellectual nature by stating that it had been studied as a part of the quadrivium in the Middle Ages, and that it "contributes to memory, comparison, attention, and intellectual faculties. '' (p. 13 - 14). Another advocate of music in public education was Swiss educational reformer Johann Heinrich Pestalozzi. Pestalozzi believed that nature was the ultimate and original source of knowledge, therefore his educational theories placed a high value on sensory, kinesthetic, and active learning. He felt students should start with simple concepts in all subjects and move later to more complex ideas. Pestalozzi 's method was one of the earliest methods that could be considered "student - centered learning '' and his ideas of discipline and a student - teacher rapport based on love and trust were markedly different from the common practice of corporal punishment at the time. The first music educator to use Pestalozzian ideas in teaching music was Hans Negeli, a colleague of Pestalozzi in Switzerland. His Pestalozzian approach to music was brought to the United States, translated, and popularized by William Channing Woodbridge, Elam Ives, and Lowell Mason. This approach prized active and sensory learning, taught sounds before signs, separated music into melody, rhythm, and expression, and to moved from the simple to complex within the context of each element. Many of these ideas that are found in later established music teaching methods, such as Orff - Schulwerk, Kodaly, and, Dalcroze - Eurthymics. Music education, primarily vocal, remained most common in women 's schools, though many private academies also existed, offering boys and girls instruction in orchestral instruments like the violin, viola, cello and piano. In the mid-19th century, educator Luther Whiting Mason established music education in the schools of Cincinnati, and become a prominent author of textbooks. His National Music Course, first published in 1870, was a widely adopted standard part of many American curricula. Music education continued to expand across the country, and gained in respect as an essential part of educational development. There was a music section in the National Education Association by the 1890s. After the Civil War pragmatism and the scientific aspects of sequential skill building, accurate evaluations, examinations, systematic teaching methods, and scientific methods were popular in education. Music educators ' responses showed that music could be studied scientifically through the use of different methodologies, systematic textbooks and graded music series, and instructional material for teachers. The scientific and more pragmatic goals of education in the nineteenth century had a profound affect on the development of music education in the schools. Two main methodologies used to teach music were the "rote '' method and the "note '' method. The rote method followed many Pestalozzian ideologies. Songs were taught first and musical ideas were presented later, one at time, carefully and systematically. Singing by rote was the basis for this methodology. Lowell Mason authored what is believed to be the first music series using the rote method. Lowell Mason 's The Song Garden from 1864 set the stage for other rote methodologies of the late nineteenth century. Luther Whiting Mason was a prominent rote method name presented in the textbook. Luther Mason was employed in Cincinnati schools but later moved to Boston to be the "superintendent of music in the primary schools '' (p. 195). Luther Mason wrote school textbooks using rote methodology. The National Music Course, published by Ginn in 1870, had seven books presented in a sequential approach of using rote songs to teach music reading. (p. 196) Luther Mason included very detailed lesson plans for the classroom teacher, since at the time music was taught by the classroom teacher but overseen by a music specialist. The series was designed for fifteen minutes of music instruction each day given by the classroom teacher and overseen by a music educator once per week. The book series was so popular Luther Mason was invited to apply his methods in Japan and Germany. The rote method was less favored than the "note '' method later in the nineteenth century. However, the debate continues today. An example of the note method is Joseph Bird 's 1861 Vocal Music Reader and Benjamin Jepson 's three book series using "note '' methodology. The Elementary Music Reader was published in 1871 by the Barnes Company, one year after Luther Mason 's The National Music Course. Benjamin Jepson was a military man turned music teacher in New Haven after an injury in the war. His music textbooks had exercises and songs presented systematically for the goal of music reading and sight - singing. Jepson later published two revisions of his series under the names The Standard Music Reader in 1888, and in 1904 The New Standard Music Reader. Following in Jepson 's footsteps of note methodology were Hosea Edson Holt and John Wheeler Tufts, who wrote The Normal Music Course published in 1883 (twelve years after Jepson 's The Elementary Music Reader). Click here to read the Normal Music Course. This series had five books all geared toward sight - singing and reading music. A few other note methodology textbook were presented to show the seriousness of music reading as a scientific and pragmatic study. These other note methodology books included: The Graded School Singer by Blackman and Whittemore - 1873, and the Cincinnati Schools ' The Young Singer - 1860, The Young Singer 's Manual - 1866, and the Cincinnati Music Reader - 1893 (p. 207). The culmination of the scientific method and note methodology advances was presented in Thomas Tapper 's and Frederick Ripley 's The National Music Course published in 1895. The book focused on a no - nonsense systematic approach to music literacy to develop beauty in singing. It emphasized the systematic and pragmatic delivery of materials. Music Education in the United States took a big turn with the creation of the Music Supervisors National Conference. The first meeting of the Music Supervisors Conference was held on April 10 -- 12, 1907 in Keokuk, Iowa, at Westminster Presbyterian Church. A music educator by the name of Philip C. Hayden made the first meeting possible by sending invitations and announcing the meeting in the Music School Monthly of which Hayden was the founder and editor of the publication. The gathering was primarily meant for educators to come observe new teaching techniques in rhythm and observe Hayden 's music students. During the three - day convention, music demonstrations took place provided by Hayden and his students. Informal discussions on current topics in music education would also take place during the convention. Future conventions and clinics would be based on this model. Throughout the convention, many educators discussed the importance of having a more permanent organization dedicated to music supervisors and teaching techniques. On the last day of the convention, a forum of sixty - nine music supervisors voted to have another convention and became charter members of an organization that would remain nameless. The Music Supervisors National Conference was officially established during the third meeting of the organization in Cincinnati, OH, in 1910 with the adoption of the constitution and bylaws. As the role of the music supervisor changed into more of an administrative position, the conference began to focus primarily on the teaching methods provided in the classroom. At the 1934 Chicago meeting, members decided to change the name to Music Educators National Conference. Since the inception of Music Supervisors National Conference, the organization has worked diligently in making sure that every student has access to music instruction in the public school system provided by a qualified music teacher. Vocal instruction dominated public schooling at all levels. Instrumental education was handled largely through private enterprise, until the early 20th century. Inspired by the band music of Frederick Innes, John Philip Sousa and others, many schools offered orchestral instrument bands. This accelerated following World War I, when many soldiers returned with knowledge and interest in the band music they learned as soldiers. The first formal school for music educators was founded in 1884, in Potsdam, New York, by Julia Ettie Crane, but Oberlin Conservatory in Ohio in the 1920s became the first school to offer a four - year degree in music education. Bands during the mid to late nineteenth century were an intricate part of every community. These bands would march in parades, provide free concerts, supported soldiers, and played for those in hospitals. During the golden age of bands which occurred from 1865 to 1917, there were approximately over 10,000 brass bands in existence across the country. The band membership did consist of males, but past photographs indicate that there were also all women bands. The female bands continued their popularity into the twentieth century and influenced the evolution of the high school band from totally male to the integration of female into the programs. High schools often housed the standard male band, but also often included a female band. There were also female bands created to support industry. With the beginning of WW II female bands expanded into women 's military bands. These bands were created to entertain female troops, sell war bonds, and perform at concerts, graduations, dances, parades, and hospitals (2008). The WAC (Woman 's Army Corp) bands were used to entertain injured soldiers returning from the war. One Ohio music educator, Joan A. Lamb, provided great contributions to world of military bands. Lamb was a public music teacher until enrolling in the Women 's Army Auxiliary Corps later known as WAC. In her basic training, she was asked to become a candidate for officer, but she insisted she had joined to play in the WAC band. She was even sent to a psychiatrist because of her persistence in the effort to become a member of the band. Finally, she located the WAC band director, auditioned for him with her oboe, and was immediately reassigned to the band. Lamb held several music positions within the band which was a first for women. She graduated from Army Music School, directed the 400th WAC Band, started an African American WAC Band, and performed in the Armed Forces Radio Orchestra. After her service in the WACS, she returned to her life as a music teacher, educator, and administrator in Los Angeles public schools where she served for 30 years. African American women also wanted to serve in the WAC Band. A 404 WAC Band was created. The WAC branch of the military was the only branch who allowed African American female bands. The Coast Guard also created their version of the female band. The SPAR Band played for the troops and performed many of the same activities as the WAC Band. After the war, the SPAR Band was disbanded and the Coast Guard again became all male. After the war, the female SPAR band members became teachers, performers, and parents. There were some female members of the WAC and SPAR military bands who used the G.I Bill to go to college while others continued performing professionally post war. During the war, women 's swing bands also became popular replacing the male counterpart serving in the war effort. These swing bands were highly successful and were well received by the public longing for a diversion from the war effort. The Ingenues was an all - girl jazz band popular in the late 1920s through the 1930s. This performing band would often perform in vaudeville and variety theaters. Anna Mae Winburn was an African American band leader of an all - girl jazz band in the International Sweethearts of Rhythm. Frances Klein was another famous female instrumentalist of the 30 's and 40 's who played in Kermit Dart 's All - Girl Band, under the direction of Irene Vermillion. The women military bands and post war all - girl bands opened the door to more female participation in instrumental music. Through the efforts of these frontier - like women, perspectives changed as to the female purpose and level of reliability. Suddenly, female bands were found to be as entertaining as male bands. Women found a place and purpose in the entertainment world. Women who were thought to only have the ability to be mothers were leaders in the music world. Women 's equality began to emerge during this time in history and music was a primary avenue for the movement. These women of the past created an environment which the women of today now enjoy. This environment is one of fairness and the ability to seek success in all career areas. Women of today attend college and may become professionals in the work force. Women like Joan Lamb conquered what had in the past seemed like a forbidden world. The entrance of the United States into World War I (1917 -- 1918) prompted the Wilson administration to promote a "patriotic mind - set ''. Community singing of patriotic songs such as "America '' and "The Star Spangled Banner '' were popular outlets for citizens as a way to promote "... strong community efforts of all kinds '' and also assisted immigrants to learn English. In addition to community singing, concert bands and marching bands were also used to promote patriotism for "maintenance of civilian morale ''. College and university marching bands were also culturally influential during and after World War I, especially in the Midwest of the United States. Following the war, members of these collegiate bands were looking for ways to "develop good will, fellowship and understanding... and recognize the value of dedicated leadership. '' Two leading band service organizations were established to fit that calling, the Kappa Kappa Psi fraternity (ΚΚΨ) and the Tau Beta Sigma sorority (ΤΒΣ). Kappa Kappa Psi was the first of those organizations, established on November 27, 1919 at Oklahoma A & M College. The strong sense of patriotism during World War I was wearing off in the U.S., and the band members of the university wanted to continue to advocate band music. Ten collegiate band members were selected including the band leader William A. Scroggs by the director of the ensemble, Bohumil "Boh '' Makovsky. The fraternity quickly became national with the addition of the University of Washington and Montana State College in 1920. Since 1919, Kappa Kappa Psi members have been advocating, supporting and serving over 200 higher education institutions with around 5,000 active members each year. Famous fraternal brothers include John Philip Sousa, Karl King, and William Revelli. There are currently five distinctive purposes for Kappa Kappa Psi today which include: (1) promoting the existence and welfare of secondary school bands and cultivate a respect for their endeavors, (2) honoring outstanding band members through fraternal membership, (3) stimulating campus leadership and respect through positive conduct, (4) fostering a positive bond among collegiate bands and a high level or performance achievement, and (5) providing a positive social experience to all involved in college bands or other musical organizations. Collegiate bands in the 1920s were the domain of young men. Women were rarely involved, if at all. However, as the progressive movement of the U.S. was developing, bands became less militaristic and were accepting more women in the 1930s. Even with the progressive movement, some higher education bands held out much longer as a men only ensemble. Michigan State University did not admit women into the Spartan Marching Band until 1972. Tau Beta Sigma sorority was established on March 26, 1946, twenty seven years after Kappa Kappa Psi. The charter institution was Oklahoma State University, renamed from Oklahoma A & M, after the organization decided it would be easier to start in Oklahoma rather than Texas. A major circumstance for creating a band service organization for women was simply due to the fact that more women were now involved with university bands in the United States. Wava Banes along with some other women classmates at Texas Tech University approached their director, D.O. Wiley about forming a "group of bandswomen '' in 1937. The local organization was formalized as Tau Beta Sigma and structured after the Kappa Kappa Psi fraternity. Tau Beta Sigma petitioned Kappa Kappa Psi to become a chapter under their national fraternity in 1943, however a complete constitution re-write and difficulties associated with the U.S. entrance in World War II led Kappa Kappa Psi to suggest that Tau Beta Sigma form their own organization just as they had done in 1919. Due to complications with Texas corporation laws of 1945, A. Frank Martin from Kappa Kappa Psi suggested using a similar women group at Oklahoma State University (OSU) as the founding chapter for Tau Beta Sigma. One month after OSU 's charter was granted, the women traveled back to Texas to install the founding women of Tau Beta Sigma. Kappa Kappa Psi was able to convene in 1947 and officially accepted Tau Beta Sigma as their sister sorority. Both organizations grew in the post-World War II era. Society was still evolving and changing and women 's rights became front and center in 1972 with the passage of the Title IX law. This law "requires gender equity for boys and girls in every educational program that receives federal funding. '' Since both groups were federally funded, women were allowed to join Kappa Kappa Psi and men were allowed to join Tau Beta Sigma. Kappa Kappa Psi and Tau Beta Sigma are still active today and strive to promote the bands they are associated with as their founders did over a half century ago. Music is generally optional in high school in the United States, and may or may not be required in middle or junior high school. Concert bands and marching bands are also commonly offered, as are choirs and theatrical performances that often include music. In 2015 a study was released from 2010 showing that according to the U.S. Department of Education, 40 percent of high schools do n't require coursework in the arts for graduation. More than 8,000 public schools in the US are currently without music programs as of 2010. Across the country, 1.3 million elementary school students do n't have access to a music class. According to the US Department of Education, the core academic subjects studied in schools are currently English, reading or language arts, mathematics, science, foreign languages, civics and government, economics, arts, history, and geography. In order to teach a core subject in the United States, one must be a Highly Qualified Teacher (HQT), meaning they must have a bachelor 's degree from a four - year institution, be licensed in the state in which they wish to teach, and be fully competent in their subject area. However, individual competencies vary from state - to - state. Historically, music and fine arts had not been a part of core curriculum in schools in the United States, however, in July 2015, the United States Senate, passed a bipartisan revision naming music and art core subjects in curriculum under the Every Child Achieves Act. The core subjects that were added are "technology, engineering, computer science, music, and physical education. '' This was an action against the No Child Left Behind Act, which many United States Education Advocates felt had narrowed down the subjects incorporated into the core curriculum. The No Child Left Behind Act was initially entitled the Elementary and Secondary Education Act of 1965, but then was later renamed No Child Left Behind in 2002. The National Association for Music Education, or NAfME, is an "organization of American music educators dedicated to advancing and preserving music education and as part of the core curriculum of schools in the United States ''. The NAfME is an organization founded in 1907 of more than 60,000 people who advocate for the benefits of music and arts education for students at the local, state, and national levels. NAfME excitedly published a press release with the bipartisan senate revision and the impact they believe it will have on the United States Core Curriculum. Music is implemented as an academic subject in schools around the world, in places such as Greece, Germany, Slovenia, Spain India, and Africa. This is not a comprehensive, as music is considered a cultural necessity in many countries worldwide. Although the NAfME addresses the plan for implementation for music and arts as core subjects at the national level, the fulfillment of this revision of the Every Child Achieves Act varies from state to state. As of 2014, forty - one states currently have an arts education requirement at all levels, but only seventeen of these states have programs with deliberate assessment policies. Twenty - seven of the fifty United States consider the arts a core subject. Georgia and Arkansas have very specific outlines of music, while Alaska, Colorado, Hawaii, Michigan, and D.C. have no arts instruction requirements for any level of schooling. The National Coalition for Core Arts Standards (NCCAS), has a series of performance assessments entitled Model Cornerstone Assessments, or MCAs, which have been expended in high schools as a pilot program in recent months. In addition to this, NAfME also has nine National Music Education standards, which include: "singing, alone and with others, a varied repertoire of music; performing on instruments, alone and with others, a varied repertoire of music; improvising melodies, variations, and accompaniments; composing and arranging music within specified guidelines; reading and notating music; listening to, analyzing, and describing music; evaluating music and music performances; understanding relationships between music, the other arts, and disciplines outside the arts; and understanding music in relation to history and culture. '' Many policymakers and initiators for music and the arts as a part of the core curriculum believe that students participating in music and arts programs which hold them to high standards will bring a creative outlook now seemingly required in the workforce. Since music has traditionally been viewed as a subject outside of academia, and music has been incorporated into schools as a secondary subject, or often as an elective, there is limited research on classroom benefits of music as a core subject. Many researchers have explored both the benefits to listening to music passively as well as pursuing music actively, as with learning an instrument. The benefits music in the classroom and its effects on brain development, academic performance, and practical life skills have been observed through research by Jenny Nam Yoon. She concluded that the two hemispheres of the brain are stimulated when music is played and how the corpus callosum, the bridge that connects the two hemispheres is larger in musician 's brains. The effects of strictly listening to music have long been explored and has been given the name the "Mozart Effect, '' which is known to cause a "small increase in spatial - temporal reasoning ''. As seen with the Mozart Effect, listening to music has been proven to affect the brain and mood, as well as spatial temporal reasoning, but does not have any long - term benefits. A 1981 study at Mission Vejo High School proved that music students had a higher GPA than students who did not participate in music (3.59 vs. 2.91). There have been studies done verifying music as an enrichment activity that causes an increase in self - confidence, discipline, and social cohesion, as well as academic benefits. Although the US Senate has endorsed music as a core subject, many, including Secretary of Education Lamar Alexander, argue music and the arts to be extra-curricular. Many also believe that among Federal budget cuts, music and the arts would be the first to go because they are not part of our foundation of core curriculum in America. The benefits of music as a core subject and its impact on the education system through the arithmetic, language, concentration, and other skills involved still have to be assessed before conclusions can be drawn about the concrete, measurable impacts music and the arts have on children in the United States public school system. Elementary schools in the United States typically offer music classes several times a week, with classes ranging from thirty to forty - five minutes in length. Beginning in about fourth grade, performance opportunities are often provided in the form of choirs or orchestral (especially wind) bands. There are several developed teaching methods intended for use in elementary schools. The Kodaly Method was developed in Hungary by Zoltan Kodaly. The Orff - Schulwerk Method was developed by Carl Orff, the German composer who wrote Carmina Burana. The Dalcroze - Eurythmics Method was developed in Switzerland by Emile Jaques - Dalcroze, who was teaching at the Geneva Conservatory at the time. All three methods place an emphasis on activity and learning by doing. The Kodaly Method is known best for its use of solfege syllables and corresponding hand signals. The Orff - Schulwerk Method is most famous for its use of varying sizes of xylophones and glockenspiels, known as "Orff instruments. '' The Dalcroze - Eurythimics Method 's most visible characteristic is its use of movement to music, ideally live music. In a report by the Organization for Economic Cooperation and Education, OECD, it is explained that the amount of time students typically spend in school during primary and lower secondary levels is 7,751 hours. Of these 7,751 hours, the schools surveyed for the report spent eleven percent of the time of the whole year on the arts in elementary schools, and only eight percent of the time on the arts in lower secondary schools. This report also explains that the arts classes combined receive almost the same amount of time as physical education classes do, which is between eight and nine percent for both levels of schooling. ("How '') The United States Department of Education conducted a survey of 1,201 secondary schools during the 2009 - 2010 school year, Parsad explains. This survey reported that the majority of secondary teachers teaching the different arts classes were specialists in their field (Parsad). In a secondary investigation, the United States Department of Education concluded that 94 percent of elementary schools offered instruction in music, and 83 percent of schools offered instruction in the visual arts in elementary schools during the same time period. Drama and dance classes were offered to only three and four percent of the elementary schools involved in the study, but more than half of the schools incorporated dance into other subjects (Parsad). Art and music classes were offered to elementary students at least once per week, Parsad explains. Throughout the history of music education, many music educators have adopted and implemented technology in the classroom. Alice Keith and D.C. Boyle were said to be the first music educators in the United States to use the radio for teaching music. Keith wrote Listening in on the Masters, which was a broadcast music appreciation course. Another advocate who promoted the use of technology was Marguerite V. Hood, who was born on March 14, 1903 in Drayton, North Dakota. Hood graduated high school early, at the age of sixteen. She then attended Jamestown College, in Jamestown, North Dakota where she graduated with degrees in romance languages and music, and minors in history and English. Hood 's professional career led her to Montana in 1923, where she pursued teaching, writing, and public speaking. In 1930, Hood became Montana 's second state music supervisor. During Hood 's teaching career, the radio was used as an educational tool. Montana received poor radio reception because of the mountain interference, so Hood created local radio station broadcasts. Hood began the music education radio broadcast project, Montana School of the Air, in 1937. These broadcasts aired weekly on four radio stations. Using the radio for educational broadcasts received positive reviews because it allowed students to be reached all over the United States during the unstable time of the Great Depression. Radios were cost effective and could reach isolated, rural schools. The National Broadcasting Company (NBC) played a major role in music education. NBC would broadcast musical examples for students enrolled in public school. Walter Damrosch directed the radio program, Music Appreciation Hour. In the course of this broadcast, teachers were able to obtain the musical selections in advance along with student notebooks and teacher instruction manuals. Two other popular music broadcast programs were Alice in Orchestralia and the Standard Symphony Hour. These broadcasts were an innovative teaching strategy which promoted and encouraged active listening and music appreciation. Prior to the radio, listening to music was limited to live performances and the skill of the teacher to play music, specifically on the piano. The impact of the radio in the 1930s can similarly be compared to some of today 's technology, such as iPods, compact discs, and computers. Technology has become much more widely available in the classroom since the 1930s. Today, children have access to many musical devices and options that were not available in the 1930s. In essence, technology has been used in music classrooms throughout the United States with the intent to improve the quality of music education for students. "How Much Time do Primary and Lower Secondary Students Spend in the Classroom? '' OECD.org. April 2014. Web. 20 Nov. 2014. < http://www.oecd.org/education/skills-beyond-school/EDIF%202014 -- N22 % 20 % 28eng % 29. pdf > Parsad, Basmat, and Jared Coppersmith. "Arts Education. '' U.S. Department of Education. 2014. Web. 20 Nov. 2014. < http://nces.ed.gov/pubs2012/2012014rev.pdf >
who is the saint whose body never decomposed
Incorruptibility - wikipedia Incorruptibility is a Roman Catholic and Eastern Orthodox belief that divine intervention allows some human bodies (specifically saints and beati) to avoid the normal process of decomposition after death as a sign of their holiness. Bodies that undergo little or no decomposition, or delayed decomposition, are sometimes referred to as incorrupt or incorruptible. Incorruptibility is thought to occur even in the presence of factors which normally hasten decomposition, as in the cases of saints Catherine of Genoa, Julie Billiart and Francis Xavier. In Roman Catholicism, if a body is judged as incorruptible after death, this is generally seen as a sign that the individual is a saint. Canon law allows inspection of the body so that relics can be taken and sent to Rome. The relics must be sealed with wax and the body must be replaced after inspection. These ritual inspections are performed very rarely and can only be performed by a bishop respecting canon law. A pontifical commission can authorize inspection of the relics and demand a written report. After solemn inspection of the relics, it can be decided that the body is presented in an open reliquary and displayed for veneration. Catholic law allows saints to be buried under the altar, so Mass can be celebrated above the corpse. The relics of Saint Bernadette were inspected multiple times, and reports by the church tribunal confirmed that the body was preserved. The opening of the reliquary was attended by multiple canons, the mayor and the bishop in 1919, and repeated in 1925. Not every saint, however, is expected to have an incorruptible corpse. Although incorruptibility is recognized as supernatural, it is no longer counted as a miracle in the recognition of a saint. Embalmed bodies were not recognized as incorruptibles. For example, although the body of Pope John XXIII remained in a remarkably intact state after its exhumation, Church officials remarked that the body had been embalmed and additionally there was a lack of oxygen in his sealed triple coffin.. Incorruptibility is seen as distinct from the good preservation of a body, or from mummification. Incorruptible bodies are often said to have the odour of sanctity, exuding a sweet or floral, pleasant aroma. To the Eastern Orthodox Church, a distinction is made between natural mummification and what is believed to be supernatural incorruptibility. While incorruptibility is not generally deemed to be a prerequisite for sainthood, there are a great number of eastern Orthodox saints whose bodies have been found to be incorrupt and are in much veneration among the faithful. These include: The saints and other Christian holy men and women whose bodies are said to be or to have been incorrupt have been catalogued in The Incorruptibles: A Study of the Incorruption of the Bodies of Various Catholic Saints and Beati, a 1977 book by Joan Carroll Cruz. During marble excavations on the Appian Way in Spring 1485, workers found three marble coffins. In one, twelve feet underground, was the corpse of a young woman, said to have looked as if it had been buried that day, despite being about 1500 years old. The corpse attracted 20,000 plus crowds of spectators in the first few days, many of whom believed it to be of Tullia, daughter of Cicero, whose epitaph was on one of the tombs. The body of Blessed Mary of the Divine Heart Droste zu Vischering found to be incorrupt by the Catholic Church. The body of Saint Bernadette of Lourdes with wax face and hand coverings, declared to appear incorrupt by a committee in 1909 (subsequent exhumations indicated corruption). (b. January 7, 1844 -- d. April 16, 1879). The body of Saint John Mary Vianney wearing a wax mask, found to be incorrupt by the Catholic Church. (b. 8 May 1786 -- d. 4 August 1859). The body of Saint Padre Pio of Pietrelcina wearing a silicone mask, found to be incorrupt by the Catholic Church. (b. 25 May 1887 -- d. 23 September 1968). The body of Saint Alphonse Mary of Liguori, found to be incorrupt by the Catholic Church. (b. 27 September 1696 -- d. 1 August 1787). The body of Saint Joaquina de Vedruna, found to be incorrupt by the Catholic Church. (b. April 16, 1783 -- d. August 28, 1854). The body of Saint Zita, found to be incorrupt by the Catholic Church. (born c. 1218 - d. 27 April 1272). The body of Saint Catherine Labouré, found to be incorrupt by the Catholic Church. (b. May 2, 1806 -- d. December 31, 1876). The body of Venerable Mary of Jesus of Ágreda, found to be incorrupt by the Catholic Church. (b. April 2, 1602 -- d. May 24, 1665). The body of Saint Rita of Cascia, found to be incorrupt by the Catholic Church. (b. 1381 - d. May 22, 1457). The body of Saint Luigi Orione, found to be incorrupt by the Catholic Church. (b. June 23, 1872 -- d. March 12, 1940). The body of Saint Virginia Centurione, found to be incorrupt by the Catholic Church. (b. April 2, 1587 -- d. December 15, 1651).
how many species of ducks are there in north america
Duck - wikipedia see text Duck is the common name for a large number of species in the waterfowl family Anatidae, which also includes swans and geese. Ducks are divided among several subfamilies in the family Anatidae; they do not represent a monophyletic group (the group of all descendants of a single common ancestral species) but a form taxon, since swans and geese are not considered ducks. Ducks are mostly aquatic birds, mostly smaller than the swans and geese, and may be found in both fresh water and sea water. Ducks are sometimes confused with several types of unrelated water birds with similar forms, such as loons or divers, grebes, gallinules, and coots. The word duck comes from Old English * dūce "diver '', a derivative of the verb * dūcan "to duck, bend down low as if to get under something, or dive '', because of the way many species in the dabbling duck group feed by upending; compare with Dutch duiken and German tauchen "to dive ''. This word replaced Old English ened / ænid "duck '', possibly to avoid confusion with other Old English words, like ende "end '' with similar forms. Other Germanic languages still have similar words for "duck '', for example, Dutch eend "duck '', German Ente "duck '' and Norwegian and "duck ''. The word ened / ænid was inherited from Proto - Indo - European; compare: Latin anas "duck '', Lithuanian ántis "duck '', Ancient Greek nēssa / nētta (νῆσσα, νῆττα) "duck '', and Sanskrit ātí "water bird '', among others. A duckling is a young duck in downy plumage or baby duck, but in the food trade a young domestic duck which has just reached adult size and bulk and its meat is still fully tender, is sometimes labelled as a duckling. A male duck is called a drake and the female is called a duck, or in ornithology a hen. The overall body plan of ducks is elongated and broad, and the ducks are also relatively long - necked, albeit not as long - necked as the geese and swans. The body shape of diving ducks varies somewhat from this in being more rounded. The bill is usually broad and contains serrated lamellae, which are particularly well defined in the filter - feeding species. In the case of some fishing species the bill is long and strongly serrated. The scaled legs are strong and well developed, and generally set far back on the body, more so in the highly aquatic species. The wings are very strong and are generally short and pointed, and the flight of ducks requires fast continuous strokes, requiring in turn strong wing muscles. Three species of steamer duck are almost flightless, however. Many species of duck are temporarily flightless while moulting; they seek out protected habitat with good food supplies during this period. This moult typically precedes migration. The drakes of northern species often have extravagant plumage, but that is moulted in summer to give a more female - like appearance, the "eclipse '' plumage. Southern resident species typically show less sexual dimorphism, although there are exceptions like the paradise shelduck of New Zealand which is both strikingly sexually dimorphic and where the female 's plumage is brighter than that of the male. The plumage of juvenile birds generally resembles that of the female. Over the course of evolution, female ducks have evolved to have a corkscrew shaped vagina to prevent rape. Ducks eat a variety of food sources such as grasses, aquatic plants, fish, insects, small amphibians, worms, and small molluscs. Dabbling ducks feed on the surface of water or on land, or as deep as they can reach by up - ending without completely submerging. Along the edge of the beak, there is a comb - like structure called a pecten. This strains the water squirting from the side of the beak and traps any food. The pecten is also used to preen feathers and to hold slippery food items. Diving ducks and sea ducks forage deep underwater. To be able to submerge more easily, the diving ducks are heavier than dabbling ducks, and therefore have more difficulty taking off to fly. A few specialized species such as the mergansers are adapted to catch and swallow large fish. The others have the characteristic wide flat beak adapted to dredging - type jobs such as pulling up waterweed, pulling worms and small molluscs out of mud, searching for insect larvae, and bulk jobs such as dredging out, holding, turning head first, and swallowing a squirming frog. To avoid injury when digging into sediment it has no cere, but the nostrils come out through hard horn. The Guardian (British newspaper) published an article advising that ducks should not be fed with bread because it damages the health of the ducks and pollutes waterways. Ducks are generally monogamous, although these bonds usually last only a single year. Larger species and the more sedentary species (like fast river specialists) tend to have pair - bonds that last numerous years. Most duck species breed once a year, choosing to do so in favourable conditions (spring / summer or wet seasons). Ducks also tend to make a nest before breeding, and, after hatching, lead their ducklings to water. Mother ducks are very caring and protective of their young, but may abandon some of their ducklings if they are physically stuck in an area they can not get out of (such as nesting in an enclosed courtyard) or are not prospering due to genetic defects or sickness brought about by hypothermia, starvation, or disease. Ducklings can also be orphaned by inconsistent late hatching where a few eggs hatch after the mother has abandoned the nest and led her ducklings to water. Most domestic ducks neglect their eggs and ducklings, and their eggs must be hatched under a broody hen or artificially. Female mallard ducks make the classic "quack '' sound while males make a similar but raspier sound that is sometimes written as "breeeeze '', but despite widespread misconceptions, most species of duck do not "quack ''. In general, ducks make a wide range of calls, ranging from whistles, cooing, yodels and grunts. For example, the scaup -- which are diving ducks -- make a noise like "scaup '' (hence their name). Calls may be loud displaying calls or quieter contact calls. A common urban legend claims that duck quacks do not echo; however, this has been proven to be false. This myth was first debunked by the Acoustics Research Centre at the University of Salford in 2003 as part of the British Association 's Festival of Science. It was also debunked in one of the earlier episodes of the popular Discovery Channel television show MythBusters. The ducks have a cosmopolitan distribution. A number of species manage to live on sub-Antarctic islands like South Georgia and the Auckland Islands. Numerous ducks have managed to establish themselves on oceanic islands such as Hawaii, New Zealand and Kerguelen, although many of these species and populations are threatened or have become extinct. Some duck species, mainly those breeding in the temperate and Arctic Northern Hemisphere, are migratory; those in the tropics, however, are generally not. Some ducks, particularly in Australia where rainfall is patchy and erratic, are nomadic, seeking out the temporary lakes and pools that form after localised heavy rain. Worldwide, ducks have many predators. Ducklings are particularly vulnerable, since their inability to fly makes them easy prey not only for predatory birds but also for large fish like pike, crocodilians, predatory testudines such as the Alligator snapping turtle, and other aquatic hunters, including fish - eating birds such as herons. Ducks ' nests are raided by land - based predators, and brooding females may be caught unaware on the nest by mammals, such as foxes, or large birds, such as hawks or owls. Adult ducks are fast fliers, but may be caught on the water by large aquatic predators including big fish such as the North American muskie and the European pike. In flight, ducks are safe from all but a few predators such as humans and the peregrine falcon, which regularly uses its speed and strength to catch ducks. Ducks have many economic uses, being farmed for their meat, eggs, and feathers (particularly their down). They are also kept and bred by aviculturists and often displayed in zoos. Almost all the varieties of domestic ducks are descended from the mallard (Anas platyrhynchos), apart from the Muscovy duck (Cairina moschata). The call duck is another example of a domestic duck breed. Its name comes from its original use established by hunters, as a decoy to attract wild mallards from the sky, into traps set for them on the ground. The call duck is the world 's smallest domestic duck breed, as it weighs less than 1 kg (2.2 lb). In many areas, wild ducks of various species (including ducks farmed and released into the wild) are hunted for food or sport, by shooting, or formerly by decoys. Because an idle floating duck or a duck squatting on land can not react to fly or move quickly, "a sitting duck '' has come to mean "an easy target ''. These ducks may be contaminated by pollutants such as PCBs. In 2002, psychologist Richard Wiseman and colleagues at the University of Hertfordshire, UK, finished a year - long LaughLab experiment, concluding that of all animals, ducks attract the most humor and silliness; he said, "If you 're going to tell a joke involving an animal, make it a duck. '' The word "duck '' may have become an inherently funny word in many languages, possibly because ducks are seen as silly in their looks or behavior. Of the many ducks in fiction, many are cartoon characters, such as Walt Disney 's Donald Duck, and Warner Bros. ' Daffy Duck. Howard the Duck started as a comic book character in 1973 and was made into a movie in 1986. The 1992 Disney film The Mighty Ducks, starring Emilio Estevez chose the duck as the mascot for the fictional youth hockey team who are protagonists of the movie, based on the duck being described as a fierce fighter. This led to the duck becoming the nickname and mascot for the eventual National Hockey League professional team Anaheim Ducks. The duck is also the nickname of the University of Oregon sports teams as well as the Long Island Ducks minor league baseball team.
latitude and longitude of southern tip of india
List of extreme points of India - wikipedia The extreme points of India include the coordinates that are further north, south, east or west than any other location in India; and the highest and the lowest altitudes in the country. The northernmost point claimed by India is in territory disputed between India and Pakistan. With the exception of Kanyakumari (Cape Comorin), the southern-most location of mainland India, all other extreme locations are uninhabited. But some consider Indira point as the extreme tip but the consideration is neither right nor wrong. The latitude and longitude are expressed in decimal degree notation, in which a positive latitude value refers to the northern hemisphere, and a negative value refers to the southern hemisphere. Similarly, a positive longitude value refers to the eastern hemisphere, and a negative value refers to the western hemisphere. The coordinates used in this article are sourced from Google Earth, which makes use of the WGS84 geodetic reference system. Additionally, a negative altitude value refers to land below sea level. The northernmost point that India claims lies in the territory now administered by China as part of Xinjiang but once claimed by Hunza and therefore claimed by India as part of the disputed state of Jammu and Kashmir. The northernmost point administered by India lies in Jammu & Kashmir. India 's claim to the whole of Kashmir is disputed by Pakistan and China, with the territory currently partitioned into Pakistan 's state of Gilgit - Baltistan, the Chinese region of Aksai Chin and the Indian - administered state of Jammu and Kashmir. This list provides the northernmost point as claimed by India; the northern-most disputed point that is administered by India; and the northern-most undisputed point in India. This case also applies to the highest elevated regions. India 's eastern-most state is Arunachal Pradesh. Part of the state is claimed by China as "South Tibet '', though administered by India, The easternmost of Indian - administered territory is located in this disputed region. Consequently, this list mentions both the disputed and undisputed eastern-most points in India.
when was saturday night fever released in uk
Saturday Night Fever - wikipedia Saturday Night Fever is a 1977 American musical drama film directed by John Badham. It stars John Travolta as Tony Manero, a working - class young man who spends his weekends dancing and drinking at a local Brooklyn discothèque; Karen Gorney as Stephanie Mangano, his dance partner and eventual confidante; and Donna Pescow as Annette, Tony 's former dance partner and would - be girlfriend. While in the disco, Tony is the champion dancer. His circle of friends and weekend dancing help him to cope with the harsh realities of his life: a dead - end job, clashes with his unsupportive and squabbling parents, racial tensions in the local community, and his general restlessness. The story is based upon a 1976 New York magazine article by British writer Nik Cohn, "Tribal Rites of the New Saturday Night ''; in the mid-1990s, Cohn acknowledged that he fabricated the article. A newcomer to the United States and a stranger to the disco lifestyle, Cohn was unable to make any sense of the subculture he had been assigned to write about; instead, the character who became Tony Manero was based on an English mod acquaintance of Cohn. A huge commercial success, the film significantly helped to popularize disco music around the world and made Travolta, already well known from his role on TV 's Welcome Back, Kotter, a household name. The Saturday Night Fever soundtrack, featuring disco songs by the Bee Gees, is one of the best - selling soundtracks of all time. The film showcased aspects of the music, the dancing, and the subculture surrounding the disco era: symphony - orchestrated melodies; haute couture styles of clothing; pre-AIDS sexual promiscuity; and graceful choreography. The sequel Staying Alive (1983) also starred John Travolta and was directed by Sylvester Stallone, but received less positive reception. In 2010, Saturday Night Fever was deemed "culturally, historically, or aesthetically significant '' by the Library of Congress and selected for preservation in the National Film Registry. Anthony "Tony '' Manero is a 19 - year - old Italian American man from the Bay Ridge neighborhood of Brooklyn, New York. He lives with his parents, and works at a dead - end job in a small hardware store. To escape his day to day life, every Saturday night Tony goes to 2001 Odyssey, a local disco club. Tony has four close friends: Joey, Double J, Gus, and Bobby C. A fringe member of his group of friends is Annette, a neighborhood girl who longs for a physical relationship with Tony. Tony and his friends ritually stop on the Verrazano -- Narrows Bridge to clown around. The bridge has special significance for Tony as a symbol of escape to a better life on the other side -- in more suburban Staten Island. Tony agrees to be Annette 's partner in an upcoming dance contest, but her happiness is short - lived when Tony is mesmerized by another woman at the club, Stephanie Mangano, whose dancing skills exceed Annette 's. Although Stephanie rejects Tony 's advances, she eventually agrees to be his partner in the dance competition, provided that their partnership remains professional. Tony 's older brother, Frank Jr., who was the pride of the family since he was ordained a Roman Catholic priest, brings despair to their parents when he tells them that he has left the priesthood. Tony shares a warm relationship with Frank Jr., but feels pleased that he is no longer the black sheep of the family. While on his way home from the grocery store, Gus is attacked by a gang and hospitalized. He tells Tony and his friends that his attackers were the Barracudas. Meanwhile, Bobby C. has been trying to get out of his relationship with his devout Catholic girlfriend, Pauline, who is pregnant with his child. Facing pressure from his family and others to marry her, Bobby asks former priest Frank Jr., if the Pope would grant him dispensation for an abortion. When Frank tells him such a thing would be highly unlikely, Bobby 's feelings of desperation increase. Bobby lets Tony borrow his car to help move Stephanie from Bay Ridge to Manhattan, and tries to extract a promise from Tony to call him later that night. Eventually, the group gets their revenge on the Barracudas, and crash Bobby C 's car into their hangout. Tony, Double J, and Joey get out of the car to fight, but Bobby C. takes off when a gang member tries to attack him in the car. When the group visit Gus in the hospital, they are angry when he tells them that he may have identified the wrong gang. Later, Tony and Stephanie dance at the competition and end up winning first prize. However, Tony believes that a Puerto Rican couple performed better, and that the judges ' decision was racially motivated. He gives the Puerto Rican couple the trophy, and leaves with Stephanie. Once outside in a car, she denigrates their relationship and he tries to rape her. She resists and runs from him. Tony 's friends come to the car along with an intoxicated Annette. Joey says she has agreed to have sex with everyone. Tony tries to lead her away, but is subdued by Double J and Joey, and sullenly leaves with the group in the car. Double J and Joey rape Annette. Bobby C. pulls the car over on the Verrazano - Narrows Bridge for their usual cable - climbing antics. Instead of abstaining as usual, Bobby performs stunts more recklessly than the rest of the gang. Realizing that he is acting recklessly, Tony tries to get him to come down. Bobby 's strong sense of despair, the situation with Pauline, and Tony 's broken promise to call him earlier that day all lead to a suicidal tirade about Tony 's lack of caring before Bobby slips and falls to his death in the water below. Disgusted and disillusioned by his friends, his family, and his life, Tony spends the rest of the night riding the subway into Manhattan. Morning has dawned by the time he appears at Stephanie 's apartment. He apologizes for his bad behavior, telling her that he plans to relocate from Brooklyn to Manhattan to try and start a new life. Tony and Stephanie salvage their relationship and agree to be friends. Donna Pescow was almost considered "too pretty '' for the role of Annette. She corrected this matter by putting on 40 pounds (18 kilograms) and relearning her native Brooklyn accent, which she had overcome while studying drama at the American Academy of Dramatic Arts. After production ended, she quickly lost the weight she had gained for the role. John Travolta 's mother Helen and sister Ann both appeared in minor roles in the beginning of the film. Travolta 's sister is the pizzeria waitress who serves him the pizza slices (and delivers the first dialogue), and his mother plays the woman to whom he sells the can of paint (after being late). John G. Avildsen was signed to direct but was fired three weeks prior to principal photography over a script dispute with producer Robert Stigwood. Despite this, Travolta 's character has a Rocky poster in his room, a film directed by Avildsen. According to the DVD commentary for Saturday Night Fever, the producers intended to use the song "Lowdown '' by Boz Scaggs in the rehearsal scene between Tony and Annette in the dance studio, and choreographed their dance moves to the song. However, representatives for Scaggs ' label, Columbia Records, refused to grant legal clearance for it, as they wanted to pursue another disco movie project, which never materialized. Composer David Shire, who scored the film, had to in turn write a song to match the dance steps demonstrated in the scene and eliminate the need for future legal hassles. However, this track does not appear on the movie 's soundtrack. The song "K - Jee '' was used during the dance contest with the Puerto Rican couple that competed against Tony and Stephanie. Some VHS cassettes used a more traditional Latin - style song instead. The DVD restores the original recording. The album has been added to the National Recording Registry in the Library of Congress. Two theatrical versions of the film were released: the original R - rated version and an edited PG - rated version. (The PG - rated re-issue was in 1979; the middle - ground PG - 13 rating was not created until 1984.) The R - rated version released in 1977 represented the movie 's first run, and totaled 118 minutes. After the success of the first run, in 1979, the film 's content was re-edited into a toned down, PG - rated version and re-released during a second run, not only to attract a wider audience, but also to capitalize on attracting the target audience of the teenagers who were not old enough to see the film by themselves, but who made the soundtrack album to the film a monster hit. The R - rated version 's profanity, nudity, fight sequence, and a multiple rape scene in a car, were all de-emphasized or removed from the PG version. Producer Robert Stigwood said in an A&E Documentary of "The Inside Story: Saturday Night Fever '', about the PG version: The PG - rated version was 112 minutes. Numerous profanity - filled scenes were replaced with alternate takes of the same scenes, substituting milder language initially intended for the network television cut. When the film premiered on network television, a new milder version was created to conform with network broadcast standards. To maintain runtime, a few deleted scenes were restored (including Tony dancing with Doreen to "Disco Duck '', Tony running his finger along the cables of the Verrazano -- Narrows Bridge, and Tony 's father getting his job back). The last two deleted scenes were included in the 2017 director 's cut. In 1980, Paramount Pictures paired up the PG - rated version of the film as a double feature along with its other John Travolta blockbuster, Grease. When Saturday Night Fever premiered on HBO in 1980, they aired both versions of the film: the PG version during the day, and the R version during the evening (HBO had a programming rule of only showing R - rated films during the evening. This was before switching to a 24 - hour - a-day operation, while still under their old broadcast standards concerning R - rated films). The premiere of the R - rated edition occurred at midnight on January 1, 1980. In 2017 the director 's cut (122 minutes) premiered at the TCM Festival at Grauman 's Chinese Theatre in Hollywood. Fathom Events will host special screenings of this version in 2017. This version was released on Blu - Ray & DVD in May. Both theatrical versions were released on VHS The PG - rated version never had a Home Video Release on Laserdisc. The R - rated special - edition DVD release includes most of the deleted scenes present on the PG version. The DVD release also includes a director 's commentary and "Behind the Music '' highlights. Starting in the late 1990s VH1, TBS, and TNT started showing the original R - rated version with a TV - 14 rating. The nudity was removed / censored, and the stronger profanity was either edited or (on recent airings) silenced. But this TV edit included some of the innuendos from the original film that were edited or removed from the PG version. Turner Classic Movies has aired the film in both versions (the R - rated version is commonly seen on their normal lineup, while the PG version has appeared on TCM 's "Funday Night at the Movies '' and "Essentials Jr. '' program blocks.) The network television version (which premiered on November 16, 1980 on ABC) was basically a slightly shortened form of the PG - rated version, but contained several minutes of out - takes normally excised from both theatrical releases to make up for lost / cut material. It is among the longest cuts of the film. On May 5, 2009, Paramount released Saturday Night Fever on Blu - ray Disc in 1.78: 1 aspect ratio. This release retains the R - rated version of the film along with many special features new to home media. The 4K director 's cut (122 minutes) was released on Blu - Ray on May 2, 2017. This disc includes both the director 's cut and the original theatrical version, as well as the bulk of the bonus features from the prior release. Saturday Night Fever received positive reviews and is regarded by many critics as one of the best films of 1977. On Rotten Tomatoes the film has an approval rating of 86 % based on 44 reviews, with an average rating of 7.5 / 10. The site 's critical consensus reads, "Boasting a smart, poignant story, a classic soundtrack, and a starmaking performance from John Travolta, Saturday Night Fever ranks among the finest dramas of the 1970s. '' At Metacritic the film has a score of 77 out of 100, based on 7 critics, indicating "generally favorable reviews ''. It was added to The New York Times "Guide to the Best 1,000 Movies Ever Made '', which was published in 2004. In 2010, the film was selected for preservation in the United States National Film Registry by the Library of Congress as being "culturally, historically, or aesthetically significant ''. Film critic Gene Siskel, who would later list this as his favorite movie, praised the film: "One minute into Saturday Night Fever you know this picture is onto something, that it knows what it 's talking about. '' He also praised John Travolta 's energetic performance: "Travolta on the dance floor is like a peacock on amphetamines. He struts like crazy. '' Siskel even bought Travolta 's famous white suit from the film at a charity auction. Film critic Pauline Kael wrote a gushing review of the film in The New Yorker: "The way Saturday Night Fever has been directed and shot, we feel the languorous pull of the discotheque, and the gaudiness is transformed. These are among the most hypnotically beautiful pop dance scenes ever filmed... Travolta gets so far inside the role he seems incapable of a false note; even the Brooklyn accent sounds unerring... At its best, though, Saturday Night Fever gets at something deeply romantic: the need to move, to dance, and the need to be who you 'd like to be. Nirvana is the dance; when the music stops, you return to being ordinary. '' Award wins: Award nominations: American Film Institute Lists In 2008, the director Pablo Larraín made a film, Tony Manero, about a Chilean dancer obsessed by the main character in Saturday Night Fever who tries to win a Tony Manero look - alike contest. On April 17, 2012, Fox aired series Glee 's episode 16, "Saturday Night Glee - ver '', which pays tribute to the film and features various songs from its soundtrack (especially the songs performed by the Bee Gees), covered by the series ' cast. The Red Hot Chili Peppers 2016 music video for their song "Go Robot '' is heavily inspired by the film and recreates the opening scene and classic characters from the film who are portrayed by each band member.
who wrote yesterday is history tomorrow is a mystery
Talk: alice morse earle - Wikipedia She had a really smart and famous quot: "The clock is running. Make the most of today. Time waits for no man. Yesterday is history. Tomorrow is a mystery. Today is a gift. That 's why it is called the present. '' from the book: ' Sun Dials and Roses of Yesterday '. I think it will be reasonable to add this quote somewhere in the page. Or in WikiQuote with link. Thanks. Censored (talk) 00: 47, 3 July 2010 (UTC) She did not say that. The book "Sun Dials and Roses of Yesterday '' is available online, and there is no mention of tomorrow being a mystery or today being a gift or a present. There is a delightful section on sun - dial mottoes, but that quote is not among them. On - TechDan (talk) 13: 19, 24 June 2016 (UTC) On - TechDan
only vice president of india work with three president
List of Vice-Presidents of India - wikipedia -- -- -- -- -- -- -- -- -- -- -- -- -- -- Executive: -- -- -- -- -- -- -- Legislature: Judiciary: -- -- -- -- -- -- -- Political parties National coalitions: -- -- -- -- -- -- -- State governments Legislatures: -- -- -- -- -- -- -- Local governments: Rural bodies: Urban bodies: The Vice-President of India is the second highest constitutional office in the Indian Government after the President. In accordance with Article 65 of the Indian Constitution, the Vice-President discharges the functions of the President when a contingency arises due to the resignation, removal, death or the inability of the President to discharge his functions. He is also the ex officio chairperson of the Rajya Sabha, the upper house of the Parliament of India. The Vice President is elected by an electoral college consisting of all the members of both houses of the Parliament in accordance with the system of proportional representation by means of the single transferable vote via a secret ballot conducted by the Election Commission of India. Once elected the Vice President continues in office for a five - year term, but can continue in office irrespective of the expiry of the term, until a successor assumes office. The Vice President can be removed by a resolution passed by an effective majority in the Rajya Sabha. There have been 13 vice-president, since the inception of the post in 1950. The first Vice-president of India took oath at Rashtrapati Bhavan on 13 May 1952 On 11 August 2017 Venkaiah Naidu was sworn in as the 15th Vice-President of India, thus becoming the 13th person to hold the office. The complete list of Vice-Presidents of India includes the persons sworn into the office as Vice-President of India, following the adoption of the Constitution of India in 1950. Some of whom later became presidents. 1957 (Unopposed) Fakhruddin Ali Ahmed (1974)
who wrote the book indian war of independence 1957
The Indian War of Independence (book) - Wikipedia The Indian War of Independence is an Indian nationalist history of the 1857 revolt by Vinayak Damodar Savarkar that was first published in 1909. The book, initially written in Marathi, was penned by Savarkar in response to celebrations in Britain of the 50th anniversary of the 1857 Indian uprising with records from India Office archives and the whole project received support from Indian nationalists in Britain including the likes of Madame Cama, V.V.S. Iyer and M.P.T. Acharya, as well as Indian students who had dared not show their support or sympathy for India House openly. Published during Savarkar 's stay in London at the India House, the book was influenced by histories of the French Revolution and the American Revolution, as much as it sought to bring the Indian movement to public attention in Britain as well as to inspire nationalist revolution in India. The book, which describes the 1857 revolt as a unified and national uprising of India as a nation against British authority, was seen at the time as highly inflammatory, and the Marathi edition was banned in British India even before its publication. Publication of the English translation faltered after British printers and publishing houses were warned by the Home Office of its highly seditious content, while the British foreign office brought pressure on the French Government to prevent its publication from Paris. It was ultimately printed in the Netherlands in 1909, with the British government not tracing it until too late. The copies were printed with false dust wrappers purporting to be copies of The Pickwick Papers and other literary classics, and large quantities were shipped to India where it quickly became a bible of political extremists. It was excluded from the catalogue of the British Library to prevent Indian students from accessing it. In India, the book remained banned till the end of The Raj forty years later. The Indian War of Independence is considered to be an influential work in Indian history and nationalist writing, and also one of Savarkar 's most influential works in developing and framing ideas of masculine Hinduism. While some erstwhile and modern histories draw similar conclusions as the Savarkar, others, including R.C. Majumdar, disagreed with Savarkar 's conclusions in his book on the national and unified character of the mutiny.
what canadian network is a million little things on
2018 -- 19 Canadian network television schedule - wikipedia The 2018 -- 19 network television schedule for the five major English commercial broadcast networks in Canada covers primetime hours from September 2018 through August 2019. The schedule is followed by a list per network of returning series, new series, and series canceled after the 2017 - 18 television season, for Canadian, American and other series. CBC Television was first to announce its fall schedule on May 24, 2018, followed by Global on June 4, Citytv on June 5, and CTV and CTV Two on June 7, 2018. As in the past, the commercial networks ' announcements come shortly after the networks have had a chance to buy Canadian rights to new American series. CTV Two and Global are not included on Saturday as they normally only schedule encore programming in primetime on Saturdays.
who were the ghazis and what role did they play in building the ottoman empire
Ghazi (warrior) - wikipedia Ghazi (غازي, ġāzī) is an Arabic term originally referring to an individual who participates in ghazw (غزو, ġazw), meaning military expeditions or raiding; after the emergence of Islam, it took on new connotations of religious warfare. The related word ghazwa (غزوة ġazwah) is a singulative form meaning a battle or military expedition, often one led by the Islamic prophet Muhammad. In English language literature, the word often appears as razzia, a borrowing through French from Maghrebi Arabic. In the context of the wars between Russia and the Muslim peoples of the Caucasus, starting as early as the late 18th century 's Sheikh Mansur 's resistance to Russian expansion, the word usually appears in the form gazavat (газават). In pre-Islamic Bedouin culture, ghazw (a) was a form of limited warfare verging on brigandage that avoided head - on confrontations and instead emphasized raiding and looting, usually of livestock (see cattle raiding). The Umayyad - period Bedouin poet al - Kutami wrote the oft - quoted verses: "Our business is to make raids on the enemy, on our neighbor and our own brother, in the event we find none to raid but a brother. '' (Semi-institutionalized raiding of livestock herds was not unique to the Bedouins; the Soviet anthropologists adopted the Kazakh word barymta to describe similar practices of nomads in the Eurasian steppes.) William Montgomery Watt hypothesized that Muhammad found it useful to divert this continuous internecine warfare toward his enemies, making it the basis of his war strategy; according to Watt, the celebrated battle of Badr started as one such razzia. As a form of warfare, the razzia was then mimicked by the Christian states of Iberia in their relations with the taifa states; rough synonyms and similar tactics are the Iberian cavalgada and the Anglo - French chevauchée. The word razzia is used in French colonial context particularly for raids to plunder and capture slaves from among the people of Western and Central Africa, also known as rezzou when practiced by the Tuareg. The word was adopted from ġaziya of Algerian Arabic vernacular and later became a figurative name for any act of pillage, with its verb form razzier. Ghazi (Arabic: غازي ‎, ġāzī) is an Arabic word, the active participle of the verb ġazā, meaning ' to carry out a military expedition or raid '; the same verb can also mean ' to strive for ' and Ghazi can thus share a similar meaning to Mujahid or "one who struggles ''. The verbal noun of ġazā is ġazw or ġazawān, with the meaning ' raiding '. A derived singulative in ġazwah refers to a single battle or raid. The term ghāzī dates to at least the Samanid period, where he appears as a mercenary and frontier fighter in Khorasan and Transoxiana. Later, up to 20,000 of them took part in the Indian campaigns of Mahmud of Ghazni. Ghāzī warriors depended upon plunder for their livelihood, and were prone to brigandage and sedition in times of peace. The corporations into which they organized themselves attracted adventurers, zealots and religious and political dissidents of all ethnicities. In time, though, soldiers of Turkic ethnicity predominated, mirroring the acquisition of Mamluks, Turkic slaves in the Mamluk retinues and guard corps of the caliphs and emirs and in the ranks of the ghazi corporation, some of whom would ultimately rise to military and later political dominance in various Muslim states. In the west, Turkic ghāzīs made continual incursions along the Byzantine frontier zone, finding in the akritai (akritoi) their Greek counterparts. After the Battle of Manzikert these incursions intensified, and the region 's people would see the ghāzī corporations coalesce into semi-chivalric fraternities, with the white cap and the club as their emblems. The height of the organizations would come during the Mongol conquest when many of them fled from Persia and Turkistan into Anatolia. As organizations, the ghazi corporations were fluid, reflecting their popular character, and individual ghāzī warriors would jump between them depending upon the prestige and success of a particular emir, rather like the mercenary bands around western condottiere. It was from these Anatolian territories conquered during the ghazw that the Ottoman Empire emerged, and in its legendary traditions it is said that its founder, Osman I, came forward as a ghāzī thanks to the inspiration of Shaikh Ede Bali. In later periods of Islamic history the honorific title of ghāzī was assumed by those Muslim rulers who showed conspicuous success in extending the domains of Islam, and eventually the honorific became exclusive to them, much as the Roman title imperator became the exclusive property of the supreme ruler of the Roman state and his family. The Ottomans were probably the first to adopt this practice, and in any case the institution of ghazw reaches back to the beginnings of their state: The first nine Ottoman chiefs all used Ghazi as part of their full throne name (as with many other titles, the nomination was added even though it did not fit the office), and often afterwards. However, it never became a formal title within the ruler 's formal style, unlike Sultan ul - Mujahidin, used by Sultan Murad Khan II Khoja - Ghazi, 6th Sovereign of the House of Osman (1421 -- 1451), styled ' Abu'l Hayrat, Sultan ul - Mujahidin, Khan of Khans, Grand Sultan of Anatolia and Rumelia, and of the Cities of Adrianople and Philippolis. Because of the political legitimacy that would accrue to those bearing this title, Muslim rulers vied amongst themselves for preeminence in the ghāziya, with the Ottoman Sultans generally acknowledged as excelling all others in this feat: Ghazi was also used as a title of honor in the Ottoman Empire, generally translated as the Victorious, for military officers of high rank, who distinguished themselves in the field against non-Moslem enemies; thus it was conferred on Osman Pasha after his famous defence of Plevna in Bulgaria and on Mustafa Kemal Paşa (later known as Mustafa Kemal Atatürk) for leading the defense against the Gallipoli campaign. Some Muslim rulers (in Afghanistan) personally used the subsidiary style Padshah - i - Ghazi. Ghazwah, which literally means "campaigns '', is typically used by biographers to refer to all the Prophet 's journeys from Medina, whether to make peace treaties and preach Islam to the tribes, to go on ʽumrah, to pursue enemies who attacked Medina, or to engage in the nine battles. Muhammad participated in 27 Ghazwa. The first Ghazwa he participated in was the Invasion of Waddan in August 623, he ordered his followers to attack a Quraysh caravan. When performed within the context of Islamic warfare, the ghazw 's function was to weaken the enemy 's defenses in preparation for his eventual conquest and subjugation. Because the typical ghazw raiding party often did not have the size or strength to seize military or territorial objectives, this usually meant sudden attacks on weakly defended targets (e.g. villages) with the intent of demoralizing the enemy and destroying material which could support their military forces. Though Islam 's rules of warfare offered protection to non-combatants such as women, monastics and peasants in that they could not be slain, their property could still be looted or destroyed, and they themselves could be abducted and enslaved (Cambridge History of Islam, p. 269): A good source on the conduct of the traditional ghazw raid are the medieval Islamic jurists, whose discussions as to which conduct is allowed and which is forbidden in the course of warfare reveal some of the practices of this institution. One such source is Averroes ' Bidāyat al - Mujtahid wa - Nihāyat al - Muqtasid (translated in Peters, Jihad in Classical and Modern Islam: A Reader, Chapter 4). In the 19th century, Muslim fighters in North Caucasus who were resisting the Russian military operations declared a gazawat (understood as holy war) against the Russian Orthodox invasion. Although uncertain, it is believed that Dagestani Islamic scholar Muhammad Yaragskii was the ideologist of this holy war. In 1825, a congress of ulema in the village of Yarag declared gazawat against the Russians. Its first leader was Ghazi Muhammad; after his death, Imam Shamil would eventually continue it. During the Second Chechen War, Chechnya announced gazawat against Russia. After the terrorist attacks on Paris in November 2015, the Islamic State group is said to have referred to its actions as "ghazwa ''. Probably the most famous use of the term "ghazwa '' is in the phrase ' Manhattan Raid ', used by Al - Qaeda to refer to the September 11th attacks.
difference between kentucky fried chicken and southern fried chicken
Fried chicken - wikipedia Fried chicken (also referred to as Southern fried chicken for the variant in the United States) is a dish consisting of chicken pieces usually from broiler chickens which have been floured or battered and then pan-fried, deep fried, or pressure fried. The breading adds a crisp coating or crust to the exterior of the chicken. What separates fried chicken from other fried forms of chicken is that generally the chicken is cut at the joints, and the bones and skin are left intact. Crisp well - seasoned skin, rendered of excess fat, is a hallmark of well made fried chicken. The first dish known to have been deep fried was fritters, which were popular in the Middle Ages. However, it was the Scottish who were the first Europeans to deep fry their chicken in fat (though without seasoning). Meanwhile, a number of West African peoples had traditions of seasoned fried chicken (though battering and cooking the chicken in palm oil). Scottish frying techniques and West African seasoning techniques were combined by enslaved Africans and African - Americans in the American South. Prior to the Second World War, fried chicken was often very expensive and was only enjoyed on special occasions. In the late 1900s and early 2000s, however, fried chicken has been mass - produced and the price of the dish has gone down significantly. When being cooked, fried chicken is often divided into smaller pieces. The chicken is then generally covered in a batter, often consisting of ingredients such as eggs or milk, and a thickener such as flour. This is used to create a crust on the exterior of the meat. In addition, seasoning is often added at this stage. Once the chicken is ready to be cooked, it is placed in a deep fryer, frying pan or pressure cooker (depending on the method used) and fried in lard or a type of oil. The dish has created a large number of spin - off recipes which are commonly used around the world. For example, Korean fried chicken, a dish which is commonly served as fast food in Korea and is known for being crispier than normal fried chicken. There is also a racial stereotype surrounding fried chicken and African - American people, mostly because it was popular among slaves in the American Civil War. The Roman cookbook of Apicius (4th century) has a recipe for deep - fried chicken called Pullum Frontonianum. The American English expression "fried chicken '' is first recorded in the 1830s, and frequently appears in American cookbooks of the 1860s and 1870s. The origin of fried chicken in the southern states of America has been traced to precedents in either Scottish or West African cuisine Scottish fried chicken was cooked in fat (though unseasoned) while West African fried chicken was seasoned (but battered and cooked in palm oil). Scottish frying techniques and African seasoning techniques were used in the American South by African slaves. Fried chicken provided some means of independent economy for enslaved and segregated African - American women, who became noted sellers of poultry (live or cooked) as early as the 1730s. Because of this and the expensive nature of the ingredients, it was, despite popular belief, a rare dish in the African - American community reserved (as in Africa) for special occasions. After the development of larger and faster - growing hogs (due to crosses between European and Asian breeds) in the 18th and 19th century in the United States, backyard and small - scale hog production provided an inexpensive means of converting waste food, crop waste, and garbage into calories (in a relatively small space and in a relatively short period of time). Many of those calories came in the form of fat and rendered lard. Lard was used for almost all cooking and was a fundamental component in many common homestead foods (many that today are still regarded as holiday and comfort foods) like biscuits and pies. The economic and caloric necessity of consuming lard and other saved fats may have led to the popularity of fried foods, not only in the US, but worldwide. In the 19th century cast iron became widely available for use in cooking. The combination of flour, lard, a chicken and a heavy pan placed over a relatively controllable flame became the beginning of today 's fried chicken. When it was introduced to the American South, fried chicken became a common staple. Later, as the slave trade led to Africans being brought to work on southern plantations, the slaves who became cooks incorporated seasonings and spices that were absent in traditional Scottish cuisine, enriching the flavor. Since most slaves were unable to raise expensive meats, but generally allowed to keep chickens, frying chicken on special occasions continued in the African American communities of the South. It endured the fall of slavery and gradually passed into common use as a general Southern dish. Since fried chicken traveled well in hot weather before refrigeration was commonplace, it gained further favor in the periods of American history when segregation closed off most restaurants to the black population. Fried chicken continues to be among this region 's top choices for "Sunday dinner ''. Holidays such as Independence Day and other gatherings often feature this dish. Since the American Civil War, traditional slave foods like fried chicken, watermelon, and chitterlings have suffered a strong association with African - American stereotypes and blackface minstrelsy. This was commercialized for the first half of the 20th century by restaurants like Sambo 's and Coon Chicken Inn, which selected exaggerated depictions of blacks as mascots, implying quality by their association with the stereotype. Although also being acknowledged positively as "soul food '' today, the affinity that African - American culture has for fried chicken has been considered a delicate, often pejorative issue. While the perception of fried chicken as an ethnic dish has been fading for several decades, with the ubiquity of fried chicken dishes in the US, it persists as a racial stereotype. Before the industrialization of chicken production, and the creation of broiler breeds of chicken, only young spring chickens (pullets or cockerels) would be suitable for the higher heat and relatively fast cooking time of frying, making fried chicken a luxury of spring and summer. Older, tougher birds require longer cooking times at lower temperatures. To compensate for this, sometimes tougher birds are simmered till tender, allowed to cool and dry, and then fried. (This method is common in Australia.) Another method is to fry the chicken pieces using a pan-fried method. The chicken pieces are then simmered in liquid, usually, a gravy made in the pan that the chicken pieces were cooked in. This process (of flouring, frying and simmering in gravy) is known as "smothering '' and can be used for other tough cuts of meat, such as swiss steak. Smothered chicken is still consumed today, though with the exception of people who raise their own chickens, or who seek out stewing hens, it is primarily made using commercial broiler chickens. Fried chicken has been described as being "crunchy '' and "juicy '', as well as "crispy ''. In addition, the dish has also been called "spicy '' and "salty ''. Occasionally, fried chicken is also topped with a chili like paprika, or hot sauce to give it a spicy taste. This is especially common in fast food restaurants and chains such as KFC. The dish is traditionally served with mashed potato, gravy, coleslaw and biscuits. The dish is renowned for being greasy and unhealthy, especially when coming from fast food outlets. It has even been reported that some of those who enjoy eating the food limit themselves to eating it only a certain number of times a year, to keep their fat intake reasonably low. Out of the various parts of the animal used in fried chicken, the wings generally tend to contain the most fat, with almost 40 grams (0.088 lb) of fat for every 100 grams (0.22 lb). However, the average whole fried chicken contains only around 12 % fat, or 12 grams (0.026 lb) per every 100 grams (0.22 lb). As well as this, 100 grams (0.22 lb) grams of fried chicken generally contains around 240 calories of energy. One of the main causes of the large amounts of fat which can be found in fried chicken is the oil which is used to cook it. Generally, chickens are not fried whole; instead, the chicken is divided into its constituent pieces. The two white meat sections are the breast and the wing from the front of the chicken, while the dark meat sections are the thigh and leg or "drumstick '', are from the rear of the chicken. These pieces are usually subdivided into the wings, the breasts (the wishbone is often cut out first in home cooking), the legs, and the thighs. The ribs are sometimes left on the breast, but commercially they and the back are usually discarded. To prepare the chicken pieces for frying, they may be coated in a batter of flour and liquid (and seasonings) mixed together. The batter can contain ingredients like eggs, milk, and leavening. Alternatively, they may be dredged in flour or a similar dry substance, to coat the meat and to develop a crust. Seasonings such as salt, pepper, cayenne pepper, paprika, garlic powder, onion powder, or ranch dressing mix can be mixed in with the flour. Either process may be preceded by marination or by dipping in milk or buttermilk. As the pieces of chicken cook, some of the moisture that exudes from the chicken is absorbed by the coating of flour and browns along with the flour, creating a flavorful crust. According to Nathan Bailey 's 1736 cookbook, Dictionarium Domesticum, for example, the chicken can be covered in a marinade that consists of the juice of two large fresh lemons, malt vinegar, bay leaves, salt, pepper, ground cloves, and green onions; it then must be settled in the marinade for three hours before being dipped in the batter that consists of all - purpose flour, white wine, three egg yolks and salt, and then slowly submerged in a deep pot of either oil, lard, or clarified butter over an open fire. It can then be topped with fresh, dried parsley dipped in the same frying oil. Traditionally, lard is used to fry the chicken, but corn oil, peanut oil, canola oil, or vegetable oil are also frequently used (although clarified butter may be used as well like in colonial times). The flavor of olive oil is generally considered too strong to be used for traditional fried chicken, and its low smoke point makes it unsuitable for use. There are three main techniques for frying chickens: pan frying, deep frying and broasting. Pan frying (or shallow frying) requires a frying pan of sturdy construction and a source of fat that does not fully immerse the chicken. The chicken pieces are prepared as above, then fried. Generally the fat is heated to a temperature hot enough to seal (without browning, at this point) the outside of the chicken pieces. Once the pieces have been added to the hot fat and sealed, the temperature is reduced. There is debate as to how often to turn the chicken pieces, with one camp arguing for often turning and even browning, and the other camp pushing for letting the pieces render skin side down and only turning when absolutely necessary. Once the chicken pieces are close to being done the temperature is raised and the pieces are browned to the desired color (some cooks add small amounts of butter at this point to enhance browning). The moisture from the chicken that sticks and browns on the bottom of the pan become the fonds required to make gravy. Deep frying requires a deep fryer or other device in which the chicken pieces can be completely submerged in hot fat. The process of deep frying is basically placing food fully in oil and then cooking it at a very high temperature. The pieces are prepared as described above. The fat is heated in the deep fryer to the desired temperature. The pieces are added to the fat and a constant temperature is maintained throughout the cooking process. Broasting uses a pressure cooker to accelerate the process. The moisture inside the chicken becomes steam and increases the pressure in the cooker, lowering the cooking temperature needed. The steam also cooks the chicken through, but still allows the pieces to be moist and tender while maintaining a crisp coating. Fat is heated in a pressure cooker. Chicken pieces are prepared as described above and then placed in the hot fat. The lid is placed on the pressure cooker, and the chicken pieces are thus fried under pressure. The derivative phrases "country fried '' and "chicken fried '' often refer to other foods prepared in the manner of fried chicken. Usually, this means a boneless, tenderized piece of meat that has been floured or battered and cooked in any of the methods described above or simply chicken which is cooked outdoors. Chicken fried steak and "country fried '' boneless chicken breast are two common examples. Throughout the world, different seasoning and spices are used to augment the flavor of fried chicken. Because of the versatility of fried chicken, it is not uncommon to flavor the chicken 's crisp exterior with a variety of spices ranging from spicy to savory. Depending on regional market ubiquity, local spice variations may be labeled as distinct from traditional Southern U.S. flavors, or may appear on menus without notation. With access to chickens suitable for frying broadening on a global scale with the advent of industrialized poultry farming, many localities have added their own mark on fried chicken, tweaking recipes to suit local preferences. In the United States, fried chicken has stereotypically been associated with African - Americans. The reasons for this are various. Chicken dishes were popular among slaves before the Civil War, as chickens were generally the only animals slaves were allowed to raise on their own. Minstrel shows and the film The Birth of a Nation led to a prevalent stereotype associating African - Americans with fried chicken. On two occasions the golfer Tiger Woods has been the target of remarks regarding fried chicken. The first occurred in 1997 when golfer Fuzzy Zoeller said that Woods should avoid choosing fried chicken for the Masters champions ' dinner the following year; the second when golfer Sergio García was asked in a press conference in 2013 whether he would invite Woods to dinner during the U.S. Open to settle their ongoing feud. García, a Spaniard who was unaware of the existence of this stereotype in American society, committed a gaffe saying: "We will have him round every night... We will serve fried chicken '', which Woods said was "wrong, hurtful and clearly inappropriate ''. Both Zoeller and García subsequently apologized to Woods. In 2009, a KFC in Beijing renamed their restaurant to "Obama Fried Chicken '' in reference to recently inaugurated President Barack Obama. Despite controversy at the time, the owner refused to change the name back, and the restaurant continues to operate under this name. At a dinner during Black History Month, an NBC chef, Leslie Calhoun, served fried chicken. The drummer of the Roots, Questlove, was angered by this and thought it both offensive and ignorant. In 2012, Burger King withdrew a commercial which featured Mary J. Blige singing about a crispy chicken wrap. This was due to the racial stereotypes surrounding fried chicken.
how many instruments in a full symphony orchestra
Orchestra - Wikipedia An orchestra (/ ˈɔːrkɪstrə /; Italian: (orˈkɛstra)) is a large instrumental ensemble typical of classical music, which mixes instruments from different families, including bowed string instruments such as violin, viola, cello and double bass, as well as brass, woodwinds, and percussion instruments, each grouped in sections. Other instruments such as the piano and celesta may sometimes appear in a fifth keyboard section or may stand alone, as may the concert harp and, for performances of some modern compositions, electronic instruments. A full - size orchestra may sometimes be called a symphony orchestra or philharmonic orchestra. The actual number of musicians employed in a given performance may vary from seventy to over one hundred musicians, depending on the work being played and the size of the venue. The term chamber orchestra (and sometimes concert orchestra) usually refers to smaller - sized ensembles of about fifty musicians or fewer. Orchestras that specialize in the Baroque music of, for example, Johann Sebastian Bach and George Frideric Handel, or Classical repertoire, such as that of Haydn and Mozart, tend to be smaller than orchestras performing a Romantic music repertoire, such as the symphonies of Johannes Brahms. The typical orchestra grew in size throughout the 18th and 19th centuries, reaching a peak with the large orchestras (of as many as 120 players) called for in the works of Richard Wagner, and later, Gustav Mahler. Orchestras are usually led by a conductor who directs the performance with movements of the hands and arms, often made easier for the musicians to see by use of a conductor 's baton. The conductor unifies the orchestra, sets the tempo and shapes the sound of the ensemble. The conductor also prepares the orchestra by leading rehearsals before the public concert, in which the conductor provides instructions to the musicians on their interpretation of the music being performed. The leader of the first violin section, commonly called the concertmaster, also plays an important role in leading the musicians. In the Baroque music era (1600 -- 1750), orchestras were often led by the concertmaster or by a chord - playing musician performing the basso continuo parts on a harpsichord or pipe organ, a tradition that some 20th century and 21st century early music ensembles continue. Orchestras play a wide range of repertoire, including symphonies, opera and ballet overtures, concertos for solo instruments, and as pit ensembles for operas, ballets and some types of musical theater (e.g., Gilbert and Sullivan operettas). Amateur orchestras include those made up of students from an elementary school or a high school, youth orchestras, and community orchestras; the latter two typically being made up of amateur musicians from a particular city or region. The term orchestra derives from the Greek ὀρχήστρα (orchestra), the name for the area in front of a stage in ancient Greek theatre reserved for the Greek chorus. The typical symphony orchestra consists of four groups of related musical instruments called the woodwinds, brass, percussion, and strings (violin, viola, cello and double bass). Other instruments such as the piano and celesta may sometimes be grouped into a fifth section such as a keyboard section or may stand alone, as may the concert harp and electric and electronic instruments. The orchestra, depending on the size, contains almost all of the standard instruments in each group. In the history of the orchestra, its instrumentation has been expanded over time, often agreed to have been standardized by the classical period and Ludwig van Beethoven 's influence on the classical model. In the 20th and 21st century, new repertory demands expanded the instrumentation of the orchestra, resulting in a flexible use of the classical - model instruments and newly developed electric and electronic instruments in various combinations. The terms symphony orchestra and philharmonic orchestra may be used to distinguish different ensembles from the same locality, such as the London Symphony Orchestra and the London Philharmonic Orchestra. A symphony orchestra will usually have over eighty musicians on its roster, in some cases over a hundred, but the actual number of musicians employed in a particular performance may vary according to the work being played and the size of the venue. Chamber orchestra usually refers to smaller - sized ensembles; a major chamber orchestra might employ as many as fifty musicians; some are much smaller than that. The term concert orchestra may also be used, as in the BBC Concert Orchestra and the RTÉ Concert Orchestra. The so - called "standard complement '' of doubled winds and brass in the orchestra from the first half of the 19th century is generally attributed to the forces called for by Beethoven. The composer 's instrumentation almost always included paired flutes, oboes, clarinets, bassoons, horns and trumpets. The exceptions to this are his Symphony No. 4, Violin Concerto, and Piano Concerto No. 4, which each specify a single flute. Beethoven carefully calculated the expansion of this particular timbral "palette '' in Symphonies 3, 5, 6, and 9 for an innovative effect. The third horn in the "Eroica '' Symphony arrives to provide not only some harmonic flexibility, but also the effect of "choral '' brass in the Trio movement. Piccolo, contrabassoon, and trombones add to the triumphal finale of his Symphony No. 5. A piccolo and a pair of trombones help deliver the effect of storm and sunshine in the Sixth, also known as the Pastoral Symphony. The Ninth asks for a second pair of horns, for reasons similar to the "Eroica '' (four horns has since become standard); Beethoven 's use of piccolo, contrabassoon, trombones, and untuned percussion -- plus chorus and vocal soloists -- in his finale, are his earliest suggestion that the timbral boundaries of symphony might be expanded. For several decades after his death, symphonic instrumentation was faithful to Beethoven 's well - established model, with few exceptions. Apart from the core orchestral complement, various other instruments are called for occasionally. These include the flugelhorn and cornet. Saxophones and classical guitars, for example, appear in some 19th - through 21st - century scores. While appearing only as featured solo instruments in some works, for example Maurice Ravel 's orchestration of Modest Mussorgsky 's Pictures at an Exhibition and Sergei Rachmaninoff 's Symphonic Dances, the saxophone is included in other works, such as Ravel 's Boléro, Sergei Prokofiev 's Romeo and Juliet Suites 1 and 2, Vaughan Williams ' Symphonies No. 6 and 9 and William Walton 's Belshazzar 's Feast, and many other works as a member of the orchestral ensemble. The euphonium is featured in a few late Romantic and 20th - century works, usually playing parts marked "tenor tuba '', including Gustav Holst 's The Planets, and Richard Strauss 's Ein Heldenleben. The Wagner tuba, a modified member of the horn family, appears in Richard Wagner 's cycle Der Ring des Nibelungen and several other works by Strauss, Béla Bartók, and others; it has a prominent role in Anton Bruckner 's Symphony No. 7 in E Major. Cornets appear in Pyotr Ilyich Tchaikovsky 's ballet Swan Lake, Claude Debussy 's La Mer, and several orchestral works by Hector Berlioz. Unless these instruments are played by members "doubling '' on another instrument (for example, a trombone player changing to euphonium or a bassoon player switching to contrabassoon for a certain passage), orchestras typically hire freelance musicians to augment their regular ensemble. The 20th - century orchestra was far more flexible than its predecessors. In Beethoven 's and Felix Mendelssohn 's time, the orchestra was composed of a fairly standard core of instruments, which was very rarely modified by composers. As time progressed, and as the Romantic period saw changes in accepted modification with composers such as Berlioz and Mahler; some composers used multiple harps and sound effect such as the wind machine. During the 20th century, the modern orchestra was generally standardized with the modern instrumentation listed below. Nevertheless, by the mid - to late 20th century, with the development of contemporary classical music, instrumentation could practically be hand - picked by the composer (e.g., to add electric instruments such as electric guitar, electronic instruments such as synthesizers, non-Western instruments, or other instruments not traditionally used in orchestra). With this history in mind, the orchestra can be analyzed in five periods: the Baroque era, the Classical music period, early / mid-Romantic music era, late - Romantic / early 20th century music and 21st century era. The first is a Baroque orchestra (i.e., J.S. Bach, Handel, Vivaldi), which generally had a smaller number of performers, and in which one or more chord - playing instruments, the basso continuo group (e.g., harpsichord or pipe organ and assorted bass instruments to perform the bassline), played an important role; the second is a typical classical period orchestra (e.g., early Beethoven along with Mozart and Haydn), which used a smaller group of performers than a Romantic music orchestra and a fairly standardized instrumentation; the third is typical of an early / mid-Romantic era (e.g., Schubert, Berlioz, Schumann); the fourth is a late - Romantic / early 20th century orchestra (e.g., Wagner, Brahms, Mahler, Stravinsky), to the common complement of a 2010 - era modern orchestra (e.g., Adams, Barber, Aaron Copland, Glass, Penderecki). Among the instrument groups and within each group of instruments, there is a generally accepted hierarchy. Every instrumental group (or section) has a principal who is generally responsible for leading the group and playing orchestral solos. The violins are divided into two groups, first violin and second violin, with the second violins playing in lower registers than the first violins, playing an accompaniment part, or harmonizing the melody played by the first violins. The principal first violin is called the concertmaster (or "leader '' in the UK) and is not only considered the leader of the string section, but the second - in - command of the entire orchestra, behind only the conductor. The concertmaster leads the pre-concert tuning and handles musical aspects of orchestra management, such as determining the bowings for the violins or for all of the string section. The concertmaster usually sits to the conductor 's left, closest to the audience. There is also a principal second violin, a principal viola, a principal cello and a principal bass. The principal trombone is considered the leader of the low brass section, while the principal trumpet is generally considered the leader of the entire brass section. While the oboe often provides the tuning note for the orchestra (due to 300 - year - old convention), no principal is the leader of the woodwind section though in woodwind ensembles, often the flute is leader. Instead, each principal confers with the others as equals in the case of musical differences of opinion. Most sections also have an assistant principal (or co-principal or associate principal), or in the case of the first violins, an assistant concertmaster, who often plays a tutti part in addition to replacing the principal in his or her absence. A section string player plays in unison with the rest of the section, except in the case of divided (divisi) parts, where upper and lower parts in the music are often assigned to "outside '' (nearer the audience) and "inside '' seated players. Where a solo part is called for in a string section, the section leader invariably plays that part. The section leader (or principal) of a string section is also responsible for determining the bowings, often based on the bowings set out by the concertmaster. In some cases, the principal of a string section may use a slightly different bowing than the concertmaster, to accommodate the requirements of playing their instrument (e.g., the double - bass section). Principals of a string section will also lead entrances for their section, typically by lifting the bow before the entrance, to ensure the section plays together. Tutti wind and brass players generally play a unique but non-solo part. Section percussionists play parts assigned to them by the principal percussionist. In modern times, the musicians are usually directed by a conductor, although early orchestras did not have one, giving this role instead to the concertmaster or the harpsichordist playing the continuo. Some modern orchestras also do without conductors, particularly smaller orchestras and those specializing in historically accurate (so - called "period '') performances of baroque and earlier music. The most frequently performed repertoire for a symphony orchestra is Western classical music or opera. However, orchestras are used sometimes in popular music (e.g., to accompany a rock or pop band in a concert), extensively in film music, and increasingly often in video game music. Orchestras are also used in the symphonic metal genre. The term "orchestra '' can also be applied to a jazz ensemble, for example in the performance of big - band music. In the 2000s, all tenured members of a professional orchestra normally audition for positions in the ensemble. Performers typically play one or more solo pieces of the auditionee 's choice, such as a movement of a concerto, a solo Bach movement, and a variety of excerpts from the orchestral literature that are advertised in the audition poster (so the auditionees can prepare). The excerpts are typically the most technically challenging parts and solos from the orchestral literature. Orchestral auditions are typically held in front of a panel that includes the conductor, the concertmaster, the principal player of the section for which the auditionee is applying and possibly other principal players and regular orchestra members. The most promising candidates from the first round of auditions are invited to return for a second or third round of auditions, which allows the conductor and the panel to compare the best candidates. Performers may be asked to sight read orchestral music. The final stage of the audition process in some orchestras is a test week, in which the performer plays with the orchestra for a week or two, which allows the conductor and principal players to see if the individual can function well in an actual rehearsal and performance setting. There are a range of different employment arrangements. The most sought - after positions are permanent, tenured positions in the orchestra. Orchestras also hire musicians on contracts, ranging in length from a single concert to a full season or more. Contract performers may be hired for individual concerts when the orchestra is doing an exceptionally large late - Romantic era orchestral work, or to substitute for a permanent member who is sick. A professional musician who is hired to perform for a single concert is sometimes called a "sub ''. Some contract musicians may be hired to replace permanent members for the period that the permanent member is on parental leave or disability leave. Historically, major professional orchestras have been mostly or entirely composed of male musicians. The first female members hired in professional orchestras have been harpists. The Vienna Philharmonic, for example, did not accept women to permanent membership until 1997, far later than comparable orchestras (the other orchestras ranked among the world 's top five by Gramophone in 2008). The last major orchestra to appoint a woman to a permanent position was the Berlin Philharmonic. In February 1996, the Vienna Philharmonic 's principal flute, Dieter Flury, told Westdeutscher Rundfunk that accepting women would be "gambling with the emotional unity (emotionelle Geschlossenheit) that this organism currently has ''. In April 1996, the orchestra 's press secretary wrote that "compensating for the expected leaves of absence '' of maternity leave would be a problem. In 1997, the Vienna Philharmonic was "facing protests during a (US) tour '' by the National Organization for Women and the International Alliance for Women in Music. Finally, "after being held up to increasing ridicule even in socially conservative Austria, members of the orchestra gathered (on 28 February 1997) in an extraordinary meeting on the eve of their departure and agreed to admit a woman, Anna Lelkes, as harpist. '' As of 2013, the orchestra has six female members; one of them, violinist Albena Danailova, became one of the orchestra 's concertmasters in 2008, the first woman to hold that position. In 2012, women made up 6 % of the orchestra 's membership. VPO president Clemens Hellsberg said the VPO now uses completely screened blind auditions. In 2013, an article in Mother Jones stated that while "(m) any prestigious orchestras have significant female membership -- women outnumber men in the New York Philharmonic 's violin section -- and several renowned ensembles, including the National Symphony Orchestra, the Detroit Symphony, and the Minnesota Symphony, are led by women violinists '', the double bass, brass, and percussion sections of major orchestras "... are still predominantly male. '' A 2014 BBC article stated that the "... introduction of ' blind ' auditions, where a prospective instrumentalist performs behind a screen so that the judging panel can exercise no gender or racial prejudice, has seen the gender balance of traditionally male - dominated symphony orchestras gradually shift. '' There are also a variety of amateur orchestras: Orchestras play a wide range of repertoire ranging from 17th - century dance suites, 18th century divertimentos to 20th century film scores and 21st - century symphonies. Orchestras have become synonymous with the symphony, an extended musical composition in Western classical music that typically contains multiple movements which provide contrasting keys and tempos. Symphonies are notated in a musical score, which contains all the instrument parts. The conductor uses the score to study the symphony before rehearsals and decide on their interpretation (e.g., tempos, articulation, phrasing, etc.), and to follow the music during rehearsals and concerts, while leading the ensemble. Orchestral musicians play from parts containing just the notated music for their instrument. A small number of symphonies also contain vocal parts (e.g., Beethoven 's Ninth Symphony). Orchestras also perform overtures, a term originally applied to the instrumental introduction to an opera. During the early Romantic era, composers such as Beethoven and Mendelssohn began to use the term to refer to independent, self - existing instrumental, programmatic works that presaged genres such as the symphonic poem, a form devised by Franz Liszt in several works that began as dramatic overtures. These were "at first undoubtedly intended to be played at the head of a programme ''. In the 1850s the concert overture began to be supplanted by the symphonic poem. Orchestras also play with instrumental soloists in concertos. During concertos, the orchestra plays an accompaniment role to the soloist (e.g., a solo violinist or pianist) and, at times, introduces musical themes or interludes while the soloist is not playing. Orchestras also play during operas, ballets, some musical theatre works and some choral works (both sacred works such as Masses and secular works). In operas and ballets, the orchestra accompanies the singers and dancers, respectively, and plays overtures and interludes where the melodies played by the orchestra take centre stage. In the Baroque era, orchestras performed in a range of venues, including at the fine houses of aristocrats, in opera halls and in churches. Some wealthy aristocrats had an orchestra in residence at their estate, to entertain them and their guests with performances. During the Classical era, as composers increasing sought out financial support from the general public, orchestra concerts were increasingly held in public concert halls, where music lovers could buy tickets to hear the orchestra. Of course, aristocratic patronage of orchestras continued during the Classical era, but this went on alongside public concerts. In the 20th and 21st century, orchestras found a new patron: governments. Many orchestras in North America and Europe receive part of their funding from national, regional level governments (e.g., state governments in the U.S.) or city governments. These government subsidies make up part of orchestra revenue, along with ticket sales, charitable donations (if the orchestra is a registered as a charity) and other fundraising activities. With the invention of successive technologies, including sound recording, radio broadcasting, television broadcasting and Internet - based streaming and downloading of concert videos, orchestras have been able to find new revenue sources. One of the "great unmentionable (topics) of orchestral playing '' is "faking '', the process by which an orchestral musician gives the "... impression of playing every note as written '', typically for a very challenging passage that is very high or very fast, while not actually playing the notes that are in the printed music part. An article in The Strad states that all orchestral musicians, even those in the top orchestras, occasionally "fake '' certain passages. One reason that musicians "fake '' is because there are not enough rehearsals. Another factor is the extreme challenges in 20th century and 21st century contemporary pieces; professionals interviewed by the magazine said "faking '' was "... necessary in anything from ten to almost ninety per cent of some modern works. Professional players who were interviewed were of a consensus that faking may be acceptable when a part is not written well for the instrument, but faking "just because you have n't practised '' the music is not acceptable. The invention of the piston and rotary valve by Heinrich Stölzel and Friedrich Blühmel, both Silesians, in 1815, was the first in a series of innovations which impacted the orchestra, including the development of modern keywork for the flute by Theobald Boehm and the innovations of Adolphe Sax in the woodwinds, notably the invention of the saxophone. These advances would lead Hector Berlioz to write a landmark book on instrumentation, which was the first systematic treatise on the use of instrumental sound as an expressive element of music. The next major expansion of symphonic practice came from Richard Wagner 's Bayreuth orchestra, founded to accompany his musical dramas. Wagner 's works for the stage were scored with unprecedented scope and complexity: indeed, his score to Das Rheingold calls for six harps. Thus, Wagner envisioned an ever - more - demanding role for the conductor of the theatre orchestra, as he elaborated in his influential work On Conducting. This brought about a revolution in orchestral composition, and set the style for orchestral performance for the next eighty years. Wagner 's theories re-examined the importance of tempo, dynamics, bowing of string instruments and the role of principals in the orchestra. Conductors who studied his methods would go on to be influential themselves. As the early 20th century dawned, symphony orchestras were larger, better funded, and better trained than ever before; consequently, composers could compose larger and more ambitious works. The influence of Gustav Mahler was particularly innovational; in his later symphonies, such as the mammoth Symphony No. 8, Mahler pushes the furthest boundaries of orchestral size, employing huge forces. By the late Romantic era, orchestras could support the most enormous forms of symphonic expression, with huge string sections, massive brass sections and an expanded range of percussion instruments. With the recording era beginning, the standards of performance were pushed to a new level, because a recorded symphony could be listened to closely and even minor errors in intonation or ensemble, which might not be noticeable in a live performance, could be heard by critics. As recording technologies improved over the 20th and 21st centuries, eventually small errors in a recording could be "fixed '' by audio editing or overdubbing. Some older conductors and composers could remember a time when simply "getting through '' the music as best as possible was the standard. Combined with the wider audience made possible by recording, this led to a renewed focus on particular star conductors and on a high standard of orchestral execution. With the advent of the early music movement, smaller orchestras where players worked on execution of works in styles derived from the study of older treatises on playing became common. These include the Orchestra of the Age of Enlightenment, the London Classical Players under the direction of Sir Roger Norrington and the Academy of Ancient Music under Christopher Hogwood, among others. In the United States, the late 20th century saw a crisis of funding and support for orchestras. The size and cost of a symphony orchestra, compared to the size of the base of supporters, became an issue that struck at the core of the institution. Few orchestras could fill auditoriums, and the time - honored season - subscription system became increasingly anachronistic, as more and more listeners would buy tickets on an ad hoc basis for individual events. Orchestral endowments and -- more centrally to the daily operation of American orchestras -- orchestral donors have seen investment portfolios shrink or produce lower yields, reducing the ability of donors to contribute; further, there has been a trend toward donors finding other social causes more compelling. Also, while government funding is less central to American than European orchestras, cuts in such funding are still significant for American ensembles. Finally, the drastic falling - off of revenues from recording, tied to no small extent to changes in the recording industry itself, began a period of change that has yet to reach its conclusion. U.S. orchestras that have gone into Chapter 11 bankruptcy include the Philadelphia Orchestra (in April 2011), and the Louisville Orchestra, in December 2010; orchestras that have gone into Chapter 7 bankruptcy and have ceased operations include the Northwest Chamber Orchestra in 2006, the Honolulu Orchestra in March 2011, the New Mexico Symphony Orchestra in April 2011, and the Syracuse Symphony in June 2011. The Festival of Orchestras in Orlando, Florida ceased operations at the end of March, 2011. One source of financial difficulties that received notice and criticism was high salaries for music directors of US orchestras, which led several high - profile conductors to take pay cuts in recent years. Music administrators such as Michael Tilson Thomas and Esa - Pekka Salonen argued that new music, new means of presenting it, and a renewed relationship with the community could revitalize the symphony orchestra. The American critic Greg Sandow has argued in detail that orchestras must revise their approach to music, performance, the concert experience, marketing, public relations, community involvement, and presentation to bring them in line with the expectations of 21st - century audiences immersed in popular culture. It is not uncommon for contemporary composers to use unconventional instruments, including various synthesizers, to achieve desired effects. Many, however, find more conventional orchestral configuration to provide better possibilities for color and depth. Composers like John Adams often employ Romantic - size orchestras, as in Adams ' opera Nixon in China; Philip Glass and others may be more free, yet still identify size - boundaries. Glass in particular has recently turned to conventional orchestras in works like the Concerto for Cello and Orchestra and the Violin Concerto No. 2. Along with a decrease in funding, some U.S. orchestras have reduced their overall personnel, as well as the number of players appearing in performances. The reduced numbers in performance are usually confined to the string section, since the numbers here have traditionally been flexible (as multiple players typically play from the same part). Conducting is the art of directing a musical performance, such as an orchestral or choral concert. The primary duties of the conductor are to set the tempo, ensure correct entries by various members of the ensemble, and to "shape '' the phrasing where appropriate. To convey their ideas and interpretation, a conductor communicates with their musicians primarily through hand gestures, typically though not invariably with the aid of a baton, and may use other gestures or signals, such as eye contact with relevant performers. A conductor 's directions will almost invariably be supplemented or reinforced by verbal instructions or suggestions to their musicians in rehearsal prior to a performance. The conductor typically stands on a raised podium with a large music stand for the full score, which contains the musical notation for all the instruments and voices. Since the mid-18th century, most conductors have not played an instrument when conducting, although in earlier periods of classical music history, leading an ensemble while playing an instrument was common. In Baroque music from the 1600s to the 1750s, the group would typically be led by the harpsichordist or first violinist (see concertmaster), an approach that in modern times has been revived by several music directors for music from this period. Conducting while playing a piano or synthesizer may also be done with musical theatre pit orchestras. Communication is typically non-verbal during a performance (this is strictly the case in art music, but in jazz big bands or large pop ensembles, there may be occasional spoken instructions, such as a "count in ''). However, in rehearsals, frequent interruptions allow the conductor to give verbal directions as to how the music should be played or sung. Conductors act as guides to the orchestras or choirs they conduct. They choose the works to be performed and study their scores, to which they may make certain adjustments (e.g., regarding tempo, articulation, phrasing, repetitions of sections, and so on), work out their interpretation, and relay their vision to the performers. They may also attend to organizational matters, such as scheduling rehearsals, planning a concert season, hearing auditions and selecting members, and promoting their ensemble in the media. Orchestras, choirs, concert bands and other sizable musical ensembles such as big bands are usually led by conductors. In the Baroque music era (1600 -- 1750), most orchestras were led by one of the musicians, typically the principal first violin, called the concertmaster. The concertmaster would lead the tempo of pieces by lifting his or her bow in a rhythmic manner. Leadership might also be provided by one of the chord - playing instrumentalists playing the basso continuo part which was the core of most Baroque instrumental ensemble pieces. Typically, this would be a harpsichord player, a pipe organist or a luteist or theorbo player. A keyboard player could lead the ensemble with his or her head, or by taking one of the hands off the keyboard to lead a more difficult tempo change. A lutenist or theorbo player could lead by lifting the instrument neck up and down to indicate the tempo of a piece, or to lead a ritard during a cadence or ending. In some works which combined choirs and instrumental ensembles, two leaders were sometimes used: a concertmaster to lead the instrumentalists and a chord - playing performer to lead the singers. During the Classical music period (ca. 1720 -- 1800), the practice of using chordal instruments to play basso continuo was gradually phased out, and it disappeared completely by 1800. Instead, ensembles began to use conductors to lead the orchestra 's tempos and playing style, while the concertmaster played an additional leadership role for the musicians, especially the string players, who imitate the bowstroke and playing style of the concertmaster, to the degree that is feasible for the different stringed instruments. In 1922, the idea of a conductor-less orchestra was revived in post-revolutionary Soviet Union. The symphony orchestra Persimfans was formed without a conductor, because the founders believed that the ensemble should be modeled on the ideal Marxist state, in which all people are equal. As such, its members felt that there was no need to be led by the dictatorial baton of a conductor; instead they were led by a committee, which determined tempos and playing styles. Although it was a partial success within the Soviet Union, the principal difficulty with the concept was in changing tempo during performances, because even if the committee had issued a decree about where a tempo change should take place, there was no leader in the ensemble to guide this tempo change. The orchestra survived for ten years before Stalin 's cultural politics disbanded it by taking away its funding. In Western nations, some ensembles, such as the Orpheus Chamber Orchestra, based in New York City, have had more success with conductorless orchestras, although decisions are likely to be deferred to some sense of leadership within the ensemble (for example, the principal wind and string players, notably the concertmaster). Others have returned to the tradition of a principal player, usually a violinist, being the artistic director and running rehearsal and leading concerts. Examples include the Australian Chamber Orchestra, Amsterdam Sinfonietta & Candida Thompson and the New Century Chamber Orchestra. As well, as part of the early music movement, some 20th and 21st century orchestras have revived the Baroque practice of having no conductor on the podium for Baroque pieces, using the concertmaster or a chord - playing basso continuo performer (e.g., harpsichord or organ) to lead the group. Some orchestral works specify that an offstage trumpet should be used or that other instruments from the orchestra should be positioned off - stage or behind the stage, to create a haunted, mystical effect. To ensure that the offstage instrumentalist (s) play in time, sometimes a sub-conductor will be stationed offstage with a clear view of the principal conductor. Examples include the ending of "Neptune '' from Gustav Holst 's The Planets. The principal conductor leads the large orchestra, and the sub-conductor relays the principal conductor 's tempo and gestures to the offstage musician (or musicians). One of the challenges with using two conductors is that the second conductor may get out of synchronization with the main conductor, or may mis - convey (or misunderstand) the principal conductor 's gestures, which can lead to the offstage instruments being out of time. In the late 20th century and early 21st century, some orchestras use a video camera pointed at the principal conductor and a closed - circuit TV set in front of the offstage performer (s), instead of using two conductors. The techniques of polystylism and polytempo music have led a few 20th and 21st century composers to write music where multiple orchestras or ensembles perform simultaneously. These trends have brought about the phenomenon of polyconductor music, wherein separate sub-conductors conduct each group of musicians. Usually, one principal conductor conducts the sub-conductors, thereby shaping the overall performance. In Percy Grainger 's "The Warriors '' which includes three conductors: the primary conductor of the orchestra, a secondary conductor directing an off - stage brass ensemble, and a tertiary conductor directing percussion and harp. One example in the late century orchestral music is Karlheinz Stockhausen 's Gruppen, for three orchestras, which are placed around the audience. This way, the "sound masses '' could be spacialized, as in an electroacoustic work. Gruppen was premiered in Cologne, in 1958, conducted by Stockhausen, Bruno Maderna and Pierre Boulez. It has been performed by Simon Rattle, John Carewe and Daniel Harding.
hawkwind warrior on the edge of time songs
Warrior on the Edge of Time - wikipedia Warrior on the Edge of Time is Hawkwind 's fifth studio album. It reached number 13 on the U.K. album charts and was their third and last album to make the U.S. Billboard chart, where it peaked at number 150. Many of the lyrics are by Michael Moorcock and the album is loosely based on the concept of Moorcock 's ' Eternal Champion '. Reviews have been mixed, with Melody Maker panning the album and particularly criticizing the vocal work while the All Music Guide has praised the album for features such as the songwriting. Throughout 1974, Hawkwind heavily toured the UK, Europe and North America with their set being composed predominantly from that year 's Hall of the Mountain Grill album. Unusually for them, no new material had been introduced with the exception of some Michael Moorcock poems based on his Elric fictional character, which appeared on the 1974 live album The 1999 Party. In December through to February, the group embarked upon a series of UK dates known as "A Dead Singer '' tour after the Moorcock story published in the accompanying tour programme, with support from Dr Feelgood (Wilko Johnson: "Us and Hawkwind were a great bill. We had just been signed by United Artists, Hawkwind 's label. UA wanted to give us a little experience in the larger venues. That was where I first met and made friends with Lemmy, who turned out to be a good pal. ''). As the band owed one final single to United Artists to conclude their recording contract, during a mid-tour break they entered Olympic Studios on 5 and 6 January where they recorded Brock 's "Kings of Speed '' which featured lyrics written by Moorcock originally intended for inclusion on his New Worlds Fair album, Lemmy 's "Motorhead '' and House 's "Spiral Galaxy ''. The first two were selected for the A and B side respectively, and the single was released on 7 March. On resuming their UK tour, Brock expressed disillusionment with the band 's popularity commenting that "it 's getting to be like a war '', preferring his life with his wife Sylvie and their two children on their ten - acre Devon farm, trading under an alias in a community which knew nothing of his association with rock music. He revealed the growing disharmony within the band, "you would n't believe some of the scenes that go on backstage. All the fucking rows, people losing their temper. '' He was particularly critical of Turner on both a musical level ("Some nights I 've unplugged my guitar and marched across the stage to sort Nik out. He keeps playing the saxophone when I 'm singing and I 've told him a thousand times not to do that '') and personal level ("Nik 's really gullible, you know. He knows so many people and they always used to take him for a ride. It 's so easy because he 's not very sussed out ''). He was also critical of Lemmy listing a catalogue of on - stage problems with him, and he "lives that (Hells Angels) fantasy. It 's what he 'd like to be, but he ca n't '', but he 's "quite a good front man, though ''. Of the forthcoming Eternal Champion project, Brock revealed that he wanted Arthur Brown for the title role, and it would be "a complete fantasy trip on every level... and if we did it, that would be the end (of Hawkwind) ''. The next contract the group signed was a North American deal with Atlantic Records subsidiary Atco Records. With a scheduled North American tour for April and May, "Atlantic... needed an album to co-incide with our visit ''. For the only time in the 1970s, the group were due to record without having prepared new material in a live environment, which led to concern that "we 're going to be really pushed just to get an album together ''. The band entered Rockfield studios in March, King explaining "we laid all the backing tracks down in about three and a half days. Then, after we had a couple of days off, we went down to Olympic and added bits here and there, dubbed over vocals and mixed it all. That took about three days, and it was finished. '' The band "gave (the songs) their debut on two British gigs at Yeovil and Dunstable (12 and 13 April) '', then headed to North America for a tour at the end of April into May, during which Paul Rudolph replaced Lemmy. The album was released by ATCO on 9 May and licensed to United Artists for a UK release. The group promoted the album with tours in Germany and France in June, the UK in July and August including headlining the Reading Festival and appearing at Watchfield Free Festival. "Assault and Battery '' lyrics quote from Henry Wadsworth Longfellow 's poem "Psalm of Life ''. The song is a popular live number, being performed occasionally over the years, and has appeared on numerous live albums, sometimes under the title "Lives of Great Men ''. It was included as part of the live show for The Chronicle of the Black Sword concept, appearing on the album Live Chronicles. "The Golden Void '' segues from "Assault and Battery '', and the two songs are often performed live as a pair as on the albums Palace Springs (1991) and Canterbury Fayre 2001. The song is a popular live number, being performed occasionally over the years, and has appeared on numerous live albums, sometimes under the title "Void of Golden Light '', as on 1994 's The Business Trip. "The Wizard Blew His Horn '', "Standing at the Edge '' and "Warriors '' are Michael Moorcock poems based on his Eternal Champion literary figure. The poems are recited to atmospheric soundscapes provided by Simon House, and the percussionists Simon King and Alan Powell. The band had been performing them on stage during 1974, versions appearing on The 1999 Party live album. "Opa - Loka '' is an instrumental that features a motorik rhythm and is strongly influenced by the music of Neu!, the title possibly being a reference to Opa - locka, Florida. It was performed live, but when Robert Calvert joined the band at the beginning of 1976, he would recite the poem "Vikings on Mars '' over the top of it, the song evolving into "Uncle Sam 's on Mars '' on the 1979 album PXR5. "The Demented Man '' is a Brock acoustic number. (Also listed as "The Demented King ''.) The lyrics of "Magnu '' are based upon Percy Shelley 's poem "Hymn of Apollo ''. The song is a popular live number, being performed occasionally over the years, versions on the albums Choose Your Masques: Collectors Series Volume 2 (1982), The Friday Rock Show Sessions (1986) and Canterbury Fayre 2001. "Spiral Galaxy 28948 '' is a Simon House instrumental, the title being his date of birth (28 September 1948). It was performed live in 1975 after the release of the album, and again during 2001 when House had temporarily rejoined the band, a version appearing on the album Canterbury Fayre 2001. The original album sleeve unfolds into a large shield - shape, revealing that the silhouetted Warrior is standing at the edge of an apparently bottomless chasm. The landscape on the other side of the chasm is a mirror image, with another setting sun, a closer inspection of this entire image reveals a helmeted face. The reverse of the cover depicts a bronze shield bearing the 8 - rayed emblem of Chaos, as described in Moorcock 's books. Allan Jones in Melody Maker (10 May 1975) was critical in his review of the album despite it being "probably Hawkwind 's most professional record '' due to the advance in their "technical proficiency '', specifically the contributions of Simon House. The compositions are in the "standard Hawkwind traditions of sweeping synthesiser passages contrasting ethereal space with the violence of monotonous bass and rhythm guitar '', and of the poems he says "If Moorcock feels qualified to describe any of these pieces as poetry, then that 's his problem '' and that they are delivered "with all the emotion of Davros being exterminated by renegade Daleks ''. Geoff Barton in Sounds assessed it as "includ (ing) most of their traditional characteristics (leaden guitar, ritualistic chanting, wailing moogs, SF lyrics) but in a much more mature and varied setting '', and that "Simon House 's influence is strongly felt '' making it "rather fuller, more interesting than usual ''. Michael Moorcock: "Warrior On The Edge Of Time was a concept of mine. What Dave tends to do is he says ' Do us a concept ' or ' I 've got this rough concept, can you work it out? ' I do it, then Dave has a different idea and the whole thing shifts away, so that 's the way it works. It 's a perfectly good way of working -- it tends to give Dave a bit of a start or whatever. I was doing a lot of my ' Eternal Champion ' stuff on stage, so it seemed automatic to do that because there were so many numbers I could fit into that. I was only in the studio about an hour to do the stuff I did, and it was one of those weird things I did n't get the session fee either. '' Lemmy: "The album was a fuck - up from start to finish. That ' Opa - Loka ' was a lot of fucking rubbish. I was n't even on that. That was the drummer 's thing, that track... We were kind of complacent anyway. If you have a hit album, you 're complacent, and if you have two you really are in trouble. With them, they had four, ' cos they had In Search of Space before me... There 's great stuff on all them albums. ' The Golden Void ' was a beautiful track, but by then I was well out of favour. '' Dave Brock: "There was some good stuff on that album. I think we peaked then, in 1974 / 75. '' Simon King: "I suppose I 'm two - thirds happy with this one. For me that 's not bad as I was only half happy with the last one! Warriors is a different musical thing because it 's Simon House 's first real contribution: on Mountain Grill he was too new to be able to have that much influence, and now, of course we 've got Alan as a second drummer, which has meant a lot of changes. '' In a 2011 interview Nigel Reeve, custodian of Hawkwind 's United Artists Records archive at EMI, explained that Warrior on the Edge of Time had originally been released on a separate contract with United Artists, and its rights were no longer held by EMI, hence it was omitted from EMI 's remastering and release of Hawkwind 's United Artists catalogue in 1996. The album was released in the UK on CD for the first time in 1992 on the Dojo label, mastered from vinyl. A second version was released in 1993 on the Canadian label Griffin Music, mastered from a first - generation copy of the original master. This master was the Atco tape used for the 1975 North American vinyl release and included the single mix of "Kings of Speed ''. The Atco master used by Griffin was originally created at Olympic Studios and did n't have any fades on the tracks. A set of accompanying notes written by Dave Brock in 1975 were used to recreate the original fades when Griffin created their digital master. No EQ was used when the Griffin digital master was created. The transfer was done to match the original vinyl as closely as possible. In May 2013, Cherry Red reissued the album, along with a new stereo and 5.1 mix by Steven Wilson, on the Atomhenge label managed by Esoteric Recordings. It was also confirmed that the original master tapes were used.
what law covers all conservation issues in new zealand
Conservation in New Zealand - wikipedia Conservation in New Zealand has a history associated with both Māori and Europeans. Both groups of people caused a loss of species and both altered their behaviour to a degree after realising their effect on indigenous flora and fauna. New Zealand has fourteen national parks, thirty one marine reserves and many other protected areas for the conservation of biodiversity. The introduction of many invasive species is threatening the indigenous biodiversity since the geographical isolation of New Zealand led to the evolution of plants and animals that did not have traits to protect against predation. New Zealand has a high proportion of endemic species, so pest control is generally regarded as a high priority. The New Zealand Department of Conservation administers approximately 30 % of New Zealand 's land, along with less than 1 % of the country 's marine environment, for conservation and recreational purposes. It has published lists, under the New Zealand Threat Classification System, of flora and fauna which is at risk or declining which are included in national and regional plans. The Conservation Act 1987 is New Zealand 's principal legislation concerning the conservation of indigenous biodiversity. The Act established the Department of Conservation, Fish and Game, and complements the National Parks Act 1980 and the Reserves Act 1977. The black robin was saved from the brink of extinction by a conservation effort led by Don Merton of the New Zealand Wildlife Service. However all black robins that survive today are descended from a single female, therefore the species has little genetic diversity. The two subspecies of saddleback had each been reduced to a small population on a single island. Hen Island for the North Island saddleback and Big South Cape Island off Stewart Island / Rakiura for the South Island saddleback. After a programme of translocation to other predator free island reserves, the population of the South Island saddleback has increased from 36 birds to over 1,200 birds on 15 islands. The North Island subspecies had increased from 500 birds to over 6,000 birds on 12 islands. This has taken both subspecies from critically endangered on the IUCN Red List to near threatened for the South Island saddleback and least concern for the North Island saddleback. The Brown teal recovery program has successfully improved the population status from endangered to near threatened on the IUCN Red List. Most of the current 11.9 million hectares of agricultural land had been cleared, representing around 44 % of the total land area of New Zealand. Initial attempts to decrease the scale of further deforestation, such as Forestry Rights Registration Act 1983 that created ' forestry rights ' have been argued to only be moderately successful. However, they created world class structures of data collection and property rights that made way first for an amendment to the 1949 Forests Act in 1993 and later to the Climate Change Response Act 2002. New Zealand 's patterns of greenhouse gas emissions are similar to Scandinavian countries, in that land use and land use change and forestry are amongst the most significant contributors. Forestry came to be seen as main tool in meeting New Zealand 's Kyoto Protocol targets. Accordingly, REDD programmes (reducing emissions from deforestation and forest degradation) were implemented, whereby reforestation and deforestation was tied carbon emissions credits and traded (ETS) and commercial carbon - sink forests were planted. Perhaps due to the government 's initial control over REDD and the trade in carbon credits there was initially an increase in deforestation and it was not until private forestry owners gained access to the trading scheme and to carbon credits that the scheme started to produce reductions in deforestation. During the relatively short occupation of New Zealand by humans a large number of species have been made extinct due to predation by introduced species, hunting, and the loss of habitat. Many extant species are under threat because of past and ongoing human activities. One example is the Cromwell chafer beetle (Prodontria lewisi), which is on the IUCN Red List of critically endangered species. A reserve was created in 1983 to protect its habitat. More recent examples are the Hector 's and Maui 's dolphins, which are under threat from the fishing industry. The use of 1080 poison (sodium fluoroacetate) is a contentious issue. 1080 is used with carrots and cereal pellets to control the common brushtail possum, an introduced animal pest. As well as government funding for conservation efforts money also comes from numerous NGOs and private individuals. The Nature Heritage Fund and the Community Conservation Fund are both government funded. Conservation organisations began to form from the 19th century. Scenery Preservation Societies formed in some of the Provinces. An early conservation lobby group was the Royal Forest and Bird Protection Society of New Zealand, which is now the foremost environmental organisation involved in conservation advocacy in New Zealand. In recent years numerous conservation, landcare and activist groups have formed including:
how many warehouses does amazon have in the united states
List of Amazon locations - wikipedia This is a list of locations in which American corporation Amazon does business. Amazon 's global headquarters are in 14 buildings in Seattle 's South Lake Union neighborhood, developed primarily by Vulcan, Inc. from 2008 onward. The first 11 buildings were acquired from Vulcan in 2012 at a cost of $1.16 billion. The company was previously headquartered in rented space within the Pacific Medical Center, located in the city 's Beacon Hill neighborhood, from 1998 to 2011. Amazon is currently building a new three - tower complex in Seattle 's Denny Triangle neighborhood to serve as its new headquarters. The plan, designed by NBBJ and named "Rufus 2.0 '' after a dog who was part of the company in its early days, was approved by the city of Seattle in 2012 and construction began the year after. The first of the towers, nicknamed Doppler, opened on December 14, 2015. The European headquarters are in Luxembourg 's capital, Luxembourg City. While much of Amazon 's software development occurs in Seattle, the company employs software developers in centers across the globe. Some of these sites are run by an Amazon subsidiary called A2Z Development. Fulfillment centers are located in the following cities, often named after an International Air Transport Association airport code. Amazon Fulfillment centers can also provide warehousing and order - fulfillment for third - party sellers, for an extra fee. Third - party sellers can use Fulfillment by Amazon (FBA) to sell on other platforms as well, such as eBay or their own websites. Warehouses are large and each has hundreds of employees, sometimes thousands. Employees are responsible for five basic tasks: unpacking and inspecting incoming goods; placing goods in storage and recording their location; picking goods from their computer recorded locations to make up an individual shipment; sorting and packing orders; and shipping. A computer that records the location of goods and maps out routes for pickers plays a key role: employees carry hand - held computers which communicate with the central computer and monitor their rate of progress. A picker may walk 10 or more miles a day. In the newer fulfillment centers, items are stored on pods and brought to pickers by robots (Kiva Systems). In the United Kingdom initial staffing was provided by Randstad Holding and other temporary employment agencies. In the United States, many workers are hired as Amazon employees and granted shares of stock, while others are offered temporary seasonal positions. "When we have permanent positions available, we look to the top performing temporary associates to fill them, '' said an Amazon spokesperson. Development of a high level of automation is anticipated in the future following Amazon 's 2012 acquisition of Kiva Systems, a warehouse automation company. These US distribution centers have been closed: SDC Seattle Distribution Center, located in Georgetown, just south of downtown Seattle; Red Rock, Nevada; Chambersburg, Pennsylvania; Munster, Indiana; and McDonough, Georgia. From 2000 until February 2001, there was an Amazon customer service based in The Hague, Netherlands.
who wrote the song when two worlds collide
When Two Worlds Collide - wikipedia When Two Worlds Collide is an album by Jerry Lee Lewis, released on Elektra Records in 1980. When Two Worlds Collide was Lewis 's second album after leaving Mercury Records and peaked at number 32 on the Billboard country albums chart. The title track was released as a single, making it to number 11, while the Jerry Chestnut song "Honky Tonk Stuff '' reached number 28. Lewis had previously recorded "Who Will Buy the Wine '' with Sam Phillips at Sun Records. The period leading up to the recording had been a difficult one for Lewis. In July 1979, his father died of cancer and, two months later, he was arrested for possession of pills prescribed by Dr. George C. Nichopoulos, the infamous "Dr. Nick '' who had also prescribed pills to Elvis Presley. (In the 2014 authorized biography Jerry Lee Lewis: His Own Story, Lewis would call Dr. Nick "a good man, a remarkable man. '') The IRS was also after his assets and he was in poor health from a lifetime of excess. He went on an ill - advised British tour, appearing on The Old Grey Whistle Test, and then came home to participate in a television special with Mickey Gilley called A Family Affair, looking gaunt as the cousins played side by side at the piano.
does old bay seasoning have salt in it
Old Bay seasoning - wikipedia Old Bay Seasoning is a blend of herbs and spices that is marketed in the United States by McCormick & Company, and produced in Maryland. It is produced in the Chesapeake Bay area where it was developed by German immigrant Gustav Brunn in 1939, and where the seasoning is very popular to this day. At that time, crabs were so plentiful that bars in Baltimore, Maryland, offered them free and salty seasonings like Old Bay were created to encourage patrons to purchase more beverages. Old Bay is just one of many crab seasonings created during that era, yet it is one of only a few that survived. Notable others are J.O. Spice and Baltimore Spice. According to the ingredients list, the seasoning mix includes celery salt, black pepper, crushed red pepper flakes, and paprika. Other spices are used, but are not specified. It is regionally popular, specifically in Maryland, the Mid-Atlantic States, the Southern States, and parts of New England and the Gulf Coast. Due to the strong presence of the United States Navy in Maryland and Virginia, it is a common fixture in galleys onboard navy ships. Otherwise, it can be made at home, with instructions on its manufacture readily available. Old Bay Seasoning is named after the Old Bay Line, a passenger ship line that plied the waters of the Chesapeake Bay from Baltimore to Norfolk, Virginia, in the early 1900s. Gustav Brunn 's company became the Old Bay Company in 1939, the year he fled Nazi Germany, producing crab seasonings in the unique yellow can container until the company was purchased by McCormick & Co in 1990. McCormick continued to offer Old Bay in the classic yellow can. According to Gustav Brunn, he had worked for McCormick for a week before starting his own spice business. He claimed that he was fired when McCormick learned that he was Jewish. McCormick has a number of other products under the Old Bay banner, including seasoning packets for crab cakes, salmon patties and tuna, tartar sauce, cocktail sauce, and seafood batter mix. They also make other seasoning blends that mix Old Bay seasoning with garlic, lemon, brown sugar, herbs and blackened seasonings. McCormick has offered a lower - sodium version of Old Bay Seasoning. The seasoning is chiefly used to season crab and shrimp. It is also used in various clam chowder and oyster stew recipes. The seasoning is also used as a topping on popcorn, salads, eggs, fried chicken, french fries, tater tots, corn on the cob, boiled peanuts, dips, chipped beef, baked potato, potato salad, and potato chips. Several movie theaters in the Chesapeake region offer it in the condiment section. Potato chip manufacturer Utz created the original "Crab Chip '' based on an analogue spice mix. The popular potato chip variety was later copied and marketed by Herr 's (however, Herr 's uses the Old Bay seasoning and it is sold as "Herr 's Old Bay Chips ''). Early in its history, the Subway sandwich shop used Old Bay when mixing their seafood and crab salad. Many local Subway shops in the Northeastern states still have Old Bay for use on sandwiches. Old Bay is also occasionally used around the Chesapeake Bay region as an ingredient in Bloody Marys, and as far south as The Breakers Hotel in Palm Beach, Florida. Old Bay Seasoning is available at every Boardwalk Burgers and Fries restaurant. In 2014, the Maryland - based brewery Flying Dog created an Old Bay - inspired summer ale named Dead Rise to celebrate the seasoning 's 75th anniversary.
what's the population of chiang mai thailand
Chiang Mai - wikipedia Chiang Mai (/ ˈtʃjɑːŋˈmaɪ /, from Thai: เชียงใหม่ (tɕhiəŋ màj) (listen), Lanna: ᨩ᩠ᨿᨦᩉ᩠ᨾᩲ᩵ (t͡ɕīaŋ. màj) (listen)) sometimes written as "Chiengmai '' or "Chiangmai '', is the largest city in northern Thailand. It is the capital of Chiang Mai Province and was a former capital of the kingdom of Lan Na (1296 -- 1768), which later became the Kingdom of Chiang Mai, a tributary state of Siam from 1774 to 1899, and finally the seat of a princely rulers until 1939. It is 700 km (435 mi) north of Bangkok and is situated amongst the highest mountains in the country. The city sits astride the Ping River, a major tributary of the Chao Phraya River. Chiang Mai means "New City '' and was so named because it became the new capital of Lan Na when it was founded in 1296, succeeding Chiang Rai, the former capital founded in 1262. In May 2006 Chiang Mai was the site of the Chiang Mai Initiative, concluded between the Association of Southeast Asian Nations and the "ASEAN + 3 '' countries, (China, Japan, and South Korea). Chiang Mai was one of three Thai cities contending for Thailand 's bid to host the World Expo 2020. The others were Chonburi, and Ayutthaya). Ayutthaya, however, was the city ultimately chosen by the Thai Parliament to register for the international competition. In early -- December 2017, Chiang Mai was awarded the UNESCO title of Creative City. However, its 2015 application for UNESCO Heritage City is still under consideration, with UNESCO committee 's main concern being the ongoing conflicts between local business interests, and the native - born residents ' passion for preserving their traditional way of life and cultural environment. Chiang Mai was one of two tourist destinations in Thailand on TripAdvisor 's 2014 list of "25 Best Destinations in the World '', where it stands at number 24. Chiang Mai 's historic importance is derived from its close proximity to the Ping River and major trading routes. While officially the city (thesaban nakhon, "city municipality '') of Chiang Mai only covers most parts of the Mueang Chiang Mai District with a population of 160,000, the city 's sprawl extends into several neighboring districts. The Chiang Mai metropolitan area has a population of nearly one million people, more than half the total of Chiang Mai Province. The city is subdivided into four khwaeng (electoral wards): Nakhon Ping, Srivijaya, Mengrai, and Kawila. The first three are on the west bank of the Ping River, and Kawila is on the east bank. Nakhon Ping District includes the northern part of the city. Srivijaya, Mengrai, and Kawila consist of the western, southern, and eastern parts, respectively. The city center -- within the city walls -- is mostly within Srivijaya ward. The Ping River, one of the main tributaries of the Chao Phraya River, originates at Doi Thuai, in the mountains of the Daen Lao Range in Chiang Dao District. The river, the largest in the region, runs from north to south, forming a river basin east of Chiang Mai. Mae Ping River also served as the route of trade and communication between Chiang Mai and its controlled states in Lanna, as well as the outside world. Mangrai founded Chiang Mai in 1294 or 1296 on the site of an older city of the Lawa people called Wiang Nopburi. Gordon Young, in his 1962 book The Hill tribes of Northern Thailand, mentions how a Wa chieftain in British Burma told him that the Wa, a people who are closely related to the Lawa, once lived in the Chiang Mai valley in "sizeable cities ''. Chiang Mai succeeded Chiang Rai as the capital of Lan Na. Pha Yu enlarged and fortified the city, and built Wat Phra Singh in honor of his father Kham Fu. The ruler was known as the chao. The city was surrounded by a moat and a defensive wall since nearby Taungoo Dynasty of the Bamar people was a constant threat, as were the armies of the Mongol Empire, which only decades earlier had conquered most of Yunnan, China, and in 1292 overran the bordering Dai kingdom of Chiang Hung. With the decline of Lan Na, the city lost importance and was occupied by the Taungoo in 1556. Chiang Mai formally became part of the Thonburi Kingdom in 1775 by an agreement with Chao Kavila, after the Thonburi king Taksin helped drive out the Taungoo Bamar. Because of Taungoo counterattacks, Chiang Mai was abandoned between 1776 and 1791. Lampang then served as the capital of what remained of Lan Na. Chiang Mai then slowly grew in cultural, trading, and economic importance to its current status as the unofficial capital of Northern Thailand, second in importance only to Bangkok. The modern municipality dates to a sanitary district (sukhaphiban) that was created in 1915. It was upgraded to a municipality (thesaban) on 29 March 1935, as published in the Royal Gazette, Book No. 52 section 80. First covering just 17.5 km (7 sq mi), the city was enlarged to 40.2 km (16 sq mi) on 5 April 1983. The city emblem shows the stupa at Wat Phra That Doi Suthep in its center. Below it are clouds representing the moderate climate in the mountains of northern Thailand. There is a nāga, the mythical snake said to be the source of the Ping River, and rice stalks, which refer to the fertility of the land. Chiang Mai has a tropical savanna climate (Köppen Aw), tempered by the low latitude and moderate elevation, with warm to hot weather year - round, though nighttime conditions during the dry season can be cool and much lower than daytime highs. The maximum temperature ever recorded was 42.4 ° C (108.3 ° F) in May 2005. A continuing environmental issue in Chiang Mai is the incidence of air pollution that primarily occurs every year towards the end of the dry season between February and April. In 1996, speaking at the Fourth International Network for Environmental Compliance and Enforcement conference -- held in Chiang Mai that year -- the Governor Virachai Naewboonien invited guest speaker Dr. Jakapan Wongburanawatt, Dean of the Social Science Faculty of Chiang Mai University, to discuss air pollution efforts in the region. Dr. Wongburanawatt stated that, in 1994, an increasing number of city residents attended hospitals suffering from respiratory problems associated with the city 's air pollution. During the February -- March period, air quality in Chiang Mai often remains below recommended standards, with fine - particle dust levels reaching twice the standard limits. According to the Bangkok Post, corporations in the agricultural sector, not farmers, are the biggest contributors to smoke pollution. The main source of the fires is forested area being cleared to make room for new crops. The new crops to be planted after the smoke clears are not rice and vegetables to feed locals. A single crop is responsible: corn. The haze problem began in 2007 and has been traced at the local level and at the macro-market level to the growth of the animal feed business. "The true source of the haze... sits in the boardrooms of corporations eager to expand production and profits. A chart of Thailand 's growth in world corn markets can be overlaid on a chart of the number of fires. It is no longer acceptable to scapegoat hill tribes and slash - and - burn agriculture for the severe health and economic damage caused by this annual pollution. '' These data have been ignored by the government. The end is not in sight, as the number of fires has increased every year for a decade, and data shows more pollution in late - February 2016 than in late - February 2015. The northern centre of the Meteorological Department has reported that low - pressure areas from China trap forest fire smoke in the mountains along the Thai - Myanmar border. Research conducted between 2005 and 2009 showed that average PM10 rates in Chiang Mai during February and March were considerably above the country 's safety level of 120 μg / m, peaking at 383 μg / m on 14 March 2007. According to the World Health Organization (WHO), the acceptable level is 50 μg / m. To address the increasing amount of greenhouse gas emissions from the transport sector in Chiang Mai, the city government has advocated the use of non-motorised transport (NMT). In addition to its potential to reduce greenhouse gas emissions, the NMT initiative addresses other issues such as traffic congestion, air quality, income generation for the poor, and the long - term viability of the tourism industry. It has been said that smoke pollution has made March "the worst month to visit Chiang Mai ''. Chiang Mai has over 300 Buddhist temples ("wat '' in Thai). These include: Fireworks at Wat Phantao during Loi Krathong, Chiang Mai Wat Chedi Luang. Ban Ho Mosque. Wat Prathat Doi Suthep Chiang Mai hosts many Thai festivals, including: The inhabitants speak Northern Thai, also known as Lanna or Kham Mueang. The script used to write this language, called the Tai Tham alphabet, is studied only by scholars, and the language is commonly written with the standard Thai alphabet. English is used in hotels and travel - related businesses. Khan tok is a century - old Lan Na Thai tradition in Chiang Mai. It is an elaborate dinner or lunch offered by a host to guests at various ceremonies or parties, such as weddings, housewarmings, celebrations, novice ordinations, or funerals. It can also be held in connection with celebrations for specific buildings in a Thai temple and during Buddhist festivals such as Khao Pansa, Og Pansa, Loi Krathong, and Thai New Year (Songkran). Chiang Mai has several universities, including Chiang Mai University, Chiang Mai Rajabhat University, Rajamangala University of Technology Lanna, Payap University, Far Eastern University, and Maejo University, as well as numerous technical and teacher colleges. Chiang Mai University was the first government university established outside of Bangkok. Payap University was the first private institution in Thailand to be granted university status. A number of bus stations link the city to central, southeast, and northern Thailand. The central Chang Puak Terminal (north of Chiang Puak Gate) provides local services within Chiang Mai Province. The Chiang Mai Arcade bus terminal northeast of the city centre (which can be reached with a songthaew or tuk - tuk ride) provides services to over 20 other destinations in Thailand including Bangkok, Pattaya, Hua Hin, and Phuket. There are several services a day from Chiang Mai Arcade terminal to Mo Chit Station in Bangkok (a 10 - to 12 - hour journey). The state railway operates 10 trains a day to Chiang Mai Station from Bangkok. Most journeys run overnight and take approximately 12 -- 15 hours. Most trains offer first - class (private cabins) and second - class (seats fold out to make sleeping berths) service. Chiang Mai is the northern terminus of the Thai railway system. Chiang Mai International Airport receives up to 28 flights a day from Bangkok (flight time about 1 hour 10 minutes) and also serves as a local hub for services to other northern cities such as Chiang Rai, Phrae, and Mae Hong Son. International services also connect Chiang Mai with other regional centers, including cities in other Asian countries. The locally preferred form of transport is personal motorbike and, increasingly, private car. Local public transport is via tuk - tuk, songthaew, or rickshaws. As population density continues to grow, greater pressure is placed upon the city 's transportation system. During peak hours, the road traffic is often badly congested. The city officials as well as researchers and experts have been trying to find feasible solutions to tackle the city 's traffic problems. Most of them agree that factors such as lack of public transport, increasing number of motor vehicles, inefficient land use plan and urban sprawl, have led to these problems. The latest development is that Mass Rapid Transit Authority of Thailand (MRTA) has approved a draft decree on the light railway transit system project in Chiang Mai. If the draft is approved by the Thai cabinet, the construction could begin in 2020 and be completed by 2023. It is believed that such a system would mitigate Chiang Mai 's traffic problems to a large degree. In February 2017, the Digital Economy Promotion Agency (DEPA) (under Thailand 's Digital Economy and Society Ministry) announced that 36.5 million baht would be invested into developing Chiang Mai into an innovation - driven "smart city ''. Chiang Mai was the second city in Thailand, after Phuket and along with Khon Kaen, to be developed using the "smart city '' model. The model aims to capture and populate multiple levels of information (including building, social, environmental, governmental, and economic data) from sources like sensors, real - time traffic information, and social forums for access by managers, governments, and citizens using mobile apps, tablets, and dashboards. The "Smart City '' outlook (integrating Information and Communications Technology (ICT) with the Internet of Things (IOT)), is viewed to be critical both for secondary cities with burgeoning urban population like Chiang Mai, as well as part of Thailand 's move to be digital hub of ASEAN. The role of private sector investment, together with public sector partnership, is key to promote digital entrepreneurship. Prosoft Comtech, a Thai software company, has spent 300 million baht to build its own "Oon IT Valley '' on a 90 rai plot of land as a community for tech start - ups, Internet of Things technology, software programmers and business process outsourcing services. It is aimed to both increase the size of Chiang Mai 's digital workforce, as well as attract foreign digital talent to Chiang Mai. In January 2018, it was announced that Chiang Mai would be launching "Mobike In '', a bike - sharing app that would see the introduction of some 500 smart bikes on the streets. The smart bikes would be available for use for both locals and tourists. It is reported that as a start, the bikes would be placed at convenient locations including the Three Kings monument, Tha Pae Gate and Suan Buak Haad park, as well as in the old town. The "Mobike In '' project is sponsored by Advanced Info Service (Thailand 's largest mobile phone operator), in collaboration with the Tourism Authority of Thailand (Chiang Mai Office), together with local universities, public and private sectors. The project aims to promote non-motorised transportation and support eco-tourism. Speaking at the launch at the Lanna Folklife Museum, Deputy Governor Puttipong Sirimart stated that the introduction of such "smart transportation '' was a positive move in Chiang Mai 's transformation into a "Smart City '' (part of the "Thailand 4.0 '' vision). Phongsak Ariyajitphaisal, DEPA 's Chiang Mai branch manager, stated that one of the areas its smart city initiative would be promoting was "smart agriculture ''. Eighty percent of Chiang Mai Province 's population are farmers, mostly small - scale, and increasing productivity through use of ICT has the potential to improve the local economy and living standards. DEPA has also provided funding to Chiang Mai 's Maejo University, to develop wireless sensor systems for better farmland irrigation techniques, to reduce use of water sprinklers and increase productivity. The university is also developing agricultural drones that can assist to spray fertilizers and pesticides in agricultural areas, which if successful will result in labour costs and time savings. The drones also aim to detect and monitor fires and smoke pollution. Under the 2011 IBM "Smarter Cities Challenge '', IBM experts recommended smarter food initiatives focused on creating insight into agriculture data for farmers, including price modelling, farmer - focused weather forecasting tools, an e-portal to help farmers align crop production with demand, as well as branding of Chiang Mai produce. Longer - term recommendations included implementing traceability, enabling the tracking of produce from farm to consumer, smarter irrigation as well as flood control and early warning systems. As part of the smart city project supported by IBM, Chiang Mai is also looking to use technology to boost its medical tourism hub ambitions. In 2011, IBM launched its Smarter Cities Challenge, a three - year, 100 - city, 1.6 billion baht (US $50 million) program where teams of experts study and make detailed recommendations to address local important urban issues. Chiang Mai won the grant (of about US $400,000) in 2011. The IBM team focused on smarter healthcare initiatives, aimed at making Chiang Mai and the University Medical Clinic a medical hub, as well as improving efficiency of hospitals for improved service delivery. For example, healthcare providers could use real - time location tracking of patients and hospital assets to increase efficiency and build an internationally recognised service identity. Electronic medical record technology can also be adopted to standardise information exchanges to link all medical service providers, even including traditional medicine and spas. Similar ideas include linking patient databases and healthcare asset information. In partnership with the Faculty of Medicine at Chiang Mai University, the team of experts aim to enhance the quality of medical care available to the community, both urban and rural, as well as develop Chiang Mai into a centre for medical tourism with the infrastructure for supporting international visitors seeking long - term medical care. As the largest city in northern Thailand, Chiang Mai already receives some long stay healthcare visitors, largely Japanese. Its main advantage over Bangkok is lower costs of living. Quality services at low prices are a major selling point in mainstream healthcare, dental and ophthalmologic care as well as Thai traditional medicine. Its local university is also developing specializations in robotic surgery and geriatric medicine to accommodate a future aging population. DEPA also reported that it has developed an mobile app that uses augmented reality technology to showcase various historical attractions in Chiang Mai, in line with the government 's policy to promote Chiang Mai as a world heritage city. According to Thailand 's Tourist Authority, in 2013 Chiang Mai had 14.1 million visitors: 4.6 million foreigners and 9.5 million Thais. In 2016, tourist arrivals are expected to grow by approximately 10 percent to 9.1 million, with Chinese tourists increasing by seven percent to 750,000 and international arrivals by 10 percent to 2.6 million. Tourism in Chiang Mai has been growing annually by 15 percent per year since 2011, mostly due to Chinese tourists who account for 30 percent of international arrivals. Chiang Mai is estimated to have 32,000 -- 40,000 hotel rooms and Chiang Mai International Airport (CNX) is Thailand 's fourth largest airport, after Suvarnabhumi (BKK) and Don Mueang (DMK) in Bangkok, and Phuket (HKT). The Thailand Convention and Exhibition Bureau (TCEB) aims to market Chiang Mai as a global MICE city as part of a five - year plan. The TCEB forecasts revenue from MICE to rise by 10 percent to 4.24 billion baht in 2013 and the number of MICE travellers to rise by five percent to 72,424. The influx of tourists has put a strain on the city 's natural resources. Faced with rampant unplanned development, air and water pollution, waste management problems, and traffic congestion, the city has launched a non-motorised transport (NMT) system. The initiative, developed by a partnership of experts and with support from the Climate & Development Knowledge Network, aims to reduce greenhouse gas emissions and create employment opportunities for the urban poor. The climate compatible development strategy has gained support from policy - makers and citizens alike as a result of its many benefits. Chiang Mai has agreements with four sister cities: Inthakhin city pillar building, Wat Chedi Luang Street food, Sunday Evening Market Selling umbrellas, Sunday Evening Market A soi NE of city center Police tuk - tuk, Tha Phae Gate Chang Phueak Gate and part of the old city wall View south along the eastern moat of city center, Chiang Mai. The road on the right is Moon Muang, on the left, Chaiyapoom Ho Trai (library), Wat Phra Singh Sunday Evening Market, Chiang Mai Huai Tueng Thao Lake, NW of Chiang Mai
who is credited with having published the first research article in social psychology
Social psychology - wikipedia Social psychology is the study of how people 's thoughts, feelings, and behaviors are influenced by the actual, imagined, or implied presence of others. In this definition, scientific refers to the empirical investigation using the scientific method. The terms thoughts, feelings, and behaviors refer to psychological variables that can be measured in humans. The statement that others ' presence may be imagined or implied suggests that humans are malleable to social influences even when alone, such as when watching television or following internalized cultural norms. Social psychologists typically explain human behavior as a result of the interaction of mental states and social situations. Social psychologists examine factors that cause behaviors to unfold in a given way in the presence of others. They study conditions under which certain behavior, actions, and feelings occur. Social psychology is concerned with the way these feelings, thoughts, beliefs, intentions, and goals are cognitively constructed and how these mental representations, in turn, influence our interactions with others. Social psychology traditionally bridged the gap between psychology and sociology. During the years immediately following World War II there was frequent collaboration between psychologists and sociologists. The two disciplines, however, have become increasingly specialized and isolated from each other in recent years, with sociologists focusing on "macro variables '' (e.g., social structure) to a much greater extent than psychologists. Nevertheless, sociological approaches to psychology remain an important counterpart to psychological research in this area. In addition to the split between psychology and sociology, there has been a somewhat less pronounced difference in emphasis between American social psychologists and European social psychologists. As a generalization, American researchers traditionally have focused more on the individual, whereas Europeans have paid more attention to group level phenomena (see group dynamics). Although there were some older writings about social psychology, such as those by Islamic philosopher Al - Farabi (Alpharabius), the discipline of social psychology, as its modern - day definition, began in the United States at the beginning of the 20th century. By that time, though, the discipline had already developed a significant foundation. Following the 18th century, those in the emerging field of social psychology were concerned with developing concrete explanations for different aspects of human nature. They attempted to discover concrete cause and effect relationships that explained the social interactions in the world around them. In order to do so, they believed that the scientific method, an empirically based scientific measure, could be applied to human behavior. The first published study in this area was an experiment in 1898 by Norman Triplett, on the phenomenon of social facilitation. During the 1930s, many Gestalt psychologists, most notably Kurt Lewin, fled to the United States from Nazi Germany. They were instrumental in developing the field as something separate from the behavioral and psychoanalytic schools that were dominant during that time, and social psychology has always maintained the legacy of their interests in perception and cognition. Attitudes and small group phenomena were the most commonly studied topics in this era. During World War II, social psychologists studied persuasion and propaganda for the U.S. military. After the war, researchers became interested in a variety of social problems, including gender issues and racial prejudice. Most notable, revealing, and contentious of these were the Stanley Milgram shock experiments on obedience to authority. In the sixties, there was growing interest in new topics, such as cognitive dissonance, bystander intervention, and aggression. By the 1970s, however, social psychology in America had reached a crisis. There was heated debate over the ethics of laboratory experimentation, whether or not attitudes really predicted behavior, and how much science could be done in a cultural context. This was also the time when a radical situationist approach challenged the relevance of self and personality in psychology. Throughout the 1980s and 1990s social psychology reached a more mature level. Two of the areas social psychology matured in were theories and methods. Careful ethical standards now regulate research. Pluralistic and multicultural perspectives have emerged. Modern researchers are interested in many phenomena, but attribution, social cognition, and the self - concept are perhaps the greatest areas of growth in recent years. Social psychologists have also maintained their applied interests with contributions in the social psychology of health, education, law, and the workplace. In social psychology, attitudes are defined as learned, global evaluations of a person, object, place, or issue that influence thought and action. Put more simply, attitudes are basic expressions of approval or disapproval, favorability or unfavorability, or as Bem put it, likes and dislikes. Examples would include liking chocolate ice cream, or endorsing the values of a particular political party. Social psychologists have studied attitude formation, the structure of attitudes, attitude change, the function of attitudes, and the relationship between attitudes and behavior. Because people are influenced by the situation, general attitudes are not always good predictors of specific behavior. For example, for a variety of reasons, a person may value the environment but not recycle a can on a particular day. In recent times, research on attitudes has examined the distinction between traditional, self - reported attitude measures and "implicit '' or unconscious attitudes. For example, experiments using the Implicit Association Test have found that people often demonstrate implicit bias against other races, even when their explicit responses reveal equal mindedness. One study found that explicit attitudes correlate with verbal behavior in interracial interactions, whereas implicit attitudes correlate with nonverbal behavior. One hypothesis on how attitudes are formed, first advanced by Abraham Tesser in 1983, is that strong likes and dislikes are ingrained in our genetic make - up. Tesser speculates that individuals are disposed to hold certain strong attitudes as a result of inborn physical, sensory, and cognitive skills, temperament, and personality traits. Whatever disposition nature elects to give us, our most treasured attitudes are often formed as a result of exposure to attitude objects; our history of rewards and punishments; the attitude that our parents, friends, and enemies express; the social and cultural context in which we live; and other types of experiences we have. Obviously, attitudes are formed through the basic process of learning. Numerous studies have shown that people can form strong positive and negative attitudes toward neutral objects that are in some way linked to emotionally charged stimuli. Attitudes are also involved in several other areas of the discipline, such as conformity, interpersonal attraction, social perception, and prejudice. The topic of persuasion has received a great deal of attention in recent years. Persuasion is an active method of influence that attempts to guide people toward the adoption of an attitude, idea, or behavior by rational or emotive means. Persuasion relies on "appeals '' rather than strong pressure or coercion. Numerous variables have been found to influence the persuasion process; these are normally presented in five major categories: who said what to whom and how. Dual - process theories of persuasion (such as the elaboration likelihood model) maintain that the persuasive process is mediated by two separate routes; central and peripheral. The central route of persuasion is more fact - based and results in longer lasting change, but requires motivation to process. The peripheral route is more superficial and results in shorter lasting change, but does not require as much motivation to process. An example of a peripheral route of persuasion might be a politician using a flag lapel pin, smiling, and wearing a crisp, clean shirt. Notice that this does not require motivation to be persuasive, but should not last as long as persuasion based on the central route. If that politician were to outline exactly what they believed, and their previous voting record, this would be using the central route, and would result in longer lasting change, but would require a good deal of motivation to process. Social cognition is a growing area of social psychology that studies how people perceive, think about, and remember information about others. Much research rests on the assertion that people think about (other) people differently from non-social targets. This assertion is supported by the social cognitive deficits exhibited by people with Williams syndrome and autism. Person perception is the study of how people form impressions of others. The study of how people form beliefs about each other while interacting is known as interpersonal perception. A major research topic in social cognition is attribution. Attributions are the explanations we make for people 's behavior, either our own behavior or the behavior of others. We can ascribe the locus of a behavior to either internal or external factors. An internal, or dispositional, attribution assigns behavior to causes related to inner traits such as personality, disposition, character or ability. An external, or situational, attribution involves situational elements, such as the weather. A second element, attribution, ascribes the cause of behavior to either stable or unstable factors (whether the behavior will be repeated or changed under similar circumstances). Finally, we also attribute causes of behavior to either controllable or uncontrollable factors: how much control one has over the situation at hand. Numerous biases in the attribution process have been discovered. For instance, the fundamental attribution error is the tendency to make dispositional attributions for behavior, overestimating the influence of personality and underestimating the influence of situations. The actor - observer difference is a refinement of this bias, the tendency to make dispositional attributions for other people 's behavior and situational attributions for our own. The self - serving bias is the tendency to attribute dispositional causes for successes, and situational causes for failure, particularly when self - esteem is threatened. This leads to assuming one 's successes are from innate traits, and one 's failures are due to situations, including other people. Other ways people protect their self - esteem are by believing in a just world, blaming victims for their suffering, and making defensive attributions, which explain our behavior in ways which defend us from feelings of vulnerability and mortality. Researchers have found that mildly depressed individuals often lack this bias and actually have more realistic perceptions of reality (as measured by the opinions of others). Heuristics are cognitive short cuts. Instead of weighing all the evidence when making a decision, people rely on heuristics to save time and energy. The availability heuristic occurs when people estimate the probability of an outcome based on how easy that outcome is to imagine. As such, vivid or highly memorable possibilities will be perceived as more likely than those that are harder to picture or are difficult to understand, resulting in a corresponding cognitive bias. The representativeness heuristic is a shortcut people use to categorize something based on how similar it is to a prototype they know of. Numerous other biases have been found by social cognition researchers. The hindsight bias is a false memory of having predicted events, or an exaggeration of actual predictions, after becoming aware of the outcome. The confirmation bias is a type of bias leading to the tendency to search for, or interpret information in a way that confirms one 's preconceptions. Another key concept in social cognition is the assumption that reality is too complex to easily discern. As a result, we tend to see the world according to simplified schemas or images of reality. Schemas are generalized mental representations that organize knowledge and guide information processing. Schemas often operate automatically and unintentionally, and can lead to biases in perception and memory. Expectations from schemas may lead us to see something that is not there. One experiment found that people are more likely to misperceive a weapon in the hands of a black man than a white man. This type of schema is actually a stereotype, a generalized set of beliefs about a particular group of people (when incorrect, an ultimate attribution error). Stereotypes are often related to negative or preferential attitudes (prejudice) and behavior (discrimination). Schemas for behaviors (e.g., going to a restaurant, doing laundry) are known as scripts. Self - concept is a term referring to the whole sum of beliefs that people have about themselves. However, what specifically does self - concept consist of? According to Hazel Markus (1977), the self - concept is made up of cognitive molecules called self - schemas -- beliefs that people have about themselves that guide the processing of self - reliant information. For example, an athlete at a university would have multiple selves that would process different information pertinent to each self: the student would be one "self, '' who would process information pertinent to a student (taking notes in class, completing a homework assignment, etc.); the athlete would be the "self '' who processes information about things related to being an athlete (recognizing an incoming pass, aiming a shot, etc.). These "selves '' are part of one 's identity and the self - reliant information is the information that relies on the proper "self '' to process and react on it. If a "self '' is not part of one 's identity, then it is much more difficult for one to react. For example, a civilian may not know how to handle a hostile threat as a trained Marine would. The Marine contains a "self '' that would enable him / her to process the information about the hostile threat and react accordingly, whereas a civilian may not contain that self, disabling them from properly processing the information from the hostile threat and, furthermore, debilitating them from acting accordingly. Self - schemas are to an individual 's total self -- concept as a hypothesis is to a theory, or a book is to a library. A good example is the body weight self - schema; people who regard themselves as over or underweight, or for those whom body image is a significant self - concept aspect, are considered schematics with respect to weight. For these people a range of otherwise mundane events -- grocery shopping, new clothes, eating out, or going to the beach -- can trigger thoughts about the self. In contrast, people who do not regard their weight as an important part of their lives are a-schematic on that attribute. It is rather clear that the self is a special object of our attention. Whether one is mentally focused on a memory, a conversation, a foul smell, the song that is stuck in one 's head, or this sentence, consciousness is like a spotlight. This spotlight can shine on only one object at a time, but it can switch rapidly from one object to another and process the information out of awareness. In this spotlight the self is front and center: things relating to the self have the spotlight more often. The self 's ABCs are affect, behavior, and cognition. An affective (or emotional) question: How do people evaluate themselves, enhance their self - image, and maintain a secure sense of identity? A behavioral question: How do people regulate their own actions and present themselves to others according to interpersonal demands? A cognitive question: How do individuals become themselves, build a self - concept, and uphold a stable sense of identity? Affective forecasting is the process of predicting how one would feel in response to future emotional events. Studies done by Timothy Wilson and Daniel Gilbert in 2003 have shown that people overestimate the strength of reaction to anticipated positive and negative life events that they actually feel when the event does occur. There are many theories on the perception of our own behavior. Daryl Bem 's (1972) self - perception theory claims that when internal cues are difficult to interpret, people gain self - insight by observing their own behavior. Leon Festinger 's 1954 social comparison theory is that people evaluate their own abilities and opinions by comparing themselves to others when they are uncertain of their own ability or opinions. There is also the facial feedback hypothesis: that changes in facial expression can lead to corresponding changes in emotion. The fields of social psychology and personality have merged over the years, and social psychologists have developed an interest in self - related phenomena. In contrast with traditional personality theory, however, social psychologists place a greater emphasis on cognitions than on traits. Much research focuses on the self - concept, which is a person 's understanding of his or her self. The self - concept is often divided into a cognitive component, known as the self - schema, and an evaluative component, the self - esteem. The need to maintain a healthy self - esteem is recognized as a central human motivation in the field of social psychology. Self - efficacy beliefs are associated with the self - schema. These are expectations that performance on some task will be effective and successful. Social psychologists also study such self - related processes as self - control and self - presentation. People develop their self - concepts by varied means, including introspection, feedback from others, self - perception, and social comparison. By comparing themselves to relevant others, people gain information about themselves, and they make inferences that are relevant to self - esteem. Social comparisons can be either "upward '' or "downward, '' that is, comparisons to people who are either higher in status or ability, or lower in status or ability. Downward comparisons are often made in order to elevate self - esteem. Self - perception is a specialized form of attribution that involves making inferences about oneself after observing one 's own behavior. Psychologists have found that too many extrinsic rewards (e.g. money) tend to reduce intrinsic motivation through the self - perception process, a phenomenon known as overjustification. People 's attention is directed to the reward and they lose interest in the task when the reward is no longer offered. This is an important exception to reinforcement theory. Social influence is an overarching term given to describe the persuasive effects people have on each other. It is seen as a fundamental value in social psychology and overlaps considerably with research on attitudes and persuasion. The three main areas of social influence include: conformity, compliance, and obedience. Social influence is also closely related to the study of group dynamics, as most principles of influence are strongest when they take place in social groups. The first major area of social influence is conformity. Conformity is defined as the tendency to act or think like other members of a group. The identity of members within a group, i.e. status, similarity, expertise, as well as cohesion, prior commitment, and accountability to the group help to determine the level of conformity of an individual. Individual variation among group members plays a key role in the dynamic of how willing people will be to conform. Conformity is usually viewed as a negative tendency in American culture, but a certain amount of conformity is adaptive in some situations, as is nonconformity in other situations. The second major area of social influence research is compliance. Compliance refers to any change in behavior that is due to a request or suggestion from another person. The foot - in - the - door technique is a compliance method in which the persuader requests a small favor and then follows up with requesting a larger favor, e.g., asking for the time and then asking for ten dollars. A related trick is the bait and switch. The third major form of social influence is obedience; this is a change in behavior that is the result of a direct order or command from another person. Obedience as a form of compliance was dramatically highlighted by the Milgram study, wherein people were ready to administer shocks to a person in distress on a researcher 's command. An unusual kind of social influence is the self - fulfilling prophecy. This is a prediction that, in being made, actually causes itself to become true. For example, in the stock market, if it is widely believed that a crash is imminent, investors may lose confidence, sell most of their stock, and thus actually cause the crash. Similarly, people may expect hostility in others and actually induce this hostility by their own behavior. A group can be defined as two or more individuals that are connected to each another by social relationships. Groups tend to interact, influence each other, and share a common identity. They have a number of emergent qualities that distinguish them from aggregates: Temporary groups and aggregates share few or none of these features, and do not qualify as true social groups. People waiting in line to get on a bus, for example, do not constitute a group. Groups are important not only because they offer social support, resources, and a feeling of belonging, but because they supplement an individual 's self - concept. To a large extent, humans define themselves by the group memberships which form their social identity. The shared social identity of individuals within a group influences intergroup behavior, the way in which groups behave towards and perceive each other. These perceptions and behaviors in turn define the social identity of individuals within the interacting groups. The tendency to define oneself by membership in a group may lead to intergroup discrimination, which involves favorable perceptions and behaviors directed towards the in - group, but negative perceptions and behaviors directed towards the out - group. On the other hand, such discrimination and segregation may sometimes exist partly to facilitate a diversity which strengthens society. Intergroup discrimination leads to prejudice and stereotyping, while the processes of social facilitation and group polarization encourage extreme behaviors towards the out - group. Groups often moderate and improve decision making, and are frequently relied upon for these benefits, such as in committees and juries. A number of group biases, however, can interfere with effective decision making. For example, group polarization, formerly known as the "risky shift, '' occurs when people polarize their views in a more extreme direction after group discussion. More problematic is the phenomenon of groupthink. This is a collective thinking defect that is characterized by a premature consensus or an incorrect assumption of consensus, caused by members of a group failing to promote views which are not consistent with the views of other members. Groupthink occurs in a variety of situations, including isolation of a group and the presence of a highly directive leader. Janis offered the 1961 Bay of Pigs Invasion as a historical case of groupthink. Groups also affect performance and productivity. Social facilitation, for example, is a tendency to work harder and faster in the presence of others. Social facilitation increases the dominant response 's likelihood, which tends to improve performance on simple tasks and reduce it on complex tasks. In contrast, social loafing is the tendency of individuals to slack off when working in a group. Social loafing is common when the task is considered unimportant and individual contributions are not easy to see. Social psychologists study group - related (collective) phenomena such as the behavior of crowds. An important concept in this area is deindividuation, a reduced state of self - awareness that can be caused by feelings of anonymity. Deindividuation is associated with uninhibited and sometimes dangerous behavior. It is common in crowds and mobs, but it can also be caused by a disguise, a uniform, alcohol, dark environments, or online anonymity. A major area in the study of people 's relations to each other is interpersonal attraction. This refers to all forces that lead people to like each other, establish relationships, and (in some cases) fall in love. Several general principles of attraction have been discovered by social psychologists, but many still continue to experiment and do research to find out more. One of the most important factors in interpersonal attraction is how similar two particular people are. The more similar two people are in general attitudes, backgrounds, environments, worldviews, and other traits, the more probable an attraction is possible. Physical attractiveness is an important element of romantic relationships, particularly in the early stages characterized by high levels of passion. Later on, similarity and other compatibility factors become more important, and the type of love people experience shifts from passionate to companionate. Robert Sternberg has suggested that there are actually three components of love: intimacy, passion, and commitment. When two (or more) people experience all three, they are said to be in a state of consummate love. According to social exchange theory, relationships are based on rational choice and cost - benefit analysis. If one partner 's costs begin to outweigh his or her benefits, that person may leave the relationship, especially if there are good alternatives available. This theory is similar to the minimax principle proposed by mathematicians and economists (despite the fact that human relationships are not zero - sum games). With time, long term relationships tend to become communal rather than simply based on exchange. Social psychology is an empirical science that attempts to answer questions about human behavior by testing hypotheses, both in the laboratory and in the field. Careful attention to sampling, research design, and statistical analysis is important; results are published in peer reviewed journals such as the Journal of Experimental Social Psychology, Personality and Social Psychology Bulletin and the Journal of Personality and Social Psychology. Social psychology studies also appear in general science journals such as Psychological Science and Science. Experimental methods involve the researcher altering a variable in the environment and measuring the effect on another variable. An example would be allowing two groups of children to play violent or nonviolent videogames, and then observing their subsequent level of aggression during free - play period. A valid experiment is controlled and uses random assignment. Correlational methods examine the statistical association between two naturally occurring variables. For example, one could correlate the amount of violent television children watch at home with the number of violent incidents the children participate in at school. Note that this study would not prove that violent TV causes aggression in children: it is quite possible that aggressive children choose to watch more violent TV. Observational methods are purely descriptive and include naturalistic observation, "contrived '' observation, participant observation, and archival analysis. These are less common in social psychology but are sometimes used when first investigating a phenomenon. An example would be to unobtrusively observe children on a playground (with a videocamera, perhaps) and record the number and types of aggressive actions displayed. Whenever possible, social psychologists rely on controlled experimentation. Controlled experiments require the manipulation of one or more independent variables in order to examine the effect on a dependent variable. Experiments are useful in social psychology because they are high in internal validity, meaning that they are free from the influence of confounding or extraneous variables, and so are more likely to accurately indicate a causal relationship. However, the small samples used in controlled experiments are typically low in external validity, or the degree to which the results can be generalized to the larger population. There is usually a trade - off between experimental control (internal validity) and being able to generalize to the population (external validity). Because it is usually impossible to test everyone, research tends to be conducted on a sample of persons from the wider population. Social psychologists frequently use survey research when they are interested in results that are high in external validity. Surveys use various forms of random sampling to obtain a sample of respondents that are representative of a population. This type of research is usually descriptive or correlational because there is no experimental control over variables. However, new statistical methods like structural equation modeling are being used to test for potential causal relationships in this type of data. Some psychologists, including Dr. David Sears, have criticized social psychological research for relying too heavily on studies conducted on university undergraduates in academic settings. Over 70 % of experiments in Sears ' study used North American undergraduates as subjects, a subset of the population that may not be representative of the population as a whole. Regardless of which method has been chosen to be used, the results are of high importance. Results need to be used to evaluate the hypothesis of the research that is done. These results should either confirm or reject the original hypothesis that was predicted. There are two different types of testing social psychologists use in order to test their results. Statistics and probability testing define a significant finding that can be as low as 5 % or less, likely to be due to chance. Replications are important, to ensure that the result is valid and not due to chance, or some feature of a particular sample. False positive conclusions, often resulting from the pressure to publish or the author 's own confirmation bias, are a hazard in the field. The goal of social psychology is to understand cognition and behavior as they naturally occur in a social context, but the very act of observing people can influence and alter their behavior. For this reason, many social psychology experiments utilize deception to conceal or distort certain aspects of the study. Deception may include false cover stories, false participants (known as confederates or stooges), false feedback given to the participants, and so on. The practice of deception has been challenged by some psychologists who maintain that deception under any circumstances is unethical, and that other research strategies (e.g., role - playing) should be used instead. Unfortunately, research has shown that role - playing studies do not produce the same results as deception studies and this has cast doubt on their validity. In addition to deception, experimenters have at times put people into potentially uncomfortable or embarrassing situations (e.g., the Milgram experiment and Stanford prison experiment), and this has also been criticized for ethical reasons. To protect the rights and well - being of research participants, and at the same time discover meaningful results and insights into human behavior, virtually all social psychology research must pass an ethical review process. At most colleges and universities, this is conducted by an ethics committee or Institutional Review Board. This group examines the proposed research to make sure that no harm is likely to be done to the participants, and that the study 's benefits outweigh any possible risks or discomforts to people taking part in the study. Furthermore, a process of informed consent is often used to make sure that volunteers know what will happen in the experiment and understand that they are allowed to quit the experiment at any time. A debriefing is typically done at the experiment 's conclusion in order to reveal any deceptions used and generally make sure that the participants are unharmed by the procedures. Today, most research in social psychology involves no more risk of harm than can be expected from routine psychological testing or normal daily activities. Social psychology has recently found itself at the center of a "replication crisis '' due to some research findings proving difficult to replicate. Replication failures are not unique to social psychology and are found in all fields of science. However, several factors have combined to put social psychology at the center of the current controversy. Firstly, questionable research practices (QRP) have been identified as common in the field. Such practices, while not necessarily intentionally fraudulent, involve converting undesired statistical outcomes into desired outcomes via the manipulation of statistical analyses, sample size or data management, typically to convert non-significant findings into significant ones. Some studies have suggested that at least mild versions of QRP are highly prevalent. One of the critics of Daryl Bem in the feeling the future controversy has suggested that the evidence for precognition in this study could (at least in part) be attributed to QRP. Secondly, social psychology has found itself at the center of several recent scandals involving outright fraudulent research. Most notably the admitted data fabrication by Diederik Stapel as well as allegations against others. However, most scholars acknowledge that fraud is, perhaps, the lesser contribution to replication crises. Third, several effects in social psychology have been found to be difficult to replicate even before the current replication crisis. For example, the scientific journal Judgment and Decision Making has published several studies over the years that fail to provide support for the unconscious thought theory. Replications appear particularly difficult when research trials are pre-registered and conducted by research groups not highly invested in the theory under questioning. These three elements together have resulted in renewed attention for replication supported by Daniel Kahneman. Scrutiny of many effects have shown that several core beliefs are hard to replicate. A recent special edition of the journal Social Psychology focused on replication studies and a number of previously held beliefs were found to be difficult to replicate. A 2012 special edition of the journal Perspectives on Psychological Science also focused on issues ranging from publication bias to null - aversion that contribute to the replication crises in psychology It is important to note that this replication crisis does not mean that social psychology is unscientific. Rather this process is a healthy if sometimes acrimonious part of the scientific process in which old ideas or those that can not withstand careful scrutiny are pruned. The consequence is that some areas of social psychology once considered solid, such as social priming, have come under increased scrutiny due to failed replications The Asch conformity experiments demonstrated the power of conformity in small groups with a line length estimation task that was designed to be extremely easy. In well over a third of the trials, participants conformed to the majority, who had been instructed to provide incorrect answers, even though the majority judgment was clearly wrong. Seventy - five percent of the participants conformed at least once during the experiment. Additional manipulations to the experiment showed participant conformity decreased when at least one other individual failed to conform, but increased when the individual began conforming or withdrew from the experiment. Also, participant conformity increased substantially as the number of incorrect individuals increased from one to three, and remained high as the incorrect majority grew. Participants with three incorrect opponents made mistakes 31.8 % of the time, while those with one or two incorrect opponents made mistakes only 3.6 % and 13.6 % of the time, respectively. Muzafer Sherif 's Robbers ' Cave Experiment divided boys into two competing groups to explore how much hostility and aggression would emerge. Sherif 's explanation of the results became known as realistic group conflict theory, because the intergroup conflict was induced through competition over resources. Inducing cooperation and superordinate goals later reversed this effect. In Leon Festinger 's cognitive dissonance experiment, participants were asked to perform a boring task. They were divided into 2 groups and given two different pay scales. At the study 's end, some participants were paid $1 to say that they enjoyed the task and another group of participants was paid $20 to say the same lie. The first group ($1) later reported liking the task better than the second group ($20). Festinger 's explanation was that for people in the first group being paid only $1 is not sufficient incentive for lying and those who were paid $1 experienced dissonance. They could only overcome that dissonance by justifying their lies by changing their previously unfavorable attitudes about the task. Being paid $20 provides a reason for doing the boring task, therefore no dissonance. One of the most notable experiments in social psychology was the Milgram experiment, which studied how far people would go to obey an authority figure. Following the events of The Holocaust in World War II, the experiment showed that (most) normal American citizens were capable of following orders from an authority even when they believed they were causing an innocent person to suffer. Albert Bandura 's Bobo doll experiment demonstrated how aggression is learned by imitation. This set of studies fueled debates regarding media violence which continue to be waged among scholars. In the Stanford prison study, by Philip Zimbardo, a simulated exercise between student prisoners and guards showed how far people would follow an adopted role. In just a few days, the "guards '' became brutal and cruel, and the prisoners became miserable and compliant. This was initially argued to be an important demonstration of the power of the immediate social situation and its capacity to overwhelm normal personality traits. However, to this day, it remains a matter of contention what conclusions may be drawn from this study. For example, it has been pointed out that participant self - selection may have affected the participants ' behaviour, and that the participants ' personality influenced their reactions in a variety of ways, including how long they chose to remain in the study. One of the most concerted empirical revisitations of the themes raised by Zimbardo came with the 2002 BBC prison study.
sons of anarchy what is wrong with clay's hands
Clay Morrow - wikipedia Clarence "Clay '' Morrow was a fictional character in the FX television series Sons of Anarchy. He is played by Ron Perlman. He is the former International President of the Sons of Anarchy Motorcycle Club, but is also something of a vigilante as he does everything in his power to ensure that drug dealers and rapists stay out of his town. However, during the fourth season, Clay involves the club with a number of drug dealings with the cartel for his own protection and greed, wavering in his allegiance, and gradually reveals himself to be one of the story 's antagonists. He is 6 ' 2 '' and has a number of tattoos, most notably a Grim Reaper on his upper right arm and a Paratrooper tattoo on his upper left, along with the words "Death From Above '' indicating that he served as a paratrooper in a military unit. He also wears a golden pin of the same symbol on his kutte. This symbol is often confused with the US Army Parachutist badge. The Paratrooper symbol that Clay wears is an unofficial symbol, and would serve to indicate service, particularly combat service, as a paratrooper, and consists of a skull, with wings coming from the bottom of the skull, and curving up to meet the top of the skull. Given that his and Gemma Teller 's machinations are the driving source of conflict throughout the story, it can be argued that both he and Gemma are the series ' main antagonists. Morrow, of Irish descent, was born in 1949 but is not a native of Charming, California. He is one of the original "First 9 '' members of the Sons of Anarchy Motorcycle Club, formed in 1967, but was not a founding member. Of the first nine, he was the youngest and one of only three who were not war veterans. He did later go on to serve in the military, however, joining the US Army as an Airborne qualified Infantryman in 1969 and was deployed to Vietnam until 1972 (commemorated by a tattoo on his left arm and the paratrooper pin on his vest). When he returned from service, he remained a member of the club and opened the Teller - Morrow Automotive Repair shop with John Teller, the club President and his best friend. Whilst serving as the Vice President during the early 1990s, he was responsible for a number of murders during the SAMCRO - Mayan War, including that of Lowell Harland, Sr., a mechanic at the auto shop who became an ATF informant. In 1993, he became the President of the club 's Mother Charter, based in Charming (which also means that he was International President), after the death of John Teller. He went on to marry John Teller 's widow Gemma Teller Morrow in the mid-1990s and made their son, Jax Teller, Vice-President. It is implied that under Morrow, SAMCRO has become more of a criminal enterprise than before, much to Jax 's disillusionment. On his colors he wears patches reading "First 9 '' and "President ''. He suffers from osteoarthritis (degenerative arthritis) in his hands, which is slowly worsening as the series progresses. He makes several attempts to hide it from his brothers to protect his position as President, as a rule among members of the Sons of Anarchy prevents members from leading if they ca n't hold the grip of their motorcycle and ride effectively. Season One begins with the Sons of Anarchy finding their weapon storage warehouse being burned down. The SOA rush to the scene, where local police officers are already investigating. Clay talks to Sheriff Vic Trammel about the blaze, who claims that propane tanks inside the building blew up, and that he suspects arson due to the bootprints inside. Trammel then shows Clay the burned corpses of the people hidden beneath the building, who were illegal immigrants. It is later revealed that they were Mexican prostitutes "owned '' by the club 's Sergeant - at - Arms, Tig Trager. Clay then goes to meet Laroy, the leader of the One - Niners, in the East Bay. He is due to sell guns to the gang, but as they were destroyed, he must explain what happened at the warehouse. Laroy needs the guns to protect their heroin trade from another motorcycle club, the Mayans, and eventually lets Clay have some more time to get more guns together for him. The Sons of Anarchy then work out that it was the Mayans who stole the guns and destroyed the storage warehouse, and decide to get some payback. Juice Ortiz, the club 's intelligence officer, finds out where the Mayans stored the guns and Clay, Tig, Jax and Chibs Telford all head out to San Leandro to retrieve them. When they arrive at the industrial storage warehouse and find the guns, three Mayans turn up in a car outside. Clay and Tig then shoot and kill the Mayans. Jax is shot by another man, who has a number of Nazi and White supremacist tattoos. He was a member of the Nordics, a local skinhead gang who are allied with the Mayans, and it later emerges that his name is Whistler. Jax turns around and shoots Whistler twice, killing him. They escape with the weapons and destroy the building using explosives. Wayne Unser, the Chief of the Charming Police Department, has always gotten on well with the Sons of Anarchy during his time in charge, and has even employed them as muscle, at times. However, he is dying from cancer and will retire at the end of the month, handing power over to his Deputy, David Hale, who is overtly suspicious of the SOA and will almost definitely begin an investigation into the club. To warn Unser to keep Hale off their case, Clay, Bobby Munson (the club 's Treasurer) and Opie Winston hijack a shipment that they are supposed to protect, and threaten to hijack more. As a goodwill gesture, they give the contents of the truck to the local Italian American Mafia, as their gun delivery is late. Meanwhile, Tig comes forth and tells Clay that he has been having sex with both of the women found at the warehouse and that his DNA is in the police database. Clay orders Tig and Bobby to get rid of the bodies. The pair then retrieve the corpses from the police site and burn them in a furnace. Hale is enraged when he discovers that the bodies are missing, and he threatens Clay that he will close the SOA down for good. Local businessman Elliot Oswald goes to Clay after his 13 - year - old daughter is raped at a carnival and asks the Sons of Anarchy to hunt down the rapist and kill him, in exchange for money. Clay refuses the money but insists that if they catch him, he must carry out the punishment himself. Juice and Gemma discover that the rapist is one of the carnies and the gang capture him and bring him to Oswald, who tries to castrate him but can not bring himself to do it. Clay then carries the punishment out, but wears gloves whilst doing it. He then frames Oswald for the crime, as only Oswald 's fingerprints are on the knife, because Oswald was about to sell off much of the land around Charming to big business and housing, which would challenge the SOA 's reign over the town. Jax is unhappy that Clay did not tell him of the plan, however, and tells him to always inform him on his future motives. When a Bureau of Alcohol, Tobacco, Firearms and Explosives agent arrives in Charming and begins investigating the club, Clay decides that they should move their weapons to Indian Hills, Nevada, where their brother club, the Devil 's Tribe, are based. He also sends Bobby and Jax to inform the Devil 's Tribe that the Sons of Anarchy will be patching over them, and Tig and Juice to steal a truck to transport the guns in. Clay also travels to Indian Hills to perform the patch - over ceremony and brings a number of SOA Washington members as protection in case the Mayans retaliate for an earlier conflict involving Jax, Bobby and five Mayans. At the patch - over party, he has sex with Cherry, a woman whom Half - Sack likes, as revenge for his calling Gemma a MILF earlier. The Mayans eventually retaliate, as predicted, by attacking the Devil 's Tribe clubhouse, and a large shootout ensues. Otto Delaney and a number of other imprisoned SAMCRO members have been protecting Chuck in Stockton prison. He is wanted by the Triads because he stole money from them, then informed on a number of their members when he was arrested. When Clay and Jax visit Otto in prison, he tells them that if they protect Chuck when he is released, he will inform them where the money that he has skimmed from the Asian Triads gang is located. They agree and pick him up from prison and bring him to the clubhouse. SAMCRO intends to wait until the restaurant, where the money is stored, closes before raiding it, but Chuck 's frequent unconscious masturbating is unbearable for them and they decide to raid the restaurant straight away. However, just after they steal the money and some counterfeiting plates, the Triads turn up. Instead of starting a war, Clay decides to make a deal; SAMCRO hands over Chuck and the plates for $60,000 from the Triads as the skim money is all counterfeit. Meanwhile, Kyle Hobart, a disgraced former SAMCRO member, goes to Jax and asks him if the Sons want in on his deal selling stolen car parts. Jax accepts and invites him to the clubhouse that night. This was a setup, however, to punish him for not having his SOA tattoo removed after being disowned by the club. At the clubhouse, they tie him up and Tig uses a blowtorch to burn off the tattoo. Jax and Piney sell five AK - 47s to Nate Meineke and his right - wing state militia. They then use these guns to ambush a prison convoy and free one of their members, Frank Cison. Three police officers are killed during the assault. Meineke drops his cell phone at the scene, and his last calls have been to Clay Morrow. When the ATF finds the phone, they arrest Clay and raid the clubhouse. Clay is later released because no evidence is found, meaning he can no longer be kept in custody. Meanwhile, Jax, Piney and Opie Winston decide to kill Meineke and his gang to keep them from snitching if they are apprehended by the authorities. They pretend to sell them more weapons. The "boxes of guns '' are actually filled with explosives, however, and the militia load their trucks with them. All of the militia are killed when the bombs are detonated. Workmen working for the Water and Power Board near the highway dig up three skeletons, two of which were Mayans killed in the war of 1992. The other was Lowell Harland, Sr., a mechanic at Teller - Morrow. He was killed for being a "junkie rat ''. To stop the bodies from being identified, Clay, Jax and Tig break into the local morgue, prepared to steal the bones. However, the corpses have already been identified. When Clay tells Lowell, Jr. about his father 's death, Lowell runs away. Clay tracks him down and eventually brings him back to town. Clay is almost killed when the Mayans attempt to shoot him while at a deal with Cameron Hayes of the True IRA at a bar. The two Mayan soldiers are gunned down by Tig and the shotgun - toting barman, however. Clay then calls in the Sons of Anarchy State Presidents and Vice-Presidents from Washington, Utah and Nevada, in a bid to wipe out the Mayans. After the shooting, he questions Jax 's commitment to the club and Jax 's willingness to kill. Clay is taken to the local police station for questioning about the recent shootings, by Wayne Unser. Ernest Darby is also being held there, and Clay tells Unser to bring Álvarez in, in order for the three gang leaders to hold a meeting and, hopefully, prevent further bloodshed. He meets with Darby first, and tells him not to retaliate because it would start a war on the streets of Charming. He then meets with Álvarez and the pair make a deal; the Sons of Anarchy will begin selling guns to the Mayans, and all Mayan - SOA disputes (over turf, businesses, etc.) end in the SOA 's favor. Álvarez also gives the Sons permission to kill Esai, as revenge for the attempted hit on Clay. After the clubhouse is raided and Bobby is arrested for the murder of Brenan Hefner, Clay and Tig suspect Opie of being the witness who identified him. It was another person, however. When Opie turns up at the clubhouse, Tig checks his car for bugs and finds a microphone. He also finds a recording device in Opie 's mobile phone. Both were planted by the ATF without Opie 's knowledge. Clay and Tig then decide to kill him. Clay and Tig again meet with One - Niners ' leader Laroy, and they discuss taking out the Mayans. They agree that the Sons would meet the Mayans for an arms deal and when they left, the Niners would eliminate the Mayans and take the guns as payment. However, when Clay, Tig, Opie and Jax meet Álvarez and his crew at a warehouse in Oakland, the Niners try to take out the Mayans and SOA. The group escapes, but a number of Mayans and Niners are killed in the shootout. Tig attempts to kill Opie during the havoc, but finds himself unable to do so. Later on at Jax 's son Abel 's homecoming party, Tig follows Opie 's car home and shoot the driver dead. However, the driver is Opie 's wife, Donna, who has switched vehicles with Opie. Just after Tig leave to kill Opie, Clay is approached by Wayne Unser and told that Opie is, in fact, not an informant and that the ATF has set him up. Clay tries to phone Tig to tell him, but Tig is not carrying his phone. Rosen, the club 's lawyer, meets with Clay and tells him that the ATF has put a warrant out for Opie 's arrest and that he will most likely get convicted of Hefner 's murder because of the witness. Clay, Tig and Juice then meet with Vic Trammel and offer him money to reveal the location of the witness. Trammell does not know, however, so they go to Elliot Oswald. They again blackmail him with the knife that he used to kill the rapist earlier in the season, this time to get his friends in the US Attorney 's office to tell him the witness ' case number and location. He then sends Chibs, Happy and Tig to kill the witness, who is at a safe house in San Joaquin. Jax goes to the safe house and stops the trio from killing the witness, but threatens her into leaving the state. At Church, Jax confronts Clay and asks if he killed Donna. Clay denies this. Season One ends at Donna 's funeral, which is attended by Sons of Anarchy from all over the country. There, Jax and Clay stare at each other and it is plain to see that the club is coming apart from inside. In the first episode of Season Two we see that Clay is still struggling with remorse for what he did to Donna, and that he is going head to head with Jax over her death. We see him giving Opie a fake story of how a Mayan MC member killed Donna and he throws a welcome back party for Bobby. When Ethan Zobelle and the L.O.A.N. start threatening members of SAMCRO, Clay is all for immediate retaliation, but he is at odds with Jax, who feels the club may be walking into a trap. He has recently learned of his wife Gemma 's rape at the hands of L.O.A.N. and settled his differences with Jax, so the two can work together to get revenge. Clay is now shown to be more in tune with Jax 's method of operation, opting to do more recon work before entering potentially lethal situations. In the season two finale, Clay neglects to inform Marcus Alvarez that Ethan Zobelle is an F.B.I. informant, knowing the Mayan leader would kill Zobelle himself, forfeiting Clay 's opportunity to do the deed. Later in the episode, the Sons ambush the Mayan convoy containing Zobelle and Clay spares Alvarez 's life. Clay and the others corner Zobelle in a deli, but decide to abandon the scene after learning of his grandson 's kidnapping. He is last seen consoling his distraught stepson Jax, whose son Abel (Clay 's grandson) has been kidnapped by IRA gun dealer Cameron Hayes. In the first episodes we see that Clay is still saddened by the loss of Abel, and is willing to help Jax at all costs. Also, his arthritis continues to worsen. As shown in the episode "Home '', it got so bad that Jax had to tie Clay 's hands to the handlebars. In the episode "Turas '', when SAMCRO are nearly killed by a bomb hidden in a gun shipment put there by the SAMBEL Sergeant - at - Arms, Jax has a shell shocked vision of his biological father speaking to him but it turns out to actually be Clay. In the episode "Firinne '', Clay kills McGee, member of the First 9 and President of the Belfast Chapter, for his treachery against the club by pushing him off a roof after taking his cut. He later burns the cut after expressing remorse for the killing. In the Season 3 finale, when Clay learns of Jax 's betrayal, he appears enraged and says "Jax will die ''. In the end, it is revealed that he knew of Jax 's deal with Stahl and that it was part of a plan to execute Jimmy O. and Stahl. When the plan is done, Clay and the others share a laugh, much to the confusion of the ATF agents. Also in the end of the episode, as Tara is reading the letters John Teller wrote to Maureen Ashby, Teller says he fears that Clay and Gemma will kill him because of his betraying Gemma by having an affair and attempting to alter the clubs focus on criminal activity, leading to much speculation that Clay and Gemma might have killed John Teller. Clay is seen as one of the many SAMCRO members released from prison. He later escapes their sheriff tail and accompanies Jax and Opie 's meeting with the Russians, where he settles their differences and forms a partnership with them. He attends Opie 's wedding and goes to test a new gun given to them by Putlova. As he 's firing the gun he turns and shoots Putlova 's bodyguards and Jax stabs Putlova to death as revenge for trying to kill him in prison. He also tells Gemma that his arthritis has gotten worse and that he has only 1 or 2 years left before he has to step down as President of SAMCRO. Also, while in prison, he set up a deal to run cocaine for the Gallindo Cartel, but this does not sit well with the other members of the club, as SOA avoids drugs. When the club goes to Arizona and meets the Tucson charter, SAMTAZ, he demands that the charter stop dealing meth. This request is denied, as selling generates too much money for the club to give up. Otto wants Luann 's murderer dead, and the issue is raised in the chapel. Clay is then confronted by Piney, who threatens Clay that if Clay does not kill the cocaine deal with the Gallindo Cartel, Piney will distribute letters to the club about John Teller 's murder to the other members. Clay discusses the threat with Gemma, revealing that Clay did kill Teller. Clay later meets Unser and obtains the letters from him, unaware that Gemma has the same plan. After Unser gets a copy of the letters, he confronts Clay. Clay replies he does not regret doing what he did because it protected the Club and Charming. When he visits a tied Georgie Caruso, he claims that he has connections with millionaire Japanese families. Clay immediately sees an opportunity to make Jacob Hale believe he has investors in time. His idea is that he will have the investors pull out at the last minute, which will put an end to Charming Heights. After retrieving the last brick of cocaine that Juice had stolen, framing Miles for it, Clay asks Romeo for help killing Tara, to keep the secret of the letters hidden. Clay comes to Piney 's cabin in the night. After discussing trust issues and differences in the cabin, Clay leaves, only to break down the door, knocking Piney off his feet. Piney begs Clay not to get Tara involved in the letters from JT. Clay shoots Piney in the chest with a shotgun, killing him. This makes Piney the 3rd member of the First 9, and the 2nd and final co-founding member of the Sons of Anarchy to be killed at the hands of Clay. Clay leaves the markings of the cartel to implicate them for the murder. Given the club 's difficult circumstances, Clay calls the Irish Kings for a meeting to set a new deal that lets them survive the war against Lobo Sonora. He later learns that the One Niners have been dealing with the Sonora and the Sons plan an attack using the Niners to lure them. This fails, however, as Sonora 's men were equipped with grenade launchers. Luis, Romero 's right - hand man, gives Clay a cell phone with his contact to kill Tara. Jax confronts Clay when he hears from Bobby that Clay wanted Bobby, instead of Opie, as president as was the deal. Clay says it is Tara 's fault that Jax changed, but Jax warns him to never insult her again. Gemma tells Clay that Tara will not reveal the letters to Jax for fear that Jax will get deeper into the club out of guilt. Clay promises to Gemma he wo n't hurt Tara, however he uses the cell phone Luis gave him, and the next morning pays the contact 25,000 dollars for the murder. After finding out that Jax and the babies are with her, Clay desperately tries to stop the hit, but is unsuccessful and Tara has her hand broken by a car door while struggling to escape. Clay meets Romeo and gets a refund, with Romeo taking the matter into his hands personally, and Clay reluctantly agrees that Tara is best dead. Gemma confronts Clay about the hit and a violent fight ensues between the two, including Gemma shooting at Clay (deliberately missing) and getting in a powerful punch and a kick against Clay with Clay getting injured but Clay gets the upper hand and severely beats Gemma 's face. That night he decides to sleep in the clubhouse. Opie later finds out that Clay was behind the death of Piney and seeks revenge, ending up shooting Clay in the torso twice. Clay survived the shooting but was shown to be in intensive care. Later in the episode Gemma gave Jax his father 's letters to Maureen Ashby. After reading the letters, and realizing Clay is responsible for the death of John Teller, Jax vowed to kill Clay. Jax puts a knife up to Clays throat and makes him step down as President and orders him to never go near his family again. Although Clay tries to explain his reasons Jax refuses to listen and takes his Presidents patch, thus ending Clays reign as President of SAMCRO. Clay is shown to have been discharged from the hospital, but requires an oxygen tank due to his lung injury. He is shown trying to make amends with Gemma, but she coldly brushes him off. He later reveals to the club he murdered Piney, but states that Piney was drunk and tried to kill him first and Opie found out and is the one who shot him. The club 's rules (due to him having killed a member) means they must vote on kicking him out. When Jax questions his motives for telling the club, he denies any. His arthritis is shown to have reached the point to where he ca n't ride his bike at all. He later goes to his and Gemma 's burglarized house and expresses concern over his safe being stolen. He then goes and visits Opie and convinces him not to walk away from the club because of him, seeing as though he 's "half dead already ''. He later finds out where Gemma has been going from Juice. He goes to the brothel and confronts Nero, the owner and Gemma 's new suitor. He then "seeks comfort '' with a young prostitute to anger Gemma. Gemma then attacks the girl causing her to leave. When Nero 's operation is later shut down, and he and Gemma are arrested, it 's likely Clay was behind it. The next episode "Small World '' shows Clay having recovered to the point of no longer needing his oxygen tank, though he continues to wear it (either for sympathy or to keep people off - guard). He later helps Gemma take care of a dead body (Nero 's half - sister) and the two seem to be on better terms. At the end of the episode, he confronts the three Nomads responsible for the Charming burglaries and the death of Sheriff Roosevelt 's wife. He punches one in the face exclaiming "You were n't supposed to kill her! '' This shows Clay to have been pulling the strings behind the home invasions. The next episode "Toad 's Wild Ride '' reveals the Nomads, (Go - Go, Greg and Frankie) made a deal with Clay to help him get back at the head of the table in exchange for a cut of Clay 's share of the guns and cocaine money. The break - ins were Clay 's way of turning Charming against Jax. When Unser (who has been investigating the break - ins) comes close to discovering the truth, Go - Go and Greg meet with Clay to discuss killing him, while Frankie goes underground. Clay meets with Unser in his trailer talking about the trust and friendship between them. When Go - Go and Greg break in the door, Unser shoots Go - Go with a double - barreled shotgun and Clay betrays them, shooting Greg in the head with his pistol. When Juice (who saw Go - Go and Greg going to Unser 's trailer) asks Clay what is happening, Clay denies involvement. Jax and the rest of the club find out about their attack on Unser and Jax privately accuses Clay of using the Nomads to undermine his leadership. Clay states that Pope is the one who hired the Nomads, Jax claims they 'll find Frankie and learn the truth. In "Ablation '' Clay visits Gemma, Abel, and Thomas in the hospital after a marijuana - induced car crash. He tells Jax that a truck ran her off the road to protect her. When Jax learns about the lie, he tells Clay he understands why he lied and that Gemma is dead to his family, and he wants him to take care of her. "Andare Pascare '' shows the club discovering that Frankie is hiding with a Mafia family, paying them with money stolen from Nero. They take a unanimous vote to kill him after extracting the information they need. When Clay learns from Jimmy Cacuzza the location of a Mafia safehouse where Frankie is hiding, Clay and Juice go there to kill him to keep him from outing Clay (Juice acting under the pretense of scouting it out). When Frankie barricades himself in the house and shoots at the two, Clay drives the van into the gas tank outside, blowing up the house. Just as Clay is about to kill Frankie, Jax and the others show up, having seen the explosion. However Frankie is quickly gunned down by the enraged Mafia Don for killing one of his men before he can tell them anything. Bobby later asks Clay if there 's anything he wants to tell him (implying he also knows about his dealings with the Nomads). When Clay claims his conscience is clear, Bobby states "I hope you 're as smart as you think you are, cause I 'm sicking of burning friends ''. The episode ends with Gemma coming to Clay to help him with his cortisone shots. The next episode "Crucifixed '' shows Clay negotiating with Romeo and Luis for protection as once the RICO case is gone, he will no longer need Clay and will most likely kill him. Romeo suggests getting rid of Jax and putting Clay back at the head of the table. Clay disagrees to taking the deal, Romeo states "Yes you will. '' Later, Gemma and Clay draw closer at Clay 's place, where Gemma insists they return home. They are last seen sitting on their bed talking, where Clay confesses he ca n't bear losing her again. Gemma kisses him passionately and they sink onto the bed. The next episode "To Thine Own Self '' shows Clay learning of Otto 's murder of a nurse, ending the RICO case. He is shown, for unknown reasons, trying to save Jax from Romeo and warns him to work with the cartel long enough to make the Club legitimate and leave. He refuses stating "I 'm done bowing down to greedy men who have nothing to believe in ''. When Jax reveals a new deal to the Club, which would allow the Mayans and Triads to take over muling the cocaine and selling big guns, respectively, Clay votes yes along with everyone else. He is later shown moving legal documents given to him by the Nomads to another location, to protect himself from the Club finding this important evidence of his betrayal. The end of the episode shows Bobby going to Clay and Gemma 's house to talk to him about "trying to keep you alive. '' It 's revealed in "Darthy '' that Bobby convinced Clay to confess his role in the Nomad attacks in exchange for vetoing his death: club - sanctioned assassinations (known as "A Visit from Mr. Mayhem '') require a unanimous vote. Clay also meets with Gaylen, tells him he plans on starting his own crew to run any guns SAMCRO does n't pick up, and asks for a plane to Belfast to wait out any immediate danger. He also gives Juice a gun that he values for all he has done for him. At the end of the episode, Clay has been voted out of the club after revealing everything. After Jax beats Clay in frustration for not being allowed to kill him, Happy removes the former Club President 's SAMCRO tattoos on his back and arm by smudging them over with a needle and black tattoo ink as the rest of the club looks on. He is shown to be ready to leave in the season finale when Roosevelt and several policemen state they found his gun (the same one he gave Juice) at a crime scene as the murder weapon that killed Damon Pope and three of his men. When he asks Gemma to vouch for him, she states he was gone with the gun and she did n't know he was going to kill anyone. He is arrested and is last seen in a prison transport van with two black men. Meanwhile, Jax has convinced Pope 's lieutenants that Clay was Pope 's killer; per Pope 's standing order in the event of his death, they offer a $5 million reward for Clay 's murder. At the beginning of the season, Clay is being held in protective custody after his arrest for Damon Pope 's murder. Clay is visited by retired US Marshall Lee Toric, the brother of the nurse Otto murdered. Toric, who had sworn revenge against the motorcycle club, tells Clay he can only remain in protective custody if he cooperates in building a case against SAMCRO. Clay initially refuses, but after being transferred to general population and realizing he is certain to be murdered in retaliation for two deaths he had nothing to do with and did n't even approve, he agrees to help Toric and is returned to protective custody. He later demands a sit - down with Gemma and Jax before he signs the deal. He meets with Gemma and seems apologetic, though Gemma suspects he has ulterior motives. He meets Jax and states that he will give Toric nothing and that he is sorry. In response to this he is shipped to Stockton, where he is attacked by three black men, but they spare him and offer him a shank. He later uses it to kill a member of a Neo-Nazi group and gets protection from the black gang. He is visited once more by Toric, who shows him the brutalized Otto. Clay slips Otto a shank and leaves. Despite threats that he will meet the same fate, Clay refuses to sign. Toric is later killed by Otto with the shank Clay gave him and Otto is killed by the guards in response. Following the fallout between Jax and the IRA, Clay is contacted by Galen O'Shea with the opportunity to continue as the main weapon distributor for the IRA in Northern California. The IRA will arrange to have Clay escape from the prison transport on the way to his hearing and he will retreat to Belfast and build his own new crew. Clay requests a conjugal visit with Gemma so that he can have her relate this information to Jax. He pays off two guards to allow him to chat with Gemma, but when the visit is over, the guards demand to watch Clay and Gemma have sex while they masturbate. They threaten to have Clay killed and Gemma grudgingly agrees to do it. Afterwards Clay vows to kill the two guards, but Gemma tells him that SAMCRO needs him alive. When Clay 's transport date is moved up, Galen enlists SAMCRO to assist in attacking his transport truck to free him; Bobby is shot during the attack and Juice kills a guard. After meeting up with the Irish, Jax kills Galen and his men. He explains the situation, stating the Club took a unanimous vote how to handle the situation. Clay, accepting of his fate, stands ready. Jax then executes Clay by shooting him in the neck and then five times in the chest while he is on the floor. Jax then arranges the bodies to make it look like Clay had a falling out with the Irish and they all died in a shootout, allowing him to finally get revenge against Clay as well as Galen.
did bowie and queen perform under pressure live
Under Pressure - wikipedia "Under Pressure '' is a 1981 song by the British rock band Queen and the British singer David Bowie. It was included on Queen 's 1982 album Hot Space. The song reached number one on the UK Singles Chart, becoming Queen 's second number - one hit in their home country (after 1975 's "Bohemian Rhapsody '', which topped the chart for nine weeks) and Bowie 's third (after 1980 's "Ashes to Ashes '' and the 1975 reissue of "Space Oddity ''). The song only peaked at No. 29 on the US Billboard Hot 100 in January 1982, and would re-chart for one week at No. 45 in the US following Bowie 's death in January 2016. It was also number 31 on VH1 's 100 Greatest Songs of the ' 80s. It has been voted the second best collaboration of all time in a poll by the Rolling Stone magazine. The song was played live at every Queen concert from 1981 until the end of Queen 's touring career in 1986. It is recorded on the live albums Queen Rock Montreal and Live at Wembley ' 86. The song was included on some editions of Queen 's first Greatest Hits compilations, such as the original 1981 Elektra release in the US. It is included on the band 's compilation albums Greatest Hits II, Classic Queen, and Absolute Greatest as well as Bowie compilations such as Best of Bowie (2002), The Platinum Collection (2005), Nothing Has Changed (2014) and Legacy (2016). It was certified 2x Platinum in the US by the RIAA, for over two million digital download equivalent units, on 20 March 2018. Queen had been working on a song called "Feel Like '', but was not satisfied with the result. David Bowie had originally come to Mountain Studios to sing back up vocals on another Queen song, "Cool Cat '', but his vocals were removed from the final song because he was not satisfied with his performance. Once he got there, they worked together for a while and wrote the song. The final version, which became "Under Pressure '', evolved from a jam session that Bowie had with the band at Queen 's studio in Montreux, Switzerland. It was credited as being co-written by the five musicians. The scat singing that dominates much of the song is evidence of the jam - beginnings as improvisation. However, according to Queen bassist John Deacon (as quoted in a French magazine in 1984), the song 's primary musical songwriter was Freddie Mercury -- though all contributed to the arrangement. Brian May recalled to Mojo magazine, in October 2008, that, "It was hard, because you had four very precocious boys and David, who was precocious enough for all of us. David took over the song lyrically. Looking back, it 's a great song but it should have been mixed differently. Freddie and David had a fierce battle over that. It 's a significant song because of David and its lyrical content. '' The earlier, embryonic version of the song without Bowie, "Feel Like '', is widely available in bootleg form, and was written by Queen drummer Roger Taylor. There has also been some confusion about who had created the song 's bassline. John Deacon said (in Japanese magazine Music life in 1982) that David Bowie created it. In more recent interviews, Brian May and Roger Taylor credited the bass riff to Deacon. Bowie, on his website, said that the bassline was already written before he became involved. Roger Taylor, in an interview for the BBC documentary Queen: the Days of Our Lives, stated that Deacon did indeed create the bassline, stating that all through the sessions in the studio he had been playing the riff over and over. He also claims that when the band returned from dinner, Deacon, amusingly, forgot the riff, but fortunately Taylor was still able to remember it. Brian May clarified matters in a 2016 Mirror Online article, writing that it was actually Bowie, not Taylor, who had inadvertently changed the riff. The riff began as "Deacy began playing, 6 notes the same, then one note a fourth down ''. After the dinner break, Bowie corrected (actually changed) Deacon 's memory of the riff to "Ding - Ding - Ding Diddle Ing - Ding ''. The video for the song features neither Queen nor David Bowie due to touring commitments. Taking the theme of pressure, director David Mallet edited together stock footage of traffic jams, commuter trains packed with passengers, explosions, riots, cars being crushed and various pieces of footage from silent films of the 1920s, most notably Sergei Eisenstein 's influential Soviet film Battleship Potemkin, the silent Dr. Jekyll and Mr. Hyde starring John Barrymore, and F.W. Murnau 's Nosferatu, a masterpiece of the German Expressionist movement. The video explores the pressure - cooker mentality of a culture willing to wage war against political machines, and at the same time love and have fun (there is also footage of crowds enjoying concerts, and lots of black and white kissing scenes). Top of the Pops refused to show the video due to it containing footage of explosions in Northern Ireland, so a choreographed performance was instead shown. In 2003, Slant Magazine ranked Under Pressure number 27 among the 100 greatest music videos of all time. Musicians on original version: The September 2005 edition of online music magazine Stylus singled out the bassline as the best in popular music history. In November 2004, Stylus music critic Anthony Miccio commented that "Under Pressure '' "is the best song of all time '' and described it as Queen 's "opus ''. In 2012, Slant Magazine listed "Under Pressure '' as the 21st best single of the 1980s. Although very much a joint project, only Queen incorporated the song into their live shows at the time. Bowie chose not to perform the song before an audience until the 1992 Freddie Mercury Tribute Concert, when he and Annie Lennox sang it as a duet (backed by the surviving Queen members). However, after Mercury 's death and the Outside tour in 1995, Bowie performed the song at virtually every one of his live shows, with bassist Gail Ann Dorsey taking Mercury 's vocal part. The song also appeared in set lists from A Reality Tour mounted by Bowie in 2004, when he frequently would dedicate it to Freddie Mercury. Queen + Paul Rodgers have recently performed the song; and in summer of 2012, Queen + Adam Lambert toured, including a performance of the song by Lambert and Roger Taylor in each show. While David Bowie was never present for a live performance of the song with Freddie Mercury, Roger Taylor instead filled for backing vocals usually in unison with Mercury, as Mercury took over all of Bowie 's parts. A remixed version (called the "Rah Mix '') was issued in December 1999 to promote Queen 's Greatest Hits III compilation, reaching No. 14 on the UK Singles Chart. The video for the Rah Mix was directed by DoRo, featuring footage of Freddie Mercury from the Wembley concert on 12 July 1986 and David Bowie at the Freddie Mercury Tribute Concert also at Wembley on 20 April 1992 spliced together using digital technology (with Annie Lennox carefully edited out). This version is featured on the Greatest Hits III compilation, the Rah Mix CD single (as an Enhanced CD video) and the 2011 iTunes LP edition of Hot Space. Two CD singles (one multimedia enhanced) released 6 December 1999 and 7 '' picture disc released 13 December 1999. As "Bohemian Rhapsody '' wins The Song of The Millennium award, this released as B - side under the title "The Song of The Millennium -- Bohemian Rhapsody ''. Mr. Mixx Remix. Mr. Mixx of 2 Live Crew produced a hip - hop remix intended for inclusion as the fourth track on the cancelled 1992 Hollywood Records compilation BASIC Queen Bootlegs. Lazy Kiss Edit. Released in October 2013 by Brazilian Electro - House duo, Lazy Kiss. This edit / mashup gained exposure through blog filter site, Hype Machine and the Italian music blog, Frequenze Indipendenti. Mouth Pressure. Released in January 2017 as a part of the Neil Cicierega album Mouth Moods, "Mouth Pressure '' pairs the instrumentals from "Under Pressure '' with the vocals from Smash Mouth 's "All Star ''. Percy 's Pressure  . A karoke version of the song was released in September as a part of the soundtrack of the animated Warner Brothers musical film Smallfoot   whose lyrics detail one of the central human characters Percy 's (voiced by James Corden) fall from fame and his need to bounce back. Additional lyrics were written by Karey Kirkpatrick the films director and his brother Wayne Kirkpatrick. In the U.K., "Under Pressure '' was Queen 's second number - one hit and Bowie 's third. Queen 's smash hit "Bohemian Rhapsody '' reached number one in November 1975, just two weeks after Bowie 's "Space Oddity '' had done the same. Bowie also topped the British charts in August 1980 with "Ashes To Ashes '', his own answer song to "Space Oddity ''. Controversy arose when Vanilla Ice sampled the bassline for his 1990 single "Ice Ice Baby ''. Initially he denied the accusation and then said he had modified it but did not originally pay songwriting credit or royalties to Queen and Bowie. A lawsuit resulted in Bowie and all members of Queen receiving songwriting credit for the sample. Vanilla Ice later claimed that he purchased the publishing rights to "Under Pressure ''. Vanilla Ice said buying the song made more financial sense than paying out royalties. The song was covered in 2005 by American alternative rock bands the Used and My Chemical Romance for tsunami relief. The cover was originally released as an Internet download track but has subsequently been featured as a bonus track on the 2005 re-release of the Used 's second studio album In Love and Death, and received wide airplay in 2005. On the Billboard charts, the single reached number 28 on Modern Rock chart and number 41 on the Hot 100. In October 2018, Canadian singer songwriter Shawn Mendes featuring American singer songwriter Teddy Geiger released a version of the song. The song was released to coincide with the release of the film Bohemian Rhapsody (film) Bohemian Rhapsody)). Universal Music Group will release 3 tracks by different artists ' channeling their inner Freddie Mercury; this is the first. A portion of the profits from the "Under Pressure '' cover will be donated to Mercury Phoenix Trust, which was founded by Queen 's Brian May and Roger Taylor (and the group 's manager, Jim Beach) after Mercury 's death to help fight AIDS worldwide. Mendes said in a statement "I am so honored to be able to support the amazing legacy of Freddie and Queen by doing a cover of one of my favorite songs, ' Under Pressure ' ''. Taylor Weatherby from Billboard called the track "breezy '' and said "Mendes and Geiger put their voices at the forefront of the stripped - down rendition, with Mendes ' falsetto and Geiger 's raspy tone complementing their plucky acoustic guitars. ''
who played in more ashes test winning sides at the gabba
The Gabba - Wikipedia The Brisbane Cricket Ground, commonly known as the Gabba, is a major sports stadium in Brisbane, the capital of Queensland, Australia. The nickname Gabba derives from the suburb of Woolloongabba, in which it is located. The land on which the ground sits was set aside for use as a cricket ground in 1895 and the first match was held on the site on 19 December 1896, between Parliament and The Press. Prior to this, cricket was played at a ground in the area then known as Green Hills (beside Countess Street Petrie Terrace opposite the Victoria Barracks -- now occupied by the Northern Busway), since at least the early 1860s. The Gabba shared first - class cricket matches with the Exhibition Ground until 1931. The first Sheffield Shield match at the Gabba was scheduled to be played between 31 January 1931 and 4 February 1931, but it was washed out without a ball being bowled. The first Test match at the Gabba was played between Australia and South Africa between 27 November and 3 December 1931. Over the years, the Gabba has hosted athletics, Australian rules football, baseball, concerts, cricket, cycling, rugby league, rugby union, soccer and pony and greyhound races. Between 1993 and 2005, the Gabba was redeveloped in six stages at a cost of A $128,000,000. The dimensions of the playing field are now 170.6 metres (east - west) by 149.9 metres (north - south) to accommodate the playing of Australian Football at elite level. The seating capacity of the ground is now 42,000. On 15 December 2016, Australia hosted Pakistan for the first day - night Test at the Gabba, and the first Australian day - night Test hosted outside Adelaide Oval. The First Test between Australia and England is played nowadays at Brisbane. Nobody seems to know why, and all sorts of arguments are ventilated for and against more cricket Tests on the Woolloongabba ground. I am all in favour of robbing Queensland of its greatest cricketing occasion, for the ground depresses. It is not a cricket ground at all. It is a concentration camp! Wire fences abound. Spectators are herded and sorted out into lots as though for all the world this was a slave market and not a game of cricket. The stands are of wood and filthy to sit on. The dining rooms are barns, without a touch of colour or a picture on the wall. Everywhere there is dust and dirt... Forgive me if I am bitter about the Woolloongabba ground... the city has many good points, and the people who live there are generous and hospitable to the highest degree, but once one goes to the cricket ground the advantages are overwhelmingly lost in the mass of rules and regulations... -- John Kay, 1950 -- 51 Ashes series The Gabba is used from October to March for cricket and is home to the Queensland Cricket Association, the Queensland Cricketers Club and the Queensland Bulls cricket team. The venue usually hosts the first Test match of the season each November in addition to a number of international one - day matches usually held in January. The pitch is usually fast and bouncy. The Gabba 's amenities were greatly improved in the 1980s from a very basic standard, especially in comparison with the other Australian cricket grounds. Test cricket was first played at the ground in November 1931, the first Test of the series between Australia and South Africa. In December 1960, Test cricket 's first - ever Tied Test took place at the ground when Richie Benaud 's Australian team tied with Frank Worrell 's West Indian side. Queensland clinched its first - ever Sheffield Shield title with victory over South Australia in the final at the ground in March 1995. The Gabba was the first Australian venue to host an International Twenty20 cricket match. In November 1968 Colin Milburn scored 243 -- in the two - hour afternoon session he scored 181 - in a Sheffield Shield match for Western Australia vs. Queensland For the first day of the first Test of the 2010 -- 11 Ashes series between Australia and England the Gabba was almost sold out. Australia 's Michael Clarke holds the record for number of runs scored in one Test innings at the Gabba with 259 not out, breaking the previous record set by Alastair Cook. Australia has a formidable test match record at the ground. In the 55 matches played at the ground, Australia has won 33, drawn 13, tied 1 and lost 8. Australia has also not lost at the Gabba in 28 matches, a record dating back to 1988. England have a notoriously poor record at The Gabba, and have only won two test matches at the ground since the end of the Second World War. Many of their defeats have been heavy and only seven England players have scored centuries at the ground. The Gabba was the home ground for the Brisbane Bears from 1993 to 1996 and since 1997 has been the home of the Brisbane Lions AFL team. The record crowd for an Australian rules football match is 37,224 between the Brisbane Lions and Collingwood in Round 15 of the 2005 AFL season. Australian football has a long association with the ground. The Queensland Football League, a precursor to AFL Queensland played matches at the Gabba from 1905 to 1914, 1959 to 1971, and in the late 1970s and early 1980s. AFLQ matches resumed in 1993 as curtain - raiser events to AFL games, along with occasional AFLQ Grand Finals. Interstate games, including the 1961 national carnival have also been played there, as was a demonstration game during the 1982 Commonwealth Games. In 1991 the Gabba was host to Queensland 's only victory over a Victorian side. In the early 1900s, the Gabba hosted numerous matches between Australia and various touring nations. During the 1950s and 1960s the Gabba hosted soccer matches for English first division and Scottish clubs including Blackpool FC, Everton FC, Manchester United and Heart of Midlothian. The Chinese and South African national teams also played at the ground. During the 2000 Summer Olympics, the Gabba hosted association football group games. On 8 May 1909 the first match of rugby league was played in Brisbane at the Gabba. Norths played against Souths before a handful of spectators at the ground. The Gabba hosted its first rugby league Test match on 26 June 1909, when Australia defeated New Zealand Māori 16 -- 13. The Kangaroos continued to play Tests at this venue until 1956, and a ground record crowd of 47,800 people saw Australia play Great Britain in 1954. From 1932 to 1959 the Gabba was also used to host interstate matches and International Rugby League Finals from 1909 -- 2003. The Gabba hosted 11 rugby league test matches between 1912 and 1956. The Gabba has hosted six rugby union Test matches. The Gabba hosted seven games of the 2000 Olympic Games Men 's Football tournament including a Quarter final match. Greyhound racing was also conducted at the Gabba prior to the redevelopment. In 2009 as part of the Q150 celebrations, the Gabba was announced as one of the Q150 Icons of Queensland for its role as a "structure and engineering feat ''. Players Teams Last updated: 19 May 2015. Test match between Australia and South Africa at the Gabba in Nov 2012 The Gabba prior to redevelopment Shane Warne in action at the Gabba The Gabba in 2006 -- 07 Ashes series
so you think you can dance website usa
So You Think You Can Dance (U.S. TV series) - wikipedia So You Think You Can Dance is an American televised dance competition show that airs on Fox in the United States and is the flagship series of the international So You Think You Can Dance television franchise. It was created by American Idol producers Simon Fuller and Nigel Lythgoe and is produced by 19 Entertainment, Dick Clark Productions, and Conrad Sewell Productions. The series premiered on July 20, 2005 with over ten million viewers and ended the summer season as the top - rated show on television. The first season was hosted by American news personality Lauren Sánchez. Since the second season, it has been hosted by former British children 's television personality and game show emcee Cat Deeley. The show features a format wherein dancers trained in a variety of dance genres enter open auditions held in a number of major U.S. cities to showcase their talents and move forward through successive additional rounds of auditions to test their ability to adapt to different styles. At the end of this process, a small number of dancers are chosen as finalists. These dancers move on to the competition 's main phase, where they perform solo, duet, and group dance numbers on live television, attempting to master a diverse selection of dance styles, including classical, contemporary, ballroom, hip - hop, street, club, jazz, and musical theatre styles, among others. They compete for the votes of the broadcast viewing audience which, combined with the input of a panel of judges, determines which dancers advance to the next stage from week to week, until a winner is crowned as "Americaa 's favorite dancer ''. So You Think You Can Dance has won seven Primetime Emmy Awards for Outstanding Choreography and a total of nine Emmy Awards altogether. Licensed variations of the show, produced for broadcast markets in other nations, began airing in August 2005, and dozens of localized adaptations of the show have been produced since, airing in 39 countries to date. The fifteenth and most recent season of the U.S. show aired from June 4 through September 10th, 2018. A typical season of So You Think You Can Dance is divided between a selection process, during which expert judges select competitors from a wide pool of applicant dancers, and a competition phase, during which these ' finalists ' (more typically referred to as the ' Top 20 ') compete for votes from home viewers. Though it is produced over the course of months, the selection phase is highly edited and usually constitutes only the first 2 -- 4 weeks of aired episodes, with the competition episodes forming the remaining 7 -- 9 weeks of the season. The open auditions, the first stage in determining a season 's finalists, take place in 2 -- 6 major U.S. cities each season and are typically open to anyone aged 18 -- 30 at the time of their audition, although season 13 focused on a younger class of competitors, ages 8 - 14. The cities where auditions are held change from season to season but some, such as Los Angeles and New York, have featured in most seasons. During this stage, dancers perform a brief routine (typically a solo, but duet and group routines are allowed as well) before a panel of dance experts, usually headed by series creator and executive producer Nigel Lythgoe. This panel then decides on - the - spot whether the dancer demonstrated enough ability and performance value to proceed further. If the dancer exhibited exceptional ability in their performance, judges award "a ticket to Vegas '' (or in more recent seasons "a ticket to the Academy ''), moving them instantly one step forward in the competition. Alternatively, if judges are on the fence about the dancer, they may ask the contestant to wait until the end of that day 's auditions to participate in a short test of their ability to pick up professional choreography. The second stage of the selection process is referred to as "the callbacks '' (this round was referenced as "Vegas Week '' for much of the show 's run, as it was held in Las Vegas, but has been called Academy Week since season 13). The callbacks consist of a several - day - long process in which the remaining hopefuls are tested for overall well - rounded dance ability, stamina, creativity and their ability to perform under pressure. The dancers are put through a battery of rounds that test their ability to pick up various dance styles; these are typically some of the more well - represented genres that are later prominent in the competition phase, such as hip - hop, jazz, ballroom, and contemporary. Additionally the dancers may be asked to perform further solos in styles of their choosing and participate in a group choreography round in which small teams of contestants must display their musicality and ability to communicate professionally by choreographing a performance to a randomly selected piece of music -- this challenge is notable as being the only time competitors are asked to choreograph themselves, aside from solos. The callbacks are often collectively portrayed as one of the most exhausting and stressful stages of the competition; each successive round sees cuts in which a significant portion of the remaining dancers are eliminated from competition and dancers are given a limited amount of time to adapt to styles they are sometimes wholly unfamiliar with while being physically taxed by the rapid progression of rounds and a limited amount of rest. At the end of this process, usually less than 40 competitors remain in a pool that final contestants are chosen from. Most seasons have featured 20 "top '' finalists for the competition portion of the show, but season 1 was represented by a Top 16, season 7 saw a Top 11, and seasons 13 through 15 have featured a Top 10. Following the finalist selection process, the show transitions into its regular competition phase, which lasts the rest of the season. The competition stage is typically divided into eight weeks, generally with two contestants eliminated per week. Dancers are paired - up into male - female couples that will sometimes stay paired for much of the remaining competition if neither is eliminated (since season 7, competitors have also been occasionally paired with "All Stars '', returning dancers from previous seasons who partner with the contestant dancers, but who are not themselves competing). These couples perform 1 -- 2 duets per week in different styles which are typically, but not always, randomly selected. These duets, as with all non-solo performances at this stage in the competition, are choreographed by professional choreographers. Prior to most duet performances, a video packet of clips of the couple preparing to perform the routine is shown; these packets are intended not only to demonstrate the couple 's efforts to master the routine, but also to give glimpses of the personalities and personal histories of the dancers as well as insights from the choreographer as to the thematic, narrative, and artistic intentions of the piece. Following each duet performance, the week 's panel of judges gives critical feedback, often emphasizing the two key areas of technique and performance value. Duets and their accompanying video packets and critiques typically take up the majority of an episode but are often supplemented by solos, group numbers, and occasionally guest dance or musical performances. In season 1, each week of the competition featured a single episode, with dancers ' eliminations pre-recorded the week they occurred and then broadcast at the beginning of the next week 's episode. In seasons 2 - 8, the show 's weekly format was split between two episodes, a performance episode, as described above, and a results show which revealed the outcome of the at - home - viewer voting following the performance show of the same week. More recent seasons have returned to a one - show - per - week format, but with each week 's episode typically reflecting the results of voting for the previous week 's performances, with these results revealed at the end of the following week 's performances. Depending on the stage of the competition, each week may feature eliminations which are based entirely on an at - home viewer vote, or the vote may simply create a group of bottom dancers from which the show 's judges will select the final eliminations. Voting has also varied by season (and often within seasons) with regard to whether the voter selected individuals or couples. Following the announcement of their elimination, dancers are typically given a brief send - off via a video montage. Each competitive episode ends with a quick recap of the night 's routines accompanied by voting prompts. Episodes typically last around two hours, commercials included. There has also been variability in how long couples are kept together and how the at - home - viewer votes are balanced against judge decisions, though ultimately at some point in every season, the judges give up their power to save dancers and eliminations are determined exclusively by viewer votes. The total number of hours shown in a given week during the performance phase of the competition has varied from two to four hours. A season 's finale episode is often the most elaborately produced show of a season and features the last performances of the competitors, encore performances of many of the season 's most acclaimed routines, guest dancers (including returning past season competitors and cast - members from other international versions of the franchise), musical performances, and multiple video packets chronicling the course of the season 's events, all culminating in the announcement of the winner of the competition. Most seasons have featured a single winner, while seasons 9 and 10 featured both a male and female winner. Following the closure of the season, the Top Ten dancers often go on tour for several months, performing hit routines from the season among other performances. A typical season of So You Think You Can Dance is presided over by a panel of 2 -- 4 permanent judges, supplemented by occasional guest judges, with the panel sometimes ballooning up to twice or more its normal size for callback episodes or season finales. Executive producer an co-creator of the show Nigel Lythgoe is the only judge to have sat as a permanent member of the panel across all seasons, although ballroom specialist Mary Murphy has also sat as a permanent member of the panel in seven seasons. Other permanent judges have included film director and choreographer Adam Shankman, contemporary choreographer Mia Michaels, pop music and dance icon Paula Abdul, noted youth dancer Maddie Ziegler, music and dance artist Jason Derulo, and successful show alumni and season 4 runner - up Stephen "tWitch '' Boss. Many earlier seasons frequently featured guest judges in occasional episodes, though this practice has become increasingly rare. These guest judge positions have typically been filled by choreographers who work regularly on the show (who in rare cases may also be former contestants themselves) and by iconic names from the entertainment industry. Guest judges for the show have included: Debbie Allen, Christina Applegate, Robin Antin, Toni Basil, Cicely Bradley, Kristin Chenoweth, Misty Copeland, Alex Da Silva, Ellen DeGeneres, Tyce Diorio, Joey Dowling, Napoleon and Tabitha D'umo, Carmen Electra, Brian Friedman, Jean - Marc Généreux, Jason Gilkison, Neil Patrick Harris, Hi - Hat, Katie Holmes, Dan Karaty, Lady Gaga, Carly Rae Jepsen, Lil ' C, Rob Marshall, Mandy Moore, Megan Mullally, Kenny Ortega, Toni Redpath, Debbie Reynolds, Wade Robson, Doriana Sanchez, Shane Sparks, Sonya Tayeh, Olisa Thompson, Stacey Tookey, Jesse Tyler Ferguson, and Travis Wall. † From its inception in season 6 and through season 10, the dancer showcase episode represented a non-competitive round with no viewer voting or subsequent eliminations, followed the next week by the first competitive round. In season 11 it was the first episode of the season upon which viewers voted. ‡ For seasons 8 - 10, the dancer showcase episode was combined with the Top 20 reveal episode, with groups of the dancers performing immediately after being revealed as finalists. * In both seasons 7 and 8, the judges decided not to eliminate any dancers on the occasion of one results show; in both cases this event was followed by the elimination of double the normal number of contestants the following week. Similarly, for format reasons, season 9 featured two shows with double eliminations, with four dancers eliminated instead of two for each of these shows. ⁂ Unlike all previous seasons, season 12 featured the elimination of one "street '' dancer and one "stage '' dancer each week, as opposed to one female and one male contestant (as in all previous seasons which eliminated two dancers per week). ° Season 13 (during which the show was subtitled ' The Next Generation ') featured competitors between the ages of 9 (or as young as 8 at time of application) and 14. * * In season 13, the judges held the audition rounds, but the all - stars, rather than the judges, made the eliminations during Academy week to choose the top 10. After this, in episodes 7 and 8, from the two contestants with the lowest viewer votes, the judges made the elimination. In episode 9, the two contestants with the lowest viewer votes were both eliminated, and in episodes 10 and 11, the contestant with the lowest viewer votes was eliminated. ° ° Season 15, with only 10 finalists in the live shows and the elimination of 2 per episode, was the shortest run of live show elimination rounds to date. Over the course of its fifteen seasons, So You Think You Can Dance has featured dozens of distinct dance styles in its choreographed routines. Most of these styles fall into four categories that are regularly showcased and can be found in almost every performance episode: western contemporary / classical styles, ballroom styles, hip - hop / street styles, and Jazz and its related styles. Various other forms of dance that do not especially fall into these broad categories are seen as well, but not as regularly. The following styles have all been seen in a choreographed duet or group routine; styles featured only in auditions or solos are not listed. Routines from the classically derived style of contemporary dance are the most common dances seen on the show, being seen in every performance episode of the series (and typically at least twice per episode). While contemporary, lyrical, and modern dance are typically considered three separate (if overlapping) styles of dance, the practice on So You Think You Can Dance has been to refer to all routines in this area as "contemporary '', except in the first season where the label "lyrical '' was used for the same purpose. Ballet routines occur much more rarely, at a rate of one or two per season since their introduction in the fourth season. Hip - hop routines are also present in every performance episode. While these routines frequently feature elements from many different subgenres of hip - hop (locking and popping, for example) and various "street '' styles (such as breaking), they are typically all labelled under the umbrella term of hip - hop. An exception is the now frequently featured lyrical hip - hop, which is unique amongst all the styles on SYTYCD in that it is the only one that is held to have become a known distinct style at least in - part as a result of the show; the style is widely attributed to regular show choreographers Tabitha and Napoleon D'umo and the term itself to judge Adam Shankman. These two broad categories are occasionally supplemented by routines which are labelled as krump, breakdancing, waacking, and stepping. Ballroom styles are also seen regularly in every performance episode. These routines may use the movement of traditional International Standard forms or lean toward American competitive styles; other routines may use street or regional variants, or may combine elements of different variations. Jazz is featured in nearly all performance episodes. While these routines are typically labelled simply "Jazz '', the genre is notable as being one of the most fusional featured on the show and various style combinations and sub-categories have been referenced. Descended from Jazz but treated as a separate genre on SYTYCD, "Broadway '' is analogous to the label "Musical Theater '' outside the U.S. These dance styles are featured less frequently than their ballroom relatives, but have been seen intermittently since the first season. In addition to the broad categories above, many more styles that are less common in the U.S. are sometimes featured. Most of these are seen only once, but the Bollywood style has been featured several times per season since the fourth season. On September 2, 2009, as prelude to season 6, a special show aired featuring judge picks for the top 15 routines from the first five seasons. At the end of the show, show creator and judge Nigel Lythgoe presented his favorite performance, a contemporary piece choreographed by Tyce Diorio and performed by Melissa Sandvig and Ade Obayomi. In March 2014, Chinese television station CCTV broadcast a promotional episode in which notable all - star dancers from the U.S. and Chinese versions of So You Think You Can Dance competed directly against one - another as teams. Titled Zhōngměi Wǔ Lín Guànjūn Duìkàngsài - Super Dancer Born Tonight, the show was shot in Las Vegas but never aired on U.S. television. So You Think You Can Dance premiered with over 10 million viewers in 2005. For season 1, it was the No. 1 summer show on television. However, when NBC 's America 's Got Talent premiered in the summer of 2006, it took the title of "# 1 summer show '' and, over the following few years, broadened its lead. In summer 2009, SYTYCD premiered strong with a 3.4 rating in its target demographic, although with the start of America 's Got Talent roughly a month later in the same timeslot, Dance fell to No. 4 on the ratings board. It continued to lose viewers throughout the summer of 2009 and ended up with an average of approximately 8 million viewers. Fox then moved SYTYCD to its fall 2009 schedule where its ratings continued to decline; hitting an all - time series low of 4.6 million viewers for a "special '' episode hosted by Nigel Lythgoe on September 2, 2009. The move to the fall was short - lived. After dropping to an average of 6 million viewers, Fox moved SYTYCD back to the summer in 2010. With Mia Michaels replacing Mary Murphy and former contestants termed as "All - Stars '' being used as partners, the ratings for Dance continued to slide to all - time series lows; dropping to just 5.6 million viewers on July 15, 2010. For season 7, So You Think You Can Dance averaged just over 5 million viewers. After season 7, Mia Michaels was replaced on the judge 's panel by returning personality Mary Murphy. The change appeared to have little effect on the ratings, and the show continued to average just over five million viewers per episode in 2011 's season 8. Season 9 saw a slight uptick in ratings early on, with each of the season 's first five episodes garnering between six and seven million viewers, but the rise was short - lived and the show 's ratings hit a new low of 4.16 million viewers on August 29, 2012. Season 10 maintained similar numbers, averaging about 4 million viewers per episode in 2013, with a 4.3 million viewership for the last episode of the season, an all - time series low for a finale. In April 2014, Nigel Lythgoe appealed on Twitter to fans to share information about the show ahead of the 11th season 's May premiere in an attempt to augment the show 's ratings for the upcoming season and bolster its chances of renewal thereafter. The show was renewed for a 12th season, but ratings continued to decline, with an average of around 3.5 million viewers per show. FOX renewed the show for a 13th season, but with a drastically re-worked format focused on child dancers. Ratings declined further for the new version, with only five episodes breaking the 3 million viewer mark; the finale saw a series low viewership of just 2.27 million viewers. In 2016, a New York Times study of the 50 TV shows with the most Facebook Likes found that "in general '', Dance "is more popular in cities, though it hits peak popularity in Utah ''. Dance competition had been a part of American television for decades before the premiere of So You Think You Can Dance, but usually in the form of all - around talent searches (such as Star Search, Soul Train, or Showtime at the Apollo). However, a season - long American Idol - like talent - search show with a sole focus on dance had never been broadcast on American network television. Producers and judges associated with the show have stated on numerous occasions, both within the context of the show and in interviews, that the series was meant to rejuvenate the visibility and appreciation of dance as an art form in the U.S. and to give exposure to struggling dancers. Series judge Mary Murphy says, for example, "Of course you hope you can make a living at it, because you do n't want to give up on something that you do, but the honest truth is most dancers have to carry one or two jobs and dance as much as they can on the side -- it 's a very lucky dancer who gets a full scholarship. '' A number of dance - themed competition shows have been produced for American television since the premiere of So You Think You Can Dance, including America 's Best Dance Crew, Superstars of Dance, Live to Dance, and World of Dance. In 2009, Lythgoe came together with fellow SYTYCD judge Adam Shankman as well as Katie Holmes, Carrie Ann Inaba, and others in the dance entertainment industry, in an effort to launch The Dizzyfeet Foundation, with the aim of providing scholarships and training to young dancers of limited means. The foundation has been referenced sporadically on the show since. In 2010, Lythgoe, with the assistance of other SYTYCD personalities and long - time healthy lifestyles proponent Congresswoman Eleanor Holmes Norton, was successful in getting another of his dance - oriented concepts realized -- an official National Dance Day, held now annually on the last Saturday of July, to promote fitness through movement. This national dance day has been celebrated annually by the show since. Before the end of 2005, the year the series first premiered, its format had already been licensed for the first of a number foreign adaptations. To date, the resulting So You Think You Can Dance franchise has produced 28 shows representing 39 different countries and comprising more than 90 individual seasons. These adaptations have aired in Armenia, Australia, Belgium, Canada, China, Denmark, Egypt, Finland, France, Georgia, Germany, Greece, Iraq, India, Israel, Jordan, Kazakhstan, Kuwait, Lebanon, Lithuania, Malaysia, Morocco, the Netherlands, New Zealand, Norway, Palestinian Territories, Poland, Portugal, Qatar, South Africa, Sudan, Sweden, Syria, Tunisia, Turkey, Ukraine, United Arab Emirates, the United Kingdom and Vietnam. Matt Firestone Patrick Boozer Pete Radice Sean Smith Dean Banowetz Ralph Abalos Shawn Finch Melissa Jaqua Marie DelPrete Amy Harmon Tyson Fountaine Adam Christopher Matt Firestone Patrick Boozer Pete Radice Matt Firestone Patrick Boozer Pete Radice Similar shows:
what is the difference between kewra water and rose water
Kewra - wikipedia Kewra, keora or kewda (Hindi: केवड़ा, Bengali: কেওড়া, Odia: କିଆ, Urdu: کیوڑہ ‬ ‎ in Punjabi ਕੇਵੜਾ / کیوڑہ) is an extract distilled from the flower of the pandanus plant. It is primarily used to flavour South Asian cuisine. Pandanus is native to tropical South Asia, Southeast Asia, Australasia and is used as flavoring agent throughout much of these regions. The male pandanus flower is almost exclusively used for kewra distillation. Approximately 95 % of total kewra flower exported from India is collected from areas surrounding Berhampur city in Ganjam district. The coastal areas of Chhatrapur, Rangeilunda, Patrapur and Chikiti are famous for their aromatic pandanus plantations. Arguably, flowers from coastal locales have an exquisite floral note that rival inland varieties with the most famous varieties being those endemic and cultivated in Gopalpur - on - Sea. Cultivation of kewra flower is a major source of income in Ganjam district and there are nearly 200 registered kewra distillation factories. Kewra is also used in traditional Indian perfumery, both as functional fragrance and in Ittar.