sentence1
stringlengths 1
133k
| sentence2
stringlengths 1
131k
|
---|---|
Berlin, which was the capital of the then Kingdom of Prussia (the largest and most influential state in both the Confederation and the empire). Two decades later, the current parliament building was erected. The Reichstag delegates were elected by direct and equal male suffrage (and not the three-class electoral system prevailing in Prussia until 1918). The Reichstag did not participate in the appointment of the Chancellor until the parliamentary reforms of October 1918. After the Revolution of November 1918 and the establishment of the Weimar Constitution, women were given the right to vote for (and serve in) the Reichstag, and the parliament could use the no-confidence vote to force the chancellor or any cabinet member to resign. In 1933, Adolf Hitler was appointed Chancellor and through the Reichstag Fire Decree, the Enabling Act of 1933 and the death of President Paul von Hindenburg in 1934 gained unlimited power. After this, the Reichstag met only rarely, usually at the Krolloper (Kroll Opera House) to unanimously rubber-stamp the decisions of the government. It last convened on 26 April 1942. With the new Constitution of 1949, the Bundestag was established as the new West German parliament. Because West Berlin was not officially under the jurisdiction of the Constitution, a legacy of the Cold War, the Bundestag met in Bonn in several different buildings, including (provisionally) a former waterworks facility. In addition, owing to the city's legal status, citizens of West Berlin were unable to vote in elections to the Bundestag, and were instead represented by 22 non-voting delegates chosen by the House of Representatives, the city's legislature. The Bundeshaus in Bonn is the former parliament building of Germany. The sessions of the German Bundestag were held there from 1949 until its move to Berlin in 1999. Today it houses the International Congress Centre Bundeshaus Bonn and in the northern areas the branch office of the Bundesrat ("Federal Council"), which represents the Länder – the federated states. The southern areas became part of German offices for the United Nations in 2008. The former Reichstag building housed a history exhibition (Fragen an die deutsche Geschichte) and served occasionally as a conference center. The Reichstag building was also occasionally used as a venue for sittings of the Bundestag and its committees and the Bundesversammlung (Federal Convention), the body which elects the German Federal President. However, the Soviets harshly protested against the use of the Reichstag building by institutions of the Federal Republic of Germany and tried to disturb the sittings by flying supersonic jets close to the building. Since 19 April 1999, the German parliament has again assembled in Berlin in its original Reichstag building, which was built in 1888 based on the plans of German architect Paul Wallot and underwent a significant renovation under the lead of British architect Lord Norman Foster. Parliamentary committees and subcommittees, public hearings and parliamentary group meetings take place in three auxiliary buildings, which surround the Reichstag building: the Jakob-Kaiser-Haus, Paul-Löbe-Haus and Marie-Elisabeth-Lüders-Haus. In 2005, a small aircraft crashed close to the German Parliament. It was then decided to ban private air traffic over Central Berlin. Tasks Together with the Bundesrat, the Bundestag is the legislative branch of the German political system. Although most legislation is initiated by the executive branch, the Bundestag considers the legislative function its most important responsibility, concentrating much of its energy on assessing and amending the government's legislative program. The committees (see below) play a prominent role in this process. Plenary sessions provide a forum for members to engage in public debate on legislative issues before them, but they tend to be well attended only when significant legislation is being considered. The Bundestag members are the only federal officials directly elected by the public; the Bundestag in turn elects the Chancellor and, in addition, exercises oversight of the executive branch on issues of both substantive policy and routine administration. This check on executive power can be employed through binding legislation, public debates on government policy, investigations, and direct questioning of the chancellor or cabinet officials. For example, the Bundestag can conduct a question hour (Fragestunde), in which a government representative responds to a written question previously submitted by a member. Members can ask related questions during the question hour. The questions can concern anything from a major policy issue to a specific constituent's problem. Use of the question hour has increased markedly over the past forty years, with more than 20,000 questions being posed during the 1987–90 term. Understandably, the opposition parties actively exercise their parliamentary right to scrutinize government actions. Constituent services also take place via the Petition Committee. In 2004, the Petition Committee received over 18,000 complaints from citizens and was able to negotiate a mutually satisfactory solution to more than half of them. In 2005, as a pilot of the potential of internet petitions, a version of e-Petitioner was produced for the Bundestag. This was a collaborative project involving The Scottish Parliament, International Teledemocracy Centre and the Bundestag 'Online Services Department'. The system was formally launched on 1 September | written question previously submitted by a member. Members can ask related questions during the question hour. The questions can concern anything from a major policy issue to a specific constituent's problem. Use of the question hour has increased markedly over the past forty years, with more than 20,000 questions being posed during the 1987–90 term. Understandably, the opposition parties actively exercise their parliamentary right to scrutinize government actions. Constituent services also take place via the Petition Committee. In 2004, the Petition Committee received over 18,000 complaints from citizens and was able to negotiate a mutually satisfactory solution to more than half of them. In 2005, as a pilot of the potential of internet petitions, a version of e-Petitioner was produced for the Bundestag. This was a collaborative project involving The Scottish Parliament, International Teledemocracy Centre and the Bundestag 'Online Services Department'. The system was formally launched on 1 September 2005, and in 2008 the Bundestag moved to a new system based on its evaluation.<ref>{{cite web|url=http://www.tab.fzk.de/de/brief/brief32.pdf |archive-url=https://web.archive.org/web/20090316102923/http://www.tab.fzk.de/de/brief/brief32.pdf |url-status=dead |archive-date=16 March 2009 |title=Öffentliche Petitionen beim deutschen Bundestag - erste Ergebnisse der Evaluation des Modellversuchs = An Evaluation Study of Public Petitions at the German Parliament |access-date=16 June 2009 |last=Trenel |first=M. |work=TAB Brief Nr 32 |publisher=Deutscher Bundestag |year=2007 }}</ref> Electoral term The Bundestag is elected for four years, and new elections must be held between 46 and 48 months after the beginning of its electoral term, unless the Bundestag is dissolved prematurely. Its term ends when the next Bundestag convenes, which must occur within 30 days of the election. Prior to 1976, there could be a period where one Bundestag had been dissolved and the next Bundestag could not be convened; during this period, the rights of the Bundestag were exercised by a so-called "Permanent Committee". Election Germany uses the mixed-member proportional representation system, a system of proportional representation combined with elements of first-past-the-post voting. The Bundestag has 598 nominal members, elected for a four-year term; these seats are distributed between the sixteen German states in proportion to the states' population eligible to vote. Every elector has two votes: a constituency vote (first vote) and a party list vote (second vote). Based solely on the first votes, 299 members are elected in single-member constituencies by first-past-the-post voting. The second votes are used to produce a proportional number of seats for parties, first in the states, and then on the federal level. Seats are allocated using the Sainte-Laguë method. If a party wins fewer constituency seats in a state than its second votes would entitle it to, it receives additional seats from the relevant state list. Parties can file lists in every single state under certain conditions – for example, a fixed number of supporting signatures. Parties can receive second votes only in those states in which they have filed a state list. If a party, by winning single-member constituencies in one state, receives more seats than it would be entitled to according to its second vote share in that state (so-called overhang seats), the other parties receive compensation seats. Owing to this provision, the Bundestag usually has more than 598 members. The 20th and current Bundestag, for example, has 736 seats: 598 regular seats and 138 overhang and compensation seats. Overhang seats are calculated at the state level, so many more seats are added to balance this out among the different states, adding more seats than would be needed to compensate for overhang at the national level in order to avoid negative vote weight. To qualify for seats based on the party-list vote share, a party must either win three single-member constituencies via first votes (basic mandate clause) or exceed a threshold of 5% of the second votes nationwide. If a party only wins one or two single-member constituencies and fails to get at least 5% of the second votes, it keeps the single-member seat(s), but other parties that accomplish at least one of the two threshold conditions receive compensation seats. In the most recent example of this, during the 2002 election, the PDS won only 4.0% of the second votes nationwide, but won two constituencies in the state of Berlin. The same applies if an independent candidate wins a single-member constituency, which has not happened since the 1949 election. If a voter cast a first vote for a successful independent candidate or a successful candidate whose party failed to qualify for proportional representation, his or her second vote does not count toward proportional representation. However, it does count toward whether the elected party exceeds the 5% threshold. Parties representing recognized national minorities (currently Danes, Frisians, Sorbs, and Romani people) are exempt from both the 5% threshold and the basic mandate clause, but normally only run in state elections. The only party that has been able to benefit from this provision so far on the federal level is the South Schleswig Voters' Association, which represents the minorities of Danes and Frisians in Schleswig-Holstein and managed to win a seat in 1949 and 2021. Latest election result The latest federal election was held on Sunday, 26 September 2021, to elect the members of the 20th Bundestag. List of Bundestag by session Presidents since 1949 Membership Organization Parliamentary groups The most important organisational structures within the Bundestag are parliamentary groups (Fraktionen; sing. Fraktion). A parliamentary group must consist of at least 5% of all members of parliament. Members of parliament from different parties may only join in a group if those parties did not run against each other in any German state during the election. Normally, all parties that surpassed the 5%-threshold build a parliamentary group. The CDU and CSU have always formed a single united Fraktion (CDU/CSU), which is possible, as the CSU only runs in the state of Bavaria and the CDU only runs in the other 15 states. The size of a party's Fraktion determines the extent of its representation on committees, the time slots allotted for speaking, the number of committee chairs it can hold, and its representation in executive bodies of the Bundestag. The Fraktionen, not the members, receive the bulk of government funding for legislative and administrative activities. The leadership of each Fraktion consists of a parliamentary party leader, several deputy leaders, and an executive committee. The leadership's major responsibilities are to represent the Fraktion, enforce party discipline and orchestrate the party's parliamentary activities. The members of each Fraktion are distributed among working groups focused on specific policy-related topics such as social policy, economics, and foreign policy. The Fraktion meets every Tuesday afternoon in the weeks in which the Bundestag is in session to consider legislation before the Bundestag and formulate the party's position on it. Parties that do not hold 5% of the Bundestag-seats may be granted the status of a Gruppe (literally "group", but a different status from Fraktion) in the Bundestag; this is decided case by case, as the rules of procedure do not state a fixed number of seats for this. Most recently, this applied to the Party of Democratic Socialism (PDS) from 1990 to 1998. This status entails some privileges which are in general less than those of a Fraktion. Executive bodies The Bundestag's executive bodies include the Council of Elders and the Presidium. The council consists of the Bundestag leadership, together with the most senior representatives of each Fraktion, with the number of these representatives tied to the strength of the Parliamentary groups in the chamber. The council is the coordination hub, determining the daily legislative agenda and assigning committee chairpersons based on Parliamentary group representation. The council also serves as an important forum for interparty negotiations on specific legislation and procedural issues. The Presidium is responsible for the routine administration of the Bundestag, including its clerical and research activities. It consists of the chamber's president (usually elected from the largest Fraktion) and vice presidents (one from each Fraktion). Committees Most of the legislative work in the Bundestag is the product of standing committees, which exist largely unchanged throughout one legislative period. The number of committees approximates the number of federal ministries, and the titles of each are roughly similar (e.g., defense, agriculture, and labor). There are, as of the |
(Austria) Bundesrat of Germany Federal Council (Switzerland) Bundesrat (German Empire) da:Forbundsrådet no:Forbundsrådet ru:Бундесрат | Federal Council (Austria) Bundesrat of Germany Federal Council (Switzerland) |
in the Nazi era by his father allowed Herbert Quandt to buy BMW. The BMW 700 was successful and assisted in the company's recovery. The 1962 introduction of the BMW New Class compact sedans was the beginning of BMW's reputation as a leading manufacturer of sport-oriented cars. Throughout the 1960s, BMW expanded its range by adding coupe and luxury sedan models. The BMW 5 Series mid-size sedan range was introduced in 1972, followed by the BMW 3 Series compact sedans in 1975, the BMW 6 Series luxury coupes in 1976 and the BMW 7 Series large luxury sedans in 1978. The BMW M division released its first road car, a mid-engine supercar, in 1978. This was followed by the BMW M5 in 1984 and the BMW M3 in 1986. Also in 1986, BMW introduced its first V12 engine in the 750i luxury sedan. The company purchased the Rover Group in 1994, however the takeover was not successful and was causing BMW large financial losses. In 2000, BMW sold off most of the Rover brands, retaining only the Mini brand. In 1998, BMW also acquired the rights to the Rolls Royce brand from Vickers Plc. The 1995 BMW Z3 expanded the line-up to include a mass-production two-seat roadster and the 1999 BMW X5 was the company's entry into the SUV market. The first modern mass-produced turbocharged petrol engine was introduced in 2006, (from 1973 to 1975, BMW built 1672 units of a turbocharged M10 engine for the BMW 2002 turbo), with most engines switching over to turbocharging over the 2010s. The first hybrid BMW was the 2010 BMW ActiveHybrid 7, and BMW's first mass-production electric car was the BMW i3 city car, which was released in 2013, (from 1968 to 1972, BMW built two battery-electric BMW 1602 Elektro saloons for the 1972 Olympic Games). After many years of establishing a reputation for sporting rear-wheel drive cars, BMW's first front-wheel drive car was the 2014 BMW 2 Series Active Tourer multi-purpose vehicle (MPV). In January 2021, BMW announced that its sales in 2020 fell by 8.4% due to the impact of the COVID-19 pandemic and the restrictions. However, in the fourth quarter of 2020, BMW witnessed a rise of 3.2% of its customers' demands. Branding Company name The name BMW is an abbreviation for Bayerische Motoren Werke (). This name is grammatically incorrect (in German, compound words must not contain spaces), which is why the name's grammatically correct form Bayerische Motorenwerke () has been used in several publications and advertisements in the past. Bayerische Motorenwerke translates into English as Bavarian Motor Works. The suffix AG, short for Aktiengesellschaft, signifies an incorporated entity which is owned by shareholders, thus akin to "Inc." (US) or PLC, "Public Limited Company" (UK). The terms Beemer, Bimmer and Bee-em are sometimes used as slang for BMW in the English language and are sometimes used interchangeably for cars and motorcycles. Logo The circular blue and white BMW logo or roundel evolved from the circular Rapp Motorenwerke company logo, which featured a black ring bearing the company name surrounding the company logo, on a plinth a horse's head couped. BMW retained Rapp's black ring inscribed with the company name, but adopted as the central element a circular escutcheon bearing a quasi-heraldic reference to the coat of arms (and flag) of the Free State of Bavaria (as the state of their origin was named after 1918), being the arms of the House of Wittelsbach, Dukes and Kings of Bavaria. However, as the local law regarding trademarks forbade the use of state coats of arms or other symbols of sovereignty on commercial logos, the design was sufficiently differentiated to comply, but retained the tinctures azure (blue) and argent (white). The current iteration of the logo was introduced in 2020, removing 3D effects that had been used in renderings of the logo, and also removing the black outline encircling the rondel. The logo will be used on BMW's branding but will not be used on vehicles. The origin of the logo as a portrayal of the movement of an aircraft propeller, the BMW logo with the white blades seeming to cut through a blue sky, is a myth which sprang from a 1929 BMW advertisement depicting the BMW emblem overlaid on a rotating propeller, with the quarters defined by strobe-light effect, a promotion of an aircraft engine then being built by BMW under license from Pratt & Whitney. It is well established that this propeller portrayal was first used in a BMW advertisement in 1929 – twelve years after the logo was created – so this is not the true origin of the logo. Slogan The slogan 'The Ultimate Driving Machine' was first used in North America in 1974. In 2010, this long-lived campaign was mostly supplanted by a campaign intended to make the brand more approachable and to better appeal to women, 'Joy'. By 2012 BMW had returned to 'The Ultimate Driving Machine'. Finances For the fiscal year 2017, BMW reported earnings of EUR 8.620 billion, with an annual revenue of EUR 98.678 billion, an increase of 4.8% over the previous fiscal cycle. BMW's shares traded at over €77 per share, and its market capitalization was valued at US 55.3 billion in November 2018. Motorcycles BMW began production of motorcycle engines and then motorcycles after World War I. Its motorcycle brand is now known as BMW Motorrad. Their first successful motorcycle after the failed Helios and Flink, was the "R32" in 1923, though production originally began in 1921. This had a "boxer" twin engine, in which a cylinder projects into the air-flow from each side of the machine. Apart from their single-cylinder models (basically to the same pattern), all their motorcycles used this distinctive layout until the early 1980s. Many BMW's are still produced in this layout, which is designated the R Series. The entire BMW Motorcycle production has, since 1969, been located at the company's Berlin-Spandau factory. During the Second World War, BMW produced the BMW R75 motorcycle with a motor-driven sidecar attached, combined with a lockable differential, this made the vehicle very capable off-road. In 1982, came the K Series, shaft drive but water-cooled and with either three or four cylinders mounted in a straight line from front to back. Shortly after, BMW also started making the chain-driven F and G series with single and parallel twin Rotax engines. In the early 1990s, BMW updated the airhead Boxer engine which became known as the oilhead. In 2002, the oilhead engine had two spark plugs per cylinder. In 2004 it added a built-in balance shaft, an increased capacity to and enhanced performance to for the R1200GS, compared to of the previous R1150GS. More powerful variants of the oilhead engines are available in the R1100S and R1200S, producing , respectively. In 2004, BMW introduced the new K1200S Sports Bike which marked a departure for BMW. It had an engine producing , derived from the company's work with the Williams F1 team, and is lighter than previous K models. Innovations include electronically adjustable front and rear suspension, and a Hossack-type front fork that BMW calls Duolever. BMW introduced anti-lock brakes on production motorcycles starting in the late 1980s. The generation of anti-lock brakes available on the 2006 and later BMW motorcycles paved the way for the introduction of electronic stability control, or anti-skid technology later in the 2007 model year. BMW has been an innovator in motorcycle suspension design, taking up telescopic front suspension long before most other manufacturers. Then they switched to an Earles fork, front suspension by swinging fork (1955 to 1969). Most modern BMWs are truly rear swingarm, single sided at the back (compare with the regular swinging fork usually, and wrongly, called swinging arm). Some BMWs started using yet another trademark front suspension design, the Telelever, in the early 1990s. Like the Earles fork, the Telelever significantly reduces dive under braking. BMW Group, on 31 January 2013, announced that Pierer Industrie AG has bought Husqvarna Motorcycles for an undisclosed amount, which will not be revealed by either party in the future. The company is headed by Stephan Pierer (CEO of KTM). Pierer Industrie AG is 51% owner of KTM and 100% owner of Husqvarna. In September 2018, BMW unveiled a new self-driving motorcycle with BMW Motorrad with a goal of using the technology to help improve road safety. The design of the bike was inspired by the company's BMW R1200 GS model. Automobiles Current models The current model lines of BMW cars are: 1 Series five-door hatchbacks (model code F40). A four-door sedan variant (model code F52) is also sold in China and Mexico. 2 Series two-door coupes (model code G42), "Active Tourer" five-seat MPVs (F45) and "Gran Tourer" seven-seat MPVs (F46), and four-door "Gran Coupe" fastback (model code F44). 3 Series four-door sedans (model code G20) and five-door station wagons (G21). 4 Series two-door coupes (model code G22), two-door convertibles (model code G23) and five-door "Gran Coupe" fastbacks (model code G24). 5 Series four-door sedans (model code G30) and five-door station wagons (G31). A long-wheelbase sedan variant (G38) is also sold in China. 6 Series "Gran Turismo" five-door coupes (model code G32) 7 Series four-door sedans (model code G11) and long-wheelbase four-door sedans (model code G12). 8 Series two-door coupes (model code G14), two-door convertibles (G15) and "Gran Coupe" four-door fastbacks (G16). The current model lines of the X Series SUVs and crossovers are: X1 (F48) X2 (F39) X3 (G01) X4 (G02) X5 (G05) X6 (G06) X7 (G07) The current model line of the Z Series two-door roadsters is the Z4 (model code G29). i models All-electric vehicles and plug-in hybrid vehicles are sold under the BMW i sub-brand. The current model range consists of: i3 five-door B-segment (supermini) hatchback, powered by an electric motor (with optional REx petrol engine) BMW announced the launch of two new BMW i all-electric models, the BMW iX3 SUV by late 2020, and the BMW i4 four-door sedan in 2021. In addition, several plug-in hybrid models built on existing platforms have been marketed as iPerformance models. Examples include the 225xe using a 1.5 L three-cylinder turbocharged petrol engine with an electric motor, the 330e/530e using a 2.0 L four-cylinder engine with an electric motor, and the 740e using a 2.0 litre turbocharged petrol engine with an electric motor. Also, crossover and SUV plug-in hybrid models have been released using i technology: X1 xDrive25e, X2 xDrive25e, X3 xDrive30e, and X5 xDrive40e. M models The BMW M GmbH subsidiary (called BMW Motorsport GmbH until 1993) has high-performance versions of various BMW models since 1978. The recent model range consists of: M2 two-door coupe M3 four-door sedan M4 two-door coupe/convertible M5 four-door sedan M8 two-door coupe/convertible and four-door sedan X3 M five-door compact SUV X4 M five-door coupe-styled compact SUV X5 M five-door SUV X6 M five-door coupe-styled SUV The letter "M" is also often used in the marketing of BMW's regular models, for example the F20 M140i model, the G11 M760Li model and various optional extras called "M Sport", "M Performance" or similar. Naming convention for models Motorsport BMW has a long history of motorsport activities, including: Touring cars, such as DTM, WTCC, ETCC and BTCC Formula One Endurance racing, such as 24 Hours Nürburgring, 24 Hours of Le Mans, 24 Hours of Daytona and Spa 24 Hours Isle of Man TT Dakar Rally American Le Mans Series IMSA SportsCar Championship Formula BMW – a junior racing Formula category. Formula Two Formula E Involvement in the arts Art Cars In 1975, sculptor Alexander Calder was commissioned to paint the BMW 3.0 CSL racing car driven by Hervé Poulain at the 24 Hours of Le Mans, which became the first in the series of BMW Art Cars. Since Calder's work of art, many other renowned artists throughout the world have created BMW Art Cars, including David Hockney, Jenny Holzer, Roy Lichtenstein, Robert Rauschenberg, Frank Stella, and Andy Warhol. To date, a total of 19 BMW Art Cars, based on both racing and regular production vehicles, have been created. Architecture The global BMW Headquarters in Munich represents the cylinder head of a 4-cylinder engine. It was designed by Karl Schwanzer and was completed in 1972. The building has become a European icon and was declared a protected historic building in 1999. The main tower consists of four vertical cylinders standing next to and across from each other. Each cylinder is divided horizontally in its center by a mold in the facade. Notably, these cylinders do not stand on the ground; they are suspended on a central support tower. BMW Museum is a futuristic cauldron-shaped building, which was also designed by Karl Schwanzer and opened in 1972. The interior has a spiral theme and the roof is a 40-metre diameter BMW logo. BMW Welt, the company's exhibition space in Munich, was designed by Coop Himmelb(l)au and opened in 2007. It includes a showroom and lifting platforms where a customer's new car is theatrically unveiled to the customer. Film In 2001 and 2002, BMW produced a series of 8 short films called The Hire, which had plots based around BMW models being driven to extremes by Clive Owen. The directors for The Hire included Guy Ritchie, John Woo, John Frankenheimer and Ang Lee. In 2016, a ninth film in the series was released. The 2006 "BMW Performance Series" was a marketing event geared to attract black car buyers. It consisted of seven concerts by jazz musician Mike Phillips, and screenings of films by black filmmakers. Visual arts BMW was the principal sponsor of the 1998 The Art of the Motorcycle exhibition at various Guggenheim museums, though the financial relationship between BMW and the Guggenheim Foundation was criticised in many quarters. In 2012, BMW began sponsoring Independent Collectors production of the BMW Art Guide, which is the first global guide to private and publicly accessible collections of contemporary art worldwide. The fourth edition, released in 2016, features 256 collections from 43 countries. Production and sales BMW produces complete automobiles in the following countries: Germany: Munich, Dingolfing, Regensburg and Leipzig Austria: Graz United States: Spartanburg Mexico: San Luis Potosí South Africa: Rosslyn India: Chennai China: | 4% of the affected cars in South Korea had experienced failures in the EGR coolers, leading to approximately 20 owners suing the company. Industry collaboration BMW has collaborated with other car manufacturers on the following occasions: McLaren Automotive: BMW designed and produced the V12 engine that powered the McLaren F1. Peugeot and Citroën: Joint production of four-cylinder petrol engines, beginning in 2004. Daimler Benz: Joint venture to produce the hybrid drivetrain components used in the ActiveHybrid 7. Development of automated driving technology. Toyota: Three-part agreement in 2013 to jointly develop fuel cell technology, develop a joint platform for a sports car (for the 2018 BMW Z4 (G29) and Toyota Supra) and research lithium-air batteries. Audi and Mercedes: Joint purchase of Nokia's Here WeGo (formerly Here Maps) in 2015. In 2018, Horizn Studios collaborated with BMW to launch special luggage editions. Sponsorships BMW made a six-year sponsorship deal with the United States Olympic Committee in July 2010. In golf, BMW has sponsored various events, including the PGA Championship since 2007, the Italian Open from 2009 to 2012, the BMW Masters in China from 2012 to 2015 and the BMW International Open in Munich since 1989. In rugby, BMW sponsored the South Africa national rugby union team from 2011 to 2015. Environmental record BMW is a charter member of the U.S. Environmental Protection Agency's (EPA) National Environmental Achievement Track, which recognizes companies for their environmental stewardship and performance. It is also a member of the South Carolina Environmental Excellence Program. Since 1999, BMW has been named the world's most sustainable automotive company every year by the Dow Jones Sustainability Index. The BMW Group is one of three automotive companies to be featured every year in the index. In 2001, the BMW Group committed itself to the United Nations Environment Programme, the UN Global Compact and the Cleaner Production Declaration. It was also the first company in the automotive industry to appoint an environmental officer, in 1973. BMW is a member of the World Business Council for Sustainable Development. In 2012, BMW was the highest automotive company in the Carbon Disclosure Project's Global 500 list, with a score of 99 out of 100. The BMW Group was rated the most sustainable DAX 30 company by Sustainalytics in 2012. To reduce vehicle emissions, BMW is improving the efficiency of existing fossil-fuel powered models, while researching electric power, hybrid power and hydrogen for future models. During the first quarter of 2018, BMW sold 26,858 Electrified Vehicles (EVs, PHEVs, & Hybrids). Car-sharing services DriveNow was a joint-venture between BMW and Sixt that operated from in Europe from 2011 until 2019. By December 2012, DriveNow operated over 1,000 vehicles, in five cities and with approximately 60,000 customers. In 2012, the BMW-owned subsidiary Alphabet began a corporate car-sharing service in Europe called AlphaCity. The ReachNow car-sharing service was launched in Seattle in April 2016. ReachNow currently operates in Seattle, Portland and Brooklyn. In 2018, BMW announced the launching of a pilot car subscription service for the United States called Access by BMW (its first one for the country), in Nashville, Tennessee. In January 2021, the company said that Access by BMW was "suspended". Overseas subsidiaries Production facilities China The first BMW production facility in China was opened in 2004, as a result of a joint venture between BMW and Brilliance Auto. The plant was opened in the Shenyang industrial area and produces 3 Series and 5 Series models for the Chinese market. In 2012, a second factory was opened in Shenyang. Between January and November 2014, BMW sold 415,200 vehicles in China, through a network of over 440 BMW stores and 100 Mini stores. On 13 December 2021, BMW announced to be moving the production of the X5 from the United States to China. Hungary On 31 July 2018, BMW announced to build 1 billion euro car factory in Hungary. The plant, to be built near Debrecen, will have a production capacity of 150,000 cars a year. Mexico In July 2014, BMW announced it was establishing a plant in Mexico, in the city and state of San Luis Potosi involving an investment of $1 billion. The plant will employ 1,500 people, and produce 150,000 cars annually. Netherlands The Mini Convertible, Mini Countryman and BMW X1 are currently produced in the Netherlands at the VDL Nedcar factory in Born. Long-term orders for the Mini Countryman ended in 2020. South Africa BMWs have been assembled in South Africa since 1968, when Praetor Monteerders' plant was opened in Rosslyn, near Pretoria. BMW initially bought shares in the company, before fully acquiring it in 1975; in so doing, the company became BMW South Africa, the first wholly owned subsidiary of BMW to be established outside Germany. Unlike United States manufacturers, such as Ford and GM, which divested from the country in the 1980s, BMW retained full ownership of its operations in South Africa. Following the end of apartheid in 1994, and the lowering of import tariffs, BMW South Africa ended local production of the 5 Series and 7 Series, in order to concentrate on production of the 3 Series for the export market. South African–built BMWs are now exported to right hand drive markets including Japan, Australia, New Zealand, the United Kingdom, Indonesia, Malaysia, Singapore, and Hong Kong, as well as Sub-Saharan Africa. Since 1997, BMW South Africa has produced vehicles in left-hand drive for export to Taiwan, the United States and Iran, as well as South America. Three unique models that BMW Motorsport created for the South African market were the E23 M745i (1983), which used the M88 engine from the BMW M1, the BMW 333i (1986), which added a six-cylinder 3.2-litre M30 engine to the E30, and the E30 BMW 325is (1989) which was powered by an Alpina-derived 2.7-litre engine. The plant code (position 11 in the VIN) for South African built models is "N". United States BMW cars have been officially sold in the United States since 1956 and manufactured in the United States since 1994. The first BMW dealership in the United States opened in 1975. In 2016, BMW was the twelfth highest selling brand in the United States. The manufacturing plant in Greer, South Carolina has the highest production of the BMW plants worldwide, currently producing approximately 1,500 vehicles per day. The models produced at the Spartanburg plant are the X3, X4, X6 and X7 SUV models. The X5 model's production was announced to be moving to China in December 2021. In addition to the South Carolina manufacturing facility, BMW's North American companies include sales, marketing, design, and financial services operations in the United States, Mexico, Canada and Latin America. Complete knock-down assembly facilities Brazil On 9 October 2014, BMW's new complete knock-down (CKD) assembly plant in Araquari, assembled its first car— an F30 3 Series. The cars assembled at Araquari are the F20 1 Series, F30 3 Series, F48 X1, F25 X3 and Mini Countryman. Egypt Bavarian Auto Group became the importer of the BMW and Mini brands in 2003. Since 2005, the 3 Series, 5 Series, 7 Series, X1 and X3 models sold in Egypt are assembled from complete knock-down components at the BMW plant in 6th of October City. India BMW India was established in 2006 |
to: Bisexual characteristics, having an ambiguous sexual identity (e.g. epicenity or androgyny) A bisexual flower, in botany, one that possesses both male (pollen-producing) and female (seed-producing) parts Dioecy, in biology, a species that has members of two different distinct sexes | biology, a species that has members of two different distinct sexes (e.g. humans), opposed to unisexual (only one sex present, always females) The Bisexual, a 2018 British-American comedy-drama television series See also By-Sexual, a Japanese visual kei punk |
Germany's aspirations to develop flying bombs and rockets which were to become known as V-1. Soviet occupation (1945–1946) Bornholm was heavily bombarded by the Soviet Air Force in May 1945, as it was a part of the Eastern Front. The German garrison commander, German Navy Captain Gerhard von Kamptz (1902–1998), refused to surrender to the Soviets, as his orders were to surrender to the Western Allies. The Germans sent several telegrams to Copenhagen requesting that at least one British soldier should be transferred to Bornholm, so that the Germans could surrender to the Western Allied forces instead of the Soviets. When von Kamptz failed to provide a written capitulation as demanded by the Soviet commanders, Soviet aircraft relentlessly bombed and destroyed more than 800 civilian houses in Rønne and Nexø and seriously damaged roughly 3,000 more on 7–8 May 1945. The population had been forewarned of the bombardments, and the towns were evacuated, but 10 local people were killed. Soldiers were also killed and wounded. Some of them were conscripts from occupied Latvia fighting in German ranks against the Soviets. During the Soviet bombing of the two main towns on 7 and 8 May, Danish radio was not allowed to broadcast the news because it was thought it would spoil the liberation festivities in Denmark. On 9 May Soviet troops landed on the island, and after a short fight, the German garrison (about 12,000 strong) surrendered. Soviet forces left the island on 5 April 1946 as part of the post-war division of interests of the Soviet Union and the Western Allies. Denmark was to be Western aligned, and in return Estonia, Latvia and Lithuania were to be kept in the Soviet sphere of influence. Cold War After the evacuation of their forces from Bornholm, the Soviets took the position that the stationing of foreign troops on Bornholm would be considered a declaration of war against the Soviet Union, and that Denmark should keep troops on it at all times to protect it from such foreign aggression. This policy remained in force after NATO was formed, with Denmark as a founding member. The Soviets accepted the stationing there of Danish troops, which were part of NATO but viewed as militarily inferior elements of the alliance, but they strongly objected to the presence of other NATO troops on Bornholm, US troops in particular. On 5 March 1953, the day of Stalin's death, Polish pilot Franciszek Jarecki defected from the Eastern Bloc and landed a MiG-15 fighter on the island. He was later granted asylum and rewarded for providing Western intelligence with the then-newest Soviet jet fighter. In 2017, Denmark's Defence Intelligence Service decided to build a listening tower near Østermarie, almost the size of the Statue of Liberty, to intercept radio communications across the Baltic Sea and in parts of Russia. Municipality Bornholm Regional Municipality is the local authority (Danish, kommune) covering the entire island. It is the result of a merger of the five former (1 April 1970 until 2002) municipalities on the island (Allinge-Gudhjem, Hasle, Nexø, Rønne and Aakirkeby) and the former Bornholm County. Bornholm Regional Municipality was also a county in its own right during its first four years from 1 January 2003 until 31 December 2006. From 1 January 2007 all counties were abolished, and Bornholm became part of the Capital Region of Denmark whose main responsibility is the health service. The municipality still retains its name Bornholm Regional Municipality. The island had 21 municipalities until March 1970, of which 6 were market towns and 15 parishes. In addition to supervising parish municipalities, which was the responsibility of the counties in all of Denmark, the market town municipalities of Bornholm were supervised by Bornholm County as well and not by the Interior Ministry as was the case in the rest of Denmark. The seat of the municipal council is the island's main town, Rønne. The voters decided to merge the county with the municipalities in a referendum 29 May 2001, effective from 1 January 2003. The question on the ballot was, "Do you want the six municipal entities of Bornholm to be joined to form one municipal entity as of 1 January 2003?" 73.9% voted in favour. The lowest percentage for the merger was in Nexø municipality (966 more people voting "Yes" than "No"), whose mayor, Annelise Molin, a Social Democrat, spoke out against the merger. It was required that each municipality had more "Yes" votes than "No" votes. Otherwise the merger would have to be abandoned altogether. The six municipal entities had up to 122 councillors (of which county clls were 18, from 1998 15), reduced to 89 in the municipalities from the 1990s, in the 1970s and the new regional municipality would have 27 councillors from the start. They were reduced to 23 from 1 January 2018 (election November 2017). The merger was approved in a law by the Folketing 19 (and signature by the Queen 25) March 2002, transferring the tasks of the abolished county and old municipalities to the new Bornholm Regional Municipality. The first regional mayor in the first three years from 2003 until 2005 was Thomas Thors (born 28 July 1949), a physician and member of the Social Democrats and previously the last mayor of Rønne Municipality for five years from 1998 until 2002. He became a mayor again in 2021. Bjarne Kristiansen, who was the last mayor of Hasle years from the summer of 2000 until 2002, representing the local Borgerlisten political party, served as mayor for four years from 1 January 2006 until 2009. From 1 January 2007, Bornholm became a part of the Capital Region of Denmark. From 1 January 2010 until 31 December 2020 the mayor was Winni Grosbøll, a high school teacher and a member of the Social Democrats (Socialdemokratiet) political party. The deputy mayor Morten Riis was mayor for a short interlude from 1 January until 4 January 2021. He is from the Red-Green Alliance. Thomas Thors, who was elected again in 2017, became mayor again from 4 January 2021. After the 2021 Danish local elections Jacob Trøst became mayor from January 2022. He is from the Conservative party. This was after an agreement (aftale om konstituering) between the Red-Green Alliance, amongst whom Morten Riis will be deputy mayor, and the Danish People's Party with the Conservatives. Municipal council Bornholm's municipal council today consists of 23 members, elected every four years. In the first four local elections in the newly created municipality there were 27 members elected to the municipal council. The 2002 local election only took place on Bornholm. From the election in 2017 the number of councillors elected was reduced to 23 members, serving their term of office from 1 January 2018 until 31 December 2021. Below are the election results to the new merged municipal council beginning with the first election 29 May 2002. Transport Ferry services connect Rønne to Świnoujście (Poland), Sassnitz (Germany), Køge, by road ( as the crow flies) south of Copenhagen, Denmark; the destination to Køge replaced the nighttime route directly to and from Copenhagen (for both cargo and passengers) from 1 October 2004; and catamaran services to Ystad (Sweden). Simrishamn (Sweden) has a ferry connection during the summer. There are also regular catamaran services between Nexø and the Polish ports of Kołobrzeg, Łeba and Ustka. There are direct bus connections Ystad-Copenhagen, coordinated with the catamaran. There are also flights from Bornholm Airport to Copenhagen and other locations. Because of its remote location Bornholm Regional Municipality has its own traffic company, BAT, and is its own employment region, and also performs other tasks normally carried out by the regions in the rest of Denmark. In some respects the municipality forms a region of its own. Bornholm Regional Municipality was not merged with other municipalities on 1 January 2007 in the nationwide Municipal Reform of 2007. Towns and villages The larger towns on the island are located on the coast and have harbours. There is however one exception, centrally placed Aakirkeby, which was also the name of the municipality from 1970 until 2002, which did, however, include the harbor, Boderne, to the south. The largest town is Rønne, the seat, in the southwest on the westernmost point of the island. The other main towns (clockwise round the island) are Hasle, Sandvig, Allinge, Gudhjem, Svaneke and Nexø. Monday morning 22 September 2014 it was documented by Folkeregistret in the municipality that the number of people living in the municipality that day were 39,922, the lowest number in over 100 years. , Statistics Denmark gave the populations as follows: The town of Rønne after the merger of the island's administrative entities 1 January 2003 reached a low point of 13,568 inhabitants 1 January 2014. 15,957 people in 1965 (date unknown;number not registerbased) lived in the two parishes that would become Rønne municipality from 1 April 1970. In the table, numbers for Rønne are for the parish of Rønne, Rønne Sogn, alone. Year unknown, but between 2000 and 2005. It does not include Knudsker Sogn, which was also a part of Rønne Municipality. Other localities (with approximate populations, not updated) include Aarsballe (86), Arnager (151), Olsker (67), Rutsker (64), Rø (181), Stenseby (?) and Vang (92). In 2010 and 2018 10,297 and 9,111 respectively lived in rural districts, and 88 and 71 had no fixed address. A rural district is defined by Statistics Denmark as a settlement with less than 200 inhabitants. Demography Population of parishes Year: Beginning with 2007; 2018; 7552. Rønne 11,752; 11,539; 7553. Knudsker 2,821; 2,729; 7554. Vestermarie 1,460; 1,324; 7555. Nylarsker 924; 832; 7556. Nyker 1,737; 1,628; 7557. Hasle 1,887; 1,747; 7558. Rutsker 684; 570; 7559. Olsker 1,556; 1,266; 7560. Allinge-Sandvig 1,860; 1,527; 7561. Klemensker 1,737; 1,555; 7562. Rø 503; 418; 7563. Ibsker 1,322; 1,148; 7564. Svaneke 1,082; 981; 7565. Østerlarsker 997; 811; 7566. Gudhjem 752; 677; 7567. Østermarie 1,624; 1,458; 7568. Christiansø 95; 83; 7569. Aaker 3,479; 3,201; 7570. Bodilsker 981; 849; 7571. Nexø 3,884; 3,670; 7572. Poulsker 1,215; 1,061; 7573. Pedersker 715; 570; Population numbers are from 1 January. Christiansø Parish (which encompasses Ertholmene) is not a part of Bornholm Regional Municipality. It is included because Danmarks Statistik includes it as parish number 7568. Bornholm has 21 parishes (2018) that before 1 April 1970 were parish (15) or market city (6) municipalities themselves. There are 2,158 parishes (2021) in the Church of Denmark. Source:Statistikbanken.dk/Befolkning og valg/(table)FODIE (births);FOD207 (deaths);BEV107 (births;deaths;birth surplus);KM1 (parishes). On 22 September 2014 population numbers showed fewer than 40,000 inhabitants on the island for the first time in over 100 years. The Folkeregister in the municipality could document 39,922 inhabitants in the municipality on that date. Language Many inhabitants speak the Bornholmsk dialect, which is officially a dialect of Danish. Bornholmsk retains three grammatical genders, like Faroese, Icelandic and most dialects of Norwegian, but unlike standard Danish. Its phonology includes archaisms (unstressed and internal , where other dialects have and ) and innovations ( for before and after front-tongue vowels). This makes the dialect difficult to understand for some Danish speakers. However, Swedish speakers often consider Bornholmian to be easier to understand than standard Danish. The intonation resembles the dialects spoken in nearby Scania, Blekinge and Halland the southernmost provinces of Sweden. Religion Most inhabitants are members of the Lutheran Church of Denmark (Folkekirken). Various Christian denominations have become established on the island, most during the 19th century. Folkekirken (State church) (1536) Baptist church (1843) The Church of Jesus Christ of Latter-day Saints (LDS Church) (1850) Methodist church (1895) Jehovah's Witnesses (1897) Roman Catholic Church (ca 1150–1536, 1849) Sights and landmarks On the surface of Bornholm older geological formations can be seen better than in the rest of Denmark. Stubbeløkken – which is still operating (Danish i drift) – and Klippeløkken granite quarries in Knudsker parish just east of central Rønne – and statistically a part | Bornholm formed part of the historical Lands of Denmark when the nation united out of a series of petty chiefdoms. It was originally administratively part of the province of Scania and was administered by the Scanian Law after this was codified in the 13th century. Control over the island evolved into a long-raging dispute between the See of Lund and the Danish crown culminating in several battles. The first fortress on the island was Gamleborg which was replaced by Lilleborg, built by the king in 1150. In 1149, the king accepted the transfer of three of the island's four herreder (districts) to the archbishop. In 1250, the archbishop constructed his own fortress, Hammershus. A campaign launched from it in 1259 conquered the remaining part of the island including Lilleborg. The island's status remained a matter of dispute for an additional 200 years. Modern Bornholm was pawned to Lübeck for 50 years starting in 1525. Its first militia, Bornholms Milits, was formed in 1624. Swedish forces conquered the island in 1645, but returned the island to Denmark in the following peace settlement. After the war in 1658, Denmark ceded the island to Sweden under the Treaty of Roskilde along with the rest of the Skåneland, Bohuslän and Trøndelag, and it was occupied by Swedish forces. A revolt broke out the same year, culminating in Villum Clausen's shooting of the Swedish commander Johan Printzensköld on 8 December 1658. Following the revolt, a deputation of islanders presented the island as a gift to King Frederick III of Denmark on the condition that the island would never be ceded again. This status was confirmed in the Treaty of Copenhagen in 1660. Swedes, notably from Småland and Scania, emigrated to the island during the 19th century, seeking work and better conditions. Most of the migrants did not remain. Bornholm also attracted many famous artists at the beginning of the 20th century, forming a group now known as the Bornholm school of painters. In addition to Oluf Høst, they include Karl Isaksson (1878–1922), from Sweden, and the Danes Edvard Weie (1879–1943), Olaf Rude (1886–1957), Niels Lergaard (1893–1982), and Kræsten Iversen (1886–1955). German occupation (1940–1945) Bornholm, as a part of Denmark, was captured by Germany on 10 April 1940, and served as a lookout post and listening station during the war, as it was a part of the Eastern Front. The island's perfect central position in the Baltic Sea meant that it was an important "natural fortress" between Germany and Sweden, effectively keeping submarines and destroyers away from Nazi-occupied waters. Several concrete coastal installations were built during the war, and several coastal batteries, which had tremendous range. However, none of them were ever used, and only a single test shot was fired during the occupation. These remnants of Nazi rule have since fallen into disrepair and are mostly regarded today as historical curiosities. Many tourists visit the ruins each year, however, providing supplemental income to the tourist industry. On 22 August 1942 a V-1 flying bomb (numbered V83, probably launched from a Heinkel He 111) crashed on Bornholm during a test – the warhead was a dummy made of concrete. This was photographed or sketched by the Danish Naval Officer-in-Charge on Bornholm, Lieutenant Commander Hasager Christiansen. This was one of the first signs British Intelligence saw of Germany's aspirations to develop flying bombs and rockets which were to become known as V-1. Soviet occupation (1945–1946) Bornholm was heavily bombarded by the Soviet Air Force in May 1945, as it was a part of the Eastern Front. The German garrison commander, German Navy Captain Gerhard von Kamptz (1902–1998), refused to surrender to the Soviets, as his orders were to surrender to the Western Allies. The Germans sent several telegrams to Copenhagen requesting that at least one British soldier should be transferred to Bornholm, so that the Germans could surrender to the Western Allied forces instead of the Soviets. When von Kamptz failed to provide a written capitulation as demanded by the Soviet commanders, Soviet aircraft relentlessly bombed and destroyed more than 800 civilian houses in Rønne and Nexø and seriously damaged roughly 3,000 more on 7–8 May 1945. The population had been forewarned of the bombardments, and the towns were evacuated, but 10 local people were killed. Soldiers were also killed and wounded. Some of them were conscripts from occupied Latvia fighting in German ranks against the Soviets. During the Soviet bombing of the two main towns on 7 and 8 May, Danish radio was not allowed to broadcast the news because it was thought it would spoil the liberation festivities in Denmark. On 9 May Soviet troops landed on the island, and after a short fight, the German garrison (about 12,000 strong) surrendered. Soviet forces left the island on 5 April 1946 as part of the post-war division of interests of the Soviet Union and the Western Allies. Denmark was to be Western aligned, and in return Estonia, Latvia and Lithuania were to be kept in the Soviet sphere of influence. Cold War After the evacuation of their forces from Bornholm, the Soviets took the position that the stationing of foreign troops on Bornholm would be considered a declaration of war against the Soviet Union, and that Denmark should keep troops on it at all times to protect it from such foreign aggression. This policy remained in force after NATO was formed, with Denmark as a founding member. The Soviets accepted the stationing there of Danish troops, which were part of NATO but viewed as militarily inferior elements of the alliance, but they strongly objected to the presence of other NATO troops on Bornholm, US troops in particular. On 5 March 1953, the day of Stalin's death, Polish pilot Franciszek Jarecki defected from the Eastern Bloc and landed a MiG-15 fighter on the island. He was later granted asylum and rewarded for providing Western intelligence with the then-newest Soviet jet fighter. In 2017, Denmark's Defence Intelligence Service decided to build a listening tower near Østermarie, almost the size of the Statue of Liberty, to intercept radio communications across the Baltic Sea and in parts of Russia. Municipality Bornholm Regional Municipality is the local authority (Danish, kommune) covering the entire island. It is the result of a merger of the five former (1 April 1970 until 2002) municipalities on the island (Allinge-Gudhjem, Hasle, Nexø, Rønne and Aakirkeby) and the former Bornholm County. Bornholm Regional Municipality was also a county in its own right during its first four years from 1 January 2003 until 31 December 2006. From 1 January 2007 all counties were abolished, and Bornholm became part of the Capital Region of Denmark whose main responsibility is the health service. The municipality still retains its name Bornholm Regional Municipality. The island had 21 municipalities until March 1970, of which 6 were market towns and 15 parishes. In addition to supervising parish municipalities, which was the responsibility of the counties in all of Denmark, the market town municipalities of Bornholm were supervised by Bornholm County as well and not by the Interior Ministry as was the case in the rest of Denmark. The seat of the municipal council is the island's main town, Rønne. The voters decided to merge the county with the municipalities in a referendum 29 May 2001, effective from 1 January 2003. The question on the ballot was, "Do you want the six municipal entities of Bornholm to be joined to form one municipal entity as of 1 January 2003?" 73.9% voted in favour. The lowest percentage for the merger was in Nexø municipality (966 more people voting "Yes" than "No"), whose mayor, Annelise Molin, a Social Democrat, spoke out against the merger. It was required that each municipality had more "Yes" votes than "No" votes. Otherwise the merger would have to be abandoned altogether. The six municipal entities had up to 122 councillors (of which county clls were 18, from 1998 15), reduced to 89 in the municipalities from the 1990s, in the 1970s and the new regional municipality would have 27 councillors from the start. They were reduced to 23 from 1 January 2018 (election November 2017). The merger was approved in a law by the Folketing 19 (and signature by the Queen 25) March 2002, transferring the tasks of the abolished county and old municipalities to the new Bornholm Regional Municipality. The first regional mayor in the first three years from 2003 until 2005 was Thomas Thors (born 28 July 1949), a physician and member of the Social Democrats and previously the last mayor of Rønne Municipality for five years from 1998 until 2002. He became a mayor again in 2021. Bjarne Kristiansen, who was the last mayor of Hasle years from the summer of 2000 until 2002, representing the local Borgerlisten political party, served as mayor for four years from 1 January 2006 until 2009. From 1 January 2007, Bornholm became a part of the Capital Region of Denmark. From 1 January 2010 until 31 December 2020 the mayor was Winni Grosbøll, a high school teacher and a member of the Social Democrats (Socialdemokratiet) political party. The deputy mayor Morten Riis was mayor for a short interlude from 1 January until 4 January 2021. He is from the Red-Green Alliance. Thomas Thors, who was elected again in 2017, became mayor again from 4 January 2021. After the 2021 Danish local elections Jacob Trøst became mayor from January 2022. He is from the Conservative party. This was after an agreement (aftale om konstituering) between the Red-Green Alliance, amongst whom Morten Riis will be deputy mayor, and the Danish People's Party with the Conservatives. Municipal council Bornholm's municipal council today consists of 23 members, elected every four years. In the first four local elections in the newly created municipality there were 27 members elected to the municipal council. The 2002 local election only took place on Bornholm. From the election in 2017 the number of councillors elected was reduced to 23 members, serving their term of office from 1 January 2018 until 31 December 2021. Below are the election results to the new merged municipal council beginning with the first election 29 May 2002. Transport Ferry services connect Rønne to Świnoujście (Poland), Sassnitz (Germany), Køge, by road ( as the crow flies) south of Copenhagen, Denmark; the destination to Køge replaced the nighttime route directly to and from Copenhagen (for both cargo and passengers) from 1 October 2004; and catamaran services to Ystad (Sweden). Simrishamn (Sweden) has a ferry connection during the summer. There are also regular catamaran services between Nexø and the Polish ports of Kołobrzeg, Łeba and Ustka. There are direct bus connections Ystad-Copenhagen, coordinated with the catamaran. There are also flights from Bornholm Airport to Copenhagen and other locations. Because of its remote location Bornholm Regional Municipality has its own traffic company, BAT, and is its own employment region, and also performs other tasks normally carried out by the regions in the rest of Denmark. In some respects the municipality forms a region of its own. Bornholm Regional Municipality was not merged with other municipalities on 1 January 2007 in the nationwide Municipal Reform of 2007. Towns and villages The larger towns on the island are located on the coast and have harbours. There is however one exception, centrally placed Aakirkeby, which was also the name of the municipality from 1970 until 2002, which did, however, include the harbor, Boderne, to the south. The largest town is Rønne, the seat, in the southwest on the westernmost point of the island. The other main towns (clockwise round the island) are Hasle, Sandvig, Allinge, Gudhjem, Svaneke and Nexø. Monday morning 22 September 2014 it was documented by Folkeregistret in the municipality that the number of people living in the municipality that day were 39,922, the lowest number in over 100 years. , Statistics Denmark gave the populations as follows: The town of Rønne after the merger of the island's administrative entities 1 January 2003 reached a low point of 13,568 inhabitants 1 January 2014. 15,957 people in 1965 (date unknown;number not registerbased) lived in the two parishes that would become Rønne municipality from 1 April 1970. In the table, numbers for Rønne are for the parish of Rønne, Rønne Sogn, alone. Year unknown, but between 2000 and 2005. It does not include Knudsker Sogn, which was also a part of Rønne Municipality. Other localities (with approximate populations, not updated) include Aarsballe (86), Arnager (151), Olsker (67), Rutsker (64), Rø (181), Stenseby (?) and Vang (92). In 2010 and 2018 10,297 and 9,111 respectively lived in rural districts, and 88 and 71 had no fixed address. A rural district is defined by Statistics Denmark as a settlement with less than 200 inhabitants. Demography Population of parishes Year: Beginning with 2007; 2018; 7552. Rønne 11,752; 11,539; 7553. Knudsker 2,821; 2,729; 7554. Vestermarie 1,460; 1,324; 7555. Nylarsker 924; 832; 7556. Nyker 1,737; 1,628; 7557. Hasle 1,887; 1,747; 7558. Rutsker 684; 570; 7559. Olsker 1,556; 1,266; 7560. Allinge-Sandvig 1,860; 1,527; 7561. Klemensker 1,737; 1,555; 7562. Rø 503; 418; 7563. Ibsker 1,322; 1,148; 7564. Svaneke 1,082; 981; 7565. Østerlarsker 997; 811; 7566. Gudhjem 752; 677; 7567. Østermarie 1,624; 1,458; 7568. Christiansø 95; 83; 7569. Aaker 3,479; 3,201; 7570. Bodilsker 981; 849; 7571. Nexø 3,884; 3,670; 7572. Poulsker 1,215; 1,061; 7573. Pedersker 715; 570; Population numbers are from 1 January. Christiansø Parish (which encompasses Ertholmene) is not a part of Bornholm Regional Municipality. It is included because Danmarks Statistik includes it as parish number 7568. Bornholm has 21 parishes (2018) that before 1 April 1970 were parish (15) |
Morecambe Bay, the largest intertidal bay in England United States Bay, Arkansas Bay, Springfield, Massachusetts, a neighborhood Bay, Missouri Bay County, Florida Bay County, Michigan Bays, Kentucky Bays, Ohio Chesapeake Bay, an estuary in the District of Columbia, Maryland, Delaware, and Virginia Jamaica Bay, in Queens, New York San Francisco Bay, a shallow estuary in the U.S. state of California San Francisco Bay Area, or simply the Bay Area Animals and plants Animals Bay (horse), a color of the hair coats of some horses Baying, a kind of howling made by canines Plants Bay laurel, the evergreen laurel tree species Laurus nobilis Bay leaf, the aromatic leaves of several species of the Laurel family Rose bay, a common name for Rhododendron maximum Architecture and interior design Bay (architecture), a module in classical or Gothic architecture Bay, the name in English of | station located in Malta Bay Radio (Spain), a radio station serving the Valencian Community in Spain Heart North Lancashire & Cumbria, formerly The Bay, a radio station in North West England Hot Radio, originally operating as The Bay 102.8, a radio station in Dorset, England, Swansea Bay Radio, a radio station in South Wales WZBA, a classic rock radio station, operating as 100.7 The Bay, in Westminster, Maryland Other arts, entertainment, and media The Bay (film), a 2012 American found footage horror film The Bay (web series), a soap opera web series that premiered in 2010 "The Bay", a 2011 single by Metronomy The Bay (TV series), a British crime drama Businesses Bank of Ayudhya, a Thai commercial bank (Stock symbol: BAY) Bay Networks, a network hardware vendor acquired by Nortel Networks in 1998 Bay Trading Company, a retailer of woman's clothes in the UK Hudson's Bay (retailer) or The Bay, a chain of department stores in Canada Transport Baia Mare Airport in Baia Mare, Romania Bay platform, a dead-end platform at a railway station which has through lines Bay station, a subway station in Toronto Bay, the space enclosed by a set of struts on a biplane (see ) Loading bay, a synonym for loading dock People Bay (chancellor), a royal scribe to an ancient Egyptian ruler Bay (surname) Bay Buchanan (born 1948), prominent conservative political commentator Other uses Bay (cloth), a coarse woolen cloth similar to Baize but lighter in weight and with shorter pile. Drive bay, an area for adding hardware in a computer Sick bay, nautical term for the location in a ship that is used for medical purposes The Bay School of |
the internet means that much new information is not printed in paper books, but is made available online through a digital library, on CD-ROM, in the form of ebooks or other online media. An on-line book is an ebook that is available online through the internet. Though many books are produced digitally, most digital versions are not available to the public, and there is no decline in the rate of paper publishing. There is an effort, however, to convert books that are in the public domain into a digital medium for unlimited redistribution and infinite availability. This effort is spearheaded by Project Gutenberg combined with Distributed Proofreaders. There have also been new developments in the process of publishing books. Technologies such as POD or "print on demand", which make it possible to print as few as one book at a time, have made self-publishing (and vanity publishing) much easier and more affordable. On-demand publishing has allowed publishers, by avoiding the high costs of warehousing, to keep low-selling books in print rather than declaring them out of print. Indian manuscripts Goddess Saraswati image dated 132 AD excavated from Kankali tila depicts her holding a manuscript in her left hand represented as a bound and tied palm leaf or birch bark manuscript. In India a bounded manuscript made of birch bark or palm leaf existed side by side since antiquity. The text in palm leaf manuscripts was inscribed with a knife pen on rectangular cut and cured palm leaf sheets; colourings were then applied to the surface and wiped off, leaving the ink in the incised grooves. Each sheet typically had a hole through which a string could pass, and with these the sheets were tied together with a string to bind like a book. Mesoamerican Codex The codices of pre-Columbian Mesoamerica (Mexico and Central America) had the same form as the European codex, but were instead made with long folded strips of either fig bark (amatl) or plant fibers, often with a layer of whitewash applied before writing. New World codices were written as late as the 16th century (see Maya codices and Aztec codices). Those written before the Spanish conquests seem all to have been single long sheets folded concertina-style, sometimes written on both sides of the local amatl paper. Modern manufacturing The methods used for the printing and binding of books continued fundamentally unchanged from the 15th century into the early 20th century. While there was more mechanization, a book printer in 1900 had much in common with Gutenberg. Gutenberg's invention was the use of movable metal types, assembled into words, lines, and pages and then printed by letterpress to create multiple copies. Modern paper books are printed on papers designed specifically for printed books. Traditionally, book papers are off-white or low-white papers (easier to read), are opaque to minimise the show-through of text from one side of the page to the other and are (usually) made to tighter caliper or thickness specifications, particularly for case-bound books. Different paper qualities are used depending on the type of book: Machine finished coated papers, woodfree uncoated papers, coated fine papers and special fine papers are common paper grades. Today, the majority of books are printed by offset lithography. When a book is printed, the pages are laid out on the plate so that after the printed sheet is folded the pages will be in the correct sequence. Books tend to be manufactured nowadays in a few standard sizes. The sizes of books are usually specified as "trim size": the size of the page after the sheet has been folded and trimmed. The standard sizes result from sheet sizes (therefore machine sizes) which became popular 200 or 300 years ago, and have come to dominate the industry. British conventions in this regard prevail throughout the English-speaking world, except for the USA. The European book manufacturing industry works to a completely different set of standards. Processes Layout Modern bound books are organized according to a particular format called the book's layout. Although there is great variation in layout, modern books tend to adhere to a set of rules with regard to what the parts of the layout are and what their content usually includes. A basic layout will include a front cover, a back cover and the book's content which is called its body copy or content pages. The front cover often bears the book's title (and subtitle, if any) and the name of its author or editor(s). The inside front cover page is usually left blank in both hardcover and paperback books. The next section, if present, is the book's front matter, which includes all textual material after the front cover but not part of the book's content such as a foreword, a dedication, a table of contents and publisher data such as the book's edition or printing number and place of publication. Between the body copy and the back cover goes the end matter which would include any indices, sets of tables, diagrams, glossaries or lists of cited works (though an edited book with several authors usually places cited works at the end of each authored chapter). The inside back cover page, like that inside the front cover, is usually blank. The back cover is the usual place for the book's ISBN and maybe a photograph of the author(s)/ editor(s), perhaps with a short introduction to them. Also here often appear plot summaries, barcodes and excerpted reviews of the book. Printing Some books, particularly those with shorter runs (i.e. with fewer copies) will be printed on sheet-fed offset presses, but most books are now printed on web presses, which are fed by a continuous roll of paper, and can consequently print more copies in a shorter time. As the production line circulates, a complete "book" is collected together in one stack of pages, and another machine carries out the folding, pleating, and stitching of the pages into bundles of signatures (sections of pages) ready to go into the gathering line. Note that the pages of a book are printed two at a time, not as one complete book. Excess numbers are printed to make up for any spoilage due to make-readies or test pages to assure final print quality. A make-ready is the preparatory work carried out by the pressmen to get the printing press up to the required quality of impression. Included in make-ready is the time taken to mount the plate onto the machine, clean up any mess from the previous job, and get the press up to speed. As soon as the pressman decides that the printing is correct, all the make-ready sheets will be discarded, and the press will start making books. Similar make readies take place in the folding and binding areas, each involving spoilage of paper. Binding After the signatures are folded and gathered, they move into the bindery. In the middle of last century there were still many trade binders – stand-alone binding companies which did no printing, specializing in binding alone. At that time, because of the dominance of letterpress printing, typesetting and printing took place in one location, and binding in a different factory. When type was all metal, a typical book's worth of type would be bulky, fragile and heavy. The less it was moved in this condition the better: so printing would be carried out in the same location as the typesetting. Printed sheets on the other hand could easily be moved. Now, because of increasing computerization of preparing a book for the printer, the typesetting part of the job has flowed upstream, where it is done either by separately contracting companies working for the publisher, by the publishers themselves, or even by the authors. Mergers in the book manufacturing industry mean that it is now unusual to find a bindery which is not also involved in book printing (and vice versa). If the book is a hardback its path through the bindery will involve more points of activity than if it is a paperback. Unsewn binding, is now increasingly common. The signatures of a book can also be held together by "Smyth sewing" using needles, "McCain sewing", using drilled holes often used in schoolbook binding, or "notch binding", where gashes about an inch long are made at intervals through the fold in the spine of each signature. The rest of the binding process is similar in all instances. Sewn and notch bound books can be bound as either hardbacks or paperbacks. Finishing "Making cases" happens off-line and prior to the book's arrival at the binding line. In the most basic case-making, two pieces of cardboard are placed onto a glued piece of cloth with a space between them into which is glued a thinner board cut to the width of the spine of the book. The overlapping edges of the cloth (about 5/8" all round) are folded over the boards, and pressed down to adhere. After case-making the stack of cases will go to the foil stamping area for adding decorations and type. Digital printing Recent developments in book manufacturing include the development of digital printing. Book pages are printed, in much the same way as an office copier works, using toner rather than ink. Each book is printed in one pass, not as separate signatures. Digital printing has permitted the manufacture of much smaller quantities than offset, in part because of the absence of make readies and of spoilage. One might think of a web press as printing quantities over 2000, quantities from 250 to 2000 being printed on sheet-fed presses, and digital presses doing quantities below 250. These numbers are of course only approximate and will vary from supplier to supplier, and from book to book depending on its characteristics. Digital printing has opened up the possibility of print-on-demand, where no books are printed until after an order is received from a customer. Ebook In the 2000s, due to the rise in availability of affordable handheld computing devices, the opportunity to share texts through electronic means became an appealing option for media publishers. Thus, the "ebook" was made. The term ebook is a contraction of "electronic book"; it refers to a book-length publication in digital form. An ebook is usually made available through the internet, but also on CD-ROM and other forms. Ebooks may be read either via a computing device with an LED display such as a traditional computer, a smartphone or a tablet computer; or by means of a portable e-ink display device known as an ebook reader, such as the Sony Reader, Barnes & Noble Nook, Kobo eReader, or the Amazon Kindle. Ebook readers attempt to mimic the experience of reading a print book by using this technology, since the displays on ebook readers are much less reflective. Design Book design is the art of incorporating the content, style, format, design, and sequence of the various components of a book into a coherent whole. In the words of Jan Tschichold, book design "though largely forgotten today, methods and rules upon which it is impossible to improve have been developed over centuries. To produce perfect books these rules have to be brought back to life and applied." Richard Hendel describes book design as "an arcane subject" and refers to the need for a context to understand what that means. Many different creators can contribute to book design, including graphic designers, artists and editors. Sizes The size of a modern book is based on the printing area of a common flatbed press. The pages of type were arranged and clamped in a frame, so that when printed on a sheet of paper the full size of the press, the pages would be right side up and in order when the sheet was folded, and the folded edges trimmed. The most common book sizes are: Quarto (4to): the sheet of paper is folded twice, forming four leaves (eight pages) approximately 11–13 inches (c. 30 cm) tall Octavo (8vo): the most common size for current hardcover books. The sheet is folded three times into eight leaves (16 pages) up to inches (c. 23 cm) tall. DuoDecimo (12mo): a size between 8vo and 16mo, up to inches (c. 18 cm) tall Sextodecimo (16mo): the sheet is folded four times, forming 16 leaves (32 pages) up to inches (c. 15 cm) tall Sizes smaller than 16mo are: 24mo: up to inches (c. 13 cm) tall. 32mo: up to 5 inches (c. 12 cm) tall. 48mo: up to 4 inches (c. 10 cm) tall. 64mo: up to 3 inches (c. 8 cm) tall. Small books can be called booklets. Sizes larger than quarto are: Folio: up to 15 inches (c. 38 cm) tall. Elephant Folio: up to 23 inches (c. 58 cm) tall. Atlas Folio: up to 25 inches (c. 63 cm) tall. Double Elephant Folio: up to 50 inches (c. 127 cm) tall. The largest extant medieval manuscript in the world is Codex Gigas 92 × 50 × 22 cm. The world's largest book is made of stone and is in Kuthodaw Pagoda (Burma). Types By content A common separation by content are fiction and non-fiction books. This simple separation can be found in most collections, libraries, and bookstores. There are other types such as books of sheet music. Fiction Many of the books published today are "fiction", meaning that they contain invented material, and are creative literature. Other literary forms such as poetry are included in the broad category. Most fiction is additionally categorized by literary form and genre. The novel is the most common form of fiction book. Novels are stories that typically feature a plot, setting, themes and characters. Stories and narrative are not restricted to any topic; a novel can be whimsical, serious or controversial. The novel has had a tremendous impact on entertainment and publishing markets. A novella is a term sometimes used for fiction prose typically between 17,500 and 40,000 words, and a novelette between 7,500 and 17,500. A short story may be any length up to 10,000 words, but these word lengths vary. Comic books or graphic novels are books in which the story is illustrated. The characters and narrators use speech or thought bubbles to express verbal language. Non-fiction Non-fiction books are in principle based on fact, on subjects such as history, politics, social and cultural issues, as well as autobiographies and memoirs. Nearly all academic literature is non-fiction. A reference book is a general type of non-fiction book which provides information as opposed to telling a story, essay, commentary, or otherwise supporting a point of view. An almanac is a very general reference book, usually one-volume, with lists of data and information on many topics. An encyclopedia is a book or set of books designed to have more in-depth articles on many topics. A book listing words, their etymology, meanings, and other information is called a dictionary. A book which is a collection of maps is an atlas. A more specific reference book with tables or lists of data and information about a certain topic, often intended for professional use, is often called a handbook. Books which try to list references and abstracts in a certain broad area may be called an index, such as Engineering Index, or abstracts such as chemical abstracts and biological abstracts. Books with technical information on how to do something or how to use some equipment are called instruction manuals. Other popular how-to books include cookbooks and home improvement books. Students typically store and carry textbooks and schoolbooks for study purposes. Unpublished Many types of book are private, often filled in by the owner, for a variety of personal records. Elementary school pupils often use workbooks, which are published with spaces or blanks to be filled by them for study or homework. In US higher education, it is common for a student to take an exam using a blue book. There is a large set of books that are made only to write private ideas, notes, and accounts. These books are rarely published and are typically destroyed or remain private. Notebooks are blank papers to be written in by the user. Students and writers commonly use them for taking notes. Scientists and other researchers use lab notebooks to record their notes. They often feature spiral coil bindings at the edge so that pages may easily be torn out. Address books, phone books, and calendar/appointment books are commonly used on a daily basis for recording appointments, meetings and personal contact information. Books for recording periodic entries by the user, such as daily information about a journey, are called logbooks or simply logs. A similar book for writing the owner's daily private personal events, information, and ideas is called a diary or personal journal. Businesses use accounting books such as journals and ledgers to record financial data in a practice called bookkeeping (now usually held on computers rather than in hand-written form). Other There are several other types of books which are not commonly found under this system. Albums are books for holding a group of items belonging to a particular theme, such as a set of photographs, card collections, and memorabilia. One common example is stamp albums, which are used by many hobbyists to protect and organize their collections of postage stamps. Such albums are often made using removable plastic pages held inside in a ringed binder or other similar holder. Picture books are books for children with pictures on every page and less text (or even no text). Hymnals are books with collections of musical hymns that can typically be found in churches. Prayerbooks or missals are books that contain written prayers and are commonly carried by monks, nuns, and other devoted followers or clergy. Lap books are a learning tool created by students. Decodable readers and leveling A leveled book collection is a set of books organized in levels of difficulty from the easy books appropriate for an emergent reader to longer more complex books adequate for advanced readers. Decodable readers or books are a specialized type of leveled books that use decodable text only including controlled lists of words, sentences and stories consistent with the letters and phonics that have been taught to the emergent reader. New sounds and letters are added to higher level decodable books, as the level of instruction progresses, allowing for higher levels of accuracy, comprehension and fluency. By physical format Hardcover books have a stiff binding. Paperback books have cheaper, flexible covers which tend to be less durable. An alternative to paperback is the glossy cover, otherwise known as a dust cover, found on magazines, and comic books. Spiral-bound books are bound by spirals made of metal or plastic. Examples of spiral-bound books include teachers' manuals and puzzle books (crosswords, sudoku). Publishing is a process for producing pre-printed books, magazines, and newspapers for the reader/user to buy. Publishers may produce low-cost, pre-publication copies known as galleys or 'bound proofs' for promotional purposes, such as generating reviews in advance of publication. Galleys are usually made as cheaply as possible, since they are not intended for sale. Dummy books Dummy books (or faux books) are books that are designed to imitate a real book by appearance to deceive people, some books may be whole with empty pages, others may be hollow or in other cases, there may be a whole panel carved with spines which are then painted to look like books, titles of some books may also be fictitious. There are many reasons to have dummy books on display such as; to allude visitors of the vast wealth of information in their possession and to inflate the owner's appearance of wealth, to conceal something, for shop displays or for decorative purposes. In early 19th century at Gwrych Castle, North Wales, Lloyd Hesketh Bamford-Hesketh was known for his vast collection of books at his library, however, at the later part of that same century, the public became aware that parts of his library was a fabrication, dummy books were built and then locked behind glass doors to stop people from trying to access them, from this a proverb was born, "Like Hesky's library, all outside". Libraries Private or personal libraries made up of non-fiction and fiction books, (as opposed to the state or institutional records kept in archives) first appeared in classical Greece. In the ancient world, the maintaining of a library was usually (but not exclusively) the privilege of a wealthy individual. These libraries could have been either private or public, i.e. for people who were interested in using them. The difference from a modern public library lies in that they were usually not funded from public sources. It is estimated that in the city of Rome at the end of the 3rd century there were around 30 public libraries. Public libraries also existed in other cities of the ancient Mediterranean region (for example, Library of Alexandria). Later, in the Middle Ages, monasteries and universities had also libraries that could be accessible to general public. Typically not the whole collection was available to public, the books could not be borrowed and often were chained to reading stands to prevent theft. The beginning of modern public library begins around 15th century when individuals started to donate books to towns. The growth of a public library system in the United States started in the late 19th century and was much helped by donations from Andrew Carnegie. This reflected classes in a society: The poor or the middle class had to access most books through a public library or by other means while the rich could afford to have a private library built in their homes. In the United States the Boston Public Library 1852 Report of the Trustees established the justification for the public library as a tax-supported institution intended to extend educational opportunity and provide for general culture. The advent of paperback books in the 20th century led to an explosion of popular publishing. Paperback books made owning books affordable for many people. Paperback books often included works from genres that had previously been published mostly in pulp magazines. As a result of the low cost of such books and the spread of bookstores filled with them (in addition to the creation of a smaller market of extremely cheap used paperbacks) owning a private library ceased to be a status symbol for the rich. In library and booksellers' catalogues, it is common to include an abbreviation such as "Crown 8vo" to indicate the paper size from which the book is made. When rows of books are lined on a book holder, bookends are sometimes needed to keep them from slanting. Identification and classification During the 20th century, librarians were concerned about keeping track of the many books being added yearly to the Gutenberg Galaxy. Through a global society called the International Federation of Library Associations and Institutions (IFLA), they devised a series of tools including the International Standard Bibliographic Description (ISBD). Each book is specified by an International Standard Book Number, or ISBN, which is unique to every edition of every book produced by participating publishers, worldwide. It is managed by the ISBN Society. An ISBN has four parts: the first part is the country code, the second the publisher code, and the third the title code. The last part is | a composition of such great length that it takes a considerable investment of time to compose and still considered as an investment of time to read. In a restricted sense, a book is a self-sufficient section or part of a longer composition, a usage reflecting that, in antiquity, long works had to be written on several scrolls and each scroll had to be identified by the book it contained. Each part of Aristotle's Physics is called a book. In an unrestricted sense, a book is the compositional whole of which such sections, whether called books or chapters or parts, are parts. The intellectual content in a physical book need not be a composition, nor even be called a book. Books can consist only of drawings, engravings or photographs, crossword puzzles or cut-out dolls. In a physical book, the pages can be left blank or can feature an abstract set of lines to support entries, such as in an account book, an appointment book, an autograph book, a notebook, a diary or a sketchbook. Some physical books are made with pages thick and sturdy enough to support other physical objects, like a scrapbook or photograph album. Books may be distributed in electronic form as ebooks and other formats. Although in ordinary academic parlance a monograph is understood to be a specialist academic work, rather than a reference work on a scholarly subject, in library and information science monograph denotes more broadly any non-serial publication complete in one volume (book) or a finite number of volumes (even a novel like Proust's seven-volume In Search of Lost Time), in contrast to serial publications like a magazine, journal or newspaper. An avid reader or collector of books is a bibliophile or colloquially, "bookworm". A place where books are traded is a bookshop or bookstore. Books are also sold elsewhere and can be borrowed from libraries. Google has estimated that by 2010, approximately 130,000,000 titles had been published. In some wealthier nations, the sale of printed books has decreased because of the increased usage of ebooks. Etymology The word book comes from Old English , which in turn comes from the Germanic root , cognate to 'beech'. In Slavic languages like Russian, Bulgarian, Macedonian —'letter' is cognate with 'beech'. In Russian, Serbian and Macedonian, the word () or () refers to a primary school textbook that helps young children master the techniques of reading and writing. It is thus conjectured that the earliest Indo-European writings may have been carved on beech wood. The Latin word , meaning a book in the modern sense (bound and with separate leaves), originally meant 'block of wood'. History Antiquity When writing systems were created in ancient civilizations, a variety of objects, such as stone, clay, tree bark, metal sheets, and bones, were used for writing; these are studied in epigraphy. Tablet A tablet is a physically robust writing medium, suitable for casual transport and writing. Clay tablets were flattened and mostly dry pieces of clay that could be easily carried, and impressed with a stylus. They were used as a writing medium, especially for writing in cuneiform, throughout the Bronze Age and well into the Iron Age. Wax tablets were pieces of wood covered in a coating of wax thick enough to record the impressions of a stylus. They were the normal writing material in schools, in accounting, and for taking notes. They had the advantage of being reusable: the wax could be melted, and reformed into a blank. The custom of binding several wax tablets together (Roman pugillares) is a possible precursor of modern bound (codex) books. The etymology of the word codex (block of wood) also suggests that it may have developed from wooden wax tablets. Scroll Scrolls can be made from papyrus, a thick paper-like material made by weaving the stems of the papyrus plant, then pounding the woven sheet with a hammer-like tool until it is flattened. Papyrus was used for writing in Ancient Egypt, perhaps as early as the First Dynasty, although the first evidence is from the account books of King Neferirkare Kakai of the Fifth Dynasty (about 2400 BC). Papyrus sheets were glued together to form a scroll. Tree bark such as lime and other materials were also used. According to Herodotus (History 5:58), the Phoenicians brought writing and papyrus to Greece around the 10th or 9th century BC. The Greek word for papyrus as writing material (biblion) and book (biblos) come from the Phoenician port town Byblos, through which papyrus was exported to Greece. From Greek we also derive the word tome (), which originally meant a slice or piece and from there began to denote "a roll of papyrus". Tomus was used by the Latins with exactly the same meaning as volumen (see also below the explanation by Isidore of Seville). Whether made from papyrus, parchment, or paper, scrolls were the dominant form of book in the Hellenistic, Roman, Chinese, Hebrew, and Macedonian cultures. The more modern codex book format form took over the Roman world by late antiquity, but the scroll format persisted much longer in Asia. Codex Isidore of Seville (died 636) explained the then-current relation between codex, book and scroll in his Etymologiae (VI.13): "A codex is composed of many books; a book is of one scroll. It is called codex by way of metaphor from the trunks (codex) of trees or vines, as if it were a wooden stock, because it contains in itself a multitude of books, as it were of branches." Modern usage differs. A codex (in modern usage) is the first information repository that modern people would recognize as a "book": leaves of uniform size bound in some manner along one edge, and typically held between two covers made of some more robust material. The first written mention of the codex as a form of book is from Martial, in his Apophoreta CLXXXIV at the end of the first century, where he praises its compactness. However, the codex never gained much popularity in the pagan Hellenistic world, and only within the Christian community did it gain widespread use. This change happened gradually during the 3rd and 4th centuries, and the reasons for adopting the codex form of the book are several: the format is more economical, as both sides of the writing material can be used; and it is portable, searchable, and easy to conceal. A book is much easier to read, to find a page that you want, and to flip through. A scroll is more awkward to use. The Christian authors may also have wanted to distinguish their writings from the pagan and Judaic texts written on scrolls. In addition, some metal books were made, that required smaller pages of metal, instead of an impossibly long, unbending scroll of metal. A book can also be easily stored in more compact places, or side by side in a tight library or shelf space. Manuscripts The fall of the Roman Empire in the 5th century AD saw the decline of the culture of ancient Rome. Papyrus became difficult to obtain due to lack of contact with Egypt, and parchment, which had been used for centuries, became the main writing material. Parchment is a material made from processed animal skin and used—mainly in the past—for writing on. Parchment is most commonly made of calfskin, sheepskin, or goatskin. It was historically used for writing documents, notes, or the pages of a book. Parchment is limed, scraped and dried under tension. It is not tanned, and is thus different from leather. This makes it more suitable for writing on, but leaves it very reactive to changes in relative humidity and makes it revert to rawhide if overly wet. Monasteries carried on the Latin writing tradition in the Western Roman Empire. Cassiodorus, in the monastery of Vivarium (established around 540), stressed the importance of copying texts. St. Benedict of Nursia, in his Rule of Saint Benedict (completed around the middle of the 6th century) later also promoted reading. The Rule of Saint Benedict (Ch. XLVIII), which set aside certain times for reading, greatly influenced the monastic culture of the Middle Ages and is one of the reasons why the clergy were the predominant readers of books. The tradition and style of the Roman Empire still dominated, but slowly the peculiar medieval book culture emerged. Before the invention and adoption of the printing press, almost all books were copied by hand, which made books expensive and comparatively rare. Smaller monasteries usually had only a few dozen books, medium-sized perhaps a few hundred. By the 9th century, larger collections held around 500 volumes and even at the end of the Middle Ages, the papal library in Avignon and Paris library of the Sorbonne held only around 2,000 volumes. The scriptorium of the monastery was usually located over the chapter house. Artificial light was forbidden for fear it may damage the manuscripts. There were five types of scribes: Calligraphers, who dealt in fine book production Copyists, who dealt with basic production and correspondence Correctors, who collated and compared a finished book with the manuscript from which it had been produced Illuminators, who painted illustrations Rubricators, who painted in the red letters The bookmaking process was long and laborious. The parchment had to be prepared, then the unbound pages were planned and ruled with a blunt tool or lead, after which the text was written by the scribe, who usually left blank areas for illustration and rubrication. Finally, the book was bound by the bookbinder. Different types of ink were known in antiquity, usually prepared from soot and gum, and later also from gall nuts and iron vitriol. This gave writing a brownish black color, but black or brown were not the only colors used. There are texts written in red or even gold, and different colors were used for illumination. For very luxurious manuscripts the whole parchment was colored purple, and the text was written on it with gold or silver (for example, Codex Argenteus). Irish monks introduced spacing between words in the 7th century. This facilitated reading, as these monks tended to be less familiar with Latin. However, the use of spaces between words did not become commonplace before the 12th century. It has been argued that the use of spacing between words shows the transition from semi-vocalized reading into silent reading. The first books used parchment or vellum (calfskin) for the pages. The book covers were made of wood and covered with leather. Because dried parchment tends to assume the form it had before processing, the books were fitted with clasps or straps. During the later Middle Ages, when public libraries appeared, up to the 18th century, books were often chained to a bookshelf or a desk to prevent theft. These chained books are called libri catenati. At first, books were copied mostly in monasteries, one at a time. With the rise of universities in the 13th century, the Manuscript culture of the time led to an increase in the demand for books, and a new system for copying books appeared. The books were divided into unbound leaves (pecia), which were lent out to different copyists, so the speed of book production was considerably increased. The system was maintained by secular stationers guilds, which produced both religious and non-religious material. Judaism has kept the art of the scribe alive up to the present. According to Jewish tradition, the Torah scroll placed in a synagogue must be written by hand on parchment and a printed book would not do, though the congregation may use printed prayer books and printed copies of the Scriptures are used for study outside the synagogue. A sofer "scribe" is a highly respected member of any observant Jewish community. Middle East People of various religious (Jews, Christians, Zoroastrians, Muslims) and ethnic backgrounds (Syriac, Coptic, Persian, Arab etc.) in the Middle East also produced and bound books in the Islamic Golden Age (mid 8th century to 1258), developing advanced techniques in Islamic calligraphy, miniatures and bookbinding. A number of cities in the medieval Islamic world had book production centers and book markets. Yaqubi (died 897) says that in his time Baghdad had over a hundred booksellers. Book shops were often situated around the town's principal mosque as in Marrakesh, Morocco, that has a street named Kutubiyyin or book sellers in English and the famous Koutoubia Mosque is named so because of its location in this street. The medieval Muslim world also used a method of reproducing reliable copies of a book in large quantities known as check reading, in contrast to the traditional method of a single scribe producing only a single copy of a single manuscript. In the check reading method, only "authors could authorize copies, and this was done in public sessions in which the copyist read the copy aloud in the presence of the author, who then certified it as accurate." With this check-reading system, "an author might produce a dozen or more copies from a single reading," and with two or more readings, "more than one hundred copies of a single book could easily be produced." By using as writing material the relatively cheap paper instead of parchment or papyrus the Muslims, in the words of Pedersen "accomplished a feat of crucial significance not only to the history of the Islamic book, but also to the whole world of books". Wood block printing In woodblock printing, a relief image of an entire page was carved into blocks of wood, inked, and used to print copies of that page. This method originated in China, in the Han dynasty (before 220 AD), as a method of printing on textiles and later paper, and was widely used throughout East Asia. The oldest dated book printed by this method is The Diamond Sutra (868 AD). The method (called woodcut when used in art) arrived in Europe in the early 14th century. Books (known as block-books), as well as playing-cards and religious pictures, began to be produced by this method. Creating an entire book was a painstaking process, requiring a hand-carved block for each page; and the wood blocks tended to crack, if stored for long. The monks or people who wrote them were paid highly. Movable type and incunabula The Chinese inventor Bi Sheng made movable type of earthenware c. 1045, but there are no known surviving examples of his printing. Around 1450, in what is commonly regarded as an independent invention, Johannes Gutenberg invented movable type in Europe, along with innovations in casting the type based on a matrix and hand mould. This invention gradually made books less expensive to produce, and more widely available. Early printed books, single sheets and images which were created before 1501 in Europe are known as incunables or incunabula. "A man born in 1453, the year of the fall of Constantinople, could look back from his fiftieth year on a lifetime in which about eight million books had been printed, more perhaps than all the scribes of Europe had produced since Constantine founded his city in AD 330." 19th century to 21st centuries Steam-powered printing presses became popular in the early 19th century. These machines could print 1,100 sheets per hour, but workers could only set 2,000 letters per hour. Monotype and linotype typesetting machines were introduced in the late 19th century. They could set more than 6,000 letters per hour and an entire line of type at once. There have been numerous improvements in the printing press. As well, the conditions for freedom of the press have been improved through the gradual relaxation of restrictive censorship laws. See also intellectual property, public domain, copyright. In mid-20th century, European book production had risen to over 200,000 titles per year. Throughout the 20th century, libraries have faced an ever-increasing rate of publishing, sometimes called an information explosion. The advent of electronic publishing and the internet means that much new information is not printed in paper books, but is made available online through a digital library, on CD-ROM, in the form of ebooks or other online media. An on-line book is an ebook that is available online through the internet. Though many books are produced digitally, most digital versions are not available to the public, and there is no decline in the rate of paper publishing. There is an effort, however, to convert books that are in the public domain into a digital medium for unlimited redistribution and infinite availability. This effort is spearheaded by Project Gutenberg combined with Distributed Proofreaders. There have also been new developments in the process of publishing books. Technologies such as POD or "print on demand", which make it possible to print as few as one book at a time, have made self-publishing (and vanity publishing) much easier and more affordable. On-demand publishing has allowed publishers, by avoiding the high costs of warehousing, to keep low-selling books in print rather than declaring them out of print. Indian manuscripts Goddess Saraswati image dated 132 AD excavated from Kankali tila depicts her holding a manuscript in her left hand represented as a bound and tied palm leaf or birch bark manuscript. In India a bounded manuscript made of birch bark or palm leaf existed side by side since antiquity. The text in palm leaf manuscripts was inscribed with a knife pen on rectangular cut and cured palm leaf sheets; colourings were then applied to the surface and wiped off, leaving the ink in the incised grooves. Each sheet typically had a hole through which a string could pass, and with these the sheets were tied together with a string to bind like a book. Mesoamerican Codex The codices of pre-Columbian Mesoamerica (Mexico and Central America) had the same form as the European codex, but were instead made with long folded strips of either fig bark (amatl) or plant fibers, often with a layer of whitewash applied before writing. New World codices were written as late as the 16th century (see Maya codices and Aztec codices). Those written before the Spanish conquests seem all to have been single long sheets folded concertina-style, sometimes written on both sides of the local amatl paper. Modern manufacturing The methods used for the printing and binding of books continued fundamentally unchanged from the 15th century into the early 20th century. While there was more mechanization, a book printer in 1900 had much in common with Gutenberg. Gutenberg's invention was the use of movable metal types, assembled into words, lines, and pages and then printed by letterpress to create multiple copies. Modern paper books are printed on papers designed specifically for printed books. Traditionally, book papers are off-white or low-white papers (easier to read), are opaque to minimise the show-through of text from one side of the page to the other and are (usually) made to tighter caliper or thickness specifications, particularly for case-bound books. Different paper qualities are used depending on the type of book: Machine finished coated papers, woodfree uncoated papers, coated fine papers and special fine papers are common paper grades. Today, the majority of books are printed by offset lithography. When a book is printed, the pages are laid out on the plate so that after the printed sheet is folded the pages will be in the correct sequence. Books tend to be manufactured nowadays in a few standard sizes. The sizes of books are usually specified as "trim size": the size of the page after the sheet has been folded and trimmed. The standard sizes result from sheet sizes (therefore machine sizes) which became popular 200 or 300 years ago, and have come to dominate the industry. British conventions in this regard prevail throughout the English-speaking world, except for the USA. The European book manufacturing industry works to a completely different set of standards. Processes Layout Modern bound books are organized according to a particular format called the book's layout. Although there is great variation in layout, modern books tend to adhere to a set of rules with regard to what the parts of the layout are and what their content usually includes. A basic layout will include a front cover, a back cover and the book's content which is called its body copy or content pages. The front cover often bears the book's title (and subtitle, if any) and the name of its author or editor(s). The inside front cover page is usually left blank in both hardcover and paperback books. The next section, if present, is the book's front matter, which includes all textual material after the front cover but not part of the book's content such as a foreword, a dedication, a table of contents and publisher data such as the book's edition or printing number and place of publication. Between the body copy and the back cover goes the end matter which would include any indices, sets of tables, diagrams, glossaries or lists of cited works (though an edited book with several authors usually places cited works at the end of each authored chapter). The inside back cover page, like that inside the front cover, is usually blank. The back cover is the usual place for the book's ISBN and maybe a photograph of the author(s)/ editor(s), perhaps with a short introduction to them. Also here often appear plot summaries, barcodes and excerpted reviews of the book. Printing Some books, particularly those with shorter runs (i.e. with fewer copies) will be printed on sheet-fed offset presses, but most books are now printed on web presses, which are fed by a continuous roll of paper, and can consequently print more copies in a shorter time. As the production line circulates, a complete "book" is collected together in one stack of pages, and another machine carries out the folding, pleating, and stitching of the pages into bundles of signatures (sections of pages) ready to go into the gathering line. Note that the pages of |
bomber aircraft. B-52 or B52 may also refer to: The B-52's, an American new wave band The B-52's (album) B52 (New York City bus), a bus line in Brooklyn, New York B52 (chess opening), a chess | park within the Orlando International Airport, Florida Volvo B52 engine, a group of Volvo engines Nora B-52, a Serbian self-propelled howitzer HLA-B52, an HLA-B serotype Bundesstraße 52, a federal highway in Germany B52, route number for Kings Highway in Australia B-52, |
the 17th century Maratha king. Initially, Thackeray said it was not a political party but an army of Shivaji Maharaj, inclined to fight for the Marathi manoos (person). It demanded that native speakers of the state's local language Marathi (the "sons of the soil" movement) be given preferential treatment in private and public sector jobs. The early objective of the Shiv Sena was to ensure their job security competing against South Indians and Gujaratis. In its 1966 party manifesto, Thackeray primarily blamed south Indians. In Marmik, Thackeray published a list of corporate officials from a local directory, many being south Indians, citing it as proof that Maharashtrians were being discriminated against. His party grew in the next ten years. Senior leaders such as Babasaheb Purandare, chief attorney for Trade Union of Maharashtra Madhav Mehere joined the party and chartered accountant Madhav Gajanan Deshpande backed various aspects of the party operations. In 1969, Thackeray and Manohar Joshi were jailed after participating in a protest demanding the merger of Karwar, Belgaum and Nipani regions in Maharashtra. During the 1970s, it did not succeed in the local elections and it was active mainly in Mumbai, compared to the rest of the state. The party set up local branch offices and settled disputes, complaints against the government. It later started violent tactics with attacks against rival parties, migrants and the media; the party agitated by destroying public and private property. Thackeray publicly supported Indira Gandhi during the 1975 Emergency to avoid getting arrested; Thackeray supported the Congress party numerous times. Dr. Hemchandra Gupte, Mayor of Mumbai and the former family physician and confidant of Thackeray, left Shiv Sena in 1976 citing importance given to money, violence committed by the Shiv Sena members and Thackeray's support for Indira Gandhi and the 1975 emergency. Politically, the Shiv Sena was anti-communist, and wrested control of trade unions in Mumbai from the Communist Party of India (CPI). Local unemployed youth from the declining textile industry joined the party and it further expanded because of Marathi migrants from the Konkan region. By the 1980s, it became a threat to the ruling Congress party which initially encouraged it because of it rivalling the CPI. In 1989, the Sena's newspaper Saamna was launched by Thackeray. Because of Thackeray being against the Mandal Commission report, his close aide Chhagan Bhujbal left the party in 1991. Following the 1992 Bombay riots, Thackeray took stances viewed as anti-Muslim and based on Hindutva. Shiv Sena later allied itself with the Bharatiya Janata Party (BJP). The BJP-Shiv Sena alliance won the 1995 Maharashtra State Assembly elections and were in power from 1995 to 1999. Thackeray declared himself to be the "remote control" chief minister. Thackeray and the Chief Minister Manohar Joshi were explicitly named for inciting the Shivsainiks for violence against Muslims during the 1992–1993 riots in an inquiry ordered by the government of India, the Srikrishna Commission Report. He had influence in the film industry. His party workers agitated against films he found controversial and would disrupt film screenings, causing losses. Bombay, a 1995 film on the riots was opposed by them. 1999–2012 On 28 July 1999, Thackeray was banned from voting and contesting in any election for six years from 11 December 1999 till 10 December 2005 and was not happy with this, on the recommendations of the Election Commission for indulging in corrupt practice by seeking votes in the name of religion. In 2000, he was arrested for his role in the riots but was released because the statute of limitations expired. In 2002, Thackeray issued a call to form Hindu suicide bomber squads to take on the menace of terrorism. In response, the Maharashtra government registered a case against him for inciting enmity between different groups. At least two organisations founded and managed by retired Indian Army officers, Lt Col Jayant Rao Chitale and Lt Gen. P.N. Hoon (former commander-in-chief of the Western Command), responded to the call with such statements as not allowing Pakistanis to work in India due to accusations against Pakistan for supporting attacks in India by militants. After the six-year voting ban on Thackeray was lifted in 2005, he voted for the first time in the 2007 BMC elections. Eight or nine cases against Thackeray and Saamna for inflammatory writings were not investigated by the government. Thackeray said that the Shiv Sena had helped the Marathi people in Mumbai, especially in the public sector. Thackeray believed that Hindus must be organised to struggle against those who oppose their identity and religion. Opposition leftist parties alleged that the Shiv Sena has done little to solve the problem of unemployment facing a large proportion of Maharashtrian youth during its tenure, in contradiction to its ideological foundation of 'sons of the soil.' In 2006, Thackeray's nephew Raj Thackeray broke away from Shiv Sena to form the Maharashtra Navnirman Sena (MNS) during Thackeray's retirement and appointment of his son, Uddhav rather than Raj as the leader of Shiv Sena. Narayan Rane also quit around that time. The Sena acted as a "moral police" and opposed Valentine's Day celebrations. On 14 February 2006, Thackeray condemned and apologised for the violent attacks by its Shiv Sainiks on a private celebration in Mumbai. "It is said that women were beaten up in the Nallasopara incident. If that really happened, then it is a symbol of cowardice. I have always instructed Shiv Sainiks that in any situation women should not be humiliated and harassed." Thackeray and the Shiv Sena remained opposed to it, although they indicated support for an "Indian alternative." In 2007, he was briefly arrested and let out on bail after referring to Muslims as "green poison" during a Shiv Sena rally. On 27 March 2008, in protest to Thackeray's editorial, leaders of Shiv Sena in Delhi resigned, citing its "outrageous conduct" towards non-Marathis in Maharashtra and announced that they would form a separate party. Addressing a press conference, Shiv Sena's North India chief Jai Bhagwan Goyal said the decision to leave the party was taken because of the "partial attitude" of the party high command towards Maharashtrians. Goyal further said "Shiv Sena is no different from Khalistan and Jammu and Kashmir militant groups which are trying to create a rift between people along regional lines. The main aim of these forces is to split our country. Like the Maharashtra Navnirman Sena, the Shiv Sena too has demeaned North Indians and treated them inhumanely." Political views Thackeray was criticised for his praise of Adolf Hitler. He was quoted by Asiaweek as saying: "I am a great admirer of Hitler, and I am not ashamed to say so! I do not say that I agree with all the methods he employed, but he was a wonderful organiser and orator, and I feel that he and I have several things in common...What India really needs is a dictator who will rule benevolently, but with an iron hand." In a 1993 interview, Thackeray stated, "There is nothing wrong" if "Muslims are treated as Jews were in Nazi Germany." In another 1992 interview, Thackeray stated, "If you take Mein Kampf and if you remove the word 'Jew' and put in the word 'Muslim', that is what I believe in". Indian Express published an interview on 29 January 2007: "Hitler did very cruel and ugly things. But he was an artist, I love him [for that]. He had the power to carry the whole nation, the mob with him. You have to think what magic he had. He was a miracle...The killing of Jews was wrong. But the good part about Hitler was that he was an artist. He was a daredevil. He had good qualities and bad. I may also have good qualities and bad ones." Thackeray also declared that he was "not against every Muslim, but only those who reside in this country but do not obey the laws of the land...I consider such people [to be] traitors." The Shiv Sena is viewed by the media as being anti-Muslim, though Shiv sena members officially reject this accusation. When explaining his views on Hindutva, he conflated Islam with violence and called on Hindus to "fight terrorism and fight Islam." In an interview with Suketu Mehta, he called for the mass expulsion of illegal Bangladeshi Muslim migrants from India and for a visa system to enter Mumbai, the Indian National Congress state government had earlier during the Indira Gandhi declared national emergency considered a similar measure. He told India Today "[Muslims] are spreading like a cancer and should be operated on like a cancer. The country...should be saved from the Muslims and the police should support them [Hindu Maha Sangh] in their struggle just like the police in Punjab were sympathetic to the Khalistanis." However, in an interview in 1998, he said that his stance had changed on many issues that the Shiv Sena had with Muslims, particularly regarding the Babri Mosque or Ram Janmabhoomi issue: "We must look after the Muslims and treat them as part of us." He also expressed admiration for Muslims in Mumbai in the wake of the 11 July 2006 Mumbai train bombings perpetrated by Islamic fundamentalists. In response to threats made by Abu Azmi, a leader of the Samajwadi Party, that accusations of terrorism directed at Indian Muslims would bring about communal strife, Thackeray said that the unity of Mumbaikars (residents of Mumbai) in the wake of the attacks was "a slap to fanatics of Samajwadi Party leader Abu Asim Azmi" and that Thackeray "salute[s] those Muslims who participated in the two minutes' silence on 18 July to mourn the blast victims." Again in 2008 he wrote: "Islamic terrorism is growing and Hindu terrorism is the only way to counter it. We need suicide bomb squads to protect India and Hindus." He also reiterated a desire | to take on the menace of terrorism. In response, the Maharashtra government registered a case against him for inciting enmity between different groups. At least two organisations founded and managed by retired Indian Army officers, Lt Col Jayant Rao Chitale and Lt Gen. P.N. Hoon (former commander-in-chief of the Western Command), responded to the call with such statements as not allowing Pakistanis to work in India due to accusations against Pakistan for supporting attacks in India by militants. After the six-year voting ban on Thackeray was lifted in 2005, he voted for the first time in the 2007 BMC elections. Eight or nine cases against Thackeray and Saamna for inflammatory writings were not investigated by the government. Thackeray said that the Shiv Sena had helped the Marathi people in Mumbai, especially in the public sector. Thackeray believed that Hindus must be organised to struggle against those who oppose their identity and religion. Opposition leftist parties alleged that the Shiv Sena has done little to solve the problem of unemployment facing a large proportion of Maharashtrian youth during its tenure, in contradiction to its ideological foundation of 'sons of the soil.' In 2006, Thackeray's nephew Raj Thackeray broke away from Shiv Sena to form the Maharashtra Navnirman Sena (MNS) during Thackeray's retirement and appointment of his son, Uddhav rather than Raj as the leader of Shiv Sena. Narayan Rane also quit around that time. The Sena acted as a "moral police" and opposed Valentine's Day celebrations. On 14 February 2006, Thackeray condemned and apologised for the violent attacks by its Shiv Sainiks on a private celebration in Mumbai. "It is said that women were beaten up in the Nallasopara incident. If that really happened, then it is a symbol of cowardice. I have always instructed Shiv Sainiks that in any situation women should not be humiliated and harassed." Thackeray and the Shiv Sena remained opposed to it, although they indicated support for an "Indian alternative." In 2007, he was briefly arrested and let out on bail after referring to Muslims as "green poison" during a Shiv Sena rally. On 27 March 2008, in protest to Thackeray's editorial, leaders of Shiv Sena in Delhi resigned, citing its "outrageous conduct" towards non-Marathis in Maharashtra and announced that they would form a separate party. Addressing a press conference, Shiv Sena's North India chief Jai Bhagwan Goyal said the decision to leave the party was taken because of the "partial attitude" of the party high command towards Maharashtrians. Goyal further said "Shiv Sena is no different from Khalistan and Jammu and Kashmir militant groups which are trying to create a rift between people along regional lines. The main aim of these forces is to split our country. Like the Maharashtra Navnirman Sena, the Shiv Sena too has demeaned North Indians and treated them inhumanely." Political views Thackeray was criticised for his praise of Adolf Hitler. He was quoted by Asiaweek as saying: "I am a great admirer of Hitler, and I am not ashamed to say so! I do not say that I agree with all the methods he employed, but he was a wonderful organiser and orator, and I feel that he and I have several things in common...What India really needs is a dictator who will rule benevolently, but with an iron hand." In a 1993 interview, Thackeray stated, "There is nothing wrong" if "Muslims are treated as Jews were in Nazi Germany." In another 1992 interview, Thackeray stated, "If you take Mein Kampf and if you remove the word 'Jew' and put in the word 'Muslim', that is what I believe in". Indian Express published an interview on 29 January 2007: "Hitler did very cruel and ugly things. But he was an artist, I love him [for that]. He had the power to carry the whole nation, the mob with him. You have to think what magic he had. He was a miracle...The killing of Jews was wrong. But the good part about Hitler was that he was an artist. He was a daredevil. He had good qualities and bad. I may also have good qualities and bad ones." Thackeray also declared that he was "not against every Muslim, but only those who reside in this country but do not obey the laws of the land...I consider such people [to be] traitors." The Shiv Sena is viewed by the media as being anti-Muslim, though Shiv sena members officially reject this accusation. When explaining his views on Hindutva, he conflated Islam with violence and called on Hindus to "fight terrorism and fight Islam." In an interview with Suketu Mehta, he called for the mass expulsion of illegal Bangladeshi Muslim migrants from India and for a visa system to enter Mumbai, the Indian National Congress state government had earlier during the Indira Gandhi declared national emergency considered a similar measure. He told India Today "[Muslims] are spreading like a cancer and should be operated on like a cancer. The country...should be saved from the Muslims and the police should support them [Hindu Maha Sangh] in their struggle just like the police in Punjab were sympathetic to the Khalistanis." However, in an interview in 1998, he said that his stance had changed on many issues that the Shiv Sena had with Muslims, particularly regarding the Babri Mosque or Ram Janmabhoomi issue: "We must look after the Muslims and treat them as part of us." He also expressed admiration for Muslims in Mumbai in the wake of the 11 July 2006 Mumbai train bombings perpetrated by Islamic fundamentalists. In response to threats made by Abu Azmi, a leader of the Samajwadi Party, that accusations of terrorism directed at Indian Muslims would bring about communal strife, Thackeray said that the unity of Mumbaikars (residents of Mumbai) in the wake of the attacks was "a slap to fanatics of Samajwadi Party leader Abu Asim Azmi" and that Thackeray "salute[s] those Muslims who participated in the two minutes' silence on 18 July to mourn the blast victims." Again in 2008 he wrote: "Islamic terrorism is growing and Hindu terrorism is the only way to counter it. We need suicide bomb squads to protect India and Hindus." He also reiterated a desire for Hindus to unite across linguistic barriers to see "a Hindustan for Hindus" and to "bring Islam in this country down to its knees." In 2008, following agitation against Biharis and other north Indians travelling to Maharashtra to take civil service examinations for the Indian Railways due to an overlimit of the quota in their home provinces, Thackeray also said of Bihari MPs that they were "spitting in the same plate from which they ate" when they criticised Mumbaikars and Maharashtrians. He wrote: "They are trying to add fuel to the fire that has been extinguished, by saying that Mumbaikars have rotten brains." He also criticised Chhath Puja, a holiday celebrated by Biharis and those from eastern Uttar Pradesh, which occurs on six days of the Hindu month of Kartik. He said that it was not a real holiday. This was reportedly a response to MPs from Bihar who had disrupted the proceedings of the Lok Sabha in protest to the attacks on North Indians. Bihar Chief Minister Nitish Kumar, upset with the remarks, called on the prime minister and the central government to intervene in the matter. A Saamna editorial prompted at least 16 MPs from Bihar and Uttar Pradesh, belonging to the Rashtriya Janata Dal, Janata Dal (United), Samajwadi Party and the Indian National Congress, to give notice for breach of privilege proceedings against Thackeray. After the matter was raised in the Lok Sabha, Speaker Somnath Chatterjee said: "If anybody has made any comment on our members' functioning in the conduct of business in the House, not only do we treat that with the contempt that it deserves, but also any action that may be necessary will be taken according to procedure and well established norms. Nobody will be spared.'" In 2009, he criticised Indian cricketer Sachin Tendulkar, a "Marathi icon", for saying he was an Indian before he was a Maharashtrian. Opposition to Caste Based Reservations Thackeray firmly opposed caste based reservation and said - "There are only two castes in the world, the rich are rich and the poor is poor, make the poor rich but don't make the rich poor. Besides these two castes I don't believe in any other casteism." The Bhartiya Janata Party (BJP) supported caste based reservations based on the Mandal commission. Thackarey, despite being warned that opposition to the reservations would be politically suicidal for the Shiv Sena party, opposed the BJP over this issue and said he would initiate "divorce proceedings against the BJP" if the BJP supported caste based reservations. This also let to his conflict with Chhagan Bhujbal, an OBC, who later left the Shiv Sena. Views on Vinayak Damodar Savarkar Thackeray defended Vinayak Damodar Savarkar against criticism and praised him as a great leader. In 2002, when President A. P. J. Abdul Kalam unveiled a portrait of Savarkar in the presence of Prime Minister Atal Bihari Vajpayee, the |
Economy, California classification of safe practices within a lockdown economy Britain Stronger in Europe, a lobbying group Board of Secondary Education, Odisha, India Biological systems engineering Bury St Edmunds railway station (station code), Suffolk, England Backscattered electron (see scanning electron microscope) Blender Stack Exchange, a Q&A site for the Blender 3D software BSE (satellite), a Japanese satellite Bachelor of Science in Engineering, an undergraduate academic degree | Education, Odisha, India Biological systems engineering Bury St Edmunds railway station (station code), Suffolk, England Backscattered electron (see scanning electron microscope) Blender Stack Exchange, a Q&A site for the Blender 3D software BSE (satellite), a Japanese satellite Bachelor of Science in Engineering, an undergraduate academic degree awarded to a student after 3-5 years of studying engineering at a university or college Black Sun Empire, a Dutch drum and bass group Bethe–Salpeter equation, an equation in quantum field theory Bendigo South East College, a secondary school in Victoria, Australia Banco de Seguros del Estado, a Uruguayan state-owned insurance company |
the invitation from the Hangzhou government to serve as a "culture consultant" for the city. His film Night Train to Lisbon (2013) premiered out of competition at the 63rd Berlin International Film Festival. He had planned to direct a Gianni Versace biopic with Antonio Banderas as Versace, but this project was cancelled. In August 2021, it was announced that August would direct a feature film adaptation of Karen Blixen's novel Ehrengard, which is being produced by Netflix and SF Studios. Personal life August has been married four times, and had four children by two of his wives. He was first married to Annie Munksgård Lauritzen. His second wife was Masja Dessau. Together they had two children, Anders Frithiof August (born June 15, 1978) and Adam August (born in 1983). They have both become screenwriters. His third marriage was to Swedish Star Wars prequel actress Pernilla August from 1991 to 1997. Together they had daughters Asta Kamma August (born November 5, 1991) and Alba Adèle August (born June 6, 1993); he also became the stepfather of her daughter Agnes from her first marriage to Swedish novelist and screenwriter Klas Östergren. In 2012, he married his fourth and current wife, Sara-Marie Maltha. In addition, August fathered four other children by two women outside marriage. Filmography Film As cinematographer Television References External links 1997 interview with Bille August about Smilla's Sense of Snow 1948 births Living people People from Lyngby-Taarbæk Municipality Danish film directors Danish screenwriters Danish cinematographers Directors of Palme d'Or winners Directors of Best Foreign Language Film | serve as a "culture consultant" for the city. His film Night Train to Lisbon (2013) premiered out of competition at the 63rd Berlin International Film Festival. He had planned to direct a Gianni Versace biopic with Antonio Banderas as Versace, but this project was cancelled. In August 2021, it was announced that August would direct a feature film adaptation of Karen Blixen's novel Ehrengard, which is being produced by Netflix and SF Studios. Personal life August has been married four times, and had four children by two of his wives. He was first married to Annie Munksgård Lauritzen. His second wife was Masja Dessau. Together they had two children, Anders Frithiof August (born June 15, 1978) and Adam August (born in 1983). They have both become screenwriters. His third marriage was to Swedish Star Wars prequel actress Pernilla August from 1991 to 1997. Together they had daughters Asta Kamma August (born November 5, 1991) and Alba Adèle August (born June 6, 1993); he also became the stepfather of her daughter Agnes from her first marriage to Swedish novelist and screenwriter Klas Östergren. In 2012, he married his fourth and current wife, Sara-Marie Maltha. In addition, August fathered four other children by two women |
up space Body (biology), the physical material of an organism Body plan, the physical features shared by a group of animals Human body, the entire structure of a human organism Dead body, cadaver, or corpse, a dead human body (living) matter, see: Mind–body problem, the relationship between mind and matter in philosophy Aggregates within living matter, such as inclusion bodies In arts and entertainment In film and television Body (2015 Polish film), a 2015 Polish film Body (2015 American film), a 2015 American film "Body" (Wonder Showzen episode), a 2006 episode of American sketch comedy television series Wonder Showzen "Body", an episode of the Adult Swim television series, Off the Air In literature and publishing body text, the text forming the main content of any printed matter body (typography), the size of a piece of metal type B.O.D.Y. (manga), by Ao Mimori B O D Y, an international online literary magazine In music Electronic body music, a genre "Body" (Dreezy song), 2016 "Body" (The Jacksons song), a song by The Jacksons from Victory, 1984 "Body" (Ja Rule song), a 2007 hip-hop song "Body" (Loud Luxury song), a 2017 house song "Body" (Marques Houston song), a 2009 R&B song "Body" (Megan Thee Stallion song), a song | body, an object in physics that represents a large amount, has mass or takes up space Body (biology), the physical material of an organism Body plan, the physical features shared by a group of animals Human body, the entire structure of a human organism Dead body, cadaver, or corpse, a dead human body (living) matter, see: Mind–body problem, the relationship between mind and matter in philosophy Aggregates within living matter, such as inclusion bodies In arts and entertainment In film and television Body (2015 Polish film), a 2015 Polish film Body (2015 American film), a 2015 American film "Body" (Wonder Showzen episode), a 2006 episode of American sketch comedy television series Wonder Showzen "Body", an episode of the Adult Swim television series, Off the Air In literature and publishing body text, the text forming the main content of any printed matter body (typography), the size of a piece of metal type B.O.D.Y. (manga), by Ao Mimori B O D Y, an international online literary magazine In music Electronic body music, a genre "Body" |
lamps for lighting the clock. The clock is on the highest of three levels. The original clock was replaced during World War II with a working one, given by the Nazis because the city had maintained German graves from World War I. The massive tower is composed of walls, massive spiral stairs, wooden mezzanine constructions, pendentives and the dome. During the construction of the tower, the façade was simultaneously decorated with simple stone plastic. Church of Saint Demetrius The Church of Saint Demetrius was built in 1830 with voluntary contributions of local merchants and craftsmen. It is plain on the outside, as all churches in the Ottoman Empire had to be, but lavishly decorated with chandeliers, a carved bishop throne and an engraved iconostasis on the inside. According to some theories, the iconostasis is a work of the Mijak engravers. Its most impressive feature is the arc above the imperial quarters with modeled figures of Jesus and the apostles. Other engraved wood items include the bishop's throne made in the spirit of Mijak engravers, several icon frames and five more-recent pillars shaped like thrones. The frescoes originate from two periods: the end of the 19th century, and the end of World War I to the present. The icons and frescoes were created thanks to voluntary contributions of local businessmen and citizens. The authors of many of the icons had a vast knowledge of iconography schemes of the New Testament. The icons show a great sense of color, dominated by red, green and ochra shades. The abundance of golden ornaments is noticeable and points to the presence of late-Byzantine artwork and baroque style. The icon of Saint Demetrius is signed with the initials "D. A. Z.", showing that it was made by iconographer Dimitar Andonov the zograph in 1889. There are many other items, including the chalices made by local masters, a darohranilka of Russian origin, and several paintings of scenes from the New Testament, brought from Jerusalem by pilgrims. The opening scenes of the film The Peacemaker were shot in the "Saint Dimitrija" church in Bitola, as well as some Welcome to Sarajevo scenes. Co-Cathedral of the Sacred Heart Heraclea Lyncestis Heraclea Lyncestis () was an important ancient settlement from the Hellenistic period till the early Middle Ages. It was founded by Philip II of Macedon by the middle of the 4th century BC. Today, its ruins are in the southern part of Bitola, from the city center. The covered bazaar Situated near the city centre, the covered bazaar () is one of the most impressive and oldest buildings in Bitola from the Ottoman period. With its numerous cupolas that look like a fortress, with its tree-branch-like inner streets and four large metal doors it is one of the biggest covered markets in the region. It was built in the 15th century by Kara Daut Pasha Uzuncarsili, then Rumelia's Beylerbey. Although the bazaar appears secure, it has been robbed and set on fire, but has managed to survive. The Bezisten, from the 15th to the 19th centuries, was rebuilt, and many stores, often changing over time, were located there. Most of them were selling textile and other luxurious fabrics. At the same time the Bezisten was a treasury, where in specially made small rooms the money from the whole Rumelian Vilaet was kept, before it was transferred into the royal treasury. In the 19th century the Bezisten contained 84 shops. Today most of them are contemporary and they sell different types of products, but despite the internal renovations, the outwards appearance of the structure has remained unchanged. Gazi Hajdar Kadi Mosque The Gazi Hajdar Kadi Mosque is one of the most attractive monuments of Islamic architecture in Bitola. It was built in the early 1560s, as the project of the architect Mimar Sinan, ordered by the Bitola kadija Ajdar-kadi. Over time, it was abandoned and heavily damaged, and at one point used as a stare, but recent restoration and conservation has restored to some extent its original appearance. Jeni Mosque The Jeni Mosque is located in the center of the city. It has a square base, topped with a dome. Near the mosque is a minaret, 40 m high. Today, the mosque's rooms house permanent and temporary art exhibitions. Recent archaeological excavations have revealed that it has been built upon an old church. Ishak Çelebi Mosque The Ishak Çelebi Mosque is the inheritance of the kadi Ishak Çelebi. In its spacious yard are several tombs, attractive because of the soft, molded shapes of the sarcophagi. Kodža Kadi Mosque The old bazaar The old bazaar (Macedonian: Стара Чаршија) is mentioned in a description of the city from the 16th and the 17th centuries. The present bezisten does not differ much in appearance from the original one. The bezisten had eighty-six shops and four large iron gates. The shops used to sell textiles, and today sell food products. Deboj Bath The Deboj Bath is a Turkish bath (hamam). It is not known when exactly it was constructed. At one point, it was heavily damaged, but after repairs it regained its original appearance: a façade with two large domes and several minor ones. Bitola today Bitola is the economic and industrial center of southwestern North Macedonia. Many of the largest companies in the country are based in the city. The Pelagonia agricultural combine is the largest producer of food in the country. The Streževo water system is the largest in North Macedonia and has the best technological facilities. The three thermoelectric power stations of REK Bitola produce nearly 80% of electricity in the state. The Frinko refrigerate factory was a leading electrical and metal company. Bitola also has significant capacity in the textile and food industries. Bitola is also home to thirteen consulates, which gives the city the nickname "the city of consuls." General consulates (since 2006) (since 2006) Honorary consulates (since 2019) (since 2014) (since 2014) (since 1996) (since 2012) (since 2008) (since 2007) (since 2001) (since 2007) (since 1998) (since 2011) Former consulates (2006-2014) (2005-2014) (2000-2014) Italy has also expressed interest in opening a consulate in Bitola. Media There is only one television station in Bitola: Tera, few regional radio stations: the private Radio 105 (Bombarder), Radio 106,6, UKLO FM, Radio Delfin as well as a local weekly newspaper — Bitolski Vesnik. City Council The Bitola Municipality Council () is the governing body of the city and municipality of Bitola. The city council approves and rejects projects that would have place inside the municipality given by its members and the Mayor of Bitola. The Council consists of elected representatives. The number of members of the council is determined according to the number of residents in the community and can not be fewer than nine nor more than 33. Currently the council is composed of 31 councillors. Council members are elected for a term of four years. Examining matters within its competence, the Council set up committees. Council committees are either permanent or temporary. Permanent committees of the council: Finance and Budget Committee; Commission for Public Utilities; Committee on Urban Planning, public works and environmental protection; Commission for social activities; Commission for local government; Commission to mark holidays, events and award certificates and awards; Sports The most popular sports in Bitola are football and handball. The main football team is FK Pelister and they play at the Tumbe Kafe Stadium which has a capacity of 8,000. Gjorgji Hristov, Dragan Kanatlarovski, Toni Micevski, Nikolče Noveski, Toni Savevski and Mitko Stojkovski are some of the Bitola natives to start their careers with the club. The main handball club and most famous sports team from Bitola is RK Eurofarm Pelister. RK Eurofarm Pelister 2 is the second club from the city and both teams play their games at the Sports Hall Boro Čurlevski. The main basketball club is KK Pelister and they also compete at the Sports Hall Boro Čurlevski. All the sports teams under the name Pelister are supported by the fans known as Čkembari. Demographics Ethnic groups During Ottoman times, Bitola had a significant Aromanian population, bigger than the Slavic and Jewish ones. In 1901, the Italian consul to the Ottoman Empire in Bitola said that "Undoubtedly, Koutzo-Vlach [Aromanian] population in Bitola is most significant in this town in terms of number of inhabitants, social status and importance in trade". According to the 1948 census Bitola had 30,761 inhabitants. 77.2% (or 23,734 inhabitants) were Macedonians, 11.5% (or 3,543 inhabitants) were Turks, 4.3% (or 1,327 inhabitants) were Albanians, 3% (or 912 inhabitants) were Serbs and 1.3% (or 402 inhabitants) were Aromanians. As of 2002, the city of Bitola has 74,550 inhabitants and the ethnic composition is the following: Language According to the 2002 census the most common languages in the city are the following: Religion Bitola is a bishopric city and the seat of the Diocese of Prespa- Pelagonia. In World War II the diocese was named Ohrid - Bitola. With the restoration of the autocephaly of the Macedonian Orthodox Church in 1967, it got its present name Prespa- Pelagonia diocese which covers the following regions and cities: Bitola, Resen, Prilep, Krusevo and Demir Hisar. The first bishop of the diocese (1958 - 1979) was Mr. Kliment. The second and current bishop and administrator | The old bazaar The old bazaar (Macedonian: Стара Чаршија) is mentioned in a description of the city from the 16th and the 17th centuries. The present bezisten does not differ much in appearance from the original one. The bezisten had eighty-six shops and four large iron gates. The shops used to sell textiles, and today sell food products. Deboj Bath The Deboj Bath is a Turkish bath (hamam). It is not known when exactly it was constructed. At one point, it was heavily damaged, but after repairs it regained its original appearance: a façade with two large domes and several minor ones. Bitola today Bitola is the economic and industrial center of southwestern North Macedonia. Many of the largest companies in the country are based in the city. The Pelagonia agricultural combine is the largest producer of food in the country. The Streževo water system is the largest in North Macedonia and has the best technological facilities. The three thermoelectric power stations of REK Bitola produce nearly 80% of electricity in the state. The Frinko refrigerate factory was a leading electrical and metal company. Bitola also has significant capacity in the textile and food industries. Bitola is also home to thirteen consulates, which gives the city the nickname "the city of consuls." General consulates (since 2006) (since 2006) Honorary consulates (since 2019) (since 2014) (since 2014) (since 1996) (since 2012) (since 2008) (since 2007) (since 2001) (since 2007) (since 1998) (since 2011) Former consulates (2006-2014) (2005-2014) (2000-2014) Italy has also expressed interest in opening a consulate in Bitola. Media There is only one television station in Bitola: Tera, few regional radio stations: the private Radio 105 (Bombarder), Radio 106,6, UKLO FM, Radio Delfin as well as a local weekly newspaper — Bitolski Vesnik. City Council The Bitola Municipality Council () is the governing body of the city and municipality of Bitola. The city council approves and rejects projects that would have place inside the municipality given by its members and the Mayor of Bitola. The Council consists of elected representatives. The number of members of the council is determined according to the number of residents in the community and can not be fewer than nine nor more than 33. Currently the council is composed of 31 councillors. Council members are elected for a term of four years. Examining matters within its competence, the Council set up committees. Council committees are either permanent or temporary. Permanent committees of the council: Finance and Budget Committee; Commission for Public Utilities; Committee on Urban Planning, public works and environmental protection; Commission for social activities; Commission for local government; Commission to mark holidays, events and award certificates and awards; Sports The most popular sports in Bitola are football and handball. The main football team is FK Pelister and they play at the Tumbe Kafe Stadium which has a capacity of 8,000. Gjorgji Hristov, Dragan Kanatlarovski, Toni Micevski, Nikolče Noveski, Toni Savevski and Mitko Stojkovski are some of the Bitola natives to start their careers with the club. The main handball club and most famous sports team from Bitola is RK Eurofarm Pelister. RK Eurofarm Pelister 2 is the second club from the city and both teams play their games at the Sports Hall Boro Čurlevski. The main basketball club is KK Pelister and they also compete at the Sports Hall Boro Čurlevski. All the sports teams under the name Pelister are supported by the fans known as Čkembari. Demographics Ethnic groups During Ottoman times, Bitola had a significant Aromanian population, bigger than the Slavic and Jewish ones. In 1901, the Italian consul to the Ottoman Empire in Bitola said that "Undoubtedly, Koutzo-Vlach [Aromanian] population in Bitola is most significant in this town in terms of number of inhabitants, social status and importance in trade". According to the 1948 census Bitola had 30,761 inhabitants. 77.2% (or 23,734 inhabitants) were Macedonians, 11.5% (or 3,543 inhabitants) were Turks, 4.3% (or 1,327 inhabitants) were Albanians, 3% (or 912 inhabitants) were Serbs and 1.3% (or 402 inhabitants) were Aromanians. As of 2002, the city of Bitola has 74,550 inhabitants and the ethnic composition is the following: Language According to the 2002 census the most common languages in the city are the following: Religion Bitola is a bishopric city and the seat of the Diocese of Prespa- Pelagonia. In World War II the diocese was named Ohrid - Bitola. With the restoration of the autocephaly of the Macedonian Orthodox Church in 1967, it got its present name Prespa- Pelagonia diocese which covers the following regions and cities: Bitola, Resen, Prilep, Krusevo and Demir Hisar. The first bishop of the diocese (1958 - 1979) was Mr. Kliment. The second and current bishop and administrator of the diocese, who has been bishop since 1981 is Mr. Petar. The Prespa- Pelagonia diocese has about 500 churches and monasteries. In the last ten years in the diocese have been built or are being built about 40 churches and 140 church buildings. The diocese has two church museums- the cathedral "St. Martyr Demetrius" in Bitola and at the Church "St. John" in Krusevo and permanent exhibition of icons and libraries in the building of the seat of the diocese. The seat building was built between 1901 and 1902 and is an example of baroque architecture. Besides the dominant Macedonian Orthodox Church in Bitola there are other major religious groups such as the Islamic community, the Roman Catholic Church and others. According to the 2002 census the religious composition of the city is the following: Bitola's Cathedral of the Sacred Heart of Jesus is the co-cathedral of the Roman Catholic Diocese of Skopje. Culture Bitola has been part of the UNESCO Creative Cities Network since December 2015. Manaki Festival of Film and Camera Held in memory of the first cameramen on the Balkans, Milton Manaki, every September the Film and Photo festival "Brothers Manaki" takes place. It is a combination of documentary and full-length films that are shown. The festival is a world class event with high recognition from press. A number of high-profile actors such as Catherine Deneuve, Isabelle Huppert, Victoria Abril, Predrag Manojlovic, Michael York, Juliette Binoche, and Rade Sherbedgia have attended. Ilindenski Denovi Every year, the traditional folk festival "Ilinden Days" takes place in Bitola. It is a 4-5 day festival of music, songs, and dances that is dedicated to the Ilinden Uprising against the Turks, where the main concentration is placed on the folk culture of North Macedonia. Folk dances and songs are presented with many folklore groups and organizations taking part . Small Monmartre of Bitola In the last few years, the art exhibition "Small Monmartre of Bitola" that is organized by the art studio "Kiril and Metodij" has turned into a successful children's art festival. Children from all over the world come to create art, making a number of highly valued art pieces that are presented in the country and around the world. "Small Monmartre of Bitola" has won numerous awards and nominations. Bitolino Bitolino is an annual children's theater festival held in August with the Babec Theater. Every year professional children's theaters from all over the world participate in the festival. The main prize is the grand prix for best performance. Si-Do Every May, Bitola hosts the international children's song festival Si-Do, which in recent years has increased in attendance. Children from all over Europe participate in this event which usually consists of about 20 songs. This festival is supported by ProMedia which organizes the event with a new topic each year. Many Macedonian musicians have participated in the festival including: Next Time and Karolina Goceva who also represented North Macedonia at the Eurovision Song Contest. Festival for classical music Interfest Interfest is an international festival dedicated mainly to classical music where musicians from around the world play their classical pieces. In addition to the classical music concerts, there are also few nights for pop-modern music, theater plays, art exhibitions, and a day for literature presentation during the event. In the last few years there have been artists from Russia, Slovakia, Poland, and many other countries. As Bitola has been called the city with most pianos, one night of the festival is dedicated to piano competitions. One award is given for the best young piano player, and another for competitors over 30. Akto Festival The Akto Festival for Contemporary Arts is a regional festival. The festival includes visual arts, performing arts, music and theory of culture. The first Akto festival was held in 2006. The aim of the festival is to open the cultural frameworks of a modern society through "recomposing" and redefining them in a new context. In the past, the festival featured artists from regional countries like Slovenia, Greece or Bulgaria, but also from Germany, Italy, France and Austria. International Monodrama Festival Is annual festival of monodrama held in April in organization of Centre of Culture of Bitola every year many actors from all over the world come in Bitola to play monodramas. Lokum fest Is a cultural and tourist event which has existed since 2007. The founder and organizer of the festival is the Association of Citizens Center for Cultural Decontamination Bitola. The festival is held every year in mid-July in the heart of the old Turkish bazaar in Bitola, as part of Bitola Cultural Summer Bit Fest. Education St. Clement of Ohrid University of Bitola (. Климент Охридски — Битола) was founded in 1979, as a result of an increasing demand for highly skilled professionals outside the country's capital. Since 1994, it has carried the name of the Slavic educator St. Clement of Ohrid. The university has institutes in Bitola, Ohrid, and Prilep, and its headquarters is in Bitola. It has become a well established university, and cooperates with University of St. Cyril and Methodius from Skopje and other universities in the Balkans and Europe. The following institutes and scientific organizations are part of the university: Technical Faculty – Bitola Economical Faculty – Prilep Faculty of Tourism and Leisure management – Ohrid Teachers Faculty – Bitola Faculty of biotechnological sciences – Bitola Faculty of Information and Communication Technologies — Bitola Medical college – Bitola Faculty of Veterinary Sciences – Bitola Tobacco institute – Prilep Hydro-biological institute – Ohrid Slavic cultural institute – Prilep There are seven high schools in Bitola: "Josip Broz-Tito", a gymnasium "Taki Daskalo", a gymnasium Stopansko School (mining survey, part of Taki Daskalo) "Dr. Jovan Kalauzi", a medical high school "Jane Sandanski", an economical high school "Gjorgji Naumov", a technological high school "Kuzman Šapkarev", an agricultural high school "Toše Proeski", a musical high school Ten Primary Schools in Bitola are: "Todor Angelevski" "Sv. Kliment Ohridski" "Goce Delčev" "Elpida Karamandi" "Dame Gruev" "Kiril i Metodij" "Kole Kaninski" "Trifun Panovski" "Stiv Naumov" "Gjorgji Sugarev" People from Bitola Twin towns — sister cities Bitola is twinned with: Épinal, France, since 1976 Kranj, Slovenia, since 1976 Požarevac, Serbia, since 1976 Trelleborg, Sweden, since 1981 Rockdale, Australia, since 1985 Bursa, Turkey, since 1995 Pleven, Bulgaria, since 1999 Pushkin, Russia, since 2005 Kremenchuk, Ukraine, since 2006 Stari Grad (Belgrade), Serbia, since 2006 Veliko Tarnovo, Bulgaria, since 2006 Nizhny Novgorod, Russia, since 2008 Rijeka, Croatia, since 2011 Ningbo, China, since 2014 Cetinje, Montenegro, since 2020 Gallery References Bibliography Basil Gounaris, |
promoted by subsequent Tudor administrations, became the authoritative sources for writers for the next four hundred years. As such, Tudor literature paints a flattering picture of Henry's reign, depicting the Battle of Bosworth as the final clash of the civil war and downplaying the subsequent uprisings. For England the Middle Ages ended in 1485, and English Heritage claims that other than William the Conqueror's successful invasion of 1066, no other year holds more significance in English history. By portraying Richard as a hunchbacked tyrant who usurped the throne by killing his nephews, the Tudor historians attached a sense of myth to the battle: it became an epic clash between good and evil with a satisfying moral outcome. According to Reader Colin Burrow, André was so overwhelmed by the historic significance of the battle that he represented it with a blank page in his Henry VII (1502). For Professor Peter Saccio, the battle was indeed a unique clash in the annals of English history, because "the victory was determined, not by those who fought, but by those who delayed fighting until they were sure of being on the winning side." Historians such as Adams and Horrox believe that Richard lost the battle not for any mythic reasons, but because of morale and loyalty problems in his army. Most of the common soldiers found it difficult to fight for a liege whom they distrusted, and some lords believed that their situation might improve if Richard were dethroned. According to Adams, against such duplicities Richard's desperate charge was the only knightly behaviour on the field. As fellow historian Michael Bennet puts it, the attack was "the swan-song of [mediaeval] English chivalry". Adams believes this view was shared at the time by the printer William Caxton, who enjoyed sponsorship from Edward IV and Richard III. Nine days after the battle, Caxton published Thomas Malory's story about chivalry and death by betrayal—Le Morte d'Arthur—seemingly as a response to the circumstances of Richard's death. Elton does not believe Bosworth Field has any true significance, pointing out that the 20th-century English public largely ignored the battle until its quincentennial celebration. In his view, the dearth of specific information about the battle—no-one even knows exactly where it took place—demonstrates its insignificance to English society. Elton considers the battle as just one part of Henry's struggles to establish his reign, underscoring his point by noting that the young king had to spend ten more years pacifying factions and rebellions to secure his throne. Mackie asserts that, in hindsight, Bosworth Field is notable as the decisive battle that established a dynasty which would rule unchallenged over England for more than a hundred years. Mackie notes that contemporary historians of that time, wary of the three royal successions during the long Wars of the Roses, considered Bosworth Field just another in a lengthy series of such battles. It was through the works and efforts of Francis Bacon and his successors that the public started to believe the battle had decided their futures by bringing about "the fall of a tyrant". Shakespearian dramatisation William Shakespeare gives prominence to the Battle of Bosworth in his play, Richard III. It is the "one big battle"; no other fighting scene distracts the audience from this action, represented by a one-on-one sword fight between Henry Tudor and Richard III. Shakespeare uses their duel to bring a climactic end to the play and the Wars of the Roses; he also uses it to champion morality, portraying the "unequivocal triumph of good over evil". Richard, the villainous lead character, has been built up in the battles of Shakespeare's earlier play, Henry VI, Part 3, as a "formidable swordsman and a courageous military leader"—in contrast to the dastardly means by which he becomes king in Richard III. Although the Battle of Bosworth has only five sentences to direct it, three scenes and more than four hundred lines precede the action, developing the background and motivations for the characters in anticipation of the battle. Shakespeare's account of the battle was mostly based on chroniclers Edward Hall's and Raphael Holinshed's dramatic versions of history, which were sourced from Vergil's chronicle. However, Shakespeare's attitude towards Richard was shaped by scholar Thomas More, whose writings displayed extreme bias against the Yorkist king. The result of these influences is a script that vilifies the king, and Shakespeare had few qualms about departing from history to incite drama. Margaret of Anjou died in 1482, but Shakespeare had her speak to Richard's mother before the battle to foreshadow Richard's fate and fulfill the prophecy she had given in Henry VI. Shakespeare exaggerated the cause of Richard's restless night before the battle, imagining it as a haunting by the ghosts of those whom the king had murdered, including Buckingham. Richard is portrayed as suffering a pang of conscience, but as he speaks he regains his confidence and asserts that he will be evil, if such needed to retain his crown. The fight between the two armies is simulated by rowdy noises made off-stage (alarums or alarms) while actors walk on-stage, deliver their lines, and exit. To build anticipation for the duel, Shakespeare requests more alarums after Richard's councillor, William Catesby, announces that the king is "[enacting] more wonders than a man". Richard punctuates his entrance with the classic line, "A horse, a horse! My kingdom for a horse!" He refuses to withdraw, continuing to seek to slay Henry's doubles until he has killed his nemesis. There is no documentary evidence that Henry had five decoys at Bosworth Field; the idea was Shakespeare's invention. He drew inspiration from Henry IV's use of them at the Battle of Shrewsbury (1403) to amplify the perception of Richard's courage on the battlefield. Similarly, the single combat between Henry and Richard is Shakespeare's creation. The True Tragedy of Richard III, by an unknown playwright, earlier than Shakespeare's, has no signs of staging such an encounter: its stage directions give no hint of visible combat. Despite the dramatic licences taken, Shakespeare's version of the Battle of Bosworth was the model of the event for English textbooks for many years during the 18th and 19th centuries. This glamorised version of history, promulgated in books and paintings and played out on stages across the country, perturbed humorist Gilbert Abbott à Beckett. He voiced his criticism in the form of a poem, equating the romantic view of the battle to watching a "fifth-rate production of Richard III": shabbily costumed actors fight the Battle of Bosworth on-stage while those with lesser roles lounge at the back, showing no interest in the proceedings. In Laurence Olivier's 1955 film adaptation of Richard III, the Battle of Bosworth is represented not by a single duel but a general melee that became the film's most recognised scene and a regular screening at Bosworth Battlefield Heritage Centre. The film depicts the clash between the Yorkist and Lancastrian armies on an open field, focusing on individual characters amidst the savagery of hand-to-hand fighting, and received accolades for the realism portrayed. One reviewer for The Manchester Guardian newspaper, however, was not impressed, finding the number of combatants too sparse for the wide plains and a lack of subtlety in Richard's death scene. The means by which Richard is shown to prepare his army for the battle also earned acclaim. As Richard speaks to his men and draws his plans in the sand using his sword, his units appear on-screen, arraying themselves according to the lines that Richard had drawn. Intimately woven together, the combination of pictorial and narrative elements effectively turns Richard into a storyteller, who acts out the plot he has constructed. Shakespearian critic Herbert Coursen extends that imagery: Richard sets himself up as a creator of men, but dies amongst the savagery of his creations. Coursen finds the depiction a contrast to that of Henry V and his "band of brothers". The adaptation of the setting for Richard III to a 1930s fascist England in Ian McKellen's 1995 film, however, did not sit well with historians. Adams posits that the original Shakespearian setting for Richard's fate at Bosworth teaches the moral of facing one's fate, no matter how unjust it is, "nobly and with dignity". By overshadowing the dramatic teaching with special effects, McKellen's film reduces its version of the battle to a pyrotechnic spectacle about the death of a one-dimensional villain. Coursen agrees that, in this version, the battle and Richard's end are trite and underwhelming. Battlefield location Officially the site of the battle is deemed by Leicestershire County Council to be in the vicinity of the town of Market Bosworth. The council engaged historian Daniel Williams to research the battle, and in 1974 his findings were used to build the Bosworth Battlefield Heritage Centre and the presentation it houses. Williams's interpretation, however, has since been questioned. Sparked by the battle's quincentenary celebration in 1985, a dispute among historians has led many to suspect the accuracy of Williams's theory. In particular, geological surveys conducted from 2003 to 2009 by the Battlefields Trust, a charitable organisation that protects and studies old English battlefields, show that the southern and eastern flanks of Ambion Hill were solid ground in the 15th century, contrary to Williams's claim that it was a large area of marshland. Landscape archaeologist Glenn Foard, leader of the survey, said the collected soil samples and finds of medieval military equipment suggest that the battle took place southwest of Ambion Hill (52°34′41″N 1°26′02″W), contrary to the popular belief that it was fought near the foot of the hill. Historians' theories The Historic Buildings and Monuments Commission for England (popularly referred to as "English Heritage") argues that the battle was named after Market Bosworth because the town was the nearest significant settlement to the battlefield in the 15th century. As explored by Professor Philip Morgan, a battle might initially not be named specifically at all. As time passes, writers of administrative and historical records find it necessary to identify a notable battle, ascribing it a name that is usually toponymical in nature and sourced from combatants or observers. This official name becomes accepted by society and future generations without question. Early records associated the Battle of Bosworth with "Brownehethe", "bellum Miravallenses", "Sandeford" and "Dadlyngton field". The earliest record, a municipal memorandum of 23 August 1485 from York, locates the battle "on the field of Redemore". This is corroborated by a 1485–86 letter that mentions "Redesmore" as its site. According to historian Peter Foss, records did not associate the battle with "Bosworth" until 1510. Foss is named by English Heritage as the principal advocate for "Redemore" as the battle site. He suggests the name is derived from "Hreod Mor", an Anglo-Saxon phrase that means "reedy marshland". Basing his opinion on 13th- and 16th-century church records, he believes "Redemore" was an area of wetland that lay between Ambion Hill and the village of Dadlington, and was close to the Fenn Lanes, a Roman road running east to west across the region. Foard believes this road to be the most probable route that both armies took to reach the battlefield. Williams dismisses the notion of "Redmore" as a specific location, saying that the term refers to a large area of reddish soil; Foss argues that Williams's sources are local stories and flawed interpretations of records. Moreover, he proposes that Williams was influenced by William Hutton's 1788 The Battle of Bosworth-Field, which Foss blames for introducing the notion that the battle was fought west of Ambion Hill on the north side of the River Sence. Hutton, as Foss suggests, misinterpreted a passage from his source, Raphael Holinshed's 1577 Chronicle. Holinshed wrote, "King Richard pitched his field on a hill called Anne Beame, refreshed his soldiers and took his rest." Foss believes that Hutton mistook "field" to mean "field of battle", thus creating the idea that the fight took place on Anne Beame (Ambion) Hill. To "[pitch] his field", as Foss clarifies, was a period expression for setting up a camp. Foss brings further evidence for his "Redemore" theory by quoting Edward Hall's 1550 Chronicle. Hall stated that Richard's army stepped onto a plain after breaking camp the next day. Furthermore, historian William Burton, author of Description of Leicestershire (1622), wrote that the battle was "fought in a large, flat, plaine, and spacious ground, three miles [5 km] distant from [Bosworth], between the Towne of Shenton, Sutton [Cheney], Dadlington and Stoke [Golding]". In Foss's opinion both sources are describing an area of flat ground north of Dadlington. Physical site English Heritage, responsible for managing England's historic sites, used both theories to designate the site for Bosworth Field. Without preference for either theory, they constructed a single continuous battlefield boundary that encompasses the locations proposed by both Williams and Foss. The region has experienced extensive changes over the years, starting after the battle. Holinshed stated in his chronicle that he found firm ground where he expected the marsh to be, and Burton confirmed that by the end of the 16th century, areas of the battlefield were enclosed and had been improved to make them agriculturally productive. Trees were planted on the south side of Ambion Hill, forming Ambion Wood. In the 18th and 19th centuries, the Ashby Canal carved through the land west and south-west of Ambion Hill. Winding alongside the canal at a distance, the Ashby and Nuneaton Joint Railway crossed the area on an embankment. The changes to the landscape were so extensive that when Hutton revisited the region in 1807 after an earlier 1788 visit, he could not readily find his way around. Bosworth Battlefield Heritage Centre was built on Ambion Hill, near Richard's Well. According to legend, Richard III drank from one of the several springs in the region on the day of the battle. In 1788, a local pointed out one of the springs to Hutton as the one mentioned in the legend. A stone structure was later built over the location. The inscription on the well reads: Northwest of Ambion Hill, just across the northern tributary of the , a flag and memorial stone mark Richard's Field. Erected in 1973, the site was selected on the basis of Williams's theory. St James's Church at Dadlington is the only structure in the area that is reliably associated with the Battle of Bosworth; the bodies of those killed in the battle were buried there. Rediscovered battlefield and possible battle scenario The very extensive survey carried out (2005–2009) by the Battlefields Trust headed by Glenn Foard led eventually to the discovery of the real location of the core battlefield. This lies about a kilometre further west than the location suggested by Peter Foss. It is in what was at the time of the battle an area of marginal land at the meeting of several township boundaries. There was a cluster of field names suggesting the presence of marshland and heath. Thirty four lead round shot were discovered as a result of systematic metal detecting (more than the total found previously on all other C15th European battlefields), as well as other significant finds, including a small silver gilt badge depicting | to defend northern England and maintain its peace. Initially the earl had issues with Richard III as Edward groomed his brother to be the leading power of the north. Northumberland was mollified when he was promised he would be the Warden of the East March, a position that was formerly hereditary for the Percys. He served under Richard during the 1482 invasion of Scotland, and the allure of being in a position to dominate the north of England if Richard went south to assume the crown was his likely motivation for supporting Richard's bid for kingship. However, after becoming king, Richard began moulding his nephew, John de la Pole, 1st Earl of Lincoln, to manage the north, passing over Northumberland for the position. According to Carpenter, although the earl was amply compensated, he despaired of any possibility of advancement under Richard. Lancastrians Henry Tudor was unfamiliar with the arts of war and was a stranger to the land he was trying to conquer. He spent the first fourteen years of his life in Wales and the next fourteen in Brittany and France. Slender but strong and decisive, Henry lacked a penchant for battle and was not much of a warrior; chroniclers such as Polydore Vergil and ambassadors like Pedro de Ayala found him more interested in commerce and finance. Having not fought in any battles, Henry recruited several experienced veterans to command his armies. John de Vere, 13th, Earl of Oxford, was Henry's principal military commander. He was adept in the arts of war. At the Battle of Barnet, he commanded the Lancastrian right wing and routed the division opposing him. However, as a result of confusion over identities, Oxford's group came under friendly fire from the Lancastrian main force and retreated from the field. The earl fled abroad and continued his fight against the Yorkists, raiding shipping and eventually capturing the island fort of St Michael's Mount in 1473. He surrendered after receiving no aid or reinforcement, but in 1484 escaped from prison and joined Henry's court in France, bringing along his erstwhile gaoler Sir James Blount. Oxford's presence raised morale in Henry's camp and troubled Richard III. Stanleys In the early stages of the Wars of the Roses, the Stanleys of Cheshire had been predominantly Lancastrians. Sir William Stanley, however, was a staunch Yorkist supporter, fighting in the Battle of Blore Heath in 1459 and helping Hastings to put down uprisings against Edward IV in 1471. When Richard took the crown, Sir William showed no inclination to turn against the new king, refraining from joining Buckingham's rebellion, for which he was amply rewarded. Sir William's elder brother, Thomas Stanley, 2nd Baron Stanley, was not as steadfast. By 1485, he had served three kings, namely Henry VI, Edward IV, and Richard III. Lord Stanley's skilled political manoeuvrings—vacillating between opposing sides until it was clear who would be the winner—gained him high positions; he was Henry's chamberlain and Edward's steward. His non-committal stance, until the crucial point of a battle, earned him the loyalty of his men, who felt he would not needlessly send them to their deaths. Lord Stanley's relations with the king's brother, the eventual Richard III, were not cordial. The two had conflicts that erupted into violence around March 1470. Furthermore, having taken Lady Margaret as his second wife in June 1472, Stanley was Henry Tudor's stepfather, a relationship which did nothing to win him Richard's favour. Despite these differences, Stanley did not join Buckingham's revolt in 1483. When Richard executed those conspirators who had been unable to flee England, he spared Lady Margaret. However, he declared her titles forfeit and transferred her estates to Stanley's name, to be held in trust for the Yorkist crown. Richard's act of mercy was calculated to reconcile him with Stanley, but it may have been to no avail—Carpenter has identified a further cause of friction in Richard's intention to reopen an old land dispute that involved Thomas Stanley and the Harrington family. Edward IV had ruled the case in favour of Stanley in 1473, but Richard planned to overturn his brother's ruling and give the wealthy estate to the Harringtons. Immediately before the Battle of Bosworth, being wary of Stanley, Richard took his son, Lord Strange, as hostage to discourage him from joining Henry. Crossing the English Channel and through Wales Henry's initial force consisted of the English and Welsh exiles who had gathered around Henry, combined with a contingent of mercenaries put at his disposal by Charles VIII of France. The history of Scottish author John Major (published in 1521) claims that Charles had granted Henry 5,000 men, of whom 1,000 were Scots, headed by Sir Alexander Bruce. No mention of Scottish soldiers was made by subsequent English historians. Henry's crossing of the English Channel in 1485 was without incident. Thirty ships sailed from Harfleur on 1 August and, with fair winds behind them, landed in his native Wales, at Mill Bay (near Dale) on the north side of Milford Haven on 7 August, easily capturing nearby Dale Castle. Henry received a muted response from the local population. No joyous welcome awaited him on shore, and at first few individual Welshmen joined his army as it marched inland. Historian Geoffrey Elton suggests only Henry's ardent supporters felt pride over his Welsh blood. His arrival had been hailed by contemporary Welsh bards such as Dafydd Ddu and Gruffydd ap Dafydd as the true prince and "the youth of Brittany defeating the Saxons" in order to bring their country back to glory. When Henry moved to Haverfordwest, the county town of Pembrokeshire, Richard's lieutenant in South Wales, Sir Walter Herbert, failed to move against Henry, and two of his officers, Richard Griffith and Evan Morgan, deserted to Henry with their men. The most important defector to Henry in this early stage of the campaign was probably Rhys ap Thomas, who was the leading figure in West Wales. Richard had appointed Rhys Lieutenant in West Wales for his refusal to join Buckingham's rebellion, asking that he surrender his son Gruffydd ap Rhys ap Thomas as surety, although by some accounts Rhys had managed to evade this condition. However, Henry successfully courted Rhys, offering the lieutenancy of all Wales in exchange for his fealty. Henry marched via Aberystwyth while Rhys followed a more southerly route, recruiting a force of Welshmen en route, variously estimated at 500 or 2,000 men, to swell Henry's army when they reunited at Cefn Digoll, Welshpool. By 15 or 16 August, Henry and his men had crossed the English border, making for the town of Shrewsbury. Shrewsbury: the gateway to England Since 22 June Richard had been aware of Henry's impending invasion, and had ordered his lords to maintain a high level of readiness. News of Henry's landing reached Richard on 11 August, but it took three to four days for his messengers to notify his lords of their king's mobilisation. On 16 August, the Yorkist army started to gather; Norfolk set off for Leicester, the assembly point, that night. The city of York, a historical stronghold of Richard's family, asked the king for instructions, and receiving a reply three days later sent 80 men to join the king. Simultaneously Northumberland, whose northern territory was the most distant from the capital, had gathered his men and ridden to Leicester. Although London was his goal, Henry did not move directly towards the city. After resting in Shrewsbury, his forces went eastwards and picked up Sir Gilbert Talbot and other English allies, including deserters from Richard's forces. Although its size had increased substantially since the landing, Henry's army was still substantially outnumbered by Richard's forces. Henry's pace through Staffordshire was slow, delaying the confrontation with Richard so that he could gather more recruits to his cause. Henry had been communicating on friendly terms with the Stanleys for some time before setting foot in England, and the Stanleys had mobilised their forces on hearing of Henry's landing. They ranged themselves ahead of Henry's march through the English countryside, meeting twice in secret with Henry as he moved through Staffordshire. At the second of these, at Atherstone in Warwickshire, they conferred "in what sort to arraign battle with King Richard, whom they heard to be not far off". On 21 August, the Stanleys were making camp on the slopes of a hill north of Dadlington, while Henry encamped his army at White Moors to the northwest of their camp. On 20 August, Richard rode from Nottingham to Leicester, joining Norfolk. He spent the night at the Blue Boar inn (demolished 1836). Northumberland arrived the following day. The royal army proceeded westwards to intercept Henry's march on London. Passing Sutton Cheney, Richard moved his army towards Ambion Hill—which he thought would be of tactical value—and made camp on it. Richard's sleep was not peaceful and, according to the Croyland Chronicle, in the morning his face was "more livid and ghastly than usual". Engagement The Yorkist army, variously estimated at between 7,500 and 12,000 men, deployed on the hilltop along the ridgeline from west to east. Norfolk's force (or "battle" in the parlance of the time) of spearmen stood on the right flank, protecting the cannon and about 1,200 archers. Richard's group, comprising 3,000 infantry, formed the centre. Northumberland's men guarded the left flank; he had approximately 4,000 men, many of them mounted. Standing on the hilltop, Richard had a wide, unobstructed view of the area. He could see the Stanleys and their 4,000–6,000 men holding positions on and around Dadlington Hill, while to the southwest was Henry's army. Henry's force has been variously estimated at between 5,000 and 8,000 men, his original landing force of exiles and mercenaries having been augmented by the recruits gathered in Wales and the English border counties (in the latter area probably mustered chiefly by the Talbot interest), and by deserters from Richard's army. Historian John Mackie believes that 1,800 French mercenaries, led by Philibert de Chandée, formed the core of Henry's army. John Mair, writing thirty-five years after the battle, claimed that this force contained a significant Scottish component, and this claim is accepted by some modern writers, but Mackie reasons that the French would not have released their elite Scottish knights and archers, and concludes that there were probably few Scottish troops in the army, although he accepts the presence of captains like Bernard Stewart, Lord of Aubigny. In their interpretations of the vague mentions of the battle in the old text, historians placed areas near the foot of Ambion Hill as likely regions where the two armies clashed, and thought up possible scenarios of the engagement. In their recreations of the battle, Henry started by moving his army towards Ambion Hill where Richard and his men stood. As Henry's army advanced past the marsh at the southwestern foot of the hill, Richard sent a message to Stanley, threatening to execute his son, Lord Strange, if Stanley did not join the attack on Henry immediately. Stanley replied that he had other sons. Incensed, Richard gave the order to behead Strange but his officers temporised, saying that battle was imminent, and it would be more convenient to carry out the execution afterwards. Henry had also sent messengers to Stanley asking him to declare his allegiance. The reply was evasive—the Stanleys would "naturally" come, after Henry had given orders to his army and arranged them for battle. Henry had no choice but to confront Richard's forces alone. Well aware of his own military inexperience, Henry handed command of his army to Oxford and retired to the rear with his bodyguards. Oxford, seeing the vast line of Richard's army strung along the ridgeline, decided to keep his men together instead of splitting them into the traditional three battles: vanguard, centre, and rearguard. He ordered the troops to stray no further than from their banners, fearing that they would become enveloped. Individual groups clumped together, forming a single large mass flanked by horsemen on the wings. The Lancastrians were harassed by Richard's cannon as they manoeuvred around the marsh, seeking firmer ground. Once Oxford and his men were clear of the marsh, Norfolk's battle and several contingents of Richard's group, under the command of Sir Robert Brackenbury, started to advance. Hails of arrows showered both sides as they closed. Oxford's men proved the steadier in the ensuing hand-to-hand combat; they held their ground and several of Norfolk's men fled the field. Norfolk lost one of his senior officers, Walter Devereux, in this early clash. Recognising that his force was at a disadvantage, Richard signalled for Northumberland to assist but Northumberland's group showed no signs of movement. Historians, such as Horrox and Pugh, believe Northumberland chose not to aid his king for personal reasons. Ross doubts the aspersions cast on Northumberland's loyalty, suggesting instead that Ambion Hill's narrow ridge hindered him from joining the battle. The earl would have had to either go through his allies or execute a wide flanking move—near impossible to perform given the standard of drill at the time—to engage Oxford's men. At this juncture Richard saw Henry at some distance behind his main force. Seeing this, Richard decided to end the fight quickly by killing the enemy commander. He led a charge of mounted men around the melee and tore into Henry's group; several accounts state that Richard's force numbered 800–1000 knights, but Ross says it was more likely that Richard was accompanied only by his household men and closest friends. Richard killed Henry's standard-bearer Sir William Brandon in the initial charge and unhorsed burly John Cheyne, Edward IV's former standard-bearer, with a blow to the head from his broken lance. French mercenaries in Henry's retinue related how the attack had caught them off guard and that Henry sought protection by dismounting and concealing himself among them to present less of a target. Henry made no attempt to engage in combat himself. Oxford had left a small reserve of pike-equipped men with Henry. They slowed the pace of Richard's mounted charge, and bought Tudor some critical time. The remainder of Henry's bodyguards surrounded their master, and succeeded in keeping him away from the Yorkist king. Meanwhile, seeing Richard embroiled with Henry's men and separated from his main force, William Stanley made his move and rode to the aid of Henry. Now outnumbered, Richard's group was surrounded and gradually pressed back. Richard's force was driven several hundred yards away from Tudor, near to the edge of a marsh, into which the king's horse toppled. Richard, now unhorsed, gathered himself and rallied his dwindling followers, supposedly refusing to retreat: "God forbid that I retreat one step. I will either win the battle as a king, or die as one." In the fighting Richard's banner man—Sir Percival Thirlwall—lost his legs, but held the Yorkist banner aloft until he was killed. It is likely that James Harrington also died in the charge. The king's trusted advisor Richard Ratcliffe was also slain. Polydore Vergil, Henry Tudor's official historian, recorded that "King Richard, alone, was killed fighting manfully in the thickest press of his enemies". Richard had come within a sword's length of Henry Tudor before being surrounded by William Stanley's men and killed. The Burgundian chronicler Jean Molinet says that a Welshman struck the death-blow with a halberd while Richard's horse was stuck in the marshy ground. It was said that the blows were so violent that the king's helmet was driven into his skull. The contemporary Welsh poet Guto'r Glyn implies the leading Welsh Lancastrian Rhys ap Thomas, or one of his men, killed the king, writing that he " Lladd y baedd, eilliodd ei ben" (In English, "killed the boar, shaved his head"). Analysis of King Richard's skeletal remains found 11 wounds, nine of them to the head; a blade consistent with a halberd had sliced off part of the rear of Richard's skull, suggesting he had lost his helmet. Richard's forces disintegrated as news of his death spread. Northumberland and his men fled north on seeing the king's fate, and Norfolk was killed by the knight Sir John Savage in single combat according to the Ballad of Lady Bessy. After the battle Although he claimed fourth-generation, maternal Lancastrian descendancy, Henry seized the crown by right of conquest. After the battle, Richard's circlet is said to have been found and brought to Henry, who was proclaimed king at the top of Crown Hill, near the village of Stoke Golding. According to Vergil, Henry's official historian, Lord Stanley found the circlet. Historians Stanley Chrimes and Sydney Anglo dismiss the legend of the circlet's finding in a hawthorn bush; none of the contemporary sources reported such an event. Ross, however, does not ignore the legend. He argues that the hawthorn bush would not be part of Henry's coat of arms if it did not have a strong relationship to his ascendance. Baldwin points out that a hawthorn bush motif was already used by the House of Lancaster, and Henry merely added the crown. In Vergil's chronicle, 100 of Henry's men, compared to 1,000 of Richard's, died in this battle—a ratio Chrimes believes to be an exaggeration. The bodies of the fallen were brought to St James Church at Dadlington for burial. However, Henry denied any immediate rest for Richard; instead the last Yorkist king's corpse was stripped naked and strapped across a horse. His body was brought to Leicester and openly exhibited to prove that he was dead. Early accounts suggest that this was in the major Lancastrian collegiate foundation, the Church of the Annunciation of Our Lady of the Newarke. After two days, the corpse was interred in a plain tomb, within the church of the Greyfriars. The church was demolished following the friary's dissolution in 1538, and the location of Richard's tomb was long uncertain. On 12 September 2012, archaeologists announced the discovery of a buried skeleton with spinal abnormalities and head injuries under a car park in Leicester, and their suspicions that it was Richard III. On 4 February 2013, it was announced that DNA testing had convinced Leicester University scientists and researchers "beyond reasonable doubt" that the remains were those of King Richard. On 26 March 2015, these remains were ceremonially buried in Leicester Cathedral. Richard's tomb was unveiled on the following day. Henry dismissed the mercenaries in his force, retaining only a small core of local soldiers to form a "Yeomen of his Garde", and proceeded to establish his rule of England. Parliament reversed his attainder and recorded Richard's kingship as illegal, although the Yorkist king's reign remained officially in the annals of England history. The proclamation of Edward IV's children as illegitimate was also reversed, restoring Elizabeth's status to a royal princess. The marriage of Elizabeth, the heiress to the House of York, to Henry, the master of the House of Lancaster, marked the end of the feud between the two houses and the start of the Tudor dynasty. The royal matrimony, however, was delayed until Henry was crowned king and had established his claim on the throne firmly enough to preclude that of Elizabeth and her kin. Henry further convinced Parliament to backdate his reign to the day before the battle, enabling him retrospectively to declare as traitors those who had fought against him at Bosworth Field. Northumberland, who had remained inactive during the battle, was imprisoned but later released and reinstated to pacify the north in Henry's name. The purge of those who fought for Richard occupied Henry's first two years of rule, although later he proved prepared to accept those who submitted to him regardless of their former allegiances. Of his supporters, Henry rewarded the Stanleys the most generously. Aside from making William his chamberlain, he bestowed the earldom of Derby upon Lord Stanley along with grants and offices in other estates. Henry rewarded Oxford by restoring to him the lands and titles confiscated by the Yorkists and appointing him as Constable of the Tower and admiral of England, Ireland, and Aquitaine. For his kin, Henry created Jasper Tudor the Duke of Bedford. He returned to his mother the lands and grants stripped from her by Richard, and proved to be a filial son, granting her a place of honour in the palace and faithfully attending to her throughout his reign. Parliament's declaration of Margaret as femme sole effectively empowered her; she no longer needed to manage her estates through Stanley. Elton points out that despite his initial largesse, Henry's supporters at Bosworth would enjoy his special favour for only the short term; in later years, he would instead promote those who best served his interests. Like the kings before him, Henry faced dissenters. The first open revolt occurred two years after Bosworth Field; Lambert Simnel claimed to be Edward Plantagenet, 17th Earl of Warwick, who was Edward IV's nephew. The Earl of Lincoln backed him for the throne and led rebel forces in the name of the House of York. The rebel army fended off several attacks by Northumberland's forces, before engaging Henry's army at the Battle of Stoke Field on 16 June 1487. Oxford and Bedford led Henry's men, including several former supporters of Richard III. Henry won this battle easily, but other malcontents and conspiracies would follow. A rebellion in 1489 started with Northumberland's murder; military historian Michael C. C. Adams says that the author of a note, which was left next to Northumberland's body, blamed the earl for Richard's death. Legacy and historical significance Contemporary accounts of the Battle of Bosworth can be found in four main sources, one of which is the English Croyland Chronicle, written by a senior Yorkist chronicler who relied on second-hand information from nobles and soldiers. The other accounts were written by foreigners—Vergil, Jean Molinet, and Diego de Valera. Whereas Molinet was sympathetic to Richard, Vergil was in Henry's service and drew information from the king and his subjects to portray them in a good light. Diego de Valera, whose information Ross regards as unreliable, compiled his work from letters of Spanish merchants. However, other historians have used Valera's work to deduce possibly valuable insights not readily evident in other sources. Ross finds the poem, The Ballad of Bosworth Field, a useful source to ascertain certain details of the battle. The multitude of different accounts, mostly based on second- or third-hand information, has proved an obstacle to historians as they try to reconstruct the battle. Their common complaint is that, except for its outcome, very few details of the battle are found in the chronicles. According to historian Michael Hicks, the Battle of Bosworth is one of the worst-recorded clashes of the Wars of the Roses. Historical depictions and interpretations Henry tried to present his victory as a new beginning for the country; he hired chroniclers to portray his reign as a "modern age" with its dawn in 1485. Hicks states that the works of Vergil and the blind historian Bernard André, promoted by subsequent Tudor administrations, became the authoritative sources for writers for the next four hundred years. As such, Tudor literature paints a flattering picture of Henry's reign, depicting the Battle of Bosworth as the final clash of the civil war and downplaying the subsequent uprisings. For England the Middle Ages ended in 1485, and English Heritage claims that other than William the Conqueror's successful invasion of 1066, no other year holds more significance in English history. By portraying Richard as a hunchbacked tyrant who usurped the throne by killing his nephews, the Tudor historians attached a sense of myth to the battle: it became an epic clash between good and evil with a satisfying moral outcome. According to Reader Colin Burrow, André was so overwhelmed by the historic significance of the battle that he represented it with a blank page in his Henry VII (1502). For Professor Peter Saccio, the battle was indeed a unique clash in the annals of English history, because "the victory was determined, not by those who fought, but by those who delayed fighting until they were sure of being on the winning side." Historians such as Adams and Horrox believe that Richard lost the battle not for any mythic reasons, but because of morale and loyalty problems in his army. Most of the common soldiers found it difficult to fight for a liege whom they distrusted, and some lords believed that their situation might improve if Richard were dethroned. According to Adams, against such duplicities Richard's desperate charge was the only knightly behaviour on the field. As fellow historian Michael Bennet puts it, the attack was "the swan-song of [mediaeval] English chivalry". Adams believes this view was shared at the time by the printer William Caxton, who enjoyed sponsorship from Edward IV and Richard III. Nine days after the battle, Caxton published Thomas Malory's story about chivalry and death by betrayal—Le Morte d'Arthur—seemingly as a response to the circumstances of Richard's death. Elton does not believe Bosworth Field has any true significance, pointing out that the 20th-century English public largely ignored the battle until its quincentennial celebration. In his view, the dearth of specific information about the battle—no-one even knows exactly where it took place—demonstrates its insignificance to English society. Elton considers the battle as just one part of Henry's struggles to establish his reign, underscoring his point by noting that the young king had to spend ten more years pacifying factions and rebellions to secure his throne. Mackie asserts that, in hindsight, Bosworth Field is notable as the decisive battle that established a dynasty which would rule unchallenged over England for more than a hundred years. Mackie notes that contemporary historians of that time, wary of the three royal successions during the long Wars of the Roses, considered Bosworth Field just another in a lengthy series of such battles. It was through the works and efforts of Francis Bacon and his successors that the public started to believe the battle had decided their futures by bringing about "the fall of a tyrant". Shakespearian dramatisation William Shakespeare gives prominence to the Battle of Bosworth in his play, Richard III. It is the "one big battle"; no other fighting scene distracts the audience from this action, represented by a one-on-one sword fight between Henry Tudor and Richard III. Shakespeare uses their duel to bring a climactic end to the play and the Wars of the Roses; he also uses it to champion morality, portraying the "unequivocal triumph of good over evil". Richard, the villainous lead character, has been built up in the battles of Shakespeare's earlier play, Henry VI, Part 3, as a "formidable swordsman and a courageous military leader"—in contrast to the dastardly means by which he becomes king in Richard III. Although the Battle of Bosworth has only five sentences to direct it, three scenes and more than four hundred lines precede the action, developing the background and motivations for the characters in anticipation of the battle. Shakespeare's account of the battle was mostly based on chroniclers Edward Hall's and Raphael Holinshed's dramatic versions of history, which were sourced from Vergil's chronicle. However, Shakespeare's attitude towards Richard was shaped by scholar Thomas More, whose writings displayed extreme bias against the Yorkist king. The result of these influences is a script that vilifies the king, and Shakespeare had few qualms about departing from history to incite drama. Margaret of Anjou died in 1482, but Shakespeare had her speak to Richard's mother before the battle to foreshadow Richard's fate and fulfill the prophecy she had given in Henry VI. Shakespeare exaggerated the cause of Richard's restless night before the battle, imagining it as a haunting by the ghosts of those whom the king had murdered, including Buckingham. Richard is portrayed as suffering a pang of conscience, but as he speaks he regains his confidence and asserts that he will be evil, if such needed to retain his crown. The fight between the two armies is simulated by rowdy noises made off-stage (alarums |
although sometimes deeply incised, alternatingly set leaves without stipules or in leaf rosettes, with terminal inflorescences without bracts, containing flowers with four free sepals, four free alternating petals, two short and four longer free stamens, and a fruit with seeds in rows, divided by a thin wall (or septum). The family contains 372 genera and 4,060 accepted species. The largest genera are Draba (440 species), Erysimum (261 species), Lepidium (234 species), Cardamine (233 species), and Alyssum (207 species). The family contains the cruciferous vegetables, including species such as Brassica oleracea (e.g. broccoli, cabbage, cauliflower, kale, collards), Brassica rapa (turnip, Chinese cabbage, etc.), Brassica napus (rapeseed, etc.), Raphanus sativus (common radish), Armoracia rusticana (horseradish), but also a cut-flower Matthiola (stock) and the model organism Arabidopsis thaliana (thale cress). Pieris rapae and other butterflies of the family Pieridae are some of the best-known pests of Brassicaceae species planted as commercial crops. Trichoplusia ni (cabbage looper) moth is also becoming increasingly problematic for crucifers due to its resistance to commonly used pest control methods. Some rarer Pieris butterflies, such as Pieris virginiensis, depend upon native mustards for their survival, in their native habitats. Some non-native mustards, such as garlic mustard, Alliaria petiolata, an extremely invasive species in the United States, can be toxic to their larvae. Taxonomy Carl Linnaeus in 1753 regarded the Brassicaceae as a natural group, naming them "Klass" Tetradynamia. Alfred Barton Rendle placed the family in the order Rhoedales, while George Bentham and Joseph Dalton Hooker in their system published from 1862–1883, assigned it to their cohort Parietales (now the class Violales). Following Bentham and Hooker, John Hutchinson in 1948 and again in 1964 thought the Brassicaceae to stem from near the Papaveraceae. In 1994, a group of scientists including Walter Stephen Judd suggested to include the Capparaceae in the Brassicaceae. Early DNA-analysis showed that the Capparaceae—as defined at that moment—were paraphyletic, and it was suggested to assign the genera closest to the Brassicaceae to the Cleomaceae. The Cleomaceae and Brassicaceae diverged approximately 41 million years ago. All three families have consistently been placed in one order (variably called Capparales or Brassicales). The APG II system, merged Cleomaceae and Brassicaceae. Other classifications have continued to recognize the Capparaceae, but with a more restricted circumscription, either including Cleome and its relatives in the Brassicaceae or recognizing them in the segregate family Cleomaceae. The APG III system has recently adopted this last solution, but this may change as a consensus arises on this point. Current insights in the relationships of the Brassicaceae, based on a 2012 DNA-analysis, are summarized in the following tree. Relationships within the family Early classifications depended on morphological comparison only, but because of extensive convergent evolution, these do not provide a reliable phylogeny. Although a substantial effort was made through molecular phylogenetic studies, the relationships within the Brassicaceae have not always been well resolved yet. It has long been clear that the Aethionema are sister of the remainder of the family. One analysis from 2014 represented the relation between 39 tribes with the following tree. Etymology The name Brassicaceae comes to international scientific vocabulary from New Latin, from Brassica, the type genus, + -aceae, a standardized suffix for plant family names in modern taxonomy. The genus name comes from the Classical Latin word brassica, referring to cabbage and other cruciferous vegetables. The alternative older name, Cruciferae, meaning "cross-bearing", describes the four petals of mustard flowers, which resemble a cross. Cruciferae is one of eight plant family names, not derived from a genus name and without the suffix -aceae that are authorized alternative names. Genera Version 1 of the Plantlist website lists 349 genera. Description Species belonging to the Brassicaceae are mostly annual, biennial, or perennial herbaceous plants, some are dwarf shrubs or shrubs, and very few vines. Although generally terrestrial, a few species such as water awlwort live submerged in fresh water. They may have a taproot or a sometimes woody caudex that may have few or many branches, some have thin or tuberous rhizomes, or rarely develop runners. Few species have multi-cellular glands. Hairs consist of one cell and occur in many forms: from simple to forked, star-, tree- or T-shaped, rarely taking the form of a shield or scale. They are never topped by a gland. The stems may be upright, rise up towards the tip, or lie flat, are mostly herbaceous but sometimes woody. Stems carry leaves or the stems may be leafless (in Caulanthus), and some species lack stems altogether. The leaves do not have stipules, but there may be a pair of glands at base of leafstalks and flowerstalks. The leaf may be seated or have a leafstalk. The leaf blade is usually simple, entire or dissected, rarely trifoliolate or pinnately compound. A leaf rosette at the | leaves. The orientation of the pedicels when fruits are ripe varies dependent on the species. The flowers are bisexual, star symmetrical (zygomorphic in Iberis and Teesdalia) and the ovary positioned above the other floral parts. Each flower has four free or seldomly merged sepals, the lateral two sometimes with a shallow spur, which are mostly shed after flowering, rarely persistent, may be reflexed, spreading, ascending, or erect, together forming a tube-, bell- or urn-shaped calyx. Each flower has four petals, set alternating with the sepals, although in some species these are rudimentary or absent. They may be differentiated into a blade and a claw or not, and consistently lack basal appendages. The blade is entire or has an indent at the tip, and may sometimes be much smaller than the claws. The mostly six stamens are set in two whorls: usually the two lateral, outer ones are shorter than the four inner stamens, but very rarely the stamens can all have the same length, and very rarely species have different numbers of stamens such as sixteen to twenty four in Megacarpaea, four in Cardamine hirsuta, and two in Coronopus. The filaments are slender and not fused, while the anthers consist of two pollen producing cavities, and open with longitudinal slits. The pollen grains are tricolpate. The receptacle carries a variable number of nectaries, but these are always present opposite the base of the lateral stamens. Ovary, fruit and seed There is one superior pistil that consists of two carpels that may either sit directly above the base of the stamens or on a stalk. It initially consists of only one cavity but during its further development a thin wall grows that divides the cavity, both placentas and separates the two valves (a so-called false septum). Rarely, there is only one cavity without a septum. The 2–600 ovules are usually along the side margin of the carpels, or rarely at the top. Fruits are capsules that open with two valves, usually towards the top. These are called silique if at least three times longer than wide, or silicle if the length is less than three times the width. The fruit is very variable in its other traits. There may be one persistent style that connects the ovary to the globular or conical stigma, which is undivided or has two spreading or connivent lobes. The variously shaped seeds are usually yellow or brown in color, and arranged in one or two rows in each cavity. The seed leaves are entire or have a notch at the tip. The seed does not contain endosperm. Differences with similar families Brassicaceae have a bisymmetical corolla (left is mirrored by right, stem-side by out-side, but each quarter is not symmetrical), a septum dividing the fruit, lack stipules and have simple (although sometimes deeply incised) leaves. The sister family Cleomaceae has bilateral symmetrical corollas (left is mirrored by right, but stem-side is different from out-side), stipules and mostly palmately divided leaves, and mostly no septum. Capparaceae generally have a gynophore, sometimes an androgynophore, and a variable number of stamens. Phytochemistry Almost all Brassicaceae have C3 carbon fixation. The only exceptions are a few Moricandia species, which have a hybrid system between C3 and C4 carbon fixation, C4 fixation being more efficient in drought, high temperature and low nitrate availability. Brassicaceae contain different cocktails of dozens of glucosinolates. They also contain enzymes called myrosinases, that convert the glucosinolates into isothiocyanates, thiocyanates and nitriles, which are toxic to many organisms, and so help guard against herbivory. Distribution Brassicaceae can be found almost on the entire land surface of the planet, but the family is absent from Antarctica, and also absent from some areas in the tropics i.e. northeastern Brazil, the Congo basin, Maritime Southeast Asia and tropical Australasia. The area of origin of the family is possibly the Irano-Turanian Region, where approximately 900 species occur in 150 different genera. About 530 of those 900 species are endemics. Next in abundance comes the Mediterranean Region, with around 630 species (290 of which are endemic) in 113 genera. The family is less prominent in the Saharo-Arabian Region—65 genera, 180 species of which 62 are endemic—and North America (comprising the North American Atlantic Region and the Rocky Mountain Floristic Region)—99 genera, 780 species of which 600 are endemic -. South-America has 40 genera containing 340 native species, Southern Africa 15 genera with over 100 species, and Australia and New-Zealand have 19 genera with 114 species between them. Ecology Brassicaceae are almost exclusively pollinated by insects. A chemical mechanism in the pollen is active in many species to avoid selfing. Two notable exceptions are exclusive self pollination in closed flowers in Cardamine chenopodifolia, and wind pollination in Pringlea antiscorbutica. Although it can be cross-pollinated, Alliaria petiolata is self-fertile. Most species reproduce sexually through seed, but Cardamine bulbifera produces gemmae and in others, such as Cardamine pentaphyllos, the coral-like roots easily break into segments, that will grow into separate plants. In some species, such as in the genus Cardamine, seed pods open with force and so catapult the seeds quite far. Many of these have sticky seed coats, assisting long distance dispersal by animals, and this may also explain several intercontinental dispersal events in the genus, and its near global distribution. Brassicaceae are common on serpentine |
few core statistics have been traditionally referenced – batting average, RBI, and home runs. To this day, a player who leads the league in all of these three statistics earns the "Triple Crown". For pitchers, wins, ERA, and strikeouts are the most often-cited statistics, and a pitcher leading his league in these statistics may also be referred to as a "triple crown" winner. General managers and baseball scouts have long used the major statistics, among other factors and opinions, to understand player value. Managers, catchers and pitchers use the statistics of batters of opposing teams to develop pitching strategies and set defensive positioning on the field. Managers and batters study opposing pitcher performance and motions in attempting to improve hitting. Scouts use stats when they are looking at a player who they may end up drafting or signing to a contract. Some sabermetric statistics have entered the mainstream baseball world that measure a batter's overall performance including on-base plus slugging, commonly referred to as OPS. OPS adds the hitter's on-base percentage (number of times reached base by any means divided by total plate appearances) to their slugging percentage (total bases divided by at-bats). Some argue that the OPS formula is flawed and that more weight should be shifted towards OBP (on-base percentage). The statistic wOBA (weighted on-base average) attempts to correct for this. OPS is also useful when determining a pitcher's level of success. "Opponent on-base plus slugging" (OOPS) is becoming a popular tool to evaluate a pitcher's actual performance. When analyzing a pitcher's statistics, some useful categories include K/9IP (strikeouts per nine innings), K/BB (strikeouts per walk), HR/9 (home runs per nine innings), WHIP (walks plus hits per inning pitched), and OOPS (opponent on-base plus slugging). However, since 2001, more emphasis has been placed on defense-independent pitching statistics, including defense-independent ERA (dERA), in an attempt to evaluate a pitcher's performance regardless of the strength of the defensive players behind them. All of the above statistics may be used in certain game situations. For example, a certain hitter's ability to hit left-handed pitchers might incline a manager to increase their opportunities to face left-handed pitchers. Other hitters may have a history of success against a given pitcher (or vice versa), and the manager may use this information to create a favorable match-up. This is often referred to as "playing the percentages". Commonly used statistics Most of these terms also apply to softball. Commonly used statistics with their abbreviations are explained here. The explanations below are for quick reference and do not fully or completely define the statistic; for the strict definition, see the linked article for each statistic. Batting statistics 1B – Single: hits on which the batter reaches first base safely without the contribution of a fielding error 2B – Double: hits on which the batter reaches second base safely without the contribution of a fielding error 3B – Triple: hits on which the batter reaches third base safely without the contribution of a fielding error AB – At bat: plate appearances, not including bases on balls, being hit by pitch, sacrifices, interference, or obstruction AB/HR – At bats per home run: at bats divided by home runs BA – Batting average (also abbreviated AVG): hits divided by at bats (H/AB) BB – Base on balls (also called a "walk"): hitter not swinging at four pitches called out of the strike zone and awarded first base. BABIP – Batting average on balls in play: frequency at which a batter reaches a base after putting the ball in the field of play. Also a pitching category. BB/K – Walk-to-strikeout ratio: number of bases on balls divided by number of strikeouts BsR – Base runs: Another run estimator, like runs created EQA – Equivalent average: a player's batting average absent park and league factors FC – Fielder's choice: times reaching base safely because a fielder chose to try for an out on another runner GO/AO – Ground ball fly ball ratio: number of ground ball outs divided by number of fly ball outs GDP or GIDP – Ground into double play: number of ground balls hit that became double plays GPA – Gross production average: 1.8 times on-base percentage plus slugging percentage, divided by four GS – Grand slam: a home run with the bases loaded, resulting in four runs scoring, and four RBIs credited to the batter H – Hit: reaching base because of a batted, fair ball without error by the defense HBP – Hit by pitch: times touched by a pitch and awarded first base as a result HR – Home runs: hits on which the batter successfully touched all four bases, without the contribution of a fielding error HR/H – Home runs per hit: home runs divided by total hits ITPHR – Inside-the-park home run: hits on which the batter successfully touched all four bases, without the contribution of a fielding error or the ball going outside the ball park. IBB – Intentional base on balls: times awarded first base on balls (see BB above) deliberately thrown by the pitcher. Also known as IW (intentional walk). ISO – Isolated power: a hitter's ability to hit for extra bases, calculated by subtracting batting average from slugging percentage K – Strike out (also abbreviated SO): number of times that a third strike is taken or swung at and missed, or bunted foul. Catcher must catch the third strike or batter may attempt to run to first base. LOB – Left on base: number of runners neither out nor scored at the end of an inning OBP – On-base percentage: times reached base (H + BB + HBP) divided by at bats plus walks plus hit by pitch plus sacrifice flies (AB + BB + HBP + SF) OPS – On-base plus slugging: on-base percentage plus slugging average PA – Plate appearance: number of completed batting appearances PA/SO – Plate appearances per strikeout: number of times a batter strikes out to their plate appearance R – Runs scored: number of times a player crosses home plate RC – Runs created: an attempt to measure how many runs a player has contributed to their team RP – Runs produced: an attempt to measure how many runs a player has contributed RBI – Run batted in: number of runners who score due to a batter's action, except when the batter grounded into a double play or reached on an error RISP – Runner in scoring position: a breakdown of a batter's batting average with runners in scoring position, which includes runners at second or third base SF – Sacrifice fly: fly balls hit to the outfield which, although caught for an out, allow a baserunner to advance SH – Sacrifice hit: number of sacrifice bunts which allow runners to advance on the basepaths SLG – Slugging percentage: total bases achieved on hits divided by at-bats (TB/AB) TA – Total average: total bases, plus walks, plus hit by pitch, plus steals, minus caught stealing divided by at bats, minus hits, plus caught stealing, plus grounded into double plays [(TB + BB + HBP + SB – CS)/(AB – H + CS + GIDP)] TB – Total bases: one for each single, two for each double, three for each triple, and four for each home run [H + 2B + (2 × 3B) + (3 × HR)] or [1B + (2 × 2B) + (3 × 3B) + (4 × HR)] TOB – Times on base: times reaching base as a result of hits, walks, and hit-by-pitches (H + BB + HBP) XBH – Extra base hits: total hits greater than singles (2B + 3B + HR) Baserunning statistics SB – Stolen base: number of bases advanced by the runner while the ball is in the possession of the defense CS – Caught stealing: times tagged out while attempting to steal a base SBA or ATT – Stolen base attempts: total number of times the player has attempted to steal a base (SB+CS) SB% – Stolen base percentage: the percentage of bases stolen successfully. (SB) divided by (SBA) (stolen bases attempted). DI – Defensive Indifference: if the catcher does not attempt to throw out a runner (usually because the base would be insignificant), the runner is not awarded a steal. Scored as a fielder's choice. R – Runs scored: times reached home plate legally and safely UBR – Ultimate base running: a metric that assigns linear weights to every individual baserunning event in order to measure the impact of a player's baserunning skill Pitching statistics BB – Base on balls (also called a "walk"): times pitching four balls, allowing the batter to take first base BB/9 – Bases on balls per 9 innings pitched: base on balls multiplied by nine, divided by innings pitched BF – Total batters faced: opponent team's total plate appearances BK – Balk: number of times pitcher commits an illegal pitching action while in contact with the pitching rubber as judged by umpire, resulting in baserunners advancing one base BS – Blown save: number of times entering the game in a save situation, and being charged the run (earned or not) which eliminates his team's lead CERA – Component ERA: an estimate of a pitcher's ERA based upon the individual components of his statistical line (K, H, 2B, 3B, HR, BB, HBP) CG – Complete game: number of games where player was the only pitcher for their team DICE – Defense-Independent Component ERA: an estimate of a pitcher's ERA based upon the defense-independent components of his statistical line (K, HR, BB, HBP) but which also uses number of outs (IP), which is not defense independent. ER – Earned run: number of runs that did not occur as a result of errors or passed balls ERA – Earned run average: total number of earned runs (see "ER" above), multiplied by 9, divided by innings pitched ERA+ – Adjusted ERA+: earned run average adjusted for the ballpark and the league average FIP – Fielding independent pitching: a metric, scaled to resemble an ERA, that focuses on events within the pitcher's control – home runs, walks, and strikeouts – but also uses in its denominator the number of outs the team gets (see IP), which is not entirely within the pitcher's control. xFIP: This variant substitutes a pitcher's own home run percentage with the league average G – Games (AKA "appearances"): number of times a pitcher pitches in a season GF – Games finished: number of games pitched where player was the final pitcher for their team as a relief pitcher GIDP – Double plays induced: number of double play groundouts induced GIDPO - Double play opportunities: number of groundout induced double play opportunities GIR - Games in relief: games as a non starting pitcher GO/AO or G/F – Ground | from leagues such as the National Association of Professional Base Ball Players and the Negro leagues, although the consistency of whether these records were kept, of the standards with respect to which they were calculated, and of their accuracy has varied. Development The practice of keeping records of player achievements was started in the 19th century by Henry Chadwick. Based on his experience with the sport of cricket, Chadwick devised the predecessors to modern-day statistics including batting average, runs scored, and runs allowed. Traditionally, statistics such as batting average (the number of hits divided by the number of at bats) and earned run average (the average number of earned runs allowed by a pitcher per nine innings) have dominated attention in the statistical world of baseball. However, the recent advent of sabermetrics has created statistics drawing from a greater breadth of player performance measures and playing field variables. Sabermetrics and comparative statistics attempt to provide an improved measure of a player's performance and contributions to his team from year to year, frequently against a statistical performance average. Comprehensive, historical baseball statistics were difficult for the average fan to access until 1951, when researcher Hy Turkin published The Complete Encyclopedia of Baseball. In 1969, Macmillan Publishing printed its first Baseball Encyclopedia, using a computer to compile statistics for the first time. Known as "Big Mac", the encyclopedia became the standard baseball reference until 1988, when Total Baseball was released by Warner Books using more sophisticated technology. The publication of Total Baseball led to the discovery of several "phantom ballplayers", such as Lou Proctor, who did not belong in official record books and were removed. Use Throughout modern baseball, a few core statistics have been traditionally referenced – batting average, RBI, and home runs. To this day, a player who leads the league in all of these three statistics earns the "Triple Crown". For pitchers, wins, ERA, and strikeouts are the most often-cited statistics, and a pitcher leading his league in these statistics may also be referred to as a "triple crown" winner. General managers and baseball scouts have long used the major statistics, among other factors and opinions, to understand player value. Managers, catchers and pitchers use the statistics of batters of opposing teams to develop pitching strategies and set defensive positioning on the field. Managers and batters study opposing pitcher performance and motions in attempting to improve hitting. Scouts use stats when they are looking at a player who they may end up drafting or signing to a contract. Some sabermetric statistics have entered the mainstream baseball world that measure a batter's overall performance including on-base plus slugging, commonly referred to as OPS. OPS adds the hitter's on-base percentage (number of times reached base by any means divided by total plate appearances) to their slugging percentage (total bases divided by at-bats). Some argue that the OPS formula is flawed and that more weight should be shifted towards OBP (on-base percentage). The statistic wOBA (weighted on-base average) attempts to correct for this. OPS is also useful when determining a pitcher's level of success. "Opponent on-base plus slugging" (OOPS) is becoming a popular tool to evaluate a pitcher's actual performance. When analyzing a pitcher's statistics, some useful categories include K/9IP (strikeouts per nine innings), K/BB (strikeouts per walk), HR/9 (home runs per nine innings), WHIP (walks plus hits per inning pitched), and OOPS (opponent on-base plus slugging). However, since 2001, more emphasis has been placed on defense-independent pitching statistics, including defense-independent ERA (dERA), in an attempt to evaluate a pitcher's performance regardless of the strength of the defensive players behind them. All of the above statistics may be used in certain game situations. For example, a certain hitter's ability to hit left-handed pitchers might incline a manager to increase their opportunities to face left-handed pitchers. Other hitters may have a history of success against a given pitcher (or vice versa), and the manager may use this information to create a favorable match-up. This is often referred to as "playing the percentages". Commonly used statistics Most of these terms also apply to softball. Commonly used statistics with their abbreviations are explained here. The explanations below are for quick reference and do not fully or completely define the statistic; for the strict definition, see the linked article for each statistic. Batting statistics 1B – Single: hits on which the batter reaches first base safely without the contribution of a fielding error 2B – Double: hits on which the batter reaches second base safely without the contribution of a fielding error 3B – Triple: hits on which the batter reaches third base safely without the contribution of a fielding error AB – At bat: plate appearances, not including bases on balls, being hit by pitch, sacrifices, interference, or obstruction AB/HR – At bats per home run: at bats divided by home runs BA – Batting average (also abbreviated AVG): hits divided by at bats (H/AB) BB – Base on balls (also called a "walk"): hitter not swinging at four pitches called out of the strike zone and awarded first base. BABIP – Batting average on balls in play: frequency at which a batter reaches a base after putting the ball in the field of play. Also a pitching category. BB/K – Walk-to-strikeout ratio: number of bases on balls divided by number of strikeouts BsR – Base runs: Another run estimator, like runs created EQA – Equivalent average: a player's batting average absent park and league factors FC – Fielder's choice: times reaching base safely because a fielder chose to try for an out on another runner GO/AO – Ground ball fly ball ratio: number of ground ball outs divided by number of fly ball outs GDP or GIDP – Ground into double play: number of ground balls hit that became double plays GPA – Gross production average: 1.8 times on-base percentage plus slugging percentage, divided by four GS – Grand slam: a home run with the bases loaded, resulting in four runs scoring, and four RBIs credited to the batter H – Hit: reaching base because of a batted, fair ball without error by the defense HBP – Hit by pitch: times touched by a pitch and awarded first base as a result HR – Home runs: hits on which the batter successfully touched all four bases, without the contribution of a fielding error HR/H – Home runs per hit: home runs divided by total hits ITPHR – Inside-the-park home run: hits on which the batter successfully touched all four bases, without the contribution of a fielding error or the ball going outside the ball park. IBB – Intentional base on balls: times awarded first base on balls (see BB above) deliberately thrown by the pitcher. Also known as IW (intentional walk). ISO – Isolated power: a hitter's ability to hit for extra bases, calculated by subtracting batting average from slugging percentage K – Strike out (also abbreviated SO): number of times that a third strike is taken or swung at and missed, or bunted foul. Catcher must catch the third strike or batter may attempt to run to first base. LOB – Left on base: number of runners neither out nor scored at the end of an inning OBP – On-base percentage: times reached base (H + BB + HBP) divided by at bats plus walks plus hit by pitch plus sacrifice flies (AB + BB + HBP + SF) OPS – On-base plus slugging: on-base percentage plus slugging average PA – Plate appearance: number of completed batting appearances PA/SO – Plate appearances per strikeout: number of times a batter strikes out to their plate appearance R – Runs scored: number of times a player crosses home plate RC – Runs created: an attempt to measure how many runs a player has contributed to their team RP – Runs produced: an attempt to measure how many runs a player has contributed RBI – Run batted in: number of runners who score due to a batter's action, except when the batter grounded into a double play or reached on an error RISP – Runner in scoring position: a breakdown of a batter's batting average with runners in scoring position, which includes runners at second or third base SF – Sacrifice fly: fly balls hit to the outfield which, although caught for an out, allow a baserunner to advance SH – Sacrifice hit: number of sacrifice bunts which allow runners to advance on the basepaths SLG – Slugging percentage: total bases achieved on hits divided by at-bats (TB/AB) TA – Total average: total bases, plus walks, plus hit by pitch, plus steals, minus caught stealing divided by at bats, minus hits, plus caught stealing, plus grounded into double plays [(TB + BB + HBP + SB – CS)/(AB – H + CS + GIDP)] TB – Total bases: one for each single, two for each double, three for each triple, and four for each home run [H + 2B + (2 × 3B) + (3 × HR)] or [1B + (2 × 2B) + (3 × 3B) + (4 × HR)] TOB – Times on base: times reaching base as a result of hits, walks, and hit-by-pitches (H + BB + HBP) XBH – Extra base hits: total hits greater than singles (2B + 3B + HR) Baserunning statistics SB – Stolen base: number of bases advanced by the runner while the ball is in the possession of the defense CS – Caught stealing: times tagged out while attempting to steal a base SBA or ATT – Stolen base attempts: total number of times the player has attempted to steal a base (SB+CS) SB% – Stolen base percentage: the percentage of bases stolen successfully. (SB) divided by (SBA) (stolen bases attempted). DI – Defensive Indifference: if the catcher does not attempt to throw out a runner (usually because the base would be insignificant), the runner is not awarded a steal. Scored as a fielder's choice. R – Runs scored: times reached home plate legally and safely UBR – Ultimate base running: a metric that assigns linear weights to every individual baserunning event in order to measure the impact of a player's baserunning skill Pitching statistics BB – Base on balls (also called a "walk"): times pitching four balls, allowing the batter to take first base BB/9 – Bases on balls per |
by a runner caught stealing, for example), no at bat or plate appearance will result. In this case, the batter will come to bat again in the next inning, though the count will be reset to no balls and no strikes. Put shortly, an at-bat is a specific type of plate appearance in which in the batter puts the ball in play intending to get on base. This is why at-bats, and not plate appearances, are used to calculate batting average, as plate appearances in general can result in many outcomes that don't involve the ball being put in play, and batting average specifically measures a batter's contact hitting. Rule 9.02(a)(1) of the official rules of Major League Baseball defines an at bat as: "Number of times batted, except that no time at bat shall be charged when a player: (A) hits a sacrifice bunt or sacrifice fly; (B) is awarded first base on four called balls; (C) is hit by a pitched ball; or (D) is awarded first base because of interference or obstruction[.]" Examples An at bat is counted when: The batter reaches first base on a hit The batter reaches first base on an error The batter is called out for any reason other than as part of a sacrifice There is a fielder's choice Records Pete Rose had 14,053 career at bats, the all-time major league and National League record. The American League record is held by Carl Yastrzemski, whose 11,988 career at bats were all in the AL. The single season record is held by Jimmy Rollins, who had 716 at bats in 2007; Willie Wilson, Ichiro Suzuki and Juan Samuel also had more than 700 at bats in a season. 14 players share the single game record of 11 at bats in a single game, all of which were extra inning games. In games of 9 innings or fewer, the record is 7 at bats and has occurred more than 200 times. The team record for most at bats in a single season is 5,781 by the 1997 Boston Red Sox. At bat as a phrase "At bat", "up", "up at bat", and "at the plate" are all phrases describing a batter who is facing the pitcher. Note that just because a player is described as being "at bat" in this sense, he will not necessarily be given an at bat in his statistics; the phrase actually signifies a plate appearance (assuming it is eventually completed). This ambiguous terminology is usually clarified by context. To refer explicitly to the technical meaning of "at bat" described above, the term | season. Batters will not receive credit for an at bat if their plate appearances end under the following circumstances: They receive a base on balls (BB). They are hit by a pitch (HBP). They hit a sacrifice fly or a sacrifice bunt (also known as sacrifice hit). They are awarded first base due to interference or obstruction, usually by the catcher. They are replaced by another hitter before their at bat is completed, in which case the plate appearance and any related statistics go to the pinch hitter (unless they are replaced with two strikes and their replacement completes a strikeout, in which case the at bat and strikeout are still charged to the first batter). In addition, if the inning ends while they are still at bat (due to the third out being made by a runner caught stealing, for example), no at bat or plate appearance will result. In this case, the batter will come to bat again in the next inning, though the count will be reset to no balls and no strikes. Put shortly, an at-bat is a specific type of plate appearance in which in the batter puts the ball in play intending to get on base. This is why at-bats, and not plate appearances, are used to calculate batting average, as plate appearances in general can result in many outcomes that don't involve the ball being put in play, and batting average specifically measures a batter's contact hitting. Rule 9.02(a)(1) of the official rules of Major League Baseball defines an at bat as: "Number of times batted, except that no time at bat shall be charged when a player: (A) hits a sacrifice bunt or sacrifice fly; (B) is awarded first base on four called balls; (C) is hit by a pitched ball; or (D) is awarded first base because of interference or obstruction[.]" Examples An at bat is counted when: The batter reaches first base on a hit The batter reaches first base on an error The batter is called out for any reason other than as part of a sacrifice There is a fielder's choice Records Pete Rose had 14,053 career at bats, the all-time major league and National League record. The American League record |
runs are tabulated as part of a pitcher's statistics. However, earned runs are specially denoted because of their use in calculating a pitcher's earned run average (ERA), the number of earned runs allowed by the pitcher per nine innings pitched (i.e., averaged over a regulation game). Thus, in effect, the pitcher is held personally accountable for earned runs, while the responsibility for unearned runs is shared with the rest of the team. To determine whether a run is earned, the official scorer must reconstruct the inning as it would have occurred without errors or passed balls. Details If no errors and no passed balls occur during the inning, all runs scored are automatically earned (assigned responsible to the pitcher(s) who allowed each runner to reach base). Also, in some cases, an error can be rendered harmless as the inning progresses. For example, a runner on first base advances to second on a passed ball and the next batter walks. Since the runner would now have been at second anyway, the passed ball no longer has any effect on the earned/unearned calculation. On the other hand, a batter/runner may make his entire circuit around the bases without the aid of an error, yet the run would be counted as unearned if an error prevented the third out from being made before he crossed the plate to score. An error made by the pitcher in fielding at his position is counted the same as an error by any other player. A run is counted as unearned when: A batter reaches base on an error (including catcher's interference) that would have retired the batter except for the error, and later scores a run in that inning by any means. A batter hits a foul fly ball that is dropped by a fielder for an error, extending the at bat, and later scores. In this case, the manner in which the batter reached base becomes irrelevant. A baserunner remains on base or advances to the next base as the result of an error on a fielder's choice play that would put the baserunner out except for the error, and later scores. A batter reaches first base on a passed ball (but not a wild pitch) and later scores. A baserunner scores by any means after the third out would have been made except for an error other than catcher's interference. A batter or runner advances one or more bases on an error or passed ball (but not a wild pitch) and scores on a play that would otherwise not have provided the opportunity to score. Under the MLB rule used in extra innings in 2020 and 2021, in which each half-inning starts with the last batter from the previous inning being placed on second base to begin the inning, a run scored by this runner is unearned. If the runner is erased on a fielder's choice which places a batter on base, and the new batter-runner later scores, this | denoted because of their use in calculating a pitcher's earned run average (ERA), the number of earned runs allowed by the pitcher per nine innings pitched (i.e., averaged over a regulation game). Thus, in effect, the pitcher is held personally accountable for earned runs, while the responsibility for unearned runs is shared with the rest of the team. To determine whether a run is earned, the official scorer must reconstruct the inning as it would have occurred without errors or passed balls. Details If no errors and no passed balls occur during the inning, all runs scored are automatically earned (assigned responsible to the pitcher(s) who allowed each runner to reach base). Also, in some cases, an error can be rendered harmless as the inning progresses. For example, a runner on first base advances to second on a passed ball and the next batter walks. Since the runner would now have been at second anyway, the passed ball no longer has any effect on the earned/unearned calculation. On the other hand, a batter/runner may make his entire circuit around the bases without the aid of an error, yet the run would be counted as unearned if an error prevented the third out from being made before he crossed the plate to score. An error made by the pitcher in fielding at his position is counted the same as an error by any other player. A run is counted as unearned when: A batter reaches base on an error (including catcher's interference) that would have retired the batter except for the error, and later scores a run in that inning by any means. A batter hits a foul fly ball that is dropped by a fielder for an error, extending the at bat, and later scores. In this case, the manner in which the batter reached base becomes irrelevant. A baserunner remains on base or advances to the next base as the result of an error on a fielder's choice play that would put the baserunner out except for the error, and later scores. A batter reaches first base on a passed ball (but not a wild pitch) and later scores. A baserunner scores by any means after the third out would have been made except for an error other than catcher's interference. A batter or runner advances one or more bases on an error or passed ball (but not a wild pitch) and scores on a play that would otherwise not have provided the opportunity to score. Under the MLB rule used in extra innings in 2020 and 2021, in which each half-inning starts with the last batter from the previous inning being placed on second base to begin the inning, a run scored by this runner is unearned. If the runner is erased on a fielder's choice which places a batter on base, and the new batter-runner later scores, this would also be an unearned run. This rule was implemented on an interim basis for those two seasons only to reduce the length of extra inning games during the height of the COVID-19 pandemic. While the inning is still being played, the second and the second-last scenario can cause a temporary situation where a run has already |
hit by pitch is not counted statistically as a walk, though the effect is mostly the same, with the batter receiving a free pass to first base. One exception is that on a HBP (hit-by-pitch), the ball is dead. On a HBP, any runners attempting to steal on the play must return to their original base unless forced to the next base anyway. When a walk occurs, the ball is still live: any runner not forced to advance may nevertheless attempt to advance at his own risk, which might occur on a steal play, passed ball, or wild pitch. Also, because a ball is live when a base on balls occurs, runners on base forced to advance one base may attempt to advance beyond one base, at their own risk. The batter-runner himself may attempt to advance beyond first base, at his own risk. Rule 6.08 addresses this matter as well. An attempt to advance an additional base beyond the base awarded might occur when ball four is a passed ball or a wild pitch. History In early baseball, there was no concept of a "ball." It was created by the NABBP in 1863, originally as a sort of unsportsmanlike-conduct penalty: "Should the pitcher repeatedly fail to deliver to the striker fair balls, for the apparent purpose of delaying the game, or for any other cause, the umpire, after warning him, shall call one ball, and if the pitcher persists in such action, two and three balls; when three balls shall have been called, the striker shall be entitled to the first base; and should any base be occupied at that time, each player occupying them shall be entitled to one base without being put out." Note that this rule in effect gave the pitcher 9 balls, since each penalty ball could only be called on a third offense. In 1869 the rule was modified so that only those baserunners forced to advance could advance. From 1871 through 1886, the batter was entitled to call "high" or "low," i.e. above or below the waist; a pitch which failed to conform was "unfair." Certain pitches were defined as automatic balls in 1872: any ball delivered over the batter's head, that hit the ground in front of home plate, was delivered to the opposite side from the batter, or came within one foot of him. In 1880, the National League changed the rules so that eight "unfair balls" instead of nine were required for a walk. In 1884, the National League changed the rules so that six balls were required for a walk. In 1886, the American Association changed the rules so that six balls instead of seven were required for a walk; however, the National League changed the rules so that seven balls were required for a walk instead of six. In 1887, the National League and American Association agreed to abide by some uniform rule changes, including, for the first time, a strike zone which | base without liability to be put out (e.g., hit by pitch (HBP), catcher's interference). Though a base on balls, catcher's interference, or a batter hit by a pitched ball all result in the batter (and possibly runners on base) being awarded a base, the term "walk" usually refers only to a base on balls, and not the other methods of reaching base without the bat touching the ball. An important difference is that for a hit batter or catcher's interference, the ball is dead and no one may advance unless forced; the ball is live after a walk (see below for details). A batter who draws a base on balls is commonly said to have been "walked" by the pitcher. When the batter is walked, runners advance one base without liability to be put out only if forced to vacate their base to allow the batter to take first base. If a batter draws a walk with the bases loaded, all preceding runners are forced to advance, including the runner on third base who is forced to home plate to score a run; when a run is forced on a walk, the batter is credited with an RBI per rule 9.04. Receiving a base on balls does not count as a hit or an at bat for a batter but does count as a time on base and a plate appearance. Therefore, a base on balls does not affect a player's batting average, but it can increase his on-base percentage. A hit by pitch is not counted statistically as a walk, though the effect is mostly the same, with the batter receiving a free pass to first base. One exception is that on a HBP (hit-by-pitch), the ball is dead. On a HBP, any runners attempting to steal on the play must return to their original base unless forced to the next base anyway. When a walk occurs, the ball is still live: any runner not forced to advance may nevertheless attempt to advance at his own risk, which might occur on a steal play, passed ball, or wild pitch. Also, because a ball is live when a base on balls occurs, runners on base forced to advance one base may attempt to advance beyond one base, at their own risk. The batter-runner himself may attempt to advance beyond first base, at his own risk. Rule 6.08 addresses this matter as well. An attempt to advance an additional base beyond the base awarded might occur when ball four is a passed ball or a wild pitch. History In early baseball, there was no concept of a "ball." It was created by the NABBP in 1863, originally as a sort of unsportsmanlike-conduct penalty: "Should the pitcher repeatedly fail to deliver to the striker fair balls, for the apparent purpose of |
by other means (such as a base on balls) or advancing further after the hit (such as when a subsequent batter gets a hit) does not increase the player's total bases. The total bases divided by the number of at bats is the player's slugging average. Hank Aaron is the career leader in total bases with 6,856. Stan Musial (6,134), Willie Mays (6,080), and Albert Pujols (6,038) are the only other players with at least 6,000 career total | hit (such as when a subsequent batter gets a hit) does not increase the player's total bases. The total bases divided by the number of at bats is the player's slugging average. Hank Aaron is the career leader in total bases with 6,856. Stan Musial (6,134), Willie Mays (6,080), and Albert |
be awarded first unless he made no attempt to avoid it (and he had an opportunity to avoid it). A batter hit by a pitch is not credited with a hit or at bat, but is credited with a time on base and a plate appearance; therefore, being hit by a pitch does not increase or decrease a player's batting average but does increase his on-base percentage. A batter hit by a pitch with the bases loaded is also credited with an RBI per MLB rule 10.04(a)(2). A pitch ruled a hit by pitch is recorded as a ball in the pitcher's pitch count, since by definition the ball must be outside the strike zone and not have been swung at. The rule awarding first base to a batter hit by a pitch was instituted in 1887. Tactical use Inside pitching is a common and legal tactic in baseball, and many players make use of brushback pitches, or pitches aimed underneath the chin, commonly referred to as "chin music", to keep players away from the plate. "Headhunter" is a common term for pitchers who have a reputation for throwing these kinds of pitches. However, throwing at a batter intentionally is illegal, and can be very dangerous. When an umpire suspects that a pitcher has thrown at a batter intentionally, but is not certain, a warning is issued to the pitcher and the managers of both teams. From that point on, any pitch thrown at a batter can cause the pitcher and the manager of the offending team to be ejected immediately from the game. Serious offenses such as a ball thrown at the head (called a beanball) can result in the immediate ejection of the pitcher, and the manager if he ordered the beanball, even without a warning. If the umpire is certain that the pitcher intentionally hit the batter with the pitch, the pitcher is ejected from the game with no warning. This infamously happened on 15 August 2018, when José Ureña was ejected from a game against the Atlanta Braves after hitting Ronald Acuña Jr. on the elbow with the first pitch of the game, which led to the Braves' and Marlins' benches clearing. Occasionally, if a player is acting rude or unsportsmanlike, or having an extraordinarily good day, the pitcher may intentionally hit the batter, disguising it as a pitch that accidentally slipped his control. Managers may also order a pitcher to throw such a pitch (sometimes called a "plunking"). These pitches are typically aimed at the lower back and slower than normal, designed to send a message more than anything else. The opposing team usually hits a batter in retaliation for this act. The plunkings generally end there because of umpire warnings, but in some cases things can get out of hand, and sometimes they lead to the batter charging the mound, bench-clearing brawls, and several ejections. Records The all-time record for a player being hit by a pitch is held by Hughie Jennings, who was hit by 287 pitches between 1891 and 1903. The modern-era record is held by Craig Biggio of the Houston Astros, who had 285 as of the end of the 2007 season when he retired. Prior to Biggio, the modern-era record belonged to Don Baylor, who was hit 267 times. The all-time single-season record also belongs to Jennings, who was hit 51 times during the 1896 season. Ron Hunt of the 1971 Montreal Expos was hit 50 times during that year, the modern-era record. The single-game record | Acuña Jr. on the elbow with the first pitch of the game, which led to the Braves' and Marlins' benches clearing. Occasionally, if a player is acting rude or unsportsmanlike, or having an extraordinarily good day, the pitcher may intentionally hit the batter, disguising it as a pitch that accidentally slipped his control. Managers may also order a pitcher to throw such a pitch (sometimes called a "plunking"). These pitches are typically aimed at the lower back and slower than normal, designed to send a message more than anything else. The opposing team usually hits a batter in retaliation for this act. The plunkings generally end there because of umpire warnings, but in some cases things can get out of hand, and sometimes they lead to the batter charging the mound, bench-clearing brawls, and several ejections. Records The all-time record for a player being hit by a pitch is held by Hughie Jennings, who was hit by 287 pitches between 1891 and 1903. The modern-era record is held by Craig Biggio of the Houston Astros, who had 285 as of the end of the 2007 season when he retired. Prior to Biggio, the modern-era record belonged to Don Baylor, who was hit 267 times. The all-time single-season record also belongs to Jennings, who was hit 51 times during the 1896 season. Ron Hunt of the 1971 Montreal Expos was hit 50 times during that year, the modern-era record. The single-game record is three, held by numerous players. The all-time record for pitchers is held by Gus Weyhing with 277 (1887–1901). The modern-era career pitching record for most hit batsmen is 205 by Hall-of-Famer Walter Johnson. The season record is 54 by Phil Knell in 1891, and the game record is six, held by Ed Knouff and John Grimes. Brady Anderson was the first player to be hit by a pitch two times in the same inning in an American League game. On April 25, 2014, Brandon Moss became the second when he was hit twice in the top of the 9th inning by Houston Astros pitchers. Five players have been hit by a pitch twice in the same inning in the National League. On September 1, 2021, Austin Adams became the first pitcher hitting batters 20 or more times with 120 or less IPs in a season. Ed Doheny hit batters 22 times in 133.2 IP in 1900. Twice has a perfect game been broken up by the 27th batter being hit by pitch. Hooks Wiltse and Max Scherzer hold this rare feat. Both finished with no-hitters after the hit by pitch. Scherzer's team was leading 6-0 when he pitched his no-hitter, but Wiltse's team was scoreless through 9; he pitched a 10-inning 1–0 no-hitter. The record for most hit batters in a no-hitter is three, held by Chris Heston of the San Francisco Giants for his 2015 effort against the New York Mets. Postseason career records are held by Greg Maddux and Tim Wakefield—each of whom hit 9 batters—and Shane Victorino, who was hit by pitch 11 times. Dangers One major league player died as a result of being struck by a pitch: Ray Chapman of the Cleveland Indians was hit in the head by Carl Mays on August 16, 1920, and died the next morning. Serious injuries may result from being hit by a pitch, even when wearing a batting helmet. On August 18, 1967, Red Sox batter Tony Conigliaro was hit almost directly in the left eye by a fastball thrown by Jack Hamilton of the California Angels. His cheekbone was shattered; he nearly lost the sight of the eye, |
Sfakia, in Chania regional unit. See also List of islands of Greece Landforms of Chania (regional unit) Uninhabited islands of Crete Mediterranean islands Islands | to the southern coast of Crete in the Libyan Sea. They fall within the administration of the municipality of Sfakia, in Chania regional |
or tag first base while carrying the ball. The hit is scored the moment the batter reaches first base safely; if he is put out while attempting to stretch his hit to a double or triple or home run on the same play, he still gets credit for a hit (according to the last base he reached safely on the play). If a batter reaches first base because of offensive interference by a preceding runner (including if a preceding runner is hit by a batted ball), he is also credited with a hit. Types of hits A hit for one base is called a single, for two bases a double, and for three bases a triple. A home run is also scored as a hit. Doubles, triples, and home runs are also called extra base hits. An "infield hit" is a hit where the ball does not leave the infield. Infield hits are uncommon by nature, and most often earned by speedy runners. Pitching a no-hitter A no-hitter is a game in which one of the teams prevented the other from getting a hit. Throwing a no-hitter is rare and considered an extraordinary accomplishment for a pitcher or pitching staff. In most cases in the professional game, no-hitters are accomplished by a single pitcher who throws a complete game. A pitcher who throws a no-hitter could still allow runners to reach base safely, by way of walks, errors, hit batsmen, or batter reaching base due to interference or obstruction. If the pitcher allows no runners to reach base in any manner whatsoever (hit, walk, hit batsman, error, etc.), the no-hitter is a perfect game. History In 1887, Major League Baseball counted bases on balls (walks) as hits. The result was skyrocketing batting averages, including some near .500; Tip O'Neill of the St. Louis Browns batted .485 that season, which would still be a major league record if recognized. The experiment was abandoned the following season. There is controversy regarding how the records of 1887 should be interpreted. The number of legitimate walks and at-bats are known for all players that year, so computing averages using the same method as in other years is straightforward. In 1968, Major League Baseball formed a Special Baseball Records Committee to resolve this (and other) issues. The Committee ruled that walks in 1887 should not be counted as hits. In 2000, Major League Baseball reversed its decision, ruling that the statistics which were recognized in each year's official records should stand, even in cases where they were later proven incorrect. Most current sources list O'Neill's 1887 average as .435, as calculated by omitting his walks. He would retain his American Association batting championship. However, the variance between methods results in differing recognition for the 1887 National League batting champion. Cap Anson would be recognized, with his .421 average, if walks are included, but Sam Thompson would be the champion at .372 if they are not. Major League Baseball rules The official rulebook of Major League Baseball states in Rule 10.05: (a) The official scorer shall credit a batter with a base hit when: (1) the batter reaches first base (or any succeeding base) safely on a fair ball that | the variance between methods results in differing recognition for the 1887 National League batting champion. Cap Anson would be recognized, with his .421 average, if walks are included, but Sam Thompson would be the champion at .372 if they are not. Major League Baseball rules The official rulebook of Major League Baseball states in Rule 10.05: (a) The official scorer shall credit a batter with a base hit when: (1) the batter reaches first base (or any succeeding base) safely on a fair ball that settles on the ground, that touches a fence before being touched by a fielder or that clears a fence; (2) the batter reaches first base safely on a fair ball hit with such force, or so slowly, that any fielder attempting to make a play with the ball has no opportunity to do so; Rule 10.05(a)(2) Comment: The official scorer shall credit a hit if the fielder attempting to handle the ball cannot make a play, even if such fielder deflects the ball from or cuts off another fielder who could have put out a runner. (3) the batter reaches first base safely on a fair ball that takes an unnatural bounce so that a fielder cannot handle it with ordinary effort, or that touches the pitcher's plate or any base (including home plate) before being touched by a fielder and bounces so that a fielder cannot handle the ball with ordinary effort; (4) the batter reaches first base safely on a fair ball that has not been touched by a fielder and that is in fair territory when the ball reaches the outfield, unless in the scorer's judgment the ball could have been handled with ordinary effort; (5) a fair ball that has not been touched by a fielder touches a runner or an umpire, unless a runner is called out for having been touched by an Infield Fly, in which case the official scorer shall not score a hit; or (6) a fielder unsuccessfully attempts to put out a preceding runner and, in the official scorer's judgment, the batter-runner would not have been put out at first base by ordinary effort. Rule 10.05(a) Comment: In applying Rule 10.05(a), the official scorer shall always give the batter the benefit of the doubt. A safe course for the official scorer to follow is to score a hit when exceptionally good fielding of a ball fails to result in a putout. (b) The official scorer shall not credit a base hit when a: (1) runner is forced out by a batted ball, or would have been forced out except for a fielding error; (2) batter apparently hits safely and a runner who is forced to advance by reason of the batter becoming a runner fails to touch the first base to which such runner is advancing and is called |
manager of the Pittsburgh Pirates, was featured in a Life Magazine graphic in which the formula for on-base percentage was shown as the first component of an all-encompassing "offense" equation. However, it was not named as on-base percentage, and there is little evidence that Roth's statistic was taken seriously at the time by the baseball community at large. On-base percentage became an official MLB statistic in 1984. Since then, especially since the publication of the influential 2003 book Moneyball which prominently highlighted Oakland Athletics general manager Billy Beane's focus on the statistic, on-base percentage has seen its perceived importance as a measure of offensive success increase dramatically. Many baseball observers, particularly those influenced by the field of sabermetrics, now consider on-base percentage to be superior to the traditional statistic meant to measure offensive skill, batting average. This is most notably because while both statistics incorporate a batter's success in recording hits, batting average ignores, while on-base percentage incorporates, the rate at which a batter is also able to avoid making an out and reach base via bases on balls or "walks." Overview Traditionally, players with the best on-base percentages bat as leadoff hitter, unless they are power hitters, who traditionally bat slightly lower in the batting order. The league average for on-base percentage in Major League Baseball has varied considerably over time; at its peak in the late 1990s, it was around .340, whereas it was typically .300 during the dead-ball era. On-base percentage can also vary quite considerably from player to player. The highest career OBP of a batter with more than 3,000 plate appearances is .482 by Ted Williams. The lowest is by Bill Bergen, who had an OBP of .194. On-base percentage | as the first component of an all-encompassing "offense" equation. However, it was not named as on-base percentage, and there is little evidence that Roth's statistic was taken seriously at the time by the baseball community at large. On-base percentage became an official MLB statistic in 1984. Since then, especially since the publication of the influential 2003 book Moneyball which prominently highlighted Oakland Athletics general manager Billy Beane's focus on the statistic, on-base percentage has seen its perceived importance as a measure of offensive success increase dramatically. Many baseball observers, particularly those influenced by the field of sabermetrics, now consider on-base percentage to be superior to the traditional statistic meant to measure offensive skill, batting average. This is most notably because while both statistics incorporate a batter's success in recording hits, batting average ignores, while on-base percentage incorporates, the rate at which a batter is also able to avoid making an out and reach base via bases on balls or "walks." Overview Traditionally, players with the best on-base percentages bat as leadoff hitter, unless they are power hitters, who traditionally bat slightly lower in the batting order. The league average for on-base percentage in Major League Baseball |
in its present form in MLB in 1954, Gil Hodges of the Dodgers holds the record for most sacrifice flies in one season with 19, in 1954; Eddie Murray holds the MLB record for most sacrifice flies in a career with 128. As of the end of the 2021 Major League Baseball season, the ten players who had hit the most sacrifice flies were as follows: Eddie Murray (128) Cal Ripken, Jr. (127) Robin Yount (123) Hank Aaron (121) Frank Thomas (121) George Brett (120) Rubén Sierra (120) Rafael Palmeiro (119) Rusty Staub (119) Andre Dawson (118) Only once has the World Series been won on a sac fly. In 1912, Larry Gardner of the Boston Red Sox hit a fly ball off a pitch from the New York Giants' Christy Mathewson. Steve Yerkes tagged up and scored from third base to win game 8 in the tenth inning and take the series for the Red Sox. History Batters have not been charged with a time at-bat for a sacrifice hit since 1893, but baseball has changed the sacrifice fly rule multiple times. The sacrifice fly as a statistical category was instituted in 1908, only to be discontinued in 1931. The rule was again adopted in 1939, only to be eliminated again in 1940, before being adopted for the last time in 1954. For some baseball fans, it is significant that the sacrifice-fly rule was eliminated in 1940 because, in 1941, Ted Williams was hitting .39955 on the last day of the season and needed one hit in a doubleheader against the Philadelphia A's to become the first hitter since Bill Terry in 1930 to hit .400. He got six hits, finishing with an official .406 average, the last major leaguer in nearly 80 years now to bat .400 or more. In his book, Baseball in '41, author Robert Creamer, citing estimates, points out that if Williams' 14 at-bats on sacrifice flies that year were deducted from the 456 official at-bats he was charged with, his final average in 1941 would have been .419. References External links MLB Official Rules from the Major League | is credited with a sacrifice fly, as well as a second RBI if a runner on third also scores. At the professional level this will only typically occur in unusual circumstances that prevent the defense from making an immediate throw back to the infield, such as an outfielder colliding with the wall while making a catch on the warning track. The sacrifice fly is credited even if another runner is put out so long as the run scores. The sacrifice fly is credited on a dropped ball even if another runner is forced out by reason of the batter becoming a runner. On any fly ball, a runner can initiate an attempt to advance bases as soon as a fielder touches the ball by tagging up, even before the fielder has full control of the ball. Records The most sacrifice flies by a team in one game in Major League Baseball (MLB) is five; the record was established by the Seattle Mariners in 1988, tied by the Colorado Rockies in 2006, and tied again by the Mariners in 2008. Five MLB teams have collected three sacrifice flies in an inning: the Chicago White Sox (fifth inning, July 1, 1962 against the Cleveland Indians); the New York Yankees twice (fourth inning, June 29, 2000 against the Detroit Tigers and third inning, August 19, 2000 against the Anaheim Angels); the New York Mets (second inning, June 24, 2005 against the Yankees); and the Houston Astros (seventh inning, June 26, 2005 against the Texas Rangers). In these cases one or more of the flies did not result in a putout due to an error. Since the rule was reinstated in its present form in MLB in 1954, Gil Hodges of the Dodgers holds the record for most sacrifice flies in one season with 19, in 1954; Eddie Murray holds the MLB record for most sacrifice flies in a career with 128. As of the end of the 2021 Major League Baseball season, the ten players who had hit the most sacrifice flies were as follows: Eddie Murray (128) Cal Ripken, Jr. (127) Robin Yount (123) Hank Aaron (121) Frank Thomas (121) George Brett (120) Rubén Sierra (120) Rafael Palmeiro (119) Rusty Staub (119) Andre Dawson (118) Only once has the World Series been won on a sac fly. In 1912, Larry Gardner of the Boston Red Sox hit a fly ball off a pitch from the New York Giants' Christy Mathewson. Steve Yerkes tagged up and scored |
and broadcasters picked it up. The popularity of OPS gradually spread, and by 2004 it began appearing on Topps baseball cards. OPS was formerly sometimes known as production. For instance, production was included in early versions of Thorn's Total Baseball encyclopedia, and in the Strat-O-Matic Computer Baseball game. This term has fallen out of use. OPS gained popularity because of the availability of its components, OBP and SLG, and that team OPS correlates well with team runs scored. An OPS scale Bill James, in his essay titled "The 96 Families of Hitters" uses seven different categories for classification by OPS: This effectively transforms OPS into a seven-point ordinal scale. Substituting quality labels such as excellent (A), very good (B), good (C), average (D), fair (E), poor (F) and very poor (G) for the A–G categories creates a subjective reference for OPS values. Leaders The top ten Major League Baseball players in lifetime OPS, with at least 3,000 plate appearances through August 5, 2020, were: Babe Ruth, 1.1636 Ted Williams, 1.1155 Lou Gehrig, 1.0798 Barry Bonds, 1.0512 Jimmie Foxx, 1.0376 Hank Greenberg, 1.0169 Rogers Hornsby, 1.0103 Mike Trout, 1.0009 Manny Ramirez, 0.9960 Mark McGwire, 0.9823 The top four were all left-handed batters. Jimmie Foxx has the highest career OPS for a right-handed batter. The top ten single-season performances in MLB are (all left-handed hitters): Barry Bonds, 1.4217 () Barry Bonds, 1.3807 () Babe Ruth, 1.3791 () Barry Bonds, 1.3785 () Babe Ruth, 1.3586 () Babe Ruth, 1.3089 () Ted Williams, 1.2875 () Barry Bonds, 1.2778 () Babe Ruth, 1.2582 () Ted Williams, 1.2566 () The highest single-season mark for a right-handed hitter was 1.2449 by Rogers Hornsby in , 13th on the all-time list. Since 1935, the highest single-season OPS for a right-hander is 1.2224 by Mark McGwire in , which was 16th all-time. Adjusted OPS (OPS+) OPS+, adjusted OPS, is a closely related statistic. OPS+ is OPS adjusted for the park and the league in which the player played, but not for fielding position. An OPS+ of 100 is defined to be the league average. An OPS+ of 150 or more is excellent and 125 very good, while an OPS+ of 75 or below is poor. The basic equation for OPS+ is where *lgOBP is the park adjusted OBP of the league (not counting pitchers hitting) and *lgSLG is the park adjusted SLG of the league. A common misconception is that OPS+ closely matches the ratio of a player's OPS to that of their league. In fact, due to the additive nature of the two components in OPS+, a player with an OBP and SLG both 50% better than league average in those metrics will have an OPS+ of 200 (twice the league average OPS+) while still | league in which the player played, but not for fielding position. An OPS+ of 100 is defined to be the league average. An OPS+ of 150 or more is excellent and 125 very good, while an OPS+ of 75 or below is poor. The basic equation for OPS+ is where *lgOBP is the park adjusted OBP of the league (not counting pitchers hitting) and *lgSLG is the park adjusted SLG of the league. A common misconception is that OPS+ closely matches the ratio of a player's OPS to that of their league. In fact, due to the additive nature of the two components in OPS+, a player with an OBP and SLG both 50% better than league average in those metrics will have an OPS+ of 200 (twice the league average OPS+) while still having an OPS that is only 50% better than the average OPS of the league. It would be a better (although not exact) approximation to say that a player with an OPS+ of 150 produces 50% more runs, in a given set of plate appearances, as a player with an OPS+ of 100 (though see clarification above, under "History"). Leaders in OPS+ Through the end of the 2019 season, the career top twenty leaders in OPS+ (minimum 3,000 plate appearances) were: Babe Ruth, 206 Ted Williams, 190 Barry Bonds, 182 Lou Gehrig, 179 Mike Trout, 176 Rogers Hornsby, 175 Mickey Mantle, 172 Dan Brouthers, 171 Joe Jackson, 170 Ty Cobb, 168 Pete Browning, 163 Jimmie Foxx, 163 Mark McGwire, 163 Dave Orr, 162 Stan Musial, 159 Hank Greenberg, 158 Johnny Mize, 158 Tris Speaker, 157 Dick Allen, 156 Willie Mays, 156 Frank Thomas 156 The only purely right-handed batters to appear on this list are Browning, Hornsby, Foxx, Trout, McGwire, Allen, Mays, and Thomas. Mantle is the only switch-hitter in the group. The highest single-season performances were: Barry Bonds, 268 () Barry Bonds, 263 () Barry Bonds, 259 () Fred Dunlap, 258 (1884) * Babe Ruth, 256 () Babe Ruth, 239 () Babe Ruth, 239 () Ted Williams, 235 () Ted Williams, 233 () Ross Barnes, 231 (1876) ** Barry Bonds, 231 () * - Fred Dunlap's historic 1884 season came in the Union Association, which some baseball experts consider not to be a true major league. ** - Ross Barnes may have been aided by a rule that made a bunt fair if it first rolled in fair territory. He did not play nearly so well when this rule was removed, although injuries may have been mostly to blame, as his fielding statistics similarly declined. If Dunlap's and Barnes' seasons were to be eliminated from the list, two other Ruth seasons |
to swing, it becomes a pure steal attempt. In the delayed steal, the runner does not take advantage of the pitcher's duty to complete a pitch, but relies on surprise and takes advantage of any complacency by the fielders. The runner gives the impression he is not trying to steal, and does not break for the next base until the ball crosses the plate. It is rare for Major League defenses to be fooled, but the play is used effectively at the college level. The first delayed steal on record was performed by Miller Huggins in 1903. The delayed steal was famously practiced by Eddie Stanky of the Brooklyn Dodgers. Second base is the base most often stolen, because once a runner is on second base he is considered to be in scoring position, meaning that he is expected to be able to run home and score on most routine singles hit into the outfield. Second base is also the easiest to steal, as it is farthest from home plate and thus a longer throw from the catcher is required to prevent it. Third base is a shorter throw for the catcher, but the runner is able to take a longer lead off second base and can leave for third base earlier against a left-handed pitcher. A steal of home plate is the riskiest, as the catcher only needs to tag out the runner after receiving the ball from the pitcher. It is difficult for the runner to cover the distance between the bases before the ball arrives home. Ty Cobb holds the records for most steals of home in a single season (8) as well as for a career (54). Steals of home are not officially recorded statistics, and must be researched through individual game accounts. Thus Cobb's totals may be even greater than is recorded. Jackie Robinson famously stole home in Game 1 of the 1955 World Series. Thirty-five games have ended with a runner stealing home, but only two have occurred since 1980. In a variation on the steal of home, the batter is signaled to simultaneously execute a sacrifice bunt, which results in the squeeze play. The suicide squeeze is a squeeze in which the runner on third begins to steal home without seeing the outcome of the bunt; it is so named because if the batter fails to bunt, the runner will surely be out. In contrast, when the runner on third does not commit until seeing that the ball is bunted advantageously, it is called a safety squeeze. In more recent years, most steals of home involve a delayed double steal, in which a runner on first attempts to steal second, while the runner on third breaks for home as soon as the catcher throws to second base. If it is important to prevent the run from scoring, the catcher may hold on to the ball (conceding the steal of second) or may throw to the pitcher; this may deceive the runner at third and the pitcher may throw back to the catcher for the out. Statistics In baseball statistics, stolen bases are denoted by SB. Attempts to steal that result in the baserunner being out are caught stealing (CS). The sum of these statistics is steal attempts. Successful steals as a percentage of total steal attempts is called the success rate. The rule on stolen bases states that: Advances that are credited to some other play are not steal attempts. For example, on a wild pitch or a passed ball, the official scorer must notice whether the runner broke for the next base before the pitch got away. As usual, statistics in the case of a defensive error are based on error-free play. If a runner would have been out, but for the error, it is scored as "caught stealing, safe on the error." A catcher does not commit an error by throwing poorly to the destination base, but if any runner takes an extra base on the bad throw, it is "stolen base plus error." There is no steal attempt on a dead ball, whether the runner is sent back to the original base (as on a foul ball) or is awarded the next base (as on a hit batsman). On a base award when the ball is live (such as a walk), the runner could make a steal attempt beyond the base awarded. Cases where the defense intentionally allows the runner to advance without attempting to put him out are scored as defensive indifference, also called fielder's indifference, and do not count as stolen bases. This is usually only scored late in games when it is clear that the defense's priority is getting the batter out. The lack of a putout attempt does not by itself indicate defensive indifference; the official scorer must also factor in the game situation and the defensive players' actions. Relative skill at stealing bases can be judged by evaluating either a player's total number of steals or the success rate. Noted statistician Bill James has argued that unless a player has a high success rate (67-70% or better), attempting to steal a base is detrimental to a team. Comparing skill against players from other eras is problematic, because the definition has not been constant. Caught stealing was not recorded regularly until the middle of the 20th century. Ty Cobb, for example, was known as a great base-stealer, with 892 steals and a success rate of over 83%. However, the data on Cobb's caught stealing is missing from 12 seasons, strongly suggesting he was unsuccessful many more times than his stats indicate. Carlos Beltrán, with 286 steals, has the highest career success rate of all players with over 300 stolen base attempts, at 88.3%. Evolution of rules and scoring The first mention of the stolen base as a statistic was in the 1877 scoring rules adopted by the National League, which noted credit toward a player's total bases when a base is stolen. It was not until 1886 that the stolen base appeared as something to be tracked, but was only to "appear in the summary of the game". In 1887, the stolen base was given its own individual statistical column in the box score, and was defined for purposes of scoring: "...every base made after first base has been reached by a base runner, except for those made by reason of or with the aid of a battery error (wild pitch or passed ball), or by batting, balks or by being forced off. In short, shall include all bases made by a clean steal, or through a wild throw or muff of the ball by a fielder who is directly trying to put the base runner out while attempting to steal." The next year, it was clarified that any attempt to steal must be credited to the runner, and that fielders committing errors during this play must also be charged with an error. This rule also clarified that advancement of another base(s) beyond the one being stolen is not credited as a stolen base on the same play, and that an error is charged to the fielder who permitted the extra advancement. There was clarification that a runner is credited with a steal if the attempt began before a battery error. Finally, batters were credited with a stolen base if they were tagged out after over running the base. In 1892, a rule credited runners with stolen bases if a base runner advanced on a fly out, or if they advanced more than one base on any safe hit or attempted out, providing an attempt was made by the defense to put the runner out. The rule was rescinded in 1897. In 1898, stolen base scoring was narrowed to no longer include advancement in the event of a fielding error, or advancement caused by a hit batsman. 1904 saw an attempt to reduce the already wordy slew of rules governing stolen bases, with the stolen base now credited when "the advances a base unaided by a base hit, a put out, (or) a fielding or batter error." 1910 saw the first addressing of the double and triple steal attempts. Under the new rule, when any runner is thrown out, and the other(s) are successful, the successful runners will not be credited with a stolen base. Without using the term, 1920 saw the first rule that would be referred to today as defensive indifference, as stolen bases would not be credited, unless an effort was made to stop the runner by the defense. This is usually called if such is attempted in the ninth inning while that player's team is trailing, unless the runner represents the potential tying run. 1931 saw a further narrowing of the criteria for awarding a stolen base. Power was given to the official scorer, in the event of a muff by the catcher in throwing, that in the judgment of the scorer the runner would have been out, to credit the catcher with an error, and not credit the runner with a stolen base. Further, any successful steal on a play resulting in a wild pitch, passed ball, or balk would no longer be credited as a steal, even if the runner had started to steal before the play. One of the largest rewrites to the rules in history came in 1950. The stolen base was specifically to be credited "to a runner whenever he advances one base unaided by a base hit, a putout, a forceout, a fielder's choice, a passed ball, a wild pitch, or a balk." There were noted exceptions, such as denying a stolen base to an otherwise successful steal as a part of a double or triple steal, if one other runner was thrown out in the process. A stolen base would be awarded to runners who successfully stole second base as a part of a double steal with a man on third, if the other runner failed to steal home, but instead was able to return safely to third base. Runners who are tagged out oversliding the base after an otherwise successful steal would not be credited with a stolen base. Indifference was also credited as an exception. Runners would now be credited with stolen bases if they had begun the act of stealing, and the resulting pitch was wild, or a passed ball. Finally, for 1950 only, runners would be credited with a stolen base if they were "well advanced" toward the base they were attempting to steal, and the pitcher is charged with a balk, with the further exception of a player attempting to steal, who would otherwise have been forced to advance on the balk by a runner behind them. This rule was removed in 1951. A clarification came in 1955 that awarded a stolen base to a runner even if he became involved in a rundown, provided he evaded the rundown and advanced to the base he intended to steal. The criteria for "caught stealing" were fine-tuned | base until the ball crosses the plate. It is rare for Major League defenses to be fooled, but the play is used effectively at the college level. The first delayed steal on record was performed by Miller Huggins in 1903. The delayed steal was famously practiced by Eddie Stanky of the Brooklyn Dodgers. Second base is the base most often stolen, because once a runner is on second base he is considered to be in scoring position, meaning that he is expected to be able to run home and score on most routine singles hit into the outfield. Second base is also the easiest to steal, as it is farthest from home plate and thus a longer throw from the catcher is required to prevent it. Third base is a shorter throw for the catcher, but the runner is able to take a longer lead off second base and can leave for third base earlier against a left-handed pitcher. A steal of home plate is the riskiest, as the catcher only needs to tag out the runner after receiving the ball from the pitcher. It is difficult for the runner to cover the distance between the bases before the ball arrives home. Ty Cobb holds the records for most steals of home in a single season (8) as well as for a career (54). Steals of home are not officially recorded statistics, and must be researched through individual game accounts. Thus Cobb's totals may be even greater than is recorded. Jackie Robinson famously stole home in Game 1 of the 1955 World Series. Thirty-five games have ended with a runner stealing home, but only two have occurred since 1980. In a variation on the steal of home, the batter is signaled to simultaneously execute a sacrifice bunt, which results in the squeeze play. The suicide squeeze is a squeeze in which the runner on third begins to steal home without seeing the outcome of the bunt; it is so named because if the batter fails to bunt, the runner will surely be out. In contrast, when the runner on third does not commit until seeing that the ball is bunted advantageously, it is called a safety squeeze. In more recent years, most steals of home involve a delayed double steal, in which a runner on first attempts to steal second, while the runner on third breaks for home as soon as the catcher throws to second base. If it is important to prevent the run from scoring, the catcher may hold on to the ball (conceding the steal of second) or may throw to the pitcher; this may deceive the runner at third and the pitcher may throw back to the catcher for the out. Statistics In baseball statistics, stolen bases are denoted by SB. Attempts to steal that result in the baserunner being out are caught stealing (CS). The sum of these statistics is steal attempts. Successful steals as a percentage of total steal attempts is called the success rate. The rule on stolen bases states that: Advances that are credited to some other play are not steal attempts. For example, on a wild pitch or a passed ball, the official scorer must notice whether the runner broke for the next base before the pitch got away. As usual, statistics in the case of a defensive error are based on error-free play. If a runner would have been out, but for the error, it is scored as "caught stealing, safe on the error." A catcher does not commit an error by throwing poorly to the destination base, but if any runner takes an extra base on the bad throw, it is "stolen base plus error." There is no steal attempt on a dead ball, whether the runner is sent back to the original base (as on a foul ball) or is awarded the next base (as on a hit batsman). On a base award when the ball is live (such as a walk), the runner could make a steal attempt beyond the base awarded. Cases where the defense intentionally allows the runner to advance without attempting to put him out are scored as defensive indifference, also called fielder's indifference, and do not count as stolen bases. This is usually only scored late in games when it is clear that the defense's priority is getting the batter out. The lack of a putout attempt does not by itself indicate defensive indifference; the official scorer must also factor in the game situation and the defensive players' actions. Relative skill at stealing bases can be judged by evaluating either a player's total number of steals or the success rate. Noted statistician Bill James has argued that unless a player has a high success rate (67-70% or better), attempting to steal a base is detrimental to a team. Comparing skill against players from other eras is problematic, because the definition has not been constant. Caught stealing was not recorded regularly until the middle of the 20th century. Ty Cobb, for example, was known as a great base-stealer, with 892 steals and a success rate of over 83%. However, the data on Cobb's caught stealing is missing from 12 seasons, strongly suggesting he was unsuccessful many more times than his stats indicate. Carlos Beltrán, with 286 steals, has the highest career success rate of all players with over 300 stolen base attempts, at 88.3%. Evolution of rules and scoring The first mention of the stolen base as a statistic was in the 1877 scoring rules adopted by the National League, which noted credit toward a player's total bases when a base is stolen. It was not until 1886 that the stolen base appeared as something to be tracked, but was only to "appear in the summary of the game". In 1887, the stolen base was given its own individual statistical column in the box score, and was defined for purposes of scoring: "...every base made after first base has been reached by a base runner, except for those made by reason of or with the aid of a battery error (wild pitch or passed ball), or by batting, balks or by being forced off. In short, shall include all bases made by a clean steal, or through a wild throw or muff of the ball by a fielder who is directly trying to put the base runner out while attempting to steal." The next year, it was clarified that any attempt to steal must be credited to the runner, and that fielders committing errors during this play must also be charged with an error. This rule also clarified that advancement of another base(s) beyond the one being stolen is not credited as a stolen base on the same play, and that an error is charged to the fielder who permitted the extra advancement. There was clarification that a runner is credited with a steal if the attempt began before a battery error. Finally, batters were credited with a stolen base if they were tagged out after over running the base. In 1892, a rule credited runners with stolen bases if a base runner advanced on a fly out, or if they advanced more than one base on any safe hit or attempted out, providing an attempt was made by the defense to put the runner out. The rule was rescinded in 1897. In 1898, stolen base scoring was narrowed to no longer include advancement in the event of a fielding error, or advancement caused by a hit batsman. 1904 saw an attempt to reduce the already wordy slew of rules governing stolen bases, with the stolen base now credited when "the advances a base unaided by a base hit, a put out, (or) a fielding or batter error." 1910 saw the first addressing of the double and triple steal attempts. Under the new rule, when any runner is thrown out, and the other(s) are successful, the successful runners will not be credited with a stolen base. Without using the term, 1920 saw the first rule that would be referred to today as defensive indifference, as stolen bases would not be credited, unless an effort was made to stop the runner by the defense. This is usually called if such is attempted in the ninth inning while that player's team is trailing, unless the runner represents the potential tying run. 1931 saw a further narrowing of the criteria for awarding a stolen base. Power was given to the official |
is put out on the basepaths for the third out in a way other than by the batter putting the ball into play (i.e., picked off, caught stealing). In this case, the same batter continues his turn batting in the next inning with no balls or strikes against him. A batter is not credited with a plate appearance if, while batting, the game ends as the winning run scores from third base on a balk, stolen base, wild pitch or passed ball. A batter may or may not be credited with a plate appearance (and possibly at bat) in the rare instance when he is replaced by a pinch hitter after having already started his turn at bat. Under Rule 9.15(b), the pinch hitter would receive the plate appearance (and potential of an at-bat) unless the original batter is replaced when having 2 strikes against him and the pinch hitter subsequently completes the strikeout, in which case the plate appearance and at-bat are charged to the first batter. Relation to at bat Under Official Baseball Rule 9.02(a)(1), an at bat results from a completed plate appearance, unless the batter: hits a sacrifice bunt or sacrifice fly; or is awarded first base on four called balls; or is hit by a pitched ball; or is awarded first base because of interference or obstruction. In common terminology, the term "at bat" is sometimes used to mean "plate appearance" (for example, "he fouled off the ball to keep the at bat alive"). The intent is usually clear from the context, although the term "official at bat" is sometimes used to explicitly refer to an at bat as distinguished from a plate appearance. However, terms such as turn at bat or time at bat are synonymous with plate appearance. "Time at bat" in the rulebook Official Baseball Rule 5.06(c) provides that "[a] batter has legally completed his time at bat when he is put out or becomes a runner" (emphasis added). The "time at bat" defined in this rule is more commonly referred to as a plate appearance, and the playing rules (Rules 1 through 8) uses the phrase "time at bat" in this sense (e.g. Rule 5.04(a)(3), which states that "[t]he first batter in each inning after the first inning shall be the player whose name follows that of the last player who legally completed his time at bat in the preceding inning" (emphasis added)). In contrast, the scoring rules uses the phrase "time at bat" to refer to the statistic at bat, defined in Rule 9.02(a)(1), but sometimes uses the phrase "official time at bat" or refers back to Rule 9.02(a)(1) when mentioning the statistic. The phrase "plate appearance" is used in Rules 9.22 and 9.23 dealing with batting titles and hitting streaks, and in Rule 5.10(g) Comment | average. And suppose Player B, with 490 plate appearances and 400 at bats, gets 110 hits during the season and finishes the season with a .275 batting average. Player B, even though he had the same amount of at bats as Player A and even though his batting average is higher, will not be eligible for season-ending rankings because he did not accumulate the required 502 plate appearances, while Player A did and therefore will be eligible. Exception for batting titles Rule 9.22(a) of the Official Baseball Rules make a single allowance to the minimum requirement of 502 plate appearances for the purposes of determining the batting, slugging or on-base percentage title. If a player: leads the league in one of the statistics; does not have the required 502 plate appearances; and would still lead the league in that statistic if as many at bats (without hits or reaching base) were added to his records as necessary to meet the requirement, he will win that title, but with his original statistic (before the extra at bats were added). In the example above, Player B is 12 plate appearances short of the required 502, but were he be charged with 12 additional at bats, he would go 110-for-412 for a batting average of .267. If no one else has a batting average (similarly modified if appropriate) higher than .267, player B will be awarded the batting title (with his original batting average of .275) despite the lack of 502 plate appearances. In a real-life example, in 2012, Melky Cabrera, then of the San Francisco Giants, finished the season with a league-high .346 batting average, but he had only 501 plate appearances, one short of the required 502. Per the rule, he would have won the batting title because after an extra at bat is added and his batting average recalculated, he still would have led the league in batting average. Cabrera's case, however, turned out differently. The reason Cabrera finished the season with only 501 at bats was because he was suspended in mid-August when he tested positive for illegal performance-enhancing drugs. Cabrera was still eligible for that extra plate appearance, but he requested that the extra plate appearance not be added to his total, and that he not be considered for the batting crown, because he admitted that his use of performance-enhancing drugs had given him an unfair advantage over other players. As a result, Cabrera's name is nowhere to be found on the list of 2012 National League batting leaders. Scoring A batter is not credited with a plate appearance if, while batting, a preceding runner is put out on the basepaths for the third out in a way other than by the batter putting the ball into play (i.e., picked off, caught stealing). In this case, the same batter continues his turn batting in the next inning with no balls or strikes against him. A batter is not credited with a plate appearance if, while batting, the game ends as the winning run scores from third base on a balk, stolen base, wild pitch or passed ball. A batter may or may not be credited with a plate appearance (and possibly at bat) in the rare instance when he is replaced by a pinch hitter after having already started his turn at bat. Under Rule 9.15(b), the pinch hitter would receive the plate appearance (and potential of an at-bat) unless the original batter is replaced when having 2 strikes against him and the pinch hitter subsequently completes the strikeout, in which case the plate appearance and at-bat are charged to the first batter. Relation to at bat Under Official Baseball |
the game is contested. Baseball In baseball, the statistic applies to players, who prior to a game, are included on a starting lineup card or are announced as an ex ante substitute, whether or not they play, although, in Major League Baseball, the application of this statistic doesn't extend to consecutive games played streaks. A starting pitcher, then, may be credited with a game played even as he is not credited with a | this statistic doesn't extend to consecutive games played streaks. A starting pitcher, then, may be credited with a game played even as he is not credited with a game started or an inning pitched. For pitchers only, the statistic games pitched is used. Association football In association football, a game played is counted if a player is in the Starting XI, or if they come off the bench before full-time. See also |
Bronx, one of the five boroughs of New York City. The group pioneered the fusion of dancehall reggae and hip hop music and their debut LP Criminal Minded contained frank descriptions of life in the South Bronx during the late 1980s, thus setting the stage for what would eventually become gangsta rap. Members BDP's membership changed throughout its existence, the only constant being KRS-One. The group was founded by KRS-One and DJ Scott La Rock, with producer Lee Smith, who was essential in the production of the songs on Criminal Minded, being added as a member shortly after. From those beginnings, BDP members and collaborators included Ced Gee of Ultramagnetic MC's, Lee Smith, Scott La Rock, D-Nice, Henry Wilkerson PoppyDa, Kenny Parker (KRS-One's younger brother), Just-Ice, ICU, McBoo, Ms. Melodie, Heather B., Scottie Morris, Tony Rahsan, Willie D., RoboCop, Harmony, DJ Red Alert, Jay Kramer, D-Square, Rebekah Foster, Scott Whitehill, Scott King, Chris Tait and Sidney Mills. BDP as a group essentially ended because KRS-One began recording and performing under his own name rather than the group name. Lee Smith, who has co-producer credit on the original 12” "South Bronx" single, was the last to be inexplicably jettisoned by KRS-One and the future new label after Scott's death. In the liner notes on BDP's 1992 album Sex and Violence, KRS-One writes: "BDP in 1992 is KRS-One, Willie D, and Kenny Parker! BDP is not D-Nice, Jamal-ski, Harmony, Ms. Melodie, and Scottie Morris. They are not down with BDP so stop frontin'." Steve "Flash" Juon of RapReviews.com claimed that this initiated the ultimate breakup of the group. Cultural influences and impact "The Bridge Wars" A conflict arose in the late 1980s concerning the origins of hip-hop, and BDP made conscious efforts in its early work to establish its interpretation of the issue. The origins of hip-hop to many, including BDP, are believed to be from the Bronx. A rival hip-hop collective, known as the Juice Crew's lyrics, were misunderstood to contain a claim in the song "The Bridge" that hip hop was directly a result of artists originating from Queensbridge. Boogie Down and KRS retorted angrily with | use of the "Mad Mad" or "Diseases" riddim started in 1981 with reggae star Yellowman's song "Zunguzung." BDP used this riff in the song "Remix for P is Free," and it was later resampled by artists such as Black Star and dead prez. As an album regarded by many as the start of the gangsta rap movement, Criminal Minded played an important role in reaffirming the social acceptance of having Jamaican roots. BDP referenced reggae in a way that helped to solidify Jamaica's place in modern hip-hop culture. Political and social activism From its start, BDP affected the development of hip-hop and gave a sincere voice to the reality of life in the South Bronx, a section of New York City clouded with poverty and crime. With Criminal Minded, the group combined the sounds of LaRock's harsh, spare, reggae-influenced beats and KRS-One's long-winded rhyme style on underground classics such as “9mm Goes Bang” and “South Bronx,” the album's gritty portrait of life on the streets (as well as the firearms that adorned its cover) influenced the gangsta rap movement that began in earnest two years later. BDP's influence in the creation and development of gangsta rap highlights the cultural significance and impact of the type of music BDP and other early hip-hop artists like it created. This subgenre of hip-hop is most closely associated with hard-core hip-hop and is widely misinterpreted as promoting violence and gang activity. This misinterpretation or stigma is closely related to Boogie Down Productions and the general purpose behind their underlying themes of violence. For instance, the cover art of Criminal Minded displays the two artists in the group brandishing drawn guns and displaying other firearms. This is not an encouragement of the violence described in BDP's music, but a portrayal of the violence in the South Bronx as a means of expression, escape, and even condemnation. This album art is not meant to advocate violence but to challenge the conception of a criminal, to assert that those who are really criminally minded are those who hold power. BDP's music became significantly more politically astute after Scott La Rock's death. KRS-One published four more albums under the title Boogie Down Productions, and each was increasingly innovative and expanded from the thuggish imagery of Criminal Minded, exploring themes like black-on-black crime and black radicalism, using a riff on the words of Malcolm X, “by any means necessary”, which became the title of the second BDP album, and remains one of the most political hip-hop albums to date. It was in this album that KRS defined himself as the “teacha” or “teacher”, symbolizing his emphasis on educating his audience members and fans about relevant social issues surrounding the African-American experience. During his time in association with Boogie Down Productions, KRS-One joined other rappers to create the Stop the Violence Movement, which addressed many of the issues brought up in BDP's music and is the most conscious effort displayed by KRS-One and BDP of political activism and engagement. The movement created the single “Self-Destruction” in 1989 through the collaboration of BDP (KRS-One, D-Nice & Ms. Melodie), Stetsasonic (Delite, Daddy-O, Wise, and Frukwan), Kool Moe Dee, MC Lyte, Doug E. Fresh, Just-Ice, Heavy D, Biz Markie, and Public Enemy (Chuck D & Flavor Flav), with the aim of spreading awareness about violence in African-American and hip-hop communities. All proceeds from this effort went to the National Urban League. Discography Studio albums Criminal Minded (1987) By All Means |
and "Excess-3". For example, the BCD digit 6, in 8421 notation, is in 4221 (two encodings are possible), in 7421, while in Excess-3 it is (). The following table represents decimal digits from 0 to 9 in various BCD encoding systems. In the headers, the "8421" indicates the weight of each bit. In the fifth column ("BCD 84−2−1"), two of the weights are negative. Both ASCII and EBCDIC character codes for the digits, which are examples of zoned BCD, are also shown. As most computers deal with data in 8-bit bytes, it is possible to use one of the following methods to encode a BCD number: Unpacked: Each decimal digit is encoded into one byte, with four bits representing the number and the remaining bits having no significance. Packed: Two decimal digits are encoded into a single byte, with one digit in the least significant nibble (bits 0 through 3) and the other numeral in the most significant nibble (bits 4 through 7). As an example, encoding the decimal number 91 using unpacked BCD results in the following binary pattern of two bytes: Decimal: 9 1 Binary : 0000 1001 0000 0001 In packed BCD, the same number would fit into a single byte: Decimal: 9 1 Binary: 1001 0001 Hence the numerical range for one unpacked BCD byte is zero through nine inclusive, whereas the range for one packed BCD byte is zero through ninety-nine inclusive. To represent numbers larger than the range of a single byte any number of contiguous bytes may be used. For example, to represent the decimal number 12345 in packed BCD, using big-endian format, a program would encode as follows: Decimal: 0 1 2 3 4 5 Binary : 0000 0001 0010 0011 0100 0101 Here, the most significant nibble of the most significant byte has been encoded as zero, so the number is stored as 012345 (but formatting routines might replace or remove leading zeros). Packed BCD is more efficient in storage usage than unpacked BCD; encoding the same number (with the leading zero) in unpacked format would consume twice the storage. Shifting and masking operations are used to pack or unpack a packed BCD digit. Other bitwise operations are used to convert a numeral to its equivalent bit pattern or reverse the process. Packed BCD In packed BCD (or simply packed decimal), each of the two nibbles of each byte represent a decimal digit. Packed BCD has been in use since at least the 1960s and is implemented in all IBM mainframe hardware since then. Most implementations are big endian, i.e. with the more significant digit in the upper half of each byte, and with the leftmost byte (residing at the lowest memory address) containing the most significant digits of the packed decimal value. The lower nibble of the rightmost byte is usually used as the sign flag, although some unsigned representations lack a sign flag. As an example, a 4-byte value consists of 8 nibbles, wherein the upper 7 nibbles store the digits of a 7-digit decimal value, and the lowest nibble indicates the sign of the decimal integer value. Standard sign values are 1100 (hex C) for positive (+) and 1101 (D) for negative (−). This convention comes from the zone field for EBCDIC characters and the signed overpunch representation. Other allowed signs are 1010 (A) and 1110 (E) for positive and 1011 (B) for negative. IBM System/360 processors will use the 1010 (A) and 1011 (B) signs if the A bit is set in the PSW, for the ASCII-8 standard that never passed. Most implementations also provide unsigned BCD values with a sign nibble of 1111 (F). ILE RPG uses 1111 (F) for positive and 1101 (D) for negative. These match the EBCDIC zone for digits without a sign overpunch. In packed BCD, the number 127 is represented by 0001 0010 0111 1100 (127C) and −127 is represented by 0001 0010 0111 1101 (127D). Burroughs systems used 1101 (D) for negative, and any other value is considered a positive sign value (the processors will normalize a positive sign to 1100 (C)). No matter how many bytes wide a word is, there is always an even number of nibbles because each byte has two of them. Therefore, a word of n bytes can contain up to (2n)−1 decimal digits, which is always an odd number of digits. A decimal number with d digits requires (d+1) bytes of storage space. For example, a 4-byte (32-bit) word can hold seven decimal digits plus a sign and can represent values ranging from ±9,999,999. Thus the number −1,234,567 is 7 digits wide and is encoded as: 0001 0010 0011 0100 0101 0110 0111 1101 1 2 3 4 5 6 7 − Like character strings, the first byte of the packed decimal that with the most significant two digits is usually stored in the lowest address in memory, independent of the endianness of the machine. In contrast, a 4-byte binary two's complement integer can represent values from −2,147,483,648 to +2,147,483,647. While packed BCD does not make optimal use of storage (using about 20% more memory than binary notation to store the same numbers), conversion to ASCII, EBCDIC, or the various encodings of Unicode is made trivial, as no arithmetic operations are required. The extra storage requirements are usually offset by the need for the accuracy and compatibility with calculator or hand calculation that fixed-point decimal arithmetic provides. Denser packings of BCD exist which avoid the storage penalty and also need no arithmetic operations for common conversions. Packed BCD is supported in the COBOL programming language as the "COMPUTATIONAL-3" (an IBM extension adopted by many other compiler vendors) or "PACKED-DECIMAL" (part of the 1985 COBOL standard) data type. It is supported in PL/I as "FIXED DECIMAL". Beside the IBM System/360 and later compatible mainframes, packed BCD is implemented in the native instruction set of the original VAX processors from Digital Equipment Corporation and some models of the SDS Sigma series mainframes, and is the native format for the Burroughs Corporation Medium Systems line of mainframes (descended from the 1950s Electrodata 200 series). Ten's complement representations for negative numbers offer an alternative approach to encoding the sign of packed (and other) BCD numbers. In this case, positive numbers always have a most significant digit between 0 and 4 (inclusive), while negative numbers are represented by the 10's complement of the corresponding positive number. As a result, this system allows for 32-bit packed BCD numbers to range from −50,000,000 to +49,999,999, and −1 is represented as 99999999. (As with two's complement binary numbers, the range is not symmetric about zero.) Fixed-point packed decimal Fixed-point decimal numbers are supported by some programming languages (such as COBOL and PL/I). These languages allow the programmer to specify an implicit decimal point in front of one of the digits. For example, a packed decimal value encoded with the bytes 12 34 56 7C represents the fixed-point value +1,234.567 when the implied decimal point is located between the 4th and 5th digits: 12 34 56 7C 12 34.56 7+ The decimal point is not actually stored in memory, as the packed BCD storage format does not provide for it. Its location is simply known to the compiler, and the generated code acts accordingly for the various arithmetic operations. Higher-density encodings If a decimal digit requires four bits, then three decimal digits require 12 bits. However, since 210 (1,024) is greater than 103 (1,000), if three decimal digits are encoded together, only 10 bits are needed. Two such encodings are Chen–Ho encoding and densely packed decimal (DPD). The latter has the advantage that subsets of the encoding encode two digits in the optimal seven bits and one digit in four bits, as in regular BCD. Zoned decimal Some implementations, for example IBM mainframe systems, support zoned decimal numeric representations. Each decimal digit is stored in one byte, with the lower four bits encoding the digit in BCD form. The upper four bits, called the "zone" bits, are usually set to a fixed value so that the byte holds a character value corresponding to the digit. EBCDIC systems use a zone value of 1111 (hex F); this yields bytes in the range F0 to F9 (hex), which are the EBCDIC codes for the characters "0" through "9". Similarly, ASCII systems use a zone value of 0011 (hex 3), giving character codes 30 to 39 (hex). For signed zoned decimal values, the rightmost (least significant) zone nibble holds the sign digit, which is the same set of values that are used for signed packed decimal numbers (see above). Thus a zoned decimal value encoded as the hex bytes F1 F2 D3 represents the signed decimal value −123: F1 F2 D3 1 2 −3 EBCDIC zoned decimal conversion table (*) Note: These characters vary depending on the local character code page setting. Fixed-point zoned decimal Some languages (such as COBOL and PL/I) directly support fixed-point zoned decimal values, assigning an implicit decimal point at some location between the decimal digits of a number. For example, given a six-byte signed zoned decimal value with an implied decimal point to the right of the fourth digit, the hex bytes F1 F2 F7 F9 F5 C0 represent the value +1,279.50: F1 F2 F7 F9 F5 C0 1 2 7 9. 5 +0 BCD in computers IBM IBM used the terms Binary-Coded Decimal Interchange Code (BCDIC, sometimes just called BCD), for 6-bit alphanumeric codes that represented numbers, upper-case letters and special characters. Some variation of BCDIC alphamerics is used in most early IBM computers, including the IBM 1620 (introduced in 1959), IBM 1400 series, and non-Decimal Architecture members of the IBM 700/7000 series. The IBM 1400 series are character-addressable machines, each location being six bits labeled B, A, 8, 4, 2 and 1, plus an odd parity check bit (C) and a word mark bit (M). For encoding digits 1 through 9, B and A are zero and the digit value represented by standard 4-bit BCD in bits 8 through 1. For most other characters bits B and A are derived simply from the "12", "11", and "0" "zone punches" in the punched card character code, and bits 8 through 1 from the 1 through 9 punches. A "12 zone" punch set both B and A, an "11 zone" set B, and a "0 zone" (a 0 punch combined with any others) set A. Thus the letter A, which is (12,1) in the punched card format, is encoded (B,A,1). The currency symbol $, (11,8,3) in the punched card, was encoded in memory as (B,8,2,1). This allows the circuitry to convert between the punched card format and the internal storage format to be very simple with only a few special cases. One important special case is digit 0, represented by a lone 0 punch in the card, and (8,2) in core memory. The memory of the IBM 1620 is organized into 6-bit addressable digits, the usual 8, 4, 2, 1 plus F, used as a flag bit and C, an odd parity check bit. BCD alphamerics are encoded using digit pairs, with the "zone" in the even-addressed digit and the "digit" in the odd-addressed digit, the "zone" being related to the 12, 11, and 0 "zone punches" as in the 1400 series. Input/Output translation hardware converted between the internal digit pairs and the external standard 6-bit BCD codes. In the Decimal Architecture IBM 7070, IBM 7072, and IBM 7074 alphamerics are encoded using digit pairs (using two-out-of-five code in the digits, not BCD) of the 10-digit word, with the "zone" in the left digit and the "digit" in the right digit. Input/Output translation hardware converted between the internal digit pairs and the external standard 6-bit BCD codes. With the introduction of System/360, IBM expanded 6-bit BCD alphamerics to 8-bit EBCDIC, allowing the addition of many more characters (e.g., lowercase letters). A variable length Packed BCD numeric data type is also implemented, providing machine instructions that perform arithmetic directly on packed decimal data. On the IBM 1130 and 1800, packed BCD is supported in software by IBM's Commercial Subroutine Package. Today, BCD data is still heavily used in IBM processors and databases, such as IBM DB2, mainframes, and Power6. In these products, the BCD is usually zoned BCD (as in EBCDIC or ASCII), Packed BCD (two decimal digits per byte), or | can hold seven decimal digits plus a sign and can represent values ranging from ±9,999,999. Thus the number −1,234,567 is 7 digits wide and is encoded as: 0001 0010 0011 0100 0101 0110 0111 1101 1 2 3 4 5 6 7 − Like character strings, the first byte of the packed decimal that with the most significant two digits is usually stored in the lowest address in memory, independent of the endianness of the machine. In contrast, a 4-byte binary two's complement integer can represent values from −2,147,483,648 to +2,147,483,647. While packed BCD does not make optimal use of storage (using about 20% more memory than binary notation to store the same numbers), conversion to ASCII, EBCDIC, or the various encodings of Unicode is made trivial, as no arithmetic operations are required. The extra storage requirements are usually offset by the need for the accuracy and compatibility with calculator or hand calculation that fixed-point decimal arithmetic provides. Denser packings of BCD exist which avoid the storage penalty and also need no arithmetic operations for common conversions. Packed BCD is supported in the COBOL programming language as the "COMPUTATIONAL-3" (an IBM extension adopted by many other compiler vendors) or "PACKED-DECIMAL" (part of the 1985 COBOL standard) data type. It is supported in PL/I as "FIXED DECIMAL". Beside the IBM System/360 and later compatible mainframes, packed BCD is implemented in the native instruction set of the original VAX processors from Digital Equipment Corporation and some models of the SDS Sigma series mainframes, and is the native format for the Burroughs Corporation Medium Systems line of mainframes (descended from the 1950s Electrodata 200 series). Ten's complement representations for negative numbers offer an alternative approach to encoding the sign of packed (and other) BCD numbers. In this case, positive numbers always have a most significant digit between 0 and 4 (inclusive), while negative numbers are represented by the 10's complement of the corresponding positive number. As a result, this system allows for 32-bit packed BCD numbers to range from −50,000,000 to +49,999,999, and −1 is represented as 99999999. (As with two's complement binary numbers, the range is not symmetric about zero.) Fixed-point packed decimal Fixed-point decimal numbers are supported by some programming languages (such as COBOL and PL/I). These languages allow the programmer to specify an implicit decimal point in front of one of the digits. For example, a packed decimal value encoded with the bytes 12 34 56 7C represents the fixed-point value +1,234.567 when the implied decimal point is located between the 4th and 5th digits: 12 34 56 7C 12 34.56 7+ The decimal point is not actually stored in memory, as the packed BCD storage format does not provide for it. Its location is simply known to the compiler, and the generated code acts accordingly for the various arithmetic operations. Higher-density encodings If a decimal digit requires four bits, then three decimal digits require 12 bits. However, since 210 (1,024) is greater than 103 (1,000), if three decimal digits are encoded together, only 10 bits are needed. Two such encodings are Chen–Ho encoding and densely packed decimal (DPD). The latter has the advantage that subsets of the encoding encode two digits in the optimal seven bits and one digit in four bits, as in regular BCD. Zoned decimal Some implementations, for example IBM mainframe systems, support zoned decimal numeric representations. Each decimal digit is stored in one byte, with the lower four bits encoding the digit in BCD form. The upper four bits, called the "zone" bits, are usually set to a fixed value so that the byte holds a character value corresponding to the digit. EBCDIC systems use a zone value of 1111 (hex F); this yields bytes in the range F0 to F9 (hex), which are the EBCDIC codes for the characters "0" through "9". Similarly, ASCII systems use a zone value of 0011 (hex 3), giving character codes 30 to 39 (hex). For signed zoned decimal values, the rightmost (least significant) zone nibble holds the sign digit, which is the same set of values that are used for signed packed decimal numbers (see above). Thus a zoned decimal value encoded as the hex bytes F1 F2 D3 represents the signed decimal value −123: F1 F2 D3 1 2 −3 EBCDIC zoned decimal conversion table (*) Note: These characters vary depending on the local character code page setting. Fixed-point zoned decimal Some languages (such as COBOL and PL/I) directly support fixed-point zoned decimal values, assigning an implicit decimal point at some location between the decimal digits of a number. For example, given a six-byte signed zoned decimal value with an implied decimal point to the right of the fourth digit, the hex bytes F1 F2 F7 F9 F5 C0 represent the value +1,279.50: F1 F2 F7 F9 F5 C0 1 2 7 9. 5 +0 BCD in computers IBM IBM used the terms Binary-Coded Decimal Interchange Code (BCDIC, sometimes just called BCD), for 6-bit alphanumeric codes that represented numbers, upper-case letters and special characters. Some variation of BCDIC alphamerics is used in most early IBM computers, including the IBM 1620 (introduced in 1959), IBM 1400 series, and non-Decimal Architecture members of the IBM 700/7000 series. The IBM 1400 series are character-addressable machines, each location being six bits labeled B, A, 8, 4, 2 and 1, plus an odd parity check bit (C) and a word mark bit (M). For encoding digits 1 through 9, B and A are zero and the digit value represented by standard 4-bit BCD in bits 8 through 1. For most other characters bits B and A are derived simply from the "12", "11", and "0" "zone punches" in the punched card character code, and bits 8 through 1 from the 1 through 9 punches. A "12 zone" punch set both B and A, an "11 zone" set B, and a "0 zone" (a 0 punch combined with any others) set A. Thus the letter A, which is (12,1) in the punched card format, is encoded (B,A,1). The currency symbol $, (11,8,3) in the punched card, was encoded in memory as (B,8,2,1). This allows the circuitry to convert between the punched card format and the internal storage format to be very simple with only a few special cases. One important special case is digit 0, represented by a lone 0 punch in the card, and (8,2) in core memory. The memory of the IBM 1620 is organized into 6-bit addressable digits, the usual 8, 4, 2, 1 plus F, used as a flag bit and C, an odd parity check bit. BCD alphamerics are encoded using digit pairs, with the "zone" in the even-addressed digit and the "digit" in the odd-addressed digit, the "zone" being related to the 12, 11, and 0 "zone punches" as in the 1400 series. Input/Output translation hardware converted between the internal digit pairs and the external standard 6-bit BCD codes. In the Decimal Architecture IBM 7070, IBM 7072, and IBM 7074 alphamerics are encoded using digit pairs (using two-out-of-five code in the digits, not BCD) of the 10-digit word, with the "zone" in the left digit and the "digit" in the right digit. Input/Output translation hardware converted between the internal digit pairs and the external standard 6-bit BCD codes. With the introduction of System/360, IBM expanded 6-bit BCD alphamerics to 8-bit EBCDIC, allowing the addition of many more characters (e.g., lowercase letters). A variable length Packed BCD numeric data type is also implemented, providing machine instructions that perform arithmetic directly on packed decimal data. On the IBM 1130 and 1800, packed BCD is supported in software by IBM's Commercial Subroutine Package. Today, BCD data is still heavily used in IBM processors and databases, such as IBM DB2, mainframes, and Power6. In these products, the BCD is usually zoned BCD (as in EBCDIC or ASCII), Packed BCD (two decimal digits per byte), or "pure" BCD encoding (one decimal digit stored as BCD in the low four bits of each byte). All of these are used within hardware registers and processing units, and in software. To convert packed decimals in EBCDIC table unloads to readable numbers, you |
US Bid Closing Date The closing date for a bid is a specific date (and usually a specific time) when the bid is closed to the public for bid submissions. At this point, only the submitted proposals will be considered eligible. The British Columbia Dragoons, a Canadian Forces armoured regiment Places Bacolod–Silay International Airport (IATA code), Silay City, Philippines Beirut Central District, Beirut, Lebanon Other uses Bad conduct discharge, a form of discharge from US military service, sometimes referred to colloquially as a "big chicken dinner". Barrels per calendar day, a unit for measuring output of oil refineries Blue compact dwarf galaxy, a small galaxy which contains large clusters of young, hot, massive stars Board-certified diplomate, in the list of credentials in psychology | in binary BCD (character encoding), a 6-bit superset of binary-coded decimal derived from the binary encoding of the same name Boot Configuration Data, the configuration data required to boot Microsoft Windows Vista and later Bipolar-CMOS-DMOS, a type of BiCMOS semiconductor technology Organisations Basnahira Cricket Dundee, a Sri Lankan cricket team BCD Travel, a provider of global corporate travel management Belarusian Christian Democracy, a Christian-democratic political party in Belarus. Berkshire Country Day School, an independent school in |
relation involving two elements Binary-coded decimal, a method for encoding for decimal digits in binary sequences Finger binary, a system for counting in binary numbers on the fingers of human hands Computing Binary code, the digital representation of text and data Bit, or binary digit, the basic unit of information in computers Binary file, composed of something other than human-readable text Executable, a type of binary file that contains machine code for the computer to execute Binary tree, a computer tree data structure in which each node has at most two children Astronomy Binary star, a star system with two stars in it Binary planet, two planetary bodies of | most two children Astronomy Binary star, a star system with two stars in it Binary planet, two planetary bodies of comparable mass orbiting each other Binary asteroid, two asteroids orbiting each other Biology Binary fission, the splitting of a single-celled organism into two daughter cells Chemistry Binary phase, a chemical compound containing two different chemical elements Arts and entertainment Binary (comics), a superheroine in the Marvel Universe Binary (Doctor Who audio) Music Binary form, a way of structuring a piece of music Binary (Ani DiFranco album), 2017 Binary (Kay Tse album), 2008 "Binary" (song), a 2007 single by Assemblage 23 Novel Binary (novel), a 1972 novel |
Anagui's emissary, and severed relations with the Rouran Khaganate. Anagui's "blacksmith" (鍛奴 / 锻奴, Pinyin: duàn nú, Wade–Giles: tuan-nu) insult was recorded in Chinese chronicles. Some sources state that members of the Tujue did serve blacksmiths for the Rouran elite, and that "blacksmith slavery" may refer to a kind of vassalage that prevailed in Rouran society. Nevertheless, after this incident Bumin emerged as the leader of the revolt against Rouran. In 551, Bumin requested a Western Wei princess in marriage. Yuwen Tai permitted it and sent Princess Changle(長樂公主) of Western Wei to Bumin.In the same year when Emperor Wen of Western Wei died, Bumin sent mission and gave two hundred horses. The beginning of formal diplomatic relations with China propped up Bumin's authority among the Turks. He eventually united the local Turkic tribes and threw off the yoke of the Rouran domination. In 552 Bumin's army defeated Anagui's forces at the north of Huaihuang and then Anagui committed suicide. With their defeat Bumin proclaimed himself "Illig Qaghan" and made his wife qaghatun. "Illig" means Ilkhan (i.e. ruler of people) in Old Turkic. According to the Bilge Qaghan's memorial complex and the Kul Tigin's memorial complex, Bumin and Istemi ruled people by Turkic laws and they developed them. Death and family Bumin died within several months after proclaiming himself Illig Qaghan. He was married to Princess Changle of Western Wei. Issue: Ashina Keluo (阿史那科罗) - Issig Qaghan Ashina Qijin (阿史那俟斤) - Muqan Qaghan Taspar Qaghan Ashina Kutou (阿史那庫頭) - Ditou Qaghan (appointed by Muqan Qaghan to be lesser khagan of eastern wing of Turkic Empire) | Qaghan (Chinese: 伊利可汗, Pinyin: Yīlì Kèhán, Wade–Giles: i-li k'o-han) or Yamï Qaghan (, died 552 AD) was the founder of the Turkic Khaganate. He was the eldest son of Ashina Tuwu (吐務 / 吐务). He was the chieftain of the Turks under the sovereignty of Rouran Khaganate. He is also mentioned as "Tumen" (, , commander of ten thousand) of the Rouran Khaganate. Early life According to History of Northern Dynasties and Zizhi Tongjian, in 545 Tumen's tribe started to rise and frequently invaded the western frontier of Wei. The chancellor of Western Wei, Yuwen Tai, sent An Nuopanto (Nanai-Banda, a Sogdian from Bukhara,) as an emissary to the Göktürk chieftain Tumen, in an attempt to establish a commercial relationship. In 546, Tumen paid tribute to the Western Wei state. In that same year, Tumen put down a revolt of the Tiele tribes against the Rouran Khaganate, their overlords. Following this, Tumen felt entitled to request of the Rouran a princess as his wife. The |
forcibly moved into the heart of the empire to prevent them from doing so. Before Wang's suggestion could be acted upon, however, there was an uprising by the Göktürks who surrendered, under the leadership of Xiedie Sitai (𨁂跌思泰) and Axilan (阿悉爛). Xue and Wang tried to intercept them and dealt them defeats, but they were able to flee back to the Göktürk state anyway. This defeat led to Xue Ne's retirement. Religious policy At some point in his life, he wanted to convert to Buddhism, settle in cities. However, Tonyukuk discouraged him from this, citing Turks' small numbers and vulnerability to Chinese attack. While Turks' power rested on their mobility, conversion to Buddhism would bring pacifism among population. Therefore, sticking to Tengrism was necessary to survive. Later reign In 720, Wang believed that the Pugu (僕固) and Xiedie tribes of the region were planning to defect to Eastern Tujue and attack with Eastern Tujue troops. He thus held a feast and invited the chieftains, and, at the feast, massacred them. He then attacked the Pugu and Xiedie tribes in the area, nearly wiping them out. He then proposed a plan to attack Qaghan along with the Baximi, Xi, and Khitan. Emperor Xuanzong also recruited Qapaghan Khagan's sons Bilgä Tigin and Mo Tigin, Yenisei Kyrgyz Qaghan Kutluk Bilgä Qaghan and Huoba Guiren to fight against Tujue. Tonyukuk cunningly launched first attack on Baximi in 721 autumn, completely crushing them. Meanwhile, Bilgä raided Gansu, taking much of the livestock. Later that year Khitans, next year Xi were also crushed. In 726, his father-in-law and chancellor Tonyukuk died. In 727, he sent Buyruk Chor () as en emissary to Xuanzong to send 30 horses as gift. He also alarmed him of Me Agtsom's proposal of anti-Tang alliance. This alarm proved to be true when Tibetan general We Tadra Khonglo invaded Tang China in 727, sack Guazhou (瓜州, in mordern Gansu), Changle (常樂, in south of mordern Guazhou County), Changmenjun (長門軍, in north of mordern Yumen) and Anxi (安西, mordern Lintan). On 27 February 731, Kul Tigin | his father from early childhood. He was created as Tardush shad and given command over the western wing of the empire in 697 by Qapaghan. He managed to annihilate Wei Yuanzhong's army in 701 with his brother. He also reconquered Basmyl tribes in 703. He also subdued Yenisei Kyrgyz forces in 709, after their disobedience had to reconquer and kill their Qaghan in 710. He killed Türgesh khagan Suoge at Battle of Bolchu. In later years of Qapaghan, he had to fight four battles in a year starting from 714, resubduing tribes and nearly was killed in an ambush from Uyghur forces in 716. Reign In 716, Qapaghan Qaghan, the second Qaghan, was killed in his campaign against the Toquz Oghuz alliance and his severed head was sent to Chang'an. Although his son Inel Khagan succeeded him, Bilgä's brother Kul Tigin and Tonyukuk carried out a coup d'état against Inel Qaghan. They killed him and made him Bilgä Qaghan. His name literally means "wise king". He appointed his brother Kul Tigin to be Left Wise Prince, which made second most powerful person in realm. He re-subdued Huige in 716. He also appointed his father-in-law Tonyukuk to be Master Strategist. New reforms and stabilization of the regime, caused tribes that fled Tujue to come back. Tang chancellor Wang Jun, believing that the Göktürks who surrendered would try to flee back to the Göktürk state, suggested that they be forcibly moved into the heart of the empire to prevent them from doing so. Before Wang's suggestion could be acted upon, however, there was an uprising by the Göktürks who surrendered, under the leadership of Xiedie Sitai (𨁂跌思泰) and Axilan (阿悉爛). Xue and Wang tried to intercept them and dealt them defeats, but they were able to flee back to the Göktürk state anyway. This defeat led to Xue Ne's retirement. Religious policy At some point in his life, he wanted to convert to Buddhism, settle in cities. However, Tonyukuk discouraged him from this, citing Turks' small numbers and vulnerability to Chinese attack. While Turks' power rested on their mobility, conversion to Buddhism would bring pacifism among population. Therefore, sticking to Tengrism was necessary to survive. Later reign In 720, Wang believed that the Pugu (僕固) and Xiedie tribes of the region were planning to defect to Eastern Tujue and attack with Eastern Tujue troops. He thus held a feast and invited the chieftains, and, at the feast, massacred them. He then attacked the Pugu and Xiedie tribes in the area, nearly wiping them out. He then |
many of the bridges and service stations were "bold examples of modernism", and among those submitting designs was Mies van der Rohe. Architectural output The paradox of the early Bauhaus was that, although its manifesto proclaimed that the aim of all creative activity was building, the school did not offer classes in architecture until 1927. During the years under Gropius (1919–1927), he and his partner Adolf Meyer observed no real distinction between the output of his architectural office and the school. So the built output of Bauhaus architecture in these years is the output of Gropius: the Sommerfeld house in Berlin, the Otte house in Berlin, the Auerbach house in Jena, and the competition design for the Chicago Tribune Tower, which brought the school much attention. The definitive 1926 Bauhaus building in Dessau is also attributed to Gropius. Apart from contributions to the 1923 Haus am Horn, student architectural work amounted to un-built projects, interior finishes, and craft work like cabinets, chairs and pottery. In the next two years under Meyer, the architectural focus shifted away from aesthetics and towards functionality. There were major commissions: one from the city of Dessau for five tightly designed "Laubenganghäuser" (apartment buildings with balcony access), which are still in use today, and another for the Bundesschule des Allgemeinen Deutschen Gewerkschaftsbundes (ADGB Trade Union School) in Bernau bei Berlin. Meyer's approach was to research users' needs and scientifically develop the design solution. Mies van der Rohe repudiated Meyer's politics, his supporters, and his architectural approach. As opposed to Gropius's "study of essentials", and Meyer's research into user requirements, Mies advocated a "spatial implementation of intellectual decisions", which effectively meant an adoption of his own aesthetics. Neither Mies van der Rohe nor his Bauhaus students saw any projects built during the 1930s. The popular conception of the Bauhaus as the source of extensive Weimar-era working housing is not accurate. Two projects, the apartment building project in Dessau and the Törten row housing also in Dessau, fall in that category, but developing worker housing was not the first priority of Gropius nor Mies. It was the Bauhaus contemporaries Bruno Taut, Hans Poelzig and particularly Ernst May, as the city architects of Berlin, Dresden and Frankfurt respectively, who are rightfully credited with the thousands of socially progressive housing units built in Weimar Germany. The housing Taut built in south-west Berlin during the 1920s, close to the U-Bahn stop Onkel Toms Hütte, is still occupied. Impact The Bauhaus had a major impact on art and architecture trends in Western Europe, Canada, the United States and Israel in the decades following its demise, as many of the artists involved fled, or were exiled by the Nazi regime. Tel Aviv in 2004 was named to the list of world heritage sites by the UN due to its abundance of Bauhaus architecture; it had some 4,000 Bauhaus buildings erected from 1933 onwards. In 1928, the Hungarian painter Alexander Bortnyik founded a school of design in Budapest called Műhely, which means "the studio". Located on the seventh floor of a house on Nagymezo Street, it was meant to be the Hungarian equivalent to the Bauhaus. The literature sometimes refers to it—in an oversimplified manner—as "the Budapest Bauhaus". Bortnyik was a great admirer of László Moholy-Nagy and had met Walter Gropius in Weimar between 1923 and 1925. Moholy-Nagy himself taught at the Miihely. Victor Vasarely, a pioneer of Op Art, studied at this school before establishing in Paris in 1930. Walter Gropius, Marcel Breuer, and Moholy-Nagy re-assembled in Britain during the mid 1930s and lived and worked in the Isokon housing development in Lawn Road in London before the war caught up with them. Gropius and Breuer went on to teach at the Harvard Graduate School of Design and worked together before their professional split. Their collaboration produced, among other projects, the Aluminum City Terrace in New Kensington, Pennsylvania and the Alan I W Frank House in Pittsburg. The Harvard School was enormously influential in America in the late 1920s and early 1930s, producing such students as Philip Johnson, I. M. Pei, Lawrence Halprin and Paul Rudolph, among many others. In the late 1930s, Mies van der Rohe re-settled in Chicago, enjoyed the sponsorship of the influential Philip Johnson, and became one of the world's pre-eminent architects. Moholy-Nagy also went to Chicago and founded the New Bauhaus school under the sponsorship of industrialist and philanthropist Walter Paepcke. This school became the Institute of Design, part of the Illinois Institute of Technology. Printmaker and painter Werner Drewes was also largely responsible for bringing the Bauhaus aesthetic to America and taught at both Columbia University and Washington University in St. Louis. Herbert Bayer, sponsored by Paepcke, moved to Aspen, Colorado in support of Paepcke's Aspen projects at the Aspen Institute. In 1953, Max Bill, together with Inge Aicher-Scholl and Otl Aicher, founded the Ulm School of Design (German: Hochschule für Gestaltung – HfG Ulm) in Ulm, Germany, a design school in the tradition of the Bauhaus. The school is notable for its inclusion of semiotics as a field of study. The school closed in 1968, but the "Ulm Model" concept continues to influence international design education. Another series of projects at the school were the Bauhaus typefaces, mostly realized in the decades afterward. The influence of the Bauhaus on design education was significant. One of the main objectives of the Bauhaus was to unify art, craft, and technology, and this approach was incorporated into the curriculum of the Bauhaus. The structure of the Bauhaus Vorkurs (preliminary course) reflected a pragmatic approach to integrating theory and application. In their first year, students learnt the basic elements and principles of design and colour theory, and experimented with a range of materials and processes. This approach to design education became a common feature of architectural and design school in many countries. For example, the Shillito Design School in Sydney stands as a unique link between Australia and the Bauhaus. The colour and design syllabus of the Shillito Design School was firmly underpinned by the theories and ideologies of the Bauhaus. Its first year foundational course mimicked the Vorkurs and focused on the elements and principles of design plus colour theory and application. The founder of the school, Phyllis Shillito, which opened in 1962 and closed in 1980, firmly believed that "A student who has mastered the basic principles of design, can design anything from a dress to a kitchen stove". In Britain, largely under the influence of painter and teacher William Johnstone, Basic Design, a Bauhaus-influenced art foundation course, was introduced at Camberwell School of Art and the Central School of Art and Design, whence it spread to all art schools in the country, becoming universal by the early 1960s. One of the most important contributions of the Bauhaus is in the field of modern furniture design. The characteristic Cantilever chair and Wassily Chair designed by Marcel Breuer are two examples. (Breuer eventually lost a legal battle in Germany with Dutch architect/designer Mart Stam over patent rights to the cantilever chair design. Although Stam had worked on the design of the Bauhaus's 1923 exhibit in Weimar, and guest-lectured at the Bauhaus later in the 1920s, he was not formally associated with the school, and he and Breuer had worked independently on the cantilever concept, leading to the patent dispute.) The most profitable product of the Bauhaus was its | the new faith of the future." By 1923, however, Gropius was no longer evoking images of soaring Romanesque cathedrals and the craft-driven aesthetic of the "Völkisch movement", instead declaring "we want an architecture adapted to our world of machines, radios and fast cars." Gropius argued that a new period of history had begun with the end of the war. He wanted to create a new architectural style to reflect this new era. His style in architecture and consumer goods was to be functional, cheap and consistent with mass production. To these ends, Gropius wanted to reunite art and craft to arrive at high-end functional products with artistic merit. The Bauhaus issued a magazine called Bauhaus and a series of books called "Bauhausbücher". Since the Weimar Republic lacked the number of raw materials available to the United States and Great Britain, it had to rely on the proficiency of a skilled labour force and an ability to export innovative and high-quality goods. Therefore, designers were needed and so was a new type of art education. The school's philosophy stated that the artist should be trained to work with the industry. Weimar was in the German state of Thuringia, and the Bauhaus school received state support from the Social Democrat-controlled Thuringian state government. The school in Weimar experienced political pressure from conservative circles in Thuringian politics, increasingly so after 1923 as political tension rose. One condition placed on the Bauhaus in this new political environment was the exhibition of work undertaken at the school. This condition was met in 1923 with the Bauhaus' exhibition of the experimental Haus am Horn. The Ministry of Education placed the staff on six-month contracts and cut the school's funding in half. The Bauhaus issued a press release on 26 December 1924, setting the closure of the school for the end of March 1925. At this point it had already been looking for alternative sources of funding. After the Bauhaus moved to Dessau, a school of industrial design with teachers and staff less antagonistic to the conservative political regime remained in Weimar. This school was eventually known as the Technical University of Architecture and Civil Engineering, and in 1996 changed its name to Bauhaus-University Weimar. Dessau The Bauhaus moved to Dessau in 1925 and new facilities there were inaugurated in late 1926. Gropius's design for the Dessau facilities was a return to the futuristic Gropius of 1914 that had more in common with the International style lines of the Fagus Factory than the stripped down Neo-classical of the Werkbund pavilion or the Völkisch Sommerfeld House. During the Dessau years, there was a remarkable change in direction for the school. According to Elaine Hoffman, Gropius had approached the Dutch architect Mart Stam to run the newly founded architecture program, and when Stam declined the position, Gropius turned to Stam's friend and colleague in the ABC group, Hannes Meyer. Meyer became director when Gropius resigned in February 1928, and brought the Bauhaus its two most significant building commissions, both of which still exist: five apartment buildings in the city of Dessau, and the Bundesschule des Allgemeinen Deutschen Gewerkschaftsbundes (ADGB Trade Union School) in Bernau bei Berlin. Meyer favoured measurements and calculations in his presentations to clients, along with the use of off-the-shelf architectural components to reduce costs. This approach proved attractive to potential clients. The school turned its first profit under his leadership in 1929. But Meyer also generated a great deal of conflict. As a radical functionalist, he had no patience with the aesthetic program and forced the resignations of Herbert Bayer, Marcel Breuer, and other long-time instructors. Even though Meyer shifted the orientation of the school further to the left than it had been under Gropius, he didn't want the school to become a tool of left-wing party politics. He prevented the formation of a student communist cell, and in the increasingly dangerous political atmosphere, this became a threat to the existence of the Dessau school. Dessau mayor Fritz Hesse fired him in the summer of 1930. The Dessau city council attempted to convince Gropius to return as head of the school, but Gropius instead suggested Ludwig Mies van der Rohe. Mies was appointed in 1930 and immediately interviewed each student, dismissing those that he deemed uncommitted. He halted the school's manufacture of goods so that the school could focus on teaching, and appointed no new faculty other than his close confidant Lilly Reich. By 1931, the Nazi Party was becoming more influential in German politics. When it gained control of the Dessau city council, it moved to close the school. Berlin In late 1932, Mies rented a derelict factory in Berlin (Birkbusch Street 49) to use as the new Bauhaus with his own money. The students and faculty rehabilitated the building, painting the interior white. The school operated for ten months without further interference from the Nazi Party. In 1933, the Gestapo closed down the Berlin school. Mies protested the decision, eventually speaking to the head of the Gestapo, who agreed to allow the school to re-open. However, shortly after receiving a letter permitting the opening of the Bauhaus, Mies and the other faculty agreed to voluntarily shut down the school . Although neither the Nazi Party nor Adolf Hitler had a cohesive architectural policy before they came to power in 1933, Nazi writers like Wilhelm Frick and Alfred Rosenberg had already labelled the Bauhaus "un-German" and criticized its modernist styles, deliberately generating public controversy over issues like flat roofs. Increasingly through the early 1930s, they characterized the Bauhaus as a front for communists and social liberals. Indeed, when Meyer was fired in 1930, a number of communist students loyal to him moved to the Soviet Union. Even before the Nazis came to power, political pressure on Bauhaus had increased. The Nazi movement, from nearly the start, denounced the Bauhaus for its "degenerate art", and the Nazi regime was determined to crack down on what it saw as the foreign, probably Jewish, influences of "cosmopolitan modernism". Despite Gropius's protestations that as a war veteran and a patriot his work had no subversive political intent, the Berlin Bauhaus was pressured to close in April 1933. Emigrants did succeed, however, in spreading the concepts of the Bauhaus to other countries, including the "New Bauhaus" of Chicago: Mies decided to emigrate to the United States for the directorship of the School of Architecture at the Armour Institute (now Illinois Institute of Technology) in Chicago and to seek building commissions. The simple engineering-oriented functionalism of stripped-down modernism, however, did lead to some Bauhaus influences living on in Nazi Germany. When Hitler's chief engineer, Fritz Todt, began opening the new autobahns (highways) in 1935, many of the bridges and service stations were "bold examples of modernism", and among those submitting designs was Mies van der Rohe. Architectural output The paradox of the early Bauhaus was that, although its manifesto proclaimed that the aim of all creative activity was building, the school did not offer classes in architecture until 1927. During the years under Gropius (1919–1927), he and his partner Adolf Meyer observed no real distinction between the output of his architectural office and the school. So the built output of Bauhaus architecture in these years is the output of Gropius: the Sommerfeld house in Berlin, the Otte house in Berlin, the Auerbach house in Jena, and the competition design for the Chicago Tribune Tower, which brought the school much attention. The definitive 1926 Bauhaus building in Dessau is also attributed to Gropius. Apart from contributions to the 1923 Haus am Horn, student architectural work amounted to un-built projects, interior finishes, and craft work like cabinets, chairs and pottery. In the next two years under Meyer, the architectural focus shifted away from aesthetics and towards functionality. There were major commissions: one from the city of Dessau for five tightly designed "Laubenganghäuser" (apartment buildings with balcony access), which are still in use today, and another for the Bundesschule des Allgemeinen Deutschen Gewerkschaftsbundes (ADGB Trade Union School) in Bernau bei Berlin. Meyer's approach was to research users' needs and scientifically develop the design solution. Mies van der Rohe repudiated Meyer's politics, his supporters, and his architectural approach. As opposed to Gropius's "study of essentials", and Meyer's research into user requirements, Mies advocated a "spatial implementation of intellectual decisions", which effectively meant an adoption of his own aesthetics. Neither Mies van der Rohe nor his Bauhaus students saw any projects built during the 1930s. The popular conception of the Bauhaus as the source of extensive Weimar-era working housing is not accurate. Two projects, the apartment building project in Dessau and the Törten row housing also in Dessau, fall in that category, but developing worker housing was not the first priority of Gropius nor Mies. It was the Bauhaus contemporaries Bruno Taut, Hans Poelzig and particularly Ernst May, as the city architects of Berlin, Dresden and Frankfurt respectively, who are rightfully credited with the thousands of socially progressive housing units built in Weimar Germany. The housing Taut built in south-west Berlin during the 1920s, close to the U-Bahn stop Onkel Toms Hütte, is still occupied. Impact The Bauhaus had a major impact on art and architecture trends in Western Europe, Canada, the United States and Israel in the decades following its demise, as many of the artists involved fled, or were exiled by the Nazi regime. Tel Aviv in 2004 was named to the list of world heritage sites by the UN due to its abundance of Bauhaus architecture; it had some 4,000 Bauhaus buildings erected from 1933 onwards. In 1928, the Hungarian painter Alexander Bortnyik founded a school of design in Budapest called Műhely, which means "the studio". Located on the seventh floor of a house on Nagymezo Street, it was meant to be the Hungarian equivalent to the Bauhaus. The literature sometimes refers to it—in an oversimplified manner—as "the Budapest Bauhaus". Bortnyik was a great admirer of László Moholy-Nagy and had met Walter Gropius in Weimar between 1923 and 1925. Moholy-Nagy himself taught at the Miihely. Victor Vasarely, a pioneer of Op Art, studied at this school before establishing in Paris in 1930. Walter Gropius, Marcel Breuer, and Moholy-Nagy re-assembled in Britain during the mid 1930s and lived and worked in the Isokon housing development in Lawn Road in London before the war caught up with them. Gropius and Breuer went on to teach at the Harvard Graduate School of Design and worked together before their professional split. Their collaboration produced, among other projects, the Aluminum City Terrace in New Kensington, Pennsylvania and the Alan I W Frank House in Pittsburg. The Harvard School was enormously influential in America in the late 1920s and early 1930s, producing such students as Philip Johnson, I. M. Pei, Lawrence Halprin and Paul Rudolph, among many others. In the late 1930s, Mies van der Rohe re-settled in Chicago, enjoyed the sponsorship of the influential Philip Johnson, and became one of the world's pre-eminent architects. Moholy-Nagy also went to Chicago and founded the New Bauhaus school under the sponsorship of industrialist and philanthropist Walter Paepcke. This school became the Institute of Design, part of the Illinois Institute of Technology. Printmaker and painter Werner Drewes was also largely responsible for bringing the Bauhaus aesthetic to America and taught at both Columbia University and Washington University in St. Louis. Herbert Bayer, sponsored by Paepcke, moved to Aspen, Colorado in support of Paepcke's Aspen projects at the Aspen Institute. In 1953, Max Bill, together with Inge Aicher-Scholl and Otl Aicher, founded the Ulm School of Design (German: Hochschule für Gestaltung – HfG Ulm) in Ulm, Germany, a design school in the tradition of the Bauhaus. The school is notable for its inclusion of semiotics as a field of study. The school closed in 1968, but the "Ulm Model" concept continues to influence international design education. Another series of projects at the school were the Bauhaus typefaces, mostly realized in the decades afterward. The influence of the Bauhaus on design education was significant. One of the main objectives of the Bauhaus was to unify art, craft, and technology, and |
and literate patterns. Larry Benson proposed that Germanic literature contains "kernels of tradition" which Beowulf expands upon. Ann Watts argued against the imperfect application of one theory to two different traditions: traditional, Homeric, oral-formulaic poetry and Anglo-Saxon poetry. Thomas Gardner agreed with Watts, arguing that the Beowulf text is too varied to be completely constructed from set formulae and themes. John Miles Foley wrote that comparative work must observe the particularities of a given tradition; in his view, there was a fluid continuum from traditionality to textuality. Editions and translations Editions Many editions of the Old English text of Beowulf have been published; this section lists the most influential. The Icelandic scholar Grímur Jónsson Thorkelin made the first transcriptions of the Beowulf-manuscript in 1786, working as part of a Danish government historical research commission. He made one himself, and had another done by a professional copyist who knew no Old English (and was therefore in some ways more likely to make transcription errors, but in other ways more likely to copy exactly what he saw). Since that time, the manuscript has crumbled further, making these transcripts prized witnesses to the text. While the recovery of at least 2000 letters can be attributed to them, their accuracy has been called into question, and the extent to which the manuscript was actually more readable in Thorkelin's time is uncertain. Thorkelin used these transcriptions as the basis for the first complete edition of Beowulf, in Latin. In 1922, Frederick Klaeber published his edition Beowulf and The Fight at Finnsburg; it became the "central source used by graduate students for the study of the poem and by scholars and teachers as the basis of their translations." The edition included an extensive glossary of Old English terms. His third edition was published in 1936, with the last version in his lifetime being a revised reprint in 1950. Klaeber's text was re-presented with new introductory material, notes, and glosses, in a fourth edition in 2008. Another widely used edition is Elliott Van Kirk Dobbie's, published in 1953 in the Anglo-Saxon Poetic Records series. The British Library, meanwhile, took a prominent role in supporting Kevin Kiernan's Electronic Beowulf; the first edition appeared in 1999, and the fourth in 2014. Translations The tightly interwoven structure of Old English poetry makes translating Beowulf a severe technical challenge. Despite this, a great number of translations and adaptations are available, in poetry and prose. Andy Orchard, in A Critical Companion to Beowulf, lists 33 "representative" translations in his bibliography, while the Arizona Center for Medieval and Renaissance Studies published Marijane Osborn's annotated list of over 300 translations and adaptations in 2003. Beowulf has been translated many times in verse and in prose, and adapted for stage and screen. By 2020, the Beowulf's Afterlives Bibliographic Database listed some 688 translations and other versions of the poem. Beowulf has been translated into at least 38 other languages. In 1805, the historian Sharon Turner translated selected verses into modern English. This was followed in 1814 by John Josias Conybeare who published an edition "in English paraphrase and Latin verse translation." N. F. S. Grundtvig reviewed Thorkelin's edition in 1815 and created the first complete verse translation in Danish in 1820. In 1837, John Mitchell Kemble created an important literal translation in English. In 1895, William Morris and A. J. Wyatt published the ninth English translation. In 1909, Francis Barton Gummere's full translation in "English imitative metre" was published, and was used as the text of Gareth Hinds's 2007 graphic novel based on Beowulf. In 1975, John Porter published the first complete verse translation of the poem entirely accompanied by facing-page Old English. Seamus Heaney's 1999 translation of the poem (Beowulf: A New Verse Translation, called "Heaneywulf" by Howell Chickering and many others) was both praised and criticized. The US publication was commissioned by W. W. Norton & Company, and was included in the Norton Anthology of English Literature. Many retellings of Beowulf for children appeared in the 20th century. In 2000 (2nd edition 2013), Liuzza published his own version of Beowulf in a parallel text with the Old English, with his analysis of the poem's historical, oral, religious and linguistic contexts. R. D. Fulk, of Indiana University, published a facing-page edition and translation of the entire Nowell Codex manuscript in 2010. Hugh Magennis's 2011 Translating Beowulf: Modern Versions in English Verse discusses the challenges and history of translating the poem, as well as the question of how to approach its poetry, and discusses several post-1950 verse translations, paying special attention to those of Edwin Morgan, Burton Raffel, Michael J. Alexander, and Seamus Heaney. Translating Beowulf is one of the subjects of the 2012 publication Beowulf at Kalamazoo, containing a section with 10 essays on translation, and a section with 22 reviews of Heaney's translation,some of which compare Heaney's work with Liuzza's. Tolkien's long-awaited translation (edited by his son Christopher) was published in 2014 as Beowulf: A Translation and Commentary. The book includes Tolkien's own retelling of the story of Beowulf in his tale Sellic Spell, but not his incomplete and unpublished verse translation. The Mere Wife, by Maria Dahvana Headley, was published in 2018. It relocates the action to a wealthy community in 20th century America and is told primarily from the point of view of Grendel's mother. In 2020, Headley published a translation in which the opening "Hwæt!" is rendered "Bro!". Sources and analogues Neither identified sources nor analogues for Beowulf can be definitively proven, but many conjectures have been made. These are important in helping historians understand the Beowulf manuscript, as possible source-texts or influences would suggest time-frames of composition, geographic boundaries within which it could be composed, or range (both spatial and temporal) of influence (i.e. when it was "popular" and where its "popularity" took it). The poem has been related to Scandinavian, Celtic, and international folkloric sources. Scandinavian parallels and sources 19th century studies proposed that Beowulf was translated from a lost original Scandinavian work; surviving Scandinavian works have continued to be studied as possible sources. In 1886 Gregor Sarrazin suggested that an Old Norse original version of Beowulf must have existed, but in 1914 Carl Wilhelm von Sydow pointed out that Beowulf is fundamentally Christian and was written at a time when any Norse tale would have most likely been pagan. Another proposal was a parallel with the Grettis Saga, but in 1998, Magnús Fjalldal challenged that, stating that tangential similarities were being overemphasized as analogies. The story of Hrolf Kraki and his servant, the legendary bear-shapeshifter Bodvar Bjarki has also been suggested as a possible parallel; he survives in Hrólfs saga kraka and Saxo's Gesta Danorum, while Hrolf Kraki, one of the Scyldings, appears as "Hrothulf" in Beowulf. New Scandinavian analogues to Beowulf continue to be proposed regularly, with Hrólfs saga Gautrekssonar being the most recently adduced text. International folktale sources Friedrich Panzer (1910) wrote a thesis that the first part of Beowulf (the Grendel Story) incorporated preexisting folktale material, and that the folktale in question was of the Bear's Son Tale (Bärensohnmärchen) type, which has surviving examples all over the world. This tale type was later catalogued as international folktale type 301, now formally entitled "The Three Stolen Princesses" type in Hans Uther's catalogue, although the "Bear's Son" is still used in Beowulf criticism, if not so much in folkloristic circles. However, although this folkloristic approach was seen as a step in the right direction, "The Bear's Son" tale has later been regarded by many as not a close enough parallel to be a viable choice. Later, Peter A. Jorgensen, looking for a more concise frame of reference, coined a "two-troll tradition" that covers both Beowulf and Grettis saga: "a Norse 'ecotype' in which a hero enters a cave and kills two giants, usually of different sexes"; this has emerged as a more attractive folk tale parallel, according to a 1998 assessment by Andersson. The epic's similarity to the Irish folktale "The Hand and the Child" was noted in 1899 by Albert S. Cook, and others even earlier. In 1914, the Swedish folklorist Carl Wilhelm von Sydow made a strong argument for parallelism with "The Hand and the Child", because the folktale type demonstrated a "monstrous arm" motif that corresponded with Beowulf's wrenching off Grendel's arm. No such correspondence could be perceived in the Bear's Son Tale or in the Grettis saga. James Carney and Martin Puhvel agree with this "Hand and the Child" contextualisation. Puhvel supported the "Hand and the Child" theory through such motifs as (in Andersson's words) "the more powerful giant mother, the mysterious light in the cave, the melting of the sword in blood, the phenomenon of battle rage, swimming prowess, combat with water monsters, underwater adventures, and the bear-hug style of wrestling." In the Mabinogion, Teyrnon discovers the otherworldly boy child Pryderi, the principal character of the cycle, after cutting off the arm of a monstrous beast which is stealing foals from his stables. The medievalist R. Mark Scowcroft notes that the tearing off of the monster's arm without a weapon is found only in Beowulf and fifteen of the Irish variants of the tale; he identifies twelve parallels between the tale and Beowulf. Classical sources Attempts to find classical or Late Latin influence or analogue in Beowulf are almost exclusively linked with Homer's Odyssey or Virgil's Aeneid. In 1926, Albert S. Cook suggested a Homeric connection due to equivalent formulas, metonymies, and analogous voyages. In 1930, James A. Work supported the Homeric influence, stating that encounter between Beowulf and Unferth was parallel to the encounter between Odysseus and Euryalus in Books 7–8 of the Odyssey, even to the point of both characters giving the hero the same gift of a sword upon being proven wrong in their initial assessment of the hero's prowess. This theory of Homer's influence on Beowulf remained very prevalent in the 1920s, but started to die out in the following decade when a handful of critics stated that the two works were merely "comparative literature", although Greek was known in late 7th century England: Bede states that Theodore of Tarsus, a Greek, was appointed Archbishop of Canterbury in 668, and he taught Greek. Several English scholars and churchmen are described by Bede as being fluent in Greek due to being taught by him; Bede claims to be fluent in Greek himself. Frederick Klaeber, among others, argued for a connection between Beowulf and Virgil near the start of the 20th century, claiming that the very act of writing a secular epic in a Germanic world represents Virgilian influence. Virgil was seen as the pinnacle of Latin literature, and Latin was the dominant literary language of England at the time, therefore making Virgilian influence highly likely. Similarly, in 1971, Alistair Campbell stated that the apologue technique used in Beowulf is so rare in epic poetry aside from Virgil that the poet who composed Beowulf could not have written the poem in such a manner without first coming across Virgil's writings. Biblical influences It cannot be denied that Biblical parallels occur in the text, whether seen as a pagan work with "Christian colouring" added by scribes or as a "Christian historical novel, with selected bits of paganism deliberately laid on as 'local colour'", as Margaret E. Goldsmith did in "The Christian Theme of Beowulf". Beowulf channels the Book of Genesis, the Book of Exodus, and the Book of Daniel in its inclusion of references to the Genesis creation narrative, the story of Cain and Abel, Noah and the flood, the Devil, Hell, and the Last Judgment. Dialect Beowulf predominantly uses the West Saxon dialect of Old English, like other Old English poems copied at the time. However, it also uses many other linguistic forms; this leads some scholars to believe that it has endured a long and complicated transmission through all the main dialect areas. It retains a complicated mix of Mercian, Northumbrian, Early West Saxon, Anglian, Kentish and Late West Saxon dialectical forms. Form and metre An Old English poem such as Beowulf is very different from modern poetry. Anglo-Saxon poets typically used alliterative verse, a form of verse in which the first half of the line (the a-verse) is linked to the second half (the b-verse) through similarity in initial sound. In addition, the two halves are divided by a caesura: (l. 4). This verse form maps stressed and unstressed syllables onto abstract entities known as metrical positions. There is no fixed number of beats per line: the first one cited has three () whereas the second has two (). The poet had a choice of formulae to assist in fulfilling the alliteration scheme. These were memorised phrases that conveyed a | designation is "British Library, Cotton Vitellius A.XV" because it was one of Sir Robert Bruce Cotton's holdings in the Cotton library in the middle of the 17th century. Many private antiquarians and book collectors, such as Sir Robert Cotton, used their own library classification systems. "Cotton Vitellius A.XV" translates as: the 15th book from the left on shelf A (the top shelf) of the bookcase with the bust of Roman Emperor Vitellius standing on top of it, in Cotton's collection. Kevin Kiernan argues that Nowell most likely acquired it through William Cecil, 1st Baron Burghley, in 1563, when Nowell entered Cecil's household as a tutor to his ward, Edward de Vere, 17th Earl of Oxford. The earliest extant reference to the first foliation of the Nowell Codex was made sometime between 1628 and 1650 by Franciscus Junius (the younger). The ownership of the codex before Nowell remains a mystery. The Reverend Thomas Smith (1638–1710) and Humfrey Wanley (1672–1726) both catalogued the Cotton library (in which the Nowell Codex was held). Smith's catalogue appeared in 1696, and Wanley's in 1705. The Beowulf manuscript itself is identified by name for the first time in an exchange of letters in 1700 between George Hickes, Wanley's assistant, and Wanley. In the letter to Wanley, Hickes responds to an apparent charge against Smith, made by Wanley, that Smith had failed to mention the Beowulf script when cataloguing Cotton MS. Vitellius A. XV. Hickes replies to Wanley "I can find nothing yet of Beowulph." Kiernan theorised that Smith failed to mention the Beowulf manuscript because of his reliance on previous catalogues or because either he had no idea how to describe it or because it was temporarily out of the codex. The manuscript passed to Crown ownership in 1702, on the death of its then owner, Sir John Cotton, who had inherited it from his grandfather, Robert Cotton. It suffered damage in a fire at Ashburnham House in 1731, in which around a quarter of the manuscripts bequeathed by Cotton were destroyed. Since then, parts of the manuscript have crumbled along with many of the letters. Rebinding efforts, though saving the manuscript from much degeneration, have nonetheless covered up other letters of the poem, causing further loss. Kiernan, in preparing his electronic edition of the manuscript, used fibre-optic backlighting and ultraviolet lighting to reveal letters in the manuscript lost from binding, erasure, or ink blotting. Writing The Beowulf manuscript was transcribed from an original by two scribes, one of whom wrote the prose at the beginning of the manuscript and the first 1939 lines, before breaking off in mid-sentence. The first scribe made a point of carefully regularizing the spelling of the original document into the common West Saxon, removing any archaic or dialectical features. The second scribe, who wrote the remainder, with a difference in handwriting noticeable after line 1939, seems to have written more vigorously and with less interest. As a result, the second scribe's script retains more archaic dialectic features, which allow modern scholars to ascribe the poem a cultural context. While both scribes appear to have proofread their work, there are nevertheless many errors. The second scribe was ultimately the more conservative copyist as he did not modify the spelling of the text as he wrote, but copied what he saw in front of him. In the way that it is currently bound, the Beowulf manuscript is followed by the Old English poem Judith. Judith was written by the same scribe that completed Beowulf, as evidenced by similar writing style. Wormholes found in the last leaves of the Beowulf manuscript that are absent in the Judith manuscript suggest that at one point Beowulf ended the volume. The rubbed appearance of some leaves suggests that the manuscript stood on a shelf unbound, as was the case with other Old English manuscripts. Knowledge of books held in the library at Malmesbury Abbey and available as source works, as well as the identification of certain words particular to the local dialect found in the text, suggest that the transcription may have taken place there. Performance The scholar Roy Liuzza notes that the practice of oral poetry is by its nature invisible to history as evidence is in writing. Comparison with other bodies of verse such as Homer's, coupled with ethnographic observation of early 20th century performers, has provided a vision of how an Anglo-Saxon singer-poet or scop may have practised. The resulting model is that performance was based on traditional stories and a repertoire of word formulae that fitted the traditional metre. The scop moved through the scenes, such as putting on armour or crossing the sea, each one improvised at each telling with differing combinations of the stock phrases, while the basic story and style remained the same. Liuzza notes that Beowulf itself describes the technique of a court poet in assembling materials, in lines 867–874 in his translation, "full of grand stories, mindful of songs ... found other words truly bound together; ... to recite with skill the adventure of Beowulf, adeptly tell a tall tale, and (wordum wrixlan) weave his words." The poem further mentions (lines 1065–1068) that "the harp was touched, tales often told, when Hrothgar's scop was set to recite among the mead tables his hall-entertainment". Debate over oral tradition The question of whether Beowulf was passed down through oral tradition prior to its present manuscript form has been the subject of much debate, and involves more than simply the issue of its composition. Rather, given the implications of the theory of oral-formulaic composition and oral tradition, the question concerns how the poem is to be understood, and what sorts of interpretations are legitimate. In his landmark 1960 work, The Singer of Tales, Albert Lord, citing the work of Francis Peabody Magoun and others, considered it proven that Beowulf was composed orally. Later scholars have not all been convinced; they agree that "themes" like "arming the hero" or the "hero on the beach" do exist across Germanic works, some scholars conclude that Anglo-Saxon poetry is a mix of oral-formulaic and literate patterns. Larry Benson proposed that Germanic literature contains "kernels of tradition" which Beowulf expands upon. Ann Watts argued against the imperfect application of one theory to two different traditions: traditional, Homeric, oral-formulaic poetry and Anglo-Saxon poetry. Thomas Gardner agreed with Watts, arguing that the Beowulf text is too varied to be completely constructed from set formulae and themes. John Miles Foley wrote that comparative work must observe the particularities of a given tradition; in his view, there was a fluid continuum from traditionality to textuality. Editions and translations Editions Many editions of the Old English text of Beowulf have been published; this section lists the most influential. The Icelandic scholar Grímur Jónsson Thorkelin made the first transcriptions of the Beowulf-manuscript in 1786, working as part of a Danish government historical research commission. He made one himself, and had another done by a professional copyist who knew no Old English (and was therefore in some ways more likely to make transcription errors, but in other ways more likely to copy exactly what he saw). Since that time, the manuscript has crumbled further, making these transcripts prized witnesses to the text. While the recovery of at least 2000 letters can be attributed to them, their accuracy has been called into question, and the extent to which the manuscript was actually more readable in Thorkelin's time is uncertain. Thorkelin used these transcriptions as the basis for the first complete edition of Beowulf, in Latin. In 1922, Frederick Klaeber published his edition Beowulf and The Fight at Finnsburg; it became the "central source used by graduate students for the study of the poem and by scholars and teachers as the basis of their translations." The edition included an extensive glossary of Old English terms. His third edition was published in 1936, with the last version in his lifetime being a revised reprint in 1950. Klaeber's text was re-presented with new introductory material, notes, and glosses, in a fourth edition in 2008. Another widely used edition is Elliott Van Kirk Dobbie's, published in 1953 in the Anglo-Saxon Poetic Records series. The British Library, meanwhile, took a prominent role in supporting Kevin Kiernan's Electronic Beowulf; the first edition appeared in 1999, and the fourth in 2014. Translations The tightly interwoven structure of Old English poetry makes translating Beowulf a severe technical challenge. Despite this, a great number of translations and adaptations are available, in poetry and prose. Andy Orchard, in A Critical Companion to Beowulf, lists 33 "representative" translations in his bibliography, while the Arizona Center for Medieval and Renaissance Studies published Marijane Osborn's annotated list of over 300 translations and adaptations in 2003. Beowulf has been translated many times in verse and in prose, and adapted for stage and screen. By 2020, the Beowulf's Afterlives Bibliographic Database listed some 688 translations and other versions of the poem. Beowulf has been translated into at least 38 other languages. In 1805, the historian Sharon Turner translated selected verses into modern English. This was followed in 1814 by John Josias Conybeare who published an edition "in English paraphrase and Latin verse translation." N. F. S. Grundtvig reviewed Thorkelin's edition in 1815 and created the first complete verse translation in Danish in 1820. In 1837, John Mitchell Kemble created an important literal translation in English. In 1895, William Morris and A. J. Wyatt published the ninth English translation. In 1909, Francis Barton Gummere's full translation in "English imitative metre" was published, and was used as the text of Gareth Hinds's 2007 graphic novel based on Beowulf. In 1975, John Porter published the first complete verse translation of the poem entirely accompanied by facing-page Old English. Seamus Heaney's 1999 translation of the poem (Beowulf: A New Verse Translation, called "Heaneywulf" by Howell Chickering and many others) was both praised and criticized. The US publication was commissioned by W. W. Norton & Company, and was included in the Norton Anthology of English Literature. Many retellings of Beowulf for children appeared in the 20th century. In 2000 (2nd edition 2013), Liuzza published his own version of Beowulf in a parallel text with the Old English, with his analysis of the poem's historical, oral, religious and linguistic contexts. R. D. Fulk, of Indiana University, published a facing-page edition and translation of the entire Nowell Codex manuscript in 2010. Hugh Magennis's 2011 Translating Beowulf: Modern Versions in English Verse discusses the challenges and history of translating the poem, as well as the question of how to approach its poetry, and discusses several post-1950 verse translations, paying special attention to those of Edwin Morgan, Burton Raffel, Michael J. Alexander, and Seamus Heaney. Translating Beowulf is one of the subjects of the 2012 publication Beowulf at Kalamazoo, containing a section with 10 essays on translation, and a section with 22 reviews of Heaney's translation,some of which compare Heaney's work with Liuzza's. Tolkien's long-awaited translation (edited by his son Christopher) was published in 2014 as Beowulf: A Translation and Commentary. The book includes Tolkien's own retelling of the story of Beowulf in his tale Sellic Spell, but not his incomplete and unpublished verse translation. The Mere Wife, by Maria Dahvana Headley, was published in 2018. It relocates the action to a wealthy community in 20th century America and is told primarily from the point of view of Grendel's mother. In 2020, Headley published a translation in which the opening "Hwæt!" is rendered "Bro!". Sources and analogues Neither identified sources nor analogues for Beowulf can be definitively proven, but many conjectures have been made. These are important in helping historians understand the Beowulf manuscript, as possible source-texts or influences would suggest time-frames of composition, geographic boundaries within which it could be composed, or range (both spatial and temporal) of influence (i.e. when it was |
published by Comics Greatest World, an imprint of Dark Horse Comics. The character first appeared in Comics' Greatest World: Steel Harbor in 1993. The original Barb Wire series published nine issues between 1994 and 1995 and was followed by a four-issue miniseries in 1996. A reboot was published in 2015 and lasted eight issues. In 1996, the character was adapted into a Barb Wire film starring Pamela Anderson. Unlike the comics, the film takes place in a possible future rather than an alternate version of present-day Earth. Creators Regular series: 1: John Arcudi, writer/ Lee Moder, pencils/Ande Parks, inks 2–3: Arcudi, writer/ Dan Lawlis, pencils/Parks, inks 4–5: Arcudi, writer/Lawlis, pencils/Ian Akin, inks 6–7: Arcudi, writer/Mike Manley, pencils/Parks, inks 8: Arcudi, writer/ Andrew Robinson, pencils/ Jim Royal, inks 9: Anina Bennett & Paul Guinan, writers/ Robert Walker, pencils/Jim Royal, inks Ace of Spades (miniseries): 1–4: Chris Warner, script and pencils/Tim Bradstreet, inks Character history Barb Wire's stories take place on an alternate version of present-day Earth with superhumans and more advanced technology. In this Earth's history, an alien entity called the Vortex arrived in 1931 and began conducting secret experiments. In 1947, an atom bomb test detonated in a desert nearby the alien's experiments. The result was the creation of a trans-dimensional wormhole referred to as "the Vortex" or "the Maelstrom" which released energy that gave different people across Earth superpowers for years to come. Decades later, Barbara Kopetski grows up in Steel Harbour when it is still a thriving steel industry city. Barbara and her brother Charlie live with their grandmother and parents, their mother being a police officer while their father is a former marine who became a steelworker. Officer Kopetski later dies, after which her husband becomes so ill he is confined to a bed for years, developing Alzheimer's disease as well before passing away. Following the death of her father, Barbara leaves Steel Harbour for a time as the city's economy starts to spiral and crime begins rising. Soon, much of the city is controlled by warring gangs rather than local government. Years later, Barbara returns to Steel Harbour, now an experienced bounty hunter operating under the name Barb | ruin, with hundreds of buildings destroyed or abandoned in the area known as "Metal City". Many are forced to leave the city or take to the streets, and the gangs (all of whom have superhuman members) start moving to take more control. To help contain the chaos and keep her home from descending further, Barb Wire now acts at times as a vigilante, intervening when the police can't or won't. Fighting alongside the Wolf Gang, she defied criminal Mace Blitzkrieg's attempts to bring all gangs under his leadership and control the city. Thanks to learning from her police officer mother and marine father, and her life experiences while traveling outside of the city, Barb Wire is an excellent hand-to-hand combatant, skilled in various firearms, and an expert driver and motorcycle rider. Her bar the Hammerhead has been considered neutral meeting ground by the Steel Harbour gangs. Aiding her bounty hunter activities is her brother Charlie, acting as her mechanic and engineer, and others such as Avram Roman Jr., a cyborg sometimes known simply as "the Machine". Though she has loyal allies, including Charlie, Barb Wire is a harsh, guarded person who looks at the world with suspicion and cynicism, considering herself a loner at heart. Other characters Supporting characters Charlie Kopetski, Barb's brother, a blind mechanic, and engineering genius. He invents and maintains most of her weapons and superhuman restraining devices. He openly complains about how often he must fix the equipment she continuously breaks during her adventures. Allies The Machine, real name: Avram Roman Jr. A man whose body is inhabited by a self-repairing machine colony, making him an advanced cyborg. Along with a reinforced skeleton, superhuman strength and enhanced durability, he is capable of rebuilding parts of his body. Over time, he becomes more machine-like in nature, no longer requiring food. Motörhead, real name: Frank Fletcher. A drifter with psychic powers who is bonded to an ancient, powerful artifact known as the Motor. Wolf Gang, a group that believe gangs shouldn't go too far in their activities and victimize the city, and prefer independence and a balance of power rather than uniting all gangs under one leader. The Wolf Gang is formidable and its members are known for discipline and loyalty. The gang includes five superhumans: Burner (fire abilities); Bomber (creates energy bombs); Breaker (superhuman strength); Cutter (energy blades); and their leader Wolf Ferrell, also known as Hunter (enhanced senses). Ghost, real name: Elisa Cameron. A popular Dark Horse Comics character with ghost-like abilities who has a brief crossover story with Barb Wire. Enemies The Prime Movers, a collective of street gang leaders who agree to serve under the leadership of superhumanly |
of weeks. Then, it was basically me, Mel, Richie Pryor and Norman Steinberg. Richie left after the first draft and then Norman, Mel and I wrote the next three or four drafts. It was a riot. It was a rioter’s room!" The original title, Tex X, was rejected to avoid it being mistaken for an X-rated film, as were Black Bart – a reference to Black Bart, a white highwayman of the 19th century – and Purple Sage. Brooks said he finally conceived Blazing Saddles one morning while taking a shower. Casting was problematic. Richard Pryor was Brooks' original choice to play Sheriff Bart, but the studio, claiming his history of drug arrests made him uninsurable, refused to approve financing with Pryor as the star. Cleavon Little was cast in the role, and Pryor remained as a writer. Brooks offered the other leading role, the Waco Kid, to John Wayne; he declined, deeming the film "too blue" for his family-oriented image, but assured Brooks that "he would be the first one in line to see it." Gig Young was cast, but he collapsed during his first scene from what was later determined to be alcohol withdrawal syndrome, and Gene Wilder was flown in to replace him. Johnny Carson and Wilder both turned down the Hedley Lamarr role before Harvey Korman was cast. Madeline Kahn objected when Brooks asked to see her legs during her audition. "She said, 'So it's THAT kind of an audition? Brooks recalled. "I explained that I was a happily married man and that I needed someone who could straddle a chair with her legs like Marlene Dietrich in Destry Rides Again. So she lifted her skirt and said, 'No touching. Brooks had numerous conflicts over content with Warner Bros. executives, including frequent use of the word "nigger", Lili Von Shtupp's seduction scene, the cacophony of flatulence around the campfire, and Mongo punching out a horse. Brooks, whose contract gave him final content control, declined to make any substantive changes, with the exception of cutting Bart's final line during Lili's seduction: "I hate to disappoint you, ma'am, but you're sucking my arm." When asked later about the many "nigger" references, Brooks said he received consistent support from Pryor and Little. He added, "If they did a remake of Blazing Saddles today [2012], they would leave out the N-word. And then, you've got no movie." Brooks said he received many letters of complaint after the film's release. The film was almost not released. "When we screened it for executives, there were few laughs", said Brooks. "The head of distribution said, 'Let's dump it and take a loss.' But [studio president John] Calley insisted they open it in New York, Los Angeles and Chicago as a test. It became the studio's top moneymaker that summer." The world premiere took place on February 7, 1974, at the Pickwick Drive-In Theater in Burbank; 250 invited guests—including Little and Wilder—watched the film on horseback. Songs and music Mel Brooks wrote the music and lyrics for three of Blazing Saddles songs, "The Ballad of Rock Ridge", "I'm Tired", and "The French Mistake". Brooks also wrote the lyrics to the title song, with music by John Morris, the composer of the film's score. To sing the title song, Brooks advertised in the trade papers for a "Frankie Laine–type" singer; to his surprise, Laine himself offered his services. "Frankie sang his heart out ... and we didn't have the heart to tell him it was a spoof. He never heard the whip cracks; we put those in later. We got so lucky with his serious interpretation of the song." The choreographer for "I'm Tired" and "The French Mistake" was Alan Johnson. "I'm Tired" is a homage to and parody of Marlene Dietrich's singing of Cole Porter's song "I'm the Laziest Gal in Town" in Alfred Hitchcock's 1950 film Stage Fright, as well as "Falling in Love Again (Can't Help It)" from The Blue Angel. The orchestrations were by Morris and Jonathan Tunick. The first studio-licensed release of the full music soundtrack to Blazing Saddles was on La-La Land Records on August 26, 2008. Remastered from original studio vault elements, the limited edition CD – a run of 3000 – features the songs from the film as well as Morris's score. Instrumental versions of all the songs are bonus tracks on the disc. The disc features liner notes featuring comments from Mel Brooks and John Morris. Reception While Blazing Saddles is now considered a classic comedy, critical reaction was mixed when the film was released. Vincent Canby wrote: Roger Ebert gave the film four stars out of four, calling it a "crazed grab bag of a movie that does everything to keep us laughing except hit us over the head with a rubber chicken. Mostly, it succeeds. It's an audience picture; it doesn't have a lot of classy polish and its structure is a total mess. But of course! What does that matter while Alex Karras is knocking a horse cold with a right cross to the jaw?" Gene Siskel awarded three stars out of four and called it "bound to rank with the funniest of the year," adding, "Whenever the laughs begin to run dry, Brooks and his quartet of gag writers splash about in a pool of obscenities that score belly laughs if your ears aren't sensitive and if you're hip to western movie conventions being parodied." Variety wrote, "If comedies are measured solely by the number of yocks they generate from audiences, then 'Blazing Saddles' must be counted a success ... Few viewers will have time between laughs to complain that pic is essentially a raunchy, protracted version of a television comedy skit." Charles Champlin of the Los Angeles Times called the film "irreverent, outrageous, improbable, often as blithely tasteless as a stag night at the Friar's Club and almost continuously funny." Gary Arnold of The Washington Post was negative, writing that "Mel Brooks squanders a snappy title on a stockpile of stale jokes. To say that this slapdash Western spoof lacks freshness and spontaneity and originality is putting it mildly. 'Blazing Saddles' is at once a messy and antiquated gag machine." Jan Dawson of The Monthly Film Bulletin wrote, "Perhaps it is pedantic to complain that the whole is not up to the sum of its parts when, for the curate's egg that it is, Blazing Saddles contains so many good parts and memorable performances." John Simon wrote a negative review of Blazing Saddles, saying, "All kinds of gags—chiefly anachronisms, irrelevancies, reverse ethnic jokes, and out and out vulgarities—are thrown together pell-mell, batted about insanely in all directions, and usually beaten into the ground." On review aggregator Rotten Tomatoes, the film has an 88% approval rating based on 59 reviews, with a weighted average of 8.1/10. The site's consensus reads: "Daring, provocative, and laugh-out-loud funny, Blazing Saddles is a gleefully vulgar spoof of Westerns that marks a high point in Mel Brooks' storied career." During production for the film, retired longtime film star Hedy Lamarr sued Warner Bros. for $100,000, charging that the film's running parody of her name infringed on her right to privacy. Brooks said that he was flattered and chose to not fight it in court; the studio settled out of court for a small sum and an apology for "almost using her name." Brooks said that Lamarr "never got the joke." This lawsuit would be referenced by an in-film joke where Brooks' character, the Governor, tells Hedley Lamarr that, "This is 1874; you'll be able to sue HER." Ishmael Reed's 1969 novel Yellow Back Radio Broke-Down has been cited as an important precursor or influence for Blazing Saddles, a connection that Reed himself has made. Box office The film earned theatrical rentals of $26.7 million in its initial release in the United States and Canada. In its 1976 reissue, it earned a further $10.5 million and another $8 million in 1979. Its total rentals in the United States and Canada totalled $47.8 million from a gross of $119.5 million, becoming only the tenth film up to that | "Alan Arkin was hired to direct and James Earl Jones was going to play the sheriff. That fell apart, as things often do." Brooks was taken with the story, which he described as "hip talk—1974 talk and expressions—happening in 1874 in the Old West", and purchased the film rights from Bergman. Though he had not worked with a writing team since Your Show of Shows, he hired a group of writers (including Bergman) to expand the outline, and posted a large sign: "Please do not write a polite script." Brooks described the writing process as chaotic: "Blazing Saddles was more or less written in the middle of a drunken fistfight. There were five of us all yelling loudly for our ideas to be put into the movie. Not only was I the loudest, but luckily I also had the right as director to decide what was in or out." Bergman remembers the room being just as chaotic, telling Creative Screenwriting, "In the beginning, we had five people. One guy left after a couple of weeks. Then, it was basically me, Mel, Richie Pryor and Norman Steinberg. Richie left after the first draft and then Norman, Mel and I wrote the next three or four drafts. It was a riot. It was a rioter’s room!" The original title, Tex X, was rejected to avoid it being mistaken for an X-rated film, as were Black Bart – a reference to Black Bart, a white highwayman of the 19th century – and Purple Sage. Brooks said he finally conceived Blazing Saddles one morning while taking a shower. Casting was problematic. Richard Pryor was Brooks' original choice to play Sheriff Bart, but the studio, claiming his history of drug arrests made him uninsurable, refused to approve financing with Pryor as the star. Cleavon Little was cast in the role, and Pryor remained as a writer. Brooks offered the other leading role, the Waco Kid, to John Wayne; he declined, deeming the film "too blue" for his family-oriented image, but assured Brooks that "he would be the first one in line to see it." Gig Young was cast, but he collapsed during his first scene from what was later determined to be alcohol withdrawal syndrome, and Gene Wilder was flown in to replace him. Johnny Carson and Wilder both turned down the Hedley Lamarr role before Harvey Korman was cast. Madeline Kahn objected when Brooks asked to see her legs during her audition. "She said, 'So it's THAT kind of an audition? Brooks recalled. "I explained that I was a happily married man and that I needed someone who could straddle a chair with her legs like Marlene Dietrich in Destry Rides Again. So she lifted her skirt and said, 'No touching. Brooks had numerous conflicts over content with Warner Bros. executives, including frequent use of the word "nigger", Lili Von Shtupp's seduction scene, the cacophony of flatulence around the campfire, and Mongo punching out a horse. Brooks, whose contract gave him final content control, declined to make any substantive changes, with the exception of cutting Bart's final line during Lili's seduction: "I hate to disappoint you, ma'am, but you're sucking my arm." When asked later about the many "nigger" references, Brooks said he received consistent support from Pryor and Little. He added, "If they did a remake of Blazing Saddles today [2012], they would leave out the N-word. And then, you've got no movie." Brooks said he received many letters of complaint after the film's release. The film was almost not released. "When we screened it for executives, there were few laughs", said Brooks. "The head of distribution said, 'Let's dump it and take a loss.' But [studio president John] Calley insisted they open it in New York, Los Angeles and Chicago as a test. It became the studio's top moneymaker that summer." The world premiere took place on February 7, 1974, at the Pickwick Drive-In Theater in Burbank; 250 invited guests—including Little and Wilder—watched the film on horseback. Songs and music Mel Brooks wrote the music and lyrics for three of Blazing Saddles songs, "The Ballad of Rock Ridge", "I'm Tired", and "The French Mistake". Brooks also wrote the lyrics to the title song, with music by John Morris, the composer of the film's score. To sing the title song, Brooks advertised in the trade papers for a "Frankie Laine–type" singer; to his surprise, Laine himself offered his services. "Frankie sang his heart out ... and we didn't have the heart to tell him it was a spoof. He never heard the whip cracks; we put those in later. We got so lucky with his serious interpretation of the song." The choreographer for "I'm Tired" and "The French Mistake" was Alan Johnson. "I'm Tired" is a homage to and parody of Marlene Dietrich's singing of Cole Porter's song "I'm the Laziest Gal in Town" in Alfred Hitchcock's 1950 film Stage Fright, as well as "Falling in Love Again (Can't Help It)" from The Blue Angel. The orchestrations were by Morris and Jonathan Tunick. The first studio-licensed release of the full music soundtrack to Blazing Saddles was on La-La Land Records on August 26, 2008. Remastered from original studio vault elements, the limited edition CD – a run of 3000 – features the songs from the film as well as Morris's score. Instrumental versions of all the songs are bonus tracks on the disc. The disc features liner notes featuring comments from Mel Brooks and John Morris. Reception While Blazing Saddles is now considered a classic comedy, critical reaction was mixed when the film was released. Vincent Canby wrote: Roger Ebert gave the film four stars out of four, calling it a "crazed grab bag of a movie that does everything to keep us laughing except hit us over the head with a rubber chicken. Mostly, it succeeds. It's an audience picture; it doesn't have a lot of classy polish and its structure is a total mess. But of course! What does that matter while Alex Karras is knocking a horse cold with a right cross to the jaw?" Gene Siskel awarded three stars out of four and called it "bound to rank with the funniest of the year," adding, "Whenever the laughs begin to run dry, Brooks and his quartet of gag writers splash about in a pool of obscenities that score belly laughs if your ears aren't sensitive and if you're hip to western movie conventions being parodied." Variety wrote, "If comedies are measured solely by the number of yocks they generate from audiences, then 'Blazing Saddles' must be counted a success ... Few viewers will have time between laughs to complain that pic is essentially a raunchy, protracted version of a television comedy skit." Charles Champlin of the Los Angeles Times called the film "irreverent, outrageous, improbable, often as blithely tasteless as a stag night at the Friar's Club and almost continuously funny." Gary Arnold of The Washington Post was negative, writing that "Mel Brooks squanders a snappy title on a stockpile of stale jokes. To say that this slapdash Western spoof lacks freshness and spontaneity and originality is putting it mildly. 'Blazing Saddles' is at once a messy and antiquated gag machine." Jan Dawson of The Monthly Film Bulletin wrote, "Perhaps it is pedantic to complain that the whole is not up to the sum of its parts when, for the curate's egg that it is, Blazing Saddles contains so many good parts and memorable performances." John Simon wrote a negative review of Blazing Saddles, saying, "All kinds of gags—chiefly anachronisms, irrelevancies, reverse ethnic jokes, and out and out vulgarities—are thrown together pell-mell, batted about insanely in all directions, and usually beaten into the ground." On review aggregator Rotten Tomatoes, the film has an 88% approval rating based on 59 reviews, with a weighted average of 8.1/10. The site's consensus reads: "Daring, provocative, and laugh-out-loud funny, Blazing Saddles is a gleefully vulgar spoof of Westerns that marks a high point in Mel Brooks' storied career." During production for the film, retired longtime film star Hedy Lamarr sued Warner Bros. for $100,000, charging that the film's running parody of her name infringed on her right to privacy. Brooks said that he was flattered and chose to not fight it in court; |
collection Crystal Express and the collection Schismatrix Plus, which contains the novel Schismatrix and all of the stories set in the Shaper/Mechanist universe. Alastair Reynolds identified Schismatrix and the other Shaper/Mechanist stories as one of the greatest influences on his own work. In the 1980s, Sterling edited the science fiction critical fanzine Cheap Truth under the alias of Vincent Omniaveritas. He wrote a column called Catscan for the now-defunct science fiction critical magazine SF Eye. He contributed a chapter to Sound Unbound: Sampling Digital Music and Culture (The MIT Press, 2008) edited by Paul D. Miller a.k.a. DJ Spooky. From April 2009 through May 2009, he was an editor at Cool Tools. Since October 2003 Sterling has blogged at "Beyond the Beyond", which is hosted by Wired along with contributions to other print and online platforms like the Magazine of Fantasy & Science Fiction. His most recent novel (as of 2013) is Love Is Strange (December 2012), a Paranormal Romance (40k). Projects He has been the instigator of three projects which can be found on the Web - The Dead Media Project - A collection of "research notes" on dead media technologies, from Incan quipus, through Victorian phenakistoscopes, to the departed video game and home computers of the 1980s. The Project's homepage, including Sterling's original Dead Media Manifesto can be found at http://www.deadmedia.org The Viridian Design Movement - his attempt to create a "green" design movement focused on high-tech, stylish, and ecologically sound design. The Viridian Design home page, including Sterling's Viridian Manifesto and all of his Viridian Notes, is managed by Jon Lebkowsky at http://www.viridiandesign.org. The Viridian Movement helped to spawn the popular "bright green" environmental weblog Worldchanging. WorldChanging contributors include many of the original members of the Viridian "curia". Embrace the Decay - a web-only art piece commissioned by the LA Museum of Contemporary Art in 2003. Incorporating contributions solicited through The Viridian Design 'movement', Embrace the Decay was the most visited piece/page at LA MOCA's Digital Gallery, and included contributions from Jared Tarbell of levitated.net and co-author of several books on advanced Flash programming, and Monty Zukowski, creator of the winning 'decay algorithm' sponsored by Bruce. Neologisms Sterling has coined various neologisms to describe things that he believes will be common in the future, especially items which already exist in limited numbers. In the December 2005 issue of Wired magazine, Sterling coined the term buckyjunk to refer to future, difficult-to-recycle consumer waste made of carbon nanotubes (a.k.a. buckytubes, based on buckyballs or buckminsterfullerene). In his 2005 book Shaping Things, he coined the term design fiction which refers to a type of speculative design which focuses on world building. In July 1989, in SF Eye #5, he was the first to use the word "slipstream" to refer to a type of speculative fiction between traditional science fiction and fantasy and mainstream literature. In December 1999 he coined the term "Wexelblat disaster", for a disaster caused when a natural disaster triggers a secondary, and more damaging, failure of human technology. In August 2004, he suggested a type of technological device (he called it "spime") that, through pervasive RFID and GPS tracking, can track its history of use and interact with the world. He discussed and expanded on Sophia Al Maria's neologism "Gulf Futurism" in his column for Wired magazine, "Beyond The Beyond" Personal In the beginning of his childhood he lived in Galveston, Texas until his family moved to India. Sterling spent several years in India and has a fondness for Bollywood films. In 1976, he graduated from the University of Texas with a degree | at LA MOCA's Digital Gallery, and included contributions from Jared Tarbell of levitated.net and co-author of several books on advanced Flash programming, and Monty Zukowski, creator of the winning 'decay algorithm' sponsored by Bruce. Neologisms Sterling has coined various neologisms to describe things that he believes will be common in the future, especially items which already exist in limited numbers. In the December 2005 issue of Wired magazine, Sterling coined the term buckyjunk to refer to future, difficult-to-recycle consumer waste made of carbon nanotubes (a.k.a. buckytubes, based on buckyballs or buckminsterfullerene). In his 2005 book Shaping Things, he coined the term design fiction which refers to a type of speculative design which focuses on world building. In July 1989, in SF Eye #5, he was the first to use the word "slipstream" to refer to a type of speculative fiction between traditional science fiction and fantasy and mainstream literature. In December 1999 he coined the term "Wexelblat disaster", for a disaster caused when a natural disaster triggers a secondary, and more damaging, failure of human technology. In August 2004, he suggested a type of technological device (he called it "spime") that, through pervasive RFID and GPS tracking, can track its history of use and interact with the world. He discussed and expanded on Sophia Al Maria's neologism "Gulf Futurism" in his column for Wired magazine, "Beyond The Beyond" Personal In the beginning of his childhood he lived in Galveston, Texas until his family moved to India. Sterling spent several years in India and has a fondness for Bollywood films. In 1976, he graduated from the University of Texas with a degree in journalism. In 1978, he was a Dungeon Master for a Dungeons & Dragons game whose players included Warren Spector, who cited Sterling's game as a major inspiration for the game design of |
most frequent presenting symptoms are headache, drowsiness, confusion, seizures, hemiparesis or speech difficulties together with fever with a rapidly progressive course. Headache is characteristically worse at night and in the morning, as the intracranial pressure naturally increases when in the supine position. This elevation similarly stimulates the medullary vomiting center and area postrema, leading to morning vomiting. Other symptoms and findings depend largely on the specific location of the abscess in the brain. An abscess in the cerebellum, for instance, may cause additional complaints as a result of brain stem compression and hydrocephalus. Neurological examination may reveal a stiff neck in occasional cases (erroneously suggesting meningitis). Pathophysiology Bacterial Anaerobic and microaerophilic cocci and gram-negative and gram-positive anaerobic bacilli are the predominate bacterial isolates. Many brain abscesses are polymicrobial. The predominant organisms include: Staphylococcus aureus, aerobic and anaerobic streptococci (especially Streptococcus intermedius), Bacteroides, Prevotella, and Fusobacterium species, Enterobacteriaceae, Pseudomonas species, and other anaerobes. Less common organisms include: Haemophillus influenzae, Streptococcus pneumoniae and Neisseria meningitidis. Bacterial abscesses rarely (if ever) arise de novo within the brain, although establishing a cause can be difficult in many cases. There is almost always a primary lesion elsewhere in the body that must be sought assiduously, because failure to treat the primary lesion will result in relapse. In cases of trauma, for example in compound skull fractures where fragments of bone are pushed into the substance of the brain, the cause of the abscess is obvious. Similarly, bullets and other foreign bodies may become sources of infection if left in place. The location of the primary lesion may be suggested by the location of the abscess: infections of the middle ear result in lesions in the middle and posterior cranial fossae; congenital heart disease with right-to-left shunts often result in abscesses in the distribution of the middle cerebral artery; and infection of the frontal and ethmoid sinuses usually results in collection in the subdural sinuses. Other organisms Fungi and parasites may also cause the disease. Fungi and parasites are especially associated with immunocompromised patients. Other causes include: Nocardia asteroides, Mycobacterium, Fungi (e.g. Aspergillus, Candida, Cryptococcus, Mucorales, Coccidioides, Histoplasma capsulatum, Blastomyces dermatitidis, Bipolaris, Exophiala dermatitidis, Curvularia pallescens, Ochroconis gallopava, Ramichloridium mackenziei, Pseudallescheria boydii), Protozoa (e.g. Toxoplasma gondii, Entamoeba histolytica, Trypanosoma cruzi, Schistosoma, Paragonimus), and Helminths (e.g. Taenia solium). Organisms that are most frequently associated with brain abscess in patients with AIDS are poliovirus, Toxoplasma gondii, and Cryptococcus neoformans, though in infection with the latter organism, symptoms of meningitis generally predominate. These organisms are associated with certain predisposing conditions: Sinus and dental infections—Aerobic and anaerobic streptococci, anaerobic gram-negative bacilli (e.g. Prevotella, Porphyromonas, Bacteroides), Fusobacterium, S. aureus, and Enterobacteriaceae Penetrating trauma—S. aureus, aerobic streptococci, Enterobacteriaceae, and Clostridium spp. Pulmonary infections—Aerobic and anaerobic streptococci, anaerobic gram-negative bacilli (e.g. Prevotella, Porphyromonas, Bacteroides), Fusobacterium, Actinomyces, and Nocardia Congenital heart disease—Aerobic and microaerophilic streptococci, and S. aureus HIV infection—T. gondii, Mycobacterium, Nocardia, Cryptococcus, and Listeria monocytogenes Transplantation—Aspergillus, Candida, Cryptococcus, Mucorales, Nocardia, and T. gondii Neutropenia—Aerobic gram-negative bacilli, Aspergillus, Candida, and Mucorales Diagnosis The diagnosis is established by a computed tomography (CT) (with contrast) examination. At the initial phase of the inflammation (which is referred to as cerebritis), the immature lesion does not have a capsule and it may be difficult to | patients with AIDS are poliovirus, Toxoplasma gondii, and Cryptococcus neoformans, though in infection with the latter organism, symptoms of meningitis generally predominate. These organisms are associated with certain predisposing conditions: Sinus and dental infections—Aerobic and anaerobic streptococci, anaerobic gram-negative bacilli (e.g. Prevotella, Porphyromonas, Bacteroides), Fusobacterium, S. aureus, and Enterobacteriaceae Penetrating trauma—S. aureus, aerobic streptococci, Enterobacteriaceae, and Clostridium spp. Pulmonary infections—Aerobic and anaerobic streptococci, anaerobic gram-negative bacilli (e.g. Prevotella, Porphyromonas, Bacteroides), Fusobacterium, Actinomyces, and Nocardia Congenital heart disease—Aerobic and microaerophilic streptococci, and S. aureus HIV infection—T. gondii, Mycobacterium, Nocardia, Cryptococcus, and Listeria monocytogenes Transplantation—Aspergillus, Candida, Cryptococcus, Mucorales, Nocardia, and T. gondii Neutropenia—Aerobic gram-negative bacilli, Aspergillus, Candida, and Mucorales Diagnosis The diagnosis is established by a computed tomography (CT) (with contrast) examination. At the initial phase of the inflammation (which is referred to as cerebritis), the immature lesion does not have a capsule and it may be difficult to distinguish it from other space-occupying lesions or infarcts of the brain. Within 4–5 days the inflammation and the concomitant dead brain tissue are surrounded with a capsule, which gives the lesion the famous ring-enhancing lesion appearance on CT examination with contrast (since intravenously applied contrast material can not pass through the capsule, it is collected around the lesion and looks as a ring surrounding the relatively dark lesion). Lumbar puncture procedure, which is performed in many infectious disorders of the central nervous system is contraindicated in this condition (as it is in all space-occupying lesions of the brain) because removing a certain portion of the cerebrospinal fluid may alter the concrete intracranial pressure balances and causes the brain tissue to move across structures within the skull (brain herniation). Ring enhancement may also be observed in cerebral hemorrhages (bleeding) and some brain tumors. However, in the presence of the rapidly progressive course with fever, focal neurologic findings (hemiparesis, aphasia etc.) and signs of increased intracranial pressure, the most likely diagnosis should be the brain abscess. Treatment The treatment includes lowering the increased intracranial pressure and starting intravenous antibiotics (and meanwhile identifying the causative organism mainly by blood culture studies). Hyperbaric oxygen therapy (HBO2 or HBOT) is indicated as a primary and adjunct treatment which provides four primary functions. Firstly, HBOT |
Trintignant at the time was married to actress Stéphane Audran. Bardot and Vadim divorced in 1957; they had no children together, but remained in touch, and even collaborated on later projects. The stated reason for the divorce was Bardot's affairs with two other men. Bardot and Trintignant lived together for about two years, spanning the period before and after Bardot's divorce from Vadim, but they never married. Their relationship was complicated by Trintignant's frequent absence due to military service and Bardot's affair with musician Gilbert Bécaud. After her separation from Vadim, Bardot acquired a historic property dating from the 16th century, called Le Castelet, in Cannes. The fourteen-bedroom villa, surrounded by lush gardens, olive trees, and vineyards, consisted of several buildings. In 1958, she bought a second property called La Madrague, located in Saint-Tropez. In early 1958, her break-up with Trintignant was followed in quick order by a reported nervous breakdown in Italy, according to newspaper reports. A suicide attempt with sleeping pills two days earlier was also noted but was denied by her public relations manager. She recovered within weeks and began a relationship with actor Jacques Charrier. She became pregnant well before they were married on 18 June 1959. Bardot's only child, her son Nicolas-Jacques Charrier, was born on 11 January 1960. Bardot had an affair with Glenn Ford in the early 1960s. After she and Charrier divorced in 1962, Nicolas was raised in the Charrier family and had little contact with his biological mother until his adulthood. Sami Frey was mentioned as the reason for her divorce from Charrier. Bardot was enamoured of Frey, but he quickly left her. From 1963 to 1965, she lived with musician Bob Zagury. Bardot's third marriage was to German millionaire playboy Gunter Sachs, lasting from 14 July 1966 to 7 October 1969, though they had separated the previous year. As she was on the set of Shalako, she had rejected Sean Connery's advances; she said, "It didn't last long because I wasn't a James Bond girl! I have never succumbed to his charm!" In 1968, she began dating Patrick Gilles, who co-starred with her in The Bear and the Doll (1970); but she ended their relationship in spring 1971. Over the next few years, Bardot dated in succession bartender/ski instructor Christian Kalt, club owner Luigi Rizzi, singer Serge Gainsbourg, writer John Gilmore, actor Warren Beatty, and Laurent Vergez, her co-star in Don Juan, or If Don Juan Were a Woman. In 1974, Bardot appeared in a nude photo shoot in Playboy magazine, which celebrated her 40th birthday. In 1975, she entered a relationship with artist Miroslav Brozek and posed for some of his sculptures. Brozek was also an actor; his stage name is . The couple lived together at La Madrague. They separated in December 1979. From 1980 to 1985, Bardot had a live-in relationship with French TV producer . On 28 September 1983, her 49th birthday, Bardot took an overdose of sleeping pills or tranquilizers with red wine. She had to be rushed to the hospital, where her life was saved after a stomach pump was used to evacuate the pills from her body. Bardot was diagnosed with breast cancer in 1984. She refused to undergo chemotherapy treatment and decided only to do radiation therapy. She recovered in 1986. Bardot's fourth and current husband is Bernard d'Ormale; they have been married since 16 August 1992. In 2018, in an interview accorded to Le Journal du Dimanche, she denied rumors of relationships with Johnny Hallyday, Jimi Hendrix, and Mick Jagger. Politics and legal issues Bardot expressed support for President Charles de Gaulle in the 1960s. In her 1999 book Le Carré de Pluton (Pluto's Square), Bardot criticizes the procedure used in the ritual slaughter of sheep during the Muslim festival of Eid al-Adha. Additionally, in a section in the book entitled "Open Letter to My Lost France", she writes that "my country, France, my homeland, my land is again invaded by an overpopulation of foreigners, especially Muslims". For this comment, a French court fined her 30,000 francs ( US dollars) in June 2000. She had been fined in 1997 for the original publication of this open letter in Le Figaro and again in 1998 for making similar remarks. In her 2003 book, Un cri dans le silence (A Scream in the Silence), she contrasted her close gay friends with homosexuals who "jiggle their bottoms, put their little fingers in the air and with their little castrato voices moan about what those ghastly heteros put them through," and said some contemporary homosexuals behave like "fairground freaks". In her own defence, Bardot wrote in a letter to a French gay magazine: "Apart from my husband — who maybe will cross over one day as well — I am entirely surrounded by homos. For years, they have been my support, my friends, my adopted children, my confidants." In her book, she wrote about issues such as racial mixing, immigration, the role of women in politics, and Islam. The book also contained a section attacking what she called the mixing of genes and praised previous generations who, she said, had given their lives to push out invaders. On 10 June 2004, Bardot was convicted for a fourth time by a French court for inciting racial hatred and fined €5,000 ( US dollars). Bardot denied the racial hatred charge and apologized in court, saying: "I never knowingly wanted to hurt anybody. It is not in my character." In 2008, Bardot was convicted of inciting racial/religious hatred in regard to a letter she wrote, a copy of which she sent to Nicolas Sarkozy when he was Interior Minister of France. The letter stated her objections to Muslims in France ritually slaughtering sheep by slitting their throats without anesthetizing them first. She also said, in reference to Muslims, that she was "fed up with being under the thumb of this population which is destroying us, destroying our country and imposing its habits". The trial concluded on 3 June 2008, with a conviction and fine of €15,000 ( US dollars). The prosecutor stated she was tired of charging Bardot with offences related to racial hatred. During the 2008 United States presidential election, she branded the Republican Party vice-presidential candidate Sarah Palin as "stupid" and a "disgrace to women". She criticized the former governor of Alaska for her stance on global warming and gun control. She was also offended by Palin's support for Arctic oil exploration and by her lack of consideration in protecting polar bears. On 13 August 2010, Bardot criticised director Kyle Newman for his plan to make a biographical film on her life. She told him, "Wait until I'm dead before you make a movie about my life!" otherwise "sparks will fly". In 2015, she threatened to sue a Saint-Tropez boutique selling items with her face on them. In 2014, Bardot wrote an open letter demanding the ban in France of shechita, describing it as "ritual sacrifice". In response, the European Jewish Congress released a statement saying “Bardot has once again shown her clear insensitivity for minority groups with the substance and style of her letter...She may well be concerned for the welfare of animals but her longstanding support for the far-Right and for discrimination against minorities in France shows a constant disdain for human rights instead.” In 2018, Bardot expressed support for the yellow vests movement. On 19 March 2019, Bardot sent an open letter to Réunion prefect in which she accused inhabitants of the Indian Ocean island of animal cruelty and referred to them as "autochthonous who have kept the genes of savages". In her letter relating to animal abuse and sent through her foundation, she mentioned the "beheadings of goats and billy goats" during festivals, and associated these practices with "reminiscences of cannibalism from past centuries". The public prosecutor filed a lawsuit against her the following day. In June 2021, eighty-six-year-old Bardot was fined €5,000 ( US dollars) by the Arras court for public insults against the hunters and their national president Willy Schraen. Initially, she had published a post at the end of 2019 on the Brigitte Bardot Foundation's website, calling hunters "sub-men" and "drunkards" and carriers of "genes of cruel barbarism inherited from our primitive ancestors". Schraen was also insulted. At the time of the hearing, she had not removed the comments from the website. Following her letter sent to the prefect of Réunion in 2019, she was convicted on 4 November 2021 by a French court for public insults and fined €20,000 ( US dollars), the largest of her fines to date. Bardot's husband Bernard d'Ormale is a former adviser to Jean-Marie Le Pen, former leader of the far-right party National Front (now National Rally), the main far-right party in France, known for its nationalist beliefs. Bardot expressed support for Marine Le Pen, leader of the National Front (National Rally), calling her "the Joan of Arc of the 21st century". She endorsed Le Pen in the 2012 and 2017 French presidential elections. Legacy The Guardian named Bardot "one of the most iconic faces, models, and actors of the 1950s and 1960s". She has been called a "style icon" and a "muse for Dior, Balmain, and Pierre Cardin". In fashion, the Bardot neckline (a wide-open neck that exposes both shoulders) is named after her. Bardot popularized this style which is especially used for knitted sweaters or jumpers although it is also used for other tops and dresses. Bardot popularized the bikini in her early films such as Manina (1952) (released in France as Manina, la fille sans voiles). The following year she was also photographed in a bikini on every beach in the south of France during the Cannes Film Festival. She gained additional attention when she filmed ...And God Created Woman (1956) with Jean-Louis Trintignant (released in France as Et Dieu Créa La Femme). In it Bardot portrays an immoral teenager cavorting in a bikini who seduces men in a respectable small-town setting. The film was an international success. Bardot's image was linked to the shoemaker Repetto, who created a pair of ballerinas for her in 1956. The bikini was in the 1950s relatively well accepted in France but was still considered risqué in the United States. As late as 1959, Anne Cole, one of the United States' largest swimsuit designers, said, "It's nothing more than a G-string. It's at the razor's edge of decency." She also brought into fashion the choucroute ("Sauerkraut") hairstyle (a sort of beehive hair style) and gingham clothes after wearing a checkered pink dress, designed by Jacques Esterel, at her wedding to Charrier. She was the subject of an Andy Warhol painting. Isabella Biedenharn of Elle wrote that Bardot "has inspired thousands (millions?) of women to tease their hair or try out winged eyeliner over the past few decades". A well-known evocative pose describes an iconic modeling portrait shot around 1960 where Bardot is dressed only in a pair of black pantyhose, cross-legged over her front and cross-armed over her breasts. This pose has been emulated numerous times by models and celebrities such as Lindsay Lohan, Elle Macpherson, Gisele Bündchen, and Rihanna. In the late 1960s, Bardot's silhouette was used as a model for designing and modeling the statue's bust of Marianne, a symbol of the French Republic. In addition to popularizing the bikini swimming suit, Bardot has been credited with popularizing the city of St. Tropez and the town of Armação dos Búzios in Brazil, which she visited in 1964 with her boyfriend at the time, Brazilian musician Bob Zagury. The place where she stayed in Búzios is today a small hotel, Pousada do Sol, and also a French restaurant, Cigalon. The town hosts a Bardot statue by Christina Motta. Bardot was idolized by the young John Lennon and Paul McCartney. They made plans to shoot a film featuring The Beatles and Bardot, similar to A Hard Day's Night, but the plans were never fulfilled. Lennon's first wife Cynthia Powell lightened her hair colour to more closely resemble Bardot, while George Harrison made comparisons between Bardot and his first wife Pattie Boyd, as Cynthia wrote later in A Twist of Lennon. Lennon and Bardot met in person once, in 1968 at the May Fair Hotel, introduced by Beatles press agent Derek Taylor; a nervous Lennon took LSD before arriving, and neither star impressed the other. Lennon recalled in a memoir: "I was on acid, and she was on her way out." According to the liner notes of his first (self-titled) album, musician Bob Dylan dedicated the first song he ever wrote to Bardot. He also mentioned her by name in "I Shall Be Free", which appeared on his second album, The Freewheelin' Bob Dylan. The first-ever official exhibition spotlighting Bardot's influence and legacy opened in Boulogne-Billancourt on 29 September 2009 – a day after her 75th birthday. The Australian pop group Bardot was named after her. Women who emulated and were inspired by Bardot include Claudia Schiffer, Emmanuelle Béart, Elke Sommer, Kate Moss, Faith Hill, Isabelle Adjani, Diane Kruger, Lara Stone, Kylie Minogue, Amy Winehouse, Georgia May Jagger, Zahia Dehar, Scarlett Johansson, Louise Bourgoin, and Paris Hilton. Bardot said: "None have my personality." Laetitia Casta embodied Bardot in the 2010 French drama film Gainsbourg: A Heroic Life by Joann Sfar. In 2011, Los Angeles Times Magazines list of "50 Most Beautiful Women In Film" ranked her number two. Bardot inspired Nicole Kidman to promote the 2013 campaign shoot of the British brand | a musical, Naughty Girl (1956), where Bardot played a troublesome school girl. Directed by Michel Boisrond, it was co-written by Roger Vadim and was a big hit, the 12th most popular film of the year in France. It was followed by a comedy, Plucking the Daisy (1956), written by Vadim with the director Marc Allégret, and another success at France. So too was the comedy The Bride Is Much Too Beautiful (1956) with Louis Jourdan. Finally there was the melodrama And God Created Woman (1956), Vadim's debut as director, with Bardot starring opposite Jean-Louis Trintignant and Curt Jurgens. The film, about an immoral teenager in a respectable small-town setting, was a huge success, not just in France but also around the world – it was among the ten most popular films in Britain in 1957. It turned Bardot into an international star. From at least 1956, she was being hailed as the "sex kitten". The film scandalized the United States and theatre managers were arrested for screening it. During her early career, professional photographer Sam Lévin's photos contributed to the image of Bardot's sensuality. One showed Bardot from behind, dressed in a white corset. British photographer Cornel Lucas made images of Bardot in the 1950s and 1960s that have become representative of her public persona. Bardot followed And God Created Woman with La Parisienne (1957), a comedy co-starring Charles Boyer for director Boisrond. She was reunited with Vadim in another melodrama The Night Heaven Fell (1958) and played a criminal who seduced Jean Gabin in In Case of Adversity (1958). The latter was the 13th most seen movie of the year in France. In 1958, Bardot became the highest-paid French actress. The Female (1959) for director Julien Duvivier was popular, but Babette Goes to War (1959), a comedy set in World War II, was a huge hit, the fourth biggest movie of the year in France. Also widely seen was Come Dance with Me (1959) from Boisrond. Her next film was the courtroom drama The Truth (1960), from Henri-Georges Clouzot. It was a highly publicised production, which resulted in Bardot having an affair and attempting suicide. The film was Bardot's biggest ever commercial success in France, the third biggest hit of the year, and was nominated for the Academy Award for Best Foreign Film. Bardot was awarded a David di Donatello Award for Best Foreign Actress for her role in the film. She made a comedy with Vadim, Please, Not Now! (1961), and had a role in the all-star anthology, Famous Love Affairs (1962). Bardot starred alongside Marcello Mastroianni in a film inspired by her life in A Very Private Affair (Vie privée, 1962), directed by Louis Malle. More popular in France was Love on a Pillow (1962), another for Vadim. International films and singing career: 1962–1968 In the mid-1960s, Bardot made films that seemed to be more aimed at the international market. She starred in Jean-Luc Godard's film Le Mépris (1963), produced by Joseph E. Levine and starring Jack Palance. The following year she co-starred with Anthony Perkins in the comedy Une ravissante idiote (1964). Dear Brigitte (1965), Bardot's first Hollywood film, was a comedy starring James Stewart as an academic whose son develops a crush on Bardot. Bardot's appearance was relatively brief and the film was not a big hit. More successful was the Western buddy comedy Viva Maria! (1965) for director Louis Malle, appearing opposite Jeanne Moreau. It was a big hit in France and worldwide, although it did not break through in the US as much as it had been hoped. After a cameo in Godard's Masculin Féminin (1966), she had her first outright flop for some years, Two Weeks in September (1968), a French–English co-production. She had a small role in the all-star Spirits of the Dead (1968), acting opposite Alain Delon, then tried a Hollywood film again: Shalako (1968), a Western starring Sean Connery, which was a box-office disappointment. She participated in several musical shows and recorded many popular songs in the 1960s and 1970s, mostly in collaboration with Serge Gainsbourg, Bob Zagury and Sacha Distel, including "Harley Davidson"; "Je Me Donne À Qui Me Plaît"; "Bubble gum"; "Contact"; "Je Reviendrai Toujours Vers Toi"; "L'Appareil À Sous"; "La Madrague"; "On Déménage"; "Sidonie"; "Tu Veux, Ou Tu Veux Pas?"; "Le Soleil De Ma Vie" (the cover of Stevie Wonder's "You Are the Sunshine of My Life"); and "Je t'aime... moi non-plus". Bardot pleaded with Gainsbourg not to release this duet and he complied with her wishes; the following year, he rerecorded a version with British-born model and actress Jane Birkin that became a massive hit all over Europe. The version with Bardot was issued in 1986 and became a popular download hit in 2006 when Universal Music made its back catalogue available to purchase online, with this version of the song ranking as the third most popular download. Final films: 1969–1973 From 1969 to 1978, Bardot was the official face of Marianne (who had previously been anonymous) to represent the liberty of France. Les Femmes (1969) was a flop, although the screwball comedy The Bear and the Doll (1970) performed better. Her last few films were mostly comedies: Les Novices (1970), Boulevard du Rhum (1971) (with Lino Ventura). The Legend of Frenchie King (1971) was more popular, helped by Bardot co-starring with Claudia Cardinale. She made one more with Vadim, Don Juan, or If Don Juan Were a Woman (1973), playing the title role. Vadim said the film marked "Underneath what people call 'the Bardot myth' was something interesting, even though she was never considered the most professional actress in the world. For years, since she has been growing older, and the Bardot myth has become just a souvenir... I was curious in her as a woman and I had to get to the end of something with her, to get out of her and express many things I felt were in her. Brigitte always gave the impression of sexual freedom – she is a completely open and free person, without any aggression. So I gave her the part of a man – that amused me". "If Don Juan is not my last movie it will be my next to last", said Bardot during filming. She kept her word and only made one more film, The Edifying and Joyous Story of Colinot (1973). In 1973, Bardot announced she was retiring from acting as "a way to get out elegantly". Animal rights activism After appearing in more than 40 motion pictures and recording several music albums, Bardot used her fame to promote animal rights. In 1986, she established the Brigitte Bardot Foundation for the Welfare and Protection of Animals. She became a vegetarian and raised three million of francs ( US dollars) to fund the foundation by auctioning off jewellery and personal belongings. She once had a neighbour's donkey castrated while looking after it, on the grounds of its "sexual harassment" of her own donkey and mare, for which she was taken to court by the donkey's owner in 1989. Bardot wrote a 1999 letter to Chinese President Jiang Zemin, published in French magazine VSD, in which she accused the Chinese of "torturing bears and killing the world's last tigers and rhinos to make aphrodisiacs". She has donated more than $140,000 over two years for a mass sterilization and adoption program for Bucharest's stray dogs, estimated to number 300,000. Bardot is a strong animal rights activist and a major opponent of the consumption of horse meat. In support of animal protection, she condemned seal hunting in Canada during a visit to that country with Paul Watson of the Sea Shepherd Conservation Society. In August 2010, Bardot addressed a letter to the Queen of Denmark, Margrethe II of Denmark, appealing for the sovereign to halt the killing of dolphins in the Faroe Islands. In the letter, Bardot describes the activity as a "macabre spectacle" that "is a shame for Denmark and the Faroe Islands ... This is not a hunt but a mass slaughter ... an outmoded tradition that has no acceptable justification in today's world". On 22 April 2011, French culture minister Frédéric Mitterrand officially included bullfighting in the country's cultural heritage. Bardot wrote him a highly critical letter of protest. On 25 May 2011, the Sea Shepherd Conservation Society renamed its fast interceptor vessel, MV Gojira, as MV Brigitte Bardot in appreciation of her support. From 2013 onwards, the Brigitte Bardot Foundation in collaboration with Kagyupa International Monlam Trust of India has operated an annual veterinary care camp. She has committed to the cause of animal welfare in Bodhgaya year after year. On 23 July 2015, Bardot condemned Australian politician Greg Hunt's plan to eradicate 2 million cats to save endangered species such as the Warru and night parrot. Personal life Marriages and relationships Throughout her life, Bardot had seventeen relationships with men and was married four times. Bardot was leaving for another relationship when "the present was getting lukewarm"; she said, "I have always looked for passion. That's why I was often unfaithful. And when the passion was coming to an end, I was packing my suitcase". On 20 December 1952, aged 18, Bardot married director Roger Vadim. In 1956, she had become romantically involved with Jean-Louis Trintignant, who was her co-star in And God Created Woman. Trintignant at the time was married to actress Stéphane Audran. Bardot and Vadim divorced in 1957; they had no children together, but remained in touch, and even collaborated on later projects. The stated reason for the divorce was Bardot's affairs with two other men. Bardot and Trintignant lived together for about two years, spanning the period before and after Bardot's divorce from Vadim, but they never married. Their relationship was complicated by Trintignant's frequent absence due to military service and Bardot's affair with musician Gilbert Bécaud. After her separation from Vadim, Bardot acquired a historic property dating from the 16th century, called Le Castelet, in Cannes. The fourteen-bedroom villa, surrounded by lush gardens, olive trees, and vineyards, consisted of several buildings. In 1958, she bought a second property called La Madrague, located in Saint-Tropez. In early 1958, her break-up with Trintignant was followed in quick order by a reported nervous breakdown in Italy, according to newspaper reports. A suicide attempt with sleeping pills two days earlier was also noted but was denied by her public relations manager. She recovered within weeks and began a relationship with actor Jacques Charrier. She became pregnant well before they were married on 18 June 1959. Bardot's only child, her son Nicolas-Jacques Charrier, was born on 11 January 1960. Bardot had an affair with Glenn Ford in the early 1960s. After she and Charrier divorced in 1962, Nicolas was raised in the Charrier family and had little contact with his biological mother until his adulthood. Sami Frey was mentioned as the reason for her divorce from Charrier. Bardot was enamoured of Frey, but he quickly left her. From 1963 to 1965, she lived with musician Bob Zagury. Bardot's third marriage was to German millionaire playboy Gunter Sachs, lasting from 14 July 1966 to 7 October 1969, though they had separated the previous year. As she was on the set of Shalako, she had rejected Sean Connery's advances; she said, "It didn't last long because I wasn't a James Bond girl! I have never succumbed to his charm!" In 1968, she began dating Patrick Gilles, who co-starred with her in The Bear and the Doll (1970); but she ended their relationship in spring 1971. Over the next few years, Bardot dated in succession bartender/ski instructor Christian Kalt, club owner Luigi Rizzi, singer Serge Gainsbourg, writer John Gilmore, actor Warren Beatty, and Laurent Vergez, her co-star in Don Juan, or If Don Juan Were a Woman. In 1974, Bardot appeared in a nude photo shoot in Playboy magazine, which celebrated her 40th birthday. In 1975, she entered a relationship with artist Miroslav Brozek and posed for some of his sculptures. Brozek was also an actor; his stage name is . The couple lived together at La Madrague. They separated in December 1979. From 1980 to 1985, Bardot had a live-in relationship with French TV producer . On 28 September 1983, her 49th birthday, Bardot took an overdose of sleeping pills or tranquilizers with red wine. She had to be rushed to the hospital, where her life was saved after a stomach pump was used to evacuate the pills from her body. Bardot was diagnosed with breast cancer in 1984. She refused to undergo chemotherapy treatment and decided only to do radiation therapy. She recovered in 1986. Bardot's fourth and current husband is Bernard d'Ormale; they have been married since 16 August 1992. In 2018, in an interview accorded to Le Journal du Dimanche, she denied rumors of relationships with Johnny Hallyday, Jimi Hendrix, and Mick Jagger. Politics and legal issues Bardot expressed support for President Charles de Gaulle in the 1960s. In her 1999 book Le Carré de Pluton (Pluto's Square), Bardot criticizes the procedure used in the ritual slaughter of sheep during the Muslim festival of Eid al-Adha. Additionally, in a section in the book entitled "Open Letter to My Lost France", she writes that "my country, France, my homeland, my land is again invaded by an overpopulation of foreigners, especially Muslims". For this comment, a French court fined her 30,000 francs ( US dollars) in June 2000. She had been fined in 1997 for the original publication of this open letter in Le Figaro and again in 1998 for making similar remarks. In her 2003 book, Un cri dans le silence (A Scream in the Silence), she contrasted her close gay friends with homosexuals who "jiggle their bottoms, put their little fingers in the air and with their little castrato voices moan about what those ghastly heteros put them through," and said some contemporary homosexuals behave like "fairground freaks". In her own defence, Bardot wrote in a letter to a French gay magazine: "Apart from my husband — who maybe will cross over one day as well — I am entirely surrounded by homos. For years, they have been my support, my friends, my adopted children, my confidants." In her book, she wrote about issues such as racial mixing, immigration, the role of women in politics, and Islam. The book also contained a section attacking what she called the mixing of genes and praised previous generations who, she said, had given their lives to push out invaders. On 10 June 2004, Bardot was convicted for a fourth time by a French court for inciting racial hatred and fined €5,000 ( US dollars). Bardot denied the racial hatred charge and apologized in court, saying: "I never knowingly wanted to hurt anybody. It is not in my character." In 2008, Bardot was convicted of inciting racial/religious hatred in regard to a letter she wrote, a copy of which she sent to Nicolas Sarkozy when he was Interior Minister of France. The letter stated her objections to Muslims in France ritually slaughtering sheep by slitting their throats without anesthetizing them first. She also said, in reference to Muslims, that she was "fed up with being under the thumb of this population which is destroying us, destroying our country and imposing its habits". The trial concluded on 3 June 2008, with a conviction and fine of €15,000 ( US dollars). The prosecutor stated she was tired of charging Bardot with offences related |
guitar-style pick (that is, a single one held between thumb and forefinger), unlike the five-string banjo, which is either played with a thumbpick and two fingerpicks, or with bare fingers. The plectrum banjo evolved out of the five-string banjo, to cater to styles of music involving strummed chords. The plectrum is also featured in many early jazz recordings and arrangements. Four-string banjos can be used for chordal accompaniment (as in early jazz), for single-string melody playing (as in Irish traditional music), in "chord melody" style (a succession of chords in which the highest notes carry the melody), in tremolo style (both on chords and single strings), and a mixed technique called duo style that combines single-string tremolo and rhythm chords. Four-string banjos are used from time to time in musical theater. Examples include: Hello, Dolly!, Mame, Chicago, Cabaret, Oklahoma!, Half a Sixpence, Annie, Barnum, The Threepenny Opera, Monty Python's Spamalot, and countless others. Joe Raposo had used it variably in the imaginative seven-piece orchestration for the long-running TV show Sesame Street, and has sometimes had it overdubbed with itself or an electric guitar. The banjo is still (albeit rarely) in use in the show's arrangement currently. Tenor banjo The shorter-necked, tenor banjo, with 17 ("short scale") or 19 frets, is also typically played with a plectrum. It became a popular instrument after about 1910. Early models used for melodic picking typically had 17 frets on the neck and a scale length of 19 to 21 inches. By the mid-1920s, when the instrument was used primarily for strummed chordal accompaniment, 19-fret necks with a scale length of 21 to 23 inches became standard. The usual tuning is the all-fifths tuning C3 G3 D4 A4, in which exactly seven semitones (a perfect fifth) occur between the open notes of consecutive strings; this is identical to the tuning of a viola. Other players (particularly in Irish traditional music) tune the banjo G2 D3 A3 E4 like an octave mandolin, which lets the banjoist duplicate fiddle and mandolin fingering. The popularization of this tuning is usually attributed to the late Barney McKenna, banjoist with The Dubliners. Fingerstyle on tenor banjo retuned to open G tuning dgd'g' or lower open D tuning Adad' (three finger picking, frailing) have been explored by Mirek Patek. The tenor banjo was a common rhythm instrument in early 20th-century dance bands. Its volume and timbre suited early jazz (and jazz-influenced popular music styles) and could both compete with other instruments (such as brass instruments and saxophones) and be heard clearly on acoustic recordings. George Gershwin's Rhapsody in Blue, in Ferde Grofe's original jazz-orchestra arrangement, includes tenor banjo, with widely spaced chords not easily playable on plectrum banjo in its conventional tunings. With development of the archtop and electric guitar, the tenor banjo largely disappeared from jazz and popular music, though keeping its place in traditional "Dixieland" jazz. Some 1920s Irish banjo players picked out the melodies of jigs, reels, and hornpipes on tenor banjos, decorating the tunes with snappy triplet ornaments. The most important Irish banjo player of this era was Mike Flanagan of the New York-based Flanagan Brothers, one of the most popular Irish-American groups of the day. Other pre-WWII Irish banjo players included Neil Nolan, who recorded with Dan Sullivan's Shamrock Band in Boston, and Jimmy McDade, who recorded with the Four Provinces Orchestra in Philadelphia. Meanwhile, in Ireland, the rise of ceili bands provided a new market for a loud instrument like the tenor banjo. Use of the tenor banjo in Irish music has increased greatly since the folk revival of the 1960s. Six-string banjos The six-string banjo began as a British innovation by William Temlett, one of England's earliest banjo makers. He opened a shop in London in 1846, and sold seven-string banjos which he marketed as "zither" banjos from his 1869 patent. A zither banjo usually has a closed back and sides with the drum body and skin tensioning system suspended inside the wooden rim, the neck and string tailpiece mounted on the outside of the rim, and the drone string led through a tube in the neck so that the tuning peg can be mounted on the head. They were often made by builders who used guitar tuners that came in banks of three, so five-stringed instruments had a redundant tuner; these banjos could be somewhat easily converted over to a six-string banjo. American Alfred Davis Cammeyer (1862–1949), a young violinist turned concert banjo player, devised the six-string zither banjo around 1880. British opera diva Adelina Patti advised Cammeyer that the zither banjo might be popular with English audiences as it had been invented there, and Cammeyer went to London in 1888. With his virtuoso playing, he helped show that banjos could make more sophisticated music than normally played by blackface minstrels. He was soon performing for London society, where he met Sir Arthur Sullivan, who recommended that Cammeyer progress from arranging the music of others for banjo to composing his own music. Modern six-string bluegrass banjos have been made. These add a bass string between the lowest string and the drone string on a five-string banjo, and are usually tuned G4 G2 D3 G3 B3 D4. Sonny Osborne played one of these instruments for several years. It was modified by luthier Rual Yarbrough from a Vega five-string model. A picture of Sonny with this banjo appears in Pete Wernick's Bluegrass Banjo method book. Six-string banjos known as banjo guitars basically consist of a six-string guitar neck attached to a bluegrass or plectrum banjo body, which allows players who have learned the guitar to play a banjo sound without having to relearn fingerings. This was the instrument of the early jazz great Johnny St. Cyr, jazzmen Django Reinhardt, Danny Barker, Papa Charlie Jackson and Clancy Hayes, as well as the blues and gospel singer Reverend Gary Davis. Today, musicians as diverse as Keith Urban, Rod Stewart, Taj Mahal, Joe Satriani, David Hidalgo, Larry Lalonde and Doc Watson play the six-string guitar banjo. They have become increasingly popular since the mid-1990s. Other banjos Low banjos In the late 19th and early 20th centuries, in vogue in plucked-string instrument ensembles – guitar orchestras, mandolin orchestras, banjo orchestras – was when the instrumentation was made to parallel that of the string section in symphony orchestras. Thus, "violin, viola, 'cello, bass" became "mandolin, mandola, mandocello, mandobass", or in the case of banjos, "banjolin, banjola, banjo cello, bass banjo". Because the range of pluck-stringed instrument generally is not as great as that of comparably sized bowed-string instruments, other instruments were often added to these plucked orchestras to extend the range of the ensemble upwards and downwards. The banjo cello was normally tuned C2-G2-D3-A3, one octave below the tenor banjo like the cello and mandocello. A five-string cello banjo, set up like a bluegrass banjo (with the short fifth string), but tuned one octave lower, has been produced by the Goldtone company. Bass banjos have been produced in both upright bass formats and with standard, horizontally carried banjo bodies. Contrabass banjos with either three or four strings have also been made; some of these had headstocks similar to those of bass violins. Tuning varies on these large instruments, with four-string models sometimes being tuned in 4ths like a bass violin (E1-A1-D2-G2) and sometimes in 5ths, like a four-string cello banjo, one octave lower (C1-G1-D2-A2). Banjo hybrids and variants A number of hybrid instruments exist, crossing the banjo with other stringed instruments. Most of these use the body of a banjo, often with a resonator, and the neck of the other instrument. Examples include the banjo mandolin (first patented in 1882) and the banjo ukulele, most famously played by the English comedian George Formby. These were especially popular in the early decades of the 20th century, and were probably a result of a desire either to allow players of other instruments to jump on the banjo bandwagon at the height of its popularity, or to get the natural amplification benefits of the banjo resonator in an age before electric amplification. Conversely, the tenor and plectrum guitars use the respective banjo necks on guitar bodies. They arose in the early 20th century as a way for banjo players to double on guitar without having to relearn the instrument entirely. Instruments that have a five-string banjo neck on a wooden body (for example, a guitar, bouzouki, or dobro body) have also been made, such as the banjola. A 20th-century Turkish instrument similar to the banjo is called the cümbüş, which combines a banjo-like resonator with a neck derived from an oud. At the end of the 20th century, a development of the five-string banjo was the BanSitar. This features a bone bridge, giving the instrument a sitar-like resonance. The Brazilian Samba Banjo is basically a cavaquinho neck on a banjo body, thereby producing a louder sound than the cavaquinho. It is tuned the same as the top 4 strings of a 5-string banjo up an octave (or any cavaquinho tuning). Notable banjoists Vess Ossman (1868–1923) was a leading five-string banjoist whose career spanned the late 19th and early 20th centuries. Vess started playing banjo at the age of 12. He was a popular recording artist, and in fact, one of the first recording artists ever, when audio recording first became commercially available. He formed various recording groups, his most popular being the Ossman-Dudley trio. Joel Sweeney (1810–1860) also known as Joe Sweeney, was a musician and early blackface minstrel performer. He is known for popularizing the playing of the banjo and has often been credited with advancing the physical development of the modern five-string banjo. Fred Van Eps (1878–1960) was a noted five-string player and banjo maker who learned to play from listening to cylinder recordings of Vess Ossman. He recorded for Edison's company, producing some of the earliest disk recordings, and also the earliest ragtime recordings in any medium other than player piano. Uncle Dave Macon (1870–1952) was a banjo player and comedian from Tennessee known for his "plug hat, gold teeth, chin whiskers, gates ajar collar and that million dollar Tennessee smile". Gid Tanner (1885–1960) was a notable banjo player active mostly in the 1920s and lead the band "the Skillet Lickers" while they were active with Riley Pucket. Eddie Peabody (1902–1970) was a great proponent of the plectrum banjo who performed for nearly five decades (1920–1968) and left a considerable legacy of recordings. An early reviewer dubbed him "King of the Banjo", and his was a household name for decades. He went on to develop new instruments, produce records, and appear in movies. Frank Lawes (1894–1970), of the United Kingdom, developed a unique fingerstyle technique on the four-string plectrum instrument, and was a prolific composer of four-string banjo music, much of which is still performed and recorded today. Harry Reser (1896–1965), plectrum and tenor banjo, was regarded by some as the best tenor banjoist of the 1920s. He wrote a large number of works for tenor banjo, as well as instructional material, authoring numerous banjo method books, over a dozen other instrumental method books (for guitar; ukulele; mandolin; etc.), and was well known in the banjo community. Reser's accomplished single string and "chord melody" technique set a "high mark" that many | The Five-String Banjo, which was the only banjo method on the market for years. He was followed by a movement of folk musicians, such as Dave Guard of the Kingston Trio and Erik Darling of the Weavers and Tarriers. Earl Scruggs was seen both as a legend and a "contemporary musical innovator" who gave his name to his style of playing, the Scruggs Style. Scruggs played the banjo "with heretofore unheard of speed and dexterity," using a picking technique for the 5-string banjo that he perfected from 2-finger and 3-finger picking techniques in rural North Carolina. His playing reached Americans through the Grand Ole Opry and into the living rooms of Americans who didn't listen to country or bluegrass music, through the theme music of The Beverly Hillbillies. For the last one hundred years, the tenor banjo has become an intrinsic part of the world of Irish traditional music. It is a relative newcomer to the genre. Technique Two techniques closely associated with the five-string banjo are rolls and drones. Rolls are right hand accompanimental fingering patterns that consist of eight (eighth) notes that subdivide each measure. Drone notes are quick little notes [typically eighth notes], usually played on the 5th (short) string to fill in around the melody notes [typically eighth notes]. These techniques are both idiomatic to the banjo in all styles, and their sound is characteristic of bluegrass. Historically, the banjo was played in the claw-hammer style by the Africans who brought their version of the banjo with them. Several other styles of play were developed from this. Clawhammer consists of downward striking of one or more of the four main strings with the index, middle or both fingers while the drone or fifth string is played with a 'lifting' (as opposed to downward pluck) motion of the thumb. The notes typically sounded by the thumb in this fashion are, usually, on the off beat. Melodies can be quite intricate adding techniques such as double thumbing and drop thumb. In old time Appalachian Mountain music, a style called two-finger up-pick is also used, and a three-finger version that Earl Scruggs developed into the "Scruggs" style picking was nationally aired in 1945 on the Grand Ole Opry. While five-string banjos are traditionally played with either fingerpicks or the fingers themselves, tenor banjos and plectrum banjos are played with a pick, either to strum full chords, or most commonly in Irish traditional music, play single-note melodies. Modern forms The modern banjo comes in a variety of forms, including four- and five-string versions. A six-string version, tuned and played similarly to a guitar, has gained popularity. In almost all of its forms, banjo playing is characterized by a fast arpeggiated plucking, though many different playing styles exist. The body, or "pot", of a modern banjo typically consists of a circular rim (generally made of wood, though metal was also common on older banjos) and a tensioned head, similar to a drum head. Traditionally, the head was made from animal skin, but today is often made of various synthetic materials. Most modern banjos also have a metal "tone ring" assembly that helps further clarify and project the sound, but many older banjos do not include a tone ring. The banjo is usually tuned with friction tuning pegs or planetary gear tuners, rather than the worm gear machine head used on guitars. Frets have become standard since the late 19th century, though fretless banjos are still manufactured and played by those wishing to execute glissando, play quarter tones, or otherwise achieve the sound and feeling of early playing styles. Modern banjos are typically strung with metal strings. Usually, the fourth string is wound with either steel or bronze-phosphor alloy. Some players may string their banjos with nylon or gut strings to achieve a more mellow, old-time tone. Some banjos have a separate resonator plate on the back of the pot to project the sound forward and give the instrument more volume. This type of banjo is usually used in bluegrass music, though resonator banjos are played by players of all styles, and are also used in old-time, sometimes as a substitute for electric amplification when playing in large venues. Open-back banjos generally have a mellower tone and weigh less than resonator banjos. They usually have a different setup than a resonator banjo, often with a higher string action. Five-string banjo The modern five-string banjo is a variation on Sweeney's original design. The fifth string is usually the same gauge as the first, but starts from the fifth fret, three-quarters the length of the other strings. This lets the string be tuned to a higher open pitch than possible for the full-length strings. Because of the short fifth string, the five-string banjo uses a reentrant tuning – the string pitches do not proceed lowest to highest across the fingerboard. Instead, the fourth string is lowest, then third, second, first, and the fifth string is highest. The short fifth string presents special problems for a capo. For small changes (going up or down one or two semitones, for example), retuning the fifth string simply is possible. Otherwise, various devices called "fifth-string capos" effectively shorten the vibrating part of the string. Many banjo players use model-railroad spikes or titanium spikes (usually installed at the seventh fret and sometimes at others), under which they hook the string to press it down on the fret. Five-string banjo players use many tunings. (Tunings are given in left-to-right order, as viewed from the front of the instrument with the neck pointing up.) Probably the most common, particularly in bluegrass, is the Open-G tuning G4 D3 G3 B3 D4. In earlier times, the tuning G4 C3 G3 B3 D4 was commonly used instead, and this is still the preferred tuning for some types of folk music and for classic banjo. Other tunings found in old-time music include double C (G4 C3 G3 C4 D4), "sawmill" (G4 D3 G3 C4 D4) also called "mountain modal" and open D (F#4 D3 F#3 A3 D4). These tunings are often taken up a tone, either by tuning up or using a capo. For example, "double-D" tuning (A4 D3 A3 D4 E4) – commonly reached by tuning up from double C – is often played to accompany fiddle tunes in the key of D, and Open-A (A4 E3 A3 C#4 E4) is usually used for playing tunes in the key of A. Dozens of other banjo tunings are used, mostly in old-time music. These tunings are used to make playing specific tunes easier, usually fiddle tunes or groups of fiddle tunes. The size of the five-string banjo is largely standardized, but smaller and larger sizes exist, including the long-neck or "Seeger neck" variation designed by Pete Seeger. Petite variations on the five-string banjo have been available since the 1890s. S.S. Stewart introduced the banjeaurine, tuned one fourth above a standard five-string. Piccolo banjos are smaller, and tuned one octave above a standard banjo. Between these sizes and standard lies the A-scale banjo, which is two frets shorter and usually tuned one full step above standard tunings. Many makers have produced banjos of other scale lengths, and with various innovations. American old-time music typically uses the five-string, open-back banjo. It is played in a number of different styles, the most common being clawhammer or frailing, characterized by the use of a downward rather than upward stroke when striking the strings with a fingernail. Frailing techniques use the thumb to catch the fifth string for a drone after most strums or after each stroke ("double thumbing"), or to pick out additional melody notes in what is known as drop-thumb. Pete Seeger popularized a folk style by combining clawhammer with up picking, usually without the use of fingerpicks. Another common style of old-time banjo playing is fingerpicking banjo or classic banjo. This style is based upon parlor-style guitar. Bluegrass music, which uses the five-string resonator banjo almost exclusively, is played in several common styles. These include Scruggs style, named after Earl Scruggs; melodic, or Keith style, named for Bill Keith; and three-finger style with single-string work, also called Reno style after Don Reno. In these styles, the emphasis is on arpeggiated figures played in a continuous eighth-note rhythm, known as rolls. All of these styles are typically played with fingerpicks. The first five-string, electric, solid-body banjo was developed by Charles Wilburn (Buck) Trent, Harold "Shot" Jackson, and David Jackson in 1960. The five-string banjo has been used in classical music since before the turn of the 20th century. Contemporary and modern works have been written or arranged for the instrument by Jerry Garcia, Buck Trent, Béla Fleck, Tony Trischka, Ralph Stanley, Steve Martin, George Crumb, Modest Mouse, Jo Kondo, Paul Elwood, Hans Werner Henze (notably in his Sixth Symphony), Daniel Mason of Hank Williams III's Damn Band, Beck, the Water Tower Bucket Boys, Todd Taylor, J.P. Pickens, Peggy Honeywell, Norfolk & Western, Putnam Smith, Iron & Wine, The Avett Brothers, The Well Pennies, Punch Brothers, Julian Koster, Sufjan Stevens, Sarah Jarosz and sisters Leah Song and Chloe Smith from Rising Appalachia Frederick Delius wrote for a banjo in his opera Koanga. Ernst Krenek includes two banjos in his Kleine Symphonie (Little Symphony). Kurt Weill has a banjo in his opera The Rise and Fall of the City of Mahagonny. Viktor Ullmann included a tenor banjo part in his Piano Concerto (op. 25). Four-string banjos The four-string plectrum banjo is a standard banjo without the short drone string. It usually has 22 frets on the neck and a scale length of 26 to 28 inches, and was originally tuned C3 G3 B3 D4. It can also be tuned like the top four strings of a guitar, which is known as "Chicago tuning". As the name suggests, it is usually played with a guitar-style pick (that is, a single one held between thumb and forefinger), unlike the five-string banjo, which is either played with a thumbpick and two fingerpicks, or with bare fingers. The plectrum banjo evolved out of the five-string banjo, to cater to styles of music involving strummed chords. The plectrum is also featured in many early jazz recordings and arrangements. Four-string banjos can be used for chordal accompaniment (as in early jazz), for single-string melody playing (as in Irish traditional music), in "chord melody" style (a succession of chords in which the highest notes carry the melody), in tremolo style (both on chords and single strings), and a mixed technique called duo style that combines single-string tremolo and rhythm chords. Four-string banjos are used from time to time in musical theater. Examples include: Hello, Dolly!, Mame, Chicago, Cabaret, Oklahoma!, Half a Sixpence, Annie, Barnum, The Threepenny Opera, Monty Python's Spamalot, and countless others. Joe Raposo had used it variably in the imaginative seven-piece orchestration for the long-running TV show Sesame Street, and has sometimes had it overdubbed with itself or an electric guitar. The banjo is still (albeit rarely) in use in the show's arrangement currently. Tenor banjo The shorter-necked, tenor banjo, with 17 ("short scale") or 19 frets, is also typically played with a plectrum. It became a popular instrument after about 1910. Early models used for melodic picking typically had 17 frets on the neck and a scale length of 19 to 21 inches. By the mid-1920s, when the instrument was used primarily for strummed chordal accompaniment, 19-fret necks with a scale length of 21 to 23 inches became standard. The usual tuning is the all-fifths tuning C3 G3 D4 A4, in which exactly seven semitones (a perfect fifth) occur between the open notes of consecutive strings; this is identical to the tuning of a viola. Other players (particularly in Irish traditional music) tune the banjo G2 D3 A3 E4 like an octave mandolin, which lets the banjoist duplicate fiddle and mandolin fingering. The popularization of this tuning is usually attributed to the late Barney McKenna, banjoist with The Dubliners. Fingerstyle on tenor banjo retuned to open G tuning dgd'g' or lower open D tuning Adad' (three finger picking, frailing) have been explored by Mirek Patek. The tenor banjo was a common rhythm instrument in early 20th-century dance bands. Its volume and timbre suited early jazz (and jazz-influenced popular music styles) and could both compete with other instruments (such as brass instruments and saxophones) and be heard clearly on acoustic recordings. George Gershwin's Rhapsody in Blue, in Ferde Grofe's original jazz-orchestra arrangement, includes tenor banjo, with widely spaced chords not easily playable on plectrum banjo in its conventional tunings. With development of the archtop and electric guitar, the tenor banjo largely disappeared from jazz and popular music, though keeping its place in traditional "Dixieland" jazz. Some 1920s Irish banjo players picked out the melodies of jigs, reels, and hornpipes on tenor banjos, decorating the tunes with snappy triplet ornaments. The most important Irish banjo player of this era was Mike Flanagan of the New York-based Flanagan Brothers, one of the most popular Irish-American groups of the day. Other pre-WWII Irish banjo players included Neil Nolan, who recorded with Dan Sullivan's Shamrock Band in Boston, and Jimmy McDade, who recorded with the Four Provinces Orchestra in Philadelphia. Meanwhile, in Ireland, the rise of ceili bands provided a new market for a loud instrument like the tenor banjo. Use of the tenor banjo in Irish music has increased greatly since the folk revival of the 1960s. Six-string banjos The six-string banjo began as a British innovation by William Temlett, one of England's earliest banjo makers. He opened a shop in London in 1846, and sold seven-string banjos which he marketed as "zither" banjos from his 1869 patent. A zither banjo usually has a closed back and sides with the drum body and skin tensioning system suspended inside the wooden rim, the neck and string tailpiece mounted on the outside of the rim, and the drone string led through a tube in the neck so that the tuning peg can be mounted on the head. They were often made by builders who used guitar tuners that came in banks of three, so five-stringed instruments had a redundant tuner; these banjos could be somewhat easily converted over to a six-string banjo. American Alfred Davis Cammeyer (1862–1949), a young violinist turned concert banjo player, devised the six-string zither banjo around 1880. British opera diva Adelina Patti advised Cammeyer that the zither banjo might be popular with English audiences as it had been invented there, and Cammeyer went to London in 1888. With his virtuoso playing, he helped show that banjos could make more sophisticated music than normally played by blackface minstrels. He was soon performing for London society, where he met Sir Arthur Sullivan, who recommended that Cammeyer progress from arranging the music of others for banjo to composing his own music. Modern six-string bluegrass banjos have been made. These add a bass string between the lowest string and the drone string on a five-string banjo, and are usually tuned G4 G2 D3 G3 B3 D4. Sonny Osborne played one of these instruments for several years. It was modified by luthier Rual Yarbrough from a Vega five-string model. A picture of Sonny with this banjo appears in Pete Wernick's Bluegrass Banjo method book. Six-string banjos known as banjo guitars basically consist of a six-string guitar neck attached to a bluegrass or plectrum banjo body, which allows players who have learned the guitar to play a banjo sound without having to relearn fingerings. This was the instrument of the early jazz great Johnny St. Cyr, jazzmen Django Reinhardt, Danny Barker, Papa Charlie Jackson and Clancy Hayes, as well as the blues and gospel singer Reverend Gary Davis. Today, musicians as diverse as Keith Urban, Rod Stewart, Taj Mahal, Joe Satriani, David Hidalgo, Larry Lalonde and Doc Watson play the six-string guitar banjo. They have become increasingly popular since the mid-1990s. Other banjos Low banjos In the late 19th and early 20th centuries, in vogue in plucked-string instrument ensembles – guitar orchestras, mandolin orchestras, banjo orchestras – was when the instrumentation was made to parallel that of the string section in symphony orchestras. Thus, "violin, viola, 'cello, bass" became "mandolin, mandola, mandocello, mandobass", or in the case of banjos, "banjolin, banjola, banjo cello, bass banjo". Because the range of pluck-stringed instrument generally is not as great as that of comparably sized bowed-string instruments, other instruments were often added to these plucked orchestras to extend the range of the ensemble upwards and downwards. The banjo cello was normally tuned C2-G2-D3-A3, one octave below the tenor banjo like the cello and mandocello. A five-string cello banjo, set up like a bluegrass banjo (with the short fifth string), but tuned one octave lower, has been produced by the Goldtone company. Bass banjos have been produced in both upright bass formats and with standard, horizontally carried banjo bodies. Contrabass banjos with either three or four strings have also been made; some of these had headstocks similar to those of bass violins. Tuning varies on these large instruments, with four-string models sometimes being tuned in 4ths like a bass |
ended in the early 1920s with several changes in rule and circumstance that were advantageous to hitters. Strict new regulations governed the ball's size, shape and composition, along with a new rule officially banning the spitball and other pitches that depended on the ball being treated or roughed-up with foreign substances, resulted in a ball that traveled farther when hit. The rise of the legendary player Babe Ruth, the first great power hitter of the new era, helped permanently alter the nature of the game. In the late 1920s and early 1930s, St. Louis Cardinals general manager Branch Rickey invested in several minor league clubs and developed the first modern farm system. A new Negro National League was organized in 1933; four years later, it was joined by the Negro American League. The first elections to the National Baseball Hall of Fame took place in 1936. In 1939, Little League Baseball was founded in Pennsylvania. A large number of minor league teams disbanded when World War II led to a player shortage. Chicago Cubs owner Philip K. Wrigley led the formation of the All-American Girls Professional Baseball League to help keep the game in the public eye. The first crack in the unwritten agreement barring blacks from white-controlled professional ball occurred in 1945: Jackie Robinson was signed by the National League's Brooklyn Dodgers and began playing for their minor league team in Montreal. In 1947, Robinson broke the major leagues' color barrier when he debuted with the Dodgers. Latin American players, largely overlooked before, also started entering the majors in greater numbers. In 1951, two Chicago White Sox, Venezuelan-born Chico Carrasquel and black Cuban-born Minnie Miñoso, became the first Hispanic All-Stars. Integration proceeded slowly: by 1953, only six of the 16 major league teams had a black player on the roster. Attendance records and the age of steroids In 1975, the union's power—and players' salaries—began to increase greatly when the reserve clause was effectively struck down, leading to the free agency system. Significant work stoppages occurred in 1981 and 1994, the latter forcing the cancellation of the World Series for the first time in 90 years. Attendance had been growing steadily since the mid-1970s and in 1994, before the stoppage, the majors were setting their all-time record for per-game attendance. After play resumed in 1995, non-division-winning wild card teams became a permanent fixture of the post-season. Regular-season interleague play was introduced in 1997 and the second-highest attendance mark for a full season was set. In 2000, the National and American Leagues were dissolved as legal entities. While their identities were maintained for scheduling purposes (and the designated hitter distinction), the regulations and other functions—such as player discipline and umpire supervision—they had administered separately were consolidated under the rubric of MLB. In 2001, Barry Bonds established the current record of 73 home runs in a single season. There had long been suspicions that the dramatic increase in power hitting was fueled in large part by the abuse of illegal steroids (as well as by the dilution of pitching talent due to expansion), but the issue only began attracting significant media attention in 2002 and there was no penalty for the use of performance-enhancing drugs before 2004. In 2007, Bonds became MLB's all-time home run leader, surpassing Hank Aaron, as total major league and minor league attendance both reached all-time highs. Around the world With the historic popular moniker as "America's national pastime", baseball is well established in several other countries as well. As early as 1877, a professional league, the International Association, featured teams from both Canada and the US. While baseball is widely played in Canada and many minor league teams have been based in the country, the American major leagues did not include a Canadian club until 1969, when the Montreal Expos joined the National League as an expansion team. In 1977, the expansion Toronto Blue Jays joined the American League. In 1847, American soldiers played what may have been the first baseball game in Mexico at Parque Los Berros in Xalapa, Veracruz. The first formal baseball league outside of the United States and Canada was founded in 1878 in Cuba, which maintains a rich baseball tradition. The Dominican Republic held its first islandwide championship tournament in 1912. Professional baseball tournaments and leagues began to form in other countries between the world wars, including the Netherlands (formed in 1922), Australia (1934), Japan (1936), Mexico (1937), and Puerto Rico (1938). The Japanese major leagues have long been considered the highest quality professional circuits outside of the United States. After World War II, professional leagues were founded in many Latin American countries, most prominently Venezuela (1946) and the Dominican Republic (1955). Since the early 1970s, the annual Caribbean Series has matched the championship clubs from the four leading Latin American winter leagues: the Dominican Professional Baseball League, Mexican Pacific League, Puerto Rican Professional Baseball League, and Venezuelan Professional Baseball League. In Asia, South Korea (1982), Taiwan (1990) and China (2003) all have professional leagues. The English football club, Aston Villa, were the first British baseball champions winning the 1890 National League of Baseball of Great Britain. The 2020 National Champions were the London Mets. Other European countries have seen professional leagues; the most successful, other than the Dutch league, is the Italian league, founded in 1948. In 2004, Australia won a surprise silver medal at the Olympic Games. The Confédération Européene de Baseball (European Baseball Confederation), founded in 1953, organizes a number of competitions between clubs from different countries. Other competitions between national teams, such as the Baseball World Cup and the Olympic baseball tournament, were administered by the International Baseball Federation (IBAF) from its formation in 1938 until its 2013 merger with the International Softball Federation to create the current joint governing body for both sports, the World Baseball Softball Confederation (WBSC). Women's baseball is played on an organized amateur basis in numerous countries. After being admitted to the Olympics as a medal sport beginning with the 1992 Games, baseball was dropped from the 2012 Summer Olympic Games at the 2005 International Olympic Committee meeting. It remained part of the 2008 Games. While the sport's lack of a following in much of the world was a factor, more important was MLB's reluctance to allow its players to participate during the major league season. MLB initiated the World Baseball Classic, scheduled to precede its season, partly as a replacement, high-profile international tournament. The inaugural Classic, held in March 2006, was the first tournament involving national teams to feature a significant number of MLB participants. The Baseball World Cup was discontinued after its 2011 edition in favor of an expanded World Baseball Classic. Distinctive elements Baseball has certain attributes that set it apart from the other popular team sports in the countries where it has a following. All of these sports use a clock, play is less individual, and the variation between playing fields is not as substantial or important. The comparison between cricket and baseball demonstrates that many of baseball's distinctive elements are shared in various ways with its cousin sports. No clock to kill In clock-limited sports, games often end with a team that holds the lead killing the clock rather than competing aggressively against the opposing team. In contrast, baseball has no clock, thus a team cannot win without getting the last batter out and rallies are not constrained by time. At almost any turn in any baseball game, the most advantageous strategy is some form of aggressive strategy. Whereas, in the case of multi-day Test and first-class cricket, the possibility of a draw (which occurs because of the restrictions on time, which like in baseball, originally did not exist) often encourages a team that is batting last and well behind, to bat defensively and run out the clock, giving up any faint chance at a win, to avoid an overall loss. While nine innings has been the standard since the beginning of professional baseball, the duration of the average major league game has increased steadily through the years. At the turn of the 20th century, games typically took an hour and a half to play. In the 1920s, they averaged just less than two hours, which eventually ballooned to 2:38 in 1960. By 1997, the average American League game lasted 2:57 (National League games were about 10 minutes shorter—pitchers at the plate making for quicker outs than designated hitters). In 2004, Major League Baseball declared that its goal was an average game of 2:45. By 2014, though, the average MLB game took over three hours to complete. The lengthening of games is attributed to longer breaks between half-innings for television commercials, increased offense, more pitching changes, and a slower pace of play with pitchers taking more time between each delivery, and batters stepping out of the box more frequently. Other leagues have experienced similar issues. In 2008, Nippon Professional Baseball took steps aimed at shortening games by 12 minutes from the preceding decade's average of 3:18. In 2016, the average nine-inning playoff game in Major League baseball was 3 hours and 35 minutes. This was up 10 minutes from 2015 and 21 minutes from 2014. Individual focus Although baseball is a team sport, individual players are often placed under scrutiny and pressure. While rewarding, it has sometimes been described as "ruthless" due to the pressure on the individual player. In 1915, a baseball instructional manual pointed out that every single pitch, of which there are often more than two hundred in a game, involves an individual, one-on-one contest: "the pitcher and the batter in a battle of wits". Pitcher, batter, and fielder all act essentially independent of each other. While coaching staffs can signal pitcher or batter to pursue certain tactics, the execution of the play itself is a series of solitary acts. If the batter hits a line drive, the outfielder is solely responsible for deciding to try to catch it or play it on the bounce and for succeeding or failing. The statistical precision of baseball is both facilitated by this isolation and reinforces it. Cricket is more similar to baseball than many other team sports in this regard: while the individual focus in cricket is mitigated by the importance of the batting partnership and the practicalities of tandem running, it is enhanced by the fact that a batsman may occupy the wicket for an hour or much more. There is no statistical equivalent in cricket for the fielding error and thus less emphasis on personal responsibility in this area of play. Uniqueness of parks Unlike those of most sports, baseball playing fields can vary significantly in size and shape. While the dimensions of the infield are specifically regulated, the only constraint on outfield size and shape for professional teams, following the rules of MLB and Minor League Baseball, is that fields built or remodeled since June 1, 1958, must have a minimum distance of from home plate to the fences in left and right field and to center. Major league teams often skirt even this rule. For example, at Minute Maid Park, which became the home of the Houston Astros in 2000, the Crawford Boxes in left field are only from home plate. There are no rules at all that address the height of fences or other structures at the edge of the outfield. The most famously idiosyncratic outfield boundary is the left-field wall at Boston's Fenway Park, in use since 1912: the Green Monster is from home plate down the line and tall. Similarly, there are no regulations at all concerning the dimensions of foul territory. Thus a foul fly ball may be entirely out of play in a park with little space between the foul lines and the stands, but a foulout in a park with more expansive foul ground. A fence in foul territory that is close to the outfield line will tend to direct balls that strike it back toward the fielders, while one that is farther away may actually prompt more collisions, as outfielders run full speed to field balls deep in the corner. These variations can make the difference between a double and a triple or inside-the-park home run. The surface of the field is also unregulated. While the adjacent image shows a traditional field surfacing arrangement (and the one used by virtually all MLB teams with naturally surfaced fields), teams are free to decide what areas will be grassed or bare. Some fields—including several in MLB—use an artificial surface, such as AstroTurf. Surface variations can have a significant effect on how ground balls behave and are fielded as well as on baserunning. Similarly, the presence of a roof (seven major league teams play in stadiums with permanent or retractable roofs) can greatly affect how fly balls are played. While football and soccer players deal with similar variations of field surface and stadium covering, the size and shape of their fields are much more standardized. The area out-of-bounds on a football or soccer field does not affect play the way foul territory in baseball does, so variations in that regard are largely insignificant. These physical variations create a distinctive set of playing conditions at each ballpark. Other local factors, such as altitude and climate, can also significantly affect play. A given stadium may acquire a reputation as a pitcher's park or a hitter's park, if one or the other discipline notably benefits from its unique mix of elements. The most exceptional park in this regard is Coors Field, home of the Colorado Rockies. Its high altitude— above sea level—is partly responsible for giving it the strongest hitter's park effect in the major leagues due to the low air pressure. Wrigley Field, home of the Chicago Cubs, is known for its fickle disposition: a hitter's park when the strong winds off Lake Michigan are blowing out, it becomes more of a pitcher's park when they are blowing in. The absence of a standardized field affects not only how particular games play out, but the nature of team rosters and players' statistical records. For example, hitting a fly ball into right field might result in an easy catch on the warning track at one park, and a home run at another. A team that plays in a park with a relatively short right field, such as the New York Yankees, will tend to stock its roster with left-handed pull hitters, who can best exploit it. On the individual level, a player who spends most of his career with a team that plays in a hitter's park will gain an advantage in batting statistics over time—even more so if his talents are especially suited to the park. Statistics Organized baseball lends itself to statistics to a greater degree than many other sports. Each play is discrete and has a relatively small number of possible outcomes. In the late 19th century, a former cricket player, English-born Henry Chadwick of Brooklyn, was responsible for the "development of the box score, tabular standings, the annual baseball guide, the batting average, and most of the common statistics and tables used to describe baseball." The statistical record is so central to the game's "historical essence" that Chadwick came to be known as Father Baseball. In the 1920s, American newspapers began devoting more and more attention to baseball statistics, initiating what journalist and historian Alan Schwarz describes as a "tectonic shift in sports, as intrigue that once focused mostly on teams began to go to individual players and their statistics lines." The Official Baseball Rules administered by MLB require the official scorer to categorize each baseball play unambiguously. The rules provide detailed criteria to promote consistency. The score report is the official basis for both the box score of the game and the relevant statistical records. General managers, managers, and baseball scouts use statistics to evaluate players and make strategic decisions. Certain traditional statistics are familiar to most baseball fans. The basic batting statistics include: At bats: plate appearances, excluding walks and hit by pitches—where the batter's ability is not fully tested—and sacrifices and sacrifice flies—where the batter intentionally makes an out in order to advance one or more baserunners Hits: times a base is reached safely, because of a batted, fair ball without a fielding error or fielder's choice Runs: times circling the bases and reaching home safely Runs batted in (RBIs): number of runners who scored due | baseball game, the most advantageous strategy is some form of aggressive strategy. Whereas, in the case of multi-day Test and first-class cricket, the possibility of a draw (which occurs because of the restrictions on time, which like in baseball, originally did not exist) often encourages a team that is batting last and well behind, to bat defensively and run out the clock, giving up any faint chance at a win, to avoid an overall loss. While nine innings has been the standard since the beginning of professional baseball, the duration of the average major league game has increased steadily through the years. At the turn of the 20th century, games typically took an hour and a half to play. In the 1920s, they averaged just less than two hours, which eventually ballooned to 2:38 in 1960. By 1997, the average American League game lasted 2:57 (National League games were about 10 minutes shorter—pitchers at the plate making for quicker outs than designated hitters). In 2004, Major League Baseball declared that its goal was an average game of 2:45. By 2014, though, the average MLB game took over three hours to complete. The lengthening of games is attributed to longer breaks between half-innings for television commercials, increased offense, more pitching changes, and a slower pace of play with pitchers taking more time between each delivery, and batters stepping out of the box more frequently. Other leagues have experienced similar issues. In 2008, Nippon Professional Baseball took steps aimed at shortening games by 12 minutes from the preceding decade's average of 3:18. In 2016, the average nine-inning playoff game in Major League baseball was 3 hours and 35 minutes. This was up 10 minutes from 2015 and 21 minutes from 2014. Individual focus Although baseball is a team sport, individual players are often placed under scrutiny and pressure. While rewarding, it has sometimes been described as "ruthless" due to the pressure on the individual player. In 1915, a baseball instructional manual pointed out that every single pitch, of which there are often more than two hundred in a game, involves an individual, one-on-one contest: "the pitcher and the batter in a battle of wits". Pitcher, batter, and fielder all act essentially independent of each other. While coaching staffs can signal pitcher or batter to pursue certain tactics, the execution of the play itself is a series of solitary acts. If the batter hits a line drive, the outfielder is solely responsible for deciding to try to catch it or play it on the bounce and for succeeding or failing. The statistical precision of baseball is both facilitated by this isolation and reinforces it. Cricket is more similar to baseball than many other team sports in this regard: while the individual focus in cricket is mitigated by the importance of the batting partnership and the practicalities of tandem running, it is enhanced by the fact that a batsman may occupy the wicket for an hour or much more. There is no statistical equivalent in cricket for the fielding error and thus less emphasis on personal responsibility in this area of play. Uniqueness of parks Unlike those of most sports, baseball playing fields can vary significantly in size and shape. While the dimensions of the infield are specifically regulated, the only constraint on outfield size and shape for professional teams, following the rules of MLB and Minor League Baseball, is that fields built or remodeled since June 1, 1958, must have a minimum distance of from home plate to the fences in left and right field and to center. Major league teams often skirt even this rule. For example, at Minute Maid Park, which became the home of the Houston Astros in 2000, the Crawford Boxes in left field are only from home plate. There are no rules at all that address the height of fences or other structures at the edge of the outfield. The most famously idiosyncratic outfield boundary is the left-field wall at Boston's Fenway Park, in use since 1912: the Green Monster is from home plate down the line and tall. Similarly, there are no regulations at all concerning the dimensions of foul territory. Thus a foul fly ball may be entirely out of play in a park with little space between the foul lines and the stands, but a foulout in a park with more expansive foul ground. A fence in foul territory that is close to the outfield line will tend to direct balls that strike it back toward the fielders, while one that is farther away may actually prompt more collisions, as outfielders run full speed to field balls deep in the corner. These variations can make the difference between a double and a triple or inside-the-park home run. The surface of the field is also unregulated. While the adjacent image shows a traditional field surfacing arrangement (and the one used by virtually all MLB teams with naturally surfaced fields), teams are free to decide what areas will be grassed or bare. Some fields—including several in MLB—use an artificial surface, such as AstroTurf. Surface variations can have a significant effect on how ground balls behave and are fielded as well as on baserunning. Similarly, the presence of a roof (seven major league teams play in stadiums with permanent or retractable roofs) can greatly affect how fly balls are played. While football and soccer players deal with similar variations of field surface and stadium covering, the size and shape of their fields are much more standardized. The area out-of-bounds on a football or soccer field does not affect play the way foul territory in baseball does, so variations in that regard are largely insignificant. These physical variations create a distinctive set of playing conditions at each ballpark. Other local factors, such as altitude and climate, can also significantly affect play. A given stadium may acquire a reputation as a pitcher's park or a hitter's park, if one or the other discipline notably benefits from its unique mix of elements. The most exceptional park in this regard is Coors Field, home of the Colorado Rockies. Its high altitude— above sea level—is partly responsible for giving it the strongest hitter's park effect in the major leagues due to the low air pressure. Wrigley Field, home of the Chicago Cubs, is known for its fickle disposition: a hitter's park when the strong winds off Lake Michigan are blowing out, it becomes more of a pitcher's park when they are blowing in. The absence of a standardized field affects not only how particular games play out, but the nature of team rosters and players' statistical records. For example, hitting a fly ball into right field might result in an easy catch on the warning track at one park, and a home run at another. A team that plays in a park with a relatively short right field, such as the New York Yankees, will tend to stock its roster with left-handed pull hitters, who can best exploit it. On the individual level, a player who spends most of his career with a team that plays in a hitter's park will gain an advantage in batting statistics over time—even more so if his talents are especially suited to the park. Statistics Organized baseball lends itself to statistics to a greater degree than many other sports. Each play is discrete and has a relatively small number of possible outcomes. In the late 19th century, a former cricket player, English-born Henry Chadwick of Brooklyn, was responsible for the "development of the box score, tabular standings, the annual baseball guide, the batting average, and most of the common statistics and tables used to describe baseball." The statistical record is so central to the game's "historical essence" that Chadwick came to be known as Father Baseball. In the 1920s, American newspapers began devoting more and more attention to baseball statistics, initiating what journalist and historian Alan Schwarz describes as a "tectonic shift in sports, as intrigue that once focused mostly on teams began to go to individual players and their statistics lines." The Official Baseball Rules administered by MLB require the official scorer to categorize each baseball play unambiguously. The rules provide detailed criteria to promote consistency. The score report is the official basis for both the box score of the game and the relevant statistical records. General managers, managers, and baseball scouts use statistics to evaluate players and make strategic decisions. Certain traditional statistics are familiar to most baseball fans. The basic batting statistics include: At bats: plate appearances, excluding walks and hit by pitches—where the batter's ability is not fully tested—and sacrifices and sacrifice flies—where the batter intentionally makes an out in order to advance one or more baserunners Hits: times a base is reached safely, because of a batted, fair ball without a fielding error or fielder's choice Runs: times circling the bases and reaching home safely Runs batted in (RBIs): number of runners who scored due to a batter's action (including the batter, in the case of a home run), except when batter grounded into double play or reached on an error Home runs: hits on which the batter successfully touched all four bases, without the contribution of a fielding error Batting average: hits divided by at bats—the traditional measure of batting ability The basic baserunning statistics include: Stolen bases: times advancing to the next base entirely due to the runner's own efforts, generally while the pitcher is preparing to deliver or delivering the ball Caught stealing: times tagged out while attempting to steal a base The basic pitching statistics include: Wins: credited to pitcher on winning team who last pitched before the team took a lead that it never relinquished (a starting pitcher must pitch at least five innings to qualify for a win) Losses: charged to pitcher on losing team who was pitching when the opposing team took a lead that it never relinquished Saves: games where the pitcher enters a game led by the pitcher's team, finishes the game without surrendering the lead, is not the winning pitcher, and either (a) the lead was three runs or less when the pitcher entered the game; (b) the potential tying run was on base, at bat, or on deck; or (c) the pitcher pitched three or more innings Innings pitched: outs recorded while pitching divided by three (partial innings are conventionally recorded as, e.g., "5.2" or "7.1", the last digit actually representing thirds, not tenths, of an inning) Strikeouts: times pitching three strikes to a batter Winning percentage: wins divided by decisions (wins plus losses) Earned run average (ERA): runs allowed, excluding those resulting from fielding errors, per nine innings pitched The basic fielding statistics include: Putouts: times the fielder catches a fly ball, tags or forces out a runner, or otherwise directly effects an out Assists: times a putout by another fielder was recorded following the fielder touching the ball Errors: times the fielder fails to make a play that should have been made with common effort, and the batting team benefits as a result Total chances: putouts plus assists plus errors Fielding average: successful chances (putouts plus assists) divided by total chances Among the many other statistics that are kept are those collectively known as situational statistics. For example, statistics can indicate which specific pitchers a certain batter performs best against. If a given situation statistically favors a certain batter, the manager of the fielding team may be more likely to change pitchers or have the pitcher intentionally walk the batter in order to face one who is less likely to succeed. Sabermetrics Sabermetrics refers to the field of baseball statistical study and the development of new statistics and analytical tools. The term is also used to refer directly to new statistics themselves. The term was coined around 1980 by one of the field's leading proponents, Bill James, and derives from the Society for American Baseball Research (SABR). The growing popularity of sabermetrics since the early 1980s has brought more attention to two batting statistics that sabermetricians argue are much better gauges of a batter's skill than batting average: On-base percentage (OBP) measures a batter's ability to get on base. It is calculated by taking the sum of the batter's successes in getting on base (hits plus walks plus hit by pitches) and dividing that by the batter's total plate appearances (at bats plus walks plus hit by pitches plus sacrifice flies), except for sacrifice bunts. Slugging percentage (SLG) measures a batter's ability to hit for power. It is calculated by taking the batter's total bases (one per each single, two per double, three per triple, and four per home run) and dividing that by the batter's at bats. Some of the new statistics devised by sabermetricians have gained wide use: On-base plus slugging (OPS) measures a batter's overall ability. It is calculated by adding the batter's on-base percentage and slugging percentage. Walks plus hits per inning pitched (WHIP) measures a pitcher's ability at preventing hitters from reaching base. It is calculated by adding the number of walks and hits a pitcher surrendered, then dividing by the number of innings pitched. Wins Above Replacement (WAR) measures number of additional wins his team has achieved above the number of expected team wins if that player were substituted with a replacement-level player. Popularity and cultural impact Writing in 1919, philosopher Morris Raphael Cohen described baseball as the national religion of the US. In the words of sports columnist Jayson Stark, baseball has long been "a unique paragon of American culture"—a status he sees as devastated by the steroid abuse scandal. Baseball has an important place in other national cultures as well: Scholar Peter Bjarkman describes "how deeply the sport is ingrained in the history and culture of a nation such as Cuba, [and] how thoroughly it was radically reshaped and nativized in Japan." In the United States The major league game in the United States was originally targeted toward a middle-class, white-collar audience: relative to other spectator pastimes, the National League's set ticket price of 50 cents in 1876 was high, while the location of playing fields outside the inner city and the workweek daytime scheduling of games were also obstacles to a blue-collar audience. A century later, the situation was very different. With the rise in popularity of other team sports with much higher average ticket prices—football, basketball, and hockey—professional baseball had become among the most blue-collar-oriented of leading American spectator sports. Overall, baseball has a large following in the United States; a 2006 poll found that nearly half of Americans are fans. In the late 1900s and early 2000s, baseball's position compared to football in the United States moved in contradictory directions. In 2008, MLB set a revenue record of $6.5 billion, matching the NFL's revenue for the first time in decades. A new MLB revenue record of more than $10 billion was set in 2017. On the other hand, the percentage of American sports fans polled who named baseball as their favorite sport was 9%, compared to pro football at 37%. In 1985, the respective figures were pro football 24%, baseball 23%. Because there are so many more major league games played, there is no comparison in overall attendance. In 2008, total attendance at major league games was the second-highest in history: 78.6 million, 0.7% off the record set the previous year. The following year, amid the U.S. recession, attendance fell by 6.6% to 73.4 million. Eight years later, it dropped under 73 million. Attendance at games held under the Minor League Baseball umbrella set a record in 2008, with 43.3 million. While MLB games have not drawn the same national TV viewership as football games, MLB games are dominant in teams' local markets and regularly lead all programs in primetime in their markets during the summer. Caribbean Since the early 1980s, the Dominican Republic, in particular the city of San Pedro de Macorís, has been the major leagues' primary source of foreign talent. In 2017, 83 of the 868 players on MLB Opening Day rosters (and disabled lists) were from the country. Among other Caribbean countries and territories, a combined 97 MLB players were born in Venezuela, Cuba, and Puerto Rico. Hall-of-Famer Roberto Clemente remains one of the greatest national heroes in Puerto Rico's history. While baseball has long been the island's primary athletic pastime, its once well-attended professional winter league has declined in popularity since 1990, when young Puerto Rican players began to be included in the major leagues' annual first-year player draft. Asia In Asia, baseball is among the most popular sports in Japan and South Korea. In Japan, where baseball is inarguably the leading spectator team sport, combined revenue for the twelve teams in Nippon Professional Baseball (NPB), the body that oversees both the Central and Pacific Leagues, was estimated at $1 billion in 2007. Total NPB attendance for the year was approximately 20 million. While in the preceding two decades, MLB attendance grew by 50 percent and revenue nearly tripled, the comparable NPB figures were stagnant. There are concerns that MLB's growing interest in acquiring star Japanese players will hurt the game in their home country. In Cuba, where baseball is by every reckoning the national sport, the national team overshadows the city and provincial teams that play in the top-level domestic leagues. Revenue figures are not released for the country's amateur system. Similarly, according to one official pronouncement, the sport's governing authority "has never taken into account attendance ... because its greatest interest has always been the development of athletes". Among children , Little League Baseball oversees leagues with close to 2.4 million participants in over 80 countries. The number of players has fallen since the 1990s, when 3 million children took part in Little League Baseball annually. Babe Ruth League teams have over 1 million participants. According to the president of the International Baseball Federation, between 300,000 and 500,000 women and girls play baseball around the world, including Little League and the introductory game of Tee Ball. A varsity baseball team is an established part of physical education departments at most high schools and colleges in the United States. In 2015, nearly half a million high schoolers and over 34,000 collegians played on their schools' baseball teams. By early in the 20th century, intercollegiate baseball was Japan's leading sport. Today, high school baseball in particular is immensely popular there. The final rounds of the two annual tournaments—the National High School Baseball Invitational Tournament in the spring, and the even more important National High School Baseball Championship in the summer—are broadcast around the country. The tournaments are known, respectively, as Spring Koshien and Summer Koshien after the 55,000-capacity stadium where they are played. In Cuba, |
stopping, and retrieving a hit ball, and then setting themselves up to transfer the ball, all with the end goal of getting the ball as quickly as possible to another fielder. They also run the risk of colliding with incoming runners during a tag attempt at a base. Fielders may have different responsibilities depending on the game situation. For example, when an outfielder is attempting to throw the ball from near the fence to one of the bases, an infielder may need to "cut off" the throw and then act as a relay thrower to help the ball cover its remaining distance to the target destination. As a group, the outfielders are responsible for preventing home runs by reaching over the fence (and potentially doing a wall climb) for fly balls that are catchable. The | tag attempt at a base. Fielders may have different responsibilities depending on the game situation. For example, when an outfielder is attempting to throw the ball from near the fence to one of the bases, an infielder may need to "cut off" the throw and then act as a relay thrower to help the ball cover its remaining distance to the target destination. As a group, the outfielders are responsible for preventing home runs by reaching over the fence (and potentially doing a wall climb) for fly balls that are catchable. The infielders are the ones who generally handle plays that involve tagging a base or runner. The pitcher and catcher have special responsibilities to prevent base stealing, as they are the ones who handle the ball whenever it has not been hit. Other roles Designated hitter Pinch hitter Pinch |
and coaching for his college team at Ohio Wesleyan University, Rickey had a black teammate named Charles Thomas. On a road trip through southern Ohio, his fellow player was refused a room in a hotel. Although Rickey was able to get the player into his room for that night, he was taken aback when he reached his room to find Thomas upset and crying about this injustice. Rickey related this incident as an example of why he wanted a full desegregation of not only baseball, but the entire nation. In the mid-1940s, Rickey had compiled a list of Negro league ballplayers for possible Major League contracts. Realizing that the first African-American signee would be a magnet for prejudiced sentiment, however, Rickey was intent on finding a player with the distinguished personality and character that would allow him to tolerate the inevitable abuse. Rickey's sights eventually settled on Jackie Robinson, a shortstop with the Kansas City Monarchs. Although probably not the best player in the Negro leagues at the time, Robinson was an exceptional talent, was college-educated, and had the marketable distinction of having served as an officer during World War II. Even more importantly, Rickey judged Robinson to possess the inner strength to withstand the inevitable harsh animosity to come. To prepare him for the task, Rickey played Robinson in 1946 for the Dodgers' minor league team, the Montreal Royals, which proved an arduous emotional challenge, though Robinson enjoyed fervently enthusiastic support from the Montreal fans. On April 15, 1947, Robinson broke the color barrier, which had been tacitly recognized for almost 75 years, with his appearance for the Brooklyn Dodgers at Ebbets Field. Eleven weeks later, on July 5, 1947, the American League was integrated by the signing of Larry Doby to the Cleveland Indians. Over the next few years, a handful of black baseball players made appearances in the majors, including Roy Campanella (teammate to Robinson in Brooklyn) and Satchel Paige (teammate to Doby in Cleveland). Paige, who had pitched more than 2,400 innings in the Negro leagues, sometimes two and three games a day, was still effective at 42, and still playing at 59. His ERA in the Major Leagues was 3.29. However, the initial pace of integration was slow. By 1953, only six of the sixteen major league teams had a black player on the roster. The Boston Red Sox became the last major league team to integrate its roster with the addition of Pumpsie Green on July 21, 1959. While limited in numbers, the on-field performance of early black Major League players was outstanding. In the fourteen years from 1947 to 1960, black players won one or more of the Rookie of the Year awards nine times. While never prohibited in the same fashion as African Americans, Latin American players also benefitted greatly from the integration era. In 1951, two Chicago White Sox, Venezuelan-born Chico Carrasquel and Cuban-born (and black) Minnie Miñoso, became the first Hispanic All-Stars. According to some baseball historians, Jackie Robinson and the other African-American players helped reestablish the importance of baserunning and similar elements of play that were previously de-emphasized by the predominance of power hitting. From 1947 to the 1970s, African-American participation in baseball rose steadily. By 1974, 27% of baseball players were African American. As a result of this on-field experience, minorities began to experience long-delayed gains in managerial positions within baseball. In 1975, Frank Robinson (who had been the 1956 Rookie of the Year with the Cincinnati Reds) was named player-manager of the Cleveland Indians, making him the first African-American manager in the major leagues. Although these front-office gains continued, Major League Baseball saw a lengthy slow decline in the percentage of black players after the mid-1970s. By 2007, African Americans made up less than 9% of Major League players. While this trend is largely attributed to an increased emphasis on recruitment of players from Latin America (with the number of Hispanic players in the major leagues rising to 29% by 2007), other factors have been cited as well. Hall of Fame player Dave Winfield, for instance, has pointed out that urban America provides fewer resources for youth baseball than in the past. Despite this continued prevalence of Hispanic players, the percentage of black players rose again in 2008 to 10.2%. Arturo Moreno became the first Hispanic owner of an MLB franchise when he purchased the Anaheim Angels in 2004. In 2005, a Racial and Gender Report Card on Major League Baseball was issued, which generally found positive results on the inclusion of African Americans and Latinos in baseball, and gave Major League Baseball a grade of "A" or better for opportunities for players, managers and coaches as well as for MLB's central office. At that time, 37% of major league players were people of color: Latino (26 percent), African American (9 percent) or Asian (2 percent). Also by 2004, 29% of the professional staff in MLB's central office were people of color, 11% of team vice presidents were people of color, and seven of the league's managers were of color (four African Americans and three Latinos). The Major Leagues move west Baseball had been in the West for almost as long as the National League and the American League had been around. It evolved into the Pacific Coast League (PCL), which included the Hollywood Stars, Los Angeles Angels, Oakland Oaks, Portland Beavers, Sacramento Solons, San Francisco Seals, San Diego Padres, Seattle Rainiers. The PCL was huge in the West. A member of the National Association of Professional Baseball Leagues, it kept losing great players to the National and the American leagues for less than $8,000 a player. The PCL was far more independent than the other "minor" leagues, and rebelled continuously against their Eastern masters. Clarence Pants Rowland, the President of the PCL, took on baseball commissioners Kenesaw Mountain Landis and Happy Chandler at first to get better equity from the major leagues, then to form a third major league. His efforts were rebuffed by both commissioners. Chandler and several of the owners, who saw the value of the markets in the West, started to plot the extermination of the PCL. They had one thing that Rowland did not: The financial power of the Eastern major league baseball establishment. No one was going to back a PCL club building a major-league size stadium if the National or the American League was going to build one too, which discouraged investment in PCL ballparks. PCL games and rivalries still drew fans, but the leagues' days of dominance in the West were numbered. 1953–1955 Until the 1950s, major league baseball franchises had been largely confined to the northeastern United States, with the teams and their locations remaining unchanged from 1903 to 1952. The first team to relocate in fifty years was the Boston Braves, who moved in 1953 to Milwaukee, where the club set attendance records. In 1954, the St. Louis Browns moved to Baltimore and were renamed the Baltimore Orioles. These relocations can be seen as a full-circle ending to the classic era, which began with the moves of teams from Milwaukee and Baltimore. In 1955, the Philadelphia Athletics moved to Kansas City. National League Baseball leaves New York In 1958 the New York market ripped apart. The Yankees were becoming the dominant draw, and the cities of the West offered generations of new fans in much more sheltered markets for the other venerable New York clubs, the Brooklyn Dodgers and the New York Giants. Placing these storied, powerhouse clubs in the two biggest cities in the West had the specific design of crushing any attempt by the PCL to form a third major league. Eager to bring these big names to the West, Los Angeles gave Walter O'Malley, owner of the Dodgers, a helicopter tour of the city and asked him to pick his spot. The Giants were given the lease of the PCL San Francisco Seals while Candlestick Park was built for them. California The logical first candidates for major league "expansion" were the same metropolitan areas that had just attracted the Dodgers and Giants. It is said that the Dodgers and Giants—National League rivals in New York City—chose their new cities because Los Angeles (in southern California) and San Francisco (in northern California) already had a fierce rivalry (geographical, economic, cultural and political), dating back to the state's founding. The only California expansion team—and also the first in Major League Baseball in over 70 years—was the Los Angeles Angels (later the California Angels, the Anaheim Angels, and, as of 2005, the Los Angeles Angels of Anaheim), who brought the American League to southern California in 1961. Northern California, however, would later gain its own American League team, in 1968, when the Athletics would move again, settling in Oakland, across San Francisco Bay from the Giants. 1961–1962 Along with the Angels, the other 1961 expansion team was the Washington Senators, who joined the American League and took over the nation's capital when the previous Senators moved to Minnesota and became the Twins. 1961 is also noted as being the year in which Roger Maris surpassed Babe Ruth's single season home run record, hitting 61 for the New York Yankees, albeit in a slightly longer season than Ruth's. To keep pace with the American League—which now had ten teams—the National League likewise expanded to ten teams, in 1962, with the addition of the Houston Colt .45s and New York Mets. 1969 In 1969, the American League expanded when the Kansas City Royals and Seattle Pilots, the latter in a longtime PCL stronghold, were admitted to the league. The Pilots stayed just one season in Seattle before moving to Milwaukee and becoming today's Milwaukee Brewers. The National League also added two teams that year, the Montreal Expos and San Diego Padres. Given the size of the expanded leagues, 12 teams apiece, each split into East and West divisions, with a playoff series to determine the pennant winner and World Series contender—the first post-season baseball instituted since the advent of the World Series itself. The Padres were the last of the core PCL teams to be absorbed. The Coast League did not die, though. After reforming and moving into new markets, it successfully transformed into a Class AAA league. 1972–2013 In 1972, the second Washington Senators moved to the Dallas-Fort Worth area and became the Texas Rangers. In 1977, the American League expanded to fourteen teams, with the newly formed Seattle Mariners and Toronto Blue Jays. Sixteen years later, in 1993, the National League likewise expanded to fourteen teams, with the newly formed Colorado Rockies and Florida Marlins (now Miami Marlins). Beginning with the 1994 season, both the AL and the NL were divided into three divisions (East, West, and Central), with the addition of a wild card team (the team with the best record among those finishing in second place) to enable four teams in each league to advance to the preliminary division series. However, due to the 1994–95 Major League Baseball strike (which canceled the 1994 World Series), the new rules did not go into effect until the 1995 World Series. In 1998, the AL and the NL each added a fifteenth team, for a total of thirty teams in Major League Baseball. The Arizona Diamondbacks joined the National League, and the Tampa Bay Devil Rays—now called simply the Rays—joined the American League. In order to keep the number of teams in each league at an even number (14 – AL, 16 – NL), Milwaukee changed leagues and became a member of the National League.<ref>See Major League Baseball#League organization.</ref> Two years later, the NL and AL ended their independent corporate existences and merged into a new legal entity named Major League Baseball; the two leagues remained as playing divisions. In 2001, MLB took over the struggling Montreal Expos franchise and, after the 2004 season, moved it to Washington, DC, which had been clamoring for a team ever since the second Senators' departure in 1972; the club was renamed the Nationals. In 2013, in keeping with Commissioner Bud Selig's desire for expanded interleague play, the Houston Astros were shifted from the National to the American League; with an odd number (15) in each league, an interleague contest was played somewhere almost every day during the season. At this time the divisions within each league were shuffled to create six equal divisions of five teams. Pitching dominance and rules changes By the late 1960s, the balance between pitching and hitting had swung back to favor of the pitchers once more. In 1968 Carl Yastrzemski won the American League batting title with an average of just .301, the lowest in history. That same year, Detroit Tigers pitcher Denny McLain won 31 games—making him the last pitcher to win 30 games in a season. St. Louis Cardinals starting pitcher Bob Gibson achieved an equally remarkable feat by allowing an ERA of just 1.12. In response to these events, major league baseball implemented certain rule changes in 1969 to benefit the batters. The pitcher's mound was lowered, and the strike zone was reduced. In 1973 the American League, which had been suffering from much lower attendance than the National League, made a move to increase scoring even further by initiating the designated hitter rule. Players assert themselves From the time of the formation of the Major Leagues to the 1960s, the team owners controlled the game. After the so-called "Brotherhood Strike" of 1890 and the failure of the Brotherhood of Professional Base Ball Players and its Players National League, the owners' control of the game seemed absolute. It lasted over 70 years despite a number of short-lived players organizations. In 1966, however, the players enlisted the help of labor union activist Marvin Miller to form the Major League Baseball Players Association (MLBPA). The same year, Sandy Koufax and Don Drysdale—both Cy Young Award winners for the Los Angeles Dodgers—refused to re-sign their contracts, and the era of the reserve clause, which held players to one team, was drawing to an end. The first legal challenge came in 1970. Backed by the MLBPA, St. Louis Cardinals outfielder Curt Flood took the leagues to court to negate a player trade, citing the 13th Amendment and antitrust legislation. In 1972, he finally lost his case before the United States Supreme Court by a vote of 5 to 3, but gained large-scale public sympathy, and the damage had been done. The reserve clause survived, but it had been irrevocably weakened. In 1975, Andy Messersmith of the Dodgers and Dave McNally of the Montreal Expos played without contracts, and then declared themselves free agents in response to an arbitrator's ruling. Handcuffed by concessions made in the Flood case, the owners had no choice but to accept the collective bargaining package offered by the MLBPA, and the reserve clause was effectively ended, to be replaced by the current system of free-agency and arbitration. While the legal challenges were going on, the game continued. In 1969, the "Miracle Mets", just seven years after their formation, recorded their first winning season, won the National League East and finally the World Series. On the field, the 1970s saw some of the longest-standing records fall, along with the rise of two powerhouse dynasties. In Oakland, the Swinging A's were overpowering, winning the Series in 1972, 1973 and 1974, and five straight division titles. The strained relationships between teammates, who included Catfish Hunter, Vida Blue and Reggie Jackson, gave the lie to the need for "chemistry" between players. The National League, on the other hand, belonged to the Big Red Machine in Cincinnati, where Sparky Anderson's team, which included Pete Rose as well as Hall of Famers Tony Pérez, Johnny Bench and Joe Morgan, succeeded the A's run in 1975. The decade also contained great individual achievements. On April 8, 1974, Hank Aaron of the Atlanta Braves hit his 715th career home run, surpassing Babe Ruth's all-time record. He would retire in 1976 with 755, and that was just one of numerous records he achieved, many of which, including total bases, still stand today. There was great pitching too: between 1973 and 1975, Nolan Ryan threw four "no-hit" games. He would add a record-breaking fifth in 1981 and two more before his retirement in 1993, by which time he had also accumulated 5,714 strikeouts, another record, in a 27-year career. The marketing and hype era From the 1980s onward, the major league game changed dramatically, due to the combined effects of free agency, improvements in the science of sports conditioning, changes in the marketing and television broadcasting of sporting events, and the push by brand-name products for greater visibility. These events lead to greater labor difficulties, fan disaffection, skyrocketing prices, changes in game-play, and problems with the use of performance-enhancing substances like steroids tainting the race for records. In spite of all this, stadium crowds generally grew. Average attendances first broke 20,000 in 1979 and 30,000 in 1993. That year total attendance hit 70 million, but baseball was hit hard by a strike in 1994, and as of 2005 it had marginally improved on those 1993 records. (Update: Between 2009 and 2017, average attendance hovered just over the 30,000 mark, with numbers falling into the 28,000s in '18 and 19. The 2019 season saw a million fewer tickets sold than the banner year of 2007, however revenues to major league baseball from media rights fees increased total revenue to $10 billion in 2018, a 70% rise from a decade before.) The science of the sport changes the game During the 1980s, significant advances were made in the science of physical conditioning. Weight rooms and training equipment were improved. Trainers and doctors developed better diets and regimens to make athletes bigger, healthier, and stronger than they had ever been. Another major change that had been occurring during this time was the adoption of the pitch count. Starting pitchers who played complete games had not been an unusual thing in baseball's history. Now, pitchers were throwing harder than ever and pitching coaches watched to see how many pitches a player had thrown over the game. At anywhere from 100 to 125, pitchers increasingly would be pulled out to preserve their arms. Bullpens began to specialize more, with more pitchers being trained as middle relievers, and a few hurlers, usually possessing high velocity but not much durability, as closers. The science of maximizing effectiveness and career duration, while attempting to minimize injury and downtime, is an ongoing pursuit by coaches and kinesiologists. Along with the expansion of teams, the addition of more pitchers needed to play a complete game stressed the total number of quality players available in a system that restricted its talent searches at that time to America, Canada, Latin America, and the Caribbean. Television The arrival of live televised sports in the 1950s increased attention and revenue for all major league clubs at first. The television programming was extremely regional, hurting the non-televised minor and independent leagues most. People stayed home to watch Maury Wills rather than watch unknowns at their local baseball park. Major League Baseball, as it always did, made sure that it controlled rights and fees charged for the broadcasts of all games, just as it had on radio. The national networks began televising national games of the week, opening the door for a national audience to see particular clubs. While most teams were broadcast in the course of a season, emphasis tended toward the league leaders with famous players and the major market franchises that could draw the largest audience. The rise of cable In the 1970s the cable revolution began. The Atlanta Braves became a power contender with greater revenues generated by WTBS, Ted Turner's Atlanta-based Super-Station, broadcast as "America's Team" to cable households nationwide. The roll-out of ESPN, then regional sports networks (now mostly under the umbrella of Fox Sports Net) changed sports news in general and particularly baseball with its relatively huge number of games-per-season. Now under the microscope of news organizations that needed to fill 24 programming hours per day, the amount of attention—and salary—paid to major league players grew exponentially. Players who would have sought off-season jobs to make ends meet just 20 years earlier were now well-paid professionals at least, and multi-millionaires in many cases. This super-star status often rested on careers that were not as compelling as those of the baseball heroes of a less media-intense time. As player contract values soared, and the number of broadcasters, commentators, columnists, and sports writers also multiplied. The competition for a fresh angle on any story became fierce. Media pundits began questioning the high salaries paid to players when on-field performance was deemed less than deserving. Critical commentary was more of a draw than praise, and coverage began to become intensely negative. Players' personal lives, which had always been off-limits except under extreme circumstances, became the fodder of editorials, insider stories on TV, and features in magazines. When the use of performance-enhancing drugs became an issue, drawing scornful criticism from fans and pundits, the gap between the sports media and the players whom they covered widened further. With the development of satellite television and digital cable, Major League Baseball launched channels with season-subscription fees, making it possible for fans to watch virtually every game played, in both major leagues, everywhere, in real time. Team networks The next refinement of baseball on cable was the creation of single-team cable networks. YES Network & NESN, the New York Yankees & Boston Red Sox cable television networks, respectively, took in millions to broadcast games not only in New York and Boston but around the country. These networks generated as much revenue as, or more than, revenue annually for large-market teams' baseball operations. By fencing these channels off in separate corporate entities, owners were able to exclude the income from consideration during contract negotiations. Merchandise, endorsements and sponsorships The first merchandise produced in response to the growing popularity of the game was the baseball trading card. The earliest known player cards were produced in 1868 by a pair of New York baseball-equipment purveyors. Since that time, many enterprises, notably tobacco and candy companies, have used trading cards to promote and sell their products. These cards rarely, if ever, provided any benefit directly to the players, but a growing mania for collecting and trading cards helped personalize baseball, giving some fans a more personal connection to their favorite players and introducing them to new ones. Eventually, older cards became “vintage” and rare cards gained in value until the secondary market for trading cards became a billion-dollar industry in itself, with the rarest individuals bringing mid-six-figures to millions of dollars at auction. The advent of the Internet and websites such as eBay provided huge new venues for buyers, sellers and traders, some of whom have made baseball cards their living. In recent years baseball cards have disassociated from unrelated products like tobacco and bubble-gum, to become products in their own right. Following the exit of competitor Donruss from the baseball-card industry, former bubble-gum giants Topps and Fleer came to dominate that market through exclusive contracts with players and Major League Baseball. Fleer, in turn, exited the market in 2007, leaving Topps as the only card manufacturer with an MLB contract. Other genuine baseball memorabilia also trades and sells, often at high prices. Much of what is for sale as "memorabilia" is manufactured strictly for sale and rarely has a direct connection to teams or players beyond the labeling, unless signed in person by a player. Souvenir balls caught by fans during important games, especially significant home run balls, have great rarity value, and balls signed by players have always been treasured, traded and sold. The high value of autographs has created new businessmen whose sole means of making a living was acquiring autographs and memorabilia from the athletes. Memorabilia hounds fought with fans to get signatures worth $20, $60, or even $100 or more in their inventory. Of great value to individual top players are endorsement contracts wherein the player's fame is used to sell anything from sports equipment to automobiles, soda and underwear. Top players can receive as much as a million dollars a year or more directly from the companies. In deals with players, teams and Major League Baseball, large corporations like NIKE and Champion pay big money to make sure that their logos are seen on the clothing and shoes worn by athletes on the field. This "association branding" has become a significant revenue stream. In the late 1990s and into the 21st century, the dugout, the backstops behind home plate, and | average fell below five thousand for the only time between the wars. At first wary of radio's potential to impact ticket sales at the park, owners began to make broadcast deals and by the late 1930s, all teams' games went out over the air. 1933 also saw the introduction of the yearly All-Star game, a mid-season break in which the greatest players in each league play against one another in a hard-fought but officially meaningless demonstration game. In 1936 the Baseball Hall of Fame in Cooperstown, NY, was instituted and five players elected: Ty Cobb, Walter Johnson, Christy Mathewson, Babe Ruth and Honus Wagner. The Hall formally opened in 1939 and, of course, remains open to this day. The war years In 1941, a year which saw the premature death of Lou Gehrig, Boston's great left fielder Ted Williams had a batting average over .400—the last time anyone has achieved that feat. During the same season Joe DiMaggio hit successfully in 56 consecutive games, an accomplishment both unprecedented and unequaled. After the United States entered World War II after the attack on Pearl Harbor, Landis asked Franklin D. Roosevelt whether professional baseball should continue during the war. In the "Green Light Letter", the US president replied that baseball was important to national morale, and asked for more night games so day workers could attend. Thirty-five Hall of Fame members and more than 500 Major League Baseball players served in the war, but with the exception of D-Day, games continued. Both Williams and DiMaggio would miss playing time in the services, with Williams also flying later in the Korean War. During this period Stan Musial led the St. Louis Cardinals to the 1942, 1944 and 1946 World Series titles. The war years also saw the founding of the All-American Girls Professional Baseball League. Baseball boomed after World War II. 1945 saw a new attendance record and the following year average crowds leapt nearly 70% to 14,914. Further records followed in 1948 and 1949, when the average reached 16,913. While average attendances slipped to somewhat lower levels through the 1950s, 1960s and the first half of the 1970s, they remained well above pre-war levels, and total seasonal attendance regularly hit new highs from 1962 onward as the number of major league teams—and games—increased. Racial integration in baseball The post-War years in baseball also witnessed the racial integration of the sport. Participation by African Americans in organized baseball had been precluded since the 1890s by formal and informal agreements, with only a few players being surreptitiously included in lineups on a sporadic basis. American society as a whole moved toward integration in the post-War years, partially as a result of the distinguished service by African American military units such as the Tuskegee Airmen, 366th Infantry Regiment, and others. During the baseball winter meetings in 1943, noted African-American athlete and actor Paul Robeson campaigned for integration of the sport. After World War II ended, several team managers considered recruiting members of the Negro leagues for entry into organized baseball. In the early 1920s, New York Giants' manager John McGraw tried to slip a black player, Charlie Grant, into his lineup (reportedly by passing him off to the front office as an Indian), and McGraw's wife reported finding names of dozens of black players that McGraw fantasized about signing, after his death. Pittsburgh Pirates owner Bill Bensawanger reportedly signed Josh Gibson to a contract in 1943, and the Washington Senators were also said to be interested in his services. But those efforts (and others) were opposed by Kenesaw Mountain Landis, baseball's powerful commissioner and a staunch segregationist. Bill Veeck claimed that Landis blocked his purchase of the Philadelphia Phillies because he planned to integrate the team. While this account is disputed, Landis was in fact opposed to integration, and his death in 1944 (and subsequent replacement as Commissioner by Happy Chandler) removed a major obstacle for black players in the Major Leagues. The general manager who would be eventually successful in breaking the color barrier was Branch Rickey of the Brooklyn Dodgers. Rickey himself had experienced the issue of segregation. While playing and coaching for his college team at Ohio Wesleyan University, Rickey had a black teammate named Charles Thomas. On a road trip through southern Ohio, his fellow player was refused a room in a hotel. Although Rickey was able to get the player into his room for that night, he was taken aback when he reached his room to find Thomas upset and crying about this injustice. Rickey related this incident as an example of why he wanted a full desegregation of not only baseball, but the entire nation. In the mid-1940s, Rickey had compiled a list of Negro league ballplayers for possible Major League contracts. Realizing that the first African-American signee would be a magnet for prejudiced sentiment, however, Rickey was intent on finding a player with the distinguished personality and character that would allow him to tolerate the inevitable abuse. Rickey's sights eventually settled on Jackie Robinson, a shortstop with the Kansas City Monarchs. Although probably not the best player in the Negro leagues at the time, Robinson was an exceptional talent, was college-educated, and had the marketable distinction of having served as an officer during World War II. Even more importantly, Rickey judged Robinson to possess the inner strength to withstand the inevitable harsh animosity to come. To prepare him for the task, Rickey played Robinson in 1946 for the Dodgers' minor league team, the Montreal Royals, which proved an arduous emotional challenge, though Robinson enjoyed fervently enthusiastic support from the Montreal fans. On April 15, 1947, Robinson broke the color barrier, which had been tacitly recognized for almost 75 years, with his appearance for the Brooklyn Dodgers at Ebbets Field. Eleven weeks later, on July 5, 1947, the American League was integrated by the signing of Larry Doby to the Cleveland Indians. Over the next few years, a handful of black baseball players made appearances in the majors, including Roy Campanella (teammate to Robinson in Brooklyn) and Satchel Paige (teammate to Doby in Cleveland). Paige, who had pitched more than 2,400 innings in the Negro leagues, sometimes two and three games a day, was still effective at 42, and still playing at 59. His ERA in the Major Leagues was 3.29. However, the initial pace of integration was slow. By 1953, only six of the sixteen major league teams had a black player on the roster. The Boston Red Sox became the last major league team to integrate its roster with the addition of Pumpsie Green on July 21, 1959. While limited in numbers, the on-field performance of early black Major League players was outstanding. In the fourteen years from 1947 to 1960, black players won one or more of the Rookie of the Year awards nine times. While never prohibited in the same fashion as African Americans, Latin American players also benefitted greatly from the integration era. In 1951, two Chicago White Sox, Venezuelan-born Chico Carrasquel and Cuban-born (and black) Minnie Miñoso, became the first Hispanic All-Stars. According to some baseball historians, Jackie Robinson and the other African-American players helped reestablish the importance of baserunning and similar elements of play that were previously de-emphasized by the predominance of power hitting. From 1947 to the 1970s, African-American participation in baseball rose steadily. By 1974, 27% of baseball players were African American. As a result of this on-field experience, minorities began to experience long-delayed gains in managerial positions within baseball. In 1975, Frank Robinson (who had been the 1956 Rookie of the Year with the Cincinnati Reds) was named player-manager of the Cleveland Indians, making him the first African-American manager in the major leagues. Although these front-office gains continued, Major League Baseball saw a lengthy slow decline in the percentage of black players after the mid-1970s. By 2007, African Americans made up less than 9% of Major League players. While this trend is largely attributed to an increased emphasis on recruitment of players from Latin America (with the number of Hispanic players in the major leagues rising to 29% by 2007), other factors have been cited as well. Hall of Fame player Dave Winfield, for instance, has pointed out that urban America provides fewer resources for youth baseball than in the past. Despite this continued prevalence of Hispanic players, the percentage of black players rose again in 2008 to 10.2%. Arturo Moreno became the first Hispanic owner of an MLB franchise when he purchased the Anaheim Angels in 2004. In 2005, a Racial and Gender Report Card on Major League Baseball was issued, which generally found positive results on the inclusion of African Americans and Latinos in baseball, and gave Major League Baseball a grade of "A" or better for opportunities for players, managers and coaches as well as for MLB's central office. At that time, 37% of major league players were people of color: Latino (26 percent), African American (9 percent) or Asian (2 percent). Also by 2004, 29% of the professional staff in MLB's central office were people of color, 11% of team vice presidents were people of color, and seven of the league's managers were of color (four African Americans and three Latinos). The Major Leagues move west Baseball had been in the West for almost as long as the National League and the American League had been around. It evolved into the Pacific Coast League (PCL), which included the Hollywood Stars, Los Angeles Angels, Oakland Oaks, Portland Beavers, Sacramento Solons, San Francisco Seals, San Diego Padres, Seattle Rainiers. The PCL was huge in the West. A member of the National Association of Professional Baseball Leagues, it kept losing great players to the National and the American leagues for less than $8,000 a player. The PCL was far more independent than the other "minor" leagues, and rebelled continuously against their Eastern masters. Clarence Pants Rowland, the President of the PCL, took on baseball commissioners Kenesaw Mountain Landis and Happy Chandler at first to get better equity from the major leagues, then to form a third major league. His efforts were rebuffed by both commissioners. Chandler and several of the owners, who saw the value of the markets in the West, started to plot the extermination of the PCL. They had one thing that Rowland did not: The financial power of the Eastern major league baseball establishment. No one was going to back a PCL club building a major-league size stadium if the National or the American League was going to build one too, which discouraged investment in PCL ballparks. PCL games and rivalries still drew fans, but the leagues' days of dominance in the West were numbered. 1953–1955 Until the 1950s, major league baseball franchises had been largely confined to the northeastern United States, with the teams and their locations remaining unchanged from 1903 to 1952. The first team to relocate in fifty years was the Boston Braves, who moved in 1953 to Milwaukee, where the club set attendance records. In 1954, the St. Louis Browns moved to Baltimore and were renamed the Baltimore Orioles. These relocations can be seen as a full-circle ending to the classic era, which began with the moves of teams from Milwaukee and Baltimore. In 1955, the Philadelphia Athletics moved to Kansas City. National League Baseball leaves New York In 1958 the New York market ripped apart. The Yankees were becoming the dominant draw, and the cities of the West offered generations of new fans in much more sheltered markets for the other venerable New York clubs, the Brooklyn Dodgers and the New York Giants. Placing these storied, powerhouse clubs in the two biggest cities in the West had the specific design of crushing any attempt by the PCL to form a third major league. Eager to bring these big names to the West, Los Angeles gave Walter O'Malley, owner of the Dodgers, a helicopter tour of the city and asked him to pick his spot. The Giants were given the lease of the PCL San Francisco Seals while Candlestick Park was built for them. California The logical first candidates for major league "expansion" were the same metropolitan areas that had just attracted the Dodgers and Giants. It is said that the Dodgers and Giants—National League rivals in New York City—chose their new cities because Los Angeles (in southern California) and San Francisco (in northern California) already had a fierce rivalry (geographical, economic, cultural and political), dating back to the state's founding. The only California expansion team—and also the first in Major League Baseball in over 70 years—was the Los Angeles Angels (later the California Angels, the Anaheim Angels, and, as of 2005, the Los Angeles Angels of Anaheim), who brought the American League to southern California in 1961. Northern California, however, would later gain its own American League team, in 1968, when the Athletics would move again, settling in Oakland, across San Francisco Bay from the Giants. 1961–1962 Along with the Angels, the other 1961 expansion team was the Washington Senators, who joined the American League and took over the nation's capital when the previous Senators moved to Minnesota and became the Twins. 1961 is also noted as being the year in which Roger Maris surpassed Babe Ruth's single season home run record, hitting 61 for the New York Yankees, albeit in a slightly longer season than Ruth's. To keep pace with the American League—which now had ten teams—the National League likewise expanded to ten teams, in 1962, with the addition of the Houston Colt .45s and New York Mets. 1969 In 1969, the American League expanded when the Kansas City Royals and Seattle Pilots, the latter in a longtime PCL stronghold, were admitted to the league. The Pilots stayed just one season in Seattle before moving to Milwaukee and becoming today's Milwaukee Brewers. The National League also added two teams that year, the Montreal Expos and San Diego Padres. Given the size of the expanded leagues, 12 teams apiece, each split into East and West divisions, with a playoff series to determine the pennant winner and World Series contender—the first post-season baseball instituted since the advent of the World Series itself. The Padres were the last of the core PCL teams to be absorbed. The Coast League did not die, though. After reforming and moving into new markets, it successfully transformed into a Class AAA league. 1972–2013 In 1972, the second Washington Senators moved to the Dallas-Fort Worth area and became the Texas Rangers. In 1977, the American League expanded to fourteen teams, with the newly formed Seattle Mariners and Toronto Blue Jays. Sixteen years later, in 1993, the National League likewise expanded to fourteen teams, with the newly formed Colorado Rockies and Florida Marlins (now Miami Marlins). Beginning with the 1994 season, both the AL and the NL were divided into three divisions (East, West, and Central), with the addition of a wild card team (the team with the best record among those finishing in second place) to enable four teams in each league to advance to the preliminary division series. However, due to the 1994–95 Major League Baseball strike (which canceled the 1994 World Series), the new rules did not go into effect until the 1995 World Series. In 1998, the AL and the NL each added a fifteenth team, for a total of thirty teams in Major League Baseball. The Arizona Diamondbacks joined the National League, and the Tampa Bay Devil Rays—now called simply the Rays—joined the American League. In order to keep the number of teams in each league at an even number (14 – AL, 16 – NL), Milwaukee changed leagues and became a member of the National League.<ref>See Major League Baseball#League organization.</ref> Two years later, the NL and AL ended their independent corporate existences and merged into a new legal entity named Major League Baseball; the two leagues remained as playing divisions. In 2001, MLB took over the struggling Montreal Expos franchise and, after the 2004 season, moved it to Washington, DC, which had been clamoring for a team ever since the second Senators' departure in 1972; the club was renamed the Nationals. In 2013, in keeping with Commissioner Bud Selig's desire for expanded interleague play, the Houston Astros were shifted from the National to the American League; with an odd number (15) in each league, an interleague contest was played somewhere almost every day during the season. At this time the divisions within each league were shuffled to create six equal divisions of five teams. Pitching dominance and rules changes By the late 1960s, the balance between pitching and hitting had swung back to favor of the pitchers once more. In 1968 Carl Yastrzemski won the American League batting title with an average of just .301, the lowest in history. That same year, Detroit Tigers pitcher Denny McLain won 31 games—making him the last pitcher to win 30 games in a season. St. Louis Cardinals starting pitcher Bob Gibson achieved an equally remarkable feat by allowing an ERA of just 1.12. In response to these events, major league baseball implemented certain rule changes in 1969 to benefit the batters. The pitcher's mound was lowered, and the strike zone was reduced. In 1973 the American League, which had been suffering from much lower attendance than the National League, made a move to increase scoring even further by initiating the designated hitter rule. Players assert themselves From the time of the formation of the Major Leagues to the 1960s, the team owners controlled the game. After the so-called "Brotherhood Strike" of 1890 and the failure of the Brotherhood of Professional Base Ball Players and its Players National League, the owners' control of the game seemed absolute. It lasted over 70 years despite a number of short-lived players organizations. In 1966, however, the players enlisted the help of labor union activist Marvin Miller to form the Major League Baseball Players Association (MLBPA). The same year, Sandy Koufax and Don Drysdale—both Cy Young Award winners for the Los Angeles Dodgers—refused to re-sign their contracts, and the era of the reserve clause, which held players to one team, was drawing to an end. The first legal challenge came in 1970. Backed by the MLBPA, St. Louis Cardinals outfielder Curt Flood took the leagues to court to negate a player trade, citing the 13th Amendment and antitrust legislation. In 1972, he finally lost his case before the United States Supreme Court by a vote of 5 to 3, but gained large-scale public sympathy, and the damage had been done. The reserve clause survived, but it had been irrevocably weakened. In 1975, Andy Messersmith of the Dodgers and Dave McNally of the Montreal Expos played without contracts, and then declared themselves free agents in response to an arbitrator's ruling. Handcuffed by concessions made in the Flood case, the owners had no choice but to accept the collective bargaining package offered by the MLBPA, and the reserve clause was effectively ended, to be replaced by the current system of free-agency and arbitration. While the legal challenges were going on, the game continued. In 1969, the "Miracle Mets", just seven years after their formation, recorded their first winning season, won the National League East and finally the World Series. On the field, the 1970s saw some of the longest-standing records fall, along with the rise of two powerhouse dynasties. In Oakland, the Swinging A's were overpowering, winning the Series in 1972, 1973 and 1974, and five straight division titles. The strained relationships between teammates, who included Catfish Hunter, Vida Blue and Reggie Jackson, gave the lie to the need for "chemistry" between players. The National League, on the other hand, belonged to the Big Red Machine in Cincinnati, where Sparky Anderson's team, which included Pete Rose as well as Hall of Famers Tony Pérez, Johnny Bench and Joe Morgan, succeeded the A's run in 1975. The decade also contained great individual achievements. On April 8, 1974, Hank Aaron of the Atlanta Braves hit his 715th career home run, surpassing Babe Ruth's all-time record. He would retire in 1976 with 755, and that was just one of numerous records he achieved, many of which, including total bases, still stand today. There was great pitching too: between 1973 and 1975, Nolan Ryan threw four "no-hit" games. He would add a record-breaking fifth in 1981 and two more before his retirement in 1993, by which time he had also accumulated 5,714 strikeouts, another record, in a 27-year career. The marketing and hype era From the 1980s onward, the major league game changed dramatically, due to the combined effects of free agency, improvements in the science of sports conditioning, changes in the marketing and television broadcasting of sporting events, and the push by brand-name products for greater visibility. These events lead to greater labor difficulties, fan disaffection, skyrocketing prices, changes in game-play, and problems with the use of performance-enhancing substances like steroids tainting the race for records. In spite of all this, stadium crowds generally grew. Average attendances first broke 20,000 in 1979 and 30,000 in 1993. That year total attendance hit 70 million, but baseball was hit hard by a strike in 1994, and as of 2005 it had marginally improved on those 1993 records. (Update: Between 2009 and 2017, average attendance hovered just over the 30,000 mark, with numbers falling into the 28,000s in '18 and 19. The 2019 season saw a million fewer tickets sold than the banner year of 2007, however revenues to major league baseball from media rights fees increased total revenue to $10 billion in 2018, a 70% rise from a decade before.) The science of the sport changes the game During the 1980s, significant advances were made in the science of physical conditioning. Weight rooms and training equipment were improved. Trainers and doctors developed better diets and regimens to make athletes bigger, healthier, and stronger than they had ever been. Another major change that had been occurring during this time was the adoption of the pitch count. Starting pitchers who played complete games had not been an unusual thing in baseball's history. Now, pitchers were throwing harder than ever and pitching coaches watched to see how many pitches a player had thrown over the game. At anywhere from 100 to 125, pitchers increasingly would be pulled out to preserve their arms. Bullpens began to specialize more, with more pitchers being trained as middle relievers, and a few hurlers, usually possessing high velocity but not much durability, as closers. The science of maximizing effectiveness and career duration, while attempting to minimize injury and downtime, is an ongoing pursuit by coaches and kinesiologists. Along with the expansion of teams, the addition of more pitchers needed to play a complete game stressed the total number of quality players available in a system that restricted its talent searches at that time to America, Canada, Latin America, and the Caribbean. Television The arrival of live televised sports in the 1950s increased attention and revenue for all major league clubs at first. The television programming was extremely regional, hurting the non-televised minor and independent leagues most. People stayed home to watch Maury Wills rather than watch unknowns at their local baseball park. Major League Baseball, as it always did, made sure that it controlled rights and fees charged for the broadcasts of all games, just as it had on radio. The national networks began televising national games of the week, opening the door for a national audience to see particular clubs. While most teams were broadcast in the course of a season, emphasis tended toward the league leaders with famous players and the major market franchises that could draw the largest audience. The rise of cable In the 1970s the cable revolution began. The Atlanta Braves became a power contender with greater revenues generated by WTBS, Ted Turner's Atlanta-based Super-Station, broadcast as "America's Team" to cable households nationwide. The roll-out of ESPN, then regional sports networks (now mostly under the umbrella of Fox Sports Net) changed sports news in general and particularly baseball with its relatively huge number of games-per-season. Now under the microscope of news organizations that needed to fill 24 programming hours per day, the amount of attention—and salary—paid to major league players grew exponentially. Players who would have sought off-season jobs to make ends meet just 20 years earlier were now well-paid professionals at least, and multi-millionaires in many cases. This super-star status often rested on careers that were not as compelling as those of the baseball heroes of a less media-intense time. As player contract values soared, and the number of broadcasters, commentators, columnists, and sports writers also multiplied. The competition for a fresh angle on any story became fierce. Media pundits began questioning the high salaries paid to players when on-field performance was deemed less than deserving. Critical commentary was more of a draw than praise, and coverage began to become intensely negative. Players' personal lives, which had always been off-limits except under extreme circumstances, became the fodder of editorials, insider stories on TV, and features in magazines. When the use of performance-enhancing drugs became an issue, drawing scornful criticism from fans and pundits, the gap between the sports media and the players whom they covered widened further. With the development of satellite television and digital cable, Major League Baseball launched channels with season-subscription fees, making it possible for fans to watch virtually every game played, in both major leagues, everywhere, in real time. Team networks The next refinement of baseball on cable was the creation of single-team cable networks. YES Network & NESN, the New York Yankees & Boston Red Sox cable television networks, respectively, took in millions to broadcast games not only in New York and Boston but around the country. These networks generated as much revenue as, or more than, revenue annually for large-market teams' baseball operations. By fencing these channels off in separate corporate entities, owners were able to exclude the income from consideration during contract negotiations. Merchandise, endorsements and sponsorships The first merchandise produced in response to the growing popularity of the game was the baseball trading card. The earliest known player cards were produced in 1868 by a pair of New York baseball-equipment purveyors. Since that time, many enterprises, notably tobacco and candy companies, have used trading cards to promote and sell their products. These cards rarely, if ever, provided any benefit directly to the players, but a growing mania for collecting and trading cards helped personalize baseball, giving some fans a more personal connection to their favorite players and introducing them to new ones. Eventually, older cards became “vintage” and rare cards gained in value until the secondary market for trading cards became a billion-dollar industry in itself, with the rarest individuals bringing mid-six-figures to millions of dollars at auction. The advent of the Internet and websites such as eBay provided huge new venues for buyers, sellers and traders, some of whom have made baseball cards their living. In recent years baseball cards have disassociated from unrelated products like tobacco and bubble-gum, to become products in their own right. Following the exit of competitor Donruss from the baseball-card industry, former bubble-gum giants Topps and Fleer came to dominate that market through exclusive contracts with players and Major League Baseball. Fleer, in turn, exited the market in 2007, leaving Topps as the only card manufacturer with an MLB contract. Other genuine baseball memorabilia also trades and sells, often at high prices. Much of what is for sale as "memorabilia" is manufactured strictly for sale and rarely has a direct connection to teams or players beyond the labeling, unless signed in person by a player. Souvenir balls caught by fans during important games, especially significant home run balls, have great rarity value, and balls signed by players have always been treasured, traded and sold. The high value of autographs has created new businessmen whose sole means of making a living was acquiring autographs and memorabilia from the athletes. Memorabilia hounds fought with fans to get signatures worth $20, $60, or even $100 or more in their inventory. Of great value to individual top players are endorsement contracts wherein the player's fame is used to sell anything from sports equipment to automobiles, soda and underwear. Top players can receive as much as a million dollars a year or more directly from the companies. In deals with players, teams and Major League Baseball, large corporations like NIKE and Champion pay big money to make sure that their logos are seen on the clothing and shoes worn by athletes on the field. This "association branding" has become a significant revenue stream. In the late 1990s and into the 21st century, the dugout, the backstops behind home plate, and anywhere else that might be seen by a camera, became fair game for the insertion of advertising. Player wealth Beginning with the 1972 Flood v. Kuhn Supreme Court case, management's grip on players, as embodied in the reserve clause, began to slip. In 1976, he Messersmith/McNally Arbitration, also known as the Seitz Decision effectively destroyed the reserve clause. Players who had been dramatically underpaid for generations came to be replaced by players who were paid extremely well for their services. Sports agents A new generation of sports agents arose, hawking the talents of free-agent players who knew baseball but didn't know the business end of the game. The agents broke down what the teams were generating in revenue off of the players' performances. They calculated what their player might be worth to energize a television contract, or provide more merchandise revenue, or put more fans into stadium seats. Management pushed back; the dynamic produced a variety of compromises which ideally left all parties unsatisfied. Business Under the Major League Baseball contract, players must play for minimum salary for six years, at which time they become free agents. With players seeking greener pastures when |
announced until after the World Series. The BBWAA began by polling three writers in each league city in 1938, reducing that number to two per league city in 1961. The BBWAA does not offer a clear-cut definition of what "most valuable" means, instead leaving the judgment to the individual voters. First basemen, with 34 winners, have won the most MVPs among infielders, followed by second basemen (16), third basemen (15), and shortstops (15). Of the 25 pitchers who have won the award, 15 are right-handed while 10 are left-handed. Walter Johnson, Carl Hubbell, and Hal Newhouser are the only pitchers who have won multiple times, Newhouser winning consecutively in 1944 and 1945. Hank Greenberg, Stan Musial, Alex Rodriguez, and Robin Yount have won at different positions, while Rodriguez is the only player who has won the award with two different teams at two different positions. Barry Bonds has won the most often (seven times) and the most consecutively (four: 2001–04). Jimmie Foxx was the first player to win multiple times; Ten players have won three times, and 19 have won twice. Frank Robinson is the only player to win the award in both the American and National Leagues. The award's only tie occurred in the National League in 1979, when Keith Hernandez and Willie Stargell received an equal number of points. There have been 19 unanimous winners, who received all the first-place votes. The New York Yankees have the most winning players with 22, followed by the St. Louis Cardinals with 17 winners. The award has never been presented to a member of the following three teams: Arizona Diamondbacks, New York Mets, and Tampa Bay Rays. In recent decades, pitchers have rarely won the award. When Shohei Ohtani won the AL award in 2021, he became the first pitcher in either league to be named the MVP since Clayton Kershaw in 2014, and the first in the American League since Justin Verlander in 2011. Ohtani also became the first two-way player to win this award. Chalmers Award (1911–1914) Before the 1910 season, Hugh Chalmers of Chalmers Automobile announced he would present a Chalmers Model 30 automobile to the player with the highest batting average in Major League Baseball at the end of the season. The 1910 race for best average in the American League was between the Detroit Tigers' widely disliked Ty Cobb and Nap Lajoie of the Cleveland Indians. On the last day of the season, Lajoie overtook Cobb's batting average with seven bunt hits against the St. Louis Browns. American League President Ban Johnson said a recalculation showed that Cobb had won the race anyway, | the season, Lajoie overtook Cobb's batting average with seven bunt hits against the St. Louis Browns. American League President Ban Johnson said a recalculation showed that Cobb had won the race anyway, and Chalmers ended up awarding cars to both players. The following season, Chalmers created the Chalmers Award. A committee of baseball writers were to convene after the season to determine the "most important and useful player to the club and to the league". Since the award was not as effective at advertising as Chalmers had hoped, it was discontinued after 1914. League Awards (1922–1929) In 1922 the American League created a new award to honor "the baseball player who is of the greatest all-around service to his club". Winners, voted on by a committee of eight baseball writers chaired by James Crusinberry, received a bronze medal and a cash prize. Voters were required to select one player from each team and player-coaches and prior award winners were ineligible. Famously, these criteria resulted in Babe Ruth winning only a single MVP award before it was dropped after 1928. The National League award, without these restrictions, lasted from 1924 to 1929. Baseball Writers' Association of America's Most Valuable Player (1931–present) The BBWAA first awarded the modern MVP after the 1931 season, adopting the format the National League used to distribute its league award. One writer in each city with a team filled out a ten-place ballot, with ten points for the recipient of a first-place vote, nine for a second-place vote, and so on. In 1938, the BBWAA raised the number of voters to three per city and gave 14 points for a first-place vote. The only significant change since then occurred in 1961, when the number of voters was reduced to two per league city. Key Wins by team See also "Esurance MLB Awards" Best Major Leaguer (in MLB; all positions) (there are also Best Hitter and Best Pitcher awards (in MLB)) "Players Choice Awards" Player of the Year (in MLB; all positions) (there are also Outstanding Player and Outstanding Pitcher awards (in each league)) Baseball America Major League Player of the Year (in MLB; all positions) Baseball Digest Player of the Year (in MLB; position players only; from 1969 to 1993, included all positions; in 1994, a separate Pitcher of the Year award was added) Best Major League Baseball Player ESPY Award (in MLB; all positions) The Sporting News Most Valuable Player Award (in each league) (discontinued in 1946) Sporting News Player of the Year (in MLB; position players only) List of Major League Baseball awards Baseball awards Notes A player is considered inactive if he has announced his retirement or not played for a full season. A unanimous victory indicates that the player received all possible first-place votes. Torre is a member of the Hall of Fame, but not as a player. He was inducted in as a manager. Hernandez and Stargell both received 216 points in the 1979 voting. References External links Most Valuable Player MVP Awards & Cy Young Awards Winners (1911–present) (and "Multiple Winners of the MVP and Cy Young Awards"). Baseball-Reference.com. Retrieved 2016-11-07. Most Valuable Player |
in 1940 by the Chicago chapter of the BBWAA, which selected an annual winner from 1940 through 1946. The award became national in 1947; Jackie Robinson, the Brooklyn Dodgers' second baseman, won the inaugural award. One award was presented for all of MLB in 1947 and 1948; since 1949, the honor has been given to one player each in the NL and AL. Originally, the award was known as the J. Louis Comiskey Memorial Award, named after the Chicago White Sox owner of the 1930s. The award was renamed the Jackie Robinson Award in July 1987, 40 years after Robinson broke the baseball color line. Seventeen players have been elected to the National Baseball Hall of Fame—Robinson, six AL players, and ten others from the NL. The award has been shared twice: once by Butch Metzger and Pat Zachry of the NL in 1976; and once by John Castino and Alfredo Griffin of the AL in 1979. Members of the Brooklyn and Los Angeles Dodgers have won the most awards of any franchise (with 18), twice the total of the New York Yankees, and members of the Philadelphia and Oakland Athletics (eight), who have produced the most in the AL. Fred Lynn and Ichiro Suzuki are the only two players who have been named Rookie of the Year and Most Valuable Player in the same year, and Fernando Valenzuela is the only player to have won Rookie of the Year and the Cy Young Award in the same year. Sam Jethroe is the oldest player to have won the award, at age 32, 33 days older than 2000 winner Kazuhiro Sasaki (also 32). Randy Arozarena of the Tampa Bay Rays and Jonathan India of the Cincinnati Reds are the most recent winners. Qualifications and voting From 1947 through 1956, each BBWAA voter used discretion as to who qualified as a rookie. In 1957, the term was first defined as someone with fewer than 75 at-bats or 45 innings pitched in any previous Major League season. This guideline was later amended to 90 at-bats, 45 innings pitched, or 45 days on a Major League roster before September 1 of the previous year. The current standard of 130 at-bats, 50 innings pitched or 45 days on the active roster of a Major League club (excluding time in military service or on the injury list) before September 1 was adopted in 1971. Since | award's definition. The award has drawn criticism in recent years because several players with experience in Nippon Professional Baseball (NPB) have won the award, such as Hideo Nomo in 1995, Kazuhiro Sasaki in 2000, Ichiro Suzuki in 2001, and Shohei Ohtani in 2018. The current definition of rookie status for the award is based only on Major League experience, but some feel that past NPB players are not true rookies because of their past professional experience. Others, however, believe it should make no difference since the first recipient and the award's namesake played for the Negro leagues before his MLB career and thus could also not be considered a "true rookie". This issue arose in 2003 when Hideki Matsui narrowly lost the AL award to Ángel Berroa. Jim Souhan of the Minneapolis Star Tribune said he did not see Matsui as a rookie in 2003 because "it would be an insult to the Japanese league to pretend that experience didn't count." The Japan Times ran a story in 2007 on the labeling of Daisuke Matsuzaka, Kei Igawa, and Hideki Okajima as rookies, saying "[t]hese guys aren't rookies." Past winners such as Jackie Robinson, Don Newcombe, and Sam Jethroe had professional experience in the Negro leagues. Winners Key Major Leagues combined (1947–48) American |
two divisions each, East and West. The two division winners within each league played each other in a best-of-five series to determine who would advance to the World Series. In 1985, the format changed to best-of-seven. The NLCS and ALCS, since the expansion to seven games, are always played in a 2–3–2 format: games 1, 2, 6, and 7 are played in the stadium of the team that has home field advantage, and games 3, 4, and 5 are played in the stadium of the team that does not. Home field advantage is given to the team that has the better record, except that the team that made the postseason as the Wild Card shall not get home field advantage. From 1969 to 1993, home field advantage was alternated between divisions each year regardless of regular season record and from 1995 to 1997 home field advantage was predetermined before the season. In 1981, a one-off divisional series was held due to a split season caused by a players' strike. In 1994, the league was restructured into three divisions, with the three division winners and a wild card team advancing to a best-of-five postseason round, the now-permanent National League Division Series (NLDS). The winners of that round advance to the best-of-seven NLCS. The Milwaukee Brewers, an American League team between 1969 and 1997, and the Houston Astros, a National League team between 1962 and 2012, are the only franchises to play in both the ALCS and NLCS. The Astros are the only team to have won both an NLCS (2005) and an ALCS (2017, 2019). The Astros made four NLCS appearances before moving to the AL in 2013. Every current National League franchise has appeared in the NLCS. Championship Trophy The Warren C. Giles Trophy is awarded to the | an NLCS (2005) and an ALCS (2017, 2019). The Astros made four NLCS appearances before moving to the AL in 2013. Every current National League franchise has appeared in the NLCS. Championship Trophy The Warren C. Giles Trophy is awarded to the NLCS winner. Warren Giles served as president of the National League from 1951 to 1969. Most Valuable Player Award See: League Championship Series Most Valuable Player Award#National League winners A Most Valuable Player (MVP) award is given to the outstanding player in the NLCS. No MVP award is given for Division Series play. The MVP award has been given to a player on the losing team twice, in 1986 to Mike Scott of the Houston Astros and in 1987 to Jeffrey Leonard of the San Francisco Giants. Although the National League began its LCS MVP award in 1977, the American League did not begin its LCS MVP award until 1980. The winners are listed in several locations: in the below NLCS results table, in the "Series MVP" column in the article League Championship Series Most Valuable Player Award on the MLB website Results Appearances by team Years of appearance In the sortable table below, teams are ordered first by number of wins, then by number of appearances, and finally by year of first appearance. In the "Season(s)" column, bold years indicate winning appearances. Frequent matchups See also List of National League pennant winners List of National League Wild Card winners National League Division Series American League |
division winners and a wild-card team advancing to a best-of-five postseason round, known as the American League Division Series (ALDS). The winners of that round then advanced to the best-of-seven ALCS. In 2012, the playoffs were expanded again so that two wild card teams face off in a one-game wild card round to determine which team advances to the division series, with the playoffs then continuing as it had before 2012 (though with the possibility of a fifth seed being in the playoffs and a fourth seed being out) after the end of the wild card round. This is the system currently in use. The ALCS and NLCS, since the expansion to best-of-seven, are always played in a 2–3–2 format: Games 1, 2, 6, and 7 are played in the stadium of the team that has home field advantage, and Games 3, 4, and 5 are played in the stadium of the team that does not. The series concludes when one team records its fourth win. Since 1998, home field advantage has been given to the team that has the better regular season record, except that the team made the postseason as the Wild Card shall not get home field advantage. If both teams have identical records in the regular season, then home field advantage goes to the team that has the winning head-to-head record. From 1969 to 1993, home-field advantage alternated between the two divisions, and from 1995 to 1997 home-field advantage was determined before the season. Eight managers have led a team to the ALCS in three consecutive seasons; however, the most consecutive ALCS appearances by one manager is Joe Torre, who led the New York Yankees to four straight from 1998 to 2001. The Oakland Athletics (1971-75) and the Houston Astros (2017-present) are the only teams in the American League to have made five consecutive American League Championship Series appearances (with the latter being the first team to ever win the ALDS five straight years) The Milwaukee Brewers, an American League team between 1969 and 1997, and the Houston Astros, a National League team between | winner") was determined by the best win-loss record at the end of the regular season. There was one ad hoc single-game playoff held, in , due to a tie under this formulation. (The National League had to resolve ties four times, but used three-game playoff series.) The ALCS started in 1969, when the AL reorganized into two divisions, East and West. The winners of each division played each other in a best-of-five series to determine who would advance to the World Series. In 1985, the format changed to best-of-seven. In 1981, a division series was held due to a split season caused by a players' strike. In 1994, the league was restructured into three divisions, with the three division winners and a wild-card team advancing to a best-of-five postseason round, known as the American League Division Series (ALDS). The winners of that round then advanced to the best-of-seven ALCS. In 2012, the playoffs were expanded again so that two wild card teams face off in a one-game wild card round to determine which team advances to the division series, with the playoffs then continuing as it had before 2012 (though with the possibility of a fifth seed being in the playoffs and a fourth seed being out) after the end of the wild card round. This is the system currently in use. The ALCS and NLCS, since the expansion to best-of-seven, are always played in a 2–3–2 format: Games 1, 2, 6, and 7 are played in the stadium of the team that has home field advantage, and Games 3, 4, and 5 are played |
play in the ALDS. For the 2020 Major League Baseball season only, there was an expanded playoff format, owing to an abbreviated 60-game regular season due to the COVID-19 pandemic. Eight teams qualified from the American League: the top two teams in each division plus the next two best records among the remaining teams. These eight teams played a best-of-three game series to determine placement in the ALDS. The regular format returned for the 2021 season. As of 2021, the Yankees have played in and won the most division series, with thirteen wins in twenty-two appearances. In 2015, the Toronto Blue Jays and Houston Astros were the final American League teams to make their first appearances in the ALDS. The Astros had been in the National League through 2012, and had played in the National League Division Series (NLDS) seven times. Determining the matchups The ALDS is a best-of-five series where the wild card team is assigned to play the divisional winner with the best winning percentage in the regular season in one series, and the other two division winners met in the other series with the team with the second best winning percentage, getting home-field. (From 1998 to 2011, if the wild-card team and the division winner with the best record were from the same division, the wild-card team played the division winner with the second-best record, and the remaining two division leaders played each other.) The two series winners move on to the best-of-seven ALCS. According to Nate Silver, the advent of this playoff series, and especially of the wild card, has caused teams to focus more on "getting to the playoffs" rather than "winning the pennant" as the primary goal of the regular season. Beginning with the 2012 season, the wild card team that advances to the Division Series was to face the number 1 seed, regardless whether or not they are in the same division. The two series winners move on to the best-of-seven ALCS. Home-field advantage goes to the team with the better | in which the New York Yankees won the Eastern Division series over the Milwaukee Brewers (who were in the American League until 1998) in five games while in the Western Division, the Oakland Athletics swept the Kansas City Royals (the only team with an overall losing record to ever make the postseason). In 1994, it was returned permanently when Major League Baseball (MLB) restructured each league into three divisions, but with a different format than in 1981. Each of the division winners, along with one wild card team, qualify for the Division Series. Despite being planned for the 1994 season, the post-season was cancelled that year due to the 1994–95 Major League Baseball strike. In 1995, the first season to feature a division series, the Western Division champion Seattle Mariners defeated the wild card New York Yankees three games to two, while the Central Division champion Cleveland Indians defeated the Eastern Division champion Boston Red Sox in a three game sweep. From 1994–2011, the wild card was given to the team in the American League with the best overall record that was not a division champion. Beginning with the 2012 season, a second wild card team was added, and the two wild card teams play a single-game playoff to determine which team would play in the ALDS. For the 2020 Major League Baseball season only, there was an expanded playoff format, owing to an abbreviated 60-game regular season due to the COVID-19 pandemic. Eight teams qualified from the American League: the top two teams in each division plus the next two best records among the remaining teams. These eight teams played a best-of-three game series to determine placement in the ALDS. The regular format returned for the 2021 season. As of 2021, the Yankees have played in and won the most division series, with thirteen wins in twenty-two appearances. In 2015, the Toronto Blue Jays and Houston Astros were the final American League teams to make their first appearances in the ALDS. The Astros had been in the |
team is assigned to play the divisional winner with the best winning percentage in the regular season in one series, and the other two division winners met in the other series with the team with the second best winning percentage, getting home-field. (From 1998 to 2011, if the wild-card team and the division winner with the best record were from the same division, the wild-card team played the division winner with the second-best record, and the remaining two division leaders played each other.) The two series winners move on to the best-of-seven NLCS. The winner of the wild card has won the first round seven out of the 11 years since the re-alignment and creation of the NLDS. According to Nate Silver, the advent of this playoff series, and especially of the wild card, has caused teams to focus more on "getting to the playoffs" rather than "winning the pennant" as the primary goal of the regular season. Beginning with the 2012 season, the wild card team that advances to the Division Series was to face the number 1 seed, regardless whether or not they are in the same division. The two series winners move on to the best-of-seven NLCS. Home-field advantage goes to the team with the better regular season record (or head-to-head record if there is a tie between two or more teams), except for the wild-card team, which never receives the home field advantage. Beginning in 2003, MLB has implemented a new rule to give the team from the league that wins the All-Star Game with the best regular season record a slightly greater advantage. In order to spread out the Division Series games for broadcast purposes, the two NLDS series follow one of two off-day schedules. Starting in 2007, after consulting the MLBPA, MLB has decided to allow the team with the best record in the league that wins the All-Star Game to choose whether to use the seven-day schedule (1-2-off-3-4-off-5) or the eight-day schedule (1-off-2-off-3-4-off-5). The team only gets to choose the schedule; the opponent is still determined by win-loss records. Initially, the best-of-5 series played in a 2–3 format, with the first two games set at home for the lower seed team and the last three for the higher seed. Since 1998, the series has followed a 2–2–1 format, where the higher seed team plays at home in Games 1 and 2, the lower seed plays at home in Game 3 and Game 4 (if necessary), and if a Game 5 is needed, the teams return to the higher seed's field. When MLB added a second wild card team in 2012, the Division Series re-adopted the 2–3 format due to scheduling conflicts. It reverted to the 2–2–1 format starting in 2013. Results Appearances by team Years of appearance In the sortable table below, teams are ordered first by number | regular season due to the COVID-19 pandemic. Eight teams qualified from the National League: the top two teams in each division plus the next two best records among the remaining teams. These eight teams played a best-of-three game series to determine placement in the NLDS. The regular format returned for the 2021 season. As of 2021, The Atlanta Braves have currently played in the most NL division series with seventeen appearances. The St. Louis Cardinals have currently won the most NL division series, winning eleven of the fourteen series in which they have played. The Pittsburgh Pirates (who finished with a losing record from 1993 to 2012) were the last team to make their first appearance in the NL division series, making their debut in 2013 after winning the 2013 National League Wild Card Game. In 2008, the Milwaukee Brewers became the first team to play in division series in both leagues when they won the National League wild card, their first postseason berth since winning the American League East Division title in 1982 before switching leagues in 1998. Milwaukee had competed in an American League Division Series in the strike-shortened 1981 season. Format The NLDS is a best-of-five series where the wild card team is assigned to play the divisional winner with the best winning percentage in the regular season in one series, and the other two division winners met in the other series with the team with the second best winning percentage, getting home-field. (From 1998 to 2011, if the wild-card team and the division winner with the best record were from the same division, the wild-card team played the division winner with the second-best record, and the remaining two division leaders played each other.) The two series winners move on to the best-of-seven NLCS. The winner of the wild card has won the first round seven out of the 11 years since the re-alignment and creation of the NLDS. According to Nate Silver, the advent of this playoff series, and especially of the wild card, has caused teams to focus more on "getting to the playoffs" rather than "winning the pennant" as the primary goal of the regular season. Beginning with the 2012 season, the wild card team that advances to the Division Series was to face the number 1 seed, regardless whether or not they are in the same division. The two series winners move on to the best-of-seven NLCS. Home-field advantage goes to the team with the better regular season record (or head-to-head record |
the mound while the Diamondbacks elected to bring back Curt Schilling on three days' rest. Both pitchers gave up home runs, with Schilling doing so to Shane Spencer in the third inning and Hernandez doing so to Mark Grace in the fourth. Hernandez pitched solid innings, giving up four hits while Schilling went seven innings and gave up three. With the game still tied entering the eighth, Arizona struck. After Mike Stanton recorded the first out of the inning, Luis Gonzalez singled and Erubiel Durazo hit a double to bring him in. Matt Williams followed by grounding into a fielder's choice off of Ramiro Mendoza, which scored pinch runner Midre Cummings and gave the team a 3–1 lead. With his team on the verge of taking a commanding 3–1 series lead, Diamondbacks manager Bob Brenly elected to bring in closer Byung-hyun Kim in the bottom of the eighth for a two-inning save. Kim, at 22, became the first Korean-born player ever to play in the MLB World Series. Kim struck out the side in the eighth, but ran into trouble in the ninth. Derek Jeter led off by trying to bunt for a hit but was thrown out by Williams. Paul O'Neill then lined a single in front of Gonzalez. After Bernie Williams struck out, Kim seemed to be out of trouble with Tino Martinez coming to the plate. However, Martinez drove the first pitch he saw from Kim into the right-center field bleachers, tying the score at 3–3. The Yankees were not done, as Jorge Posada walked and David Justice moved him into scoring position with a single. Kim struck Spencer out to end the threat. When the scoreboard clock in Yankee Stadium passed midnight, World Series play in November began, with the message on the scoreboard "Welcome to November Baseball". Mariano Rivera took the hill for the Yankees in the tenth and retired the Diamondbacks in order. Kim went out for a third inning of work and retired Scott Brosius and Alfonso Soriano, but Jeter hit an opposite field home run on a 3–2 pitch count from Kim. This home run gave the Yankees a 4–3 victory and tied the Series at two games apiece which guaranteed a return trip to Arizona and made Jeter the first player to hit a November home run and earning him the tongue-in-cheek nickname of "Mr. November". Game 5 Game 5 saw the Yankees return to Mike Mussina for the start while the Diamondbacks sent Miguel Batista, who had not pitched in twelve days, to the mound. Batista pitched a strong scoreless innings, striking out six. Mussina bounced back from his poor Game 1 start, recording ten strikeouts, but allowed solo home runs in the fifth inning to Steve Finley and Rod Barajas. With the Diamondbacks leading 2–0 in the ninth, Byung-hyun Kim was called upon for the save despite having thrown three innings the night before. Jorge Posada doubled to open the inning, but Kim got Shane Spencer to ground out and then struck out Chuck Knoblauch. As had happened the previous night, Kim could not hold the lead as Scott Brosius hit a 1–0 pitch over the left field wall, the second straight game tying home run in the bottom of the ninth for the Yankees. Kim was pulled from the game in favor of Mike Morgan who recorded the final out. Morgan retired the Yankees in order in the 10th and 11th innings, while the Diamondbacks got to Mariano Rivera in the 11th. Danny Bautista and Erubiel Durazo opened the inning with hits and Matt Williams advanced them into scoring position with a sacrifice bunt. Rivera then intentionally walked Steve Finley to load the bases, then got Reggie Sanders to line out and Mark Grace grounded out to end the inning. Arizona went to midseason trade acquisition Albie Lopez in the 12th, and in his first at bat he gave up a single to Knoblauch (who had entered the game as a pinch runner). Brosius moved him over with a bunt, and then Alfonso Soriano ended the game with an RBI single to give the Yankees a 3–2 victory and a 3–2 series lead as the series went back to Phoenix. Lopez would not pitch again in the series. Sterling Hitchcock got the win for the Yankees after he relieved Rivera for the twelfth. Game 6 With Arizona in a must-win situation, Randy Johnson pitched seven innings and struck out seven, giving up just two runs, and Bobby Witt and Troy Brohawn finished the blowout. The Diamondbacks struck first when Tony Womack hit a leadoff double off of Andy Pettitte and scored on Danny Bautista's single in the first. Next inning, Womack's bases-loaded single scored two and Bautista's single scored another. The Yankees loaded the bases in the third on a single and two walks, but Johnson struck out Jorge Posada to end the inning. The Diamondbacks broke the game open with eight runs in the bottom half. Pettitte allowed a leadoff walk to Greg Colbrunn and subsequent double to Matt Williams before being relieved by Jay Witasick, who allowed four straight singles to Reggie Sanders, Jay Bell, Damian Miller, and Johnson that scored three runs. After Womack struck out, Bautista's single scored two more runs and Luis Gonzalez's double scored another, with Bautista being thrown out at home. Colbrunn's single and Williams's double scored a run each before Sanders struck out to end the inning. In the fourth, Bell reached first on a strike-three wild pitch before scoring on Miller's double. Johnson struck out before Womack singled to knock Witasick out of the game. With Randy Choate pitching, Yankees second baseman Alfonso Soriano's error on Bautista's ground ball allowed Miller to score and put runners on first and second before Gonzalez's single scored the Diamondbacks' final run. Choate and Mike Stanton kept them scoreless for the rest of the game. Pettitte was charged with six runs in two innings while Witasick was charged with nine runs in innings. The Yankees scored their only runs in the sixth on back-to-back one-out singles by Shane Spencer and Luis Sojo with runners on second and third. The Diamondbacks hit six doubles and Danny Bautista batted 3-for-4 with five RBIs. The team set a World Series record with 22 hits and defeated the New York Yankees in its most lopsided postseason loss in 293 postseason games, since surpassed by a 16–1 loss to the Boston Red Sox in 2018. The 15–2 win evened the series at three games apiece and set up a Game 7 for the ages between Roger Clemens and Curt Schilling, again pitching on three days' rest. Game 7 It was a matchup of two 20-game winners in the Series finale. Roger Clemens, at 39 years old, became the oldest Game 7 starter ever. Curt Schilling had already started two games of the Series and pitched his 300th inning of the season on just three days' rest. The two aces matched each other inning by inning and after seven full innings, the game was tied at 1–1. The Diamondbacks scored first in the sixth inning with a Steve Finley single and a Danny Bautista double (Bautista would be called out at third base). The Yankees responded with an RBI single from Tino Martinez, which drove in Derek Jeter who had singled earlier. Brenly stayed with Schilling into the eighth, and the move backfired as Alfonso Soriano hit a home run on an 0–2 pitch. After Schilling struck out Scott Brosius, he gave up a single to David Justice, and he left the game trailing 2–1. When Brenly came to the mound to remove Schilling, he was heard on the Sounds of the Game microphone telling his clearly upset pitcher, "love you brother, you're my hero" and assuring him that "that ain't gonna beat us, we're gonna get that back and then some." He then brought in Game 5 starter Miguel Batista to get Jeter out and then in an unconventional move, brought in the previous night's starter and winner Randy Johnson, who had thrown 104 pitches, in relief to keep it a one-run game. It proved to be a smart move, as Johnson retired pinch hitter Chuck Knoblauch (who batted for the left handed Paul O'Neill) on a fly out to Bautista in right field, then returned to the mound for the top of the ninth where he got Bernie Williams to fly out to Steve Finley in center field and Martinez to ground out to Tony Womack at shortstop, and then struck out catcher Jorge Posada to send the game to the bottom of the ninth inning. With the Yankees ahead 2–1 in the bottom of the eighth, manager Joe Torre turned the game over to his ace closer Mariano Rivera for a two-inning save. Rivera struck out the side in the eighth, including Arizona's Luis Gonzalez, Matt Williams, and Bautista, lowering his postseason ERA to a Major League-best 0.70. Although he was effective in the eighth, this game would end in the third ninth-inning comeback of the Series. Mark Grace led off the inning with a single to center on a 1–0 pitch. Rivera's errant throw to second base on a bunt attempt by catcher Damian Miller on an 0–1 pitch put runners on first and second. Jeter tried to reach for the ball, but got tangled in the legs of pinch-runner David Dellucci, who was sliding in an attempt to break up the double play. During the next at bat, Rivera appeared to regain control when he fielded pinch hitter Jay Bell's (who was hitting for Johnson) bunt and threw out Dellucci at third base, but third baseman Brosius decided to hold onto the baseball instead of throwing to first to complete the double play. Midre Cummings was sent in to pinch-run for Damian Miller, who had reached second base safely. With Cummings at second and Bell at first, the next batter, Womack, hit a double down the right-field line on a 2–2 pitch that tied the game and earned Rivera a blown save, his first in a postseason since 1997. Bell advanced to third and the Yankees pulled the infield and outfield in as the potential winning run (Bell) stood at third with fewer than two outs. After Rivera hit Craig Counsell unintentionally with an 0–1 pitch, the bases were loaded. On an 0–1 pitch, with Williams in the on-deck circle, Gonzalez lofted a soft floater single over the drawn-in Jeter that barely reached the outfield grass, plating Jay Bell with the winning run. Gonzalez's single ended New York's bid for a fourth consecutive title and brought Arizona its first championship in its fourth year of existence, making the Diamondbacks the fastest expansion team to win a World Series (beating out the 1997 Florida Marlins, who had done it in their fifth season at that time). It was also the first, and remains the only, major professional sports championship for the state of Arizona. Randy Johnson picked up his third win. Rivera took the loss, his only postseason loss in his career. In 2009, Game 7 of the 2001 World Series was chosen by Sports Illustrated as the Best Postseason Game of the Decade (2000–2009). Composite box 2001 World Series (4–3): Arizona Diamondbacks (N.L.) over New York Yankees (A.L.) Media coverage For the second consecutive year, Fox carried the World Series over its network with its top broadcast team, Joe Buck and Tim McCarver (himself a Yankees broadcaster). This was the first year of Fox's exclusive rights to the World Series (in the previous contract, Fox only broadcast the World Series in even numbered years while NBC broadcast it in odd numbered years), which it has held ever since (this particular contract also had given Fox exclusive rights to the entire baseball postseason, which aired over its family of networks; the contract was modified following Disney's purchase of Fox Family Channel shortly after the World Series ended, as ESPN regained their postseason rights following a year of postseason games on ABC Family, Fox Family's successor). ESPN Radio provided national radio coverage for the fourth consecutive year, with Jon Miller and Joe Morgan calling the action. Locally, the Series was broadcast by KTAR-AM in Phoenix with Thom Brennaman, Greg Schulte, Rod Allen and Jim Traber, and by WABC-AM in New York City with John Sterling and Michael Kay. This would be Sterling and Kay's last World Series working together, and Game 7 would be the last Yankee broadcast on WABC. Kay moved to television and the new YES Network the following season and WCBS picked up radio rights to the Yankees. It was Kay who announced Derek Jeter's game-winning home run in Game 4 of the series and subsequently anointed him as "Mr. November". Aftermath Days after Game 7, the Yankees loss proved to be life-saving for Enrique Wilson. As there was no Yankees victory parade down the Canyon of Heroes, Wilson moved up his flight home to the Dominican Republic, and was ultimately spared from boarding American Airlines Flight 587 that crashed in Belle Harbor, Queens, killing everyone on board. During the offseason, several Yankees moved onto other teams or retired, the most notable changes being the signing of Jason Giambi to replace Martinez, and the retirements of Brosius and O'Neill. Martinez would later finish his career with the Yankees in 2005 after spending the previous three years in St. Louis and Tampa Bay. After winning the NL West again in 2002 the Diamondbacks were swept 3–0 by St. Louis in the NLDS. From here they declined, losing 111 games in 2004 as Bob Brenly was fired during that season. Arizona would not win another NL West title until 2007. Schilling was traded to the Boston Red Sox after the 2003 season and in 2004 helped lead them to their first world championship since 1918. He helped them win another championship in 2007 and retired after four years with Boston, missing the entire 2008 season with a shoulder injury. Johnson was traded to the Yankees after the 2004 season, a season that saw him throw a perfect game against the Atlanta Braves, though he would be traded back to the Diamondbacks two years later and finish his career with the San Francisco Giants in 2009. The last player from the 2001 Diamondbacks roster, Lyle Overbay, retired following the 2014 season with the Milwaukee Brewers while the last player from the 2001 Yankees, Randy Choate, retired on February 16, 2017. From 2002 through 2007, the Yankees' misfortune in the postseason continued, with the team losing the ALDS to the Anaheim Angels in 2002, the World Series to the Florida Marlins in 2003, the ALCS to the Boston Red Sox (in the process becoming the first ever team in postseason history to blow a 3–0 series lead) in 2004, the ALDS again to the Angels in 2005, the ALDS to Detroit in 2006, and the ALDS to Cleveland in 2007. In addition, including the World Series loss in 2001, every World Series champion from 2001 to 2004 won the title at the Yankees' expense in postseason play, which is an AL record and as of 2021 tied for the MLB record with the Los Angeles Dodgers from 2016 to 2019. Joe Torre's contract was allowed to expire and he was replaced by Joe Girardi in 2008, a season in which the Yankees would miss the playoffs for the first time since 1993. The Yankees won their 27th World Series championship in 2009, defeating the defending champion Philadelphia Phillies in six games. Buster Olney, who covered the Yankees for The New York Times before joining ESPN, would write a book titled The Last Night of the Yankee Dynasty. The book is a play by play account of | Bautista batted 3-for-4 with five RBIs. The team set a World Series record with 22 hits and defeated the New York Yankees in its most lopsided postseason loss in 293 postseason games, since surpassed by a 16–1 loss to the Boston Red Sox in 2018. The 15–2 win evened the series at three games apiece and set up a Game 7 for the ages between Roger Clemens and Curt Schilling, again pitching on three days' rest. Game 7 It was a matchup of two 20-game winners in the Series finale. Roger Clemens, at 39 years old, became the oldest Game 7 starter ever. Curt Schilling had already started two games of the Series and pitched his 300th inning of the season on just three days' rest. The two aces matched each other inning by inning and after seven full innings, the game was tied at 1–1. The Diamondbacks scored first in the sixth inning with a Steve Finley single and a Danny Bautista double (Bautista would be called out at third base). The Yankees responded with an RBI single from Tino Martinez, which drove in Derek Jeter who had singled earlier. Brenly stayed with Schilling into the eighth, and the move backfired as Alfonso Soriano hit a home run on an 0–2 pitch. After Schilling struck out Scott Brosius, he gave up a single to David Justice, and he left the game trailing 2–1. When Brenly came to the mound to remove Schilling, he was heard on the Sounds of the Game microphone telling his clearly upset pitcher, "love you brother, you're my hero" and assuring him that "that ain't gonna beat us, we're gonna get that back and then some." He then brought in Game 5 starter Miguel Batista to get Jeter out and then in an unconventional move, brought in the previous night's starter and winner Randy Johnson, who had thrown 104 pitches, in relief to keep it a one-run game. It proved to be a smart move, as Johnson retired pinch hitter Chuck Knoblauch (who batted for the left handed Paul O'Neill) on a fly out to Bautista in right field, then returned to the mound for the top of the ninth where he got Bernie Williams to fly out to Steve Finley in center field and Martinez to ground out to Tony Womack at shortstop, and then struck out catcher Jorge Posada to send the game to the bottom of the ninth inning. With the Yankees ahead 2–1 in the bottom of the eighth, manager Joe Torre turned the game over to his ace closer Mariano Rivera for a two-inning save. Rivera struck out the side in the eighth, including Arizona's Luis Gonzalez, Matt Williams, and Bautista, lowering his postseason ERA to a Major League-best 0.70. Although he was effective in the eighth, this game would end in the third ninth-inning comeback of the Series. Mark Grace led off the inning with a single to center on a 1–0 pitch. Rivera's errant throw to second base on a bunt attempt by catcher Damian Miller on an 0–1 pitch put runners on first and second. Jeter tried to reach for the ball, but got tangled in the legs of pinch-runner David Dellucci, who was sliding in an attempt to break up the double play. During the next at bat, Rivera appeared to regain control when he fielded pinch hitter Jay Bell's (who was hitting for Johnson) bunt and threw out Dellucci at third base, but third baseman Brosius decided to hold onto the baseball instead of throwing to first to complete the double play. Midre Cummings was sent in to pinch-run for Damian Miller, who had reached second base safely. With Cummings at second and Bell at first, the next batter, Womack, hit a double down the right-field line on a 2–2 pitch that tied the game and earned Rivera a blown save, his first in a postseason since 1997. Bell advanced to third and the Yankees pulled the infield and outfield in as the potential winning run (Bell) stood at third with fewer than two outs. After Rivera hit Craig Counsell unintentionally with an 0–1 pitch, the bases were loaded. On an 0–1 pitch, with Williams in the on-deck circle, Gonzalez lofted a soft floater single over the drawn-in Jeter that barely reached the outfield grass, plating Jay Bell with the winning run. Gonzalez's single ended New York's bid for a fourth consecutive title and brought Arizona its first championship in its fourth year of existence, making the Diamondbacks the fastest expansion team to win a World Series (beating out the 1997 Florida Marlins, who had done it in their fifth season at that time). It was also the first, and remains the only, major professional sports championship for the state of Arizona. Randy Johnson picked up his third win. Rivera took the loss, his only postseason loss in his career. In 2009, Game 7 of the 2001 World Series was chosen by Sports Illustrated as the Best Postseason Game of the Decade (2000–2009). Composite box 2001 World Series (4–3): Arizona Diamondbacks (N.L.) over New York Yankees (A.L.) Media coverage For the second consecutive year, Fox carried the World Series over its network with its top broadcast team, Joe Buck and Tim McCarver (himself a Yankees broadcaster). This was the first year of Fox's exclusive rights to the World Series (in the previous contract, Fox only broadcast the World Series in even numbered years while NBC broadcast it in odd numbered years), which it has held ever since (this particular contract also had given Fox exclusive rights to the entire baseball postseason, which aired over its family of networks; the contract was modified following Disney's purchase of Fox Family Channel shortly after the World Series ended, as ESPN regained their postseason rights following a year of postseason games on ABC Family, Fox Family's successor). ESPN Radio provided national radio coverage for the fourth consecutive year, with Jon Miller and Joe Morgan calling the action. Locally, the Series was broadcast by KTAR-AM in Phoenix with Thom Brennaman, Greg Schulte, Rod Allen and Jim Traber, and by WABC-AM in New York City with John Sterling and Michael Kay. This would be Sterling and Kay's last World Series working together, and Game 7 would be the last Yankee broadcast on WABC. Kay moved to television and the new YES Network the following season and WCBS picked up radio rights to the Yankees. It was Kay who announced Derek Jeter's game-winning home run in Game 4 of the series and subsequently anointed him as "Mr. November". Aftermath Days after Game 7, the Yankees loss proved to be life-saving for Enrique Wilson. As there was no Yankees victory parade down the Canyon of Heroes, Wilson moved up his flight home to the Dominican Republic, and was ultimately spared from boarding American Airlines Flight 587 that crashed in Belle Harbor, Queens, killing everyone on board. During the offseason, several Yankees moved onto other teams or retired, the most notable changes being the signing of Jason Giambi to replace Martinez, and the retirements of Brosius and O'Neill. Martinez would later finish his career with the Yankees in 2005 after spending the previous three years in St. Louis and Tampa Bay. After winning the NL West again in 2002 the Diamondbacks were swept 3–0 by St. Louis in the NLDS. From here they declined, losing 111 games in 2004 as Bob Brenly was fired during that season. Arizona would not win another NL West title until 2007. Schilling was traded to the Boston Red Sox after the 2003 season and in 2004 helped lead them to their first world championship since 1918. He helped them win another championship in 2007 and retired after four years with Boston, missing the entire 2008 season with a shoulder injury. Johnson was traded to the Yankees after the 2004 season, a season that saw him throw a perfect game against the Atlanta Braves, though he would be traded back to the Diamondbacks two years later and finish his career with the San Francisco Giants in 2009. The last player from the 2001 Diamondbacks roster, Lyle Overbay, retired following the 2014 season with the Milwaukee Brewers while the last player from the 2001 Yankees, Randy Choate, retired on February 16, 2017. From 2002 through 2007, the Yankees' misfortune in the postseason continued, with the team losing the ALDS to the Anaheim Angels in 2002, the World Series to the Florida Marlins in 2003, the ALCS to the Boston Red Sox (in the process becoming the first ever team in postseason history to blow a 3–0 series lead) in 2004, the ALDS again to the Angels in 2005, the ALDS to Detroit in 2006, and the ALDS to Cleveland in 2007. In addition, including the World Series loss in 2001, every World Series champion from 2001 to 2004 won the title at the Yankees' expense in postseason play, which is an AL record and as of 2021 tied for the MLB record with the Los Angeles Dodgers from 2016 to 2019. Joe Torre's contract was allowed to expire and he was replaced by Joe Girardi in 2008, a season in which the Yankees would miss the playoffs for the first time since 1993. The Yankees won their 27th World Series championship in 2009, defeating the defending champion Philadelphia Phillies in six games. Buster Olney, who covered the Yankees for The New York Times before joining ESPN, would write a book titled The Last Night of the Yankee Dynasty. The book is a play by play account of Game 7 in addition to stories about key players, executives, and moments from the 1996–2001 dynasty. In a 2005 reprinting, Olney included a new epilogue covering the aftermath of the 2001 World Series |
Major League Baseball. It matched the American League (AL) champion Boston Americans against the National League (NL) champion Pittsburgh Pirates in a best-of-nine series, with Boston prevailing five games to three, winning the last four. The first three games were played in Boston, the next four in Allegheny (home of the Pirates), and the eighth (last) game in Boston. Pittsburgh pitcher Sam Leever injured his shoulder while trap shooting, so his teammate Deacon Phillippe pitched five complete games. Phillippe won three of his games, but it was not enough to overcome the club from the new American League. Boston pitchers Bill Dinneen and Cy Young led Boston to victory. In Game 1, Phillippe struck out ten Boston batters. The next day, Dinneen bettered that mark, striking out 11 Pittsburgh batters in Game 2. Honus Wagner, bothered by injuries, batted only 6-for-27 (.222) in the Series and committed six errors. The shortstop was deeply distraught by his performance. The following spring, Wagner (who in 1903 led the league in batting average) refused to send his portrait to a "Hall of Fame" for batting champions. "I was too bum last year", he wrote. "I was a joke in that Boston-Pittsburgh Series. What does it profit a man to hammer along and make a few hits when they are not needed only to fall down when it comes to a pinch? I would be ashamed to have my picture up now." Due to overflow crowds at the Exposition Park games in Allegheny City, if a batted ball rolled under a rope in the outfield that held spectators back, a "ground-rule triple" would be scored. 17 ground-rule triples were hit in the four games played at the stadium. In the series, Boston came back from a three games to one deficit, winning the final four games to capture the title. Such a large comeback would not happen again until the Pirates came back to defeat the Washington Senators in the 1925 World Series, and has happened only 11 times in baseball history. (The Pirates themselves repeated this feat in against the Baltimore Orioles.) Much was made of the influence of Boston's "Royal Rooters", who traveled to Exposition Park and sang their theme song "Tessie" to distract the opposing players (especially Wagner). Boston wound up winning three out of four games in Allegheny City. Pirates owner Barney Dreyfuss added his share of the gate receipts to the players' share, so the losing team's players actually finished with a larger individual share than the winning team's. The Series brought the new American League prestige and proved its best could beat the best of the National League, thus strengthening the demand for future World Series competitions. Background A new league In 1901, Ban Johnson, president of the Western League, a minor league organization, formed the American League to take advantage of the National League's 1900 contraction from twelve teams to eight. Johnson and fellow owners raided the National League and signed away many star players, including Cy Young and Jimmy Collins. Johnson had a list of 46 National Leaguers he targeted for the American League; by 1902, all but one had made the jump. The constant raiding, however, nixed the idea of a championship between the two leagues. Pirates owner Barney Dreyfuss, whose team ran away with the 1902 National League pennant, was open to a postseason contest and even said he would allow the American League champion to stock its roster with all-stars. However, Johnson had spoken of putting a team in Pittsburgh and even attempted to raid the Pirates' roster in August 1902, which soured Dreyfuss. At the end of the season, however, the Pirates played a group of American League All-Stars in a four-game exhibition series, winning two games to one, with one tie. The leagues finally called a truce in the winter of 1902–03 and formed the National Commission to preside over organized baseball. The following season, the Boston Americans and Pittsburgh Pirates had secured their respective championship pennants by September. That August, Dreyfuss challenged the American League to an 11-game championship series. Encouraged by Johnson and National League President Harry Pulliam, Americans owner Henry J. Killilea met with Dreyfuss in Pittsburgh in September and instead agreed to a best-of-nine championship, with the first three games played in Boston, the next four in Allegheny City, and the remaining two (if necessary) in Boston. One significant point about this agreement was that it was an arrangement primarily between the two clubs rather than a formal arrangement between the leagues. In short, it was a voluntary event, a fact which would result in no Series at all for . The formal establishment of the Series as a compulsory event started in . The teams The Pirates won their third straight pennant in 1903 thanks to a powerful lineup that included legendary shortstop Honus Wagner, who hit .355 and drove in 101 runs, player-manager Fred Clarke, who hit .351, and Ginger Beaumont, who hit .341 and led the league in hits and runs. The Pirates' pitching | would result in no Series at all for . The formal establishment of the Series as a compulsory event started in . The teams The Pirates won their third straight pennant in 1903 thanks to a powerful lineup that included legendary shortstop Honus Wagner, who hit .355 and drove in 101 runs, player-manager Fred Clarke, who hit .351, and Ginger Beaumont, who hit .341 and led the league in hits and runs. The Pirates' pitching was weaker than it had been in previous years but boasted 24-game winner Deacon Phillippe and 25-game winner Sam Leever. The Americans had a strong pitching staff, led by Cy Young, who went 28–9 in 1903 and became the all-time wins leader that year. Bill Dinneen and Long Tom Hughes, right-handers like Young, had won 21 games and 20 games each. The Boston outfield, featuring Chick Stahl (.274), Buck Freeman (.287, 104 RBI) and Patsy Dougherty (.331, 101 runs scored) was considered excellent. Although the Pirates had dominated their league for the previous three years, they went into the series riddled with injuries and plagued by bizarre misfortunes. Otto Krueger, the team's only utility player, was beaned on September 19 and never fully played in the series. 16-game winner Ed Doheny left the team three days later, exhibiting signs of paranoia; he was committed to an insane asylum the following month. Leever had been battling an injury to his pitching arm (which he made worse by entering a trapshooting competition). Worst of all, Wagner, who had a sore thumb throughout the season, injured his right leg in September and was never 100 percent for the postseason. Some sources say Boston were heavy underdogs. Boston bookies actually gave even odds to the teams (and only because Dreyfuss and other "sports" were alleged to have bet on Pittsburgh to bring down the odds). The teams were generally thought to be evenly matched, with the Americans credited with stronger pitching and the Pirates with superior offense and fielding. The outcome, many believed, hinged on Wagner's health. "If Wagner does not play, bet your money at two to one on Boston", said the Sporting News, "but if he does play, place your money at two to one on Pittsburg." Summary Matchups Game 1 The Pirates started Game 1 strong, scoring six runs in the first four innings, and held on to win the first World Series game in baseball history. They extended their lead to 7–0 on an inside-the-park home run by Jimmy Sebring in the seventh, the first home run in World Series history. Boston tried to mount a comeback in the last three innings, but it was too little, too late; they ended up losing 7–3 in the first ever World Series game. Both Phillippe and Young threw complete games, with Phillippe striking out ten and Young fanning five, but Young also gave up twice as many hits and allowed three earned runs to Phillippe's two. Game 2 After starting out strong in Game 1, the Pirates simply shut down offensively, eking out a mere three hits, all singles. Pittsburgh starter Sam Leever went 1 inning and gave up three hits and two runs, before his ailing arm forced him to leave in favor of Bucky Veil, who finished the game. Bill Dinneen struck out 11 and pitched a complete game for the Americans, while Patsy Dougherty hit home runs in the first and sixth innings for two of the Boston's three runs. The Americans' Patsy Dougherty led off the Boston scoring with an inside-the-park home run, the first time a lead-off batter did just that until Alcides Escobar of the Kansas City Royals duplicated the feat in the 2015 World Series, 112 years later. Dougherty's second home run was the first in World Series history to actually sail over the fence, an incredibly rare feat at the time. Game 3 Phillippe, pitching after only a single day of rest, started Game 3 for the Pirates and didn't let them down, hurling his second complete-game victory of the Series to put Pittsburgh up two games to one. Game 4 After two days of rest, Phillippe was ready to pitch a second straight game. He threw his third complete-game victory of the series against Bill Dinneen, who was making his second start of the series. But Phillippe's second straight win was almost not to be, as the Americans, down 5–1 in the top of the ninth, rallied to narrow the deficit to one run. The comeback attempt failed, as Phillippe managed to put an end to it and give the Pirates a commanding 3–1 series lead. Game 5 Game 5 was a pitcher's duel for the first five innings, with Boston's Cy Young and Pittsburgh's Brickyard Kennedy giving up no runs. That changed in the top of the sixth, however, when the Americans scored a then-record six runs before being retired. Young, on the other hand, managed to keep his shutout intact before finally giving up a pair of runs in the bottom of the eighth. He went the distance and struck out four for his first World Series win. Game 6 Game 6 was a rematch between the starters of Game 2, Boston's Dinneen and Pittsburgh's Leever. Leever pitched a complete game this time but so did Dinneen, who outmatched him to earn his second complete-game victory of the series. After losing three of the first four games of the World Series, the underdog Americans had tied the series at three games apiece. Game 7 The fourth |
the target cell. The virus makes initial contact with the cell with VP2, triggering receptor-mediated endocytosis of the virus. The low pH within the endosome then triggers BTV's membrane penetration protein VP5 to undergo a conformational change that disrupts the endosomal membrane. Uncoating yields a transcriptionally active 470S core particle which is composed of two major proteins VP7 and VP3, and the three minor proteins VP1, VP4 and VP6 in addition to the dsRNA genome. There is no evidence that any trace of the outer capsid remains associated with these cores, as has been described for reovirus. The cores may be further uncoated to form 390S subcore particles that lack VP7, also in contrast to reovirus. Subviral particles are probably akin to cores derived in vitro from virions by physical or proteolytic treatments that remove the outer capsid and causes activation of the BTV transcriptase. In addition to the seven structural proteins, three non-structural (NS) proteins, NS1, NS2, NS3 (and a related NS3A) are synthesised in BTV-infected cells. Of these, NS3/NS3A is involved in the egress of the progeny virus. The two remaining non-structural proteins, NS1 and NS2, are produced at high levels in the cytoplasm and are believed to be involved in virus replication, assembly and morphogenesis. Epidemiology Bluetongue has been observed in Australia, the US, Africa, the Middle East, Asia, and Europe. An outline of the transmission cycle of BTV is illustrated in article Parasitic flies of domestic animals. Its occurrence is seasonal in the affected Mediterranean countries, subsiding when temperatures drop and hard frosts kill the adult midge vectors. Viral survival and vector longevity is seen during milder winters. A significant contribution to the northward spread of bluetongue disease has been the ability of C. obsoletus and C.pulicaris to acquire and transmit the pathogen, both of which are spread widely throughout Europe. This is in contrast to the original C.imicola vector, which is limited to North Africa and the Mediterranean. The relatively recent novel vector has facilitated a far more rapid spread than the simple expansion of habitats north through global warming. In August 2006, cases of bluetongue were found in the Netherlands, then Belgium, Germany, and Luxembourg. In 2007, the first case of bluetongue in the Czech Republic was detected in one bull near Cheb at the Czech-German border. In September 2007, the UK reported its first ever suspected case of the disease, in a Highland cow on a rare-breeds farm near Ipswich, Suffolk. Since then, the virus has spread from cattle to sheep in Britain. By October 2007, bluetongue had become a serious threat in Scandinavia and Switzerland and the first outbreak in Denmark was reported. In autumn 2008, several cases were reported in the southern Swedish provinces of Småland, Halland, and Skåne, as well as in areas of the Netherlands bordering Germany, prompting veterinary authorities in Germany to intensify controls. Norway had its first finding in February 2009, when cows at two farms in Vest-Agder in the south of Norway showed an immune response to bluetongue. Norway have since been declared free of the disease in 2011. Although the disease is not a threat to humans, the most vulnerable common domestic ruminants in the UK are cattle, goats, and especially, sheep. Overwintering A puzzling aspect of BTV is its survival between midge seasons in temperate regions. Adults of Culicoides are killed by cold winter temperatures, and BTV infections typically do not last for more than 60 days, which is not long enough for BTV to last until the next spring. It is believed that the virus somehow survives in overwintering midges or animals. Multiple mechanisms have been proposed. A few adult Culicoides midges infected with BTV may survive the mild winters of the temperate zone. Some midges may even move indoors to avoid the cold temperature of the winter. Additionally, BTV could | rare-breeds farm near Ipswich, Suffolk. Since then, the virus has spread from cattle to sheep in Britain. By October 2007, bluetongue had become a serious threat in Scandinavia and Switzerland and the first outbreak in Denmark was reported. In autumn 2008, several cases were reported in the southern Swedish provinces of Småland, Halland, and Skåne, as well as in areas of the Netherlands bordering Germany, prompting veterinary authorities in Germany to intensify controls. Norway had its first finding in February 2009, when cows at two farms in Vest-Agder in the south of Norway showed an immune response to bluetongue. Norway have since been declared free of the disease in 2011. Although the disease is not a threat to humans, the most vulnerable common domestic ruminants in the UK are cattle, goats, and especially, sheep. Overwintering A puzzling aspect of BTV is its survival between midge seasons in temperate regions. Adults of Culicoides are killed by cold winter temperatures, and BTV infections typically do not last for more than 60 days, which is not long enough for BTV to last until the next spring. It is believed that the virus somehow survives in overwintering midges or animals. Multiple mechanisms have been proposed. A few adult Culicoides midges infected with BTV may survive the mild winters of the temperate zone. Some midges may even move indoors to avoid the cold temperature of the winter. Additionally, BTV could cause a chronic or latent infection in some animals, providing another means for BTV to survive the winter. BTV can also be transmitted from mother to fetus. The outcome is abortion or stillbirth if fetal infection occurs early in gestation and survival if infection occurs late. However infection at an intermediate stage, before the fetal immune system is fully developed, may result in a chronic infection that lingers until the first months after birth of the lamb. Midges then spread the pathogen from the calves to other animals, starting a new season of infection. Treatment and prevention Prevention is effected via quarantine, inoculation with live modified virus vaccine, and control of the midge vector, including inspection of aircraft. Livestock management and insect control Vaccines Protection by live attenuated vaccines (LAVs) are serotype specific. Multiserotype LAV cocktails can induce neutralizing antibodies against unincluded serotypes, and subsequent |
a year after co-founding it. In February 1999 in an email to the Debian developers mailing list he explained his decision and stated that, though "most hackers know that Free Software and Open Source are just two words for the same thing", the success of "open source" as a marketing term had "de-emphasized the importance of the freedoms involved in Free Software"; he added, "It's time for us to fix that." He stated his regret that OSI co-founder Eric Raymond "seems to be losing his free software focus." But in the following 2000s he spoke about Open source again. Perens presently volunteers as the Open Source Initiative's representative to the European Technical Standards Institute ("ETSI"), and is a frequent participant in review of license texts submitted to OSI for certification as Open Source licenses. Linux Capital Group In 1999, Perens left Pixar and became the president of Linux Capital Group, a business incubator and venture capital firm focusing on Linux-based businesses. Their major investment was in Progeny Linux Systems, a company headed by Debian founder Ian Murdock. In 2000, as a result of the economic downturn, Perens shut down Linux Capital Group. (Progeny Linux Systems would end operations in 2007.) Hewlett-Packard From December 2000 to September 2002, Perens served as "Senior Global Strategist for Linux and Open Source" at Hewlett-Packard, internally evangelizing for the use of Linux and other open-source software. He was fired as a result of his anti-Microsoft statements, which especially became an issue after HP acquired Compaq, a major manufacturer of Microsoft Windows-based PCs, in 2002. Linux Standard Base In 1998, Perens founded, and became the first project leader, of the Linux Standard Base project, a joint project by several Linux distributions now under the organizational structure of the Linux Foundation to standardize the Linux software system structure. UserLinux In 2003 Perens created UserLinux, a Debian-based distribution whose stated goal was, "Provide businesses with freely available, high quality Linux operating systems accompanied by certifications, service, and support options designed to encourage productivity and security while reducing overall costs." UserLinux was eventually overtaken in popularity by Ubuntu, another Debian-based distribution, which was started in 2004, and UserLinux became unmaintained in 2006. SourceLabs Perens was an employee of SourceLabs, a Seattle-based open source software and services company, from June 2005 until December 2007. He produced a video commercial, Impending Security Breach, for SourceLabs in 2007. (SourceLabs was acquired by EMC in 2009.) University faculty Between 1981 and 1986, Perens was on the staff of the New York Institute of Technology Computer Graphics Lab as a Unix kernel programmer. In 2002, Perens was a remote Senior Scientist for Open Source with the Cyber Security Policy Laboratory of George Washington University under the direction of Tony Stanco. Stanco was director of the laboratory for a year, while its regular director was on sabbatical. Between 2006 and 2007, Perens was a visiting lecturer and researcher for the University of Agder under a three-year grant from the Competence Fund of Southern Norway. During this time he consulted the Norwegian Government and other entities on government policy issues related to computers and software. After this time Perens worked remotely on Agder programs, mainly concerning the European Internet Accessibility Observatory. Other activities In 2007, some of Perens's government advisory roles included a meeting with the President of the Chamber of Deputies (the lower house of parliament) in Italy and testimony to the Culture Committee of the Chamber of Deputies; a keynote speech at the foundation of Norway's Open Source Center, following Norway's Minister of Governmental Reform (Perens is on the advisory board of the center); he provided input on the revision of the European Interoperability Framework; and he was keynote speaker at a European Commission conference on Digital Business Ecosystems at the Centre Borschette, Brussels, on November 7. In 2009, Perens acted as an expert witness on open source in the Jacobsen v. Katzer U.S. Federal lawsuit. His report, which was made publicly available by Jacobsen, presented the culture and impact of open-source software development to the federal courts. Perens delivered one of the keynote addresses at the 2012 linux.conf.au conference in Ballarat, Australia. He discussed the need for open source software to market itself better to non-technical users. He also discussed some of the latest developments in open-source hardware, such as Papilio and Bus Pirate. In 2013, Perens spoke in South America, as the closing keynote at Latinoware 2013. He was the keynote of CISL – Conferencia Internacional de Software Libre, in Buenos Aires, Argentina, and keynoted a special event along with the Minister of software and innovation of Chubut Province, in Puerto Madrin, Patagonia, Argentina. He keynoted the Festival de Software Libre 2013, in Puerto Vallarta, Mexico. In 2014–2015, Perens took a break from Open Source conferences, having spoken at them often since 1996. In 2016, he returned to the conference circuit, keynoting the Open Source Insight conference in Seoul, sponsored by the Copyright Commission of South Korea. Perens web site presently advertises his availability to keynote conferences as long as travel and lodging expenses are compensated. In 2020, Perens delivered the talk, "What Comes After Open Source?" for DebConf 2020. He discussed the future of open source licensing and the need to develop alternative licensing structures so that open source developers could get paid for their work. Views Perens poses "Open Source" as a means of marketing the free and open-source software idea to business people and mainstream who might be more interested in the practical benefits of an open source development model and ecosystem than abstract ethics. He states that open source and free software are only two ways of talking about the same phenomenon, a point of view not shared by Stallman and his free software movement. Perens postulated in 2004 an economic theory | led the school to fail to teach him to read. He developed an interest in technology at an early age: besides his interest in amateur radio, he ran a pirate radio station in the town of Lido Beach and briefly engaged in phone phreaking. Career Computer graphics Perens worked for seven years at the New York Institute of Technology Computer Graphics Lab. After that, he worked at Pixar for 12 years, from 1987 to 1999. He is credited as a studio tools engineer on the Pixar films A Bug's Life (1998) and Toy Story 2 (1999). No-Code International Perens founded No-Code International in 1998 with the goal of ending the Morse Code test then required for an amateur radio license. His rationale was that amateur radio should be a tool for young people to learn advanced technology and networking, rather than something that preserved antiquity and required new hams to master outmoded technology before they were allowed on the air. Perens lobbied intensively on the Internet, at amateur radio events in the United States, and during visits to other nations. One of his visits was to Iceland, where he had half of that nation's radio amateurs in the room, and their vote in the International Amateur Radio Union was equivalent to that of the entire United States. BusyBox In 1995, Perens created BusyBox, a package of Unix-style utilities for operating systems including Linux-based ones and FreeBSD. He stopped working on it in 1996, after which it was taken over by other developers. Starting in 2007, several lawsuits were filed for infringement of BusyBox copyright and licensing. These lawsuits were filed by the Software Freedom Law Center (SFLC), and some of the later managing developers of BusyBox. In 2009, Bruce Perens released a statement about the lawsuits and those filing them. In it, he claims that he maintains a significant or even majority ownership of the software in the litigation, but was not contacted nor represented by the plaintiffs; and that some of the plaintiffs had themselves modified BusyBox and its distribution package in such a way as to violate applicable licensing terms and copyright owned by Perens and additional BusyBox developers. Perens supports enforcement of the GPL license used on Busybox. Because he was denied participation in the Busybox cases on the side of the plaintiffs, Perens started a consulting business to assist the defendants in coming into compliance with the GPL and arriving at an amicable settlement with the Software Freedom Law Center. Debian Project Leader From April 1996 to December 1997, while still working at Pixar, Perens served as Debian Project Leader, the person who coordinates development of the Debian open source operating system. He replaced Ian Murdock, the creator of Debian, who had been the first project leader. Software in the Public Interest In 1997, Perens was a co-founder of Software in the Public Interest (SPI), a nonprofit organization intended to serve as an umbrella organization to aid open-source software and hardware projects. It was originally created to allow the Debian Project to accept donations. Debian Social Contract In 1997, Perens was carbon-copied on an email conversation between Donnie Barnes of Red Hat and Ean Schuessler, who was then working on Debian. Schuessler bemoaned that Red Hat had never stated its social contract with the developer community. Perens took this as inspiration to create a formal social contract for Debian. In a blog posting, Perens claims not to have made use of the Three Freedoms (later the Four Freedoms) published by the Free Software Foundation in composing his document. Perens proposed a draft of the Debian Social Contract to the Debian developers on the debian-private mailing list early in June 1997. Debian developers contributed discussion and changes for the rest of the month while Perens edited, and the completed document was then announced as Debian project policy. Part of the Debian Social Contract was the Debian Free Software Guidelines, a set of 10 guidelines for determining whether a set of software can be described as "free software", and thus whether it could be included in Debian. Open Source Definition and The Open Source Initiative On February 3, 1998, a group of people (not including Perens) met at VA Linux Systems to discuss the promotion of Free Software to business in pragmatic terms, rather than the moral terms preferred by Richard Stallman. Christine Petersen of the nanotechnology organization Foresight Institute, who was present because Foresight took an early interest in Free Software, suggested the term "Open Source". The next day, Eric S. Raymond recruited Perens to work with him on the formation of Open Source. Perens modified the Debian Free Software Guidelines into the Open Source Definition by removing Debian references and replacing them with "Open Source". The original announcement of The Open Source Definition was made on February 9, 1998 on Slashdot and elsewhere; the definition was given in Linux Gazette on February 10, 1998. Concurrently, Perens and Raymond established the Open Source Initiative, an organization intended to promote open source software. Perens left OSI in 1999, a year after co-founding it. In February 1999 in an email to the Debian developers mailing list he explained his decision and stated that, though "most hackers know that Free Software and Open Source are just two words for the same thing", the success of "open source" as a marketing term had "de-emphasized the importance of the freedoms involved in Free Software"; he added, "It's time for us to fix that." He stated his regret that OSI co-founder Eric Raymond "seems to be losing his free software focus." But in the following 2000s he spoke about Open source again. Perens presently volunteers as the Open Source Initiative's representative to the European Technical Standards Institute ("ETSI"), and is a frequent participant in review of license texts submitted to OSI for certification as Open Source licenses. Linux Capital Group In 1999, Perens left Pixar and became the president of Linux Capital Group, a business incubator and venture capital firm focusing on Linux-based businesses. Their major investment was in Progeny Linux Systems, a company headed by Debian founder Ian Murdock. In 2000, as a result of the economic downturn, Perens shut down Linux Capital Group. (Progeny Linux Systems would end operations in 2007.) Hewlett-Packard From December 2000 to September 2002, Perens served as "Senior Global Strategist for Linux and Open Source" at Hewlett-Packard, internally evangelizing for the use of Linux and other open-source software. He was fired as a result of his anti-Microsoft statements, which especially became an issue after HP acquired Compaq, a major manufacturer of Microsoft Windows-based PCs, in 2002. Linux Standard Base In 1998, Perens founded, and became the first project leader, of the Linux Standard Base project, a joint project by several Linux distributions now under the organizational structure of the Linux Foundation to standardize the Linux software system structure. UserLinux In 2003 Perens created UserLinux, a Debian-based distribution whose stated goal was, "Provide businesses with freely available, high quality Linux operating systems accompanied by certifications, service, and support options designed to encourage productivity and security while reducing overall costs." UserLinux was eventually overtaken in popularity by Ubuntu, another Debian-based distribution, which was started in 2004, and UserLinux became unmaintained in 2006. SourceLabs Perens was an employee of SourceLabs, a Seattle-based open source software and services company, from June 2005 until December 2007. He produced a video commercial, Impending Security Breach, for SourceLabs in 2007. (SourceLabs |
other than its properties implies, this argument maintains, that one cannot conceive of a bare particular (a substance without properties), an implication that directly opposes substance theory. The conceptual difficulty of bare particulars was illustrated by John Locke when he described a substance by itself, apart from its properties, as "something, I know not what. [...] The idea then we have, to which we give the general name substance, being nothing but the supposed, but unknown, support of those qualities we find existing, which we imagine cannot subsist sine re substante, without something to support them, we call that support substantia; which, according to the true import of the word, is, in plain English, standing under or upholding." Whether a relation of an object is one of its properties may complicate such an argument. However, the argument concludes that the conceptual challenge of bare particulars leaves a bundle of properties and nothing more as the only possible conception of an object, thus justifying bundle theory. Objections Bundle theory maintains that properties are bundled together in a collection without describing how they are tied together. For example, bundle theory regards an apple as red, four inches (100 mm) wide, and juicy but lacking an underlying substance. The apple is said to be a bundle of properties including redness, being four inches (100 mm) wide, and juiciness. D. Hume used the term "bundle" in this sense, also referring to the personal identity, in his main work: "I may venture to affirm of the rest of mankind, that they are nothing but a bundle or collection of different perceptions, which succeed each other with inconceivable rapidity, and are in a perpetual flux and movement". Critics question how bundle theory accounts for the properties' compresence (the togetherness relation between those properties) without an underlying substance. Critics also question how any two given properties are determined to be properties of the same object if there is no substance in which they both inhere. Traditional bundle theory explains the compresence of properties by defining an object as a collection of properties bound together. Thus, different combinations of properties and relations produce different objects. Redness and juiciness, for example, may be found together on top of the table because they are part of a bundle of properties located on the table, one of which is the "looks like an apple" property. By contrast, substance theory explains the compresence of properties by asserting that the properties are found together because it is the substance that has those properties. In substance theory, a substance is the thing in which properties inhere. For example, redness | inherent. Arguments for The difficulty in conceiving of or describing an object without also conceiving of or describing its properties is a common justification for bundle theory, especially among current philosophers in the Anglo-American tradition. The inability to comprehend any aspect of the thing other than its properties implies, this argument maintains, that one cannot conceive of a bare particular (a substance without properties), an implication that directly opposes substance theory. The conceptual difficulty of bare particulars was illustrated by John Locke when he described a substance by itself, apart from its properties, as "something, I know not what. [...] The idea then we have, to which we give the general name substance, being nothing but the supposed, but unknown, support of those qualities we find existing, which we imagine cannot subsist sine re substante, without something to support them, we call that support substantia; which, according to the true import of the word, is, in plain English, standing under or upholding." Whether a relation of an object is one of its properties may complicate such an argument. However, the argument concludes that the conceptual challenge of bare particulars leaves a bundle of properties and nothing more as the only possible conception of an object, thus justifying bundle theory. Objections Bundle theory maintains that properties are bundled together in a collection without describing how they are tied together. For example, bundle theory regards an apple as red, four inches (100 mm) wide, and juicy but lacking an underlying substance. The apple is said to be a bundle of properties including redness, being four inches (100 mm) wide, and juiciness. |
to turn the left flank of the Eighth Army at the Battle of Alam el Halfa from 31 August 1942. The German/Italian armoured corps infantry attack was stopped in very heavy fighting. Rommel's forces had to withdraw urgently lest their retreat through the British minefields be cut off. Montgomery was criticised for not counter-attacking the retreating forces immediately, but he felt strongly that his methodical build-up of British forces was not yet ready. A hasty counter-attack risked ruining his strategy for an offensive on his own terms in late October, planning for which had begun soon after he took command. He was confirmed in the permanent rank of lieutenant-general in mid-October. The conquest of Libya was essential for airfields to support Malta and to threaten the rear of Axis forces opposing Operation Torch. Montgomery prepared meticulously for the new offensive after convincing Churchill that the time was not being wasted. (Churchill sent a telegram to Alexander on 23 September 1942 which began, "We are in your hands and of course a victorious battle makes amends for much delay.") He was determined not to fight until he thought there had been sufficient preparation for a decisive victory, and put into action his beliefs with the gathering of resources, detailed planning, the training of troops—especially in clearing minefields and fighting at night—and in the use of 252 of the latest American-built Sherman tanks, 90 M7 Priest self-propelled howitzers, and making a personal visit to every unit involved in the offensive. By the time the offensive was ready in late October, Eighth Army had 231,000 men on its ration strength. El Alamein The Second Battle of El Alamein began on 23 October 1942, and ended 12 days later with one of the first large-scale, decisive Allied land victories of the war. Montgomery correctly predicted both the length of the battle and the number of casualties (13,500). Historian Correlli Barnett has pointed out that the rain also fell on the Germans, and that the weather is therefore an inadequate explanation for the failure to exploit the breakthrough, but nevertheless the Battle of El Alamein had been a great success. Over 30,000 prisoners of war were taken, including the German second-in-command, General von Thoma, as well as eight other general officers. Rommel, having been in a hospital in Germany at the start of the battle, was forced to return on 25 October 1942 after Stumme—his replacement as German commander—died of a heart attack in the early hours of the battle. Tunisia Montgomery was advanced to KCB and promoted to full general. He kept the initiative, applying superior strength when it suited him, forcing Rommel out of each successive defensive position. On 6 March 1943, Rommel's attack on the over-extended Eighth Army at Medenine (Operation Capri) with the largest concentration of German armour in North Africa was successfully repulsed. At the Mareth Line, 20 to 27 March, when Montgomery encountered fiercer frontal opposition than he had anticipated, he switched his major effort into an outflanking inland pincer, backed by low-flying RAF fighter-bomber support. For his role in North Africa he was awarded the Legion of Merit by the United States government in the rank of Chief Commander. Sicily The next major Allied attack was the Allied invasion of Sicily (Operation Husky). Montgomery considered the initial plans for the Allied invasion, which had been agreed in principle by General Dwight D. Eisenhower, the Supreme Allied Commander Allied Forces Headquarters, and General Alexander, the 15th Army Group commander, to be unworkable because of the dispersion of effort. He managed to have the plans recast to concentrate the Allied forces, having Lieutenant General George Patton's US Seventh Army land in the Gulf of Gela (on the Eighth Army's left flank, which landed around Syracuse in the south-east of Sicily) rather than near Palermo in the west and north of Sicily. Inter-Allied tensions grew as the American commanders, Patton and Omar Bradley (then commanding US II Corps under Patton), took umbrage at what they saw as Montgomery's attitudes and boastfulness. However, while they were considered three of the greatest soldiers of their time, due to their competitiveness they were renowned for "squabbling like three schoolgirls" thanks to their "bitchiness", "whining to their superiors" and "showing off". Italian campaign During late 1943, Montgomery continued to command the Eighth Army during the landings on the mainland of Italy itself, beginning with Operation Baytown. In conjunction with the Anglo-American landings at Salerno (near Naples) by Lieutenant General Mark Clark's US Fifth Army and seaborne landings by British paratroops in the heel of Italy (including the key port of Taranto, where they disembarked without resistance directly into the port), Montgomery led the Eighth Army up the toe of Italy. Montgomery abhorred what he considered to be a lack of coordination, a dispersion of effort, a strategic muddle and a lack of opportunism in the Allied effort in Italy, and he said that he was glad to leave the "dog's breakfast" on 23 December 1943. Normandy Montgomery returned to Britain in January 1944. He was assigned to command the 21st Army Group consisting of all Allied ground forces participating in Operation Overlord, codename for the Allied invasion of Normandy. Overall direction was assigned to the Supreme Allied Commander of the Allied Expeditionary Forces, American General Dwight D. Eisenhower. Both Churchill and Eisenhower had found Montgomery difficult to work with in the past and wanted the position to go to the more affable General Sir Harold Alexander. However Montgomery's patron, General Sir Alan Brooke, firmly argued that Montgomery was a much superior general to Alexander and ensured his appointment. Without Brooke's support, Montgomery would have remained in Italy. At St Paul's School on 7 April and 15 May Montgomery presented his strategy for the invasion. He envisaged a ninety-day battle, with all forces reaching the Seine. The campaign would pivot on an Allied-held Caen in the east of the Normandy bridgehead, with relatively static British and Canadian armies forming a shoulder to attract and defeat German counter-attacks, relieving the US armies who would move and seize the Cotentin Peninsula and Brittany, wheeling south and then east on the right forming a pincer. During the ten weeks of the Battle of Normandy, unfavourable autumnal weather conditions disrupted the Normandy landing areas. Montgomery's initial plan was for the Anglo-Canadian troops under his command to break out immediately from their beachheads on the Calvados coast towards Caen with the aim of taking the city on either D Day or two days later. Montgomery attempted to take Caen with the 3rd Infantry Division, 50th (Northumbrian) Infantry Division and the 3rd Canadian Division but was stopped from 6–8 June by 21st Panzer Division and 12th SS Panzer Division Hitlerjugend, who hit the advancing Anglo-Canadian troops very hard. Rommel followed up this success by ordering the 2nd Panzer Division to Caen while Field Marshal Gerd von Rundstedt asked for and received permission from Hitler to have the elite 1st Waffen SS Division Leibstandarte Adolf Hitler and 2nd Waffen SS Division Das Reich sent to Caen as well. Montgomery thus had to face what Stephen Badsey called the "most formidable" of all the German divisions in France. The 12th Waffen SS Division Hitlerjugend, as its name implies, was drawn entirely from the more fanatical elements of the Hitler Youth and commanded by the ruthless SS-Brigadeführer Kurt Meyer, aka "Panzer Meyer". The failure to take Caen immediately has been the source of an immense historiographical dispute with bitter nationalist overtones. Broadly, there has been a "British school" which accepts Montgomery's post-war claim that he never intended to take Caen at once, and instead the Anglo-Canadian operations around Caen were a "holding operation" intended to attract the bulk of the German forces towards the Caen sector to allow the Americans to stage the "break out operation" on the left flank of the German positions, which was all part of Montgomery's "Master Plan" that he had conceived long before the Normandy campaign. By contrast, the "American school" argued that Montgomery's initial "master plan" was for the 21st Army Group to take Caen at once and move his tank divisions into the plains south of Caen, to then stage a breakout that would lead the 21st Army Group into the plains of northern France and hence into Antwerp and finally the Ruhr. Letters written by Eisenhower at the time of the battle make it clear that Eisenhower was expecting from Montgomery "the early capture of the important focal point of Caen". Later, when this plan had clearly failed, Eisenhower wrote that Montgomery had "evolved" the plan to have the US forces achieve the break-out instead. As the campaign progressed, Montgomery altered his initial plan for the invasion and continued the strategy of attracting and holding German counter-attacks in the area north of Caen rather than to the south, to allow the US First Army in the west to take Cherbourg. A memo summarising Montgomery's operations written by Eisenhower's chief of staff, General Walter Bedell Smith who met with Montgomery in late June 1944 says nothing about Montgomery conducting a "holding operation" in the Caen sector, and instead speaks of him seeking a "breakout" into the plains south of the Seine. On 12 June, Montgomery ordered the 7th Armoured Division into an attack against the Panzer Lehr Division that made good progress at first but ended when the Panzer Lehr was joined by the 2nd Panzer Division. At Villers Bocage on 14 June, the British lost twenty Cromwell tanks to five Tiger tanks led by SS Obersturmführer Michael Wittmann, in about five minutes. Despite the setback at Villers Bocage, Montgomery was still optimistic as the Allies were landing more troops and supplies than they were losing in battle, and though the German lines were holding, the Wehrmacht and Waffen SS were suffering considerable attrition. Air Marshal Sir Arthur Tedder complained that it was impossible to move fighter squadrons to France until Montgomery had captured some airfields, something he asserted that Montgomery appeared incapable of doing. The first V-1 flying bomb attacks on London, which started on 13 June, further increased the pressure on Montgomery from Whitehall to speed up his advance. On 18 June, Montgomery ordered Bradley to take Cherbourg while the British were to take Caen by 23 June. In Operation Epsom, the British VII Corps commanded by Sir Richard O'Connor attempted to outflank Caen from the west by breaking through the dividing line between the Panzer Lehr and the 12th SS to take the strategic Hill 112. Epsom began well with O'Connor's assault force (the British 15th Scottish Division) breaking through and with the 11th Armoured Division stopping the counter-attacks of the 12th SS Division. General Friedrich Dollmann of the 7th Army had to commit the newly arrived II SS Corps to stop the British offensive. Dollmann, fearing that Epsom would be a success, committed suicide and was replaced by SS Oberstegruppenführer Paul Hausser. O'Connor, at the cost of about 4,000 men, had won a salient deep and wide but placed the Germans into an unviable long-term position. There was a strong sense of crisis in the Allied command, as the Allies had advanced only about inland, at a time when their plans called for them to have already taken Rennes, Alençon and St. Malo. After Epsom, Montgomery had to tell General Harry Crerar that the activation of the First Canadian Army would have to wait as there was only room at present, in the Caen sector, for the newly arrived XII Corps under Lieutenant-General Neil Ritchie, which caused some tension with Crerar, who was anxious to get into the field. Epsom had forced further German forces into Caen but all through June and the first half of July Rommel, Rundstedt, and Hitler were engaged in planning for a great offensive to drive the British into the sea; it was never launched and would have required the commitment of a large number of German forces to the Caen sector. It was only after several failed attempts to break out in the Caen sector that Montgomery devised what he later called his "master plan" of having the 21st Army Group hold the bulk of the German forces, thus allowing the Americans to break out. The Canadian historians Terry Copp and Robert Vogel wrote about the dispute between the "American school" and "British school" after having suffered several setbacks in June 1944: Hampered by stormy weather and the bocage terrain, Montgomery had to ensure that Rommel focused on the British in the east rather than the Americans in the west, who had to take the Cotentin Peninsula and Brittany before the Germans could be trapped by a general swing east. Montgomery told General Sir Miles Dempsey, the commander of the 2nd British Army: "Go on hitting, drawing the German strength, especially some of the armour, onto yourself – so as to ease the way for Brad [Bradley]." The Germans had deployed 12 divisions, of which six were Panzer divisions, against the British while deploying eight divisions, of which three were Panzer divisions, against the Americans. By the middle of July Caen had not been taken, as Rommel continued to prioritise prevention of the break-out by British forces rather than the western territories being taken by the Americans. This was broadly as Montgomery had planned, albeit not with the same speed as he outlined at St Paul's, although as the American historian Carlo D'Este pointed out the actual situation in Normandy was "vastly different" from what was envisioned at the St. Paul's conference, as only one of four goals outlined in May had been achieved by 10 July. On 7 July, Montgomery began Operation Charnwood with a carpet bombing offensive that turned much of the French countryside and the city of Caen into a wasteland. The British and Canadians succeeded in advancing into northern Caen before the Germans, who used the ruins to their advantage and stopped the offensive. On 10 July, Montgomery ordered Bradley to take Avranches, after which the 3rd US Army would be activated to drive towards Le Mans and Alençon. On 14 July 1944, Montgomery wrote to his patron Brooke, saying he had chosen on a "real show down on the eastern flanks, and to loose a Corps of three armoured divisions in the open country about the Caen-Falaise road ... The possibilities are immense; with seven hundred tanks loosed to the South-east of Caen, and the armoured cars operating far ahead, anything can happen." The French Resistance had launched Plan Violet in June 1944 to systematically destroy the telephone system of France, which forced the Germans to use their radios more and more to communicate, and as the code-breakers of Bletchley Park had broken many of the German codes, Montgomery had—via Ultra intelligence—a good idea of the German situation. Montgomery thus knew German Army Group B had lost 96,400 men while receiving 5,200 replacements and the Panzer Lehr Division now based at St. Lô was down to only 40 tanks. Montgomery later wrote that he knew he had the Normandy campaign won at this point as the Germans had almost no reserves while he had three armoured divisions in reserve. An American break-out was achieved with Operation Cobra and the encirclement of German forces in the Falaise pocket at the cost of British losses with the diversionary Operation Goodwood. On the early morning of 18 July 1944, Operation Goodwood began with British heavy bombers beginning carpet bombing attacks that further devastated what was left of Caen and the surrounding countryside. A British tank crewman from the Guards Armoured Division later recalled: "At 0500 hours a distant thunder in the air brought all the sleepy-eyed tank crews out of their blankets. 1,000 Lancasters were flying from the sea in groups of three or four at . Ahead of them the pathfinders were scattering their flares and before long the first bombs were dropping". A German tankman from the 21st Panzer Division at the receiving end of this bombardment remembered: "We saw little dots detach themselves from the planes, so many of them that the crazy thought occurred to us: are those leaflets? ... Among the thunder of the explosions, we could hear the wounded scream and the insane howling of men who had [been] driven mad". The British bombing had badly smashed the German front-line units; e.g., tanks were thrown up on the roofs of French farmhouses. Initially, the three British armoured divisions assigned to lead the offensive, the 7th, 11th and the Guards, made rapid progress and were soon approaching the Borguebus ridge, which dominated the landscape south of Caen, by noon. If the British could take the Borguebus Ridge, the way to the plains of northern France would be wide open, and potentially Paris could be taken, which explains the ferocity with which the Germans defended the ridge. One German officer, Lieutenant Baron von Rosen, recalled that to motivate a Luftwaffe officer commanding a battery of four 88 mm guns to fight against the British tanks, he had to hold his handgun to the officer's head "and asked him whether he would like to be killed immediately or get a high decoration. He decided for the latter". The well dug-in 88 mm guns around the Borguebus Ridge began taking a toll on the British Sherman tanks, and the countryside was soon dotted with dozens of burning Shermans. One British officer reported with worry: "I see palls of smoke and tanks brewing up with flames belching forth from their turrets. I see men climbing out, on fire like torches, rolling on the ground to try and douse the flames". Despite Montgomery's orders to try to press on, fierce German counter-attacks stopped the British offensive. The objectives of Operation Goodwood were all achieved except the complete capture of the Bourgebus Ridge, which was only partially taken. The operation was a strategic Allied success in drawing in the last German reserves in Normandy towards the Caen sector away from the American sector, greatly assisting the American breakout in Operation Cobra. By the end of Goodwood on 25 July 1944, the Canadians had finally taken Caen while the British tanks had reached the plains south of Caen, giving Montgomery the "hinge" he had been seeking, while forcing the Germans to commit the last of their reserves to stop the Anglo-Canadian offensive. Ultra decrypts indicated that the Germans now facing Bradley were seriously understrength, with Operation Cobra about to commence. During Operation Goodwood, the British had 400 tanks knocked out, with many recovered returning to service. The casualties were 5,500 with of ground gained. Bradley recognised Montgomery's plan to pin down German armour and allow US forces to break out: The long-running dispute over what Montgomery's "master plan" in Normandy led historians to differ greatly about the purpose of Goodwood. The British journalist Mark Urban wrote that the purpose of Goodwood was to draw German troops to their left flank to allow the Americans to break out on the right flank, arguing that Montgomery had to lie to his soldiers about the purpose of Goodwood, as the average British soldier would not have understood why they were being asked to create a diversion to allow the Americans to have the glory of staging the breakout with Operation Cobra. By contrast, the American historian Stephen Power argued that Goodwood was intended to be the "breakout" offensive and not a "holding operation", writing: "It is unrealistic to assert that an operation which called for the use of 4,500 Allied aircraft, 700 artillery pieces and over 8,000 armored vehicles and trucks and that cost the British over 5,500 casualties was conceived and executed for so limited an objective". Power noted that Goodwood and Cobra were supposed to take effect on the same day, 18 July 1944, but Cobra was cancelled owing to heavy rain in the American sector, and argued that both operations were meant to be breakout operations to trap the German armies in Normandy. American military writer Drew Middleton wrote that there is no doubt that Montgomery wanted Goodwood to provide a "shield" for Bradley, but at the same time Montgomery was clearly hoping for more than merely diverting German attention away from the American sector. British historian John Keegan pointed out that Montgomery made differing statements before Goodwood about the purpose of the operation. Keegan wrote that Montgomery engaged in what he called a "hedging of his bets" when drafting his plans for Goodwood, with a plan for a "break out if the front collapsed, if not, sound documentary evidence that all he had intended in the first place was a battle of attrition". Again Bradley confirmed Montgomery's plan and that the capture of Caen was only incidental to his mission, not critical. The American LIFE magazine quoted Bradley in 1951: With Goodwood drawing the Wehrmacht towards the British sector, the First American Army enjoyed a two-to-one numerical superiority. General Omar Bradley accepted Montgomery's advice to begin the offensive by concentrating at one point instead of a "broad front" as Eisenhower would have preferred. Operation Goodwood almost cost Montgomery his job, as Eisenhower seriously considered sacking him and only chose not to do so because to sack the popular "Monty" would have caused such a political backlash in Britain against the Americans at a critical moment in the war that the resulting strains in the Atlantic alliance were not considered worth it. Montgomery expressed his satisfaction at the results of Goodwood when calling the operation off. Eisenhower was under the impression that Goodwood was to be a break-out operation. Either there was a miscommunication between the two men or Eisenhower did not understand the strategy. Alan Brooke, chief of the British Imperial General Staff, wrote: "Ike knows nothing about strategy and is quite unsuited to the post of Supreme Commander. It is no wonder that Monty's real high ability is not always realised". Bradley fully understood Montgomery's intentions. Both men would not give away to the press the true intentions of their strategy. Many American officers had found Montgomery a difficult man to work with, and after Goodwood, pressured Eisenhower to fire Montgomery. Although the Eisenhower–Montgomery dispute is sometimes depicted in nationalist terms as being an Anglo-American struggle, it was the British Air Marshal Arthur Tedder who was pressing Eisenhower most strongly after Goodwood to fire Montgomery. An American officer wrote in his diary that Tedder had come to see Eisenhower to "pursue his current favourite subject, the sacking of Monty". With Tedder leading the "sack Monty" campaign, it encouraged Montgomery's American enemies to press Eisenhower to fire Montgomery. Brooke was sufficiently worried about the "sack Monty" campaign to visit Montgomery at his Tactical Headquarters (TAC) in France and as he wrote in his diary; "warned [Montgomery] of a tendency in the PM [Churchill] to listen to suggestions that Monty played for safety and was not prepared to take risks". Brooke advised Montgomery to invite Churchill to Normandy, arguing that if the "sack Monty" campaign had won the Prime Minister over, then his career would be over, as having Churchill's backing would give Eisenhower the political "cover" to fire Montgomery. On 20 July, Montgomery met Eisenhower and on 21 July, Churchill, at the TAC in France. One of Montgomery's staff officers wrote afterwards that it was "common knowledge at Tac that Churchill had come to sack Monty". No notes were taken at the Eisenhower–Montgomery and Churchill–Montgomery meetings, but Montgomery was able to persuade both men not to fire him. With the success of Cobra, which was soon followed by unleashing the 3rd American Army under the General George S. Patton, Eisenhower wrote to Montgomery: "Am delighted that your basic plan has begun brilliantly to unfold with Bradley's initial success". The success of Cobra was aided by Operation Spring when the II Canadian Corps under General Guy Simonds (the only Canadian general whose skill Montgomery respected) began an offensive south of Caen that made little headway, but which the Germans regarded as the main offensive. Once the 3rd American Army arrived, Bradley was promoted to take command of the newly created 12th Army Group consisting of 1st and 3rd American Armies. Following the American breakout, there followed the Battle of Falaise Gap, as the British, Canadian and Polish soldiers of 21st Army Group commanded by Montgomery advanced south, while the American and French soldiers of Bradley's 12th Army Group advanced north to encircle the German Army Group B at Falaise, as Montgomery waged what Urban called "a huge battle of annihilation" in August 1944. Montgomery began his offensive into the Suisse Normande region with Operation Bluecoat with Sir Richard O'Connor's VIII Corps and Gerard Bucknall's XXX Corps heading south. A dissatisfied Montgomery sacked Bucknall for being insufficiently aggressive and replaced him with General Brian Horrocks. At the same time, Montgomery ordered Patton—whose Third Army was supposed to advance into Brittany—to instead capture Nantes, which was soon taken. Hitler waited too long to order his soldiers to retreat from Normandy, leading Montgomery to write: "He [Hitler] refused to face the only sound military course. As a result the Allies caused the enemy staggering losses in men and materials". Knowing via Ultra that Hitler was not planning to retreat from Normandy, Montgomery, on 6 August 1944, ordered an envelopment operation against Army Group B—with the First Canadian Army under Harry Crerar to advance towards Falaise, the Second British Army under Miles Dempsey to advance towards Argentan, and the Third American Army under George S. Patton to advance to Alençon. On 11 August, Montgomery changed his plan, with the Canadians to take Falaise and to meet the Americans at Argentan. The First Canadian Army launched two operations, Operation Totalize on 7 August, which advanced only in four days in the face of fierce German resistance, and Operation Tractable on 14 August, which finally took Falaise on 17 August. In view of the slow Canadian advance, Patton requested permission to take Falaise, but was refused by Bradley on 13 August, which prompted much controversy, many historians arguing that Bradley lacked aggression and that Montgomery should have overruled Bradley. The so-called Falaise Gap was closed on 22 August 1944, but several American generals, most notably Patton, accused Montgomery of being insufficiently aggressive in closing it. About 60,000 German soldiers were trapped in Normandy, but before 22 August, about 20,000 Germans had escaped through the Falaise Gap. About 10,000 Germans had been killed in the Battle of the Falaise Gap, which led a stunned Eisenhower, who viewed the battlefield on 24 August, to comment with horror that it was impossible to walk without stepping on corpses. The successful conclusion of the Normandy campaign saw the beginning of the debate between the "American school" and "British school" as both American and British generals started to advance claims about who was most responsible for this victory. Brooke wrote in defence of his protégé Montgomery: "Ike knows nothing about strategy and is 'quite' unsuited to the post of Supreme Commander. It is no wonder that Monty's real high ability is not always realised. Especially so when 'national' spectacles pervert the perspective of the strategic landscape". About Montgomery's conduct of the Normandy campaign, Badsey wrote: Replaced as Ground Forces Commander General Eisenhower took over Ground Forces Command on 1 September, while continuing as Supreme Commander, with Montgomery continuing to command the 21st Army Group, now consisting mainly of British and Canadian units. Montgomery bitterly resented this change, although it had been agreed before the D-Day invasion. The British journalist Mark Urban writes that Montgomery seemed unable to grasp that as the majority of the 2.2 million Allied soldiers fighting against Germany on the Western Front were now American (the ratio was 3:1) that it was politically unacceptable to American public opinion to have Montgomery remain as Land Forces Commander as: "Politics would not allow him to carry on giving orders to great armies of Americans simply because, in his view, he was better than their generals." Winston Churchill had Montgomery promoted to field marshal by way of compensation. Advance to the Rhine By September, ports like Cherbourg were too far away from the front line, causing the Allies great logistical problems. Antwerp was the third largest port in Europe. It was a deep water inland port connected to the North Sea via the river Scheldt. The Scheldt was wide enough and dredged deep enough to allow the passage of ocean-going ships. On 3 September 1944 Hitler ordered the 15th German Army, which had been stationed in the Pas de Calais region and was withdrawing north into the Low Countries, to hold the mouth of the river Scheldt to deprive the Allies of the use of Antwerp. Field Marshal Gerd von Rundstedt, the German commander of the Western Front, ordered General Gustav-Adolf von Zangen, the commander of 15th Army, that: "The attempt of the enemy to occupy the West Scheldt in order to obtain the free use of the harbor of Antwerp must be resisted to the utmost" (emphasis in the original). Rundstedt argued with Hitler that as long as the Allies could not use the port of Antwerp, the Allies would lack the logistical capacity for an invasion of Germany. The Witte Brigade (White Brigade) of the Belgian resistance had captured the Port of Antwerp before the Germans could destroy key port facilities, and on 4 September, Antwerp was captured by Horrocks with its harbour mostly intact. The British declined to immediately advance over the Albert Canal, and an opportunity to destroy the German 15th Army was lost. The Germans had mined the river Scheldt, the mouth of the Scheldt was still in German hands making it impossible for the Royal Navy to clear the mines in the river, and therefore the port of Antwerp was still useless to the Allies. On 5 September, SHAEF's naval commander, Admiral Sir Bertram Ramsay, had urged Montgomery to make clearing the mouth of the Scheldt his number-one priority. Alone among the senior commanders, only Ramsay saw opening Antwerp as crucial. Thanks to ULTRA, Montgomery was aware of Hitler's order by 5 September. On 9 September, Montgomery wrote to Brooke that "one good Pas de Calais port" would be sufficient to meet all the logistical needs of the 21st Army Group, but only the supply needs of the same formation. At the same time, Montgomery noted that "one good Pas de Calais port" would be insufficient for the American armies in France, which would thus force Eisenhower, if for no other reasons than logistics, to favour Montgomery's plans for an invasion of northern Germany by the 21st Army Group, whereas if Antwerp were opened up, then all of the Allied armies could be supplied. The importance of ports closer to Germany was highlighted with the liberation of the city of Le Havre, which was assigned to John Crocker's I Corps. To take Le Havre, two infantry divisions, two tank brigades, most of the artillery of the Second British Army, the specialized armoured "gadgets" of Percy Hobart's 79th Armoured Division, the battleship and the monitor were all committed. On 10 September 1944, Bomber Command dropped 4,719 tons of bombs on Le Havre, which was the prelude to Operation Astonia, the assault on Le Havre by Crocker's men, which was taken two days later. The Canadian historian Terry Copp wrote that the commitment of this much firepower and men to take only one French city might "seem excessive", but by this point, the Allies desperately needed ports closer to the front line to sustain their advance. In September 1944, Montgomery ordered Crerar and his First Canadian Army to take the French ports on the English Channel, namely Calais, Boulogne and Dunkirk, and to clear the Scheldt, a task that Crerar stated was impossible as he lacked enough troops to perform both operations at once. Montgomery refused Crerar's request to have British XII Corps under Neil Ritchie assigned to help clear the Scheldt as Montgomery stated he needed XII Corps for Operation Market Garden. On 6 September 1944, Montgomery told Crerar that "I want Boulogne badly" and that city should be taken no matter what the cost. On 22 September 1944, General Guy Simonds's II Canadian Corps took Boulogne, followed up by taking Calais on 1 October 1944. Montgomery was highly impatient with Simonds, complaining that it had taken Crocker's I Corps only two days to take Le Havre while it took Simonds two weeks to take Boulogne and Calais, but Simonds noted that at Le Havre, three divisions and two brigades had been employed, whereas at both Boulogne and Calais, only two brigades were sent in to take both cities. After an attempt to storm the Leopold Canal by the 4th Canadian Division had been badly smashed by the German defenders, Simonds ordered a stop to further attempts to clear the river Scheldt until his mission of capturing the French ports on the English Channel had been accomplished; this allowed the German 15th Army ample time to dig into its new home on the Scheldt. The only port that was not captured by the Canadians was Dunkirk, as Montgomery ordered the 2nd Canadian Division on 15 September to hold his flank at Antwerp as a prelude for an advance up the Scheldt. Montgomery pulled away from the First Canadian Army (temporarily commanded now by Simonds as Crerar was ill), the British 51st Highland Division, 1st Polish Division, British 49th (West Riding) Division and 2nd Canadian Armoured Brigade, and sent all of these formations to help the 2nd British Army to expand | during the Battle of Normandy (Operation Overlord), from D-Day on 6 June 1944 until 1 September 1944. He then continued in command of the 21st Army Group for the rest of the North West Europe campaign, including the failed attempt to cross the Rhine during Operation Market Garden. When German armoured forces broke through the American lines in Belgium during the Battle of the Bulge, Montgomery received command of the northern shoulder of the Bulge. This included temporary command of the US First Army and the US Ninth Army, which held up the German advance to the north of the Bulge while the US Third Army under Patton relieved Bastogne from the south. Montgomery's 21st Army Group, including the US Ninth Army and the First Allied Airborne Army, crossed the Rhine in Operation Plunder in March 1945, two weeks after the US First Army had crossed the Rhine in the Battle of Remagen. By the end of the war, troops under Montgomery's command had taken part in the encirclement of the Ruhr Pocket, liberated the Netherlands, and captured much of north-west Germany. On 4 May 1945, Montgomery accepted the surrender of the German forces in north-western Europe at Lüneburg Heath, south of Hamburg, after the surrender of Berlin to the USSR on 2 May. After the war he became Commander-in-Chief of the British Army of the Rhine (BAOR) in Germany and then Chief of the Imperial General Staff (1946–1948). From 1948 to 1951, he served as Chairman of the Commanders-in-Chief Committee of the Western Union. He then served as NATO's Deputy Supreme Allied Commander Europe until his retirement in 1958. Early life Montgomery was born in Kennington, Surrey, in 1887, the fourth child of nine, to a Church of Ireland minister, Henry Montgomery, and his wife, Maud (née Farrar). The Montgomerys, an Ulster-Scots 'Ascendancy' gentry family, were the County Donegal branch of the Clan Montgomery. Henry Montgomery, at that time Vicar of St Mark's Church, Kennington, was the second son of Sir Robert Montgomery, a native of Inishowen in County Donegal in Ulster, the noted colonial administrator in British India, who died a month after his grandson's birth. He was probably a descendant of Colonel Alexander Montgomery (1686–1729). Bernard's mother, Maud, was the daughter of The V. Rev. Frederic William Canon Farrar, the famous preacher, and was eighteen years younger than her husband. After the death of Sir Robert Montgomery, Henry inherited the Montgomery ancestral estate of New Park in Moville in Inishowen in Ulster. There was still £13,000 to pay on a mortgage, a large debt in the 1880s (equivalent to £ in )., and Henry was at the time still only an Anglican vicar. Despite selling off all the farms that were at Ballynally, "there was barely enough to keep up New Park and pay for the blasted summer holiday" (i.e., at New Park). It was a financial relief of some magnitude when, in 1889, Henry was made Bishop of Tasmania, then still a British colony and Bernard spent his formative years there. Bishop Montgomery considered it his duty to spend as much time as possible in the rural areas of Tasmania and was away for up to six months at a time. While he was away, his wife, still in her mid-twenties, gave her children "constant" beatings, then ignored them most of the time as she performed the public duties of the bishop's wife. Of Bernard's siblings, Sibyl died prematurely in Tasmania, and Harold, Donald and Una all emigrated. Maud Montgomery took little active interest in the education of her young children other than to have them taught by tutors brought from Britain. The loveless environment made Bernard something of a bully, as he himself recalled, "I was a dreadful little boy. I don't suppose anybody would put up with my sort of behaviour these days." Later in life Montgomery refused to allow his son David to have anything to do with his grandmother, and refused to attend her funeral in 1949. The family returned to England once for a Lambeth Conference in 1897, and Bernard and his brother Harold were educated for a term at The King's School, Canterbury. In 1901, Bishop Montgomery became secretary of the Society for the Propagation of the Gospel, and the family returned to London. Montgomery attended St Paul's School and then the Royal Military College, Sandhurst, from which he was almost expelled for rowdiness and violence. On graduation in September 1908 he was commissioned into the 1st Battalion the Royal Warwickshire Regiment as a second lieutenant, and first saw overseas service later that year in India. He was promoted to lieutenant in 1910, and in 1912 became adjutant of the 1st Battalion of his regiment at Shorncliffe Army Camp. First World War The Great War began in August 1914 and Montgomery moved to France with his battalion that month, which was at the time part of the 10th Brigade of the 4th Division. He saw action at the Battle of Le Cateau that month and during the retreat from Mons. At Méteren, near the Belgian border at Bailleul on 13 October 1914, during an Allied counter-offensive, he was shot through the right lung by a sniper. Montgomery was hit once more, in the knee. He was awarded the Distinguished Service Order for gallant leadership: the citation for this award, published in the London Gazette in December 1914 reads: "Conspicuous gallant leading on 13th October, when he turned the enemy out of their trenches with the bayonet. He was severely wounded." After recovering in early 1915, he was appointed brigade major, first of the 112th Brigade, and then with 104th Brigade training in Lancashire. He returned to the Western Front in early 1916 as a general staff officer in the 33rd Division and took part in the Battle of Arras in AprilMay 1917. He became a general staff officer with IX Corps, part of General Sir Herbert Plumer's Second Army, in July 1917. Montgomery served at the Battle of Passchendaele in late 1917 before finishing the war as GSO1 (effectively chief of staff) of the 47th (2nd London) Division, with the temporary rank of lieutenant-colonel. A photograph from October 1918, reproduced in many biographies, shows the then unknown Lieutenant-Colonel Montgomery standing in front of Winston Churchill (then the Minister of Munitions) at the parade following the liberation of Lille. Between the world wars 1920s After the First World War Montgomery commanded the 17th (Service) Battalion of the Royal Fusiliers, a battalion in the British Army of the Rhine, before reverting to his substantive rank of captain (brevet major) in November 1919. He had not at first been selected for the Staff College in Camberley, Surrey (his only hope of ever achieving high command). But at a tennis party in Cologne, he was able to persuade the Commander-in-chief (C-in-C) of the British Army of Occupation, Field Marshal Sir William Robertson, to add his name to the list. After graduating from the Staff College, he was appointed brigade major in the 17th Infantry Brigade in January 1921. The brigade was stationed in County Cork, Ireland, carrying out counter-insurgency operations during the final stages of the Irish War of Independence. Montgomery came to the conclusion that the conflict could not be won without harsh measures, and that self-government for Ireland was the only feasible solution; in 1923, after the establishment of the Irish Free State and during the Irish Civil War, Montgomery wrote to Colonel Arthur Ernest Percival of the Essex Regiment: In May 1923, Montgomery was posted to the 49th (West Riding) Infantry Division, a Territorial Army (TA) formation. He returned to the 1st Battalion, Royal Warwickshire Regiment in 1925 as a company commander and was promoted to major in July 1925. From January 1926 to January 1929 he served as Deputy Assistant Adjutant General at the Staff College, Camberley, in the temporary rank of lieutenant-colonel. Marriage and family In 1925, in his first known courtship of a woman, Montgomery, then in his late thirties, proposed to a 17-year-old girl, Miss Betty Anderson. His approach included drawing diagrams in the sand of how he would deploy his tanks and infantry in a future war, a contingency which seemed very remote at that time. She respected his ambition and single-mindedness, but declined his proposal of marriage. In 1927, he met and married Elizabeth (Betty) Carver, née Hobart. She was the sister of the future Second World War commander Major-General Sir Percy Hobart. Betty Carver had two sons in their early teens, John and Dick, from her first marriage to Oswald Carver. Dick Carver later wrote that it had been "a very brave thing" for Montgomery to take on a widow with two children. Montgomery's son, David, was born in August 1928. While on holiday in Burnham-on-Sea in 1937, Betty suffered an insect bite which became infected, and she died in her husband's arms from septicaemia following amputation of her leg. The loss devastated Montgomery, who was then serving as a brigadier, but he insisted on throwing himself back into his work immediately after the funeral. Montgomery's marriage had been extremely happy. Much of his correspondence with his wife was destroyed when his quarters at Portsmouth were bombed during the Second World War. After Montgomery's death, John Carver wrote that his mother had arguably done the country a favour by keeping his personal oddities—his extreme single-mindedness, and his intolerance of and suspicion of the motives of others—within reasonable bounds long enough for him to have a chance of attaining high command. Both of Montgomery's stepsons became army officers in the 1930s (both were serving in India at the time of their mother's death), and both served in the Second World War, each eventually attaining the rank of colonel. While serving as a GSO2 with Eighth Army, Dick Carver was sent forward during the pursuit after El Alamein to help identify a new site for Eighth Army HQ. He was taken prisoner at Mersa Matruh on 7 November 1942. Montgomery wrote to his contacts in England asking that inquiries be made via the Red Cross as to where his stepson was being held, and that parcels be sent to him. Like many British POWs, the most famous being General Richard O'Connor, Dick Carver escaped in September 1943 during the brief hiatus between Italy's departure from the war and the German seizure of the country. He eventually reached British lines on 5 December 1943, to the delight of his stepfather, who sent him home to Britain to recuperate. 1930s In January 1929 Montgomery was promoted to brevet lieutenant-colonel. That month he returned to the 1st Battalion, Royal Warwickshire Regiment again, as Commander of Headquarters Company; he went to the War Office to help write the Infantry Training Manual in mid-1929. In 1931 Montgomery was promoted to substantive lieutenant-colonel and became the Commanding officer (CO) of the 1st Battalion, Royal Warwickshire Regiment and saw service in Palestine and British India. He was promoted to colonel in June 1934 (seniority from January 1932). He attended and was then recommended to become an instructor at the Indian Army Staff College (now the Pakistan Command and Staff College) in Quetta, British India. On completion of his tour of duty in India, Montgomery returned to Britain in June 1937 where he took command of the 9th Infantry Brigade with the temporary rank of brigadier. His wife died that year. In 1938, he organised an amphibious combined operations landing exercise that impressed the new C-in-C of Southern Command, General Sir Archibald Percival Wavell. He was promoted to major-general on 14 October 1938 and took command of the 8th Infantry Division in the British mandate of Palestine. In Palestine, Montgomery was involved in suppressing an Arab revolt which had broken out over opposition to Jewish emigration. He returned in July 1939 to Britain, suffering a serious illness on the way, to command the 3rd (Iron) Infantry Division. Reporting the suppression of the revolt in April 1939, Montgomery wrote, "I shall be sorry to leave Palestine in many ways, as I have enjoyed the war out here". Second World War British Expeditionary Force Retreat to Dunkirk and evacuation Britain declared war on Germany on 3 September 1939. The 3rd Division was deployed to Belgium as part of the British Expeditionary Force (BEF). During this time, Montgomery faced serious trouble from his military superiors and the clergy for his frank attitude regarding the sexual health of his soldiers, but was defended from dismissal by his superior Alan Brooke, commander of II Corps. Montgomery had issued a circular on the prevention of venereal disease, worded in such "obscene language" that both the Church of England and Roman Catholic senior chaplains objected; Brooke told Monty that he did not want any further errors of this kind, though deciding not to get him to formally withdraw it as it would remove any "vestige of respect" left for him. Montgomery's training paid off when the Germans began their invasion of the Low Countries on 10 May 1940 and the 3rd Division advanced to the River Dijle and then withdrew to Dunkirk with great professionalism, entering the Dunkirk perimeter in a famous night-time march that placed his forces on the left flank, which had been left exposed by the Belgian surrender. Early in the campaign, when the 3rd Division was near Leuven, they were fired on by members of the Belgian 10th Infantry Division who mistook them for German paratroopers; Montgomery resolved the incident by approaching them and offering to place himself under Belgian command. The 3rd Division returned to Britain intact with minimal casualties. During Operation Dynamo—the evacuation of 330,000 BEF and French troops to Britain—Montgomery assumed command of the II Corps. On his return Montgomery antagonised the War Office with trenchant criticisms of the command of the BEF and was briefly relegated back to divisional command of 3rd Division. 3rd Division was at that time the only fully equipped division in Britain. He was made a Companion of the Order of the Bath. Montgomery was ordered to make ready his 3rd Division to invade the neutral Portuguese Azores. Models of the islands were prepared and detailed plans worked out for the invasion. The invasion plans did not go ahead and plans switched to invading Cape Verde island also belonging to neutral Portugal. These invasion plans also did not go ahead. Montgomery was then ordered to prepare plans for the invasion of neutral Ireland and to seize Cork, Cobh and Cork harbour. These invasion plans, like those of the Portuguese islands, also did not go ahead and in July 1940, Montgomery was appointed acting lieutenant-general, and placed in command of V Corps, responsible for the defence of Hampshire and Dorset, and started a long-running feud with the new Commander-in-chief (C-in-C) of Southern Command, Lieutenant-General Claude Auchinleck. In April 1941, he became commander of XII Corps responsible for the defence of Kent. During this period he instituted a regime of continuous training and insisted on high levels of physical fitness for both officers and other ranks. He was ruthless in sacking officers he considered would be unfit for command in action. Promoted to temporary lieutenant-general in July, in December Montgomery was given command of South-Eastern Command overseeing the defence of Kent, Sussex and Surrey. He renamed his command the South-Eastern Army to promote offensive spirit. During this time he further developed and rehearsed his ideas and trained his soldiers, culminating in Exercise Tiger in May 1942, a combined forces exercise involving 100,000 troops. North Africa and Italy Montgomery's early command In 1942, a new field commander was required in the Middle East, where Auchinleck was fulfilling both the role of Commander-in-chief (C-in-C) of Middle East Command and commander Eighth Army. He had stabilised the Allied position at the First Battle of El Alamein, but after a visit in August 1942, the Prime Minister, Winston Churchill, replaced him as C-in-C with General Sir Harold Alexander and William Gott as commander of the Eighth Army in the Western Desert. However, after Gott was killed flying back to Cairo, Churchill was persuaded by Brooke, who by this time was Chief of the Imperial General Staff (CIGS), to appoint Montgomery, who had only just been nominated to replace Alexander, as commander of the British First Army for Operation Torch, the invasion of French North Africa. A story, probably apocryphal but popular at the time, is that the appointment caused Montgomery to remark that "After having an easy war, things have now got much more difficult." A colleague is supposed to have told him to cheer up—at which point Montgomery said "I'm not talking about me, I'm talking about Rommel!" Montgomery's assumption of command transformed the fighting spirit and abilities of the Eighth Army. Taking command on 13 August 1942, he immediately became a whirlwind of activity. He ordered the creation of the X Corps, which contained all armoured divisions, to fight alongside his XXX Corps, which was all infantry divisions. This arrangement differed from the German Panzer Corps: one of Rommel's Panzer Corps combined infantry, armour and artillery units under one corps commander. The only common commander for Montgomery's all-infantry and all-armour corps was the Eighth Army Commander himself. Correlli Barnett commented that Montgomery's solution "... was in every way opposite to Auchinleck's and in every way wrong, for it carried the existing dangerous separatism still further." Montgomery reinforced the long front line at El Alamein, something that would take two months to accomplish. He asked Alexander to send him two new British divisions (51st Highland and 44th Home Counties) that were then arriving in Egypt and were scheduled to be deployed in defence of the Nile Delta. He moved his field HQ to Burg al Arab, close to the Air Force command post in order to better coordinate combined operations. Montgomery was determined that the army, navy and air forces should fight their battles in a unified, focused manner according to a detailed plan. He ordered immediate reinforcement of the vital heights of Alam Halfa, just behind his own lines, expecting the German commander, Erwin Rommel, to attack with the heights as his objective, something that Rommel soon did. Montgomery ordered all contingency plans for retreat to be destroyed. "I have cancelled the plan for withdrawal. If we are attacked, then there will be no retreat. If we cannot stay here alive, then we will stay here dead", he told his officers at the first meeting he held with them in the desert, though, in fact, Auchinleck had no plans to withdraw from the strong defensive position he had chosen and established at El Alamein. Montgomery made a great effort to appear before troops as often as possible, frequently visiting various units and making himself known to the men, often arranging for cigarettes to be distributed. Although he still wore a standard British officer's cap on arrival in the desert, he briefly wore an Australian broad-brimmed hat before switching to wearing the black beret (with the badge of the Royal Tank Regiment and the British General Officer's badge) for which he became notable. The black beret was offered to him by Jim Fraser while the latter was driving him on an inspection tour. Both Brooke and Alexander were astonished by the transformation in atmosphere when they visited on 19 August, less than a week after Montgomery had taken command. Alanbrooke said that Churchill was always impatient for his generals to attack at once, and he wrote that Montgomery was always "my Monty" when Montgomery was out of favour with Churchill! Eden had some late night drinks with Churchill, and Eden said at a meeting of the Chiefs of Staff the next day (29 October 1942) that the Middle East offensive was "petering out". Alanbrooke had told Churchill "fairly plainly" what he thought of Eden's ability to judge the tactical situation from a distance, and was supported at the Chiefs of Staff meeting by Smuts. First battles with Rommel Rommel attempted to turn the left flank of the Eighth Army at the Battle of Alam el Halfa from 31 August 1942. The German/Italian armoured corps infantry attack was stopped in very heavy fighting. Rommel's forces had to withdraw urgently lest their retreat through the British minefields be cut off. Montgomery was criticised for not counter-attacking the retreating forces immediately, but he felt strongly that his methodical build-up of British forces was not yet ready. A hasty counter-attack risked ruining his strategy for an offensive on his own terms in late October, planning for which had begun soon after he took command. He was confirmed in the permanent rank of lieutenant-general in mid-October. The conquest of Libya was essential for airfields to support Malta and to threaten the rear of Axis forces opposing Operation Torch. Montgomery prepared meticulously for the new offensive after convincing Churchill that the time was not being wasted. (Churchill sent a telegram to Alexander on 23 September 1942 which began, "We are in your hands and of course a victorious battle makes amends for much delay.") He was determined not to fight until he thought there had been sufficient preparation for a decisive victory, and put into action his beliefs with the gathering of resources, detailed planning, the training of troops—especially in clearing minefields and fighting at night—and in the use of 252 of the latest American-built Sherman tanks, 90 M7 Priest self-propelled howitzers, and making a personal visit to every unit involved in the offensive. By the time the offensive was ready in late October, Eighth Army had 231,000 men on its ration strength. El Alamein The Second Battle of El Alamein began on 23 October 1942, and ended 12 days later with one of the first large-scale, decisive Allied land victories of the war. Montgomery correctly predicted both the length of the battle and the number of casualties (13,500). Historian Correlli Barnett has pointed out that the rain also fell on the Germans, and that the weather is therefore an inadequate explanation for the failure to exploit the breakthrough, but nevertheless the Battle of El Alamein had been a great success. Over 30,000 prisoners of war were taken, including the German second-in-command, General von Thoma, as well as eight other general officers. Rommel, having been in a hospital in Germany at the start of the battle, was forced to return on 25 October 1942 after Stumme—his replacement as German commander—died of a heart attack in the early hours of the battle. Tunisia Montgomery was advanced to KCB and promoted to full general. He kept the initiative, applying superior strength when it suited him, forcing Rommel out of each successive defensive position. On 6 March 1943, Rommel's attack on the over-extended Eighth Army at Medenine (Operation Capri) with the largest concentration of German armour in North Africa was successfully repulsed. At the Mareth Line, 20 to 27 March, when Montgomery encountered fiercer frontal opposition than he had anticipated, he switched his major effort into an outflanking inland pincer, backed by low-flying RAF fighter-bomber support. For his role in North Africa he was awarded the Legion of Merit by the United States government in the rank of Chief Commander. Sicily The next major Allied attack was the Allied invasion of Sicily (Operation Husky). Montgomery considered the initial plans for the Allied invasion, which had been agreed in principle by General Dwight D. Eisenhower, the Supreme Allied Commander Allied Forces Headquarters, and General Alexander, the 15th Army Group commander, to be unworkable because of the dispersion of effort. He managed to have the plans recast to concentrate the Allied forces, having Lieutenant General George Patton's US Seventh Army land in the Gulf of Gela (on the Eighth Army's left flank, which landed around Syracuse in the south-east of Sicily) rather than near Palermo in the west and north of Sicily. Inter-Allied tensions grew as the American commanders, Patton and Omar Bradley (then commanding US II Corps under Patton), took umbrage at what they saw as Montgomery's attitudes and boastfulness. However, while they were considered three of the greatest soldiers of their time, due to their competitiveness they were renowned for "squabbling like three schoolgirls" thanks to their "bitchiness", "whining to their superiors" and "showing off". Italian campaign During late 1943, Montgomery continued to command the Eighth Army during the landings on the mainland of Italy itself, beginning with Operation Baytown. In conjunction with the Anglo-American landings at Salerno (near Naples) by Lieutenant General Mark Clark's US Fifth Army and seaborne landings by British paratroops in the heel of Italy (including the key port of Taranto, where they disembarked without resistance directly into the port), Montgomery led the Eighth Army up the toe of Italy. Montgomery abhorred what he considered to be a lack of coordination, a dispersion of effort, a strategic muddle and a lack of opportunism in the Allied effort in Italy, and he said that he was glad to leave the "dog's breakfast" on 23 December 1943. Normandy Montgomery returned to Britain in January 1944. He was assigned to command the 21st Army Group consisting of all Allied ground forces participating in Operation Overlord, codename for the Allied invasion of Normandy. Overall direction was assigned to the Supreme Allied Commander of the Allied Expeditionary Forces, American General Dwight D. Eisenhower. Both Churchill and Eisenhower had found Montgomery difficult to work with in the past and wanted the position to go to the more affable General Sir Harold Alexander. However Montgomery's patron, General Sir Alan Brooke, firmly argued that Montgomery was a much superior general to Alexander and ensured his appointment. Without Brooke's support, Montgomery would have remained in Italy. At St Paul's School on 7 April and 15 May Montgomery presented his strategy for the invasion. He envisaged a ninety-day battle, with all forces reaching the Seine. The campaign would pivot on an Allied-held Caen in the east of the Normandy bridgehead, with relatively static British and Canadian armies forming a shoulder to attract and defeat German counter-attacks, relieving the US armies who would move and seize the Cotentin Peninsula and Brittany, wheeling south and then east on the right forming a pincer. During the ten weeks of the Battle of Normandy, unfavourable autumnal weather conditions disrupted the Normandy landing areas. Montgomery's initial plan was for the Anglo-Canadian troops under his command to break out immediately from their beachheads on the Calvados coast towards Caen with the aim of taking the city on either D Day or two days later. Montgomery attempted to take Caen with the 3rd Infantry Division, 50th (Northumbrian) Infantry Division and the 3rd Canadian Division but was stopped from 6–8 June by 21st Panzer Division and 12th SS Panzer Division Hitlerjugend, who hit the advancing Anglo-Canadian troops very hard. Rommel followed up this success by ordering the 2nd Panzer Division to Caen while Field Marshal Gerd von Rundstedt asked for and received permission from Hitler to have the elite 1st Waffen SS Division Leibstandarte Adolf Hitler and 2nd Waffen SS Division Das Reich sent to Caen as well. Montgomery thus had to face what Stephen Badsey called the "most formidable" of all the German divisions in France. The 12th Waffen SS Division Hitlerjugend, as its name implies, was drawn entirely from the more fanatical elements of the Hitler Youth and commanded by the ruthless SS-Brigadeführer Kurt Meyer, aka "Panzer Meyer". The failure to take Caen immediately has been the source of an immense historiographical dispute with bitter nationalist overtones. Broadly, there has been a "British school" which accepts Montgomery's post-war claim that he never intended to take Caen at once, and instead the Anglo-Canadian operations around Caen were a "holding operation" intended to attract the bulk of the German forces towards the Caen sector to allow the Americans to stage the "break out operation" on the left flank of the German positions, which was all part of Montgomery's "Master Plan" that he had conceived long before the Normandy campaign. By contrast, the "American school" argued that Montgomery's initial "master plan" was for the 21st Army Group to take Caen at once and move his tank divisions into the plains south of Caen, to then stage a breakout that would lead the 21st Army Group into the plains of northern France and hence into Antwerp and finally the Ruhr. Letters written by Eisenhower at the time of the battle make it clear that Eisenhower was expecting from Montgomery "the early capture of the important focal point of Caen". Later, when this plan had clearly failed, Eisenhower wrote that Montgomery had "evolved" the plan to have the US forces achieve the break-out instead. As the campaign progressed, Montgomery altered his initial plan for the invasion and continued the strategy of attracting and holding German counter-attacks in the area north of Caen rather than to the south, to allow the US First Army in the west to take Cherbourg. A memo summarising Montgomery's operations written by Eisenhower's chief of staff, General Walter Bedell Smith who met with Montgomery in late June 1944 says nothing about Montgomery conducting a "holding operation" in the Caen sector, and instead speaks of him seeking a "breakout" into the plains south of the Seine. On 12 June, Montgomery ordered the 7th Armoured Division into an attack against the Panzer Lehr Division that made good progress at first but ended when the Panzer Lehr was joined by the 2nd Panzer Division. At Villers Bocage on 14 June, the British lost twenty Cromwell tanks to five Tiger tanks led by SS Obersturmführer Michael Wittmann, in about five minutes. Despite the setback at Villers Bocage, Montgomery was still optimistic as the Allies were landing more troops and supplies than they were losing in battle, and though the German lines were holding, the Wehrmacht and Waffen SS were suffering considerable attrition. Air Marshal Sir Arthur Tedder complained that it was impossible to move fighter squadrons to France until Montgomery had captured some airfields, something he asserted that Montgomery appeared incapable of doing. The first V-1 flying bomb attacks on London, which started on 13 June, further increased the pressure on Montgomery from Whitehall to speed up his advance. On 18 June, Montgomery ordered Bradley to take Cherbourg while the British were to take Caen by 23 June. In Operation Epsom, the British VII Corps commanded by Sir Richard O'Connor attempted to outflank Caen from the west by breaking through the dividing line between the Panzer Lehr and the 12th SS to take the strategic Hill 112. Epsom began well with O'Connor's assault force (the British 15th Scottish Division) breaking through and with the 11th Armoured Division stopping the counter-attacks of the 12th SS Division. General Friedrich Dollmann of the 7th Army had to commit the newly arrived II SS Corps to stop the British offensive. Dollmann, fearing that Epsom would be a success, committed suicide and was replaced by SS Oberstegruppenführer Paul Hausser. O'Connor, at the cost of about 4,000 men, had won a salient deep and wide but placed the Germans into an unviable long-term |
truth'. He is often hailed as the "Dutch Hippocrates". Biography Boerhaave was born at Voorhout near Leiden. The son of a Protestant pastor, in his youth Boerhaave studied for a divinity degree and wanted to become a preacher. After the death of his father, however, he was offered a scholarship and he entered the University of Leiden, where he took his master's degree in philosophy in 1690, with a dissertation titled De distinctione mentis a corpore (On the Difference of the Mind from the Body). There he attacked the doctrines of Epicurus, Thomas Hobbes and Baruch Spinoza. He then turned to the study of medicine. He earned his medical doctorate from the University of Harderwijk (present-day Gelderland) in 1693, with a dissertation titled De utilitate explorandorum in aegris excrementorum ut signorum (The Utility of Examining Signs of Disease in the Excrement of the Sick). In 1701 he was appointed lecturer on the institutes of medicine at Leiden; in his inaugural discourse, De commendando Hippocratis studio, he recommended to his pupils that great physician as their model. In 1709 he became professor of botany and medicine, and in that capacity he did good service, not only to his own university, but also to botanical science, by his improvements and additions to the botanic garden of Leiden, and by the publication of numerous works descriptive of new species of plants. On 14 September 1710, Boerhaave married Maria Drolenvaux, the daughter of the rich merchant, Alderman Abraham Drolenvaux. They had four children, of whom one daughter, Maria Joanna, lived to adulthood. In 1722, he began to suffer from an extreme case of gout, recovering the next year. In 1714, when he was appointed rector of the university, he succeeded Govert Bidloo in the chair of practical medicine, and in this capacity he introduced the modern system of clinical instruction. Four years later he was appointed to the chair of chemistry as well. In 1728 he was elected into the French Academy of Sciences, and two years later into the Royal Society of London. In 1729 declining health obliged him to resign the chairs of chemistry and botany; and he died, after a lingering and painful illness, at Leiden. Legacy His reputation so increased the fame of the University of Leiden, especially as a school of medicine, that it became popular with visitors from every part of Europe. All the princes of Europe sent him pupils, who found in this skilful professor not only an indefatigable teacher, but an affectionate guardian. When Peter the Great went to Holland in 1716 (he was in Holland before in 1697 to instruct himself in maritime affairs), he also took lessons from Boerhaave. Voltaire travelled to see him, as did Carl Linnaeus, who became a close friend and named the genus Boerhavia for him. His reputation was not confined to Europe; a Chinese mandarin sent him a letter addressed to "the illustrious Boerhaave, physician in Europe," and it reached him in due course. The operating theatre of the University of Leiden in which he once worked as an anatomist is now at the centre of a museum named after him; the Boerhaave Museum. Asteroid 8175 Boerhaave is named after Boerhaave. From 1955 to 1961 Boerhaave's image was printed on Dutch 20-guilder banknotes. The Leiden University Medical Centre organises medical trainings called Boerhaave-courses. He had a prodigious influence on the development of medicine and chemistry in Scotland. British medical schools credit Boerhaave for developing the system of medical education upon which their current institutions are based. Every founding member of the Edinburgh Medical School had studied at Leyden and attended Boerhaave's lectures on chemistry including John Rutherford and Francis Home. Boerhaave's Elementa Chemiae (1732) is recognised as the first text on chemistry. Boerhaave first described Boerhaave syndrome, which involves tearing of the oesophagus, usually a consequence of vigorous vomiting. Notoriously, in 1724 he described the case of Baron Jan van Wassenaer, a Dutch admiral who died of this condition following a gluttonous feast and subsequent regurgitation. The condition was uniformly fatal prior to modern surgical techniques allowing repair of the oesophagus. Boerhaave was critical of his Dutch contemporary Baruch Spinoza, attacking him in his 1688 dissertation. At the same time, he admired Isaac Newton and was a devout Christian who often wrote about God in his works. A collection of his religious thoughts on medicine, translated from Latin to English, has been compiled by the | to botanical science, by his improvements and additions to the botanic garden of Leiden, and by the publication of numerous works descriptive of new species of plants. On 14 September 1710, Boerhaave married Maria Drolenvaux, the daughter of the rich merchant, Alderman Abraham Drolenvaux. They had four children, of whom one daughter, Maria Joanna, lived to adulthood. In 1722, he began to suffer from an extreme case of gout, recovering the next year. In 1714, when he was appointed rector of the university, he succeeded Govert Bidloo in the chair of practical medicine, and in this capacity he introduced the modern system of clinical instruction. Four years later he was appointed to the chair of chemistry as well. In 1728 he was elected into the French Academy of Sciences, and two years later into the Royal Society of London. In 1729 declining health obliged him to resign the chairs of chemistry and botany; and he died, after a lingering and painful illness, at Leiden. Legacy His reputation so increased the fame of the University of Leiden, especially as a school of medicine, that it became popular with visitors from every part of Europe. All the princes of Europe sent him pupils, who found in this skilful professor not only an indefatigable teacher, but an affectionate guardian. When Peter the Great went to Holland in 1716 (he was in Holland before in 1697 to instruct himself in maritime affairs), he also took lessons from Boerhaave. Voltaire travelled to see him, as did Carl Linnaeus, who became a close friend and named the genus Boerhavia for him. His reputation was not confined to Europe; a Chinese mandarin sent him a letter addressed to "the illustrious Boerhaave, physician in Europe," and it reached him in due course. The operating theatre of the University of Leiden in which he once worked as an anatomist is now at the centre of a museum named after him; the Boerhaave Museum. Asteroid 8175 Boerhaave is named after Boerhaave. From 1955 to 1961 Boerhaave's image was printed on Dutch 20-guilder banknotes. The Leiden University Medical Centre organises medical trainings called Boerhaave-courses. He had a prodigious influence on the development of medicine and chemistry in Scotland. British medical schools credit Boerhaave for developing the system of medical education upon which their current institutions are based. Every founding member of the Edinburgh Medical School had studied at Leyden and attended Boerhaave's lectures on chemistry including John Rutherford and Francis Home. Boerhaave's Elementa Chemiae (1732) is recognised as the first text on chemistry. Boerhaave first described Boerhaave syndrome, which involves tearing of the oesophagus, usually a consequence of vigorous vomiting. Notoriously, in 1724 he described the case of Baron Jan van Wassenaer, a Dutch admiral who died of this condition following a gluttonous feast and subsequent regurgitation. The condition was uniformly fatal prior to modern surgical techniques allowing repair of the oesophagus. Boerhaave was critical of his Dutch contemporary Baruch Spinoza, attacking him in his 1688 dissertation. At the same time, he admired Isaac Newton and was a devout Christian who often wrote about God in his works. A collection of his religious thoughts on medicine, translated from Latin to English, has been compiled by the Sir Thomas Browne Instituut Leiden under the name Boerhaave's Orations (meaning "Boerhaave's Prayers"). Among other things, he considered nature as God's Creation and he used to say that the poor were his best patients because God was their paymaster. Medical contributions Boerhaave devoted himself intensively to the study of the human body. He was strongly influenced by the mechanistic theories of René Descartes, and those of the 17th-century astronomer and mathematician Giovanni Borelli, |
was signed on 13 July 1878 at the Radziwill Palace in Berlin. Disraeli and Salisbury returned home to heroes' receptions at Dover and in London. At the door of 10 Downing Street, Disraeli received flowers sent by the Queen. There, he told the gathered crowd, "Lord Salisbury and I have brought you back peace—but a peace I hope with honour." The Queen offered him a dukedom, which he declined, though accepting the Garter, as long as Salisbury also received it. In Berlin, word spread of Bismarck's admiring description of Disraeli, "Der alte Jude, das ist der Mann! " Afghanistan to Zululand In the weeks after Berlin, Disraeli and the cabinet considered calling a general election to capitalise on the public applause he and Salisbury had received. Parliaments were then for a seven-year term, and it was the custom not to go to the country until the sixth year unless forced to by events. Only four and a half years had passed since the last general election. Additionally, they did not see any clouds on the horizon that might forecast Conservative defeat if they waited. This decision not to seek re-election has often been cited as a great mistake by Disraeli. Blake, however, pointed out that results in local elections had been moving against the Conservatives, and doubted if Disraeli missed any great opportunity by waiting. As successful invasions of India generally came through Afghanistan, the British had observed and sometimes intervened there since the 1830s, hoping to keep the Russians out. In 1878 the Russians sent a mission to Kabul; it was not rejected by the Afghans, as the British had hoped. The British then proposed to send their own mission, insisting that the Russians be sent away. The Viceroy of India Lord Lytton concealed his plans to issue this ultimatum from Disraeli, and when the Prime Minister insisted he take no action, went ahead anyway. When the Afghans made no answer, the British advanced against them in the Second Anglo-Afghan War, and under Lord Roberts easily defeated them. The British installed a new ruler, and left a mission and garrison in Kabul. British policy in South Africa was to encourage federation between the British-run Cape Colony and Natal, and the Boer republics, the Transvaal (annexed by Britain in 1877) and the Orange Free State. The governor of Cape Colony, Sir Bartle Frere, believing that the federation could not be accomplished until the native tribes acknowledged British rule, made demands on the Zulu and their king, Cetewayo, which they were certain to reject. As Zulu troops could not marry until they had washed their spears in blood, they were eager for combat. Frere did not send word to the cabinet of what he had done until the ultimatum was about to expire. Disraeli and the cabinet reluctantly backed him, and in early January 1879 resolved to send reinforcements. Before they could arrive, on 22 January, a Zulu impi, or army, moving with great speed and endurance, destroyed a British encampment in South Africa in the Battle of Isandlwana. Over a thousand British and colonial troops were killed. Word of the defeat did not reach London until 12 February. Disraeli wrote the next day, "the terrible disaster has shaken me to the centre". He reprimanded Frere, but left him in charge, attracting fire from all sides. Disraeli sent General Sir Garnet Wolseley as High Commissioner and Commander in Chief, and Cetewayo and the Zulus were crushed at the Battle of Ulundi on 4 July 1879. On 8 September 1879 Sir Louis Cavagnari, in charge of the mission in Kabul, was killed with his entire staff by rebelling Afghan soldiers. Roberts undertook a successful punitive expedition against the Afghans over the next six weeks. 1880 election Gladstone, in the 1874 election, had been returned for Greenwich, finishing second behind a Conservative in the two-member constituency, a result he termed more like a defeat than a victory. In December 1878, he was offered the Liberal nomination at the next election for Edinburghshire, a constituency popularly known as Midlothian. The small Scottish electorate was dominated by two noblemen, the Conservative Duke of Buccleuch and the Liberal Earl of Rosebery. The Earl, a friend of both Disraeli and Gladstone who would succeed the latter after his final term as Prime Minister, had journeyed to the United States to view politics there, and was convinced that aspects of American electioneering techniques could be translated to Britain. On his advice, Gladstone accepted the offer in January 1879, and later that year began his Midlothian campaign, speaking not only in Edinburgh, but across Britain, attacking Disraeli, to huge crowds. Conservative chances of re-election were damaged by the poor weather, and consequent effects on agriculture. Four consecutive wet summers through 1879 had led to poor harvests. In the past, the farmer had the consolation of higher prices at such times, but with bumper crops cheaply transported from the United States, grain prices remained low. Other European nations, faced with similar circumstances, opted for protection, and Disraeli was urged to reinstitute the Corn Laws. He declined, stating that he regarded the matter as settled. Protection would have been highly unpopular among the newly enfranchised urban working classes, as it would raise their cost of living. Amid an economic slump generally, the Conservatives lost support among farmers. Disraeli's health continued to fail through 1879. Owing to his infirmities, Disraeli was three-quarters of an hour late for the Lord Mayor's Dinner at the Guildhall in November, at which it is customary that the Prime Minister speaks. Though many commented on how healthy he looked, it took him great effort to appear so, and when he told the audience he expected to speak to the dinner again the following year, attendees chuckled—Gladstone was then in the midst of his campaign. Despite his public confidence, Disraeli recognised that the Conservatives would probably lose the next election, and was already contemplating his Resignation Honours. Despite this pessimism, Conservatives hopes were buoyed in early 1880 with successes in by-elections the Liberals had expected to win, concluding with victory in Southwark, normally a Liberal stronghold. The cabinet had resolved to wait before dissolving Parliament; in early March they reconsidered, agreeing to go to the country as soon as possible. Parliament was dissolved on 24 March; the first borough constituencies began voting a week later. Disraeli took no public part in the electioneering, it being deemed improper for peers to make speeches to influence Commons elections. This meant that the chief Conservatives—Disraeli, Salisbury, and India Secretary Lord Cranbrook—would not be heard from. The election was thought likely to be close. Once returns began to be announced, it became clear that the Conservatives were being decisively beaten. The final result gave the Liberals an absolute majority of about 50. Final months, death, and memorials Disraeli refused to cast blame for the defeat, which he understood was likely to be final for him. He wrote to Lady Bradford that it was just as much work to end a government as to form one, without any of the fun. Queen Victoria was bitter at his departure as Prime Minister. Among the honours he arranged before resigning as Prime Minister on 21 April 1880 was one for his private secretary, Montagu Corry, who became Baron Rowton. Returning to Hughenden, Disraeli brooded over his electoral dismissal, but also resumed work on Endymion, which he had begun in 1872 and laid aside before the 1874 election. The work was rapidly completed and published by November 1880. He carried on a correspondence with Victoria, with letters passed through intermediaries. When Parliament met in January 1881, he served as Conservative leader in the Lords, attempting to serve as a moderating influence on Gladstone's legislation. Suffering from asthma and gout, Disraeli went out as little as possible, fearing more serious episodes of illness. In March, he fell ill with bronchitis, and emerged from bed only for a meeting with Salisbury and other Conservative leaders on the 26th. As it became clear that this might be his final sickness, friends and opponents alike came to call. Disraeli declined a visit from the Queen, saying, "She would only ask me to take a message to Albert." Almost blind, when he received the last letter from Victoria of which he was aware on 5 April, he held it momentarily, then had it read to him by Lord Barrington, a Privy Councillor. One card, signed "A Workman", delighted its recipient, "Don't die yet, we can't do without you." Despite the gravity of Disraeli's condition, the doctors concocted optimistic bulletins, for public consumption. The Prime Minister, Gladstone, called several times to enquire about his rival's condition, and wrote in his diary, "May the Almighty be near his pillow." There was intense public interest in the former Prime Minister's struggles for life. Disraeli had customarily taken the sacrament at Easter; when this day was observed on 17 April, there was discussion among his friends and family if he should be given the opportunity, but those against, fearing that he would lose hope, prevailed. On the morning of the following day, Easter Monday, he became incoherent, then comatose. Disraeli's last confirmed words before dying at his home at 19 Curzon Street in the early morning of 19 April were "I had rather live but I am not afraid to die". The anniversary of Disraeli's death was for some years commemorated in the United Kingdom as Primrose Day. Despite having been offered a state funeral by Queen Victoria, Disraeli's executors decided against a public procession and funeral, fearing that too large crowds would gather to do him honour. The chief mourners at the service at Hughenden on 26 April were his brother Ralph and nephew Coningsby, to whom Hughenden would eventually pass. Queen Victoria was prostrated with grief, and considered ennobling Ralph or Coningsby as a memorial to Disraeli (without children, his titles became extinct with his death) but decided against it on the ground that their means were too small for a peerage. Protocol forbade her attending Disraeli's funeral (this would not be changed until 1965, when Elizabeth II attended the rites for the former Prime Minister Sir Winston Churchill) but she sent primroses ("his favourite flowers") to the funeral, and visited the burial vault to place a wreath of china flowers four days later. Disraeli is buried with his wife in a vault beneath the Church of St Michael and All Angels which stands in the grounds of his home, Hughenden Manor, accessed from the churchyard. There is also a memorial to him in the chancel in the church, erected in his honour by Queen Victoria. His literary executor was his private secretary, Lord Rowton. The Disraeli vault also contains the body of Sarah Brydges Willyams, the wife of James Brydges Willyams of St Mawgan in Cornwall. Disraeli carried on a long correspondence with Mrs. Willyams, writing frankly about political affairs. At her death in 1865, she left him a large legacy, which helped clear up his debts. His will was proved in April 1882 at £84,019 18 s. 7 d. (roughly equivalent to £ in ). Disraeli has a memorial in Westminster Abbey. This monument was erected by the nation on the motion of Gladstone in his memorial speech on Disraeli in the House of Commons. Gladstone had absented himself from the funeral, with his plea of the press of public business met with public mockery. His speech was widely anticipated, if only because his dislike for Disraeli was well known, and caused the Prime Minister much worry. In the event, the speech was a model of its kind, in which he avoided comment on Disraeli's politics, while praising his personal qualities. Legacy Disraeli's literary and political career interacted over his lifetime and fascinated Victorian Britain, making him "one of the most eminent figures in Victorian public life", and occasioned a large output of commentary. Critic Shane Leslie noted three decades after his death that "Disraeli's career was a romance such as no Eastern vizier or Western plutocrat could tell. He began as a pioneer in dress and an aesthete of words ... Disraeli actually made his novels come true." Literary Disraeli's novels are his main literary achievement. They have from the outset divided critical opinion. The writer R. W. Stewart observed that there have always been two criteria for judging Disraeli's novels—one political and the other artistic. The critic Robert O'Kell, concurring, writes, "It is after all, even if you are a Tory of the staunchest blue, impossible to make Disraeli into a first-rate novelist. And it is equally impossible, no matter how much you deplore the extravagances and improprieties of his works, to make him into an insignificant one." Disraeli's early "silver fork" novels Vivian Grey (1826) and The Young Duke (1831) featured romanticised depictions of aristocratic life (despite his ignorance of it) with character sketches of well-known public figures lightly disguised. In some of his early fiction Disraeli also portrayed himself and what he felt to be his Byronic dual nature: the poet and the man of action. His most autobiographical novel was Contarini Fleming (1832), an avowedly serious work that did not sell well. The critic William Kuhn suggests that Disraeli's fiction can be read as "the memoirs he never wrote", revealing the inner life of a politician for whom the norms of Victorian public life appeared to represent a social straitjacket—particularly with regard to what Kuhn sees as the author's "ambiguous sexuality". Of the other novels of the early 1830s, Alroy is described by Blake as "profitable but unreadable", and The Rise of Iskander (1833), The Infernal Marriage and Ixion in Heaven (1834) made little impact. Henrietta Temple (1837) was Disraeli's next major success. It draws on the events of his affair with Henrietta Sykes to tell the story of a debt-ridden young man torn between a mercenary loveless marriage and a passionate love-at-first-sight for the eponymous heroine. Venetia (1837) was a minor work, written to raise much-needed cash. In the 1840s Disraeli wrote a trilogy of novels with political themes. Coningsby attacks the evils of the Whig Reform Bill of 1832 and castigates the leaderless conservatives for not responding. Sybil; or, The Two Nations (1845) reveals Peel's betrayal over the Corn Laws. These themes are expanded in Tancred (1847). With Coningsby; or, The New Generation (1844), Disraeli, in Blake's view, "infused the novel genre with political sensibility, espousing the belief that England's future as a world power depended not on the complacent old guard, but on youthful, idealistic politicians." Sybil; or, The Two Nations was less idealistic than Coningsby; the "two nations" of its sub-title referred to the huge economic and social gap between the privileged few and the deprived working classes. The last was Tancred; or, The New Crusade (1847), promoting the Church of England's role in reviving Britain's flagging spirituality. Disraeli often wrote about religion, for he was a strong promoter of the Church of England. He was troubled by the growth of elaborate rituals in the late 19th century, such as the use of incense and vestments, and heard warnings to the effect that the ritualists were going to turn control of the Church of England over to the Pope. He consequently was a strong supporter of the Public Worship Regulation Act 1874 which allowed the archbishops to go to court to stop the ritualists. Disraeli's last completed novels were Lothair (1870) and Endymion (1880). Lothair was "Disraeli's ideological Pilgrim's Progress", It tells a story of political life with particular regard to the roles of the Anglican and Roman Catholic churches. It reflected anti-Catholicism of the sort that was popular in Britain, and which fueled support for Italian unification ("Risorgimento"). Endymion, despite having a Whig as hero, is a last exposition of the author's economic policies and political beliefs. Disraeli continued to the last to pillory his enemies in barely disguised caricatures: the character St Barbe in Endymion is widely seen as a parody of Thackeray, who had offended Disraeli more than thirty years earlier by lampooning him in Punch as "Codlingsby". Disraeli left an unfinished novel in which the priggish central character, Falconet, is unmistakably a caricature of Gladstone. Blake comments that Disraeli "produced an epic poem, unbelievably bad, and a five-act blank verse tragedy, if possible worse. Further he wrote a discourse on political theory and a political biography, the Life of Lord George Bentinck, which is excellent ... remarkably fair and accurate." Political In the years after Disraeli's death, as Salisbury began his reign of more than twenty years over the Conservatives, the party emphasised the late leader's "One Nation" views, that the Conservatives at root shared the beliefs of the working classes, with the Liberals the party of the urban élite. Disraeli had, for example, stressed the need to improve the lot of the urban labourer. The memory of Disraeli was used by the Conservatives to appeal to the working classes, with whom he was said to have had a rapport. This aspect of his policies has been re-evaluated by historians in the 20th and 21st centuries. In 1972 BHAbbott stressed that it was not Disraeli but Lord Randolph Churchill who invented the term "Tory democracy", though it was Disraeli who made it an essential part of Conservative policy and philosophy. In 2007 Parry wrote, "The tory democrat myth did not survive detailed scrutiny by professional historical writing of the 1960s [which] demonstrated that Disraeli had very little interest in a programme of social legislation and was very flexible in handling parliamentary reform in 1867." Despite this, Parry sees Disraeli, rather than Peel, as the founder of the modern Conservative party. The Conservative politician and writer Douglas Hurd wrote in 2013, "[Disraeli] was not a one-nation Conservative—and this was not simply because he never used the phrase. He rejected the concept in its entirety." Disraeli's enthusiastic propagation of the British Empire has also been seen as appealing to working class voters. Before his leadership of the Conservative Party, imperialism was the province of the Liberals, most notably Palmerston, with the Conservatives murmuring dissent across the aisle. Disraeli made the Conservatives the party that most loudly supported both the Empire and military action to assert its primacy. This came about in part because Disraeli's own views stemmed that way, in part because he saw advantage for the Conservatives, and partially in reaction against Gladstone, who disliked the expense of empire. Blake argued that Disraeli's imperialism "decisively orientated the Conservative party for many years to come, and the tradition which he started was probably a bigger electoral asset in winning working-class support during the last quarter of the century than anything else". Some historians have commented on a romantic impulse behind Disraeli's approach to Empire and foreign affairs: Abbott writes, "To the mystical Tory concepts of Throne, Church, Aristocracy and People, Disraeli added Empire." Others have identified a strongly pragmatic aspect to his policies. Gladstone's biographer Philip Magnus contrasted Disraeli's grasp of foreign affairs with that of Gladstone, who "never understood that high moral principles, in their application to foreign policy, are more often destructive of political stability than motives of national self-interest." In Parry's view, Disraeli's foreign policy "can be seen as a gigantic castle in the air (as it was by Gladstone), or as an overdue attempt to force the British commercial classes to awaken to the realities of European politics." During his lifetime Disraeli's opponents, and sometimes even his friends and allies, questioned whether he sincerely held the views he propounded, or whether they were adopted by him as essential to one who sought to spend his life in politics, and were mouthed by him without conviction. Lord John Manners, in 1843 at the time of Young England, wrote, "could I only satisfy myself that D'Israeli believed all that he said, I should be more happy: his historical views are quite mine, but does he believe them?" Blake (writing in 1966) suggested that it is no more possible to answer that question now than it was then. Nevertheless, Paul Smith, in his journal article on Disraeli's politics, argues that Disraeli's ideas were coherently argued over a political career of nearly half a century, and "it is impossible to sweep them aside as a mere bag of burglar's tools for effecting felonious entry to the British political pantheon." Stanley Weintraub, in his biography of Disraeli, points out that his subject did much to advance Britain towards the 20th century, carrying one of the two great Reform Acts of the 19th despite the opposition of his Liberal rival, Gladstone. "He helped preserve constitutional monarchy by drawing the Queen out of mourning into a new symbolic national role and created the climate for what became 'Tory democracy'. He articulated an imperial role for Britain that would last into World War II and brought an intermittently self-isolated Britain into the concert of Europe." Frances Walsh comments on Disraeli's multifaceted public life: The debate about his place in the Conservative pantheon has continued since his death. Disraeli fascinated and divided contemporary opinion; he was seen by many, including some members of his own party, as an adventurer and a charlatan and by others as a far-sighted and patriotic statesman. As an actor on the political stage he played many roles: Byronic hero, man of letters, social critic, parliamentary virtuoso, squire of Hughenden, royal companion, European statesman. His singular and complex personality has provided historians and biographers with a particularly stiff challenge. Historian Llewellyn Woodward has evaluated Disraeli:Disraeli’s political ideas have not stood the test of time....His detachment from English prejudices did not give him any particular insight into foreign affairs; as a young man he accepted the platitudes of Metternich and failed to understand the meaning of the nationalist movements in Europe. The imperialism of his later years was equally superficial: an interpretation of politics without economics. Disraeli liked to think of himself in terms of pure intellect, but his politics were more personal than intellectual in character. He had far-reaching schemes but little administrative ability, and there was some foundation for Napoleon Ill’s judgement that he was ‘like all literary men, from Chateaubriand to Guizot, ignorant of the world’.... In spite of these faults...Disraeli’s courage, quickness of wit, capacity for affection, and freedom from sordid motives earned him his position. His ambition was of the nobler sort He brought politics nearer to poetry, or, at all events, to poetical prose, than any English politician since Burke. Historical writers have often played Disraeli and Gladstone against each other as great rivals. Roland Quinault, however, cautions us not to exaggerate the confrontation:they were not direct antagonists for most of their political careers. Indeed initially they were both loyal to the Tory party, the Church and the landed interest. Although their paths diverged over the repeal of the Corn Laws in 1846 and later over fiscal policy more generally, it was not until the later 1860s that their differences over parliamentary reform, Irish and Church policy assumed great partisan significance. Even then their personal relations remained fairly cordial until their dispute over the Eastern Question in the later 1870s. Role of Judaism By 1882, 46,000 Jews lived in England and, by 1890, Jewish emancipation was complete in every walk of life. Since 1858, Parliament has never been without practising Jewish members. The first Jewish Lord Mayor of London, Sir David Salomons, was elected in 1855, followed by the 1858 emancipation of the Jews. On 26 July 1858, Lionel de Rothschild was finally allowed to sit in the British House of Commons when the law restricting the oath of office to Christians was changed. Disraeli, a baptised Christian of Jewish parentage, at this point was already an MP. In 1884 Nathan Mayer Rothschild, 1st Baron Rothschild became the first Jewish member of the British House of Lords; Disraeli was already a member. Though born a Jew, Disraeli's baptism as a child qualified him as eligible for political aspirations, presenting no restrictions regarding a mandated Christian oath of office.) Disraeli as a leader of the Conservative Party, with its ties to the landed aristocracy, used his Jewish ancestry to claim an aristocratic heritage of his own. His biographer Jonathan Parry argues:: Disraeli convinced himself (wrongly) that he derived from the Sephardi aristocracy of Iberian Jews driven from Spain at the end of the fifteenth century....Presenting himself as Jewish symbolized Disraeli’s uniqueness when he was fighting for respect, and explained his set-backs. Presenting Jewishness as aristocratic and religious legitimized his claim to understand the perils facing modern England and to offer ‘national’ solutions to them. English toryism was ‘copied from the mighty [Jewish] prototype’ (Coningsby, bk 4, chap. 15). Disraeli was thus able to square his Jewishness with his equally deep attachment to England and her history. Todd Endelman points out that, "The link between Jews and old clothes was so fixed in the popular imagination that Victorian political cartoonists regularly drew Benjamin Disraeli (1804–81) as an old clothes man in order to stress his Jewishness." He adds, "Before the 1990s...few biographers of Disraeli or historians of Victorian politics acknowledged the prominence of the antisemitism that accompanied his climb up the greasy pole or its role in shaping his own singular sense of Jewishness. According to Michael Ragussis: What began in the 1830s as scattered anti-Semitic remarks aimed at him [Disraeli] by the crowds in his early electioneering became in the 1870s a kind of national scrutiny of his Jewishness — a scrutiny that erupted into a kind of anti-Semitic attack led by some of the most prominent intellectuals and politicians of the time and anchored in the charge that Disraeli was a crypto-Jew. Popular culture Depiction in 19th- and early 20th-century culture Historian Michael Diamond reports that for British music hall patrons in the 1880s and 1890s, "xenophobia and pride in empire" were reflected in the halls' most popular political heroes: all were Conservatives and Disraeli stood out above all, even decades after his death, while Gladstone was used as a villain. Film historian Roy Armes has argued that historical films helped maintain the political status quo in Britain in the 1920s and 1930s by imposing an establishment viewpoint that emphasized the greatness of monarchy, empire, and tradition. The films created "a facsimile world where existing values were invariably validated by events in the film and where all discord could be turned into harmony by an acceptance of the status quo." Steven Fielding has argued that Disraeli was an especially popular film hero: "historical dramas favoured Disraeli over Gladstone and, more substantively, promulgated an essentially deferential view of democratic leadership." Stage and screen actor George Arliss was known for his portrayals of Disraeli, winning the Academy Award for Best Actor for 1929's Disraeli. Fielding says Arliss "personified the kind of paternalistic, kindly, homely statesmanship that appealed to a significant proportion of the cinema audience ... Even workers attending Labour party meetings deferred to leaders with an elevated social background who showed they cared." Works by Disraeli Novels Vivian Grey (1826) Popanilla (1828) The Young Duke (1831) Contarini Fleming (1832) Ixion in Heaven (1832/3) The Wondrous Tale of Alroy (1833) The Rise of Iskander (1833) The Infernal Marriage (1834) A Year at Hartlebury, or The Election (with Sarah Disraeli, 1834) Henrietta Temple (1837) Venetia (1837) Coningsby, or the New Generation (1844) Sybil, or The Two Nations (1845) Tancred, or the New Crusade (1847) Lothair (1870) Endymion (1880) Falconet (unfinished 1881) Poetry The Revolutionary Epick (1834) Drama The Tragedy of Count Alarcos (1839) Non-fiction An Inquiry into the Plans, Progress, and Policy of the American Mining Companies (1825) Lawyers and Legislators: or, Notes, on the American Mining Companies (1825) The present state of Mexico (1825) England and France, or a Cure for the Ministerial Gallomania (1832) What Is He? (1833) The Vindication of the English Constitution (1835) The Letters of Runnymede (1836) Lord George Bentinck (1852) Notes and references Notes References Sources Text also available online at Oxford Dictionary of National Biography Woodward, Llewellyn. (1962) The Age of Reform, 1815-1870 (Oxford University Press, 1938; 2nd ed. 1962) online. Further reading Braun, Thom. Disraeli the Novelist (Routledge, 2016). Bright, J. Franck. A History of England. Period 4: Growth of Democracy: Victoria 1837–1880 (1893)online | States. In 1862, Disraeli met Prussian Count Otto von Bismarck for the first time and said of him, "be careful about that man, he means what he says". The party truce ended in 1864, with Tories outraged over Palmerston's handling of the territorial dispute between the German Confederation and Denmark known as the Schleswig-Holstein Question. Disraeli had little help from Derby, who was ill, but he united the party enough on a no-confidence vote to limit the government to a majority of 18—Tory defections and absentees kept Palmerston in office. Despite rumours about Palmerston's health as he passed his eightieth birthday, he remained personally popular, and the Liberals increased their margin in the July 1865 general election. In the wake of the poor election results, Derby predicted to Disraeli that neither of them would ever hold office again. Political plans were thrown into disarray by Palmerston's death on 18 October 1865. Russell became Prime Minister again, with Gladstone clearly the Liberal Party's leader-in-waiting, and as Leader of the House Disraeli's direct opponent. One of Russell's early priorities was a Reform Bill, but the proposed legislation that Gladstone announced on 12 March 1866 divided his party. The Conservatives and the dissident Liberals repeatedly attacked Gladstone's bill, and in June finally defeated the government; Russell resigned on 26 June. The dissidents were unwilling to serve under Disraeli in the House of Commons, and Derby formed a third Conservative minority government, with Disraeli again as Chancellor. Tory Democrat: the 1867 Reform Act It was Disraeli's belief that if given the vote British people would use it instinctively to put their natural and traditional rulers, the gentlemen of the Conservative Party, into power. Responding to renewed agitation in the country for popular suffrage, Disraeli persuaded a majority of the cabinet to agree to a Reform bill. With what Derby cautioned was "a leap in the dark", Disraeli had outflanked the Liberals who, as the supposed champions of Reform, dared not oppose him. In the absence of a credible party rival and for fear of having an election called on the issue, Conservatives felt obliged to support Disraeli despite their misgivings. There were Tory dissenters, most notably Lord Cranborne (as Robert Cecil was by then known) who resigned from the government and spoke against the bill, accusing Disraeli of "a political betrayal which has no parallel in our Parliamentary annals". Even as Disraeli accepted Liberal amendments (although pointedly refusing those moved by Gladstone) that further lowered the property qualification, Cranborne was unable to lead an effective rebellion. Disraeli gained wide acclaim and became a hero to his party for the "marvellous parliamentary skill" with which he secured the passage of Reform in the Commons. From the Liberal benches too there was admiration. The recognised wit, MP for Nottingham, Bernal Ostborne declared:I have always thought the Chancellor of Exchequer was the greatest Radical in the House. He has achieved what no other man in the country could have done. He has lugged up that great omnibus full of stupid, heavy, country gentlemen--I only say 'stupid' in the parliamentary sense--and has converted these Conservative into Radical Reformers. The Reform Act 1867 passed that August. It extended the franchise by 938,427 men—an increase of 88%—by giving the vote to male householders and male lodgers paying at least £10 for rooms. It eliminated rotten boroughs with fewer than 10,000 inhabitants, and granted constituencies to 15 unrepresented towns, with extra representation to large municipalities such as Liverpool and Manchester. First term as Prime Minister; Opposition leader Derby had long suffered from attacks of gout which sent him to his bed, unable to deal with politics. As the new session of Parliament approached in February 1868, he was bedridden at his home, Knowsley Hall, near Liverpool. He was reluctant to resign, reasoning that he was only 68, much younger than either Palmerston or Russell at the end of their premierships. Derby knew that his "attacks of illness would, at no distant period, incapacitate me from the discharge of my public duties"; doctors had warned him that his health required his resignation from office. In late February, with Parliament in session and Derby absent, he wrote to Disraeli asking for confirmation that "you will not shrink from the additional heavy responsibility". Reassured, he wrote to the Queen, resigning and recommending Disraeli as "only he could command the cordial support, en masse, of his present colleagues". Disraeli went to Osborne House on the Isle of Wight, where the Queen asked him to form a government. The monarch wrote to her daughter, Prussian Crown Princess Victoria, "Mr. Disraeli is Prime Minister! A proud thing for a man 'risen from the people' to have obtained!" The new Prime Minister told those who came to congratulate him, "I have climbed to the top of the greasy pole." First government (February–December 1868) The Conservatives remained a minority in the House of Commons and the passage of the Reform Bill required the calling of a new election once the new voting register had been compiled. Disraeli's term as Prime Minister, which began in February 1868, would therefore be short unless the Conservatives won the general election. He made only two major changes in the cabinet: he replaced Lord Chelmsford as Lord Chancellor with Lord Cairns, and brought in George Ward Hunt as Chancellor of the Exchequer. Derby had intended to replace Chelmsford once a vacancy in a suitable sinecure developed. Disraeli was unwilling to wait, and Cairns, in his view, was a far stronger minister. Disraeli's first premiership was dominated by the heated debate over the Church of Ireland. Although Ireland was largely Roman Catholic, the Church of England represented most landowners. It remained the established church and was funded by direct taxation, which was greatly resented by the Catholics and Presbyterians. An initial attempt by Disraeli to negotiate with Archbishop Manning the establishment of a Catholic university in Dublin foundered in March when Gladstone moved resolutions to disestablish the Irish Church altogether. The proposal united the Liberals under Gladstone's leadership, while causing divisions among the Conservatives. The Conservatives remained in office because the new electoral register was not yet ready; neither party wished a poll under the old roll. Gladstone began using the Liberal majority in the House of Commons to push through resolutions and legislation. Disraeli's government survived until the December general election, at which the Liberals were returned to power with a majority of about 110. In its short life, the first Disraeli government passed noncontroversial laws. It ended public executions, and the Corrupt Practices Act did much to end electoral bribery. It authorised an early version of nationalisation, having the Post Office buy up the telegraph companies. Amendments to the school law, the Scottish legal system, and the railway laws were passed. Disraeli sent the successful expedition against Tewodros II of Ethiopia under Sir Robert Napier. Opposition leader; 1874 election With Gladstone's Liberal majority dominant in the Commons, Disraeli could do little but protest as the government advanced legislation. Accordingly, he chose to await Liberal mistakes. Having leisure time as he was not in office, he wrote a new novel, Lothair (1870). A work of fiction by a former prime minister was a novelty for Britain, and the book became a best seller. By 1872 there was dissent in the Conservative ranks over the failure to challenge Gladstone and his Liberals. This was quieted as Disraeli took steps to assert his leadership of the party, and as divisions among the Liberals became clear. Public support for Disraeli was shown by cheering at a thanksgiving service in 1872 on the recovery of the Prince of Wales from illness, while Gladstone was met with silence. Disraeli had supported the efforts of party manager John Eldon Gorst to put the administration of the Conservative Party on a modern basis. On Gorst's advice, Disraeli gave a speech to a mass meeting in Manchester that year. To roaring approval, he compared the Liberal front bench to "a range of exhausted volcanoes. Not a flame flickers on a single pallid crest. But the situation is still dangerous. There are occasional earthquakes and ever and again the dark rumbling of the sea." Gladstone, Disraeli stated, dominated the scene and "alternated between a menace and a sigh". At his first departure from 10 Downing Street in 1868, Disraeli had had Victoria create Mary Anne Viscountess of Beaconsfield in her own right in lieu of a peerage for himself. Through 1872 the eighty-year-old peeress was suffering from stomach cancer. She died on 15 December. Urged by a clergyman to turn her thoughts to Jesus Christ in her final days, she said she could not: "You know Dizzy is my J.C." In 1873, Gladstone brought forward legislation to establish a Catholic university in Dublin. This divided the Liberals, and on 12 March an alliance of Conservatives and Irish Catholics defeated the government by three votes. Gladstone resigned, and the Queen sent for Disraeli, who refused to take office. Without a general election, a Conservative government would be another minority, dependent for survival on the division of its opponents. Disraeli wanted the power a majority would bring, and felt he could gain it later by leaving the Liberals in office now. Gladstone's government struggled on, beset by scandal and unimproved by a reshuffle. As part of that change, Gladstone took on the office of Chancellor, leading to questions as to whether he had to stand for re-election on taking on a second ministry—until the 1920s, MPs becoming ministers, thus taking an office of profit under the Crown, had to seek re-election. In January 1874, Gladstone called a general election, convinced that if he waited longer, he would do worse at the polls. Balloting was spread over two weeks, beginning on 1 February. Disraeli devoted much of his campaign to decrying the Liberal programme of the past five years. As the constituencies voted, it became clear that the result would be a Conservative majority, the first since 1841. In Scotland, where the Conservatives were perennially weak, they increased from seven seats to nineteen. Overall, they won 350 seats to 245 for the Liberals and 57 for the Irish Home Rule League. The Queen sent for Disraeli, and he became Prime Minister for the second time. Second government (1874–1880) Disraeli's cabinet of twelve, with six peers and six commoners, was the smallest since Reform. Of the peers, five of them had been in Disraeli's 1868 cabinet; the sixth, Lord Salisbury, was reconciled to Disraeli after negotiation and became Secretary of State for India. Lord Stanley (who had succeeded his father, the former Prime Minister, as Earl of Derby) became Foreign Secretary and Sir Stafford Northcote the Chancellor. In August 1876, Disraeli was elevated to the House of Lords as Earl of Beaconsfield and Viscount Hughenden. The Queen had offered to ennoble him as early as 1868; he had then declined. She did so again in 1874, when he fell ill at Balmoral, but he was reluctant to leave the Commons for a house in which he had no experience. Continued ill health during his second premiership caused him to contemplate resignation, but his lieutenant, Derby, was unwilling, feeling that he could not manage the Queen. For Disraeli, the Lords, where the debate was less intense, was the alternative to resignation from office. Five days before the end of the 1876 session of Parliament, on 11 August, Disraeli was seen to linger and look around the chamber before departing the Commons. Newspapers reported his ennoblement the following morning. In addition to the viscounty bestowed on Mary Anne Disraeli; the earldom of Beaconsfield was to have been bestowed on Edmund Burke in 1797, but he had died before receiving it. The name Beaconsfield, a town near Hughenden, also was given to a minor character in Vivian Grey. Disraeli made various statements about his elevation, writing to Selina, Lady Bradford on 8 August 1876, "I am quite tired of that place [the Commons]" but when asked by a friend how he liked the Lords, replied, "I am dead; dead but in the Elysian fields." Domestic policy Reforming legislation Under the stewardship of Richard Assheton Cross, the Home Secretary, Disraeli's new government enacted many reforms, including the Artisans' and Labourers' Dwellings Improvement Act 1875, which made inexpensive loans available to towns and cities to construct working-class housing. Also enacted were the Public Health Act 1875, modernising sanitary codes through the nation, the Sale of Food and Drugs Act (1875), and the Education Act (1876). Disraeli's government also introduced a new Factory Act meant to protect workers, the Conspiracy, and Protection of Property Act 1875, which allowed peaceful picketing, and the Employers and Workmen Act (1875) to enable workers to sue employers in the civil courts if they broke legal contracts. As a result of these social reforms the Liberal-Labour MP Alexander Macdonald told his constituents in 1879, "The Conservative party have done more for the working classes in five years than the Liberals have in fifty." Patronage and Civil Service reform Gladstone in 1870 had sponsored an Order in Council, introducing competitive examination into the Civil Service, diminishing the political aspects of government hiring. Disraeli did not agree, and while he did not seek to reverse the order, his actions often frustrated its intent. For example, Disraeli made political appointments to positions previously given to career civil servants. In this, he was backed by his party, hungry for office and its emoluments after almost thirty years with only brief spells in government. Disraeli gave positions to hard-up Conservative leaders, even—to Gladstone's outrage—creating one office at £2,000 per year. Nevertheless, Disraeli made fewer peers (only 22, and one of those one of Victoria's sons) than had Gladstone—the Liberal leader had arranged for the bestowal of 37 peerages during his just over five years in office. As he had in government posts, Disraeli rewarded old friends with clerical positions, making Sydney Turner, son of a good friend of Isaac D'Israeli, Dean of Ripon. He favoured Low church clergymen in promotion, disliking other movements in Anglicanism for political reasons. In this, he came into disagreement with the Queen, who out of loyalty to her late husband, Albert, Prince Consort, preferred Broad church teachings. One controversial appointment had occurred shortly before the 1868 election. When the position of Archbishop of Canterbury fell vacant, Disraeli reluctantly agreed to the Queen's preferred candidate, Archibald Tait, the Bishop of London. To fill Tait's vacant see, Disraeli was urged by many people to appoint Samuel Wilberforce, the former Bishop of Winchester and leading figure in London society. Disraeli disliked Wilberforce and instead appointed John Jackson, the Bishop of Lincoln. Blake suggested that, on balance, these appointments cost Disraeli more votes than they gained him. Foreign policy Disraeli always considered foreign affairs to be the most critical and most interesting part of statesmanship. Nevertheless, his biographer Robert Blake doubts that his subject had specific ideas about foreign policy when he took office in 1874. He had rarely travelled abroad; since his youthful tour of the Middle East in 1830–1831, he had left Britain only for his honeymoon and three visits to Paris, the last of which was in 1856. As he had criticised Gladstone for a do-nothing foreign policy, he most probably contemplated what actions would reassert Britain's place in Europe. His brief first premiership, and the first year of his second, gave him little opportunity to make his mark in foreign affairs. Suez The Suez Canal, opened in 1869, cut weeks and thousands of miles off the sea journey between Britain and India; in 1875, approximately 80% of the ships using the canal were British. In the event of another rebellion in India, or of a Russian invasion, the time saved at Suez might be crucial. Built by French interests, 56% of the stocks in the canal remained in their hands, while 44% of the stock belonged to Isma'il Pasha, the Khedive of Egypt. He was notorious for his profligate spending. The canal was losing money, and an attempt by Ferdinand de Lesseps, builder of the canal, to raise the tolls had fallen through when the Khedive had threatened to use military force to prevent it, and had also attracted Disraeli's attention. The Khedive governed Egypt under the Ottoman Empire; as in the Crimea, the issue of the Canal raised the Eastern Question of what to do about the decaying empire governed from Constantinople. With much of the pre-canal trade and communications between Britain and India passing through the Ottoman Empire, Britain had done its best to prop up the Ottomans against the threat that Russia would take Constantinople, cutting those communications, and giving Russian ships unfettered access to the Mediterranean. The French might also threaten those lines. Britain had had the opportunity to purchase shares in the canal but had declined to do so. Disraeli, recognising the British interest in the canal, sent the Liberal MP Nathan Rothschild to Paris to enquire about buying de Lesseps's shares. On 14 November 1875, the editor of the Pall Mall Gazette, Frederick Greenwood, learned from London banker Henry Oppenheim that the Khedive was seeking to sell his shares in the Suez Canal Company to a French firm. Greenwood quickly told Lord Derby, the Foreign Secretary, who notified Disraeli. The Prime Minister moved immediately to secure the shares. On 23 November, the Khedive offered to sell the shares for 100,000,000 francs. Rather than seek the aid of the Bank of England, Disraeli asked Lionel de Rothschild to loan funds. Rothschild did so and took a commission on the deal. The banker's capital was at risk as Parliament could have refused to ratify the transaction. The contract for purchase was signed at Cairo on 25 November and the shares deposited at the British consulate the following day. Disraeli told the Queen, "it is settled; you have it, madam!" The public saw the venture as a daring statement of British dominance of the seas. Sir Ian Malcolm described the Suez Canal share purchase as "the greatest romance of Mr. Disraeli's romantic career". In the following decades, the security of the Suez Canal, as the pathway to India, became a major concern of British foreign policy. Under Gladstone Britain took control of Egypt in 1882. A later Foreign Secretary, Lord Curzon, described the canal in 1909 as "the determining influence of every considerable movement of British power to the east and south of the Mediterranean". Royal Titles Act Although initially curious about Disraeli when he entered Parliament in 1837, Victoria came to detest him over his treatment of Peel. Over time, her dislike softened, especially as Disraeli took pains to cultivate her. He told Matthew Arnold, "Everybody likes flattery; and, when you come to royalty, you should lay it on with a trowel". Disraeli's biographer, Adam Kirsch, suggests that Disraeli's obsequious treatment of his queen was part flattery, part belief that this was how a queen should be addressed by a loyal subject, and part awe that a middle-class man of Jewish birth should be the companion of a monarch. By the time of his second premiership, Disraeli had built a strong relationship with Victoria, probably closer to her than any of her Prime Ministers except her first, Lord Melbourne. When Disraeli returned as Prime Minister in 1874 and went to kiss hands, he did so literally, on one knee; and, according to Richard Aldous in his book on the rivalry between Disraeli and Gladstone, "for the next six years Victoria and Disraeli would exploit their closeness for mutual advantage." Victoria had long wished to have an imperial title, reflecting Britain's expanding domain. She was irked when Tsar Alexander II held a higher rank than her as an emperor, and was appalled that her daughter, the Prussian Crown Princess, would outrank her when her husband came to the throne. She also saw an imperial title as proclaiming Britain's increased stature in the world. The title "Empress of India" had been used informally with respect to Victoria for some time and she wished to have that title formally bestowed on her. The Queen prevailed upon Disraeli to introduce a Royal Titles Bill, and also told of her intent to open Parliament in person, which during this time she did only when she wanted something from legislators. Disraeli was cautious in response, as careful soundings of MPs brought a negative reaction, and he declined to place such a proposal in the Queen's Speech. Once the desired bill was finally prepared, Disraeli's handling of it was not adept. He neglected to notify either the Prince of Wales or the Opposition, and was met by irritation from the prince and a full-scale attack from the Liberals. An old enemy of Disraeli, former Liberal Chancellor Robert Lowe, alleged during the debate in the Commons that two previous Prime Ministers had refused to introduce such legislation for the Queen. Gladstone immediately stated that he was not one of them, and the Queen gave Disraeli leave to quote her saying she had never approached a Prime Minister with such a proposal. According to Blake, Disraeli "in a brilliant oration of withering invective proceeded to destroy Lowe", who apologised and never held office again. Disraeli said of Lowe that he was the only person in London with whom he would not shake hands and, "he is in the mud and there I leave him." Fearful of losing, Disraeli was reluctant to bring the bill to a vote in the Commons, but when he eventually did, it passed with a majority of 75. Once the bill was formally enacted, Victoria began signing her letters "Victoria R & I" (, that is, Queen and Empress). According to Aldous, "the unpopular Royal Titles Act, however, shattered Disraeli's authority in the House of Commons". Balkans and Bulgaria In July 1875 Serb populations in Bosnia and Herzegovina, then provinces of the Ottoman Empire, rose in revolt against their Turkish masters, alleging religious persecution and poor administration. The following January, Sultan Abdülaziz agreed to reforms proposed by Hungarian statesman Julius Andrássy, but the rebels, suspecting they might win their freedom, continued their uprising, joined by militants in Serbia and Bulgaria. The Turks suppressed the Bulgarian uprising harshly, and when reports of these actions escaped, Disraeli and Derby stated in Parliament that they did not believe them. Disraeli called them "coffee-house babble" and dismissed allegations of torture by the Ottomans since "Oriental people usually terminate their connections with culprits in a more expeditious fashion". Gladstone, who had left the Liberal leadership and retired from public life, was appalled by reports of atrocities in Bulgaria, and in August 1876, penned a hastily written pamphlet arguing that the Turks should be deprived of Bulgaria because of what they had done there. He sent a copy to Disraeli, who called it "vindictive and ill-written ... of all the Bulgarian horrors perhaps the greatest". Gladstone's pamphlet became an immense best-seller and rallied the Liberals to urge that the Ottoman Empire should no longer be a British ally. Disraeli wrote to Lord Salisbury on 3 September, "Had it not been for these unhappy 'atrocities', we should have settled a peace very honourable to England and satisfactory to Europe. Now we are obliged to work from a new point of departure, and dictate to Turkey, who has forfeited all sympathy." In spite of this, Disraeli's policy favoured Constantinople and the territorial integrity of its empire. Disraeli and the cabinet sent Salisbury as lead British representative to the Constantinople Conference, which met in December 1876 and January 1877. In advance of the conference, Disraeli sent Salisbury private word to seek British military occupation of Bulgaria and Bosnia, and British control of the Ottoman Army. Salisbury ignored these instructions, which his biographer, Andrew Roberts deemed "ludicrous". Nevertheless, the conference failed to reach agreement with the Turks. Parliament opened in February 1877, with Disraeli now in the Lords as Earl of Beaconsfield. He spoke only once there in the 1877 session on the Eastern Question, stating on 20 February that there was a need for stability in the Balkans, and that forcing Turkey into territorial concessions would do nothing to secure it. The Prime Minister wanted a deal with the Ottomans whereby Britain would temporarily occupy strategic areas to deter the Russians from war, to be returned on the signing of a peace treaty, but found little support in his cabinet, which favoured partition of the Ottoman Empire. As Disraeli, by then in poor health, continued to battle within the cabinet, Russia invaded Turkey on 21 April, beginning the Russo-Turkish War. Congress of Berlin The Russians pushed through Ottoman territory and by December 1877 had captured the strategic Bulgarian town of Plevna; their march on Constantinople seemed inevitable. The war divided the British, but the Russian success caused some to forget the atrocities and call for intervention on the Turkish side. Others hoped for further Russian successes. The fall of Plevna was a major story for weeks in the newspapers, and Disraeli's warnings that Russia was a threat to British interests in the eastern Mediterranean were deemed prophetic. The jingoistic attitude of many Britons increased Disraeli's political support, and the Queen acted to help him as well, showing her favour by visiting him at Hughenden—the first time she had visited the country home of her Prime Minister since the Melbourne ministry. At the end of January 1878, the Ottoman Sultan appealed to Britain to save Constantinople. Amid war fever in Britain, the government asked Parliament to vote £6,000,000 to prepare the Army and Navy for war. Gladstone opposed the measure, but less than half his party voted with him. Popular opinion was with Disraeli, though some thought him too soft for not immediately declaring war on Russia. With the Russians close to Constantinople, the Turks yielded and in March 1878, signed the Treaty of San Stefano, conceding a Bulgarian state which would cover a large part of the Balkans. It would be initially Russian-occupied and many feared that it would give them a client state close to Constantinople. Other Ottoman possessions in Europe would become independent; additional territory was to be ceded directly to Russia. This was unacceptable to the British, who protested, hoping to get the Russians to agree to attend an international conference which German Chancellor Bismarck proposed to hold at Berlin. The cabinet discussed Disraeli's proposal to position Indian troops at Malta for possible transit to the Balkans and call out reserves. Derby resigned in protest, and Disraeli appointed Salisbury as Foreign Secretary. Amid British preparations for war, the Russians and Turks agreed to discussions at Berlin. In advance of the meeting, confidential negotiations took place between Britain and Russia in April and May 1878. The Russians were willing to make changes to the big Bulgaria, but were determined to retain their new possessions, Bessarabia in Europe and Batum and Kars on the east coast of the Black Sea. To counterbalance this, Britain required a possession in the Eastern Mediterranean where it might base ships and troops, and negotiated with the Ottomans for the cession of Cyprus. Once this was secretly agreed, Disraeli was prepared to allow Russia's territorial gains. The Congress of Berlin was held in June and July 1878, the central relationship in it that between Disraeli and Bismarck. In later years, the German chancellor would show visitors to his office three pictures on the wall: "the portrait of my Sovereign, there on the right that of my wife, and on the left, there, that of Lord Beaconsfield". Disraeli caused an uproar in the congress by making his opening address in English, rather than in French, hitherto accepted as the international language of diplomacy. By one account, the British ambassador in Berlin, Lord Odo Russell, hoping to spare the delegates Disraeli's awful French accent, told Disraeli that the congress was hoping to hear a speech in the English tongue by one of its masters. Disraeli left much of the detailed work to Salisbury, concentrating his efforts on making it as difficult as possible for the broken-up big Bulgaria to reunite. Disraeli did not have things all his own way: he intended that Batum be demilitarised, but the Russians obtained their preferred language, and in 1886, fortified the town. Nevertheless, the Cyprus Convention ceding the island to Britain was announced during the congress, and again made Disraeli a sensation. Disraeli gained agreement that Turkey should retain enough of its European possessions to safeguard the Dardanelles. By one account, when met with Russian intransigence, Disraeli told his secretary to order a special train to return them home to begin the war. Although Russia yielded, Czar Alexander II later described the congress as "a European coalition against Russia, under Bismarck". The Treaty of Berlin was signed on 13 July 1878 at the Radziwill Palace in Berlin. Disraeli and Salisbury returned home to heroes' receptions at Dover and in London. At the door of 10 Downing Street, Disraeli received flowers sent by the Queen. There, he told the gathered crowd, "Lord Salisbury and I have brought you back peace—but a peace I hope with honour." The Queen offered him a dukedom, which he declined, though accepting the Garter, as long as Salisbury also received it. In Berlin, word spread of Bismarck's admiring description of Disraeli, "Der alte Jude, das ist der Mann! " Afghanistan to Zululand In the weeks after Berlin, Disraeli and the cabinet considered calling a general election to capitalise on the public applause he and Salisbury had received. Parliaments were then for a seven-year term, and it was the custom not to go to the country until the sixth year unless forced to by events. Only four and a half years had passed since the last general election. Additionally, they did not see any clouds on the horizon that might forecast Conservative defeat if they waited. This decision not to seek re-election has often been cited as a great mistake by Disraeli. Blake, however, pointed out that results in local elections had been moving against the Conservatives, and doubted if Disraeli missed any great opportunity by waiting. As successful invasions of India generally came through Afghanistan, the British had observed and sometimes intervened there since the 1830s, hoping to keep the Russians out. In 1878 the Russians sent a mission to Kabul; it was not rejected by the Afghans, as the British had hoped. The British then proposed to send their own mission, insisting that the Russians be sent away. The Viceroy of India Lord Lytton concealed his plans to issue this ultimatum from Disraeli, and when the Prime Minister insisted he take no action, went ahead anyway. When the Afghans made no answer, the British advanced against them in the Second Anglo-Afghan War, and under Lord Roberts easily defeated them. The British installed a new ruler, and left a mission and garrison in Kabul. British policy in South Africa was to encourage federation between the British-run Cape Colony and Natal, and the Boer republics, the Transvaal (annexed by Britain in 1877) and the Orange Free State. The governor of Cape Colony, Sir Bartle Frere, believing that the federation could not be accomplished until the native tribes acknowledged British rule, made demands on the Zulu and their king, Cetewayo, which they were certain to reject. As Zulu troops could not marry until they had washed their spears in blood, they were eager for combat. Frere did not send word to the cabinet of what he had done until the ultimatum was about to expire. Disraeli and the cabinet reluctantly backed him, and in early January 1879 resolved to send reinforcements. Before they could arrive, on 22 January, a Zulu impi, or army, moving with great speed and endurance, destroyed a British encampment in South Africa in the Battle of Isandlwana. Over a thousand British and colonial troops were killed. Word of the defeat did not reach London until 12 February. Disraeli wrote the next day, "the terrible disaster has shaken me to the centre". He reprimanded Frere, but left him in charge, attracting fire from all sides. Disraeli sent General Sir Garnet Wolseley as High Commissioner and Commander in Chief, and Cetewayo and the Zulus were crushed at the Battle of Ulundi on 4 July 1879. On 8 September 1879 Sir Louis Cavagnari, in charge of the mission in Kabul, was killed with his entire staff by rebelling Afghan soldiers. Roberts undertook a successful punitive expedition against the Afghans over the next six weeks. 1880 election Gladstone, in the 1874 election, had been returned for Greenwich, finishing second behind a Conservative in the two-member constituency, a result he termed more like a defeat than a victory. In December 1878, he was offered the Liberal nomination at the next election for Edinburghshire, a constituency popularly known as Midlothian. The small Scottish electorate was dominated by two noblemen, the Conservative Duke of Buccleuch and the Liberal Earl of Rosebery. The Earl, a friend of both Disraeli and Gladstone who would succeed the latter after his final term as Prime Minister, had journeyed to the United States to view politics there, and was convinced that aspects of American electioneering techniques could be translated to Britain. On his advice, Gladstone accepted the offer in January 1879, and later that year began his Midlothian campaign, speaking not only in Edinburgh, but across Britain, attacking Disraeli, to huge crowds. Conservative chances of re-election were damaged by the poor weather, and consequent effects on agriculture. Four consecutive wet summers through 1879 had led to poor harvests. In the past, the farmer had the consolation of higher prices at such times, but with bumper crops cheaply transported from the United States, grain prices remained low. Other European nations, faced with similar circumstances, opted for protection, and Disraeli was urged to reinstitute the Corn Laws. He declined, stating that he regarded the matter as settled. Protection would have been highly unpopular among the newly enfranchised urban working classes, as it would raise their cost of living. Amid an economic slump generally, the Conservatives lost support among farmers. Disraeli's health continued to fail through 1879. Owing to his infirmities, Disraeli was three-quarters of an hour late for the Lord Mayor's Dinner at the Guildhall in November, at which it is customary that the Prime Minister speaks. Though many commented on how healthy he looked, it took him great effort to appear so, and when he told the audience he expected to speak to the dinner again the following year, attendees chuckled—Gladstone was then in the midst of his campaign. Despite his public confidence, Disraeli recognised that the Conservatives would probably lose the next election, and was already contemplating his Resignation Honours. Despite this pessimism, Conservatives hopes were buoyed in early 1880 with successes in by-elections the Liberals had expected to win, concluding with victory in Southwark, normally a Liberal stronghold. The cabinet had resolved to wait before dissolving Parliament; in early March they reconsidered, agreeing to go to the country as soon as possible. Parliament was dissolved on 24 March; the first borough constituencies began voting a week later. Disraeli took no public part in the electioneering, it being deemed improper for peers to make speeches to influence Commons elections. This meant that the chief Conservatives—Disraeli, Salisbury, and India Secretary Lord Cranbrook—would not be heard from. The election was thought likely to be close. Once returns began to be announced, it became clear that the Conservatives were being decisively beaten. The final result gave the Liberals an absolute majority of about 50. Final months, death, and memorials Disraeli refused to cast blame for the defeat, which he understood was likely to be final for him. He wrote to Lady Bradford that it was just as much work to end a government as to form one, without any of the fun. Queen Victoria was bitter at his departure as Prime Minister. Among the honours he arranged before resigning as Prime Minister on 21 April 1880 was one for his private secretary, Montagu Corry, who became Baron Rowton. Returning to Hughenden, Disraeli brooded over his electoral dismissal, but also resumed work on Endymion, which he had begun in 1872 and laid aside before the 1874 election. The work was rapidly completed and published by November 1880. He carried on a correspondence with Victoria, with letters passed through intermediaries. When Parliament met in January 1881, he served as Conservative leader in the Lords, attempting to serve as a moderating influence on Gladstone's legislation. Suffering from asthma and gout, Disraeli went out as little as possible, fearing more serious episodes of illness. In March, he fell ill with bronchitis, and emerged from bed only for a meeting with Salisbury and other Conservative leaders on the 26th. As it became clear that this might be his final sickness, friends and opponents alike came to call. Disraeli declined a visit from the Queen, saying, "She would only ask me to take a message to Albert." Almost blind, when he received the last letter from Victoria of which he was aware on 5 April, he held it momentarily, then had it read to him by Lord Barrington, a Privy Councillor. One card, signed "A Workman", delighted its recipient, "Don't die yet, we can't do without you." Despite the gravity of Disraeli's condition, the doctors concocted optimistic bulletins, for public consumption. The Prime Minister, Gladstone, called several times to enquire about his rival's condition, and wrote in his diary, "May the Almighty be near his pillow." There was intense public interest in the former Prime Minister's struggles for life. Disraeli had customarily taken the sacrament at Easter; when this day was observed on 17 April, there was discussion among his friends and family if he should be given the opportunity, but those against, fearing that he would lose hope, prevailed. On the morning of the following day, Easter Monday, he became incoherent, then comatose. Disraeli's last confirmed words before dying at his home at 19 Curzon Street in the early morning of 19 April were "I had rather live but I am not afraid to die". The anniversary of Disraeli's death was for some years commemorated in the United Kingdom as Primrose Day. Despite having been offered a state funeral by Queen Victoria, Disraeli's executors decided against a public procession and funeral, fearing that too large crowds would gather to do him honour. The chief mourners at the service at Hughenden on 26 April were his brother Ralph and nephew Coningsby, to whom Hughenden would eventually pass. Queen Victoria was prostrated with grief, and considered ennobling Ralph or Coningsby as a memorial to Disraeli (without children, his titles became extinct with his death) but decided against it on the ground that their means were too small for a peerage. Protocol forbade her attending Disraeli's funeral (this would not be changed until 1965, when Elizabeth II attended the rites for the former Prime Minister Sir Winston Churchill) but she sent primroses ("his favourite flowers") to the funeral, and visited the burial vault to place a wreath of china flowers four days later. Disraeli is buried with his wife in a vault beneath the Church of St Michael and All Angels which stands in the grounds of his home, Hughenden Manor, accessed from the churchyard. There is also a memorial to him in the chancel in |
than the true value.) The Wald method, although commonly recommended in textbooks, is the most biased. Related distributions Sums of binomials If X ~ B(n, p) and Y ~ B(m, p) are independent binomial variables with the same probability p, then X + Y is again a binomial variable; its distribution is Z=X+Y ~ B(n+m, p): A Binomial distributed random variable X ~ B(n, p) can be considered as the sum of n Bernouli distributed random variables. So the sum of two Binomial distributed random variable X ~ B(n, p) and Y ~ B(m, p) is equivalent to the sum of n + m Bernouli distributed random variables, which means Z=X+Y ~ B(n+m, p). This can also be proven directly using the addition rule. However, if X and Y do not have the same probability p, then the variance of the sum will be smaller than the variance of a binomial variable distributed as Poisson binomial distribution The binomial distribution is a special case of the Poisson binomial distribution, which is the distribution of a sum of n independent non-identical Bernoulli trials B(pi). Ratio of two binomial distributions This result was first derived by Katz and coauthors in 1978. Let X ~ B(n,p1) and Y ~ B(m,p2) be independent. Let T = (X/n)/(Y/m). Then log(T) is approximately normally distributed with mean log(p1/p2) and variance ((1/p1) − 1)/n + ((1/p2) − 1)/m. Conditional binomials If X ~ B(n, p) and Y | X ~ B(X, q) (the conditional distribution of Y, given X), then Y is a simple binomial random variable with distribution Y ~ B(n, pq). For example, imagine throwing n balls to a basket UX and taking the balls that hit and throwing them to another basket UY. If p is the probability to hit UX then X ~ B(n, p) is the number of balls that hit UX. If q is the probability to hit UY then the number of balls that hit UY is Y ~ B(X, q) and therefore Y ~ B(n, pq). Since and , by the law of total probability, Since the equation above can be expressed as Factoring and pulling all the terms that don't depend on out of the sum now yields After substituting in the expression above, we get Notice that the sum (in the parentheses) above equals by the binomial theorem. Substituting this in finally yields and thus as desired. Bernoulli distribution The Bernoulli distribution is a special case of the binomial distribution, where n = 1. Symbolically, X ~ B(1, p) has the same meaning as X ~ Bernoulli(p). Conversely, any binomial distribution, B(n, p), is the distribution of the sum of n independent Bernoulli trials, Bernoulli(p), each with the same probability p. Normal approximation If n is large enough, then the skew of the distribution is not too great. In this case a reasonable approximation to B(n, p) is given by the normal distribution and this basic approximation can be improved in a simple way by using a suitable continuity correction. The basic approximation generally improves as n increases (at least 20) and is better when p is not near to 0 or 1. Various rules of thumb may be used to decide whether n is large enough, and p is far enough from the extremes of zero or one: One rule is that for the normal approximation is adequate if the absolute value of the skewness is strictly less than 1/3; that is, if This can be made precise using the Berry–Esseen theorem. A stronger rule states that the normal approximation is appropriate only if everything within 3 standard deviations of its mean is within the range of possible values; that is, only if This 3-standard-deviation rule is equivalent to the following conditions, which also imply the first rule above. The rule is totally equivalent to request that Moving terms around yields: Since , we can apply the square power and divide by the respective factors and , to obtain the desired conditions: Notice that these conditions automatically imply that . On the other hand, apply again the square root and divide by 3, Subtracting the second set of inequalities from the first one yields: and so, the desired first rule is satisfied, Another commonly used rule is that both values and must be greater than or equal to 5. However, the specific number varies from source to source, and depends on how good an approximation one wants. In particular, if one uses 9 instead of 5, the rule implies the results stated in the previous paragraphs. Assume that both values and are greater than 9. Since , we easily have that We only have to divide now by the respective factors and , to deduce the alternative form of the 3-standard-deviation rule: The following is an example of applying a continuity correction. Suppose one wishes to calculate Pr(X ≤ 8) for a binomial random variable X. If Y has a distribution given by the normal approximation, then Pr(X ≤ 8) is approximated by Pr(Y ≤ 8.5). The addition of 0.5 is the continuity correction; the uncorrected normal approximation gives considerably less accurate results. This approximation, known as de Moivre–Laplace theorem, is a huge time-saver when undertaking calculations by hand (exact calculations with large n are very onerous); historically, it was the first use of the normal distribution, introduced in Abraham de Moivre's book The Doctrine of Chances in 1738. Nowadays, it can be seen as a consequence of the central limit theorem since B(n, p) is a sum of n independent, identically distributed Bernoulli variables with parameter p. This fact is the basis of a hypothesis test, a "proportion z-test", for the value of p using x/n, the sample proportion and estimator of p, in a common test statistic. For example, suppose one randomly samples n people out of a large population and ask them whether they agree with a certain statement. The proportion of people who agree will of course depend on the sample. If groups of n people were sampled repeatedly and truly randomly, the proportions would follow an approximate normal distribution with mean equal to the true proportion p of agreement in the population and with standard deviation Poisson approximation The binomial distribution converges towards the Poisson distribution as the number of trials goes to infinity while the product np remains fixed or at least p tends to zero. Therefore, the Poisson distribution with parameter λ = np can be used as an approximation to B(n, p) of the binomial distribution if n is sufficiently large and p is sufficiently small. According to two rules of thumb, this approximation is good if n ≥ 20 and p ≤ 0.05, or if n ≥ 100 and np ≤ 10. Concerning the accuracy of Poisson approximation, see Novak, ch. 4, and references therein. Limiting distributions Poisson limit theorem: As n approaches ∞ and p approaches 0 with the product np held fixed, the Binomial(n, p) distribution approaches the Poisson distribution with expected value λ = np. de Moivre–Laplace theorem: As n approaches ∞ while p remains fixed, the distribution of approaches the normal distribution with expected value 0 and variance 1. This result is sometimes loosely stated by saying that the distribution of X is asymptotically normal with expected value 0 and variance 1. This result is a specific case of the central limit theorem. Beta distribution The binomial distribution and beta distribution are different views of the same model of repeated Bernoulli trials. The binomial distribution is the PMF of successes given independent events each with a probability of success. Mathematically, when and , the beta distribution and the binomial distribution are related by a factor of : Beta distributions also provide a family of prior probability distributions for binomial distributions in Bayesian inference: Given a uniform | by using , and one gets the upper bound by using . Comparison The so-called "exact" (Clopper–Pearson) method is the most conservative. (Exact does not mean perfectly accurate; rather, it indicates that the estimates will not be less conservative than the true value.) The Wald method, although commonly recommended in textbooks, is the most biased. Related distributions Sums of binomials If X ~ B(n, p) and Y ~ B(m, p) are independent binomial variables with the same probability p, then X + Y is again a binomial variable; its distribution is Z=X+Y ~ B(n+m, p): A Binomial distributed random variable X ~ B(n, p) can be considered as the sum of n Bernouli distributed random variables. So the sum of two Binomial distributed random variable X ~ B(n, p) and Y ~ B(m, p) is equivalent to the sum of n + m Bernouli distributed random variables, which means Z=X+Y ~ B(n+m, p). This can also be proven directly using the addition rule. However, if X and Y do not have the same probability p, then the variance of the sum will be smaller than the variance of a binomial variable distributed as Poisson binomial distribution The binomial distribution is a special case of the Poisson binomial distribution, which is the distribution of a sum of n independent non-identical Bernoulli trials B(pi). Ratio of two binomial distributions This result was first derived by Katz and coauthors in 1978. Let X ~ B(n,p1) and Y ~ B(m,p2) be independent. Let T = (X/n)/(Y/m). Then log(T) is approximately normally distributed with mean log(p1/p2) and variance ((1/p1) − 1)/n + ((1/p2) − 1)/m. Conditional binomials If X ~ B(n, p) and Y | X ~ B(X, q) (the conditional distribution of Y, given X), then Y is a simple binomial random variable with distribution Y ~ B(n, pq). For example, imagine throwing n balls to a basket UX and taking the balls that hit and throwing them to another basket UY. If p is the probability to hit UX then X ~ B(n, p) is the number of balls that hit UX. If q is the probability to hit UY then the number of balls that hit UY is Y ~ B(X, q) and therefore Y ~ B(n, pq). Since and , by the law of total probability, Since the equation above can be expressed as Factoring and pulling all the terms that don't depend on out of the sum now yields After substituting in the expression above, we get Notice that the sum (in the parentheses) above equals by the binomial theorem. Substituting this in finally yields and thus as desired. Bernoulli distribution The Bernoulli distribution is a special case of the binomial distribution, where n = 1. Symbolically, X ~ B(1, p) has the same meaning as X ~ Bernoulli(p). Conversely, any binomial distribution, B(n, p), is the distribution of the sum of n independent Bernoulli trials, Bernoulli(p), each with the same probability p. Normal approximation If n is large enough, then the skew of the distribution is not too great. In this case a reasonable approximation to B(n, p) is given by the normal distribution and this basic approximation can be improved in a simple way by using a suitable continuity correction. The basic approximation generally improves as n increases (at least 20) and is better when p is not near to 0 or 1. Various rules of thumb may be used to decide whether n is large enough, and p is far enough from the extremes of zero or one: One rule is that for the normal approximation is adequate if the absolute value of the skewness is strictly less than 1/3; that is, if This can be made precise using the Berry–Esseen theorem. A stronger rule states that the normal approximation is appropriate only if everything within 3 standard deviations of its mean is within the range of possible values; that is, only if This 3-standard-deviation rule is equivalent to the following conditions, which also imply the first rule above. The rule is totally equivalent to request that Moving terms around yields: Since , we can apply the square power and divide by the respective factors and , to obtain the desired conditions: Notice that these conditions automatically imply that . On the other hand, apply again the square root and divide by 3, Subtracting the second set of inequalities from the first one yields: and so, the desired first rule is satisfied, Another commonly used rule is that both values and must be greater than or equal to 5. However, the specific number varies from source to source, and depends on how good an approximation one wants. In particular, if one uses 9 instead of 5, the rule implies the results stated in the previous paragraphs. Assume that both values and are greater than 9. Since , we easily have that We only have to divide now by the respective factors and , to deduce the alternative form of the 3-standard-deviation rule: The following is an example of applying a continuity correction. Suppose one wishes to calculate Pr(X ≤ 8) for a binomial random variable X. If Y has a distribution given by the normal approximation, then Pr(X ≤ 8) is approximated by Pr(Y ≤ 8.5). The addition of 0.5 is the continuity correction; the uncorrected normal approximation gives considerably less accurate results. This approximation, known as de Moivre–Laplace theorem, is a huge time-saver when undertaking calculations by hand (exact calculations with large n are very onerous); historically, it was the first use of the normal distribution, introduced in Abraham de Moivre's book The Doctrine of Chances in 1738. Nowadays, it can be seen as a consequence of the central limit theorem since B(n, p) is a sum of n independent, identically distributed Bernoulli variables with parameter p. This fact is the basis of a hypothesis test, a "proportion z-test", for the value of p using x/n, the sample proportion and estimator of p, in a common test statistic. For example, suppose one randomly samples n people out of a large population and ask them whether they agree with a certain statement. The proportion of people who agree will of course depend on the sample. If groups of n people were sampled repeatedly and truly randomly, the proportions would follow an approximate normal distribution with mean equal to the true proportion p of agreement in the population and with standard deviation Poisson approximation The binomial distribution converges towards the Poisson distribution as the number of trials goes to infinity while the product np remains fixed or at least p tends to zero. Therefore, the Poisson distribution with parameter λ = np can be used as an approximation to B(n, p) of the binomial distribution if n is sufficiently large and p is sufficiently small. According to two rules of thumb, this approximation is good if n ≥ 20 and p ≤ 0.05, or if n ≥ 100 and np ≤ 10. Concerning the accuracy of Poisson approximation, see Novak, ch. 4, and references therein. Limiting distributions Poisson limit theorem: As n approaches ∞ and p approaches 0 with the product np held fixed, the Binomial(n, p) distribution approaches the Poisson distribution with expected value λ = np. de Moivre–Laplace theorem: As n approaches ∞ while p remains fixed, the distribution of approaches the normal distribution with expected value 0 and variance 1. This result is sometimes loosely stated by saying that the distribution of X is asymptotically normal with expected value 0 and variance 1. This result is a specific case of the central limit theorem. Beta |
based on the three basic principles of experimental statistics: randomization, replication, and local control. Research question The research question will define the objective of a study. The research will be headed by the question, so it needs to be concise, at the same time it is focused on interesting and novel topics that may improve science and knowledge and that field. To define the way to ask the scientific question, an exhaustive literature review might be necessary. So, the research can be useful to add value to the scientific community. Hypothesis definition Once the aim of the study is defined, the possible answers to the research question can be proposed, transforming this question into a hypothesis. The main propose is called null hypothesis (H0) and is usually based on a permanent knowledge about the topic or an obvious occurrence of the phenomena, sustained by a deep literature review. We can say it is the standard expected answer for the data under the situation in test. In general, HO assumes no association between treatments. On the other hand, the alternative hypothesis is the denial of HO. It assumes some degree of association between the treatment and the outcome. Although, the hypothesis is sustained by question research and its expected and unexpected answers. As an example, consider groups of similar animals (mice, for example) under two different diet systems. The research question would be: what is the best diet? In this case, H0 would be that there is no difference between the two diets in mice metabolism (H0: μ1 = μ2) and the alternative hypothesis would be that the diets have different effects over animals metabolism (H1: μ1 ≠ μ2). The hypothesis is defined by the researcher, according to his/her interests in answering the main question. Besides that, the alternative hypothesis can be more than one hypothesis. It can assume not only differences across observed parameters, but their degree of differences (i.e. higher or shorter). Sampling Usually, a study aims to understand an effect of a phenomenon over a population. In biology, a population is defined as all the individuals of a given species, in a specific area at a given time. In biostatistics, this concept is extended to a variety of collections possible of study. Although, in biostatistics, a population is not only the individuals, but the total of one specific component of their organisms, as the whole genome, or all the sperm cells, for animals, or the total leaf area, for a plant, for example. It is not possible to take the measures from all the elements of a population. Because of that, the sampling process is very important for statistical inference. Sampling is defined as to randomly get a representative part of the entire population, to make posterior inferences about the population. So, the sample might catch the most variability across a population. The sample size is determined by several things, since the scope of the research to the resources available. In clinical research, the trial type, as inferiority, equivalence, and superiority is a key in determining sample size. Experimental design Experimental designs sustain those basic principles of experimental statistics. There are three basic experimental designs to randomly allocate treatments in all plots of the experiment. They are completely randomized design, randomized block design, and factorial designs. Treatments can be arranged in many ways inside the experiment. In agriculture, the correct experimental design is the root of a good study and the arrangement of treatments within the study is essential because environment largely affects the plots (plants, livestock, microorganisms). These main arrangements can be found in the literature under the names of “lattices”, “incomplete blocks”, “split plot”, “augmented blocks”, and many others. All of the designs might include control plots, determined by the researcher, to provide an error estimation during inference. In clinical studies, the samples are usually smaller than in other biological studies, and in most cases, the environment effect can be controlled or measured. It is common to use randomized controlled clinical trials, where results are usually compared with observational study designs such as case–control or cohort. Data collection Data collection methods must be considered in research planning, because it highly influences the sample size and experimental design. Data collection varies according to type of data. For qualitative data, collection can be done with structured questionnaires or by observation, considering presence or intensity of disease, using score criterion to categorize levels of occurrence. For quantitative data, collection is done by measuring numerical information using instruments. In agriculture and biology studies, yield data and its components can be obtained by metric measures. However, pest and disease injuries in plats are obtained by observation, considering score scales for levels of damage. Especially, in genetic studies, modern methods for data collection in field and laboratory should be considered, as high-throughput platforms for phenotyping and genotyping. These tools allow bigger experiments, while turn possible evaluate many plots in lower time than a human-based only method for data collection. Finally, all data collected of interest must be stored in an organized data frame for further analysis. Analysis and data interpretation Descriptive Tools Data can be represented through tables or graphical representation, such as line charts, bar charts, histograms, scatter plot. Also, measures of central tendency and variability can be very useful to describe an overview of the data. Follow some examples: Frequency tables One type of tables are the frequency table, which consists of data arranged in rows and columns, where the frequency is the number of occurrences or repetitions of data. Frequency can be: Absolute: represents the number of times that a determined value appear; Relative: obtained by the division of the absolute frequency by the total number; In the next example, we have the number of genes in ten operons of the same organism. Line graph Line graphs represent the variation of a value over another metric, such as time. In general, values are represented in the vertical axis, while the time variation is represented in the horizontal axis. Bar chart A bar chart is a graph that shows categorical data as bars presenting heights (vertical bar) or widths (horizontal bar) proportional to represent values. Bar charts provide an image that could also be represented in a tabular format. In the bar chart example, we have the birth rate in Brazil for the December months from 2010 to 2016. The sharp fall in December 2016 reflects the outbreak of Zika virus in the birth rate in Brazil. Histograms The histogram (or frequency distribution) is a graphical representation of a dataset tabulated and divided into uniform or non-uniform classes. It was first introduced by Karl Pearson. Scatter Plot A scatter plot is a mathematical diagram that uses Cartesian coordinates to display values of a dataset. A scatter plot shows the data as a set of points, each one presenting the value of one variable determining the position on the horizontal axis and another variable on the vertical axis. They are also called scatter graph, scatter chart, scattergram, or scatter diagram. Mean The arithmetic mean is the sum of a collection of values () divided by the number of items of this collection (). Median The median is the value in the middle of a dataset. Mode The mode is the value of a set of data that appears most often. Box Plot Box plot is a method for graphically depicting groups of numerical data. The maximum and minimum values are represented by the lines, and the interquartile range (IQR) represent 25–75% of the data. Outliers may be plotted as circles. Correlation Coefficients Although correlations between two different kinds of data could be inferred by graphs, such as scatter plot, it is necessary validate this though numerical information. For this reason, correlation coefficients are required. They provide a numerical value that reflects the strength of an association. Pearson Correlation Coefficient Pearson correlation coefficient is a measure of association between two variables, X and Y. This coefficient, usually represented by ρ (rho) for the population and r for the sample, assumes values between −1 and 1, where ρ = 1 represents a perfect positive correlation, ρ = −1 represents a perfect negative correlation, and ρ = 0 is no linear correlation. Inferential Statistics It is used to make inferences about an unknown population, by estimation and/or hypothesis testing. In other words, it is desirable to obtain parameters to describe the population of interest, but since the data is limited, it is necessary to make use of a representative sample in order to estimate them. With that, it is possible to test previously defined hypotheses and apply the conclusions to the entire population. The standard error of the mean is a measure of variability that is crucial to do inferences. Hypothesis testing Hypothesis testing is essential to make inferences about populations aiming to answer research questions, as settled in "Research planning" section. Authors defined four steps to be set: The hypothesis to be tested: as stated earlier, we have to work with the definition of a null hypothesis (H0), that is going to be tested, and an alternative hypothesis. But they must be defined before the experiment implementation. Significance level and decision rule: A decision rule depends on the level of significance, or in other words, the acceptable error rate (α). It is easier to think that we define a critical value that determines the statistical significance when a test statistic is compared with it. So, α also has to be predefined before the experiment. Experiment and statistical analysis: This is when the experiment is really implemented following the appropriate experimental design, data is collected and the more suitable statistical tests are evaluated. Inference: Is made when the null hypothesis is rejected or not rejected, based on the evidence that the comparison of p-values and α brings. It is pointed that the failure to reject H0 just means that there is not enough evidence to support its rejection, but not that this hypothesis is true. Confidence intervals A confidence interval is a range of values that can contain the true real parameter value in given a certain level of confidence. The first step is to estimate the best-unbiased estimate of the population parameter. The upper value of the interval is obtained by the sum of this estimate with the multiplication between the standard error of the mean and the confidence level. The calculation of lower value is similar, but instead of a sum, a subtraction must be applied. Statistical considerations Power and statistical error When testing a hypothesis, there are two types of statistic errors possible: Type I error and Type II error. The type I error or false positive is the incorrect rejection of a true null hypothesis and the type II error or false negative is the failure to reject a false null hypothesis. The significance level denoted by α is the type I error rate and should be chosen before performing the test. The type II error rate is denoted by β and statistical power of the test is 1 − β. p-value The p-value is the probability of obtaining results as extreme as or more extreme than those observed, assuming the null hypothesis (H0) is true. It is also called the calculated probability. It is common to confuse the p-value with the significance level (α), but, the α is a predefined threshold for calling significant results. If p is less than α, the null hypothesis (H0) is rejected. Multiple testing In multiple tests of the same hypothesis, the probability of the occurrence of falses positives (familywise error rate) increase and some strategy are used to control this occurrence. This is commonly achieved by using a more stringent threshold to reject null hypotheses. The Bonferroni correction defines an acceptable global significance level, denoted by α* and each test is individually compared with a value of α = α*/m. This ensures that the familywise error rate in all m tests, is less than or equal to α*. When m is large, the Bonferroni correction may be overly conservative. An alternative to the Bonferroni correction is to control the false discovery rate (FDR). The FDR controls the expected proportion of the rejected null hypotheses (the so-called discoveries) that are false (incorrect rejections). This procedure ensures that, for independent tests, the false discovery rate is at most q*. Thus, the FDR is less conservative than the Bonferroni correction and have more power, at the cost of more false positives. Mis-specification and robustness checks The main hypothesis being tested (e.g., no association between treatments and outcomes) is often accompanied by other technical assumptions (e.g., about the form of the probability distribution of the outcomes) that are also part of the null hypothesis. When the technical assumptions are violated in practice, then the null may be frequently rejected even if the main hypothesis is true. Such rejections are said to be due to model mis-specification. Verifying whether the outcome of a statistical test does not change when the technical assumptions are slightly altered (so-called robustness checks) is the main way of combating mis-specification. Model selection criteria Model criteria selection will select or model that more approximate true model. The Akaike's Information Criterion (AIC) and The Bayesian Information Criterion (BIC) are examples of asymptotically efficient criteria. Developments and Big Data Recent developments have made a large impact on biostatistics. Two important changes have been the ability to collect data on a high-throughput scale, and the ability to perform much more complex analysis using computational techniques. This comes from the development in areas as sequencing technologies, Bioinformatics and Machine learning (Machine learning in bioinformatics). Use in high-throughput data New biomedical technologies like microarrays, next-generation sequencers (for genomics) and mass spectrometry (for proteomics) generate enormous amounts of data, allowing many tests to be performed simultaneously. Careful analysis with biostatistical methods is required to separate the signal from the noise. For example, a microarray could be used to measure many thousands of genes simultaneously, determining which of them have different expression in diseased cells compared to normal cells. However, only a fraction of genes will be differentially expressed. Multicollinearity often occurs in high-throughput biostatistical settings. Due to high intercorrelation between the predictors (such as gene expression levels), the information of one predictor might be contained in another one. It could be that only 5% of the predictors are responsible for 90% of the variability of the response. In such a case, one could apply the biostatistical technique of dimension reduction (for example via principal component analysis). Classical statistical techniques like linear or logistic regression and linear discriminant analysis do not work well for high dimensional data (i.e. when the number of observations n is smaller than the number | experimental design, data collection methods, data analysis perspectives and costs evolved. It is essential to carry the study based on the three basic principles of experimental statistics: randomization, replication, and local control. Research question The research question will define the objective of a study. The research will be headed by the question, so it needs to be concise, at the same time it is focused on interesting and novel topics that may improve science and knowledge and that field. To define the way to ask the scientific question, an exhaustive literature review might be necessary. So, the research can be useful to add value to the scientific community. Hypothesis definition Once the aim of the study is defined, the possible answers to the research question can be proposed, transforming this question into a hypothesis. The main propose is called null hypothesis (H0) and is usually based on a permanent knowledge about the topic or an obvious occurrence of the phenomena, sustained by a deep literature review. We can say it is the standard expected answer for the data under the situation in test. In general, HO assumes no association between treatments. On the other hand, the alternative hypothesis is the denial of HO. It assumes some degree of association between the treatment and the outcome. Although, the hypothesis is sustained by question research and its expected and unexpected answers. As an example, consider groups of similar animals (mice, for example) under two different diet systems. The research question would be: what is the best diet? In this case, H0 would be that there is no difference between the two diets in mice metabolism (H0: μ1 = μ2) and the alternative hypothesis would be that the diets have different effects over animals metabolism (H1: μ1 ≠ μ2). The hypothesis is defined by the researcher, according to his/her interests in answering the main question. Besides that, the alternative hypothesis can be more than one hypothesis. It can assume not only differences across observed parameters, but their degree of differences (i.e. higher or shorter). Sampling Usually, a study aims to understand an effect of a phenomenon over a population. In biology, a population is defined as all the individuals of a given species, in a specific area at a given time. In biostatistics, this concept is extended to a variety of collections possible of study. Although, in biostatistics, a population is not only the individuals, but the total of one specific component of their organisms, as the whole genome, or all the sperm cells, for animals, or the total leaf area, for a plant, for example. It is not possible to take the measures from all the elements of a population. Because of that, the sampling process is very important for statistical inference. Sampling is defined as to randomly get a representative part of the entire population, to make posterior inferences about the population. So, the sample might catch the most variability across a population. The sample size is determined by several things, since the scope of the research to the resources available. In clinical research, the trial type, as inferiority, equivalence, and superiority is a key in determining sample size. Experimental design Experimental designs sustain those basic principles of experimental statistics. There are three basic experimental designs to randomly allocate treatments in all plots of the experiment. They are completely randomized design, randomized block design, and factorial designs. Treatments can be arranged in many ways inside the experiment. In agriculture, the correct experimental design is the root of a good study and the arrangement of treatments within the study is essential because environment largely affects the plots (plants, livestock, microorganisms). These main arrangements can be found in the literature under the names of “lattices”, “incomplete blocks”, “split plot”, “augmented blocks”, and many others. All of the designs might include control plots, determined by the researcher, to provide an error estimation during inference. In clinical studies, the samples are usually smaller than in other biological studies, and in most cases, the environment effect can be controlled or measured. It is common to use randomized controlled clinical trials, where results are usually compared with observational study designs such as case–control or cohort. Data collection Data collection methods must be considered in research planning, because it highly influences the sample size and experimental design. Data collection varies according to type of data. For qualitative data, collection can be done with structured questionnaires or by observation, considering presence or intensity of disease, using score criterion to categorize levels of occurrence. For quantitative data, collection is done by measuring numerical information using instruments. In agriculture and biology studies, yield data and its components can be obtained by metric measures. However, pest and disease injuries in plats are obtained by observation, considering score scales for levels of damage. Especially, in genetic studies, modern methods for data collection in field and laboratory should be considered, as high-throughput platforms for phenotyping and genotyping. These tools allow bigger experiments, while turn possible evaluate many plots in lower time than a human-based only method for data collection. Finally, all data collected of interest must be stored in an organized data frame for further analysis. Analysis and data interpretation Descriptive Tools Data can be represented through tables or graphical representation, such as line charts, bar charts, histograms, scatter plot. Also, measures of central tendency and variability can be very useful to describe an overview of the data. Follow some examples: Frequency tables One type of tables are the frequency table, which consists of data arranged in rows and columns, where the frequency is the number of occurrences or repetitions of data. Frequency can be: Absolute: represents the number of times that a determined value appear; Relative: obtained by the division of the absolute frequency by the total number; In the next example, we have the number of genes in ten operons of the same organism. Line graph Line graphs represent the variation of a value over another metric, such as time. In general, values are represented in the vertical axis, while the time variation is represented in the horizontal axis. Bar chart A bar chart is a graph that shows categorical data as bars presenting heights (vertical bar) or widths (horizontal bar) proportional to represent values. Bar charts provide an image that could also be represented in a tabular format. In the bar chart example, we have the birth rate in Brazil for the December months from 2010 to 2016. The sharp fall in December 2016 reflects the outbreak of Zika virus in the birth rate in Brazil. Histograms The histogram (or frequency distribution) is a graphical representation of a dataset tabulated and divided into uniform or non-uniform classes. It was first introduced by Karl Pearson. Scatter Plot A scatter plot is a mathematical diagram that uses Cartesian coordinates to display values of a dataset. A scatter plot shows the data as a set of points, each one presenting the value of one variable determining the position on the horizontal axis and another variable on the vertical axis. They are also called scatter graph, scatter chart, scattergram, or scatter diagram. Mean The arithmetic mean is the sum of a collection of values () divided by the number of items of this collection (). Median The median is the value in the middle of a dataset. Mode The mode is the value of a set of data that appears most often. Box Plot Box plot is a method for graphically depicting groups of numerical data. The maximum and minimum values are represented by the lines, and the interquartile range (IQR) represent 25–75% of the data. Outliers may be plotted as circles. Correlation Coefficients Although correlations between two different kinds of data could be inferred by graphs, such as scatter plot, it is necessary validate this though numerical information. For this reason, correlation coefficients are required. They provide a numerical value that reflects the strength of an association. Pearson Correlation Coefficient Pearson correlation coefficient is a measure of association between two variables, X and Y. This coefficient, usually represented by ρ (rho) for the population and r for the sample, assumes values between −1 and 1, where ρ = 1 represents a perfect positive correlation, ρ = −1 represents a perfect negative correlation, and ρ = 0 is no linear correlation. Inferential Statistics It is used to make inferences about an unknown population, by estimation and/or hypothesis testing. In other words, it is desirable to obtain parameters to describe the population of interest, but since the data is limited, it is necessary to make use of a representative sample in order to estimate them. With that, it is possible to test previously defined hypotheses and apply the conclusions to the entire population. The standard error of the mean is a measure of variability that is crucial to do inferences. Hypothesis testing Hypothesis testing is essential to make inferences about populations aiming to answer research questions, as settled in "Research planning" section. Authors defined four steps to be set: The hypothesis to be tested: as stated earlier, we have to work with the definition of a null hypothesis (H0), that is going to be tested, and an alternative hypothesis. But they must be defined before the experiment implementation. Significance level and decision rule: A decision rule depends on the level of significance, or in other words, the acceptable error rate (α). It is easier to think that we define a critical value that determines the statistical significance when a test statistic is compared with it. So, α also has to be predefined before the experiment. Experiment and statistical analysis: This is when the experiment is really implemented following the appropriate experimental |
rulers Astyages Darius III Others Baruch Tobit Judith Susanna New Testament Jesus and his relatives Jesus Mary, mother of Jesus Joseph Brothers of Jesus James (often identified with James, son of Alphaeus) Joseph (Joses) Judas (Jude) (often identified with Thaddeus) Simon Mary of Clopas Cleopas (often identified with Alphaeus and Clopas) Apostles of Jesus The Thirteen: Peter (a.k.a. Simon or Cephas) Andrew (Simon Peter's brother) James, son of Zebedee John, son of Zebedee Philip Bartholomew also known as "Nathanael" Thomas also known as "Doubting Thomas" Matthew also known as "Levi" James, son of Alphaeus Judas, son of James (a.k.a. Thaddeus or Lebbaeus) Simon the Zealot Judas Iscariot (the traitor) Matthias Others: Paul Barnabas Mary Magdalene (the one who discovered Jesus’ empty tomb) Priests Caiaphas, high priest Annas, first high priest of Roman Judea Zechariah, father of John the Baptist Prophets Agabus Anna Simeon John the Baptist Other believers Apollos Aquila Dionysius the Areopagite Epaphras, fellow prisoner of Paul, fellow worker John Mark (often identified with Mark) Joseph of Arimathea Lazarus Luke Mark Martha Mary Magdalene Mary, sister of Martha Nicodemus Onesimus Philemon Priscilla Silas Sopater Stephen, first martyr Timothy Titus Secular rulers Agrippa I, called "King Herod" or "Herod" in Acts 12 Felix governor of Judea who was present at the trial of Paul, and his wife Drusilla in Acts 24:24 Herod Agrippa II, king over several territories, | also known as "Levi" James, son of Alphaeus Judas, son of James (a.k.a. Thaddeus or Lebbaeus) Simon the Zealot Judas Iscariot (the traitor) Matthias Others: Paul Barnabas Mary Magdalene (the one who discovered Jesus’ empty tomb) Priests Caiaphas, high priest Annas, first high priest of Roman Judea Zechariah, father of John the Baptist Prophets Agabus Anna Simeon John the Baptist Other believers Apollos Aquila Dionysius the Areopagite Epaphras, fellow prisoner of Paul, fellow worker John Mark (often identified with Mark) Joseph of Arimathea Lazarus Luke Mark Martha Mary Magdalene Mary, sister of Martha Nicodemus Onesimus Philemon Priscilla Silas Sopater Stephen, first martyr Timothy Titus Secular rulers Agrippa I, called "King Herod" or "Herod" in Acts 12 Felix governor of Judea who was present at the trial of Paul, and his wife Drusilla in Acts 24:24 Herod Agrippa II, king over several territories, before whom Paul made his |
caused some controversy. The convention in rugby is for the home side to accommodate its guests when there is a clash of kit. The New Zealand side, by then already synonymous with the appellation "All Blacks", had an all black kit that clashed with the Lions' blue. After much reluctance and debate New Zealand agreed to change for the Tests and New Zealand played in all white for the first time. On the 1930 tour a delegation led by the Irish lock George Beamish expressed their displeasure at the fact that while the blue of Scotland, white of England and red of Wales were represented in the strip there was no green for Ireland. A green flash was added to the socks, which from 1938 became a green turnover (although on blue socks thus eliminating red from the kit), and that has remained a feature of the strip ever since. In 1936, the four-quartered badge returned for the tour to Argentina and has remained on the kits ever since, but other than that the strip remained the same. Red jerseys The adoption of the red jersey happened in the 1950 tour. A return to New Zealand was accompanied by a desire to avoid the controversy of 1930 and so red replaced blue for the jersey with the resultant kit being that which is still worn today, the combination of red jersey, white shorts and green and blue socks, representing the four unions. The only additions to the strip since 1950 began appearing in 1993, with the addition of kit suppliers logos in prominent positions. Umbro had in 1989 asked for "maximum brand exposure whenever possible" but this did not affect the kit's appearance. Since then, Nike, Adidas and Canterbury have had more overt branding on the shirts, with sponsors Scottish Provident (1997), NTL (2001), Zurich (2005), HSBC (2009 and 2013), Standard Life Investments (2017) and Vodafone (2021) Jersey evolution Current squad History 1888–1909 The earliest tours date back to 1888, when a 21-man squad visited Australia and New Zealand. The squad drew players from England, Scotland and Wales, though English players predominated. The 35-match tour of two host nations included no tests, but the side played provincial, city and academic sides, winning 27 matches. They played 19 games of Australian rules football, against prominent clubs in Victoria and South Australia, winning six and drawing one of these (see Australian rules football in England). The first tour, although unsanctioned by rugby bodies, established the concept of Northern Hemisphere sporting sides touring to the Southern Hemisphere. Three years after the first tour, the Western Province union invited rugby bodies in Britain to tour South Africa. Some saw the 1891 team – the first sanctioned by the Rugby Football Union – as the English national team, though others referred to it as "the British Isles". The tourists played a total of twenty matches, three of them tests. The team also played the regional side of South Africa (South Africa did not exist as a political unit in 1891), winning all three matches. In a notable event of the tour, the touring side presented the Currie Cup to Griqualand West, the province they thought produced the best performance on the tour. Five years later a British Isles side returned to South Africa. They played one extra match on this tour, making the total of 21 games, including four tests against South Africa, with the British Isles winning three of them. The squad had a notable Irish orientation, with the Ireland national team contributing six players to the 21-man squad. In 1899 the British Isles touring side returned to Australia for the first time since the unofficial tour of 1888. The squad of 23 for the first time ever had players from each of the home nations. The team again participated in 21 matches, playing state teams as well as northern Queensland sides and Victorian teams. A four-test series took place against Australia, the tourists winning three out of the four. The team returned via Hawaii and Canada playing additional games en route. Four years later, in 1903, the British Isles team returned to South Africa. The opening performance of the side proved disappointing from the tourists' point of view, with defeats in its opening three matches by Western Province sides in Cape Town. From then on the team experienced mixed results, though more wins than losses. The side lost the test series to South Africa, drawing twice, but with the South Africans winning the decider 8 to nil. No more than twelve months passed before the British Isles team ventured to Australia and New Zealand in 1904. The tourists devastated the Australian teams, winning every single game. Australia also lost all three tests to the visitors, even getting held to a standstill in two of the three games. Though the New Zealand leg of the tour did not take long in comparison to the number of Australian games, the British Isles experienced considerable difficulty across the Tasman after whitewashing the Australians. The team managed two early wins before losing the test to New Zealand and only winning one more game as well as drawing once. Despite their difficulties in New Zealand, the tour proved a raging success on-field for the British Isles. In 1908, another tour took place to Australia and New Zealand. In a reversal of previous practice, the planners allocated more matches in New Zealand rather than in Australia: perhaps the strength of the New Zealand teams and the heavy defeats of all Australian teams on the previous tour influenced this decision. Some commentators thought that this tour hoped to reach out to rugby communities in Australia, as rugby league (infamously) started in Australia in 1908. The Anglo-Welsh side (Irish and Scottish unions did not participate) performed well in all the non-test matches, but drew a test against New Zealand and lost the other two. 1910–1949 Visits that took place before the 1910 South Africa tour (the first selected by a committee from the four Home Unions) had enjoyed a growing degree of support from the authorities, although only one of these included representatives of all four nations. The 1910 tour to South Africa marked the official beginning of British Isles rugby tours: the inaugural tour operating under all four unions. The team performed moderately against the non-test teams, claiming victories in just over half their matches, and the test series went to South Africa, who won two of the three games. A side managed by Oxford University — supposedly the England rugby team, but actually including three Scottish players — toured Argentina at the time: the people of Argentina termed it the "Combined British". The next British Isles team tour did not take place until 1924, again in South Africa. The team, led by Ronald Cove-Smith, struggled with injuries and lost three of the four test matches, drawing the other 3–3. In total, 21 games were played, with the touring side winning 9, drawing 3 and losing 9. In 1927 a short, nine-game series took place in Argentina, with the British isles winning all nine encounters, and the tour was a financial success for Argentine rugby. The Lions returned to New Zealand in 1930 with some success. The Lions won all of their games that did not have test status except for the matches against Auckland, Wellington and Canterbury, but they lost three of their four test matches against New Zealand, winning the first test 6–3. The side also visited Australia, losing a test but winning five out of the six non-test games. In 1936 the British Isles visited Argentina for the third time, winning all ten of their matches | the glittering decade of the 1950s, the first tour of the 1960s proved not nearly as successful as previous ones. The 1962 tour to South Africa saw the Lions still win 16 of their 25 games, but did not fare well against the Springboks, losing three of the four tests. For the 1966 tour to Australia and New Zealand John Robins became the first Lions coach, and the trip started off very well for the Lions, who stormed through Australia, winning five non-tests and drawing one, and defeating Australia in two tests. The Lions experienced mixed results during the New Zealand leg of the tour, as well as losing all of the tests against New Zealand. The Lions also played a test against Canada on their way home, winning 19 to 8 in Toronto. The 1968 tour of South Africa saw the Lions win 15 of their 16 provincial matches, but the team actually lost three tests against the Springboks and drew one. 1970–1979 The 1970s saw a renaissance for the Lions. The 1971 British Lions tour to New Zealand and Australia, centred around the skilled Welsh half-back pairing of Gareth Edwards and Barry John, secured a series win over New Zealand. The tour started with a loss to Queensland but proceeded to storm through the next provincial fixtures, winning 11 games in a row. The Lions then went on to defeat New Zealand in Dunedin. The Lions only lost one match on the rest of the tour and won the test series against New Zealand, winning and drawing the last two games, to take the series two wins to one. The 1974 British Lions tour to South Africa was one of the best-known and most successful Lions teams. Apartheid concerns meant some players declined the tour. Nonetheless, led by the esteemed Irish forward Willie John McBride, the tour went through 22 games unbeaten and triumphed 3–0 (with one drawn) in the test series. The series featured a lot of violence. The management of the Lions concluded that the Springboks dominated their opponents with physical aggression. At that time, test match referees came from the home nation, substitutions took place only if a doctor found a player unable to continue and there were no video cameras or sideline officials to prevent violent play. The Lions decided "to get their retaliation in first" with the infamous "99 call". The Lions postulated that a South African referee would probably not send off all of the Lions if they all retaliated against "blatant thuggery". Famous video footage of the 'battle of Boet Erasmus Stadium' shows JPR Williams running over half of the pitch and launching himself at Van Heerden after such a call. The 1977 British Lions tour to New Zealand saw the Lions drop only one non-test out of 21 games, a loss to a Universities side. The team did not win the test series though, winning one game but losing the other three. In August 1977 the British Lions made a stopover in Fiji on the way home from their tour of New Zealand. Fiji beat them 25–21 at Buckhurst Park, Suva. 1980–1989 The Lions toured South Africa in 1980, and completed a flawless non-test record, winning 14 out of 14 matches. The Lions lost the first three tests to South Africa, only winning the last one once the Springboks were guaranteed to win the series. The 1983 tour to New Zealand saw the team successful in the non-test games, winning all but two games, but being whitewashed in the test series against New Zealand. A tour to South Africa by the Lions was anticipated in 1986, but the invitation for the Lions to tour was never accepted because of controversy surrounding Apartheid and the tour did not go ahead. The Lions did not return to South Africa until 1997, after the Apartheid era. A Lions team was selected in April 1986 for the International Rugby Board centenary match against 'The Rest'. The team was organised by the Four Home Unions Committee and the players were given the status of official British Lions. The Lions tour to Australia in 1989 was a shorter affair, being only 12 matches in total. The tour was very successful for the Lions, who won all eight non-test matches and won the test series against Australia, two to one. 1990–1999 The tour to New Zealand in 1993 was the last of the amateur era. The Lions won six and lost four non-test matches, and lost the test series 2–1. The tour to South Africa in 1997 was a success for the Lions, who completed the tour with only two losses, and won the test series 2–1. 2000–2009 In 2001, the ten-game tour to Australia saw the Wallabies win the test series 2–1. This series saw the first award of the Tom Richards Trophy. In the Lions' 2005 tour to New Zealand, coached by Clive Woodward, the Lions won seven games against provincial teams, were defeated by the New Zealand Maori team, and suffered heavy defeats in all three tests. In 2009, the Lions toured South Africa. There they faced the World Cup winners South Africa, with Ian McGeechan leading a coaching team including Warren Gatland, Shaun Edwards and Rob Howley. The Lions were captained by Irish lock Paul O'Connell. The initial Lions selection consisted of fourteen Irish players, thirteen Welsh, eight English and two Scots in the 37-man squad. In the first Test on 20 June, they lost 26–21, and lost the series in the second 28–25 in a tightly fought game at Loftus Versfeld on 27 June. The Lions won the third Test 28–9 at Ellis Park, and the series finished 2–1 to South Africa. 2010–2019 During June 2013 the British & Irish Lions toured Australia. Former Scotland and Lions full-back Andy Irvine was appointed as tour manager in 2010. Wales head coach Warren Gatland was the Lions' head coach, and their tour captain was Sam Warburton. The tour started in Hong Kong with a match against the Barbarians before moving on to Australia for the main tour featuring six provincial matches and three tests. The Lions won all but one non-test matches, losing to the Brumbies 14–12 on 18 June. The first test was followed shortly after this, which saw the Lions go 1-up over Australia winning 23–21. Australia did have a chance to take the win in the final moments of the game, but a missed penalty by Kurtley Beale saw the Lions take the win. The Wallabies drew the series in the second test winning 16–15, though the Lions had a chance to steal the win had it not been because of a missed penalty by Leigh Halfpenny. With tour captain Warburton out of the final test due to injury, Alun Wyn Jones took over the captaincy in the final test in Sydney. The final test was won by the Lions in what was a record win, winning 41–16 to earn their first series win since 1997 and their first over Australia since 1989. Following his winning tour of Australia in 2013, Warren Gatland was reappointed as Lions Head Coach for the tour to New Zealand in June and July 2017. In April 2016, it was announced that the side would again be captained again by Sam Warburton. The touring schedule included 10 games: an opening game against the Provincial Barbarians, challenge matches against all five of New Zealand's Super Rugby sides, a match against the Māori All Blacks and three tests against . The Lions defeated the Provincial Barbarians in the first game of the tour, before being beaten by the Blues three days later. The team recovered to beat the Crusaders but this was followed up with another midweek loss, this time against the Highlanders. The Lions then faced the Māori All Blacks, winning comfortably to restore optimism and followed up with their first midweek victory of the tour against the Chiefs. On 24 June, the Lions, captained by Peter O'Mahony, faced New Zealand in Eden Park in the first Test and were beaten 30–15. This was followed by the final midweek game of the tour, a draw against the Hurricanes. For the second Test, Gatland recalled Warburton to the starting team as captain. In Wellington Regional Stadium, the Lions beat a 14-man New Zealand side 24–21 after Sonny Bill Williams was red-carded at the 24-minute mark after a shoulder charge on Anthony Watson. This tied the series going into the final game, ending the side's 47-game winning run at home. In |
guitar", and "electric bass" and some authors claim that they are historically accurate. As the electric alternative to a double bass (which is not a guitar), many manufacturers such as Fender list the instrument in the electric bass category rather than the guitar category. Like the double bass, the bass guitar is a transposing instrument, as it is notated in bass clef an octave higher than it sounds, to reduce the need for ledger lines in music written for the instrument, and simplify reading. History 1930s In the 1930s, musician and inventor Paul Tutmarc of Seattle, Washington, developed the first electric bass guitar in its modern form, a fretted instrument designed to be played horizontally. The 1935 sales catalog for Tutmarc's company Audiovox featured his "Model 736 Bass Fiddle", a solid-bodied electric bass guitar with four strings, a scale length, and a single pickup. Around 100 were made during this period. Audiovox also sold their “Model 236” bass amplifier. 1950s In the 1950s, Leo Fender and George Fullerton developed the first mass-produced electric bass guitar. The Fender Electric Instrument Manufacturing Company began producing the Precision Bass, or P-Bass, in October 1951. The design featured a simple uncontoured "slab" body design and a single coil pickup similar to that of a Telecaster. By 1957 the Precision more closely resembled the Fender Stratocaster with the body edges beveled for comfort, and the pickup was changed to a split coil design. The Fender Bass was a revolutionary instrument for gigging musicians. In comparison with the large, heavy upright bass, which had been the main bass instrument in popular music from the early 20th century to the 1940s, the bass guitar could be easily transported to shows. When amplified, the bass guitar was also less prone than acoustic basses to unwanted audio feedback. The addition of frets enabled bassists to play in tune more easily than on fretless acoustic or electric upright basses, and allowed guitarists to more easily transition to the instrument. In 1953, Monk Montgomery became the first bassist to tour with the Fender bass, in Lionel Hampton's postwar big band. Montgomery was also possibly the first to record with the electric bass, on July 2, 1953, with the Art Farmer Septet. Roy Johnson (with Lionel Hampton), and Shifty Henry (with | a Guitar, usually with four heavy strings tuned E1'–A1'–D2–G2." It also defines bass as "Bass (iv). A contraction of Double bass or Electric bass guitar." According to some authors the proper term is "electric bass". Common names for the instrument are "bass guitar", "electric bass guitar", and "electric bass" and some authors claim that they are historically accurate. As the electric alternative to a double bass (which is not a guitar), many manufacturers such as Fender list the instrument in the electric bass category rather than the guitar category. Like the double bass, the bass guitar is a transposing instrument, as it is notated in bass clef an octave higher than it sounds, to reduce the need for ledger lines in music written for the instrument, and simplify reading. History 1930s In the 1930s, musician and inventor Paul Tutmarc of Seattle, Washington, developed the first electric bass guitar in its modern form, a fretted instrument designed to be played horizontally. The 1935 sales catalog for Tutmarc's company Audiovox featured his "Model 736 Bass Fiddle", a solid-bodied electric bass guitar with four strings, a scale length, and a single pickup. Around 100 were made during this period. Audiovox also sold their “Model 236” bass amplifier. 1950s In the 1950s, Leo Fender and George Fullerton developed the first mass-produced electric bass guitar. The Fender Electric Instrument Manufacturing Company began producing the Precision Bass, or P-Bass, in October 1951. The design featured a simple uncontoured "slab" body design and a single coil pickup similar to that of a Telecaster. By 1957 the Precision more closely resembled the Fender Stratocaster with the body edges beveled for comfort, and the pickup was changed to a split coil design. The Fender Bass was a revolutionary instrument for gigging musicians. In comparison with the large, heavy upright bass, which had been the main bass instrument in popular music from the early 20th century to the 1940s, the bass guitar could be easily transported to shows. When amplified, the bass guitar was also less prone than acoustic basses to unwanted audio feedback. The addition of frets enabled bassists to play in tune more easily than on fretless acoustic or electric upright basses, and allowed guitarists to more easily transition to the instrument. In 1953, Monk Montgomery became the first bassist to tour with the Fender bass, in Lionel Hampton's postwar big band. Montgomery was also possibly the first to record with the electric bass, on July 2, 1953, with the Art Farmer Septet. Roy Johnson (with Lionel Hampton), and Shifty Henry (with Louis Jordan and His Tympany Five), were other early Fender bass pioneers. Bill Black, who played with Elvis Presley, switched from upright bass to the Fender Precision Bass around 1957. The bass guitar was intended to appeal to guitarists as well as upright bass players, and many early pioneers of the instrument, such as Carol Kaye, Joe Osborn, and Paul McCartney were originally guitarists. Also in 1953, Gibson released the first short-scale violin-shaped electric bass, the EB-1, with an extendable end pin so a bassist could play it upright or horizontally. In 1958, Gibson released the maple arched-top EB-2 described in the Gibson catalog as a "hollow-body electric bass that features a Bass/Baritone pushbutton for two different tonal characteristics". In 1959, these were followed by the more conventional-looking EB-0 Bass. The EB-0 was very similar to a Gibson SG in appearance (although the earliest examples have a slab-sided body shape closer to that of the double-cutaway Les Paul Special). The Fender and Gibson versions used bolt-on and set necks. Several other companies also began manufacturing bass guitars during the 1950s. 1956 saw the appearance at the German trade fair "Musikmesse Frankfurt" of the distinctive Höfner 500/1 violin-shaped bass, made using violin construction techniques by Walter Höfner, a second-generation violin luthier. Due to its use by Paul McCartney, it became known as the "Beatle bass". In |
Commercial Athletic Association, which was tightly controlled by the Basketball Association of the Philippines (now defunct), the then-FIBA recognized national association. Nine teams from the MICAA participated in the league's first season that opened on April 9, 1975. The NBL is Australia's pre-eminent men's professional basketball league. The league commenced in 1979, playing a winter season (April–September) and did so until the completion of the 20th season in 1998. The 1998–99 season, which commenced only months later, was the first season after the shift to the current summer season format (October–April). This shift was an attempt to avoid competing directly against Australia's various football codes. It features 8 teams from around Australia and one in New Zealand. A few players including Luc Longley, Andrew Gaze, Shane Heal, Chris Anstey and Andrew Bogut made it big internationally, becoming poster figures for the sport in Australia. The Women's National Basketball League began in 1981. Women's basketball Women's basketball began in 1892 at Smith College when Senda Berenson, a physical education teacher, modified Naismith's rules for women. Shortly after she was hired at Smith, she went to Naismith to learn more about the game. Fascinated by the new sport and the values it could teach, she organized the first women's collegiate basketball game on March 21, 1893, when her Smith freshmen and sophomores played against one another. However, the first women's interinstitutional game was played in 1892 between the University of California and Miss Head's School. Berenson's rules were first published in 1899, and two years later she became the editor of A. G. Spalding's first Women's Basketball Guide. Berenson's freshmen played the sophomore class in the first women's intercollegiate basketball game at Smith College, March 21, 1893. The same year, Mount Holyoke and Sophie Newcomb College (coached by Clara Gregory Baer) women began playing basketball. By 1895, the game had spread to colleges across the country, including Wellesley, Vassar, and Bryn Mawr. The first intercollegiate women's game was on April 4, 1896. Stanford women played Berkeley, 9-on-9, ending in a 2–1 Stanford victory. Women's basketball development was more structured than that for men in the early years. In 1905, the executive committee on Basket Ball Rules (National Women's Basketball Committee) was created by the American Physical Education Association. These rules called for six to nine players per team and 11 officials. The International Women's Sports Federation (1924) included a women's basketball competition. 37 women's high school varsity basketball or state tournaments were held by 1925. And in 1926, the Amateur Athletic Union backed the first national women's basketball championship, complete with men's rules. The Edmonton Grads, a touring Canadian women's team based in Edmonton, Alberta, operated between 1915 and 1940. The Grads toured all over North America, and were exceptionally successful. They posted a record of 522 wins and only 20 losses over that span, as they met any team that wanted to challenge them, funding their tours from gate receipts. The Grads also shone on several exhibition trips to Europe, and won four consecutive exhibition Olympics tournaments, in 1924, 1928, 1932, and 1936; however, women's basketball was not an official Olympic sport until 1976. The Grads' players were unpaid, and had to remain single. The Grads' style focused on team play, without overly emphasizing skills of individual players. The first women's AAU All-America team was chosen in 1929. Women's industrial leagues sprang up throughout the United States, producing famous athletes, including Babe Didrikson of the Golden Cyclones, and the All American Red Heads Team, which competed against men's teams, using men's rules. By 1938, the women's national championship changed from a three-court game to two-court game with six players per team. The NBA-backed Women's National Basketball Association (WNBA) began in 1997. Though it had shaky attendance figures, several marquee players (Lisa Leslie, Diana Taurasi, and Candace Parker among others) have helped the league's popularity and level of competition. Other professional women's basketball leagues in the United States, such as the American Basketball League (1996–98), have folded in part because of the popularity of the WNBA. The WNBA has been looked at by many as a niche league. However, the league has recently taken steps forward. In June 2007, the WNBA signed a contract extension with ESPN. The new television deal ran from 2009 to 2016. Along with this deal, came the first-ever rights fees to be paid to a women's professional sports league. Over the eight years of the contract, "millions and millions of dollars" were "dispersed to the league's teams." In a March 12, 2009 article, NBA commissioner David Stern said that in the bad economy, "the NBA is far less profitable than the WNBA. We're losing a lot of money among a large number of teams. We're budgeting the WNBA to break even this year." Rules and regulations Measurements and time limits discussed in this section often vary among tournaments and organizations; international and NBA rules are used in this section. The object of the game is to outscore one's opponents by throwing the ball through the opponents' basket from above while preventing the opponents from doing so on their own. An attempt to score in this way is called a shot. A successful shot is worth two points, or three points if it is taken from beyond the three-point arc from the basket in international games and in NBA games. A one-point shot can be earned when shooting from the foul line after a foul is made. After a team has scored from a field goal or free throw, play is resumed with a throw-in awarded to the non-scoring team taken from a point beyond the endline of the court where the points(s) were scored. Playing regulations Games are played in four quarters of 10 (FIBA) or 12 minutes (NBA). College men's games use two 20-minute halves, college women's games use 10-minute quarters, and most United States high school varsity games use 8-minute quarters; however, this varies from state to state. 15 minutes are allowed for a half-time break under FIBA, NBA, and NCAA rules and 10 minutes in United States high schools. Overtime periods are five minutes in length except for high school, which is four minutes in length. Teams exchange baskets for the second half. The time allowed is actual playing time; the clock is stopped while the play is not active. Therefore, games generally take much longer to complete than the allotted game time, typically about two hours. Five players from each team may be on the court at one time. Substitutions are unlimited but can only be done when play is stopped. Teams also have a coach, who oversees the development and strategies of the team, and other team personnel such as assistant coaches, managers, statisticians, doctors and trainers. For both men's and women's teams, a standard uniform consists of a pair of shorts and a jersey with a clearly visible number, unique within the team, printed on both the front and back. Players wear high-top sneakers that provide extra ankle support. Typically, team names, players' names and, outside of North America, sponsors are printed on the uniforms. A limited number of time-outs, clock stoppages requested by a coach (or sometimes mandated in the NBA) for a short meeting with the players, are allowed. They generally last no longer than one minute (100 seconds in the NBA) unless, for televised games, a commercial break is needed. The game is controlled by the officials consisting of the referee (referred to as crew chief in the NBA), one or two umpires (referred to as referees in the NBA) and the table officials. For college, the NBA, and many high schools, there are a total of three referees on the court. The table officials are responsible for keeping track of each team's scoring, timekeeping, individual and team fouls, player substitutions, team possession arrow, and the shot clock. Equipment The only essential equipment in a basketball game is the ball and the court: a flat, rectangular surface with baskets at opposite ends. Competitive levels require the use of more equipment such as clocks, score sheets, scoreboard(s), alternating possession arrows, and whistle-operated stop-clock systems. A regulation basketball court in international games is long and wide. In the NBA and NCAA the court is . Most courts have wood flooring, usually constructed from maple planks running in the same direction as the longer court dimension. The name and logo of the home team is usually painted on or around the center circle. The basket is a steel rim diameter with an attached net affixed to a backboard that measures and one basket is at each end of the court. The white outlined box on the backboard is high and wide. At almost all levels of competition, the top of the rim is exactly above the court and inside the baseline. While variation is possible in the dimensions of the court and backboard, it is considered important for the basket to be of the correct height – a rim that is off by just a few inches can have an adverse effect on shooting. The net must "check the ball momentarily as it passes through the basket" to aid the visual confirmation that the ball went through. The act of checking the ball has the further advantage of slowing down the ball so the rebound doesn't go as far. The size of the basketball is also regulated. For men, the official ball is in circumference (size 7, or a "295 ball") and weighs . If women are playing, the official basketball size is in circumference (size 6, or a "285 ball") with a weight of . In 3x3, a formalized version of the halfcourt 3-on-3 game, a dedicated ball with the circumference of a size 6 ball but the weight of a size 7 ball is used in all competitions (men's, women's, and mixed teams). Violations The ball may be advanced toward the basket by being shot, passed between players, thrown, tapped, rolled or dribbled (bouncing the ball while running). The ball must stay within the court; the last team to touch the ball before it travels out of bounds forfeits possession. The ball is out of bounds if it touches a boundary line, or touches any player or object that is out of bounds. There are limits placed on the steps a player may take without dribbling, which commonly results in an infraction known as traveling. Nor may a player stop his dribble and then resume dribbling. A dribble that touches both hands is considered stopping the dribble, giving this infraction the name double dribble. Within a dribble, the player cannot carry the ball by placing his hand on the bottom of the ball; doing so is known as carrying the ball. A team, once having established ball control in the front half of their court, may not return the ball to the backcourt and be the first to touch it. A violation of these rules results in loss of possession. The ball may not be kicked, nor be struck with the fist. For the offense, a violation of these rules results in loss of possession; for the defense, most leagues reset the shot clock and the offensive team is given possession of the ball out of bounds. There are limits imposed on the time taken before progressing the ball past halfway (8 seconds in FIBA and the NBA; 10 seconds in NCAA and high school for both sexes), before attempting a shot (24 seconds in FIBA, the NBA, and U Sports (Canadian universities) play for both sexes, and 30 seconds in NCAA play for both sexes), holding the ball while closely guarded (5 seconds), and remaining in the restricted area known as the free-throw lane, (or the "key") (3 seconds). These rules are designed to promote more offense. There are also limits on how players may block an opponent's field goal attempt or help a teammate's field goal attempt. Goaltending is a defender's touching of a ball that is on a downward flight toward the basket, while the related violation of basket interference is the touching of a ball that is on the rim or above the basket, or by a player reaching through the basket from below. Goaltending and basket interference committed by a defender result in awarding the basket to the offense, while basket interference committed by an offensive player results in cancelling the basket if one is scored. The defense gains possession in all cases of goaltending or basket interference. Fouls An attempt to unfairly disadvantage an opponent through certain types of physical contact is illegal and is called a personal foul. These are most commonly committed by defensive players; however, they can be committed by offensive players as well. Players who are fouled either receive the ball to pass inbounds again, or receive one or more free throws if they are fouled in the act of shooting, depending on whether the shot was successful. One point is awarded for making a free throw, which is attempted from a line from the basket. The referee is responsible for judging whether contact is illegal, sometimes resulting in controversy. The calling of fouls can vary between games, leagues and referees. There is a second category of fouls called technical fouls, which may be charged for various rules violations including failure to properly record a player in the scorebook, or for unsportsmanlike conduct. These infractions result in one or two free throws, which may be taken by any of the five players on the court at the time. Repeated incidents can result in disqualification. A blatant foul involving physical contact that is either excessive or unnecessary is called an intentional foul (flagrant foul in the NBA). In FIBA and NCAA women's basketball, a foul resulting in ejection is called a disqualifying foul, while in leagues other than the NBA, such a foul is referred to as flagrant. If a team exceeds a certain limit of team fouls in a given period (quarter or half) – four for NBA, NCAA women's, and international games – the opposing team is awarded one or two free throws on all subsequent non-shooting fouls for that period, the number depending on the league. In the US college men's game and high school games for both sexes, if a team reaches 7 fouls in a half, the opposing team is awarded one free throw, along with a second shot if the first is made. This is called shooting "one-and-one". If a team exceeds 10 fouls in the half, the opposing team is awarded two free throws on all subsequent fouls for the half. When a team shoots foul shots, the opponents may not interfere with the shooter, nor may they try to regain possession until the last or potentially last free throw is in the air. After a team has committed a specified number of fouls, the other team is said to be "in the bonus". On scoreboards, this is usually signified with an indicator light reading "Bonus" or "Penalty" with an illuminated directional arrow or dot indicating that team is to receive free throws when fouled by the opposing team. (Some scoreboards also indicate the number of fouls committed.) If a team misses the first shot of a two-shot situation, the opposing team must wait for the completion of the second shot before attempting to reclaim possession of the ball and continuing play. If a player is fouled while attempting a shot and the shot is unsuccessful, the player is awarded a number of free throws equal to the value of the attempted shot. A player fouled while attempting a regular two-point shot thus receives two shots, and a player fouled while attempting a three-point shot receives three shots. If a player is fouled while attempting a shot and the shot is successful, typically the player will be awarded one additional free throw for one point. In combination with a regular shot, this is called a "three-point play" or "four-point play" (or more colloquially, an "and one") because of the basket made at the time of the foul (2 or 3 points) and the additional free throw (1 point). Common techniques and practices Positions Although the rules do not specify any positions whatsoever, they have evolved as part of basketball. During the early years of basketball's evolution, two guards, two forwards, and one center were used. In more recent times specific positions evolved, but the current trend, advocated by many top coaches including Mike Krzyzewski, is towards positionless basketball, where big players are free to shoot from outside and dribble if their skill allows it. Popular descriptions of positions include: Point guard (often called the "1") : usually the fastest player on the team, organizes the team's offense by controlling the ball and making sure that it gets to the right player at the right time. Shooting guard (the "2") : creates a high volume of shots on offense, mainly long-ranged; and guards the opponent's best perimeter player on defense. Small forward (the "3") : often primarily responsible for scoring points via cuts to the basket and dribble penetration; on defense seeks rebounds and steals, but sometimes plays more actively. Power forward (the "4"): plays offensively often with their back to the basket; on defense, plays under the basket (in a zone defense) or against the opposing power forward (in man-to-man defense). Center (the "5"): uses height and size to score (on offense), to protect the basket closely (on defense), or to rebound. The above descriptions are flexible. For most teams today, the shooting guard and small forward have very similar responsibilities and are often called the wings, as do the power forward and center, who are often called post players. While most teams describe two players as guards, two as forwards, and one as a center, on some occasions teams choose to call them by different designations. Strategy There are two main defensive strategies: zone defense and man-to-man defense. In a zone defense, each player is assigned to guard a specific area of the court. Zone defenses often allow the defense to double team the ball, a manoeuver known as a trap. In a man-to-man defense, each defensive player guards a specific opponent. Offensive plays are more varied, normally involving planned passes and movement by players without the ball. A quick movement by an offensive player without the ball to gain an advantageous position is known as a cut. A legal attempt by an offensive player to stop an opponent from guarding a teammate, by standing in the defender's way such that the teammate cuts next to him, is a screen or pick. The two plays are combined in the pick and roll, in which a player sets a pick and then "rolls" away from the pick towards the basket. Screens and cuts are very important in offensive plays; these allow the quick passes and teamwork, which can lead to a successful basket. Teams almost always have several offensive plays planned to ensure their movement is not predictable. On court, the point guard is usually responsible for indicating which play will occur. Shooting Shooting is the act of attempting to score points by throwing the ball through the basket, methods varying with players and situations. Typically, a player faces the basket with both feet facing the basket. A player will rest the ball on the fingertips of the dominant hand (the shooting arm) slightly above the head, with the other hand supporting the side of the ball. The ball is usually shot by jumping (though not always) and extending the shooting arm. The shooting arm, fully extended with the wrist fully bent, is held stationary for a moment following the release of the ball, known as a follow-through. Players often try to put a steady backspin on the ball to absorb its impact with the rim. The ideal trajectory of the shot is somewhat controversial, but generally a proper arc is recommended. Players may shoot directly into the basket or may use the backboard to redirect the ball into the basket. The two most common shots that use the above described setup are the set shot and the jump shot. Both are preceded by a crouching action which preloads the muscles and increases the power of the shot. In a set shot, the shooter straightens up and throws from a standing position with neither foot leaving the floor; this is typically used for free throws. For a jump shot, the throw is taken in mid-air with the ball being released near the top of the jump. This provides much greater power and range, and it also allows the player to elevate over the defender. Failure to release the ball before the feet return to the floor is considered a traveling violation. Another common shot is called the layup. This shot requires the player to be in motion toward the basket, and to "lay" the ball "up" and into the basket, typically off the backboard (the backboard-free, underhand version is called a finger roll). The most crowd-pleasing and typically highest-percentage accuracy shot is the slam dunk, in which the player jumps very high and throws the ball downward, through the basket while touching it. Another shot that is less common than the layup, is the "circus shot". The circus shot is a low-percentage shot that is flipped, heaved, scooped, or flung toward the hoop while the shooter is off-balance, airborne, falling down, and/or facing away from the basket. A back-shot is a shot taken when the player is facing away from the basket, and may be shot with the dominant hand, or both; but there is a very low chance that the shot will be successful. A shot that misses both the rim and the backboard completely is referred to as an air ball. A particularly bad shot, or one that only hits the backboard, is jocularly called a brick. The hang time is the length of time a player stays in the air after jumping, either to make a slam dunk, layup or jump shot. Rebounding The objective of rebounding is to successfully gain possession of the basketball after a missed field goal or free throw, as it rebounds from the hoop or backboard. This plays a major role in the game, as most possessions end when a team misses a shot. There are two categories of rebounds: offensive rebounds, in which the ball is recovered by the offensive side and does not change possession, and defensive rebounds, in which the defending team gains possession of the loose ball. The majority of rebounds are defensive, as the team on defense tends to be in better position to recover missed shots. Passing A pass is a method of moving the ball between players. Most passes are accompanied by a step forward to increase power and are followed through with the hands to ensure accuracy. A staple pass is the chest pass. The | from gate receipts. The Grads also shone on several exhibition trips to Europe, and won four consecutive exhibition Olympics tournaments, in 1924, 1928, 1932, and 1936; however, women's basketball was not an official Olympic sport until 1976. The Grads' players were unpaid, and had to remain single. The Grads' style focused on team play, without overly emphasizing skills of individual players. The first women's AAU All-America team was chosen in 1929. Women's industrial leagues sprang up throughout the United States, producing famous athletes, including Babe Didrikson of the Golden Cyclones, and the All American Red Heads Team, which competed against men's teams, using men's rules. By 1938, the women's national championship changed from a three-court game to two-court game with six players per team. The NBA-backed Women's National Basketball Association (WNBA) began in 1997. Though it had shaky attendance figures, several marquee players (Lisa Leslie, Diana Taurasi, and Candace Parker among others) have helped the league's popularity and level of competition. Other professional women's basketball leagues in the United States, such as the American Basketball League (1996–98), have folded in part because of the popularity of the WNBA. The WNBA has been looked at by many as a niche league. However, the league has recently taken steps forward. In June 2007, the WNBA signed a contract extension with ESPN. The new television deal ran from 2009 to 2016. Along with this deal, came the first-ever rights fees to be paid to a women's professional sports league. Over the eight years of the contract, "millions and millions of dollars" were "dispersed to the league's teams." In a March 12, 2009 article, NBA commissioner David Stern said that in the bad economy, "the NBA is far less profitable than the WNBA. We're losing a lot of money among a large number of teams. We're budgeting the WNBA to break even this year." Rules and regulations Measurements and time limits discussed in this section often vary among tournaments and organizations; international and NBA rules are used in this section. The object of the game is to outscore one's opponents by throwing the ball through the opponents' basket from above while preventing the opponents from doing so on their own. An attempt to score in this way is called a shot. A successful shot is worth two points, or three points if it is taken from beyond the three-point arc from the basket in international games and in NBA games. A one-point shot can be earned when shooting from the foul line after a foul is made. After a team has scored from a field goal or free throw, play is resumed with a throw-in awarded to the non-scoring team taken from a point beyond the endline of the court where the points(s) were scored. Playing regulations Games are played in four quarters of 10 (FIBA) or 12 minutes (NBA). College men's games use two 20-minute halves, college women's games use 10-minute quarters, and most United States high school varsity games use 8-minute quarters; however, this varies from state to state. 15 minutes are allowed for a half-time break under FIBA, NBA, and NCAA rules and 10 minutes in United States high schools. Overtime periods are five minutes in length except for high school, which is four minutes in length. Teams exchange baskets for the second half. The time allowed is actual playing time; the clock is stopped while the play is not active. Therefore, games generally take much longer to complete than the allotted game time, typically about two hours. Five players from each team may be on the court at one time. Substitutions are unlimited but can only be done when play is stopped. Teams also have a coach, who oversees the development and strategies of the team, and other team personnel such as assistant coaches, managers, statisticians, doctors and trainers. For both men's and women's teams, a standard uniform consists of a pair of shorts and a jersey with a clearly visible number, unique within the team, printed on both the front and back. Players wear high-top sneakers that provide extra ankle support. Typically, team names, players' names and, outside of North America, sponsors are printed on the uniforms. A limited number of time-outs, clock stoppages requested by a coach (or sometimes mandated in the NBA) for a short meeting with the players, are allowed. They generally last no longer than one minute (100 seconds in the NBA) unless, for televised games, a commercial break is needed. The game is controlled by the officials consisting of the referee (referred to as crew chief in the NBA), one or two umpires (referred to as referees in the NBA) and the table officials. For college, the NBA, and many high schools, there are a total of three referees on the court. The table officials are responsible for keeping track of each team's scoring, timekeeping, individual and team fouls, player substitutions, team possession arrow, and the shot clock. Equipment The only essential equipment in a basketball game is the ball and the court: a flat, rectangular surface with baskets at opposite ends. Competitive levels require the use of more equipment such as clocks, score sheets, scoreboard(s), alternating possession arrows, and whistle-operated stop-clock systems. A regulation basketball court in international games is long and wide. In the NBA and NCAA the court is . Most courts have wood flooring, usually constructed from maple planks running in the same direction as the longer court dimension. The name and logo of the home team is usually painted on or around the center circle. The basket is a steel rim diameter with an attached net affixed to a backboard that measures and one basket is at each end of the court. The white outlined box on the backboard is high and wide. At almost all levels of competition, the top of the rim is exactly above the court and inside the baseline. While variation is possible in the dimensions of the court and backboard, it is considered important for the basket to be of the correct height – a rim that is off by just a few inches can have an adverse effect on shooting. The net must "check the ball momentarily as it passes through the basket" to aid the visual confirmation that the ball went through. The act of checking the ball has the further advantage of slowing down the ball so the rebound doesn't go as far. The size of the basketball is also regulated. For men, the official ball is in circumference (size 7, or a "295 ball") and weighs . If women are playing, the official basketball size is in circumference (size 6, or a "285 ball") with a weight of . In 3x3, a formalized version of the halfcourt 3-on-3 game, a dedicated ball with the circumference of a size 6 ball but the weight of a size 7 ball is used in all competitions (men's, women's, and mixed teams). Violations The ball may be advanced toward the basket by being shot, passed between players, thrown, tapped, rolled or dribbled (bouncing the ball while running). The ball must stay within the court; the last team to touch the ball before it travels out of bounds forfeits possession. The ball is out of bounds if it touches a boundary line, or touches any player or object that is out of bounds. There are limits placed on the steps a player may take without dribbling, which commonly results in an infraction known as traveling. Nor may a player stop his dribble and then resume dribbling. A dribble that touches both hands is considered stopping the dribble, giving this infraction the name double dribble. Within a dribble, the player cannot carry the ball by placing his hand on the bottom of the ball; doing so is known as carrying the ball. A team, once having established ball control in the front half of their court, may not return the ball to the backcourt and be the first to touch it. A violation of these rules results in loss of possession. The ball may not be kicked, nor be struck with the fist. For the offense, a violation of these rules results in loss of possession; for the defense, most leagues reset the shot clock and the offensive team is given possession of the ball out of bounds. There are limits imposed on the time taken before progressing the ball past halfway (8 seconds in FIBA and the NBA; 10 seconds in NCAA and high school for both sexes), before attempting a shot (24 seconds in FIBA, the NBA, and U Sports (Canadian universities) play for both sexes, and 30 seconds in NCAA play for both sexes), holding the ball while closely guarded (5 seconds), and remaining in the restricted area known as the free-throw lane, (or the "key") (3 seconds). These rules are designed to promote more offense. There are also limits on how players may block an opponent's field goal attempt or help a teammate's field goal attempt. Goaltending is a defender's touching of a ball that is on a downward flight toward the basket, while the related violation of basket interference is the touching of a ball that is on the rim or above the basket, or by a player reaching through the basket from below. Goaltending and basket interference committed by a defender result in awarding the basket to the offense, while basket interference committed by an offensive player results in cancelling the basket if one is scored. The defense gains possession in all cases of goaltending or basket interference. Fouls An attempt to unfairly disadvantage an opponent through certain types of physical contact is illegal and is called a personal foul. These are most commonly committed by defensive players; however, they can be committed by offensive players as well. Players who are fouled either receive the ball to pass inbounds again, or receive one or more free throws if they are fouled in the act of shooting, depending on whether the shot was successful. One point is awarded for making a free throw, which is attempted from a line from the basket. The referee is responsible for judging whether contact is illegal, sometimes resulting in controversy. The calling of fouls can vary between games, leagues and referees. There is a second category of fouls called technical fouls, which may be charged for various rules violations including failure to properly record a player in the scorebook, or for unsportsmanlike conduct. These infractions result in one or two free throws, which may be taken by any of the five players on the court at the time. Repeated incidents can result in disqualification. A blatant foul involving physical contact that is either excessive or unnecessary is called an intentional foul (flagrant foul in the NBA). In FIBA and NCAA women's basketball, a foul resulting in ejection is called a disqualifying foul, while in leagues other than the NBA, such a foul is referred to as flagrant. If a team exceeds a certain limit of team fouls in a given period (quarter or half) – four for NBA, NCAA women's, and international games – the opposing team is awarded one or two free throws on all subsequent non-shooting fouls for that period, the number depending on the league. In the US college men's game and high school games for both sexes, if a team reaches 7 fouls in a half, the opposing team is awarded one free throw, along with a second shot if the first is made. This is called shooting "one-and-one". If a team exceeds 10 fouls in the half, the opposing team is awarded two free throws on all subsequent fouls for the half. When a team shoots foul shots, the opponents may not interfere with the shooter, nor may they try to regain possession until the last or potentially last free throw is in the air. After a team has committed a specified number of fouls, the other team is said to be "in the bonus". On scoreboards, this is usually signified with an indicator light reading "Bonus" or "Penalty" with an illuminated directional arrow or dot indicating that team is to receive free throws when fouled by the opposing team. (Some scoreboards also indicate the number of fouls committed.) If a team misses the first shot of a two-shot situation, the opposing team must wait for the completion of the second shot before attempting to reclaim possession of the ball and continuing play. If a player is fouled while attempting a shot and the shot is unsuccessful, the player is awarded a number of free throws equal to the value of the attempted shot. A player fouled while attempting a regular two-point shot thus receives two shots, and a player fouled while attempting a three-point shot receives three shots. If a player is fouled while attempting a shot and the shot is successful, typically the player will be awarded one additional free throw for one point. In combination with a regular shot, this is called a "three-point play" or "four-point play" (or more colloquially, an "and one") because of the basket made at the time of the foul (2 or 3 points) and the additional free throw (1 point). Common techniques and practices Positions Although the rules do not specify any positions whatsoever, they have evolved as part of basketball. During the early years of basketball's evolution, two guards, two forwards, and one center were used. In more recent times specific positions evolved, but the current trend, advocated by many top coaches including Mike Krzyzewski, is towards positionless basketball, where big players are free to shoot from outside and dribble if their skill allows it. Popular descriptions of positions include: Point guard (often called the "1") : usually the fastest player on the team, organizes the team's offense by controlling the ball and making sure that it gets to the right player at the right time. Shooting guard (the "2") : creates a high volume of shots on offense, mainly long-ranged; and guards the opponent's best perimeter player on defense. Small forward (the "3") : often primarily responsible for scoring points via cuts to the basket and dribble penetration; on defense seeks rebounds and steals, but sometimes plays more actively. Power forward (the "4"): plays offensively often with their back to the basket; on defense, plays under the basket (in a zone defense) or against the opposing power forward (in man-to-man defense). Center (the "5"): uses height and size to score (on offense), to protect the basket closely (on defense), or to rebound. The above descriptions are flexible. For most teams today, the shooting guard and small forward have very similar responsibilities and are often called the wings, as do the power forward and center, who are often called post players. While most teams describe two players as guards, two as forwards, and one as a center, on some occasions teams choose to call them by different designations. Strategy There are two main defensive strategies: zone defense and man-to-man defense. In a zone defense, each player is assigned to guard a |
Tetraodontidae. Blowfish may also refer to: Porcupinefish, belonging to the family Diodontidae Blowfish (cipher), an encryption algorithm Blowfish (company), an American erotic | also refer to: Porcupinefish, belonging to the family Diodontidae Blowfish (cipher), an encryption algorithm Blowfish (company), an American erotic goods supplier The Blowfish, a |
Ancient Greeks Among the ancient Greeks, games with balls (σφαῖραι) were regarded as a useful subsidiary to the more violent athletic exercises, as a means of keeping the body supple, and rendering it graceful, but were generally left to boys and girls. Of regular rules for the playing of ball games, little trace remains, if there were any such. The names in Greek for various forms, which have come down to us in such works as the Ὀνομαστικόν of Julius Pollux, imply little or nothing of such; thus, ἀπόρραξις (aporraxis) only means the putting of the ball on the ground with the open hand, οὐρανία (ourania), the flinging of the ball in the air to be caught by two or more players; φαινίνδα (phaininda) would seem to be a game of catch played by two or more, where feinting is used as a test of quickness and skill. Pollux (i. x. 104) mentions a game called episkyros (ἐπίσκυρος), which has often been looked on as the origin of football. It seems to have been played by two sides, arranged in lines; how far there was any form of "goal" seems uncertain. It was impossible to produce a ball that was perfectly spherical; children usually made their own balls by inflating pig's bladders and heating them in the ashes of a fire to make them rounder, although Plato (fl. 420s BC – 340s BC) described "balls which have leather coverings in twelve pieces". Ancient Romans Among the Romans, ball games were looked upon as an adjunct to the bath, and were graduated to the age and health of the bathers, and usually a place (sphaeristerium) was set apart for them in the baths (thermae). There appear to have been three types or sizes of ball, the pila, or small ball, used in catching games, the | is played with was in 1205 in in the phrase, "" The word came from the Middle English bal (inflected as ball-e, -es, in turn from Old Norse böllr (pronounced ; compare Old Swedish baller, and Swedish boll) from Proto-Germanic ballu-z (whence probably Middle High German bal, ball-es, Middle Dutch bal), a cognate with Old High German ballo, pallo, Middle High German balle from Proto-Germanic *ballon (weak masculine), and Old High German ballâ, pallâ, Middle High German balle, Proto-Germanic *ballôn (weak feminine). No Old English representative of any of these is known. (The answering forms in Old English would have been beallu, -a, -e—compare bealluc, ballock.) If ball- was native in Germanic, it may have been a cognate with the Latin foll-is in sense of a "thing blown up or inflated." In the later Middle English spelling balle the word coincided graphically with the French balle "ball" and "bale" which has hence been erroneously assumed to be its source. French balle (but not boule) is assumed to be of Germanic origin, itself, however. In Ancient Greek the word πάλλα (palla) for "ball" is attested besides the word σφαίρα (sfaíra), sphere. History A ball, as the essential feature in many forms of gameplay requiring physical exertion, must date from the very earliest times. A rolling object appeals not only to a human baby, but to a kitten and a puppy. Some form of game with a ball is found portrayed on Egyptian monuments, and is played among aboriginal tribes at the present day. In Homer, Nausicaa was playing at ball with her maidens when Odysseus first saw her in the land of the Phaeacians (Od. vi. 100). And Halios and Laodamas performed before Alcinous |
Steiner systems which have an n-element set S and a set of k-element subsets called blocks, such that a subset with t elements lies in just one block. These incidence structures have been generalized with block designs. The incidence matrix used in these geometrical contexts corresponds to the logical matrix used generally with binary relations. An incidence structure is a triple D = (V, B, I) where V and B are any two disjoint sets and I is a binary relation between V and B, i.e. The elements of V will be called , those of B blocks and those of . Special types of binary relations Some important types of binary relations R over sets X and Y are listed below. Uniqueness properties: Injective (also called left-unique): for all and all if and then . For such a relation, {Y} is called a primary key of R. For example, the green and blue binary relations in the diagram are injective, but the red one is not (as it relates both −1 and 1 to 1), nor the black one (as it relates both −1 and 1 to 0). Functional (also called right-unique, right-definite or univalent): for all and all if and then . Such a binary relation is called a . For such a relation, is called of R. For example, the red and green binary relations in the diagram are functional, but the blue one is not (as it relates 1 to both −1 and 1), nor the black one (as it relates 0 to both −1 and 1). One-to-one: injective and functional. For example, the green binary relation in the diagram is one-to-one, but the red, blue and black ones are not. One-to-many: injective and not functional. For example, the blue binary relation in the diagram is one-to-many, but the red, green and black ones are not. Many-to-one: functional and not injective. For example, the red binary relation in the diagram is many-to-one, but the green, blue and black ones are not. Many-to-many: not injective nor functional. For example, the black binary relation in the diagram is many-to-many, but the red, green and blue ones are not. Totality properties (only definable if the domain X and codomain Y are specified): Serial (also called left-total): for all x in X there exists a y in Y such that . In other words, the domain of definition of R is equal to X. This property, although also referred to as by some authors, is different from the definition of (also called by some authors) in Properties. Such a binary relation is called a . For example, the red and green binary relations in the diagram are serial, but the blue one is not (as it does not relate −1 to any real number), nor the black one (as it does not relate 2 to any real number). As another example, > is a serial relation over the integers. But it is not a serial relation over the positive integers, because there is no in the positive integers such that . However, < is a serial relation over the positive integers, the rational numbers and the real numbers. Every reflexive relation is serial: for a given , choose . Surjective (also called right-total or onto): for all y in Y, there exists an x in X such that xRy. In other words, the codomain of definition of R is equal to Y. For example, the green and blue binary relations in the diagram are surjective, but the red one is not (as it does not relate any real number to −1), nor the black one (as it does not relate any real number to 2). Uniqueness and totality properties (only definable if the domain X and codomain Y are specified): A : a binary relation that is functional and serial. For example, the red and green binary relations in the diagram are functions, but the blue and black ones are not. An : a function that is injective. For example, the green binary relation in the diagram is an injection, but the red, blue and black ones are not. A : a function that is surjective. For example, the green binary relation in the diagram is a surjection, but the red, blue and black ones are not. A : a function that is injective and surjective. For example, the green binary relation in the diagram is a bijection, but the red, blue and black ones are not. If relations over proper classes are allowed: Set-like (or ): for all in , the class of all in such that , i.e. , is a set. For example, the relation is set-like, and every relation on two sets is set-like. The usual ordering < over the class of ordinal numbers is a set-like relation, while its inverse > is not. Operations on binary relations Union If R and S are binary relations over sets X and Y then is the of R and S over X and Y. The identity element is the empty relation. For example, is the union of < and =, and is the union of > and =. Intersection If R and S are binary relations over sets X and Y then is the of R and S over X and Y. The identity element is the universal relation. For example, the relation "is divisible by 6" is the intersection of the relations "is divisible by 3" and "is divisible by 2". Composition If R is a binary relation over sets X and Y, and S is a binary relation over sets Y and Z then (also denoted by ) is the of R and S over X and Z. The identity element is the identity relation. The order of R and S in the notation used here agrees with the standard notational order for composition of functions. For example, the composition "is mother of" "is parent of" yields "is maternal grandparent of", while the composition "is parent of" "is mother of" yields "is grandmother of". For the former case, if x is the parent of y and y is the mother of z, then x is the maternal grandparent of z. Converse If R is a binary relation over sets X and Y then is the of R over Y and X. For example, = is the converse of itself, as is and and are each other's converse, as are and A binary relation is equal to its converse if and only if it is symmetric. Complement If R is a binary relation over sets X and Y then (also denoted by or ) is the of R over X and Y. For example, and are each other's complement, as are and and and and and, for total orders, also < and and > and The complement of the converse relation is the converse of the complement: If the complement has the following properties: If a relation is symmetric, then so is the complement. The complement of a reflexive relation is irreflexive—and vice versa. The complement of a strict weak order is a total preorder—and vice versa. Restriction If R is a binary homogeneous relation over a set X and S is a subset of X then is the of R to S over X. If R is a binary relation over sets X and Y and if S is a subset of X then is the of R to S over X and Y. If R is a binary relation over sets X and Y and if S is a subset of Y then is the of R to S over X and Y. If a relation is reflexive, irreflexive, symmetric, antisymmetric, asymmetric, transitive, total, trichotomous, a partial order, total order, strict weak order, total preorder (weak order), or an equivalence relation, then so too are its restrictions. However, the transitive closure of a restriction is a subset of the restriction of the transitive closure, i.e., in general not equal. For example, restricting the relation "x is parent of y" to females yields the relation "x is mother of the woman y"; its transitive closure doesn't relate a woman with her paternal grandmother. On the other hand, the transitive closure of "is parent of" is "is ancestor of"; its restriction to females does relate a woman with her paternal grandmother. Also, the various concepts of completeness (not to be confused with being "total") do not carry over to restrictions. For example, over the real numbers a property of the relation is that every non-empty subset with an upper bound in has a least upper bound (also called supremum) in However, for the rational numbers this supremum is not necessarily rational, so the same property does not hold on the restriction of the relation to the rational numbers. A binary relation R over | incidence. Finite and infinite projective and affine planes are included. Jakob Steiner pioneered the cataloguing of configurations with the Steiner systems which have an n-element set S and a set of k-element subsets called blocks, such that a subset with t elements lies in just one block. These incidence structures have been generalized with block designs. The incidence matrix used in these geometrical contexts corresponds to the logical matrix used generally with binary relations. An incidence structure is a triple D = (V, B, I) where V and B are any two disjoint sets and I is a binary relation between V and B, i.e. The elements of V will be called , those of B blocks and those of . Special types of binary relations Some important types of binary relations R over sets X and Y are listed below. Uniqueness properties: Injective (also called left-unique): for all and all if and then . For such a relation, {Y} is called a primary key of R. For example, the green and blue binary relations in the diagram are injective, but the red one is not (as it relates both −1 and 1 to 1), nor the black one (as it relates both −1 and 1 to 0). Functional (also called right-unique, right-definite or univalent): for all and all if and then . Such a binary relation is called a . For such a relation, is called of R. For example, the red and green binary relations in the diagram are functional, but the blue one is not (as it relates 1 to both −1 and 1), nor the black one (as it relates 0 to both −1 and 1). One-to-one: injective and functional. For example, the green binary relation in the diagram is one-to-one, but the red, blue and black ones are not. One-to-many: injective and not functional. For example, the blue binary relation in the diagram is one-to-many, but the red, green and black ones are not. Many-to-one: functional and not injective. For example, the red binary relation in the diagram is many-to-one, but the green, blue and black ones are not. Many-to-many: not injective nor functional. For example, the black binary relation in the diagram is many-to-many, but the red, green and blue ones are not. Totality properties (only definable if the domain X and codomain Y are specified): Serial (also called left-total): for all x in X there exists a y in Y such that . In other words, the domain of definition of R is equal to X. This property, although also referred to as by some authors, is different from the definition of (also called by some authors) in Properties. Such a binary relation is called a . For example, the red and green binary relations in the diagram are serial, but the blue one is not (as it does not relate −1 to any real number), nor the black one (as it does not relate 2 to any real number). As another example, > is a serial relation over the integers. But it is not a serial relation over the positive integers, because there is no in the positive integers such that . However, < is a serial relation over the positive integers, the rational numbers and the real numbers. Every reflexive relation is serial: for a given , choose . Surjective (also called right-total or onto): for all y in Y, there exists an x in X such that xRy. In other words, the codomain of definition of R is equal to Y. For example, the green and blue binary relations in the diagram are surjective, but the red one is not (as it does not relate any real number to −1), nor the black one (as it does not relate any real number to 2). Uniqueness and totality properties (only definable if the domain X and codomain Y are specified): A : a binary relation that is functional and serial. For example, the red and green binary relations in the diagram are functions, but the blue and black ones are not. An : a function that is injective. For example, the green binary relation in the diagram is an injection, but the red, blue and black ones are not. A : a function that is surjective. For example, the green binary relation in the diagram is a surjection, but the red, blue and black ones are not. A : a function that is injective and surjective. For example, the green binary relation in the diagram is a bijection, but the red, blue and black ones are not. If relations over proper classes are allowed: Set-like (or ): for all in , the class of all in such that , i.e. , is a set. For example, the relation is set-like, and every relation on two sets is set-like. The usual ordering < over the class of ordinal numbers is a set-like relation, while its inverse > is not. Operations on binary relations Union If R and S are binary relations over sets X and Y then is the of R and S over X and Y. The identity element is the empty relation. For example, is the union of < and =, and is the union of > and =. Intersection If R and S are binary relations over sets X and Y then is the of R and S over X and Y. The identity element is the universal relation. For example, the relation "is divisible by 6" is the intersection of the relations "is divisible by 3" and "is divisible by 2". Composition If R is a binary relation over sets X and Y, and S is a binary relation over sets Y and Z then (also denoted by ) is the of R and S over X and Z. The identity element is the identity relation. The order of R and S in the notation used here agrees with the standard notational order for composition of functions. For example, the composition "is mother of" "is parent of" yields "is maternal grandparent of", while the composition "is parent of" "is mother of" yields "is grandmother of". For the former case, if x is the parent of y and y is the mother of z, then x is the maternal grandparent of z. Converse If R is a binary relation over sets X and Y then is the of R over Y and X. For example, = is the converse of itself, as is and and are each other's converse, as are and A binary relation is equal to its converse if and only if it is symmetric. Complement If R is a binary relation over sets X and Y then (also denoted by or ) is the of R over X and Y. For example, and are each other's complement, as are and and and and and, for total orders, also < and and > and The complement of the converse relation is the converse of the complement: If the complement has the following properties: If a relation is symmetric, then so is the complement. The complement of a reflexive relation is irreflexive—and vice versa. The complement of a strict weak order is a total preorder—and vice versa. Restriction If R is a binary homogeneous relation over a set X and S is a subset of X then is the of R to S over X. If R is a binary relation over sets X and Y and if S is a subset of X then is the of R to S over X and Y. If R is a binary relation over sets X and Y and if S is a subset of Y then is the of R to S over X and Y. If a relation is reflexive, irreflexive, symmetric, antisymmetric, asymmetric, transitive, total, trichotomous, a partial order, total order, strict weak order, total preorder (weak order), or an equivalence relation, then so too are its restrictions. However, the transitive closure of a restriction is a subset of the restriction of the transitive closure, i.e., in general not equal. For example, restricting the relation "x is parent of y" to females yields the relation "x is mother of the woman y"; its transitive closure doesn't relate a woman with her paternal grandmother. On the other hand, the transitive closure of "is parent of" is "is ancestor of"; its restriction to females does relate a woman with her paternal grandmother. Also, the various concepts of completeness (not to be confused with being "total") do not carry over to restrictions. For example, over the real numbers a property of the relation is that every non-empty subset with an upper bound in has a least upper bound (also called supremum) in However, for the rational numbers this supremum is not necessarily rational, so the same property does not hold on the restriction of the relation to the rational numbers. A binary relation R over sets X and Y is said to be a relation S over X and Y, written if R is a subset of S, that is, for all and if xRy, then xSy. If R is contained in S and S is contained in R, then R and S are called written R = S. If R is contained in S but S is not contained in R, then R is said to be than S, written For example, on the rational numbers, the relation is smaller than and equal to the composition Matrix representation Binary relations over sets X and Y can be represented algebraically by logical matrices indexed by X and Y with entries in the Boolean semiring (addition corresponds to OR and multiplication to AND) where matrix addition corresponds to union of relations, matrix multiplication corresponds to composition of relations (of a relation over X and Y and a relation over Y and Z), the Hadamard product corresponds to intersection of relations, the zero matrix corresponds to the empty relation, and the matrix of ones corresponds to the universal relation. Homogeneous relations (when ) form a matrix semiring (indeed, a matrix semialgebra over the Boolean semiring) where the identity matrix corresponds to the identity relation. Sets versus classes Certain mathematical "relations", such as "equal to", "subset of", and "member of", cannot be understood to be binary relations as defined above, because their domains and codomains cannot be taken to be sets in the usual systems of axiomatic set theory. For example, to model the general concept of "equality" as a binary relation take the domain and codomain to be the "class of all sets", which is not a set in the usual set theory. In most mathematical contexts, references to the relations of equality, membership and subset are harmless because they can be understood implicitly to be restricted to some set in the context. The usual work-around to this problem is to select a "large enough" set A, that contains all the objects of interest, and work with the restriction =A instead of =. Similarly, the "subset of" relation needs to be restricted to have domain and codomain P(A) (the power set of a specific set A): the resulting set relation can be denoted by Also, the "member of" relation needs to be restricted to have domain A and codomain P(A) to obtain a binary relation that is a set. Bertrand Russell has shown that assuming to be defined over all sets leads to a contradiction in naive set theory. Another solution to this problem is to use a set theory with proper classes, such as NBG or Morse–Kelley set theory, and allow the domain and codomain (and so the graph) to be proper classes: in such a theory, equality, membership, and subset are binary relations without special comment. (A minor modification needs to be made to the concept of the ordered triple , as normally a proper class cannot be a member of an ordered tuple; or of course one can identify the binary relation with its graph in this context.) With this definition one can for instance define a binary relation over every set and its power set. Homogeneous relation A homogeneous relation over a set X is a binary relation over X and itself, i.e. it is a subset of the Cartesian product It is also simply called a (binary) relation over X. A homogeneous relation |
space () is used for punctuation. Letters a and c , which only use dots in the top row, were shifted two places for the apostrophe and hyphen: . (These are also the decade diacritics, at left in the table below, of the second and third decade.) In addition, there are ten patterns that are based on the first two letters () with their dots shifted to the right; these were assigned to non-French letters (ì ä ò ), or serve non-letter functions: (superscript; in English the accent mark), (currency prefix), (capital, in English the decimal point), (number sign), (emphasis mark), (symbol prefix). {| class=wikitable |+ The 64 modern braille cells !colspan=2|decade|| ||colspan=10|numeric sequence || ||colspan=2|shift right |- align=center !1st | || | | | | | | | | | | || | | |- align=center !2nd | || | | | | | | | | | | || | | |- align=center !3rd | || | | | | | | | | | | || | | |- align=center !4th | || | | | | | | | | | | || | | |- align=center !5th ! shiftdown | | | | | | | | | | | || | | |} The first four decades are similar in respect that in those decades the decade dots are applied to the numeric sequence as a logical "inclusive OR" operation whereas the fifth decade applies a "shift down" operation to the numeric sequence. Originally there had been nine decades. The fifth through ninth used dashes as well as dots, but proved to be impractical and were soon abandoned. These could be replaced with what we now know as the number sign (), though that only caught on for the digits (old 5th decade → modern 1st decade). The dash occupying the top row of the original sixth decade was simply dropped, producing the modern fifth decade. (See 1829 braille.) Assignment Historically, there have been three principles in assigning the values of a linear script (print) to Braille: Using Louis Braille's original French letter values; reassigning the braille letters according to the sort order of the print alphabet being transcribed; and reassigning the letters to improve the efficiency of writing in braille. Under international consensus, most braille alphabets follow the French sorting order for the 26 letters of the basic Latin alphabet, and there have been attempts at unifying the letters beyond these 26 (see international braille), though differences remain, for example in German Braille. This unification avoids the chaos of each nation reordering the braille code to match the sorting order of its print alphabet, as happened in Algerian Braille, where braille codes were numerically reassigned to match the order of the Arabic alphabet and bear little relation to the values used in other countries (compare modern Arabic Braille, which uses the French sorting order), and as happened in an early American version of English Braille, where the letters w, x, y, z were reassigned to match English alphabetical order. A convention sometimes seen for letters beyond the basic 26 is to exploit the physical symmetry of braille patterns iconically, for example, by assigning a reversed n to ñ or an inverted s to sh. (See Hungarian Braille and Bharati Braille, which do this to some extent.) A third principle was to assign braille codes according to frequency, with the simplest patterns (quickest ones to write with a stylus) assigned to the most frequent letters of the alphabet. Such frequency-based alphabets were used in Germany and the United States in the 19th century (see American Braille), but with the invention of the braille typewriter their advantage disappeared, and none are attested in modern use – they had the disadvantage that the resulting small number of dots in a text interfered with following the alignment of the letters, and consequently made texts more difficult to read than Braille's more arbitrary letter-assignment. Finally, there are braille scripts which don't order the codes numerically at all, such as Japanese Braille and Korean Braille, which are based on more abstract principles of syllable composition. Texts are sometimes written in a script of eight dots per cell rather than six, enabling them to encode a greater number of symbols. (See Gardner–Salinas braille codes.) Luxembourgish Braille has adopted eight-dot cells for general use; for example, it adds a dot below each letter to derive its capital variant. Form Braille was the first writing system with binary encoding. The system as devised by Braille consists of two parts: Character encoding that mapped characters of the French alphabet to tuples of six bits (the dots), The physical representation of those six-bit characters with raised dots in a braille cell. Within an individual cell, the dot positions are arranged in two columns of three positions. A raised dot can appear in any of the six positions, producing sixty-four (26) possible patterns, including one in which there are no raised dots. For reference purposes, a pattern is commonly described by listing the positions where dots are raised, the positions being universally numbered, from top to bottom, as 1 to 3 on the left and 4 to 6 on the right. For example, dot pattern 1-3-4 describes a cell with three dots raised, at the top and bottom in the left column and at the top of the right column: that is, the letter m. The lines of horizontal Braille text are separated by a space, much like visible printed text, so that the dots of one line can be differentiated from the braille text above and below. Different assignments of braille codes (or code pages) are used to map the character sets of different printed scripts to the six-bit cells. Braille assignments have also been created for mathematical and musical notation. However, because the six-dot braille cell allows only 64 (26) patterns, including space, the characters of a braille script commonly have multiple values, depending on their context. That is, character mapping between print and braille is not one-to-one. For example, the character corresponds in print to both the letter d and the digit 4. In addition to simple encoding, many braille alphabets use contractions to reduce the size of braille texts and to increase reading speed. (See Contracted braille) Writing braille Braille may be produced by hand using a slate and stylus in which each dot is created from the back of the page, writing in mirror image, or it may be produced on a braille typewriter or Perkins Brailler, or an electronic Brailler or braille notetaker. The different tools that exist for writing braille allow the braille user to select the method that is best for a given task. For example, the slate and stylus is a portable writing tool, much like the pen and paper for the sighted. Errors can be erased using a braille eraser or can be overwritten with all six dots (). Interpoint refers to braille printing that is offset, so that the paper can be embossed on both sides, with the dots on one side appearing between the divots that form the dots on the other. Using a computer or other electronic device, Braille may be produced with a braille embosser (printer) or a refreshable braille display (screen). Eight-dot braille Braille has been extended to an 8-dot code, particularly for use with braille embossers and refreshable braille displays. In 8-dot braille the additional dots are added at the bottom of the cell, giving a matrix 4 dots high by 2 dots wide. The additional dots are given the numbers 7 (for the lower-left dot) and 8 (for the lower-right dot). Eight-dot braille has the advantages that the case of an individual letter is directly coded in the cell containing the letter and that all the printable ASCII characters can be represented in a single cell. All 256 (28) possible combinations of 8 dots are encoded by the Unicode standard. Braille with six dots is frequently stored as Braille ASCII. Letters The first 25 braille letters, up through the first half of the 3rd decade, transcribe a–z (skipping w). In English Braille, the rest of that decade is rounded out with the ligatures and, for, of, the, and with. Omitting dot 3 from these forms the 4th decade, the ligatures ch, gh, sh, th, wh, ed, er, ou, ow and the letter w. (See English Braille.) Formatting Various formatting marks affect the values of the letters that follow them. They have no direct equivalent in print. The most important in English Braille are: That is, is read as capital 'A', and as the digit '1'. Punctuation Basic punctuation marks in English Braille include: is both the question mark and the opening quotation mark. Its reading depends on whether it occurs before a word or after. is used for both opening and closing parentheses. Its placement relative to spaces and other characters determines its interpretation. Punctuation varies from language to language. For example, French Braille uses for its question mark and swaps the quotation marks and parentheses (to and ); it uses the period () for the decimal point, as in print, and the decimal point () to mark capitalization. Contractions Braille contractions are words and affixes that are shortened so that they take up fewer cells. In English Braille, for example, the word afternoon is written with just three letters, , much like stenoscript. There are also several abbreviation marks that create what are effectively logograms. The most common of these is dot 5, which combines with the first letter of words. With the letter m, the resulting word is mother. There are also ligatures ("contracted" letters), which are single letters in braille but correspond to more than one letter in print. The letter and, for example, is used to write words with the sequence a-n-d in them, such as hand. Page dimensions Most braille embossers support between 34 and 40 cells per line, and 25 lines per page. A manually operated Perkins braille typewriter supports a maximum of 42 cells per line (its margins are adjustable), and typical paper allows 25 lines per page. A large interlining Stainsby has 36 cells per line and 18 lines per page. An A4-sized Marburg braille frame, which allows interpoint braille (dots on both sides of the page, offset so they do not interfere with each other), has 30 cells per line and 27 lines per page. Braille writing machine A Braille writing machine is a typewriter with six keys that allows the user to write braille on a regular hard copy page. The first Braille typewriter to gain general acceptance was invented by Frank Haven Hall (Superintendent of the Illinois School for the Blind), and was presented to the public in 1892. The Stainsby Brailler, developed by Henry Stainsby in 1903, is a mechanical writer with a sliding carriage that moves over an aluminium plate as it embosses Braille characters. An improved version was introduced around 1933. In 1951 David Abraham, a woodworking teacher at the Perkins School for the Blind produced a more advanced Braille typewriter, the Perkins Brailler. Braille printers or embosser were produced in 1950s. In 1960 Robert Mann, a teacher in MIT, wrote DOTSYS, a software that allowed automatic braille translation, and another group created an embossing device called "M.I.T. Braillemboss.". The Mitre Corporation team of Robert Gildea, Jonathan Millen, Reid Gerhart and Joseph Sullivan (now president of Duxbury Systems) developed DOTSYS III, the first braille translator written in a portable programming language. DOTSYS III was developed for the Atlanta Public Schools as a public domain program. In 1991 Ernest Bate developed the Mountbatten Brailler, an electronic machine used to type braille on braille paper, giving it a number of additional features such as word processing, audio feedback and embossing. This version was improved in 2008 with a quiet writer that had an erase key. In 2011 David S. Morgan produced the first SMART Brailler machine, with added text to speech function and allowed digital capture of data entered. Braille reading Braille is traditionally read in hardcopy form, such as with paper books written in braille, documents produced in paper braille (such as restaurant menus), and braille labels or public signage. It can also be read on a refreshable braille display either as a stand-alone electronic device or connected to a computer or smartphone. Refreshable braille displays convert what is visually shown on a computer or smartphone screen into braille through a series of pins that rise and fall to form braille symbols. Currently more than 1% of all printed books have been translated into hardcopy braille. The fastest braille readers apply a light touch and read braille with two hands, although reading braille with one hand is also possible. Although the finger can read only one braille character at a time, the brain chunks braille at a higher level, processing words a digraph, root or suffix at a time. The processing largely takes place in the visual cortex. Literacy Children who are blind miss out on fundamental parts of early and advanced education if not provided with the necessary tools, such as access to educational materials in braille. Children who are blind or visually impaired can begin learning foundational braille skills from a very young age to become fluent braille readers as they get older. Sighted children are naturally exposed to written language on signs, on TV and in the books they see. Blind children require the same early exposure to literacy, through access to braille rich environments and opportunities to explore the world around them. Print-braille books, for example, present text in both print and braille and can be read by sighted parents to blind children (and vice versa), allowing blind children to develop an early love for reading even before formal reading instruction begins. Adults who experience vision loss later in life or who didn't have the opportunity to learn it when they were younger can also learn braille. In most cases, adults who learn braille were already literate in print before vision loss and so instruction focuses more on developing the tactile and motor skills needed to read braille. While different countries publish statistics on how many readers in a given organization request braille, these numbers only provide a partial picture of braille literacy statistics. For example, this data does not always survey the entire population of braille readers or include readers who are no longer in the school system (adults) or readers who request electronic braille materials. Regardless of the precise percentage of braille readers, there is consensus that braille should be provided to all those who benefit from it. U.S. braille literacy statistics In 1960, 50% of legally blind, school-age children were able to read braille in the U.S. According to the 2015 Annual Report from the American Printing House for the Blind, there were 61,739 legally blind students registered in the U.S. Of these, 8.6% (5,333) were registered as braille readers, 31% (19,109) as visual readers, 9.4% (5,795) as auditory readers, 17% (10,470) as pre-readers, and 34% (21,032) as non-readers. There are numerous factors that influence access to braille literacy, including school budget constraints, technology advancements such as screen-reader software, and different philosophical views over how blind children should be educated. A key turning point for braille literacy was the passage of the Rehabilitation Act of 1973, an act of Congress that moved thousands of children from specialized schools for the blind into mainstream public schools. Because only a small percentage of public schools could afford to train and hire braille-qualified teachers, braille literacy has declined since the law took effect. Braille literacy rates have improved slightly since the bill was passed, in part because of pressure from consumers and advocacy groups that has led 27 states to pass legislation mandating that children who are legally blind be given the opportunity to learn braille. In 1998 there were 57,425 legally blind students registered in the United States, but only 10% (5,461) of them used braille as their primary reading medium. Early Braille education is crucial to literacy for a blind or low-vision child. A study conducted in the state of Washington found that people who learned braille at an early age did just as well, if not better than their sighted peers in several areas, including vocabulary and comprehension. In the preliminary adult study, while evaluating the correlation between adult literacy skills and employment, it was found that 44% of the participants who had learned to read in braille were unemployed, compared to the 77% unemployment rate of those who had learned to read using print. Currently, among the estimated 85,000 blind adults in the United States, 90% of those who are braille-literate are employed. Among adults who do not know braille, only 33% are employed. Statistically, history has proven that braille reading proficiency provides an essential skill set that allows blind or low-vision children to compete with their sighted peers in a school environment and later in life as they enter the workforce. United Kingdom In Britain, out of the reported two million blind and low vision population, it is estimated that only around 18,000–20,000 people use braille. Regardless of the specific percentage, proponents point out the importance of increasing access to braille for all those who can benefit from it. Braille transcription Although it is possible to transcribe print by simply substituting the equivalent braille character for its printed equivalent, in English such a character-by-character transcription (known as uncontracted braille) is typically used by beginners or those who only | there have been three principles in assigning the values of a linear script (print) to Braille: Using Louis Braille's original French letter values; reassigning the braille letters according to the sort order of the print alphabet being transcribed; and reassigning the letters to improve the efficiency of writing in braille. Under international consensus, most braille alphabets follow the French sorting order for the 26 letters of the basic Latin alphabet, and there have been attempts at unifying the letters beyond these 26 (see international braille), though differences remain, for example in German Braille. This unification avoids the chaos of each nation reordering the braille code to match the sorting order of its print alphabet, as happened in Algerian Braille, where braille codes were numerically reassigned to match the order of the Arabic alphabet and bear little relation to the values used in other countries (compare modern Arabic Braille, which uses the French sorting order), and as happened in an early American version of English Braille, where the letters w, x, y, z were reassigned to match English alphabetical order. A convention sometimes seen for letters beyond the basic 26 is to exploit the physical symmetry of braille patterns iconically, for example, by assigning a reversed n to ñ or an inverted s to sh. (See Hungarian Braille and Bharati Braille, which do this to some extent.) A third principle was to assign braille codes according to frequency, with the simplest patterns (quickest ones to write with a stylus) assigned to the most frequent letters of the alphabet. Such frequency-based alphabets were used in Germany and the United States in the 19th century (see American Braille), but with the invention of the braille typewriter their advantage disappeared, and none are attested in modern use – they had the disadvantage that the resulting small number of dots in a text interfered with following the alignment of the letters, and consequently made texts more difficult to read than Braille's more arbitrary letter-assignment. Finally, there are braille scripts which don't order the codes numerically at all, such as Japanese Braille and Korean Braille, which are based on more abstract principles of syllable composition. Texts are sometimes written in a script of eight dots per cell rather than six, enabling them to encode a greater number of symbols. (See Gardner–Salinas braille codes.) Luxembourgish Braille has adopted eight-dot cells for general use; for example, it adds a dot below each letter to derive its capital variant. Form Braille was the first writing system with binary encoding. The system as devised by Braille consists of two parts: Character encoding that mapped characters of the French alphabet to tuples of six bits (the dots), The physical representation of those six-bit characters with raised dots in a braille cell. Within an individual cell, the dot positions are arranged in two columns of three positions. A raised dot can appear in any of the six positions, producing sixty-four (26) possible patterns, including one in which there are no raised dots. For reference purposes, a pattern is commonly described by listing the positions where dots are raised, the positions being universally numbered, from top to bottom, as 1 to 3 on the left and 4 to 6 on the right. For example, dot pattern 1-3-4 describes a cell with three dots raised, at the top and bottom in the left column and at the top of the right column: that is, the letter m. The lines of horizontal Braille text are separated by a space, much like visible printed text, so that the dots of one line can be differentiated from the braille text above and below. Different assignments of braille codes (or code pages) are used to map the character sets of different printed scripts to the six-bit cells. Braille assignments have also been created for mathematical and musical notation. However, because the six-dot braille cell allows only 64 (26) patterns, including space, the characters of a braille script commonly have multiple values, depending on their context. That is, character mapping between print and braille is not one-to-one. For example, the character corresponds in print to both the letter d and the digit 4. In addition to simple encoding, many braille alphabets use contractions to reduce the size of braille texts and to increase reading speed. (See Contracted braille) Writing braille Braille may be produced by hand using a slate and stylus in which each dot is created from the back of the page, writing in mirror image, or it may be produced on a braille typewriter or Perkins Brailler, or an electronic Brailler or braille notetaker. The different tools that exist for writing braille allow the braille user to select the method that is best for a given task. For example, the slate and stylus is a portable writing tool, much like the pen and paper for the sighted. Errors can be erased using a braille eraser or can be overwritten with all six dots (). Interpoint refers to braille printing that is offset, so that the paper can be embossed on both sides, with the dots on one side appearing between the divots that form the dots on the other. Using a computer or other electronic device, Braille may be produced with a braille embosser (printer) or a refreshable braille display (screen). Eight-dot braille Braille has been extended to an 8-dot code, particularly for use with braille embossers and refreshable braille displays. In 8-dot braille the additional dots are added at the bottom of the cell, giving a matrix 4 dots high by 2 dots wide. The additional dots are given the numbers 7 (for the lower-left dot) and 8 (for the lower-right dot). Eight-dot braille has the advantages that the case of an individual letter is directly coded in the cell containing the letter and that all the printable ASCII characters can be represented in a single cell. All 256 (28) possible combinations of 8 dots are encoded by the Unicode standard. Braille with six dots is frequently stored as Braille ASCII. Letters The first 25 braille letters, up through the first half of the 3rd decade, transcribe a–z (skipping w). In English Braille, the rest of that decade is rounded out with the ligatures and, for, of, the, and with. Omitting dot 3 from these forms the 4th decade, the ligatures ch, gh, sh, th, wh, ed, er, ou, ow and the letter w. (See English Braille.) Formatting Various formatting marks affect the values of the letters that follow them. They have no direct equivalent in print. The most important in English Braille are: That is, is read as capital 'A', and as the digit '1'. Punctuation Basic punctuation marks in English Braille include: is both the question mark and the opening quotation mark. Its reading depends on whether it occurs before a word or after. is used for both opening and closing parentheses. Its placement relative to spaces and other characters determines its interpretation. Punctuation varies from language to language. For example, French Braille uses for its question mark and swaps the quotation marks and parentheses (to and ); it uses the period () for the decimal point, as in print, and the decimal point () to mark capitalization. Contractions Braille contractions are words and affixes that are shortened so that they take up fewer cells. In English Braille, for example, the word afternoon is written with just three letters, , much like stenoscript. There are also several abbreviation marks that create what are effectively logograms. The most common of these is dot 5, which combines with the first letter of words. With the letter m, the resulting word is mother. There are also ligatures ("contracted" letters), which are single letters in braille but correspond to more than one letter in print. The letter and, for example, is used to write words with the sequence a-n-d in them, such as hand. Page dimensions Most braille embossers support between 34 and 40 cells per line, and 25 lines per page. A manually operated Perkins braille typewriter supports a maximum of 42 cells per line (its margins are adjustable), and typical paper allows 25 lines per page. A large interlining Stainsby has 36 cells per line and 18 lines per page. An A4-sized Marburg braille frame, which allows interpoint braille (dots on both sides of the page, offset so they do not interfere with each other), has 30 cells per line and 27 lines per page. Braille writing machine A Braille writing machine is a typewriter with six keys that allows the user to write braille on a regular hard copy page. The first Braille typewriter to gain general acceptance was invented by Frank Haven Hall (Superintendent of the Illinois School for the Blind), and was presented to the public in 1892. The Stainsby Brailler, developed by Henry Stainsby in 1903, is a mechanical writer with a sliding carriage that moves over an aluminium plate as it embosses Braille characters. An improved version was introduced around 1933. In 1951 David Abraham, a woodworking teacher at the Perkins School for the Blind produced a more advanced Braille typewriter, the Perkins Brailler. Braille printers or embosser were produced in 1950s. In 1960 Robert Mann, a teacher in MIT, wrote DOTSYS, a software that allowed automatic braille translation, and another group created an embossing device called "M.I.T. Braillemboss.". The Mitre Corporation team of Robert Gildea, Jonathan Millen, Reid Gerhart and Joseph Sullivan (now president of Duxbury Systems) developed DOTSYS III, the first braille translator written in a portable programming language. DOTSYS III was developed for the Atlanta Public Schools as a public domain program. In 1991 Ernest Bate developed the Mountbatten Brailler, an electronic machine used to type braille on braille paper, giving it a number of additional features such as word processing, audio feedback and embossing. This version was improved in 2008 with a quiet writer that had an erase key. In 2011 David S. Morgan produced the first SMART Brailler machine, with added text to speech function and allowed digital capture of data entered. Braille reading Braille is traditionally read in hardcopy form, such as with paper books written in braille, documents produced in paper braille (such as restaurant menus), and braille labels or public signage. It can also be read on a refreshable braille display either as a stand-alone electronic device or connected to a computer or smartphone. Refreshable braille displays convert what is visually shown on a computer or smartphone screen into braille through a series of pins that rise and fall to form braille symbols. Currently more than 1% of |
The singing and dancing competitions continued, with music composed with traditional instruments such as a nasal flute and ukulele. United Kingdom Within the UK, London has a large French contingent, and celebrates Bastille Day at various locations across the city including Battersea Park, Camden Town and Kentish Town. Live entertainment is performed at Canary Wharf, with weeklong performances of French theatre at the Lion and Unicorn Theatre in Kentish Town. Restaurants feature cabarets and special menus across the city, and other celebrations include garden parties and sports tournaments. There is also a large event at the Bankside and Borough Market, where there is live music, street performers, and traditional French games are played. United States The United States has over 20 cities that conduct annual celebrations of Bastille Day. The different cities celebrate with many French staples such as food, music, games, and sometimes the recreation of famous French landmarks. Northeastern States Baltimore, Maryland, has a large Bastille Day celebration each year at Petit Louis in the Roland Park area of Baltimore City. Boston has a celebration annually, hosted by the French Cultural Center for 40 years. The street festival occurs in Boston's Back Bay neighborhood, near the Cultural Center's headquarters. The celebration includes francophone musical performers, dancing, and French cuisine. New York City has numerous Bastille Day celebrations each July, including Bastille Day on 60th Street hosted by the French Institute Alliance Française between Fifth and Lexington Avenues on the Upper East Side of Manhattan, Bastille Day on Smith Street in Brooklyn, and Bastille Day in Tribeca. There is also the annual Bastille Day Ball, taking place since 1924. Philadelphia's Bastille Day, held at Eastern State Penitentiary, involves Marie Antoinette throwing locally manufactured Tastykakes at the Parisian militia, as well as a re-enactment of the storming of the Bastille. (This Philadelphia tradition ended in 2018.) In Newport, Rhode Island, the annual Bastille Day celebration is organized by the local chapter of the Alliance Française. It takes place at King Park in Newport at the monument memorializing the accomplishments of the General Comte de Rochambeau whose 6,000 to 7,000 French forces landed in Newport on 11 July 1780. Their assistance in the defeat of the English in the War of Independence is well documented and is demonstrable proof of the special relationship between France and the United States. In Washington D.C., food, music, and auction events are sponsored by the Embassy of France. There is also a French Festival within the city, where families can meet period entertainment groups set during the time of the French Revolution. Restaurants host parties serving traditional French food. Southern States In Dallas, Texas, the Bastille Day celebration, "Bastille On Bishop", began in 2010 and is held annually in the Bishop Arts District of the North Oak Cliff neighborhood, southwest of downtown just across the Trinity River. Dallas' French roots are tied to the short lived socialist Utopian community La Réunion, formed in 1855 and incorporated into the City of Dallas in 1860. Miami's celebration is organized by "French & Famous" in partnership with the French American Chamber of Commerce, the Union des Français de l'Etranger and many French brands. The event gathers over 1,000 attendees to celebrate "La Fête Nationale". The location and theme change every year. In 2017, the theme was "Guinguette Party" and attracted 1,200 francophiles at The River Yacht Club. New Orleans, Louisiana, has multiple celebrations, the largest in the historic French Quarter. In Austin, Texas, the Alliance Française d’Austin usually conducts a family-friendly Bastille Day party at the French Legation, the home of the French representative to the Republic of Texas from 1841 to 1845. Midwestern States Chicago, Illinois, has hosted a variety of Bastille Day celebrations in a number of locations in the city, including Navy Pier and Oz Park. The recent incarnations have been sponsored in part by the Chicago branch of the French-American Chamber of Commerce and by the French Consulate-General in Chicago. Milwaukee's four-day street festival begins with a "Storming of the Bastille" with a 43-foot replica of the Eiffel Tower. Minneapolis, Minnesota, has a celebration with wine, French food, pastries, a flea market, circus performers and bands. Also in the Twin Cities area, the local chapter of the Alliance Française has hosted an annual event for years at varying locations with a competition for the "Best Baguette of the Twin Cities." Montgomery, Ohio, has a celebration with wine, beer, local restaurants' fare, pastries, games and bands. St. Louis, Missouri, has annual festivals in the Soulard neighborhood, the former French village of Carondelet, Missouri, and in the Benton Park neighborhood. The Chatillon-DeMenil Mansion in the Benton Park neighborhood, holds an annual Bastille Day festival with reenactments of the beheading of Marie Antoinette and Louis XVI, traditional dancing, and artillery demonstrations. Carondelet also began hosting an annual saloon crawl to celebrate Bastille Day in 2017. The Soulard neighborhood in St. Louis, Missouri celebrates its unique French heritage with special events including a parade, which honors the peasants who rejected to monarchy. The parade includes a 'gathering of the mob,' a walking and golf cart parade, and a mock beheading of the King and Queen. Western States Portland, Oregon, has celebrated Bastille Day with crowds up to 8,000, in public festivals at various public parks, since 2001. The event is coordinated by the Alliance Française of Portland. Seattle's Bastille Day celebration, held at the Seattle Center, involves performances, picnics, wine and shopping. Sacramento, California, conducts annual "waiter races" in the midtown restaurant and shopping district, with a street festival. One-time celebrations 1979: A concert with Jean-Michel Jarre on the Place de la Concorde in Paris was the first concert to have one million attendees. 1989: France celebrated the 200th anniversary of the French Revolution, notably with a monumental show on the Champs-Élysées in Paris, directed by French designer Jean-Paul Goude. President François Mitterrand acted as a host for invited world leaders. 1990: A concert with Jarre was held at La Défense near Paris. 1994: The military parade was opened by Eurocorps, a newly created European army unit including German soldiers. This was the first time German troops paraded in France since 1944, as a symbol of Franco-German reconciliation. 1995: A concert with Jarre was held at the Eiffel Tower in Paris. 1998: Two days after the French football team became World Cup champions, huge celebrations took place nationwide. 2004: To commemorate the centenary of the Entente Cordiale, the British led the military parade with the Red Arrows flying overhead. 2007: To commemorate the 50th anniversary of the Treaty of Rome, the military parade was led by troops from the 26 other EU member states, all marching at the French time. 2014: To commemorate the 100th anniversary of the outbreak of the First World War, representatives of 80 countries who fought during this conflict were invited to the ceremony. The military parade was opened by 76 flags representing each of these countries. 2017: To commemorate the 100th anniversary of the United States of America's entry into the First World War, president of France Emmanuel Macron invited then-U.S. president Donald Trump to celebrate a centuries-long transatlantic tie between the two countries. Trump was reported to have admired | government, to have "the Republic adopt 14 July as the day of an annual national festival". There were many disputes over which date to be remembered as the national holiday, including 4 August (the commemoration of the end of the feudal system), 5 May (when the Estates-General first assembled), 27 July (the fall of Robespierre), and 21 January (the date of Louis XVI's execution). The government decided that the date of the holiday would be 14 July, but it was still somewhat problematic. The events of 14 July 1789 were illegal under the previous government, which contradicted the Third Republic's need to establish legal legitimacy. French politicians also did not want the sole foundation of their national holiday to be rooted in a day of bloodshed and class-hatred as the day of storming the Bastille was. Instead, they based the establishment of the holiday as a dual celebration of the Fête de la Fédération, a festival celebrating the first anniversary of 14 July 1789, and the storming of the Bastille. The Assembly voted in favor of the proposal on 21 May and 8 June, and the law was approved on 27 and 29 June. The law was made official on 6 July 1880. In the debate leading up to the adoption of the holiday, Senator Henri Martin, who wrote the National Day law, addressed the chamber on 29 June 1880: Bastille Day military parade The Bastille Day military parade is the French military parade that has been held in the morning, each year in Paris since 1880. While previously held elsewhere within or near the capital city, since 1918 it has been held on the Champs-Élysées, with the participation of the Allies as represented in the Versailles Peace Conference, and with the exception of the period of German occupation from 1940 to 1944 (when the ceremony took place in London under the command of General Charles de Gaulle); and 2020 when the COVID-19 pandemic forced its cancellation. The parade passes down the Champs-Élysées from the Arc de Triomphe to the Place de la Concorde, where the President of the French Republic, his government and foreign ambassadors to France stand. This is a popular event in France, broadcast on French TV, and is the oldest and largest regular military parade in Europe. In some years, invited detachments of foreign troops take part in the parade and foreign statesmen attend as guests Smaller military parades are held in French garrison towns, including Toulon and Belfort, with local troops. Bastille Day celebrations in other countries Belgium Liège celebrates the Bastille Day each year since the end of the First World War, as Liège was decorated by the Légion d'Honneur for its unexpected resistance during the Battle of Liège. The city also hosts a fireworks show outside of Congress Hall. Specifically in Liège, celebrations of Bastille Day have been known to be bigger than the celebrations of the Belgian National holiday. Around 35,000 people gather to celebrate Bastille Day. There is a traditional festival dance of the French consul that draws large crowds, and many unofficial events over the city celebrate the relationship between France and the city of Liège. Canada Vancouver, British Columbia holds a celebration featuring exhibits, food and entertainment. The Toronto Bastille Day festival is also celebrated in Toronto, Ontario. The festival is organized by the French community in Toronto and sponsored by the Consulate General of France. The celebration includes music, performances, sport competitions, and a French Market. At the end of the festival, there is also a traditional French bal populaire. Czech Republic Since 2008, Prague has hosted a French market "" ("Fourteenth of July Market") offering traditional French food and wine as well as music. The market takes place on Kampa Island, it is usually between 11 and 14 July. It acts as an event that marks the relinquish of the EU presidency from France to the Czech Republic. Traditional selections of French produce, including cheese, wine, meat, bread and pastries, are provided by the market. Throughout the event, live music is played in the evenings, with lanterns lighting up the square at night. Denmark The amusement park Tivoli celebrates the Bastille Day. Hungary Budapest's two-day celebration is sponsored by the Institut de France. The festival is hosted along the Danube River, with streets filled with music and dancing. There are also local markets dedicated to French foods and wine, mixed with some traditional Hungarian specialties. At the end of the celebration, a fireworks show is held on the river banks. India Bastille Day is celebrated with great festivity in Pondicherry, a former French colony, every year. On the eve of the Bastille Day, retired soldiers parade and celebrate the day with Indian and French National Anthems, honoring the French soldiers who were killed in the battles. Throughout the celebration, French and Indian flags fly alongside each other, projecting the mingling of cultures and heritages. Ireland The Embassy of France in Ireland organizes several events around Dublin, Cork and Limerick for Bastille Day; including evenings of French music and tasting of French food. Many members of the French community in Ireland take part in the festivities. Events in Dublin include live entertainment, speciality menus on French cuisine, and screenings of popular French films. New Zealand The Auckland suburb of Remuera hosts an annual French-themed Bastille Day street festival. Visitors enjoy mimes, dancers, music, as well as French foods and drinks. The budding relationship between the two countries, with the establishment of a Maori garden in France and exchange of their analyses of cave art, resulted in the creation of an official reception at the Residence of France. There is also an event in Wellington for the French community held at the Residence of France. South Africa Franschhoek's weekend festival has been celebrated since 1993. (Franschhoek, or 'French Corner,' is situated in the Western Cape.) As South Africa's gourmet capital, French food, wine and other entertainment is provided throughout the festival. The French Consulate in South Africa also celebrates their national holiday with a party for the French community. Activities also include dressing up in different items of French clothing. French Polynesia Following colonial rule, France annexed a large portion of what is now French Polynesia. Under French rule, Tahitians were permitted to participate in sport, singing, and dancing competitions one day a year: Bastille Day. The single day of celebration evolved into the major Heiva i Tahiti festival in Papeete Tahiti, where traditional events such as canoe races, tattooing, and fire walks are held. The singing and dancing competitions continued, with music composed with traditional instruments such as a nasal flute and ukulele. United Kingdom Within the UK, London has a large French contingent, and celebrates Bastille Day at various locations across the city including Battersea Park, Camden Town and Kentish Town. Live entertainment is performed at Canary Wharf, with weeklong performances of French theatre at the Lion and Unicorn Theatre in Kentish Town. Restaurants feature cabarets and special menus across the city, and other celebrations include garden parties and sports tournaments. There is also a large event at the Bankside and Borough Market, where there is live music, street performers, and traditional French games are played. United States The United States has over 20 cities that conduct annual celebrations of Bastille Day. The different cities celebrate with many French staples such as food, music, games, and sometimes the recreation of famous French landmarks. Northeastern States Baltimore, Maryland, has a large Bastille Day celebration each year at Petit Louis in the Roland Park area of Baltimore City. Boston has a celebration annually, hosted by the French Cultural Center for 40 years. The street festival occurs in Boston's Back Bay neighborhood, near the Cultural Center's headquarters. The celebration includes francophone musical performers, dancing, and French cuisine. New York City has numerous Bastille Day celebrations each July, including Bastille Day on 60th Street hosted by the French Institute Alliance Française between Fifth and Lexington Avenues on the Upper East Side of Manhattan, Bastille Day on Smith Street in Brooklyn, and Bastille Day in Tribeca. There is also the annual Bastille Day Ball, taking place since 1924. Philadelphia's Bastille Day, held at Eastern State Penitentiary, involves Marie Antoinette throwing locally manufactured Tastykakes at the Parisian militia, as well as a re-enactment of the storming of the Bastille. (This Philadelphia tradition ended in 2018.) In Newport, Rhode Island, the annual Bastille Day celebration is organized by the local chapter of the Alliance Française. It takes place at King Park in Newport at the monument memorializing the accomplishments of the General Comte de Rochambeau whose 6,000 to 7,000 French forces landed in Newport on 11 July 1780. Their assistance in the defeat of the English in the War of Independence is well documented and is demonstrable proof of the special relationship between France and the United States. In Washington D.C., food, |
bit of every subkey depends on every bit of the key, as the last four values of the P-array don't affect every bit of the ciphertext. This point should be taken in consideration for implementations with a different number of rounds, as even though it increases security against an exhaustive attack, it weakens the security guaranteed by the algorithm. And given the slow initialization of the cipher with each change of key, it is granted a natural protection against brute-force attacks, which doesn't really justify key sizes longer than 448 bits. Blowfish in pseudocode uint32_t P[18]; uint32_t S[4][256]; uint32_t f (uint32_t x) { uint32_t h = S[0][x >> 24] + S[1][x >> 16 & 0xff]; return ( h ^ S[2][x >> 8 & 0xff] ) + S[3][x & 0xff]; } void blowfish_encrypt(uint32_t *L, uint32_t *R) { for (short r = 0; r < 16; r++) { *L = *L ^ P[r]; *R = f(*L) ^ *R; swap(L, R); } swap(L, R); *R = *R ^ P[16]; *L = *L ^ P[17]; } void blowfish_decrypt(uint32_t *L, uint32_t *R) { for (short r = 17; r > 1; r--) { *L = *L ^ P[r]; *R = f(*L) ^ *R; swap(L, R); } swap(L, R); *R = *R ^ P[1]; *L = *L ^ P[0]; } // ... // initializing the P-array and S-boxes with values derived from pi; omitted in the example (you can find them below) // ... { /* initialize P box w/ key*/ uint32_t k; for (short i = 0, p = 0; i < 18; i++) { k = 0x00; for (short j = 0; j < 4; j++) { k = (k << 8) | (uint8_t) key[p]; p = (p + 1) % key_len; } P[i] ^= k; } /* blowfish key expansion (521 iterations) */ uint32_t l = 0x00, r = 0x00; for (short i = 0; i < 18; i+=2) { blowfish_encrypt(&l, &r); P[i] = l; P[i+1] = r; } for (short i = 0; i < 4; i++) { for (short j = 0; j < 256; j+=2) { blowfish_encrypt(&l, &r); S[i][j] = l; S[i][j+1] = r; } } } Blowfish in practice Blowfish is a fast block cipher, except when changing keys. Each new key requires the pre-processing equivalent of encrypting about 4 kilobytes of text, which is very slow compared to other block ciphers. This prevents its use in certain applications, but is not a problem in others. In one application Blowfish's slow key changing is actually a benefit: the password-hashing method (crypt $2, i.e. bcrypt) used in OpenBSD uses an algorithm derived from Blowfish that makes use of the slow key schedule; the idea is that the extra computational effort required gives protection against dictionary attacks. See key stretching. Blowfish has a memory footprint of just over 4 kilobytes of RAM. This constraint is not a problem even for older desktop and laptop computers, though it does prevent use in the smallest embedded systems such as early smartcards. Blowfish was one of the first secure block ciphers not subject to any patents and therefore freely available for anyone to use. This benefit has | all the subkeys about 4 KB of data is processed. Because the P-array is 576 bits long, and the key bytes are XORed through all these 576 bits during the initialization, many implementations support key sizes up to 576 bits. The reason for that is a discrepancy between the original Blowfish description, which uses 448-bit keys, and its reference implementation, which uses 576-bit keys. The test vectors for verifying third-party implementations were also produced with 576-bit keys. When asked which Blowfish version is the correct one, Bruce Schneier answered: "The test vectors should be used to determine the one true Blowfish". Another opinion is that the 448 bits limit is present to ensure that every bit of every subkey depends on every bit of the key, as the last four values of the P-array don't affect every bit of the ciphertext. This point should be taken in consideration for implementations with a different number of rounds, as even though it increases security against an exhaustive attack, it weakens the security guaranteed by the algorithm. And given the slow initialization of the cipher with each change of key, it is granted a natural protection against brute-force attacks, which doesn't really justify key sizes longer than 448 bits. Blowfish in pseudocode uint32_t P[18]; uint32_t S[4][256]; uint32_t f (uint32_t x) { uint32_t h = S[0][x >> 24] + S[1][x >> 16 & 0xff]; return ( h ^ S[2][x >> 8 & 0xff] ) + S[3][x & 0xff]; } void blowfish_encrypt(uint32_t *L, uint32_t *R) { for (short r = 0; r < 16; r++) { *L = *L ^ P[r]; *R = f(*L) ^ *R; swap(L, R); } swap(L, R); *R = *R ^ P[16]; *L = *L ^ P[17]; } void blowfish_decrypt(uint32_t *L, uint32_t *R) { for (short r = 17; r > 1; r--) { *L = *L ^ P[r]; *R = f(*L) ^ *R; swap(L, R); } swap(L, R); *R = *R ^ P[1]; *L = *L ^ P[0]; } // ... // initializing the P-array and S-boxes with values derived from pi; omitted in the example (you can find them below) // ... { /* initialize P box w/ key*/ uint32_t k; for (short i = 0, p = 0; i < 18; i++) { k = 0x00; for (short j = 0; j < 4; j++) { k = (k << 8) | (uint8_t) key[p]; p = (p + 1) % key_len; } P[i] ^= k; } /* blowfish key expansion (521 iterations) */ uint32_t l = 0x00, r = 0x00; for (short i = 0; i < 18; i+=2) { blowfish_encrypt(&l, &r); P[i] = l; P[i+1] = r; } for (short i = 0; i < 4; i++) { for (short j = 0; j < 256; j+=2) { blowfish_encrypt(&l, &r); S[i][j] = l; S[i][j+1] = r; } } } Blowfish in practice Blowfish is a fast block cipher, except when changing keys. Each new key requires the pre-processing equivalent of encrypting about 4 kilobytes of text, which is very slow compared to other block ciphers. This prevents its use in certain applications, but is not a problem in others. In one application Blowfish's slow key changing is actually a benefit: the password-hashing method (crypt $2, i.e. bcrypt) used in OpenBSD uses an algorithm derived from Blowfish that makes use of the slow key schedule; the idea is that the extra computational effort required gives protection against dictionary attacks. See key stretching. Blowfish has a memory footprint of just over 4 kilobytes of RAM. This constraint is |
and students of a classroom In a classroom there are a certain number of seats. A bunch of students enter the room and the instructor asks them to be seated. After a quick look around the room, the instructor declares that there is a bijection between the set of students and the set of seats, where each student is paired with the seat they are sitting in. What the instructor observed in order to reach this conclusion was that: Every student was in a seat (there was no one standing), No student was in more than one seat, Every seat had someone sitting there (there were no empty seats), and No seat had more than one student in it. The instructor was able to conclude that there were just as many seats as there were students, without having to count either set. More mathematical examples and some non-examples For any set X, the identity function 1X: X → X, 1X(x) = x is bijective. The function f: R → R, f(x) = 2x + 1 is bijective, since for each y there is a unique x = (y − 1)/2 such that f(x) = y. More generally, any linear function over the reals, f: R → R, f(x) = ax + b (where a is non-zero) is a bijection. Each real number y is obtained from (or paired with) the real number x = (y − b)/a. The function f: R → (−π/2, π/2), given by f(x) = arctan(x) is bijective, since each real number x is paired with exactly one angle y in the interval (−π/2, π/2) so that tan(y) = x (that is, y = arctan(x)). If the codomain (−π/2, π/2) was made larger to include an integer multiple of π/2, then this function would no longer be onto (surjective), since there is no real number which could be paired with the multiple of π/2 by this arctan function. The exponential function, g: R → R, g(x) = ex, is not bijective: for instance, there is no x in R such that g(x) = −1, showing that g is not onto (surjective). However, if the codomain is restricted to the positive real numbers , then g would be bijective; its inverse (see below) is the natural logarithm function ln. The function h: R → R+, h(x) = x2 is not bijective: for instance, h(−1) = h(1) = 1, showing that h is not one-to-one (injective). However, if the domain is restricted to , then h would be bijective; its inverse is the positive square root function. By Cantor-Bernstein-Schroder theorem, given any two sets X and Y, and two injective functions f: X → Y and g: Y → X, there exists a bijective function h: X → Y. Inverses A bijection f with domain X (indicated by f: X → Y in functional notation) also defines a converse relation starting in Y and going to X (by turning the arrows around). The process of "turning the arrows around" for an arbitrary function does not, in general, yield a function, but properties (3) and (4) of a bijection say that this inverse relation is a function with domain Y. Moreover, properties (1) and (2) then say that this inverse function is a surjection and an injection, that is, the inverse function exists and is also a bijection. Functions that have | Y are finite sets, then the existence of a bijection means they have the same number of elements. For infinite sets, the picture is more complicated, leading to the concept of cardinal number—a way to distinguish the various sizes of infinite sets. A bijective function from a set to itself is also called a permutation, and the set of all permutations of a set forms the symmetric group. Bijective functions are essential to many areas of mathematics including the definitions of isomorphism, homeomorphism, diffeomorphism, permutation group, and projective map. Definition For a pairing between X and Y (where Y need not be different from X) to be a bijection, four properties must hold: each element of X must be paired with at least one element of Y, no element of X may be paired with more than one element of Y, each element of Y must be paired with at least one element of X, and no element of Y may be paired with more than one element of X. Satisfying properties (1) and (2) means that a pairing is a function with domain X. It is more common to see properties (1) and (2) written as a single statement: Every element of X is paired with exactly one element of Y. Functions which satisfy property (3) are said to be "onto Y " and are called surjections (or surjective functions). Functions which satisfy property (4) are said to be "one-to-one functions" and are called injections (or injective functions). With this terminology, a bijection is a function which is both a surjection and an injection, or using other words, a bijection is a function which is both "one-to-one" and "onto". Bijections are sometimes denoted by a two-headed rightwards arrow with tail (), as in f : X ⤖ Y. This symbol is a combination of the two-headed rightwards arrow (), sometimes used to denote surjections, and the rightwards arrow with a barbed tail (), sometimes used to denote injections. Examples Batting line-up of a baseball or cricket team Consider the batting line-up of a baseball or cricket team (or any list of all the players of any sports team where every player holds a specific spot in a line-up). The set X will be the players on the team (of size nine in the case of baseball) and the set Y will be the positions in the batting order (1st, 2nd, 3rd, etc.) The "pairing" is given by which player is in what position in this order. Property (1) is satisfied since each player is somewhere in the list. Property (2) is satisfied since no player bats in two (or more) positions in the order. Property (3) says that for each position in the order, there is some player batting in that position and property (4) states that two or more players are never batting in the same position in the list. Seats and students of a classroom In a classroom there are a certain number of seats. A bunch of students enter the room and the instructor asks them to be seated. After a quick look around the room, the instructor declares that there is a bijection between the set of students and the set of seats, where each student is paired with the seat they are sitting in. What the instructor observed in order to reach this conclusion was that: Every student was in a seat (there was no one standing), No student was in more than one seat, Every seat had someone sitting there (there were no empty seats), and No seat had more than one student in it. The instructor was able to conclude that there were just as many seats as there were students, without having to count either set. More mathematical examples and some non-examples For any set X, the identity function |
binary function . Another example is that of inner products, or more generally functions of the form , where , are real-valued vectors of appropriate size and is a matrix. If is a positive definite matrix, this yields an inner product. Functions of two real variables Functions whose domain is a subset of are often also called functions of two variables even if their domain does not form a rectangle and thus the cartesian product of two sets. Restrictions to ordinary functions In turn, one can also derive ordinary functions of one variable from a binary function. Given any element , there is a function , or , from to , given by . Similarly, given any element , there is a function , or , from to , given by . In computer science, this identification between a function from to and a function from to , where is the set of all functions from to , is called currying. Generalisations The various concepts relating to functions can also be generalised to binary functions. For example, the division example above is surjective (or onto) because every rational number may be expressed as a quotient of an integer and a natural number. This example is injective in each input separately, because the functions f x and f y are always injective. However, it's not injective in both variables simultaneously, because (for example) f (2,4) = f (1,2). One can also consider partial binary functions, which may be defined only for certain values of the inputs. For example, the division example above may also be interpreted as a partial binary function from Z and N to Q, where N is the set of all natural numbers, including zero. But this function is undefined when the second input is zero. A binary operation is a binary function where the sets X, Y, and Z are all equal; binary operations are often used to define algebraic structures. In linear algebra, a bilinear transformation is a binary function where the sets X, Y, and Z are all vector spaces and the derived functions f x and fy are all linear | simultaneously, because (for example) f (2,4) = f (1,2). One can also consider partial binary functions, which may be defined only for certain values of the inputs. For example, the division example above may also be interpreted as a partial binary function from Z and N to Q, where N is the set of all natural numbers, including zero. But this function is undefined when the second input is zero. A binary operation is a binary function where the sets X, Y, and Z are all equal; binary operations are often used to define algebraic structures. In linear algebra, a bilinear transformation is a binary function where the sets X, Y, and Z are all vector spaces and the derived functions f x and fy are all linear transformations. A bilinear transformation, like any binary function, can be interpreted as a function from X × Y to Z, but this function in general won't be linear. However, the bilinear transformation can also be interpreted as a single linear transformation from the tensor product to Z. Generalisations to ternary and other functions The concept of binary function generalises to ternary (or 3-ary) function, quaternary (or 4-ary) function, or more generally to n-ary function for any natural number n. A 0-ary function to Z is simply given by an element of Z. One can also define an A-ary function where A is any set; there is one input for each element of A. Category theory In category theory, n-ary functions generalise to n-ary morphisms |
Frank and his gang pull him out of the car, and Frank violently kisses him all over his face with red lipstick, before savagely beating him unconscious. Jeffrey awakes the next morning, bruised and bloodied. Visiting the police station, Jeffrey realizes that Detective Williams' partner Tom Gordon is the Yellow Man, who has been murdering Frank's rival drug dealers and stealing confiscated narcotics from the evidence room for Frank to sell. After he and Sandy attend a party during which they profess their love for one another, they are pursued by a car which they assume belongs to Frank. As they arrive at Jeffrey's home, Sandy realizes the car belongs to her by now ex-boyfriend Mike Shaw. After Mike threatens to beat Jeffrey for stealing his girlfriend, Dorothy appears on Jeffrey's porch naked, beaten and confused. Mike backs down as Jeffrey and Sandy whisk Dorothy to Sandy's house to summon medical attention. When Dorothy calls Jeffrey "my secret lover", a distraught Sandy slaps him for cheating on her. Jeffrey asks Sandy to tell her father everything, and Detective Williams then leads a police raid on Frank's headquarters, killing his men and crippling his criminal empire. Jeffrey returns alone to Dorothy's apartment, where he discovers her husband dead and the Yellow Man mortally wounded. As Jeffrey leaves the apartment, Frank arrives, sees him in the stairs and chases him back inside. Jeffrey re-uses the Yellow Man's walkie-talkie to lie about his precise location in the apartment and hides in a closet. When Frank arrives, Jeffrey ambushes and kills him with the Yellow Man's gun, moments before Sandy and Detective Williams arrive for help. Jeffrey and Sandy continue their relationship and Dorothy is reunited with her son. Cast Production Origin The film's story originated from three ideas that crystallized in the filmmaker's mind over a period of time starting as early as 1973. The first idea was only "a feeling" and the title Blue Velvet, Lynch told Cineaste in 1987. The second idea was an image of a severed, human ear lying in a field. "I don't know why it had to be an ear. Except it needed to be an opening of a part of the body, a hole into something else ... The ear sits on the head and goes right into the mind so it felt perfect," Lynch remarked in a 1986 interview to The New York Times. The third idea was Bobby Vinton's classic rendition of the song "Blue Velvet" and "the mood that came with that song a mood, a time, and things that were of that time." The scene in which Dorothy appears naked outside was inspired by a real-life experience Lynch had during childhood when he and his brother saw a naked woman walking down a neighborhood street at night. The experience was so traumatic to the young Lynch that it made him cry, and he had never forgotten it. Lynch eventually spent two years writing two drafts, which, he stated, were not very good. The problem with them, Lynch has said, was that "there was maybe all the unpleasantness in the film but nothing else. A lot was not there. And so it went away for a while." After completing The Elephant Man (1980), Lynch met producer Richard Roth over coffee. Roth had read and enjoyed Lynch's Ronnie Rocket script, but did not think it was something he wanted to produce. He asked Lynch if the filmmaker had any other scripts, but the director only had ideas. "I told him I had always wanted to sneak into a girl's room to watch her into the night and that, maybe, at one point or another, I would see something that would be the clue to a murder mystery. Roth loved the idea and asked me to write a treatment. I went home and thought of the ear in the field." Production was announced in August 1984. Lynch wrote two more drafts before he was satisfied with the script of the film. Conditions at this point were ideal for Lynch's film: he had made a deal with Dino De Laurentiis that gave him complete artistic freedom and final cut privileges, with the stipulation that the filmmaker take a cut in his salary and work with a budget of only $6 million. This deal meant that Blue Velvet was the smallest film on the De Laurentiis's slate. Consequently, Lynch would be left mostly unsupervised during production. "After Dune I was down so far that anything was up! So it was just a euphoria. And when you work with that kind of feeling, you can take chances. You can experiment." Casting The cast of Blue Velvet included several then-relatively unknown actors. Lynch met Isabella Rossellini at a restaurant, and offered her the role of Dorothy Vallens. Rossellini had gained some exposure before the film for her Lancôme ads in the early 1980s and for being the daughter of actress Ingrid Bergman and Italian film director Roberto Rossellini. After completion of the film, during test screenings, ICM Partners—the agency representing Rossellini—immediately dropped her as a client. Furthermore, the nuns at the school in Rome that Rossellini attended in her youth called to say they were praying for her. Kyle MacLachlan had played the central role in Lynch's critical and commercial failure Dune (1984), a science fiction epic based on the novel of the same name. MacLachlan later became a recurring collaborator with Lynch, who remarked: "Kyle plays innocents who are interested in the mysteries of life. He's the person you trust enough to go into a strange world with." Dennis Hopper was the biggest "name" in the film, having starred in Easy Rider (1969). Hopper—said to be Lynch's third choice (Michael Ironside has stated that Frank was written with him in mind)—accepted the role, reportedly having exclaimed, "I've got to play Frank! I am Frank!" as Hopper confirmed in the Blue Velvet "making-of" documentary The Mysteries of Love, produced for the 2002 special edition. Harry Dean Stanton and Steven Berkoff both turned down the role of Frank because of the violent content in the film. Laura Dern (then 18 years old) was cast, after various already successful actresses had turned it down; among these had been Molly Ringwald. Shooting Principal photography of Blue Velvet began in August 1985 and completed in November. The film was shot at EUE/Screen Gems studio in Wilmington, North Carolina, which also provided the exterior scenes of Lumberton. The scene with a raped and battered Dorothy proved to be particularly challenging. Several townspeople arrived to watch the filming with picnic baskets and rugs, against the wishes of Rossellini and Lynch. However, they continued filming as normal, and when Lynch yelled cut, the townspeople had left. As a result, police told Lynch they were no longer permitted to shoot in any public areas of Wilmington. The Carolina Apartments on 5th and Market St in downtown Wilmington served as the location central to the story, with the adjacent Kenan fountain featured prominently in many shots. The building is also the birthplace and deathplace of noted artist Claude Howell. The apartment building stands today, and the Kenan fountain was refurbished in 2020 after sustaining heavy damage during Hurricane Florence. Editing Lynch's original rough cut ran for approximately four hours. He was contractually obligated to deliver a two-hour movie by De Laurentiis and cut many small subplots and character scenes. He also made cuts at the request of the MPAA. For example, when Frank slaps Dorothy after the first rape scene, the audience was supposed to see Frank actually hitting her. Instead, the film cuts away to Jeffrey in the closet, wincing at what he has just seen. This cut was made to satisfy the MPAA's concerns about violence. Lynch thought that the change only made the scene more disturbing. In 2011, Lynch announced that footage from the deleted scenes, long thought lost, had been discovered. The material was subsequently included on the Blu-ray Disc release of the film. The final cut of the film runs at just over two hours. Distribution Because the material was completely different from anything that would be considered mainstream at the time, De Laurentiis had to start his own company to distribute it. Interpretations Despite Blue Velvets initial appearance as a mystery, the film operates on a number of thematic levels. The film owes a large debt to 1950s film noir, containing and exploring such conventions as the femme fatale (Dorothy Vallens), a seemingly unstoppable villain (Frank Booth), and the questionable moral outlook of the hero (Jeffrey Beaumont), as well as its unusual use of shadowy, sometimes dark cinematography. Blue Velvet represents and establishes Lynch's famous "askew vision", and introduces several common elements of Lynch's work, some of which would later become his trademarks, including distorted characters, a polarized world, and debilitating damage to the skull or brain. Perhaps the most significant Lynchian trademark in the film is the depiction of unearthing a dark underbelly in a seemingly idealized small town; Jeffrey even proclaims in the film that he is "seeing something that was always hidden", alluding to the plot's central idea. Lynch's characterization of films, symbols, and motifs have become well known, and his particular style, characterised largely in Blue Velvet for the first time, has been written about extensively using descriptions like "dreamlike", "ultraweird", "dark", and "oddball". Red curtains also show up in key scenes, specifically in Dorothy's apartment, which have since become a Lynch trademark. The film has been compared to Alfred Hitchcock's Psycho (1960) because of its stark treatment of evil and mental illness. The premise of both films is curiosity, leading to an investigation that draws the lead characters into a hidden, voyeuristic underworld of crime. The film's thematic framework hearkens back to Edgar Allan Poe, Henry James, and early gothic fiction, as well as films such as Shadow of a Doubt (1943) and The Night of the Hunter (1955) and the entire notion of film noir. Lynch has called it a "film about things that are hidden—within a small city and within people." Feminist psychoanalytic film theorist Laura Mulvey argues that Blue Velvet establishes a metaphorical Oedipal family—"the child", Jeffrey Beaumont, and his "parents", Frank Booth and Dorothy Vallens—through deliberate references to film noir and its underlying Oedipal theme. Michael Atkinson claims that the resulting violence in the film can be read as symbolic of domestic violence within real families. For instance, Frank's violent acts can be seen to reflect the different types of abuse within families, and the control he has over Dorothy might represent the hold an abusive husband has over his wife. He reads Jeffrey as an innocent youth who is both horrified by the violence inflicted by Frank, but also tempted by it as the means of possessing Dorothy for himself. Atkinson takes a Freudian approach to the film; considering it to be an expression of the traumatised innocence which characterises Lynch's work. He states, "Dorothy represents the sexual force of the mother [figure] because she is forbidden and because she becomes the object of the unhealthy, infantile impulses at work in Jeffrey's subconscious." Symbolism Symbolism is used heavily in Blue Velvet. The most consistent symbolism in the film is an insect motif introduced at the end of the first scene, when the camera zooms in on a well-kept suburban lawn until it unearths a swarming underground nest of bugs. This is generally recognized as a metaphor for the seedy underworld that Jeffrey will soon discover under the surface of his own suburban, Reaganesque paradise. The severed ear he finds is being overrun by black ants. The bug motif is recurrent throughout the film, most notably in the bug-like gas mask that Frank wears, but also the excuse that Jeffrey uses to gain access to Dorothy's apartment: he claims to be an insect exterminator. One of Frank's sinister accomplices is also consistently identified through the yellow jacket he wears, possibly reminiscent of the name of a type of wasp. Finally, a robin eating a bug on a fence becomes a topic of discussion in the last scene of the film. The severed ear that Jeffrey discovers is also a key symbolic element, leading Jeffrey into danger. Indeed, just as Jeffrey's troubles begin, the audience is treated to a nightmarish sequence in which the camera zooms into the canal of the severed, decomposing ear. Soundtrack The Blue Velvet soundtrack was supervised by Angelo Badalamenti (who makes a brief cameo appearance as the pianist at the Slow Club where Dorothy performs). The soundtrack makes heavy usage of vintage pop songs, such as Bobby Vinton's "Blue Velvet" and Roy Orbison's "In Dreams", juxtaposed with an orchestral score inspired by Shostakovich. During filming, Lynch placed speakers on set and in streets and played Shostakovich to set the mood he wanted to convey. The score alludes to Shostakovich's 15th Symphony, which Lynch had been listening to regularly while writing the screenplay. Lynch had originally opted to use "Song to the Siren" by This Mortal Coil during the scene in which Sandy and Jeffrey share a dance; however, he could not obtain the rights for the song at the time. He would go on to use this song in Lost Highway, eleven years later. Entertainment Weekly ranked Blue Velvet soundtrack on its list of the 100 Greatest Film Soundtracks, at the 100th position. Critic John Alexander wrote, "the haunting soundtrack accompanies the title credits, then weaves through the narrative, accentuating the noir mood of the film." Lynch worked with music composer Angelo Badalamenti for the first time in this film and asked him to write a score that had to be "like Shostakovich, be very Russian, but make it the most beautiful thing but make it dark and a little bit scary." Badalamenti's success with Blue Velvet would lead him to contribute to all of Lynch's future full-length films until Inland Empire as well as the cult television program Twin Peaks. Also included in the sound team was long-time Lynch collaborator Alan Splet, a sound editor and designer who had won an Academy Award for his work on The Black Stallion (1979), and been nominated for Never Cry Wolf (1983). Reception Box office Blue Velvet premiered in competition at the Montréal World Film Festival in August 1986, and at the Toronto Festival of Festivals on September 12, 1986, and a few days later in the United States. It debuted commercially in both countries on September 19, 1986, in 98 theatres across the United States. In its opening weekend, the film grossed a total of $789,409. It eventually expanded to another 15 theatres, and in the US and Canada grossed a total of $8,551,228. Blue Velvet was met with uproar during its audience reception, with lines formed around city blocks in New York City and Los Angeles. There were reports of mass walkouts and refund demands during its opening week. At a Chicago screening, a man | party during which they profess their love for one another, they are pursued by a car which they assume belongs to Frank. As they arrive at Jeffrey's home, Sandy realizes the car belongs to her by now ex-boyfriend Mike Shaw. After Mike threatens to beat Jeffrey for stealing his girlfriend, Dorothy appears on Jeffrey's porch naked, beaten and confused. Mike backs down as Jeffrey and Sandy whisk Dorothy to Sandy's house to summon medical attention. When Dorothy calls Jeffrey "my secret lover", a distraught Sandy slaps him for cheating on her. Jeffrey asks Sandy to tell her father everything, and Detective Williams then leads a police raid on Frank's headquarters, killing his men and crippling his criminal empire. Jeffrey returns alone to Dorothy's apartment, where he discovers her husband dead and the Yellow Man mortally wounded. As Jeffrey leaves the apartment, Frank arrives, sees him in the stairs and chases him back inside. Jeffrey re-uses the Yellow Man's walkie-talkie to lie about his precise location in the apartment and hides in a closet. When Frank arrives, Jeffrey ambushes and kills him with the Yellow Man's gun, moments before Sandy and Detective Williams arrive for help. Jeffrey and Sandy continue their relationship and Dorothy is reunited with her son. Cast Production Origin The film's story originated from three ideas that crystallized in the filmmaker's mind over a period of time starting as early as 1973. The first idea was only "a feeling" and the title Blue Velvet, Lynch told Cineaste in 1987. The second idea was an image of a severed, human ear lying in a field. "I don't know why it had to be an ear. Except it needed to be an opening of a part of the body, a hole into something else ... The ear sits on the head and goes right into the mind so it felt perfect," Lynch remarked in a 1986 interview to The New York Times. The third idea was Bobby Vinton's classic rendition of the song "Blue Velvet" and "the mood that came with that song a mood, a time, and things that were of that time." The scene in which Dorothy appears naked outside was inspired by a real-life experience Lynch had during childhood when he and his brother saw a naked woman walking down a neighborhood street at night. The experience was so traumatic to the young Lynch that it made him cry, and he had never forgotten it. Lynch eventually spent two years writing two drafts, which, he stated, were not very good. The problem with them, Lynch has said, was that "there was maybe all the unpleasantness in the film but nothing else. A lot was not there. And so it went away for a while." After completing The Elephant Man (1980), Lynch met producer Richard Roth over coffee. Roth had read and enjoyed Lynch's Ronnie Rocket script, but did not think it was something he wanted to produce. He asked Lynch if the filmmaker had any other scripts, but the director only had ideas. "I told him I had always wanted to sneak into a girl's room to watch her into the night and that, maybe, at one point or another, I would see something that would be the clue to a murder mystery. Roth loved the idea and asked me to write a treatment. I went home and thought of the ear in the field." Production was announced in August 1984. Lynch wrote two more drafts before he was satisfied with the script of the film. Conditions at this point were ideal for Lynch's film: he had made a deal with Dino De Laurentiis that gave him complete artistic freedom and final cut privileges, with the stipulation that the filmmaker take a cut in his salary and work with a budget of only $6 million. This deal meant that Blue Velvet was the smallest film on the De Laurentiis's slate. Consequently, Lynch would be left mostly unsupervised during production. "After Dune I was down so far that anything was up! So it was just a euphoria. And when you work with that kind of feeling, you can take chances. You can experiment." Casting The cast of Blue Velvet included several then-relatively unknown actors. Lynch met Isabella Rossellini at a restaurant, and offered her the role of Dorothy Vallens. Rossellini had gained some exposure before the film for her Lancôme ads in the early 1980s and for being the daughter of actress Ingrid Bergman and Italian film director Roberto Rossellini. After completion of the film, during test screenings, ICM Partners—the agency representing Rossellini—immediately dropped her as a client. Furthermore, the nuns at the school in Rome that Rossellini attended in her youth called to say they were praying for her. Kyle MacLachlan had played the central role in Lynch's critical and commercial failure Dune (1984), a science fiction epic based on the novel of the same name. MacLachlan later became a recurring collaborator with Lynch, who remarked: "Kyle plays innocents who are interested in the mysteries of life. He's the person you trust enough to go into a strange world with." Dennis Hopper was the biggest "name" in the film, having starred in Easy Rider (1969). Hopper—said to be Lynch's third choice (Michael Ironside has stated that Frank was written with him in mind)—accepted the role, reportedly having exclaimed, "I've got to play Frank! I am Frank!" as Hopper confirmed in the Blue Velvet "making-of" documentary The Mysteries of Love, produced for the 2002 special edition. Harry Dean Stanton and Steven Berkoff both turned down the role of Frank because of the violent content in the film. Laura Dern (then 18 years old) was cast, after various already successful actresses had turned it down; among these had been Molly Ringwald. Shooting Principal photography of Blue Velvet began in August 1985 and completed in November. The film was shot at EUE/Screen Gems studio in Wilmington, North Carolina, which also provided the exterior scenes of Lumberton. The scene with a raped and battered Dorothy proved to be particularly challenging. Several townspeople arrived to watch the filming with picnic baskets and rugs, against the wishes of Rossellini and Lynch. However, they continued filming as normal, and when Lynch yelled cut, the townspeople had left. As a result, police told Lynch they were no longer permitted to shoot in any public areas of Wilmington. The Carolina Apartments on 5th and Market St in downtown Wilmington served as the location central to the story, with the adjacent Kenan fountain featured prominently in many shots. The building is also the birthplace and deathplace of noted artist Claude Howell. The apartment building stands today, and the Kenan fountain was refurbished in 2020 after sustaining heavy damage during Hurricane Florence. Editing Lynch's original rough cut ran for approximately four hours. He was contractually obligated to deliver a two-hour movie by De Laurentiis and cut many small subplots and character scenes. He also made cuts at the request of the MPAA. For example, when Frank slaps Dorothy after the first rape scene, the audience was supposed to see Frank actually hitting her. Instead, the film cuts away to Jeffrey in the closet, wincing at what he has just seen. This cut was made to satisfy the MPAA's concerns about violence. Lynch thought that the change only made the scene more disturbing. In 2011, Lynch announced that footage from the deleted scenes, long thought lost, had been discovered. The material was subsequently included on the Blu-ray Disc release of the film. The final cut of the film runs at just over two hours. Distribution Because the material was completely different from anything that would be considered mainstream at the time, De Laurentiis had to start his own company to distribute it. Interpretations Despite Blue Velvets initial appearance as a mystery, the film operates on a number of thematic levels. The film owes a large debt to 1950s film noir, containing and exploring such conventions as the femme fatale (Dorothy Vallens), a seemingly unstoppable villain (Frank Booth), and the questionable moral outlook of the hero (Jeffrey Beaumont), as well as its unusual use of shadowy, sometimes dark cinematography. Blue Velvet represents and establishes Lynch's famous "askew vision", and introduces several common elements of Lynch's work, some of which would later become his trademarks, including distorted characters, a polarized world, and debilitating damage to the skull or brain. Perhaps the most significant Lynchian trademark in the film is the depiction of unearthing a dark underbelly in a seemingly idealized small town; Jeffrey even proclaims in the film that he is "seeing something that was always hidden", alluding to the plot's central idea. Lynch's characterization of films, symbols, and motifs have become well known, and his particular style, characterised largely in Blue Velvet for the first time, has been written about extensively using descriptions like "dreamlike", "ultraweird", "dark", and "oddball". Red curtains also show up in key scenes, specifically in Dorothy's apartment, which have since become a Lynch trademark. The film has been compared to Alfred Hitchcock's Psycho (1960) because of its stark treatment of evil and mental illness. The premise of both films is curiosity, leading to an investigation that draws the lead characters into a hidden, voyeuristic underworld of crime. The film's thematic framework hearkens back to Edgar Allan Poe, Henry James, and early gothic fiction, as well as films such as Shadow of a Doubt (1943) and The Night of the Hunter (1955) and the entire notion of film noir. Lynch has called it a "film about things that are hidden—within a small city and within people." Feminist psychoanalytic film theorist Laura Mulvey argues that Blue Velvet establishes a metaphorical Oedipal family—"the child", Jeffrey Beaumont, and his "parents", Frank Booth and Dorothy Vallens—through deliberate references to film noir and its underlying Oedipal theme. Michael Atkinson claims that the resulting violence in the film can be read as symbolic of domestic violence within real families. For instance, Frank's violent acts can be seen to reflect the different types of abuse within families, and the control he has over Dorothy might represent the hold an abusive husband has over his wife. He reads Jeffrey as an innocent youth who is both horrified by the violence inflicted by Frank, but also tempted by it as the means of possessing Dorothy for himself. Atkinson takes a Freudian approach to the film; considering it to be an expression of the traumatised innocence which characterises Lynch's work. He states, "Dorothy represents the sexual force of the mother [figure] because she is forbidden and because she becomes the object of the unhealthy, infantile impulses at work in Jeffrey's subconscious." Symbolism Symbolism is used heavily in Blue Velvet. The most consistent symbolism in the film is an insect motif introduced at the end of the first scene, when the camera zooms in on a well-kept suburban lawn until it unearths a swarming underground nest of bugs. This is generally recognized as a metaphor for the seedy underworld that Jeffrey will soon discover under the surface of his own suburban, Reaganesque paradise. The severed ear he finds is being overrun by black ants. The bug motif is recurrent throughout the film, most notably in the bug-like gas mask that Frank wears, but also the excuse that Jeffrey uses to gain access to Dorothy's apartment: he claims to be an insect exterminator. One of Frank's sinister accomplices is also consistently identified through the yellow jacket he wears, possibly reminiscent of the name of a type of wasp. Finally, a robin eating a bug on a fence |
More formally, a binary operation is an operation of arity two. More specifically, a binary operation on a set is an operation whose two domains and the codomain are the same set. Examples include the familiar arithmetic operations of addition, subtraction, and multiplication. Other examples are readily found in different areas of mathematics, such as vector addition, matrix multiplication, and conjugation in groups. An operation of arity two that involves several sets is sometimes also called a binary operation. For example, scalar multiplication of vector spaces takes a scalar and a vector to produce a vector, and scalar product takes two vectors to produce a scalar. Such binary operations may be called simply binary functions. Binary operations are the keystone of most algebraic structures that are studied in algebra, in particular in semigroups, monoids, groups, rings, fields, and vector spaces. Terminology More precisely, a binary operation on a set S is a mapping of the elements of the Cartesian product to S: Because the result of performing the operation on a pair of elements of S is again an element of S, the operation is called a closed (or internal) binary operation on S (or sometimes expressed as having the property of closure). If f is not a function, but a partial function, then f is called a partial binary operation. For instance, division of real numbers is a partial binary operation, because one can't divide by zero: a/0 is undefined for every real number a. In both universal algebra and model theory, binary operations are required to be defined on all elements of . Sometimes, especially in computer science, the term binary operation is used for any binary function. Properties and examples Typical examples of binary operations are the addition (+) and multiplication (×) of numbers and matrices as well as composition of functions on a single set. For instance, On the set of real numbers R, is a binary operation since the sum of two real numbers is a real number. On the set of natural numbers N, is a binary operation since the sum of two natural numbers is a natural number. This is a different binary operation than the previous one since the sets are different. On the set M(2,R) of matrices with real entries, is a binary operation since the sum of two such matrices is a matrix. On the set M(2,R) of matrices with real | is a matrix. On the set M(2,R) of matrices with real entries, is a binary operation since the product of two such matrices is a matrix. For a given set C, let S be the set of all functions . Define by for all , the composition of the two functions h and h in S. Then f is a binary operation since the composition of the two functions is again a function on the set C (that is, a member of S). Many binary operations of interest in both algebra and formal logic are commutative, satisfying for all elements a and b in S, or associative, satisfying for all a, b, and c in S. Many also have identity elements and inverse elements. The first three examples above are commutative and all of the above examples are associative. On the set of real numbers R, subtraction, that is, , is a binary operation which is not commutative since, in general, . It is also not associative, since, in general, ; for instance, but . On the set of natural numbers N, the binary operation exponentiation, , is not commutative since, (cf. Equation xy = yx), and is also not associative since . For instance, with , , and , , but . By changing the set N to the set of integers Z, this binary operation becomes a partial binary operation since it is now undefined when and b is any negative integer. For either set, this operation has a right identity (which is 1) since for all a in the set, which is not an identity (two sided identity) since in general. Division (/), a partial binary operation on the set of real or rational numbers, is not commutative or associative. Tetration (↑↑), as a binary operation on the natural numbers, is not commutative or associative and has no identity element. Notation Binary operations are often written using infix notation such as , , or (by juxtaposition with no symbol) rather than by functional notation of the form . Powers are usually also written without operator, but with the second argument as superscript. Binary operations are sometimes written using prefix or (more frequently) postfix notation, both of which dispense with parentheses. They are also called, respectively, Polish notation and reverse Polish notation. Pair and tuple A binary operation, ab, depends on the ordered pair (a, b) and so (ab)c (where the parentheses here mean first operate on the ordered pair (a, b) and then operate on the result of that using the ordered pair ((ab), c)) depends in general on the ordered pair ((a, b), c). Thus, for the general, non-associative case, binary operations can be represented with binary trees. However: If the operation is associative, (ab)c = a(bc), then the value of (ab)c depends only on the tuple (a, b, c). If the operation is commutative, ab = ba, then the value of (ab)c depends only on { {a, b}, c}, where braces indicate multisets. If the operation is both associative and commutative then the value of (ab)c depends only on the multiset {a, b, c}. If the operation is associative, commutative, and idempotent, aa = a, then the value of (ab)c depends only on the set {a, b, c}. Binary operations as ternary relations A binary operation f on a set |
closed ends or stop the end on the player's leg, so that when the player "closes" (covers all the holes), the chanter becomes silent. A practice chanter is a chanter without bag or drones, allowing a player to practice the instrument quietly and with no variables other than playing the chanter. The term chanter is derived from the Latin cantare, or "to sing", much like the modern French word chanteur. Chanter reed The note from the chanter is produced by a reed installed at its top. The reed may be a single (a reed with one vibrating tongue) or double reed (of two pieces that vibrate against each other). Double reeds are used with both conical- and parallel-bored chanters while single reeds are generally (although not exclusively) limited to parallel-bored chanters. In general, double-reed chanters are found in pipes of Western Europe while single-reed chanters appear in most other regions. Drone Most bagpipes have at least one drone, a pipe that generally is not fingered but rather produces a constant harmonizing note throughout play (usually the tonic note of the chanter). Exceptions are generally those pipes that have a double-chanter instead. A drone is most commonly a cylindrically bored tube with a single reed, although drones with double reeds exist. The drone is generally designed in two or more parts with a sliding joint so that the pitch of the drone can be adjusted. Depending on the type of pipes, the drones may lie over the shoulder, across the arm opposite the bag, or may run parallel to the chanter. Some drones have a tuning screw, which effectively alters the length of the drone by opening a hole, allowing the drone to be tuned to two or more distinct pitches. The tuning screw may also shut off the drone altogether. In most types of pipes with one drone, it is pitched two octaves below the tonic of the chanter. Additional drones often add the octave below and then a drone consonant with the fifth of the chanter. History Possible ancient origins The evidence for bagpipes prior to the 13th century AD is still uncertain, but several textual and visual clues have been suggested. The Oxford History of Music posits that a sculpture of bagpipes has been found on a Hittite slab at Euyuk in Anatolia, dated to 1000 BC. Another interpretation of this sculpture suggests that it instead depicts a pan flute played along with a friction drum. Several authors identify the ancient Greek (ἀσκός askos – wine-skin, αὐλός aulos – reed pipe) with the bagpipe. In the 2nd century AD, Suetonius described the Roman emperor Nero as a player of the tibia utricularis. Dio Chrysostom wrote in the 1st century of a contemporary sovereign (possibly Nero) who could play a pipe (tibia, Roman reedpipes similar to Greek and Etruscan instruments) with his mouth as well as by tucking a bladder beneath his armpit. Vereno suggests that such instruments, rather than being seen as an independent class, were understood as variants on mouth-blown instruments that used a bag as an alternative blowing aid and that it was not until drones were added in the European Medieval era that bagpipes were seen as a distinct class. Spread and development in Europe In the early part of the second millennium, representation of bagpipes began to appear with frequency in Western European art and iconography. The Cantigas de Santa Maria, written in Galician-Portuguese and compiled in Castile in the mid-13th century, depicts several types of bagpipes. Several illustrations of bagpipes also appear in the Chronique dite de Baudoin d’Avesnes, a 13th-century manuscript of northern French origin. Although evidence of bagpipes in the British Isles prior to the 14th century is contested, they are explicitly mentioned in The Canterbury Tales (written around 1380): Bagpipes were also frequent subjects for carvers of wooden choir stalls in the late 15th and early 16th century throughout Europe, sometimes with animal musicians. Actual specimens of bagpipes from before the 18th century are extremely rare; however, a substantial number of paintings, carvings, engravings and manuscript illuminations survive. These artifacts are clear evidence that bagpipes varied widely throughout Europe, and even within individual regions. Many examples of early folk bagpipes in continental Europe can be found in the paintings of Brueghel, Teniers, Jordaens, and Durer. The earliest known artifact identified as a part of a bagpipe is a chanter found at Rostock in 1985 that has been dated to the late 14th century or the first quarter of the 15th century. The first clear reference to the use of the Scottish Highland bagpipes is from a French history that mentions their use at the Battle of Pinkie in 1547. George Buchanan (1506–82) claimed that bagpipes had replaced the trumpet on the battlefield. This period saw the creation of the ceòl mór (great music) of the bagpipe, which reflected its martial origins, with battle tunes, marches, gatherings, salutes and laments. The Highlands of the early 17th century saw the development of piping families including the MacCrimmonds, MacArthurs, MacGregors and the Mackays of Gairloch. The first probable reference to the Irish bagpipe is from 1544, a mention attributing their use to Irish troops in Henry VIII's siege of Boulogne. Illustrations in the 1581 book The Image of Irelande by John Derricke clearly depict a bagpiper. Derricke's illustrations are considered to be reasonably faithful depictions of the attire and equipment of the English and Irish population of the 16th century. The "Battell" sequence from My Ladye Nevells Booke (1591) by William Byrd, which probably alludes to the Irish wars of 1578, contains a piece entitled The bagpipe: & the drone. In 1760, the first serious study of the Scottish Highland bagpipe and its music was attempted in Joseph MacDonald's Compleat Theory. A manuscript from the 1730s by a William Dixon of Northumberland contains music that fits the border pipes, a nine-note bellows-blown bagpipe with a chanter similar to that of the modern Great Highland bagpipe. However, the music in Dixon's manuscript varied greatly from modern Highland bagpipe tunes, consisting mostly of extended variation sets of common dance tunes. Some of the tunes in the Dixon manuscript correspond to those found in the early 19th century manuscript sources of Northumbrian smallpipe tunes, notably the rare book of 50 tunes, many with variations, by John Peacock. As Western classical music developed, both in terms of musical sophistication and instrumental technology, bagpipes in many regions fell out of favour because of their limited range and function. This triggered a long, slow decline that continued, in most cases, into the 20th century. Extensive and documented collections of traditional bagpipes may be found at the Metropolitan Museum of Art in New York City, the International Bagpipe Museum in Gijón, Spain, the Pitt Rivers Museum in Oxford, England and the Morpeth Chantry Bagpipe Museum in Northumberland, and the Musical Instrument Museum in Phoenix, Arizona. The International Bagpipe Festival is held every two years in Strakonice, Czech Republic. Recent history During the expansion of the British Empire, spearheaded by British military forces that included Highland regiments, the Scottish Great Highland bagpipe became well-known worldwide. This surge in popularity was boosted by large numbers of pipers trained for military service in World War I and World War II. This coincided with a decline in the popularity of many traditional forms of bagpipe throughout Europe, which began to be displaced by instruments from the classical tradition and later by gramophone and radio. In the United Kingdom and Commonwealth Nations such as Canada, New Zealand and Australia, the Great Highland bagpipe is commonly used in the military and is often played during formal ceremonies. Foreign militaries patterned after the British army have also adopted the Highland bagpipe, including those of Uganda, Sudan, India, Pakistan, Sri Lanka, Jordan, and Oman. Many police and fire services in Scotland, Canada, Australia, New Zealand, Hong Kong, and the United States have also adopted the tradition of fielding pipe bands. In recent years, often driven by revivals of native folk music and dance, many types of bagpipes have enjoyed a resurgence in popularity and, in many cases, instruments that had fallen into obscurity have become extremely | are the best known examples in the Anglophone world, but people have played bagpipes for centuries throughout large parts of Europe, Northern Africa, Western Asia, around the Persian Gulf and northern parts of South Asia. The term bagpipe is equally correct in the singular or the plural, though pipers usually refer to the bagpipes as "the pipes", "a set of pipes" or "a stand of pipes". Construction A set of bagpipes minimally consists of an air supply, a bag, a chanter, and usually at least one drone. Many bagpipes have more than one drone (and, sometimes, more than one chanter) in various combinations, held in place in stocks—sockets that fasten the various pipes to the bag. Air supply The most common method of supplying air to the bag is through blowing into a blowpipe or blowstick. In some pipes the player must cover the tip of the blowpipe with their tongue while inhaling, but most blowpipes have a non-return valve that eliminates this need. In recent times, there are many instruments that assist in creating a clean air flow to the pipes and assist the collection of condensation. The use of a bellows to supply air is an innovation dating from the 16th or 17th century. In these pipes, sometimes called "cauld wind pipes," air is not heated or moistened by the player's breathing, so bellows-driven bagpipes can use more refined or delicate reeds. Such pipes include the Irish uilleann pipes; the border or Lowland pipes, Scottish smallpipes, Northumbrian smallpipes and pastoral pipes in Britain; the musette de cour, the musette bechonnet and the cabrette in France; and the Dudy wielkopolskie, koziol bialy and koziol czarny in Poland. Bag The bag is an airtight reservoir that holds air and regulates its flow via arm pressure, allowing the player to maintain continuous, even sound. The player keeps the bag inflated by blowing air into it through a blowpipe or by pumping air into it with a bellows. Materials used for bags vary widely, but the most common are the skins of local animals such as goats, dogs, sheep, and cows. More recently, bags made of synthetic materials including Gore-Tex have become much more common. Synthetic bags have zips that allow the player to fit a more effective moisture trap to the inside of the bag. However, synthetic bags carry risk of colonisation by fungal spores, and the associated danger of lung infection, because they require less cleaning than do bags made from natural substances. Bags cut from larger materials are usually saddle-stitched with an extra strip folded over the seam and stitched (for skin bags) or glued (for synthetic bags) to reduce leaks. Holes are then cut to accommodate the stocks. In the case of bags made from largely intact animal skins, the stocks are typically tied into the points where the limbs and the head joined the body of the whole animal, a construction technique common in Central Europe. Chanter The chanter is the melody pipe, played with two hands. All bagpipes have at least one chanter; some pipes have two chanters, particularly those in North Africa, in the Balkans, and in Southwest Asia. A chanter can be bored internally so that the inside walls are parallel (or "cylindrical") for its full length, or it can be bored in a conical shape. The chanter is usually open-ended, so there is no easy way for the player to stop the pipe from sounding. Thus most bagpipes share a constant legato sound with no rests in the music. Primarily because of this inability to stop playing, technical movements are made to break up notes and to create the illusion of articulation and accents. Because of their importance, these embellishments (or "ornaments") are often highly technical systems specific to each bagpipe, and take many years of study to master. A few bagpipes (such as the musette de cour, the uilleann pipes, the Northumbrian smallpipes, the piva and the left chanter of the surdelina) have closed ends or stop the end on the player's leg, so that when the player "closes" (covers all the holes), the chanter becomes silent. A practice chanter is a chanter without bag or drones, allowing a player to practice the instrument quietly and with no variables other than playing the chanter. The term chanter is derived from the Latin cantare, or "to sing", much like the modern French word chanteur. Chanter reed The note from the chanter is produced by a reed installed at its top. The reed may be a single (a reed with one vibrating tongue) or double reed (of two pieces that vibrate against each other). Double reeds are used with both conical- and parallel-bored chanters while single reeds are generally (although not exclusively) limited to parallel-bored chanters. In general, double-reed chanters are found in pipes of Western Europe while single-reed chanters appear in most other regions. Drone Most bagpipes have at least one drone, a pipe that generally is not fingered but rather produces a constant harmonizing note throughout play (usually the tonic note of the chanter). Exceptions are generally those pipes that have a double-chanter instead. A drone is most commonly a cylindrically bored tube with a single reed, although drones with double reeds exist. The drone is generally designed in two or more parts with a sliding joint so that the pitch of the drone can be adjusted. Depending on the type of pipes, the drones may lie over the shoulder, across the arm opposite the bag, or may run parallel to the chanter. Some drones have a tuning screw, which effectively alters the length of the drone by opening a hole, allowing the drone to be tuned to two or more distinct pitches. The tuning screw may also shut off the drone altogether. In most types of pipes with one drone, it is pitched two octaves below the tonic of the chanter. Additional drones often add the octave below and then a drone consonant with the fifth of the chanter. History Possible ancient origins The evidence for bagpipes prior to the 13th century AD is still uncertain, but several textual and visual clues have been suggested. The Oxford History of Music posits that a sculpture of bagpipes has been found on a Hittite slab at Euyuk in Anatolia, dated to 1000 BC. Another interpretation of this sculpture suggests that it instead depicts a pan flute played along with a friction drum. Several authors identify the ancient Greek (ἀσκός askos – wine-skin, αὐλός aulos – reed pipe) with the bagpipe. In the 2nd century AD, Suetonius described the Roman emperor Nero as a player of the tibia utricularis. Dio Chrysostom wrote in the 1st century of a contemporary sovereign (possibly Nero) who could play a pipe (tibia, Roman reedpipes similar to Greek and Etruscan instruments) with his mouth as well as by tucking a bladder beneath his armpit. Vereno suggests that such instruments, rather than being seen as an independent class, were understood as variants on mouth-blown instruments that used a bag as an alternative blowing aid and that it was not until drones were added in the European Medieval era that bagpipes were seen as a distinct class. Spread and development in Europe In the early part of the second millennium, representation of bagpipes began to appear with frequency in Western European art and |
Digital and one called Lost & Found belonging to Guy J. The first Bedrock album compiled and mixed by John Digweed was released in 1999, containing several tracks signed to the Bedrock label. In 2018, Digweed marked the 20th anniversary of the label with the release of Bedrock XX. See also List of electronic music record labels | a long running and successful club night held in Hastings and also at Heaven nightclub, London - both also called Bedrock. Bedrock Records has released many singles from artists such as Astro & Glyde, Brancaccio & Aisher, Steve Lawler, Shmuel Flash, Steve Porter, Sahar Z, Guy J, Henry Saiz, Stelios Vassiloudis, Electric Rescue, The Japanese Popstars and Jerry Bonham. Bedrock is also the |
(often referred to as the "vital principle") distinct from any found in non-living matter, and it was thought that only living beings could produce the molecules of life. In 1828, Friedrich Wöhler published a paper on his serendipitous urea synthesis from potassium cyanate and ammonium sulfate; some regarded that as a direct overthrow of vitalism and the establishment of organic chemistry. However, the Wöhler synthesis has sparked controversy as some reject the death of vitalism at his hands. Since then, biochemistry has advanced, especially since the mid-20th century, with the development of new techniques such as chromatography, X-ray diffraction, dual polarisation interferometry, NMR spectroscopy, radioisotopic labeling, electron microscopy and molecular dynamics simulations. These techniques allowed for the discovery and detailed analysis of many molecules and metabolic pathways of the cell, such as glycolysis and the Krebs cycle (citric acid cycle), and led to an understanding of biochemistry on a molecular level. Another significant historic event in biochemistry is the discovery of the gene, and its role in the transfer of information in the cell. In the 1950s, James D. Watson, Francis Crick, Rosalind Franklin and Maurice Wilkins were instrumental in solving DNA structure and suggesting its relationship with the genetic transfer of information. In 1958, George Beadle and Edward Tatum received the Nobel Prize for work in fungi showing that one gene produces one enzyme. In 1988, Colin Pitchfork was the first person convicted of murder with DNA evidence, which led to the growth of forensic science. More recently, Andrew Z. Fire and Craig C. Mello received the 2006 Nobel Prize for discovering the role of RNA interference (RNAi), in the silencing of gene expression. Starting materials: the chemical elements of life Around two dozen chemical elements are essential to various kinds of biological life. Most rare elements on Earth are not needed by life (exceptions being selenium and iodine), while a few common ones (aluminum and titanium) are not used. Most organisms share element needs, but there are a few differences between plants and animals. For example, ocean algae use bromine, but land plants and animals do not seem to need any. All animals require sodium, but some plants do not. Plants need boron and silicon, but animals may not (or may need ultra-small amounts). Just six elements—carbon, hydrogen, nitrogen, oxygen, calcium and phosphorus—make up almost 99% of the mass of living cells, including those in the human body (see composition of the human body for a complete list). In addition to the six major elements that compose most of the human body, humans require smaller amounts of possibly 18 more. Biomolecules The 4 main classes of molecules in bio-chemistry (often called biomolecules) are carbohydrates, lipids, proteins, and nucleic acids. Many biological molecules are polymers: in this terminology, monomers are relatively small macromolecules that are linked together to create large macromolecules known as polymers. When monomers are linked together to synthesize a biological polymer, they undergo a process called dehydration synthesis. Different macromolecules can assemble in larger complexes, often needed for biological activity. Carbohydrates Two of the main functions of carbohydrates are energy storage and providing structure. One of the common sugars known as glucose is carbohydrate, but not all carbohydrates are sugars. There are more carbohydrates on Earth than any other known type of biomolecule; they are used to store energy and genetic information, as well as play important roles in cell to cell interactions and communications. The simplest type of carbohydrate is a monosaccharide, which among other properties contains carbon, hydrogen, and oxygen, mostly in a ratio of 1:2:1 (generalized formula CnH2nOn, where n is at least 3). Glucose (C6H12O6) is one of the most important carbohydrates; others include fructose (C6H12O6), the sugar commonly associated with the sweet taste of fruits, and deoxyribose (C5H10O4), a component of DNA. A monosaccharide can switch between acyclic (open-chain) form and a cyclic form. The open-chain form can be turned into a ring of carbon atoms bridged by an oxygen atom created from the carbonyl group of one end and the hydroxyl group of another. The cyclic molecule has a hemiacetal or hemiketal group, depending on whether the linear form was an aldose or a ketose. In these cyclic forms, the ring usually has 5 or 6 atoms. These forms are called furanoses and pyranoses, respectively—by analogy with furan and pyran, the simplest compounds with the same carbon-oxygen ring (although they lack the carbon-carbon double bonds of these two molecules). For example, the aldohexose glucose may form a hemiacetal linkage between the hydroxyl on carbon 1 and the oxygen on carbon 4, yielding a molecule with a 5-membered ring, called glucofuranose. The same reaction can take place between carbons 1 and 5 to form a molecule with a 6-membered ring, called glucopyranose. Cyclic forms with a 7-atom ring called heptoses are rare. Two monosaccharides can be joined together by a glycosidic or ester bond into a disaccharide through a dehydration reaction during which a molecule of water is released. The reverse reaction in which the glycosidic bond of a disaccharide is broken into two monosaccharides is termed hydrolysis. The best-known disaccharide is sucrose or ordinary sugar, which consists of a glucose molecule and a fructose molecule joined together. Another important disaccharide is lactose found in milk, consisting of a glucose molecule and a galactose molecule. Lactose may be hydrolysed by lactase, and deficiency in this enzyme results in lactose intolerance. When a few (around three to six) monosaccharides are joined, it is called an oligosaccharide (oligo- meaning "few"). These molecules tend to be used as markers and signals, as well as having some other uses. Many monosaccharides joined together form a polysaccharide. They can be joined together in one long linear chain, or they may be branched. Two of the most common polysaccharides are cellulose and glycogen, both consisting of repeating glucose monomers. Cellulose is an important structural component of plant's cell walls and glycogen is used as a form of energy storage in animals. Sugar can be characterized by having reducing or non-reducing ends. A reducing end of a carbohydrate is a carbon atom that can be in equilibrium with the open-chain aldehyde (aldose) or keto form (ketose). If the joining of monomers takes place at such a carbon atom, the free hydroxy group of the pyranose or furanose form is exchanged with an OH-side-chain of another sugar, yielding a full acetal. This prevents opening of the chain to the aldehyde or keto form and renders the modified residue non-reducing. Lactose contains a reducing end at its glucose moiety, whereas the galactose moiety forms a full acetal with the C4-OH group of glucose. Saccharose does not have a reducing end because of full acetal formation between the aldehyde carbon of glucose (C1) and the keto carbon of fructose (C2). Lipids Lipids comprise a diverse range of molecules and to some extent is a catchall for relatively water-insoluble or nonpolar compounds of biological origin, including waxes, fatty acids, fatty-acid derived phospholipids, sphingolipids, glycolipids, and terpenoids (e.g., retinoids and steroids). Some lipids are linear, open-chain aliphatic molecules, while others have ring structures. Some are aromatic (with a cyclic [ring] and planar [flat] structure) while others are not. Some are flexible, while others are rigid. Lipids are usually made from one molecule of glycerol combined with other molecules. In triglycerides, the main group of bulk lipids, there is one molecule of glycerol and three fatty acids. Fatty acids are considered the monomer in that case, and may be saturated (no double bonds in the carbon chain) or unsaturated (one or more double bonds in the carbon chain). Most lipids have some polar character in addition to being largely nonpolar. In general, the bulk of their structure is nonpolar or hydrophobic ("water-fearing"), meaning that it does not interact well with polar solvents like water. Another part of their structure is polar or hydrophilic ("water-loving") and will tend to associate with polar solvents like water. This makes them amphiphilic molecules (having both hydrophobic and hydrophilic portions). In the case of cholesterol, the polar group is a mere –OH (hydroxyl or alcohol). In the case of phospholipids, the polar groups are considerably larger and more polar, as described below. Lipids are an integral part | water. This makes them amphiphilic molecules (having both hydrophobic and hydrophilic portions). In the case of cholesterol, the polar group is a mere –OH (hydroxyl or alcohol). In the case of phospholipids, the polar groups are considerably larger and more polar, as described below. Lipids are an integral part of our daily diet. Most oils and milk products that we use for cooking and eating like butter, cheese, ghee etc., are composed of fats. Vegetable oils are rich in various polyunsaturated fatty acids (PUFA). Lipid-containing foods undergo digestion within the body and are broken into fatty acids and glycerol, which are the final degradation products of fats and lipids. Lipids, especially phospholipids, are also used in various pharmaceutical products, either as co-solubilisers (e.g., in parenteral infusions) or else as drug carrier components (e.g., in a liposome or transfersome). Proteins Proteins are very large molecules—macro-biopolymers—made from monomers called amino acids. An amino acid consists of an alpha carbon atom attached to an amino group, –NH2, a carboxylic acid group, –COOH (although these exist as –NH3+ and –COO− under physiologic conditions), a simple hydrogen atom, and a side chain commonly denoted as "–R". The side chain "R" is different for each amino acid of which there are 20 standard ones. It is this "R" group that made each amino acid different, and the properties of the side-chains greatly influence the overall three-dimensional conformation of a protein. Some amino acids have functions by themselves or in a modified form; for instance, glutamate functions as an important neurotransmitter. Amino acids can be joined via a peptide bond. In this dehydration synthesis, a water molecule is removed and the peptide bond connects the nitrogen of one amino acid's amino group to the carbon of the other's carboxylic acid group. The resulting molecule is called a dipeptide, and short stretches of amino acids (usually, fewer than thirty) are called peptides or polypeptides. Longer stretches merit the title proteins. As an example, the important blood serum protein albumin contains 585 amino acid residues. Proteins can have structural and/or functional roles. For instance, movements of the proteins actin and myosin ultimately are responsible for the contraction of skeletal muscle. One property many proteins have is that they specifically bind to a certain molecule or class of molecules—they may be extremely selective in what they bind. Antibodies are an example of proteins that attach to one specific type of molecule. Antibodies are composed of heavy and light chains. Two heavy chains would be linked to two light chains through disulfide linkages between their amino acids. Antibodies are specific through variation based on differences in the N-terminal domain. The enzyme-linked immunosorbent assay (ELISA), which uses antibodies, is one of the most sensitive tests modern medicine uses to detect various biomolecules. Probably the most important proteins, however, are the enzymes. Virtually every reaction in a living cell requires an enzyme to lower the activation energy of the reaction. These molecules recognize specific reactant molecules called substrates; they then catalyze the reaction between them. By lowering the activation energy, the enzyme speeds up that reaction by a rate of 1011 or more; a reaction that would normally take over 3,000 years to complete spontaneously might take less than a second with an enzyme. The enzyme itself is not used up in the process and is free to catalyze the same reaction with a new set of substrates. Using various modifiers, the activity of the enzyme can be regulated, enabling control of the biochemistry of the cell as a whole. The structure of proteins is traditionally described in a hierarchy of four levels. The primary structure of a protein consists of its linear sequence of amino acids; for instance, "alanine-glycine-tryptophan-serine-glutamate-asparagine-glycine-lysine-…". Secondary structure is concerned with local morphology (morphology being the study of structure). Some combinations of amino acids will tend to curl up in a coil called an α-helix or into a sheet called a β-sheet; some α-helixes can be seen in the hemoglobin schematic above. Tertiary structure is the entire three-dimensional shape of the protein. This shape is determined by the sequence of amino acids. In fact, a single change can change the entire structure. The alpha chain of hemoglobin contains 146 amino acid residues; substitution of the glutamate residue at position 6 with a valine residue changes the behavior of hemoglobin so much that it results in sickle-cell disease. Finally, quaternary structure is concerned with the structure of a protein with multiple peptide subunits, like hemoglobin with its four subunits. Not all proteins have more than one subunit. Ingested proteins are usually broken up into single amino acids or dipeptides in the small intestine and then absorbed. They can then be joined to form new proteins. Intermediate products of glycolysis, the citric acid cycle, and the pentose phosphate pathway can be used to form all twenty amino acids, and most bacteria and plants possess all the necessary enzymes to synthesize them. Humans and other mammals, however, can synthesize only half of them. They cannot synthesize isoleucine, leucine, lysine, methionine, phenylalanine, threonine, tryptophan, and valine. Because they must be ingested, these are the essential amino acids. Mammals do possess the enzymes to synthesize alanine, asparagine, aspartate, cysteine, glutamate, glutamine, glycine, proline, serine, and tyrosine, the nonessential amino acids. While they can synthesize arginine and histidine, they cannot produce it in sufficient amounts for young, growing animals, and so these are often considered essential amino acids. If the amino group is removed from an amino acid, it leaves behind a carbon skeleton called an α-keto acid. Enzymes called transaminases can easily transfer the amino group from one amino acid (making it an α-keto acid) to another α-keto acid (making it an amino acid). This is important in the biosynthesis of amino acids, as for many of the pathways, intermediates from other biochemical pathways are converted to the α-keto acid skeleton, and then an amino group is added, often via transamination. The amino acids may then be linked together to form a protein. A similar process is used to break down proteins. It is first hydrolyzed into its component amino acids. Free ammonia (NH3), existing as the ammonium ion (NH4+) in blood, is toxic to life forms. A suitable method for excreting it must therefore exist. Different tactics have evolved in different animals, depending on the animals' needs. Unicellular organisms simply release the ammonia into the environment. Likewise, bony fish can release the ammonia into the water where it is quickly diluted. In general, mammals convert the ammonia into urea, via the urea cycle. In order to determine whether two proteins are related, or in other words to decide whether they are homologous or not, scientists use sequence-comparison methods. Methods like sequence alignments and structural alignments are powerful tools that help scientists identify homologies between related molecules. The relevance of finding homologies among proteins goes beyond forming an evolutionary pattern of protein families. By finding how similar two protein sequences are, we acquire knowledge about their structure and therefore their function. Nucleic acids Nucleic acids, so-called because of their prevalence in cellular nuclei, is the generic name of the family of biopolymers. They are complex, high-molecular-weight biochemical macromolecules that can convey genetic information in all living cells and viruses. The monomers are called nucleotides, and each consists of three components: a nitrogenous heterocyclic base (either a purine or a pyrimidine), a pentose sugar, and a phosphate group. The most common nucleic acids are deoxyribonucleic acid (DNA) and ribonucleic acid (RNA). The phosphate group and the sugar of each nucleotide bond with each other to form the backbone of the nucleic acid, while the sequence of nitrogenous bases stores the information. The most common nitrogenous bases are adenine, cytosine, guanine, thymine, and uracil. The nitrogenous bases of each strand of a nucleic acid will form hydrogen bonds with certain other nitrogenous bases in a complementary strand of nucleic acid (similar to a zipper). Adenine binds with thymine and uracil, thymine binds only with adenine, and cytosine and guanine can bind only with one another. Adenine and Thymine & Adenine and Uracil contains two hydrogen Bonds, while Hydrogen Bonds formed between cytosine and guanine are three in number. Aside from the genetic material of the cell, nucleic acids often play a role as second messengers, as well as forming the base molecule for adenosine triphosphate (ATP), the primary energy-carrier molecule found in all living organisms. Also, the nitrogenous bases possible in the two nucleic acids are different: adenine, cytosine, and guanine occur in both RNA and DNA, while thymine occurs only in DNA and uracil occurs in RNA. Metabolism Carbohydrates as energy source Glucose is an energy source in most life forms. For instance, polysaccharides are broken down into their monomers by enzymes (glycogen phosphorylase removes glucose residues from glycogen, a polysaccharide). Disaccharides like lactose or sucrose are cleaved into their two component monosaccharides. Glycolysis (anaerobic) Glucose is mainly metabolized by a very important ten-step pathway called glycolysis, the net result of which is to break down one molecule of glucose into two molecules of pyruvate. This also produces a net two molecules of ATP, the energy currency of cells, along with two reducing equivalents of converting NAD+ (nicotinamide adenine dinucleotide: oxidized form) to NADH (nicotinamide adenine dinucleotide: reduced form). This does not require oxygen; if no oxygen is available (or the cell cannot use oxygen), the NAD is restored by converting the pyruvate to lactate (lactic acid) (e.g., in humans) or to ethanol plus carbon dioxide (e.g., in yeast). Other monosaccharides like galactose and fructose can be converted into intermediates of the glycolytic pathway. Aerobic In aerobic cells with sufficient oxygen, as in most human cells, the pyruvate is further metabolized. It is irreversibly converted to acetyl-CoA, giving off one carbon atom as the waste product carbon dioxide, generating another reducing equivalent as NADH. The two |
is a high-drag projectile, with an open conical shape: the cone is formed from sixteen overlapping feathers embedded into a rounded cork base. The cork is covered with thin leather or synthetic material. Synthetic shuttles are often used by recreational players to reduce their costs as feathered shuttles break easily. These nylon shuttles may be constructed with either natural cork or synthetic foam base and a plastic skirt. Badminton rules also provide for testing a shuttlecock for the correct speed: Shoes Badminton shoes are lightweight with soles of rubber or similar high-grip, non-marking materials. Compared to running shoes, badminton shoes have little lateral support. High levels of lateral support are useful for activities where lateral motion is undesirable and unexpected. Badminton, however, requires powerful lateral movements. A highly built-up lateral support will not be able to protect the foot in badminton; instead, it will encourage catastrophic collapse at the point where the shoe's support fails, and the player's ankles are not ready for the sudden loading, which can cause sprains. For this reason, players should choose badminton shoes rather than general trainers or running shoes, because proper badminton shoes will have a very thin sole, lower a person's centre of gravity, and therefore result in fewer injuries. Players should also ensure that they learn safe and proper footwork, with the knee and foot in alignment on all lunges. This is more than just a safety concern: proper footwork is also critical in order to move effectively around the court. Technique Strokes Badminton offers a wide variety of basic strokes, and players require a high level of skill to perform all of them effectively. All strokes can be played either forehand or backhand. A player's forehand side is the same side as their playing hand: for a right-handed player, the forehand side is their right side and the backhand side is their left side. Forehand strokes are hit with the front of the hand leading (like hitting with the palm), whereas backhand strokes are hit with the back of the hand leading (like hitting with the knuckles). Players frequently play certain strokes on the forehand side with a backhand hitting action, and vice versa. In the forecourt and midcourt, most strokes can be played equally effectively on either the forehand or backhand side; but in the rear court, players will attempt to play as many strokes as possible on their forehands, often preferring to play a round-the-head forehand overhead (a forehand "on the backhand side") rather than attempt a backhand overhead. Playing a backhand overhead has two main disadvantages. First, the player must turn their back to their opponents, restricting their view of them and the court. Second, backhand overheads cannot be hit with as much power as forehands: the hitting action is limited by the shoulder joint, which permits a much greater range of movement for a forehand overhead than for a backhand. The backhand clear is considered by most players and coaches to be the most difficult basic stroke in the game, since the precise technique is needed in order to muster enough power for the shuttlecock to travel the full length of the court. For the same reason, backhand smashes tend to be weak. Position of the shuttlecock and receiving player The choice of stroke depends on how near the shuttlecock is to the net, whether it is above net height, and where an opponent is currently positioned: players have much better attacking options if they can reach the shuttlecock well above net height, especially if it is also close to the net. In the forecourt, a high shuttlecock will be met with a net kill, hitting it steeply downwards and attempting to win the rally immediately. This is why it is best to drop the shuttlecock just over the net in this situation. In the midcourt, a high shuttlecock will usually be met with a powerful smash, also hitting downwards and hoping for an outright winner or a weak reply. Athletic jump smashes, where players jump upwards for a steeper smash angle, are a common and spectacular element of elite men's doubles play. In the rearcourt, players strive to hit the shuttlecock while it is still above them, rather than allowing it to drop lower. This overhead hitting allows them to play smashes, clears (hitting the shuttlecock high and to the back of the opponents' court), and drop shots (hitting the shuttlecock softly so that it falls sharply downwards into the opponents' forecourt). If the shuttlecock has dropped lower, then a smash is impossible and a full-length, high clear is difficult. Vertical position of the shuttlecock When the shuttlecock is well below net height, players have no choice but to hit upwards. Lifts, where the shuttlecock is hit upwards to the back of the opponents' court, can be played from all parts of the court. If a player does not lift, their only remaining option is to push the shuttlecock softly back to the net: in the forecourt, this is called a net shot; in the midcourt or rear court, it is often called a push or block. When the shuttlecock is near to net height, players can hit drives, which travel flat and rapidly over the net into the opponents' rear midcourt and rear court. Pushes may also be hit flatter, placing the shuttlecock into the front midcourt. Drives and pushes may be played from the midcourt or forecourt, and are most often used in doubles: they are an attempt to regain the attack, rather than choosing to lift the shuttlecock and defend against smashes. After a successful drive or push, the opponents will often be forced to lift the shuttlecock. Spin Balls may be spun to alter their bounce (for example, topspin and backspin in tennis) or trajectory, and players may slice the ball (strike it with an angled racquet face) to produce such spin. The shuttlecock is not allowed to bounce, but slicing the shuttlecock does have applications in badminton. (See Basic strokes for an explanation of technical terms.) Slicing the shuttlecock from the side may cause it to travel in a different direction from the direction suggested by the player's racquet or body movement. This is used to deceive opponents. Slicing the shuttlecock from the side may cause it to follow a slightly curved path (as seen from above), and the deceleration imparted by the spin causes sliced strokes to slow down more suddenly towards the end of their flight path. This can be used to create drop shots and smashes that dip more steeply after they pass the net. When playing a net shot, slicing underneath the shuttlecock may cause it to turn over itself (tumble) several times as it passes the net. This is called a spinning net shot or tumbling net shot. The opponent will be unwilling to address the shuttlecock until it has corrected its orientation. Due to the way that its feathers overlap, a shuttlecock also has a slight natural spin about its axis of rotational symmetry. The spin is in a counter-clockwise direction as seen from above when dropping a shuttlecock. This natural spin affects certain strokes: a tumbling net shot is more effective if the slicing action is from right to left, rather than from left to right. Biomechanics Badminton biomechanics have not been the subject of extensive scientific study, but some studies confirm the minor role of the wrist in power generation and indicate that the major contributions to power come from internal and external rotations of the upper and lower arm. Recent guides to the sport thus emphasize forearm rotation rather than wrist movements. The feathers impart substantial drag, causing the shuttlecock to decelerate greatly over distance. The shuttlecock is also extremely aerodynamically stable: regardless of initial orientation, it will turn to fly cork-first and remain in the cork-first orientation. One consequence of the shuttlecock's drag is that it requires considerable power to hit it the full length of the court, which is not the case for most racquet sports. The drag also influences the flight path of a lifted (lobbed) shuttlecock: the parabola of its flight is heavily skewed so that it falls at a steeper angle than it rises. With very high serves, the shuttlecock may even fall vertically. Other factors When defending against a smash, players have three basic options: lift, block, or drive. In singles, a block to the net is the most common reply. In doubles, a lift is the safest option but it usually allows the opponents to continue smashing; blocks and drives are counter-attacking strokes but may be intercepted by the smasher's partner. Many players use a backhand hitting action for returning smashes on both the forehand and backhand sides because backhands are more effective than forehands at covering smashes directed to the body. Hard shots directed towards the body are difficult to defend. The service is restricted by the Laws and presents its own array of stroke choices. Unlike in tennis, the server's racquet must be pointing in a downward direction to deliver the serve so normally the shuttle must be hit upwards to pass over the net. The server can choose a low serve into the forecourt (like a push), or a lift to the back of the service court, or a flat drive serve. Lifted serves may be either high serves, where the shuttlecock is lifted so high that it falls almost vertically at the back of the court, or flick serves, where the shuttlecock is lifted to a lesser height but falls sooner. Deception Once players have mastered these basic strokes, they can hit the shuttlecock from and to any part of the court, powerfully and softly as required. Beyond the basics, however, badminton offers rich potential for advanced stroke skills that provide a competitive advantage. Because badminton players have to cover a short distance as quickly as possible, the purpose of many advanced strokes is to deceive the opponent, so that either they are tricked into believing that a different stroke is being played, or they are forced to delay their movement until they actually sees the shuttle's direction. "Deception" in badminton is often used in both of these senses. When a player is genuinely deceived, they will often lose the point immediately because they cannot change their direction quickly enough to reach the shuttlecock. Experienced players will be aware of the trick and cautious not to move too early, but the attempted deception is still useful because it forces the opponent to delay their movement slightly. Against weaker players whose intended strokes are obvious, an experienced player may move before the shuttlecock has been hit, anticipating the stroke to gain an advantage. Slicing and using a shortened hitting action are the two main technical devices that facilitate deception. Slicing involves hitting the shuttlecock with an angled racquet face, causing it to travel in a different direction than suggested by the body or arm movement. Slicing also causes the shuttlecock to travel more slowly than the arm movement suggests. For example, a good crosscourt sliced drop shot will use a hitting action that suggests a straight clear or a smash, deceiving the opponent about both the power and direction of the shuttlecock. A more sophisticated slicing action involves brushing the strings around the shuttlecock during the hit, in order to make the shuttlecock spin. This can be used to improve the shuttle's trajectory, by making it dip more rapidly as it passes the net; for example, a sliced low serve can travel slightly faster than a normal low serve, yet land on the same spot. Spinning the shuttlecock is also used to create spinning net shots (also called tumbling net shots), in which the shuttlecock turns over itself several times (tumbles) before stabilizing; sometimes the shuttlecock remains inverted instead of tumbling. The main advantage of a spinning net shot is that the opponent will be unwilling to address the shuttlecock until it has stopped tumbling, since hitting the feathers will result in an unpredictable stroke. Spinning net shots are especially important for high-level singles players. The lightness of modern racquets allows players to use a very short hitting action for many strokes, thereby maintaining the option to hit a powerful or a soft stroke until the last possible moment. For example, a singles player may hold their racquet ready for a net shot, but then flick the shuttlecock to the back instead with a shallow lift when they notice the opponent has moved before the actual shot was played. A shallow lift takes less time to reach the ground and as mentioned above a rally is over when the shuttlecock touches the ground. This makes the opponent's task of covering the whole court much more difficult than if the lift was hit higher and with a bigger, obvious swing. A short hitting action is not only useful for deception: it also allows the player to hit powerful strokes when they have no time for a big arm swing. A big arm swing is also usually not advised in badminton because bigger swings make it more difficult to recover for the next shot in fast exchanges. The use of grip tightening is crucial to these techniques, and is often described as finger power. Elite players develop finger power to the extent that they can hit some power strokes, such as net kills, with less than a racquet swing. It is also possible to reverse this style of deception, by suggesting a powerful stroke before slowing down the hitting action to play a soft stroke. In general, this latter style of deception is more common in the rear court (for example, drop shots disguised as smashes), whereas the former style is more common in the forecourt and midcourt (for example, lifts disguised as net shots). Deception is not limited to slicing and short hitting actions. Players may also use double motion, where they make an initial racquet movement in one direction before withdrawing the racquet to hit in another direction. Players will often do this to send opponents in the wrong direction. The racquet movement is typically used to suggest a straight angle but then play the stroke crosscourt, or vice versa. Triple motion is also possible, but this is very rare in actual play. An alternative to double motion is to use a racquet head fake, where the initial motion is continued but the racquet is turned during the hit. This produces a smaller change in direction but does not require as much time. Strategy To win in badminton, players need to employ a wide variety of strokes in the right situations. These range from powerful jumping smashes to delicate tumbling net returns. Often rallies finish with a smash, but setting up the smash requires subtler strokes. For example, a net shot can force the opponent to lift the shuttlecock, which gives an opportunity to smash. If the net shot is tight and tumbling, then the opponent's lift will not reach the back of the court, which makes the subsequent smash much harder to return. Deception is also important. Expert players prepare for many different strokes that look identical and use slicing to deceive their opponents about the speed or direction of the stroke. If an opponent tries to anticipate the stroke, they may move in the wrong direction and may be unable to change their body momentum in time to reach the shuttlecock. Singles Since one person needs to cover the entire court, singles tactics are based on forcing the opponent to move as much as possible; this means that singles strokes are normally directed to the corners of the court. Players exploit the length of the court by combining lifts and clears with drop shots and net shots. Smashing tends to be less prominent in singles than in doubles because the smasher has no partner to follow up their effort and is thus vulnerable to a skillfully placed return. Moreover, frequent smashing can be exhausting in singles where the conservation of a player's energy is at a premium. However, players with strong smashes will sometimes use the shot to create openings, and players commonly smash weak returns to try to end rallies. In singles, players will often start the rally with a forehand high serve or with a flick serve. Low serves are also used frequently, either forehand or backhand. Drive serves are rare. At high levels of play, singles demand extraordinary fitness. Singles is a game of patient positional manoeuvring, unlike the all-out aggression of doubles. Doubles Both pairs will try to gain and maintain the attack, smashing downwards when the opportunity arises. Whenever possible, a pair will adopt an ideal attacking formation with one player hitting down from the rear court, and their partner in the midcourt intercepting all smash returns except the lift. If the rear court attacker plays a drop shot, their partner will move into the forecourt to threaten the net reply. If a pair cannot hit downwards, they will use flat strokes in an attempt to gain the attack. If a pair is forced to lift or clear the shuttlecock, then they must defend: they will adopt a side-by-side position in the rear midcourt, to cover the full width of their court against the opponents' smashes. In doubles, players generally smash to the middle ground between two players in order to take advantage of confusion and clashes. At high levels of play, the backhand serve has become popular to the extent that forehand serves have become fairly rare at a high level of play. The straight low serve is used most frequently, in an attempt to prevent the opponents gaining the attack immediately. Flick serves are used to prevent the opponent from anticipating the low serve and attacking it decisively. At high levels of play, doubles rallies are extremely fast. Men's doubles are the most aggressive form of badminton, with a high proportion of powerful jump smashes and very quick reflex exchanges. Because of this, spectator interest is sometimes greater for men's doubles than for singles. Mixed doubles In mixed doubles, | corrected its orientation. Due to the way that its feathers overlap, a shuttlecock also has a slight natural spin about its axis of rotational symmetry. The spin is in a counter-clockwise direction as seen from above when dropping a shuttlecock. This natural spin affects certain strokes: a tumbling net shot is more effective if the slicing action is from right to left, rather than from left to right. Biomechanics Badminton biomechanics have not been the subject of extensive scientific study, but some studies confirm the minor role of the wrist in power generation and indicate that the major contributions to power come from internal and external rotations of the upper and lower arm. Recent guides to the sport thus emphasize forearm rotation rather than wrist movements. The feathers impart substantial drag, causing the shuttlecock to decelerate greatly over distance. The shuttlecock is also extremely aerodynamically stable: regardless of initial orientation, it will turn to fly cork-first and remain in the cork-first orientation. One consequence of the shuttlecock's drag is that it requires considerable power to hit it the full length of the court, which is not the case for most racquet sports. The drag also influences the flight path of a lifted (lobbed) shuttlecock: the parabola of its flight is heavily skewed so that it falls at a steeper angle than it rises. With very high serves, the shuttlecock may even fall vertically. Other factors When defending against a smash, players have three basic options: lift, block, or drive. In singles, a block to the net is the most common reply. In doubles, a lift is the safest option but it usually allows the opponents to continue smashing; blocks and drives are counter-attacking strokes but may be intercepted by the smasher's partner. Many players use a backhand hitting action for returning smashes on both the forehand and backhand sides because backhands are more effective than forehands at covering smashes directed to the body. Hard shots directed towards the body are difficult to defend. The service is restricted by the Laws and presents its own array of stroke choices. Unlike in tennis, the server's racquet must be pointing in a downward direction to deliver the serve so normally the shuttle must be hit upwards to pass over the net. The server can choose a low serve into the forecourt (like a push), or a lift to the back of the service court, or a flat drive serve. Lifted serves may be either high serves, where the shuttlecock is lifted so high that it falls almost vertically at the back of the court, or flick serves, where the shuttlecock is lifted to a lesser height but falls sooner. Deception Once players have mastered these basic strokes, they can hit the shuttlecock from and to any part of the court, powerfully and softly as required. Beyond the basics, however, badminton offers rich potential for advanced stroke skills that provide a competitive advantage. Because badminton players have to cover a short distance as quickly as possible, the purpose of many advanced strokes is to deceive the opponent, so that either they are tricked into believing that a different stroke is being played, or they are forced to delay their movement until they actually sees the shuttle's direction. "Deception" in badminton is often used in both of these senses. When a player is genuinely deceived, they will often lose the point immediately because they cannot change their direction quickly enough to reach the shuttlecock. Experienced players will be aware of the trick and cautious not to move too early, but the attempted deception is still useful because it forces the opponent to delay their movement slightly. Against weaker players whose intended strokes are obvious, an experienced player may move before the shuttlecock has been hit, anticipating the stroke to gain an advantage. Slicing and using a shortened hitting action are the two main technical devices that facilitate deception. Slicing involves hitting the shuttlecock with an angled racquet face, causing it to travel in a different direction than suggested by the body or arm movement. Slicing also causes the shuttlecock to travel more slowly than the arm movement suggests. For example, a good crosscourt sliced drop shot will use a hitting action that suggests a straight clear or a smash, deceiving the opponent about both the power and direction of the shuttlecock. A more sophisticated slicing action involves brushing the strings around the shuttlecock during the hit, in order to make the shuttlecock spin. This can be used to improve the shuttle's trajectory, by making it dip more rapidly as it passes the net; for example, a sliced low serve can travel slightly faster than a normal low serve, yet land on the same spot. Spinning the shuttlecock is also used to create spinning net shots (also called tumbling net shots), in which the shuttlecock turns over itself several times (tumbles) before stabilizing; sometimes the shuttlecock remains inverted instead of tumbling. The main advantage of a spinning net shot is that the opponent will be unwilling to address the shuttlecock until it has stopped tumbling, since hitting the feathers will result in an unpredictable stroke. Spinning net shots are especially important for high-level singles players. The lightness of modern racquets allows players to use a very short hitting action for many strokes, thereby maintaining the option to hit a powerful or a soft stroke until the last possible moment. For example, a singles player may hold their racquet ready for a net shot, but then flick the shuttlecock to the back instead with a shallow lift when they notice the opponent has moved before the actual shot was played. A shallow lift takes less time to reach the ground and as mentioned above a rally is over when the shuttlecock touches the ground. This makes the opponent's task of covering the whole court much more difficult than if the lift was hit higher and with a bigger, obvious swing. A short hitting action is not only useful for deception: it also allows the player to hit powerful strokes when they have no time for a big arm swing. A big arm swing is also usually not advised in badminton because bigger swings make it more difficult to recover for the next shot in fast exchanges. The use of grip tightening is crucial to these techniques, and is often described as finger power. Elite players develop finger power to the extent that they can hit some power strokes, such as net kills, with less than a racquet swing. It is also possible to reverse this style of deception, by suggesting a powerful stroke before slowing down the hitting action to play a soft stroke. In general, this latter style of deception is more common in the rear court (for example, drop shots disguised as smashes), whereas the former style is more common in the forecourt and midcourt (for example, lifts disguised as net shots). Deception is not limited to slicing and short hitting actions. Players may also use double motion, where they make an initial racquet movement in one direction before withdrawing the racquet to hit in another direction. Players will often do this to send opponents in the wrong direction. The racquet movement is typically used to suggest a straight angle but then play the stroke crosscourt, or vice versa. Triple motion is also possible, but this is very rare in actual play. An alternative to double motion is to use a racquet head fake, where the initial motion is continued but the racquet is turned during the hit. This produces a smaller change in direction but does not require as much time. Strategy To win in badminton, players need to employ a wide variety of strokes in the right situations. These range from powerful jumping smashes to delicate tumbling net returns. Often rallies finish with a smash, but setting up the smash requires subtler strokes. For example, a net shot can force the opponent to lift the shuttlecock, which gives an opportunity to smash. If the net shot is tight and tumbling, then the opponent's lift will not reach the back of the court, which makes the subsequent smash much harder to return. Deception is also important. Expert players prepare for many different strokes that look identical and use slicing to deceive their opponents about the speed or direction of the stroke. If an opponent tries to anticipate the stroke, they may move in the wrong direction and may be unable to change their body momentum in time to reach the shuttlecock. Singles Since one person needs to cover the entire court, singles tactics are based on forcing the opponent to move as much as possible; this means that singles strokes are normally directed to the corners of the court. Players exploit the length of the court by combining lifts and clears with drop shots and net shots. Smashing tends to be less prominent in singles than in doubles because the smasher has no partner to follow up their effort and is thus vulnerable to a skillfully placed return. Moreover, frequent smashing can be exhausting in singles where the conservation of a player's energy is at a premium. However, players with strong smashes will sometimes use the shot to create openings, and players commonly smash weak returns to try to end rallies. In singles, players will often start the rally with a forehand high serve or with a flick serve. Low serves are also used frequently, either forehand or backhand. Drive serves are rare. At high levels of play, singles demand extraordinary fitness. Singles is a game of patient positional manoeuvring, unlike the all-out aggression of doubles. Doubles Both pairs will try to gain and maintain the attack, smashing downwards when the opportunity arises. Whenever possible, a pair will adopt an ideal attacking formation with one player hitting down from the rear court, and their partner in the midcourt intercepting all smash returns except the lift. If the rear court attacker plays a drop shot, their partner will move into the forecourt to threaten the net reply. If a pair cannot hit downwards, they will use flat strokes in an attempt to gain the attack. If a pair is forced to lift or clear the shuttlecock, then they must defend: they will adopt a side-by-side position in the rear midcourt, to cover the full width of their court against the opponents' smashes. In doubles, players generally smash to the middle ground between two players in order to take advantage of confusion and clashes. At high levels of play, the backhand serve has become popular to the extent that forehand serves have become fairly rare at a high level of play. The straight low serve is used most frequently, in an attempt to prevent the opponents gaining the attack immediately. Flick serves are used to prevent the opponent from anticipating the low serve and attacking it decisively. At high levels of play, doubles rallies are extremely fast. Men's doubles are the most aggressive form of badminton, with a high proportion of powerful jump smashes and very quick reflex exchanges. Because of this, spectator interest is sometimes greater for men's doubles than for singles. Mixed doubles In mixed doubles, both pairs typically try to maintain an attacking formation with the woman at the front and the man at the back. This is because the male players are usually substantially stronger, and can, therefore, produce smashes that are more powerful. As a result, mixed doubles require greater tactical awareness and subtler positional play. Clever opponents will try to reverse the ideal position, by forcing the woman towards the back or the man towards the front. In order to protect against this danger, mixed players must be careful and systematic in their shot selection. At high levels of play, the formations will generally be more flexible: the top women players are capable of playing powerfully from the back-court, and will happily do so if required. When the opportunity arises, however, the pair will switch back to the standard mixed attacking position, with the woman in front and men in the back. Organization Governing bodies The Badminton World Federation (BWF) is the internationally recognized governing body of the sport responsible for the regulation of tournaments and approaching fair play. Five regional confederations are associated with the BWF: Asia: Badminton Asia Confederation (BAC) Africa: Badminton Confederation of Africa (BCA) Americas: Badminton Pan Am (North America and South America belong to the same confederation; BPA) Europe: Badminton Europe (BE) Oceania: Badminton Oceania (BO) Competitions The BWF organizes several international competitions, including the Thomas Cup, the premier men's international team event first held in 1948–1949, and the Uber Cup, the women's equivalent first held in 1956–1957. The competitions now take place once every two years. More than 50 national teams compete in qualifying tournaments within continental confederations for a place in the finals. The final tournament involves 12 teams, following an increase from eight teams in 2004. It was further increased to 16 teams in 2012. The Sudirman Cup, a gender-mixed international team event held once every two years, began in 1989. Teams are divided into seven levels based on the performance of each country. To win the tournament, a country must perform well across all five disciplines (men's doubles and singles, women's doubles and singles, and mixed doubles). Like association football (soccer), it features a promotion and relegation system at every level. However, the system was last used in 2009 and teams competing will now be grouped by world rankings. Badminton was a demonstration event at the 1972 and 1988 Summer Olympics. It became an official Summer Olympic sport at the Barcelona Olympics in 1992 and its gold medals now generally rate as the sport's most coveted prizes for individual players. In the BWF World Championships, first held in 1977, currently only the highest-ranked 64 players in the world, and a maximum of four from each country can participate in any category. In both the Olympic and BWF World competitions restrictions on the number of participants from any one country have caused some controversy because they sometimes result in excluding elite world level players from the strongest badminton nations. The Thomas, Uber, and Sudirman Cups, the Olympics, and the BWF World (and World Junior Championships), are all categorized as level one tournaments. At the start of 2007, the BWF introduced a new tournament structure for the highest level tournaments aside from those in level one: the BWF Super Series. This level two tournament series, a tour for the world's elite players, stage twelve open tournaments around the world with 32 players (half the previous limit). The players collect points that determine whether they can play in Super Series Finals held at the year-end. Among the tournaments in this series is the venerable All-England Championships, first held in 1900, which was once considered the unofficial world championships of the sport. Level three tournaments consist of Grand Prix Gold and Grand Prix event. Top players can collect the world ranking points and enable them to play in the BWF Super Series open tournaments. These include the regional competitions in Asia (Badminton Asia Championships) and Europe (European Badminton Championships), which produce the world's best players as well as the Pan America Badminton Championships. The level four tournaments, known as International Challenge, International Series, and Future Series, encourage participation by junior players. Comparison with tennis Badminton is frequently compared to tennis due to several qualities. The following is a list of manifest differences: Scoring: In badminton, a match is played best 2 of 3 games, with each game played up to 21 points. In tennis a match is played best |
economic resources are available. In fact, the first Portuguese Baroque does not lack in building because "plain style" is easy to be transformed, by means of decoration (painting, tiling, etc.), turning empty areas into pompous, elaborate baroque scenarios. The same could be applied to the exterior. Subsequently, it is easy to adapt the building to the taste of the time and place and add on new features and details. Practical and economical. With more inhabitants and better economic resources, the north, particularly the areas of Porto and Braga, witnessed an architectural renewal, visible in the large list of churches, convents and palaces built by the aristocracy. Porto is the city of Baroque in Portugal. Its historical centre is part of UNESCO World Heritage List. Many of the Baroque works in the historical area of the city and beyond, belong to Nicolau Nasoni an Italian architect living in Portugal, drawing original buildings with scenographic emplacement such as the church and tower of Clérigos, the logia of the Porto Cathedral, the church of Misericórdia, the Palace of São João Novo, the Palace of Freixo, the Episcopal Palace (Portuguese: Paço Episcopal do Porto) along with many others. Russian Baroque The debut of Russian Baroque, or Petrine Baroque, followed a long visit of Peter the Great to western Europe in 1697–98, where he visited the Chateaux of Fontainebleau and the Versailles as well as other architectural monuments. He decided, on his return to Russia, to construct similar monuments in St. Petersburg, which became the new capital of Russia in 1712. Early major monuments in the Petrine Baroque include the Peter and Paul Cathedral and Menshikov Palace. During the reign of Empress Anna and Elizaveta Petrovna, Russian architecture was dominated by the luxurious Baroque style of Italian-born Bartolomeo Rastrelli, which developed into Elizabethan Baroque. Rastrelli's signature buildings include the Winter Palace, the Catherine Palace and the Smolny Cathedral. Other distinctive monuments of the Elizabethan Baroque are the bell tower of the Troitse-Sergiyeva Lavra and the Red Gate. In Moscow, Naryshkin Baroque became widespread, especially in the architecture of Eastern Orthodox churches in the late 17th century. It was a combination of western European Baroque with traditional Russian folk styles. Baroque in the Spanish and Portuguese Colonial Americas Due to the colonization of the Americas by European countries, the Baroque naturally moved to the New World, finding especially favorable ground in the regions dominated by Spain and Portugal, both countries being centralized and irreducibly Catholic monarchies, by extension subject to Rome and adherents of the Baroque Counter-reformist most typical. European artists migrated to America and made school, and along with the widespread penetration of Catholic missionaries, many of whom were skilled artists, created a multiform Baroque often influenced by popular taste. The Criollo and Indidenous craftsmen did much to give this Baroque unique features. The main centres of American Baroque cultivation, that are still standing, are (in this order) Mexico, Peru, Brazil, Ecuador, Cuba, Colombia, Bolivia, Guatemala, Panama and Puerto Rico. Of particular note is the so-called "Missionary Baroque", developed in the framework of the Spanish reductions in areas extending from Mexico and southwestern portions of current-day United States to as far south as Argentina and Chile, indigenous settlements organized by Spanish Catholic missionaries in order to convert them to the Christian faith and acculturate them in the Western life, forming a hybrid Baroque influenced by Native culture, where flourished Criollos and many Indian artisans and musicians, even literate, some of great ability and talent of their own. Missionaries' accounts often repeat that Western art, especially music, had a hypnotic impact on foresters, and the images of saints were viewed as having great powers. Many Indians were converted, and a new form of devotion was created, of passionate intensity, laden with mysticism, superstition, and theatricality, which delighted in festive masses, sacred concerts, and mysteries. The Colonial Baroque architecture in the Spanish America is characterized by a profuse decoration (portal of La Profesa Church, Mexico City; facades covered with Puebla-style azulejos, as in the Church of San Francisco Acatepec in San Andrés Cholula and Convent Church of San Francisco of Puebla), which will be exacerbated in the so-called Churrigueresque style (Facade of the Tabernacle of the Mexico City Cathedral, by Lorenzo Rodríguez; Church of San Francisco Javier, Tepotzotlán; Church of Santa Prisca of Taxco). In Peru, the constructions mostly developed in the cities of Lima, Cusco, Arequipa and Trujillo, since 1650 show original characteristics that are advanced even to the European Baroque, as in the use of cushioned walls and solomonic columns (Church of la Compañía de Jesús, Cusco; Basilica and Convent of San Francisco, Lima). Other countries include: the Metropolitan Cathedral of Sucre in Bolivia; Cathedral Basilica of Esquipulas in Guatemala; Tegucigalpa Cathedral in Honduras; León Cathedral in Nicaragua; the Church of la Compañía de Jesús in Quito, Ecuador; the Church of San Ignacio in Bogotá, Colombia; the Caracas Cathedral in Venezuela; the Cabildo of Buenos Aires in Argentina; the Church of Santo Domingo in Santiago, Chile; and Havana Cathedral in Cuba. It is also worth remembering the quality of the churches of the Spanish Jesuit Missions in Bolivia, Spanish Jesuit missions in Paraguay, the Spanish missions in Mexico and the Spanish Franciscan missions in California. In Brazil, as in the metropolis, Portugal, the architecture has a certain Italian influence, usually of a Borrominesque type, as can be seen in the Co-Cathedral of Recife (1784) and Church of Nossa Senhora da Glória do Outeiro in Rio de Janeiro (1739). In the region of Minas Gerais, highlighted the work of Aleijadinho, author of a group of churches that stand out for their curved planimetry, facades with concave-convex dynamic effects and a plastic treatment of all architectural elements (Church of São Francisco de Assis in Ouro Preto, 1765–1788). Baroque in the Spanish and Portuguese Colonial Asia In the Portuguese colonies of India (Goa, Daman and Diu) an architectural style of Baroque forms mixed with Hindu elements flourished, such as the Goa Cathedral and the Basilica of Bom Jesus of Goa, which houses the tomb of St. Francis Xavier. The set of churches and convents of Goa was declared a World Heritage Site in 1986. In the Philippines, that was part of the Spanish Empire for a long time, a large number of Baroque constructions are preserved, including the Baroque Churches of the Philippines that four of these, and the Baroque and Neoclassical city of Vigan, are both UNESCO World Heritage Sites. It was also very remarkable the Walled City of Manila (Intramuros). Other city with notable preserved Spanish-era Baroque is Tayabas. Painting Baroque painters worked deliberately to set themselves apart from the painters of the Renaissance and the Mannerism period after it. In their palette, they used intense and warm colours, and particularly made use of the primary colours red, blue and yellow, frequently putting all three in close proximity. They avoided the even lighting of Renaissance painting and used strong contrasts of light and darkness on certain parts of the picture to direct attention to the central actions or figures. In their composition, they avoided the tranquil scenes of Renaissance paintings, and chose the moments of the greatest movement and drama. Unlike the tranquil faces of Renaissance paintings, the faces in Baroque paintings clearly expressed their emotions. They often used asymmetry, with action occurring away from the centre of the picture, and created axes that were neither vertical nor horizontal, but slanting to the left or right, giving a sense of instability and movement. They enhanced this impression of movement by having the costumes of the personages blown by the wind, or moved by their own gestures. The overall impressions were movement, emotion and drama. Another essential element of baroque painting was allegory; every painting told a story and had a message, often encrypted in symbols and allegorical characters, which an educated viewer was expected to know and read. Early evidence of Italian Baroque ideas in painting occurred in Bologna, where Annibale Carracci, Agostino Carracci and Ludovico Carracci sought to return the visual arts to the ordered Classicism of the Renaissance. Their art, however, also incorporated ideas central the Counter-Reformation; these included intense emotion and religious imagery that appealed more to the heart than to the intellect. Another influential painter of the Baroque era was Michelangelo Merisi da Caravaggio. His realistic approach to the human figure, painted directly from life and dramatically spotlit against a dark background, shocked his contemporaries and opened a new chapter in the history of painting. Other major painters associated closely with the Baroque style include Artemisia Gentileschi, Elisabetta Sirani, Giovanna Garzoni, Guido Reni, Domenichino, Andrea Pozzo, and Paolo de Matteis in Italy; Francisco de Zurbarán and Diego Velázquez in Spain; Adam Elsheimer in Germany; and Nicolas Poussin and Georges de La Tour in France (though Poussin spent most of his working life in Italy). Poussin and La Tour adopted a "classical" Baroque style with less focus on emotion and greater attention to the line of the figures in the painting than to colour. Peter Paul Rubens was the most important painter of the Flemish Baroque style. Rubens' highly charged compositions reference erudite aspects of classical and Christian history. His unique and immensely popular Baroque style emphasised movement, colour, and sensuality, which followed the immediate, dramatic artistic style promoted in the Counter-Reformation. Rubens specialized in making altarpieces, portraits, landscapes, and history paintings of mythological and allegorical subjects. One important domain of Baroque painting was Quadratura, or paintings in trompe-l'œil, which literally "fooled the eye". These were usually painted on the stucco of ceilings or upper walls and balustrades, and gave the impression to those on the ground looking up were that they were seeing the heavens populated with crowds of angels, saints and other heavenly figures, set against painted skies and imaginary architecture. In Italy, artists often collaborated with architects on interior decoration; Pietro da Cortona was one of the painters of the 17th century who employed this illusionist way of painting. Among his most important commissions were the frescoes he painted for the Palace of the Barberini family (1633–39), to glorify the reign of Pope Urban VIII. Pietro da Cortona's compositions were the largest decorative frescoes executed in Rome since the work of Michelangelo at the Sistine Chapel. François Boucher was an important figure in the more delicate French Rococo style, which appeared during the late Baroque period. He designed tapestries, carpets and theatre decoration as well as painting. His work was extremely popular with Madame Pompadour, the Mistress of King Louis XV. His paintings featured mythological romantic, and mildly erotic themes. Spanish Americas In the Spanish Americas, the first influences were from Sevillan Tenebrism, mainly from Zurbarán —some of whose works are still preserved in Mexico and Peru— as can be seen in the work of the Mexicans José Juárez and Sebastián López de Arteaga, and the Bolivian Melchor Pérez de Holguín. The Cusco School of painting arose after the arrival of the Italian painter Bernardo Bitti in 1583, who introduced Mannerism in the Americas. It highlighted the work of Luis de Riaño, disciple of the Italian Angelino Medoro, author of the murals of the Church of San Pedro of Andahuaylillas. It also highlighted the Indian (Quechua) painters Diego Quispe Tito and Basilio Santa Cruz Pumacallao, as well as Marcos Zapata, author of the fifty large canvases that cover the high arches of the Cathedral of Cusco. In Ecuador, the Quito School was formed, mainly represented by the mestizo Miguel de Santiago and the criollo Nicolás Javier de Goríbar. In the 18th century sculptural altarpieces began to be replaced by paintings, developing notably the Baroque painting in the Americas. Similarly, the demand for civil works, mainly portraits of the aristocratic classes and the ecclesiastical hierarchy, grew. The main influence was the Murillesque, and in some cases – as in the criollo Cristóbal de Villalpando – that of Valdés Leal. The painting of this era has a more sentimental tone, with sweet and softer shapes. It highlight Gregorio Vásquez de Arce in Colombia, and Juan Rodríguez Juárez and Miguel Cabrera in Mexico. Sculpture The dominant figure in baroque sculpture was Gian Lorenzo Bernini. Under the patronage of Pope Urban VIII, he made a remarkable series of monumental statues of saints and figures whose faces and gestures vividly expressed their emotions, as well as portrait busts of exceptional realism, and highly decorative works for the Vatican such as the imposing Chair of St. Peter beneath the dome in St. Peter's Basilica. In addition, he designed fountains with monumental groups of sculpture to decorate the major squares of Rome. Baroque sculpture was inspired by ancient Roman statuary, particularly by the famous first century CE statue of Laocoön, which was unearthed in 1506 and put on display in the gallery of the Vatican. When he visited Paris in 1665, Bernini addressed the students at the academy of painting and sculpture. He advised the students to work from classical models, rather than from nature. He told the students, "When I had trouble with my first statue, I consulted the Antinous like an oracle." That Antinous statue is known today as the Hermes of the Museo Pio-Clementino. Notable late French baroque sculptors included Étienne Maurice Falconet and Jean Baptiste Pigalle. Pigalle was commissioned by Frederick the Great to make statues for Frederick's own version of Versailles at Sanssouci in Potsdam, Germany. Falconet also received an important foreign commission, creating the famous statue of Peter the Great on horseback found in St. Petersburg. In Spain, the sculptor Francisco Salzillo worked exclusively on religious themes, using polychromed wood. Some of the finest baroque sculptural craftsmanship was found in the gilded stucco altars of churches of the Spanish colonies of the New World, made by local craftsmen; examples include the Rosary Chapel of the Church of Santo Domingo in Oaxaca (Mexico), 1724–1731. Furniture The main motifs used are: horns of plenty, festoons, baby angels, lion heads holding a metal ring in their mouths, female faces surrounded by garlands, oval cartouches, acanthus leaves, classical columns, caryatids, pediments, and other elements of Classical architecture sculpted on some parts of pieces of furniture, baskets with fruits or flowers, shells, armour and trophies, heads of Apollo or Bacchus, and C-shaped volutes. During the first period of the reign of Louis XIV, furniture followed the previous style of Louis XIII, and was massive, and profusely decorated with sculpture and gilding. After 1680, thanks in large part to the furniture designer André Charles Boulle, a more original and delicate style appeared, sometimes known as Boulle work. It was based on the inlay of ebony and other rare woods, a technique first used in Florence in the 15th century, which was refined and developed by Boulle and others working for Louis XIV. Furniture was inlaid with plaques of ebony, copper, and exotic woods of different colors. New and often enduring types of furniture appeared; the commode, with two to four drawers, replaced the old coffre, or chest. The canapé, or sofa, appeared, in the form of a combination of two or three armchairs. New kinds of armchairs appeared, including the fauteuil en confessionale or "Confessional armchair", which had padded cushions ions on either side of the back of the chair. The console table also made its first appearance; it was designed to be placed against a wall. Another new type of furniture was the table à gibier, a marble-topped table for holding dishes. Early varieties of the desk appeared; the Mazarin desk had a central section set back, placed between two columns of drawers, with four feet on each column. Music The term Baroque is also used to designate the style of music composed during a period that overlaps with that of Baroque art. The first uses of the term 'baroque' for music were criticisms. In an anonymous, satirical review of the première in October 1733 of Rameau's Hippolyte et Aricie, printed in the Mercure de France in May 1734, the critic implied that the novelty of this opera was "du barocque," complaining that the music lacked coherent melody, was filled with unremitting dissonances, constantly changed key and meter, and speedily ran through every compositional device. Jean-Jacques Rousseau, who was a musician and noted composer as well as philosopher, made a very similar observation in 1768 in the famous Encyclopédie of Denis Diderot: "Baroque music is | orangerie of the palace of the Dukes of Saxony in the 18th century. One of the best examples of a rococo church is the Basilika Vierzehnheiligen, or Basilica of the Fourteen Holy Helpers, a pilgrimage church located near the town of Bad Staffelstein near Bamberg, in Bavaria, southern Germany. The Basilica was designed by Balthasar Neumann and was constructed between 1743 and 1772, its plan a series of interlocking circles around a central oval with the altar placed in the exact centre of the church. The interior of this church illustrates the summit of Rococo decoration. Another notable example of the style is the Pilgrimage Church of Wies (). It was designed by the brothers J. B. and Dominikus Zimmermann. It is located in the foothills of the Alps, in the municipality of Steingaden in the Weilheim-Schongau district, Bavaria, Germany. Construction took place between 1745 and 1754, and the interior was decorated with frescoes and with stuccowork in the tradition of the Wessobrunner School. It is now a UNESCO World Heritage Site. Another notable example is the St. Nicholas Church (Malá Strana) in Prague (1704–55), built by Christoph Dientzenhofer and his son Kilian Ignaz Dientzenhofer. Decoration covers all of walls of interior of the church. The altar is placed in the nave beneath the central dome, and surrounded by chapels, light comes down from the dome above and from the surrounding chapels. The altar is entirely surrounded by arches, columns, curved balustrades and pilasters of coloured stone, which are richly decorated with statuary, creating a deliberate confusion between the real architecture and the decoration. The architecture is transformed into a theatre of light, colour and movement. In Poland, the Italian-inspired Polish Baroque lasted from the early 17th to the mid-18th century and emphasised richness of detail and colour. The first Baroque building in present-day Poland and probably one of the most recognizable is the Church of St. Peter and Paul in Kraków, designed by Giovanni Battista Trevano. Sigismund's Column in Warsaw, erected in 1644, was the world's first secular Baroque monument built in the form of a column. The palatial residence style was exemplified by the Wilanów Palace, constructed between 1677 and 1696. The most renowned Baroque architect active in Poland was Dutchman Tylman van Gameren and his notable works include Warsaw's St. Kazimierz Church and Krasiński Palace, St. Anne's in Kraków and Branicki Palace in Bialystok. However, the most celebrated work of Polish Baroque is the Fara Church in Poznań, with details by Pompeo Ferrari. French Baroque or Classicism France largely resisted the ornate Baroque style of Italy, Spain, Vienna and the rest of Europe. The French Baroque style (often termed Grand Classicism or simply Classicism in France) is closely associated with the works built for Louis XIV and Louis XV; it features more geometric order and measure than Baroque, and less elaborate decoration on the facades and in the interiors. Louis XIV invited the master of Baroque, Bernini, to submit a design for the new wing of the Louvre, but rejected it in favor of a more classical design by Claude Perrault and Louis Le Vau. The principal architects of the style included François Mansart (Chateau de Balleroy, 1626–1636), Pierre Le Muet (Church of Val-de-Grace, 1645–1665), Louis Le Vau (Vaux-le-Vicomte, 1657–1661) and especially Jules Hardouin Mansart and Robert de Cotte, whose work included the Galerie des Glaces and the Grand Trianon at Versailles (1687–1688). Mansart was also responsible for the Baroque classicism of the Place Vendôme (1686–1699). The major royal project of the period was the expansion of Palace of Versailles, begun in 1661 by Le Vau with decoration by the painter Charles Le Brun. The gardens were designed by André Le Nôtre specifically to complement and amplify the architecture. The Galerie des Glaces (Hall of Mirrors), the centerpiece of the château, with paintings by Le Brun, was constructed between 1678 and 1686. Mansart completed the Grand Trianon in 1687. The chapel, designed by de Cotte, was finished in 1710. Following the death of Louis XIV, Louis XV added the more intimate Petit Trianon and the highly ornate theatre. The fountains in the gardens were designed to be seen from the interior, and to add to the dramatic effect. The palace was admired and copied by other monarchs of Europe, particularly Peter the Great of Russia, who visited Versailles early in the reign of Louis XV, and built his own version at Peterhof Palace near Saint Petersburg, between 1705 and 1725. Portuguese Baroque Baroque architecture in Portugal lasted about two centuries (the late seventeenth century and eighteenth century). The reigns of John V and Joseph I had increased imports of gold and diamonds, in a period called Royal Absolutism, which allowed the Portuguese Baroque to flourish. Baroque architecture in Portugal enjoys a special situation and different timeline from the rest of Europe. It is conditioned by several political, artistic, and economic factors, that originate several phases, and different kinds of outside influences, resulting in a unique blend, often misunderstood by those looking for Italian art, find instead specific forms and character which give it a uniquely Portuguese variety. Another key factor is the existence of the Jesuitical architecture, also called "plain style" (Estilo Chão or Estilo Plano) which like the name evokes, is plainer and appears somewhat austere. The buildings are single-room basilicas, deep main chapel, lateral chapels (with small doors for communication), without interior and exterior decoration, very simple portal and windows. It is a very practical building, allowing it to be built throughout the empire with minor adjustments, and prepared to be decorated later or when economic resources are available. In fact, the first Portuguese Baroque does not lack in building because "plain style" is easy to be transformed, by means of decoration (painting, tiling, etc.), turning empty areas into pompous, elaborate baroque scenarios. The same could be applied to the exterior. Subsequently, it is easy to adapt the building to the taste of the time and place and add on new features and details. Practical and economical. With more inhabitants and better economic resources, the north, particularly the areas of Porto and Braga, witnessed an architectural renewal, visible in the large list of churches, convents and palaces built by the aristocracy. Porto is the city of Baroque in Portugal. Its historical centre is part of UNESCO World Heritage List. Many of the Baroque works in the historical area of the city and beyond, belong to Nicolau Nasoni an Italian architect living in Portugal, drawing original buildings with scenographic emplacement such as the church and tower of Clérigos, the logia of the Porto Cathedral, the church of Misericórdia, the Palace of São João Novo, the Palace of Freixo, the Episcopal Palace (Portuguese: Paço Episcopal do Porto) along with many others. Russian Baroque The debut of Russian Baroque, or Petrine Baroque, followed a long visit of Peter the Great to western Europe in 1697–98, where he visited the Chateaux of Fontainebleau and the Versailles as well as other architectural monuments. He decided, on his return to Russia, to construct similar monuments in St. Petersburg, which became the new capital of Russia in 1712. Early major monuments in the Petrine Baroque include the Peter and Paul Cathedral and Menshikov Palace. During the reign of Empress Anna and Elizaveta Petrovna, Russian architecture was dominated by the luxurious Baroque style of Italian-born Bartolomeo Rastrelli, which developed into Elizabethan Baroque. Rastrelli's signature buildings include the Winter Palace, the Catherine Palace and the Smolny Cathedral. Other distinctive monuments of the Elizabethan Baroque are the bell tower of the Troitse-Sergiyeva Lavra and the Red Gate. In Moscow, Naryshkin Baroque became widespread, especially in the architecture of Eastern Orthodox churches in the late 17th century. It was a combination of western European Baroque with traditional Russian folk styles. Baroque in the Spanish and Portuguese Colonial Americas Due to the colonization of the Americas by European countries, the Baroque naturally moved to the New World, finding especially favorable ground in the regions dominated by Spain and Portugal, both countries being centralized and irreducibly Catholic monarchies, by extension subject to Rome and adherents of the Baroque Counter-reformist most typical. European artists migrated to America and made school, and along with the widespread penetration of Catholic missionaries, many of whom were skilled artists, created a multiform Baroque often influenced by popular taste. The Criollo and Indidenous craftsmen did much to give this Baroque unique features. The main centres of American Baroque cultivation, that are still standing, are (in this order) Mexico, Peru, Brazil, Ecuador, Cuba, Colombia, Bolivia, Guatemala, Panama and Puerto Rico. Of particular note is the so-called "Missionary Baroque", developed in the framework of the Spanish reductions in areas extending from Mexico and southwestern portions of current-day United States to as far south as Argentina and Chile, indigenous settlements organized by Spanish Catholic missionaries in order to convert them to the Christian faith and acculturate them in the Western life, forming a hybrid Baroque influenced by Native culture, where flourished Criollos and many Indian artisans and musicians, even literate, some of great ability and talent of their own. Missionaries' accounts often repeat that Western art, especially music, had a hypnotic impact on foresters, and the images of saints were viewed as having great powers. Many Indians were converted, and a new form of devotion was created, of passionate intensity, laden with mysticism, superstition, and theatricality, which delighted in festive masses, sacred concerts, and mysteries. The Colonial Baroque architecture in the Spanish America is characterized by a profuse decoration (portal of La Profesa Church, Mexico City; facades covered with Puebla-style azulejos, as in the Church of San Francisco Acatepec in San Andrés Cholula and Convent Church of San Francisco of Puebla), which will be exacerbated in the so-called Churrigueresque style (Facade of the Tabernacle of the Mexico City Cathedral, by Lorenzo Rodríguez; Church of San Francisco Javier, Tepotzotlán; Church of Santa Prisca of Taxco). In Peru, the constructions mostly developed in the cities of Lima, Cusco, Arequipa and Trujillo, since 1650 show original characteristics that are advanced even to the European Baroque, as in the use of cushioned walls and solomonic columns (Church of la Compañía de Jesús, Cusco; Basilica and Convent of San Francisco, Lima). Other countries include: the Metropolitan Cathedral of Sucre in Bolivia; Cathedral Basilica of Esquipulas in Guatemala; Tegucigalpa Cathedral in Honduras; León Cathedral in Nicaragua; the Church of la Compañía de Jesús in Quito, Ecuador; the Church of San Ignacio in Bogotá, Colombia; the Caracas Cathedral in Venezuela; the Cabildo of Buenos Aires in Argentina; the Church of Santo Domingo in Santiago, Chile; and Havana Cathedral in Cuba. It is also worth remembering the quality of the churches of the Spanish Jesuit Missions in Bolivia, Spanish Jesuit missions in Paraguay, the Spanish missions in Mexico and the Spanish Franciscan missions in California. In Brazil, as in the metropolis, Portugal, the architecture has a certain Italian influence, usually of a Borrominesque type, as can be seen in the Co-Cathedral of Recife (1784) and Church of Nossa Senhora da Glória do Outeiro in Rio de Janeiro (1739). In the region of Minas Gerais, highlighted the work of Aleijadinho, author of a group of churches that stand out for their curved planimetry, facades with concave-convex dynamic effects and a plastic treatment of all architectural elements (Church of São Francisco de Assis in Ouro Preto, 1765–1788). Baroque in the Spanish and Portuguese Colonial Asia In the Portuguese colonies of India (Goa, Daman and Diu) an architectural style of Baroque forms mixed with Hindu elements flourished, such as the Goa Cathedral and the Basilica of Bom Jesus of Goa, which houses the tomb of St. Francis Xavier. The set of churches and convents of Goa was declared a World Heritage Site in 1986. In the Philippines, that was part of the Spanish Empire for a long time, a large number of Baroque constructions are preserved, including the Baroque Churches of the Philippines that four of these, and the Baroque and Neoclassical city of Vigan, are both UNESCO World Heritage Sites. It was also very remarkable the Walled City of Manila (Intramuros). Other city with notable preserved Spanish-era Baroque is Tayabas. Painting Baroque painters worked deliberately to set themselves apart from the painters of the Renaissance and the Mannerism period after it. In their palette, they used intense and warm colours, and particularly made use of the primary colours red, blue and yellow, frequently putting all three in close proximity. They avoided the even lighting of Renaissance painting and used strong contrasts of light and darkness on certain parts of the picture to direct attention to the central actions or figures. In their composition, they avoided the tranquil scenes of Renaissance paintings, and chose the moments of the greatest movement and drama. Unlike the tranquil faces of Renaissance paintings, the faces in Baroque paintings clearly expressed their emotions. They often used asymmetry, with action occurring away from the centre of the picture, and created axes that were neither vertical nor horizontal, but slanting to the left or right, giving a sense of instability and movement. They enhanced this impression of movement by having the costumes of the personages blown by the wind, or moved by their own gestures. The overall impressions were movement, emotion and drama. Another essential element of baroque painting was allegory; every painting told a story and had a message, often encrypted in symbols and allegorical characters, which an educated viewer was expected to know and read. Early evidence of Italian Baroque ideas in painting occurred in Bologna, where Annibale Carracci, Agostino Carracci and Ludovico Carracci sought to return the visual arts to the ordered Classicism of the Renaissance. Their art, however, also incorporated ideas central the Counter-Reformation; these included intense emotion and religious imagery that appealed more to the heart than to the intellect. Another influential painter of the Baroque era was Michelangelo Merisi da Caravaggio. His realistic approach to the human figure, painted directly from life and dramatically spotlit against a dark background, shocked his contemporaries and opened a new chapter in the history of painting. Other major painters associated closely with the Baroque style include Artemisia Gentileschi, Elisabetta Sirani, Giovanna Garzoni, Guido Reni, Domenichino, Andrea Pozzo, and Paolo de Matteis in Italy; Francisco de Zurbarán and Diego Velázquez in Spain; Adam Elsheimer in Germany; and Nicolas Poussin and Georges de La Tour in France (though Poussin spent most of his working life in Italy). Poussin and La Tour adopted a "classical" Baroque style with less focus on emotion and greater attention to the line of the figures in the painting than to colour. Peter Paul Rubens was the most important painter of the Flemish Baroque style. Rubens' highly charged compositions reference erudite aspects of classical and Christian history. His unique and immensely popular Baroque style emphasised movement, colour, and sensuality, which followed the immediate, dramatic artistic style promoted in the Counter-Reformation. Rubens specialized in making altarpieces, portraits, landscapes, and history paintings of mythological and allegorical subjects. One important domain of Baroque painting was Quadratura, or paintings in trompe-l'œil, which literally "fooled the eye". These were usually painted on the stucco of ceilings or upper walls and balustrades, and gave the impression to those on the ground looking up were that they were seeing the heavens populated with crowds of angels, saints and other heavenly figures, set against painted skies and imaginary architecture. In Italy, artists often collaborated with architects on interior decoration; Pietro da Cortona was one of the painters of the 17th century who employed this illusionist way of painting. Among his most important commissions were the frescoes he painted for the Palace of the Barberini family (1633–39), to glorify the reign of Pope Urban VIII. Pietro da Cortona's compositions were the largest decorative frescoes executed in Rome since the work of Michelangelo at the Sistine Chapel. François Boucher was an important figure in the more delicate French Rococo style, which appeared during the late Baroque period. He designed tapestries, carpets and theatre decoration as well as painting. His work was extremely popular with Madame Pompadour, the Mistress of King Louis XV. His paintings featured mythological romantic, and mildly erotic themes. Spanish Americas In the Spanish Americas, the first influences were from Sevillan Tenebrism, mainly from Zurbarán —some of whose works are still preserved in Mexico and Peru— as can be seen in the work of the Mexicans José Juárez and Sebastián López de Arteaga, and the Bolivian Melchor Pérez de Holguín. The Cusco School of painting arose after the arrival of the Italian painter Bernardo Bitti in 1583, who introduced Mannerism in the Americas. It highlighted the work of Luis de Riaño, disciple of the Italian Angelino Medoro, author of the murals of the Church of San Pedro of Andahuaylillas. It also highlighted the Indian (Quechua) painters Diego Quispe Tito and Basilio Santa Cruz Pumacallao, as well as Marcos Zapata, author of the fifty large canvases that cover the high arches of the Cathedral of Cusco. In Ecuador, the Quito School was formed, mainly represented by the mestizo Miguel de Santiago and the criollo Nicolás Javier de Goríbar. In the 18th century sculptural altarpieces began to be replaced by paintings, developing notably the Baroque painting in the Americas. Similarly, the demand for civil works, mainly portraits of the aristocratic classes and the ecclesiastical hierarchy, grew. The main influence was the Murillesque, and in some cases – as in the criollo Cristóbal de Villalpando – that of Valdés Leal. The painting of this era has a more sentimental tone, with sweet and softer shapes. It highlight Gregorio Vásquez de Arce in Colombia, and Juan Rodríguez Juárez and Miguel Cabrera in Mexico. Sculpture The dominant figure in baroque sculpture was Gian Lorenzo Bernini. Under the patronage of Pope Urban VIII, he made a remarkable series of monumental statues of saints and figures whose faces and gestures vividly expressed their emotions, as well as portrait busts of exceptional realism, and highly decorative works for the Vatican such as the imposing Chair of St. Peter beneath the dome in St. Peter's Basilica. In addition, he designed fountains with monumental groups of sculpture to decorate the major squares of Rome. Baroque sculpture was inspired by ancient Roman statuary, particularly by the famous first century CE statue of Laocoön, which was unearthed in 1506 and put on display in the gallery of the Vatican. When he visited Paris in 1665, Bernini addressed the students at the academy of painting and sculpture. He advised the students to work from classical models, rather than from nature. He told the students, "When I had trouble with my first statue, I consulted the Antinous like an oracle." That Antinous statue is known today as the Hermes of the Museo Pio-Clementino. Notable late French baroque sculptors included Étienne Maurice Falconet and Jean Baptiste Pigalle. Pigalle was commissioned by Frederick the Great to make statues for Frederick's own version of Versailles at Sanssouci in Potsdam, Germany. Falconet also received an important foreign commission, creating the famous statue of Peter the Great on horseback found in St. Petersburg. In Spain, the sculptor Francisco Salzillo worked exclusively on religious themes, using polychromed wood. Some of the finest baroque sculptural craftsmanship was found in the gilded stucco altars of churches of the Spanish colonies of the New World, made by local craftsmen; examples include the Rosary Chapel of the Church of Santo Domingo in Oaxaca (Mexico), 1724–1731. Furniture The main motifs used are: horns of plenty, festoons, baby angels, lion heads holding a metal ring in their mouths, female faces surrounded by garlands, oval cartouches, acanthus leaves, classical columns, caryatids, pediments, and other elements of Classical architecture sculpted on some parts of pieces of furniture, baskets with fruits or flowers, shells, armour and trophies, heads of Apollo or Bacchus, and C-shaped volutes. During the first period of the reign of Louis XIV, furniture followed the previous style of Louis XIII, and was massive, and profusely decorated with sculpture and gilding. After 1680, thanks in large part to the furniture designer André Charles Boulle, a more original and delicate style appeared, sometimes known as Boulle work. It was based on the inlay of ebony and other rare woods, a technique first used in Florence in the 15th century, which was refined and developed by Boulle and others working for Louis XIV. Furniture was inlaid with plaques of ebony, copper, and exotic woods of different colors. New and often enduring types of furniture appeared; the commode, with two to four drawers, replaced the old coffre, or chest. The canapé, or sofa, appeared, in the form of a combination of two or three armchairs. New kinds of armchairs appeared, including the fauteuil en confessionale or "Confessional armchair", which had padded cushions ions on either side of the back of the chair. The console table also made its first appearance; it was designed to be placed against a wall. Another new type of furniture was the table à gibier, a marble-topped table for holding dishes. Early varieties of the desk appeared; the Mazarin desk had a central section set back, placed between two columns of drawers, with four feet on each column. Music The term Baroque is also used to designate the style of music composed during a period that overlaps with that of Baroque art. The first uses of the term 'baroque' for music were criticisms. In an anonymous, satirical review of the première in October 1733 of Rameau's Hippolyte et Aricie, printed in the Mercure de France in May 1734, the critic implied that the novelty of this opera was "du barocque," complaining that the music lacked coherent melody, was filled with unremitting dissonances, constantly changed key and meter, and speedily ran through every compositional device. Jean-Jacques Rousseau, who was a musician and noted composer as well as philosopher, made a very similar observation in 1768 in the famous Encyclopédie of Denis Diderot: "Baroque music is that in which the harmony is confused, and loaded with modulations and dissonances. The singing is harsh and unnatural, the intonation difficult, and the movement limited. It appears that term comes from the word 'baroco' used by logicians." Common use of the term for the music of the period began only in 1919, by Curt Sachs, and it was not until 1940 that it was first used in English in an article published by Manfred Bukofzer. The baroque was a period of musical experimentation and innovation. New forms were invented, including the concerto and sinfonia. Opera was born in Italy at the end of the 16th century (with Jacopo Peri's mostly lost Dafne, produced in Florence in 1598) and soon spread through the rest of Europe: Louis XIV created the first Royal Academy of Music, In 1669, the poet Pierre Perrin opened an academy of opera in Paris, the first opera theatre in France open to the public, and premiered Pomone, the first grand opera in French, with music by Robert Cambert, with five acts, elaborate stage machinery, and a ballet. Heinrich Schütz in Germany, Jean-Baptiste Lully in France, and Henry Purcell in England all helped to establish their national traditions in the 17th century. Several new instruments, including the piano, were introduced during this period. The invention of the piano is credited to Bartolomeo Cristofori (1655–1731) of Padua, Italy, who was employed by Ferdinando de' Medici, Grand Prince of Tuscany, as the Keeper of the Instruments. Cristofori named the instrument un cimbalo di cipresso di piano e forte ("a keyboard of cypress with soft and loud"), abbreviated over time as pianoforte, fortepiano, and later, simply, piano. Composers and examples Giovanni Gabrieli (c. 1554/1557–1612) Sonata pian' e forte (1597), In Ecclesiis (from Symphoniae sacrae book 2, 1615) Giovanni Girolamo Kapsperger (c. 1580–1651) Libro primo di villanelle, 20 (1610) Claudio Monteverdi (1567–1643), L'Orfeo, favola in musica (1610) Heinrich Schütz (1585–1672), Musikalische Exequien (1629, 1647, 1650) Francesco Cavalli (1602–1676), L'Egisto (1643), Ercole amante (1662), Scipione affricano (1664) Jean-Baptiste Lully (1632–1687), Armide (1686) Marc-Antoine Charpentier (1643–1704), Te Deum (1688–1698) Heinrich Ignaz Franz Biber (1644–1704), Mystery Sonatas (1681) John Blow (1649–1708), Venus and Adonis (1680–1687) Johann Pachelbel (1653–1706), Canon in D (1680) Arcangelo Corelli (1653–1713), 12 concerti grossi, Op. 6 (1714) Marin Marais (1656–1728), Sonnerie de Ste-Geneviève du Mont-de-Paris (1723) Henry Purcell (1659–1695), Dido and Aeneas (1688) Alessandro Scarlatti (1660–1725), L'honestà negli amori (1680), Il Pompeo (1683), Mitridate Eupatore (1707) François Couperin (1668–1733), Les barricades mystérieuses (1717) Tomaso Albinoni (1671–1751), Didone abbandonata (1724) Antonio Vivaldi (1678–1741), The Four Seasons (1725) Jan Dismas Zelenka (1679–1745), Il Serpente di Bronzo (1730), Missa Sanctissimae Trinitatis (1736) Georg Philipp Telemann (1681–1767), Der Tag des Gerichts (1762) Johann David Heinichen (1683–1729) Jean-Philippe Rameau (1683–1764), Dardanus (1739) George Frideric Handel (1685–1759), Water Music (1717), Messiah (1741) Domenico Scarlatti (1685–1757), Sonatas for harpsichord Johann Sebastian Bach (1685–1750), Toccata |
of Boolean algebra and distributive lattices is owed to the 1890 Vorlesungen of Ernst Schröder. The first extensive treatment of Boolean algebra in English is A. N. Whitehead's 1898 Universal Algebra. Boolean algebra as an axiomatic algebraic structure in the modern axiomatic sense begins with a 1904 paper by Edward V. Huntington. Boolean algebra came of age as serious mathematics with the work of Marshall Stone in the 1930s, and with Garrett Birkhoff's 1940 Lattice Theory. In the 1960s, Paul Cohen, Dana Scott, and others found deep new results in mathematical logic and axiomatic set theory using offshoots of Boolean algebra, namely forcing and Boolean-valued models. Definition A Boolean algebra is a six-tuple consisting of a set A, equipped with two binary operations ∧ (called "meet" or "and"), ∨ (called "join" or "or"), a unary operation ¬ (called "complement" or "not") and two elements 0 and 1 in A (called "bottom" and "top", or "least" and "greatest" element, also denoted by the symbols ⊥ and ⊤, respectively), such that for all elements a, b and c of A, the following axioms hold: {| cellpadding=5 |a ∨ (b ∨ c) = (a ∨ b) ∨ c |a ∧ (b ∧ c) = (a ∧ b) ∧ c | associativity |- |a ∨ b = b ∨ a |a ∧ b = b ∧ a | commutativity |- |a ∨ (a ∧ b) = a |a ∧ (a ∨ b) = a | absorption |- |a ∨ 0 = a |a ∧ 1 = a | identity |- |a ∨ (b ∧ c) = (a ∨ b) ∧ (a ∨ c) |a ∧ (b ∨ c) = (a ∧ b) ∨ (a ∧ c) | distributivity |- |a ∨ ¬a = 1 |a ∧ ¬a = 0 | complements |} Note, however, that the absorption law and even the associativity law can be excluded from the set of axioms as they can be derived from the other axioms (see Proven properties). A Boolean algebra with only one element is called a trivial Boolean algebra or a degenerate Boolean algebra. (In older works, some authors required 0 and 1 to be distinct elements in order to exclude this case.) It follows from the last three pairs of axioms above (identity, distributivity and complements), or from the absorption axiom, that a = b ∧ a if and only if a ∨ b = b. The relation ≤ defined by a ≤ b if these equivalent conditions hold, is a partial order with least element 0 and greatest element 1. The meet a ∧ b and the join a ∨ b of two elements coincide with their infimum and supremum, respectively, with respect to ≤. The first four pairs of axioms constitute a definition of a bounded lattice. It follows from the first five pairs of axioms that any complement is unique. The set of axioms is self-dual in the sense that if one exchanges ∨ with ∧ and 0 with 1 in an axiom, the result is again an axiom. Therefore, by applying this operation to a Boolean algebra (or Boolean lattice), one obtains another Boolean algebra with the same elements; it is called its dual. Examples The simplest non-trivial Boolean algebra, the two-element Boolean algebra, has only two elements, 0 and 1, and is defined by the rules: It has applications in logic, interpreting 0 as false, 1 as true, ∧ as and, ∨ as or, and ¬ as not. Expressions involving variables and the Boolean operations represent statement forms, and two such expressions can be shown to be equal using the above axioms if and only if the corresponding statement forms are logically equivalent. The two-element Boolean algebra is also used for circuit design in electrical engineering; here 0 and 1 represent the two different states of one bit in a digital circuit, typically high and low voltage. Circuits are described by expressions containing variables, and two such expressions are equal for all values of the variables if and only if the corresponding circuits have the same input-output behavior. Furthermore, every possible input-output behavior can be modeled by a suitable Boolean expression. The two-element Boolean algebra is also important in the general theory of Boolean algebras, because an equation involving several variables is generally true in all Boolean algebras if and only if it is true in the two-element Boolean algebra (which can be checked by a trivial brute force algorithm for small numbers of variables). This can for example be used to show that the following laws (Consensus theorems) are generally valid in all Boolean algebras: (a ∨ b) ∧ (¬a ∨ c) ∧ (b ∨ c) ≡ (a ∨ b) ∧ (¬a ∨ c) (a ∧ b) ∨ (¬a ∧ c) ∨ (b ∧ c) ≡ (a ∧ b) ∨ (¬a ∧ c) The power set (set of all subsets) of any given nonempty set S forms a Boolean algebra, an algebra of sets, with the two operations ∨ := ∪ (union) and ∧ := ∩ (intersection). The smallest element 0 is the empty set and the largest element 1 is the set S itself. After the two-element Boolean algebra, the simplest Boolean algebra is that defined by the power set of two atoms: The set of all subsets of that are either finite or cofinite is a Boolean algebra and an algebra of sets called the finite–cofinite algebra. If is infinite then the set of all cofinite subsets of which is called the Fréchet filter, is a free ultrafilter on However, the Fréchet filter is not an ultrafilter on the power set of Starting with the propositional calculus with κ sentence symbols, form the Lindenbaum algebra (that is, the set of sentences in the propositional calculus modulo logical equivalence). This construction yields a Boolean algebra. It is in fact the free Boolean algebra on κ generators. A truth assignment in propositional calculus is then a Boolean algebra homomorphism from this algebra to the two-element Boolean algebra. Given any linearly ordered set L with a least element, the interval algebra is | and vice versa, with ring multiplication corresponding to conjunction or meet ∧, and ring addition to exclusive disjunction or symmetric difference (not disjunction ∨). However, the theory of Boolean rings has an inherent asymmetry between the two operators, while the axioms and theorems of Boolean algebra express the symmetry of the theory described by the duality principle. History The term "Boolean algebra" honors George Boole (1815–1864), a self-educated English mathematician. He introduced the algebraic system initially in a small pamphlet, The Mathematical Analysis of Logic, published in 1847 in response to an ongoing public controversy between Augustus De Morgan and William Hamilton, and later as a more substantial book, The Laws of Thought, published in 1854. Boole's formulation differs from that described above in some important respects. For example, conjunction and disjunction in Boole were not a dual pair of operations. Boolean algebra emerged in the 1860s, in papers written by William Jevons and Charles Sanders Peirce. The first systematic presentation of Boolean algebra and distributive lattices is owed to the 1890 Vorlesungen of Ernst Schröder. The first extensive treatment of Boolean algebra in English is A. N. Whitehead's 1898 Universal Algebra. Boolean algebra as an axiomatic algebraic structure in the modern axiomatic sense begins with a 1904 paper by Edward V. Huntington. Boolean algebra came of age as serious mathematics with the work of Marshall Stone in the 1930s, and with Garrett Birkhoff's 1940 Lattice Theory. In the 1960s, Paul Cohen, Dana Scott, and others found deep new results in mathematical logic and axiomatic set theory using offshoots of Boolean algebra, namely forcing and Boolean-valued models. Definition A Boolean algebra is a six-tuple consisting of a set A, equipped with two binary operations ∧ (called "meet" or "and"), ∨ (called "join" or "or"), a unary operation ¬ (called "complement" or "not") and two elements 0 and 1 in A (called "bottom" and "top", or "least" and "greatest" element, also denoted by the symbols ⊥ and ⊤, respectively), such that for all elements a, b and c of A, the following axioms hold: {| cellpadding=5 |a ∨ (b ∨ c) = (a ∨ b) ∨ c |a ∧ (b ∧ c) = (a ∧ b) ∧ c | associativity |- |a ∨ b = b ∨ a |a ∧ b = b ∧ a | commutativity |- |a ∨ (a ∧ b) = a |a ∧ (a ∨ b) = a | absorption |- |a ∨ 0 = a |a ∧ 1 = a | identity |- |a ∨ (b ∧ c) = (a ∨ b) ∧ (a ∨ c) |a ∧ (b ∨ c) = (a ∧ b) ∨ (a ∧ c) | distributivity |- |a ∨ ¬a = 1 |a ∧ ¬a = 0 | complements |} Note, however, that the absorption law and even the associativity law can be excluded from the set of axioms as they can be derived from the other axioms (see Proven properties). A Boolean algebra with only one element is called a trivial Boolean algebra or a degenerate Boolean algebra. (In older works, some authors required 0 and 1 to be distinct elements in order to exclude this case.) It follows from the last three pairs of axioms above (identity, distributivity and complements), or from the absorption axiom, that a = b ∧ a if and only if a ∨ b = b. The relation ≤ defined by a ≤ b if these equivalent conditions hold, is a partial order with least element 0 and greatest element 1. The meet a ∧ b and the join a ∨ b of two elements coincide with their infimum and supremum, respectively, with respect to ≤. The first four pairs of axioms constitute a definition of a bounded lattice. It follows from the first five pairs of axioms that any complement is unique. The set of axioms is self-dual in the sense that if one exchanges ∨ with ∧ and 0 with 1 in an axiom, the result is again an axiom. Therefore, by applying this operation to a Boolean algebra (or Boolean lattice), one obtains another Boolean algebra with the same elements; it is called its dual. Examples The simplest non-trivial Boolean algebra, the two-element Boolean algebra, has only two elements, 0 and 1, and is defined by the rules: It has applications in logic, interpreting 0 as false, 1 as true, ∧ as and, ∨ as or, and ¬ as not. Expressions involving variables and the Boolean operations represent statement forms, and two such expressions can be shown to be equal using the above axioms if and only if the corresponding statement forms are logically equivalent. The two-element Boolean algebra is also used for circuit design in electrical engineering; here 0 and 1 represent the two different states of one bit in a digital circuit, typically high and low voltage. Circuits are described by expressions containing variables, and two such expressions are equal for all values of the variables if and only if the corresponding circuits have the same input-output behavior. Furthermore, every possible input-output behavior can be modeled by a suitable Boolean expression. The two-element Boolean algebra is also important in the general theory of Boolean algebras, because an equation involving several variables is generally true in all Boolean algebras if and only if it is true in the two-element Boolean algebra (which can be checked by a trivial brute force algorithm for small numbers of variables). This can for example be used to show that the following laws (Consensus theorems) are generally valid in all Boolean algebras: (a ∨ b) ∧ (¬a ∨ c) ∧ (b ∨ c) ≡ (a ∨ b) ∧ (¬a ∨ c) (a ∧ b) ∨ (¬a ∧ c) ∨ (b ∧ c) ≡ (a ∧ b) ∨ (¬a ∧ c) The power set (set of all subsets) of any given nonempty set S forms a Boolean algebra, an algebra of sets, with the two operations ∨ := ∪ (union) and ∧ := ∩ (intersection). The smallest element 0 is the empty set and the largest element 1 is the set S itself. After the two-element Boolean algebra, the simplest Boolean algebra is that defined by the power set of two atoms: The set of all subsets of that are either finite or cofinite is a Boolean algebra and an algebra of sets called the finite–cofinite algebra. If is infinite then the set of all cofinite subsets of which is called the Fréchet filter, is a free ultrafilter on However, the Fréchet filter is not an ultrafilter on the power set of Starting with the propositional calculus with κ sentence symbols, form the Lindenbaum algebra (that is, the set of sentences in the propositional calculus modulo logical equivalence). This construction yields a Boolean algebra. It is in fact the free Boolean algebra on κ generators. A truth assignment in propositional calculus is then a Boolean algebra homomorphism from this algebra to the two-element Boolean algebra. Given any linearly ordered set L with a least element, the interval algebra is the smallest algebra of subsets of L containing all of the half-open intervals [a, b) such that a is in L and b is either in L or equal to ∞. Interval algebras are useful in the study of Lindenbaum–Tarski algebras; every countable Boolean algebra is isomorphic to an interval algebra. For any natural number n, the set of all positive divisors of n, defining if a divides b, forms a distributive lattice. This lattice is a Boolean algebra if and only if n is square-free. The bottom and the top element of this Boolean algebra is the natural number 1 and n, respectively. The complement of a is given by n/a. The meet and the join of a and b is given by the greatest common divisor (gcd) and the least common multiple (lcm) of a and b, respectively. The ring addition a+b is given by lcm(a,b)/gcd(a,b). The picture shows an example for n = 30. As a counter-example, considering the non-square-free n=60, the greatest common divisor of 30 and its complement 2 would be 2, while it should be the bottom element 1. Other examples of Boolean algebras arise from topological spaces: if X is a topological space, then the collection of all subsets of X which are both open and closed forms a Boolean algebra with the operations ∨ := ∪ (union) and ∧ := ∩ (intersection). If is an arbitrary ring then its set of central idempotents, which is the set becomes a Boolean algebra when its operations are defined by and Homomorphisms and isomorphisms A homomorphism between two Boolean algebras A and B is a function f : |
levy and inflation within the normal limits in the first months, before the black market and ration cards. The situation followed the conflict of interest between the state-entrepreneur and the state-bank, albeit in the name of a higher ideological purpose. In 1938, the government decreed the power to directly appoint presidents and vice-presidents of the board of directors of banks. Beneduce planned to have a public bank take over the long-term credit of large companies, financed with bonds of equal duration for public works, energy, industry. After them, the Central Bank maintained a low-profile monetary policy, consistent with the directives of fascism. IRI operated differently, in agreement with the Italian banks and industries that supported fascism. The banks renounced exercising an option by "converting" the debts into shares (or a law in this regard), preferring not to enter directly into the ownership of the industrial groups. The groups transferred the bank debts to IRI, which became the new owner in exchange for shares (at the book value, not always the same as the market value), until they held control of the property and therefore of management. The debt of the IRI rose to nine and a half billion lire at the time, two thirds of which were paid within the war, because they were drastically diluted by inflation which has the effect of lowering the real weight of debts until the accounting entries are canceled. of issuance, but also to halve the purchasing power of small savers. The remaining debt was paid by 1953. The IRI in turn had debts towards the Bank of Italy for five billion lire: the State issued bonds for IRI for one and a half billion, "sterilizing" the debt that should have been repaid with "annuity" interest. accrued until 1971. The change of constitutional order and currency (exchange rate for conversion), and inflation meant that IRI (and industries) paid the Bank of Italy less than a third of the sum. After the armistice of 8 September, the German authorities demanded the delivery of the gold reserve. 173 tons of gold were first transferred to the Milan office, and then to Fortezza. Traces of it were subsequently lost. In the 1960s, the public debt increased and so did inflation. Governor Guido Carli made a policy of credit crunch to stop inflation, particularly in 1964. In general, the Bank of Italy played an important political role under this governorship. Other credit crunches were implemented between 1969 and 1970 due to the flight of capital abroad and in 1974 as a result of the oil crisis. In March 1979 the governor of the Bank of Italy Paolo Baffi and the deputy director in charge of supervision Mario Sarcinelli were accused by the Rome public prosecutor of private interest in official acts and personal aiding and abetting. Sarcinelli was arrested, and released from prison only after being suspended from duties relating to surveillance, while Baffi avoided prison due to his age. In 1981 the two will be completely acquitted. Subsequently, the suspicion will emerge that the indictment was wanted by P2 to prevent the Bank of Italy from supervising Roberto Cavali Banco Ambrosiano. The postwar period The post-war inflation, also due to the Am-lire, was fought with the credit crunch desired by the governor Luigi Einaudi, which was obtained through the compulsory reserve on deposits. In particular, the instrument of compulsory reserves of banks at the central bank was used, introduced in 1926 but never really applied. In 1948 the governor was given the task of regulating the money supply and deciding the discount rate. The universal banks were the ones that had gained the most from war and inflation (under the Authorization Regime of the Interministerial Credit Committee), with the greatest growth in deposits. Along with the recovery, speculative stocks and capital flight abroad appeared. Credit limits were no longer tied to equity, as equity figures were completely distorted by inflation. The squeeze on lending, the liquidity crisis and the Eenaudian deflation pushed operators to finance themselves by placing stocks on the market and returning capital, thus blocking the rise in prices; and by resorting to self-financing (even without distributing profits), aided by the fact that inflation had made it possible to quickly amortize fixed assets whose book value was now nominal. During the years of the Reconstruction, the governor Donato Menichella governed the issue in a gradual and balanced way: he did not implement expansionary maneuvers to encourage growth, but was careful to avoid the creation of credit crunches. In this he was helped by the low public debt. Its monetary policy program was stability for development. A part of the available bank savings was channeled annually to the Treasury to cover the budget deficit (in the current year), while during his tenure the public debt of the state never rose above 1% of GDP, until 1964. In July 1981, a "divorce" between the State (Ministry of the Treasury) and its central bank was initiated by decision of the then Treasury Minister Beniamino Andreatta. From that moment on, the institute was no longer required to purchase the bonds that the government was unable to place on the market, thus ceasing the monetization of the Italian public debt that it had carried out since the Second World War up to that moment. This decision was opposed by the Minister of Finance Rino Formica, who would have liked the Bank of Italy to be required to repay at least a portion of these securities, and from the summer of 1982 a series of intra-government verbal clashes between the two ministers known as the wives' quarrel, which was followed by the fall of the second Spadolini government a few months later. The divorce between the Ministry of the Treasury and the Bank of Italy is still considered by economic doctrine as a factor of great stabilization of inflation (which went from over 20% in 1980 to less than 5% in the following years) and a central prerequisite for guarantee the full independence of the technical monetary policy body (central bank) from the choices related to fiscal policy (under the responsibility of the government), but also a factor of considerable incidence of growth of the Italian public debt. The law of 7 February 1992 n. 82, proposed by the then Minister of the Treasury Guido Carli, clarifies that the decision on the discount rate is the exclusive competence of the governor and must no longer be agreed in concert with the Minister of the Treasury (the previous decree of the President of the Republic is modified in relation to the new law with the Presidential Decree of 18 July). The Euro and the 2006 | the board of directors of banks. Beneduce planned to have a public bank take over the long-term credit of large companies, financed with bonds of equal duration for public works, energy, industry. After them, the Central Bank maintained a low-profile monetary policy, consistent with the directives of fascism. IRI operated differently, in agreement with the Italian banks and industries that supported fascism. The banks renounced exercising an option by "converting" the debts into shares (or a law in this regard), preferring not to enter directly into the ownership of the industrial groups. The groups transferred the bank debts to IRI, which became the new owner in exchange for shares (at the book value, not always the same as the market value), until they held control of the property and therefore of management. The debt of the IRI rose to nine and a half billion lire at the time, two thirds of which were paid within the war, because they were drastically diluted by inflation which has the effect of lowering the real weight of debts until the accounting entries are canceled. of issuance, but also to halve the purchasing power of small savers. The remaining debt was paid by 1953. The IRI in turn had debts towards the Bank of Italy for five billion lire: the State issued bonds for IRI for one and a half billion, "sterilizing" the debt that should have been repaid with "annuity" interest. accrued until 1971. The change of constitutional order and currency (exchange rate for conversion), and inflation meant that IRI (and industries) paid the Bank of Italy less than a third of the sum. After the armistice of 8 September, the German authorities demanded the delivery of the gold reserve. 173 tons of gold were first transferred to the Milan office, and then to Fortezza. Traces of it were subsequently lost. In the 1960s, the public debt increased and so did inflation. Governor Guido Carli made a policy of credit crunch to stop inflation, particularly in 1964. In general, the Bank of Italy played an important political role under this governorship. Other credit crunches were implemented between 1969 and 1970 due to the flight of capital abroad and in 1974 as a result of the oil crisis. In March 1979 the governor of the Bank of Italy Paolo Baffi and the deputy director in charge of supervision Mario Sarcinelli were accused by the Rome public prosecutor of private interest in official acts and personal aiding and abetting. Sarcinelli was arrested, and released from prison only after being suspended from duties relating to surveillance, while Baffi avoided prison due to his age. In 1981 the two will be completely acquitted. Subsequently, the suspicion will emerge that the indictment was wanted by P2 to prevent the Bank of Italy from supervising Roberto Cavali Banco Ambrosiano. The postwar period The post-war inflation, also due to the Am-lire, was fought with the credit crunch desired by the governor Luigi Einaudi, which was obtained through the compulsory reserve on deposits. In particular, the instrument of compulsory reserves of banks at the central bank was used, introduced in 1926 but never really applied. In 1948 the governor was given the task of regulating the money supply and deciding the discount rate. The universal banks were the ones that had gained the most from war and inflation (under the Authorization Regime of the Interministerial Credit Committee), with the greatest growth in deposits. Along with the recovery, speculative stocks and capital flight abroad appeared. Credit limits were no longer tied to equity, as equity figures were completely distorted by inflation. The squeeze on lending, the liquidity crisis and the Eenaudian deflation pushed operators to finance themselves by placing stocks on the market and returning capital, thus blocking the rise in prices; and by resorting to self-financing (even without distributing profits), aided by the fact that inflation had made it possible to quickly amortize fixed assets whose book value was now nominal. During the years of the Reconstruction, the governor Donato Menichella governed the issue in a gradual and balanced way: he did not implement expansionary maneuvers to encourage growth, but was careful to avoid the creation of credit crunches. In this he was helped by the low public debt. Its monetary policy program was stability for development. A part of the available bank savings was channeled annually to the Treasury to cover the budget deficit (in the current year), while during his tenure the public debt of the state never rose above 1% of GDP, until 1964. In July 1981, a "divorce" between the State (Ministry of the Treasury) and its central bank was initiated by decision of the then Treasury Minister Beniamino Andreatta. From that moment on, the institute was no longer required to purchase the bonds that the government was unable to place on the market, thus ceasing the monetization of the Italian public debt that it had carried out since the Second World War up to that moment. This decision was opposed by the Minister of Finance Rino Formica, who would have liked the Bank of Italy to be required to repay at least a portion of these securities, and from the summer of 1982 a series of intra-government verbal clashes between the two ministers known as the wives' quarrel, which was followed by the fall of the second Spadolini government a few months later. The divorce between the Ministry of the Treasury and the Bank of Italy is still considered by economic doctrine as a factor of great stabilization of inflation (which went from over 20% in 1980 to less than 5% in the following years) and a central prerequisite for guarantee the full independence of the technical monetary policy body (central bank) from the choices related to fiscal policy (under the responsibility of the government), but also a factor of considerable incidence of growth of the Italian public debt. The law of 7 February 1992 n. 82, proposed by the then Minister of the Treasury Guido Carli, clarifies that the decision on the discount rate is the exclusive competence of the governor and must no longer be agreed in concert with the Minister of the Treasury (the previous decree of the President of the Republic is modified in relation to the new law with the Presidential Decree of 18 July). The Euro and the 2006 reform The Legislative Decree 10 March 1998 n. 43 removes the Bank of Italy from management by the Italian government, sanctioning its belonging to the European system of central banks. From this date, therefore, the quantity of currency in circulation is decided autonomously by the Central Bank. With the introduction of the Euro on 1 January 1999, the Bank thus loses the function of presiding over national monetary policy. This function has since been exercised collectively by the Governing Council of the European Central Bank, which also includes the Governor of Bank of Italy. On 13 June 1999 the Senate of the Republic, during the XIII Legislature, discussed the bill no. 4083 “Rules on the ownership of the Bank of Italy and on the criteria for appointing the Board of Governors of the Bank of Italy”. This bill would like the state to acquire all the shares of the institute, but it is never approved. On January 4, 2004, the weekly "Famiglia Cristiana" reports, for the first time in history, the list of participants |
Britishness, the British identity and common culture British English, the English language as spoken and written in the United Kingdom or, more broadly, throughout the British Isles Celtic Britons, an ancient ethno-linguistic group Brittonic languages, a branch of the Insular Celtic language family (formerly called British) Common Brittonic, an ancient language Other uses Brit(ish), a | broadly, throughout the British Isles Celtic Britons, an ancient ethno-linguistic group Brittonic languages, a branch of the Insular Celtic language family (formerly called British) Common Brittonic, an ancient language Other uses Brit(ish), a 2018 memoir by Afua Hirsch |
headlines, such as "SIXTY HORSES WEDGED IN A CHIMNEY", for which the copy in its entirety was "The story to fit this sensational headline has not turned up yet." news reports from around the country. or just anything that the author thought funny at the time. Morton's other interest, France, was occasionally represented by epic tales of his rambling walks through the French countryside. These were not intended as humour. "By the Way" was popular with the readership, and of course, this is one of the reasons it lasted so long. Its style and randomness could be off-putting, however, and it is safe to say the humour could be something of an acquired taste. Oddly, one of the column's greatest opponents was the Express newspaper's owner, Lord Beaverbrook, who had to keep being assured the column was indeed funny. A prominent critic was George Orwell, who frequently referred to him in his essays and diaries as "A Catholic Apologist" and accused him of being "silly-clever", in line with his criticisms of G. K. Chesterton, Hilaire Belloc, Ronald Knox and Wyndham-Lewis. But By the Way was one of the few features kept continuously running in the often seriously reduced Daily Express throughout World War II, when Morton's lampooning of Hitler, including the British invention of bracerot to make the Nazi's trousers fall down at inopportune moments, was regarded as valuable for morale. The column appeared daily until 1965 when it was changed to weekly. It was cancelled in 1975 and revived as a daily piece in the early 1990s. It continues to the present day in much the same format, but is now entitled "Beachcomber", not "By the Way". Recurrent characters Mr. Justice Cocklecarrot: well-meaning but ineffectual High Court judge, plagued by litigation involving the twelve red-bearded dwarfs. Often appears in Private Eye. Mrs. Justice Cocklecarrot: Mr. Cocklecarrot's wife. Very silent, until she observes that "Wivens has fallen down a manhole". An enquiry from the judge as to which Wivens that would be elicits the response "E. D. Wivens". After a worrying interval she reveals that E. D. Wivens is a cat. His Lordship observes that cats do not have initials, to which she replies, "This one does". Tinklebury Snapdriver and Honeygander Gooseboote: two counsel. The elbow of one has a mysterious tendency to become jammed in the jaws of the other. Twelve red-bearded dwarfs, with a penchant for farcical litigation. Their names "appear to be" Scorpion de Rooftrouser, Cleveland Zackhouse, Frums Gillygottle, Edeledel Edel, Churm Rincewind, Sophus Barkayo-Tong, Amaninter Axling, Guttergorm Guttergormpton, Badly Oronparser, Listenis Youghaupt, Molonay Tubilderborst and Farjole Merrybody. They admit that these are not genuine names. (Further red-bearded dwarfs, to the number of forty-one, appear in other litigation.) Captain Foulenough: archetypal cad and gatecrasher who impersonates the upper class in order to wreck their social events. Educated at Narkover''', a school specializing in card-playing, horse-racing and bribery. His title of "Captain" is probably spurious; but even if it had been a genuine military title, his use of it in civilian life, when at that time only officers who had achieved the rank of Major and above were allowed to do so, gives a subtle hint as to his nature. Mountfalcon Foulenough: the Captain's priggish nephew, who brings havoc to Narkover and "makes virtue seem even more horrifying than usual". Vita Brevis: debutante frequently plagued by, but with a certain reluctant admiration for, Captain Foulenough. Dr. Smart-Allick: genteel, but ludicrous and criminal, headmaster of Narkover. Miss Topsy Turvey: neighbouring headmistress, courted by Smart-Allick. Dr. Strabismus (whom God preserve) of Utrecht: eccentric scientist and inventor. Lord Shortcake: absent-minded peer obsessed by his enormous collection of goldfish. Mrs. McGurgle: seaside landlady. Fearsomely British, until she decides to reinvent her house as "Hôtel McGurgle et de l'Univers" to attract the tourists. Ministry of Bubbleblowing: possible ancestor of Monty Python's Ministry of Silly Walks. Charlie Suet: disastrous civil servant. Mimsie Slopcorner: Charlie's on-off girlfriend, an ill-informed and irritating social activist. The Filthistan Trio: Ashura, Kazbulah and Rizamughan, three Persians from "Thurralibad", two of whom play seesaw on a plank laid across the third. They | These could be anything, such as: court reports, often involving Twelve Red-Bearded Dwarfs before Mr Justice Cocklecarrot. angry exchanges of letters between characters such as Florence McGurgle and her dissatisfied boarders. interruptions from "Prodnose", representing the public, who would then be roundly cursed by the author and kicked out. installments of serials that could stop, restart from earlier, be abandoned altogether or change direction abruptly without warning. parodies of poetry or drama, particularly of the extremely "literary" type such as Ibsen. unlikely headlines, such as "SIXTY HORSES WEDGED IN A CHIMNEY", for which the copy in its entirety was "The story to fit this sensational headline has not turned up yet." news reports from around the country. or just anything that the author thought funny at the time. Morton's other interest, France, was occasionally represented by epic tales of his rambling walks through the French countryside. These were not intended as humour. "By the Way" was popular with the readership, and of course, this is one of the reasons it lasted so long. Its style and randomness could be off-putting, however, and it is safe to say the humour could be something of an acquired taste. Oddly, one of the column's greatest opponents was the Express newspaper's owner, Lord Beaverbrook, who had to keep being assured the column was indeed funny. A prominent critic was George Orwell, who frequently referred to him in his essays and diaries as "A Catholic Apologist" and accused him of being "silly-clever", in line with his criticisms of G. K. Chesterton, Hilaire Belloc, Ronald Knox and Wyndham-Lewis. But By the Way was one of the few features kept continuously running in the often seriously reduced Daily Express throughout World War II, when Morton's lampooning of Hitler, including the British invention of bracerot to make the Nazi's trousers fall down at inopportune moments, was regarded as valuable for morale. The column appeared daily until 1965 when it was changed to weekly. It was cancelled in 1975 and revived as a daily piece in the early 1990s. It continues to the present day in much the same format, but is now entitled "Beachcomber", not "By the Way". Recurrent characters Mr. Justice Cocklecarrot: well-meaning but ineffectual High Court judge, plagued by litigation involving the twelve red-bearded dwarfs. Often appears in Private Eye. Mrs. Justice Cocklecarrot: Mr. Cocklecarrot's wife. Very silent, until she observes that "Wivens has fallen down a manhole". An enquiry from the judge as to which Wivens that would be elicits the response "E. D. Wivens". After a worrying interval she reveals that E. D. Wivens is a cat. His Lordship observes that cats do not have initials, to which she replies, "This one does". Tinklebury Snapdriver and Honeygander Gooseboote: two counsel. The elbow of one has a mysterious tendency to become jammed in the jaws of the other. Twelve red-bearded dwarfs, with a penchant for |
does not have any credentials in the field. He once said, "My method is to look at something that seems like a good idea and assume it's true". In 2011, he was inducted as a Fellow of the Computer History Museum for his work on the Berkeley Software Distribution (BSD) Unix system and the co-founding of Sun Microsystems. Technology concerns In 2000, Joy gained notoriety with the publication of his article in Wired magazine, "Why The Future Doesn't Need Us", in which he declared, in what some have described as a "neo-Luddite" position, that he was convinced that growing advances in genetic engineering and nanotechnology would bring risks to humanity. He argued that intelligent robots would replace humanity, at the very least in intellectual and social dominance, in the relatively near future. He supports and promotes the idea of abandonment of GNR (genetics, nanotechnology, and robotics) technologies, instead of going into an arms race between negative uses of the technology and defense against those negative uses (good nano-machines patrolling and defending against Grey goo "bad" nano-machines). This stance of broad relinquishment was criticized by technologists such as technological-singularity thinker Ray Kurzweil, who instead advocates fine-grained relinquishment and ethical guidelines. Joy was also criticized by The American Spectator, which characterized Joy's essay as a (possibly unwitting) rationale for statism. A bar-room discussion of these technologies with Ray Kurzweil started to set Joy's thinking along this path. He states in his essay that during the conversation, he became surprised that other serious scientists were considering such possibilities likely, and even more astounded at what he felt was a lack of consideration of the contingencies. After bringing the subject up with a few more acquaintances, he states that he was further alarmed by what he felt was the fact that although many people considered these futures possible or probable, that very few of them shared as serious a concern for the dangers as he seemed to. This concern led to his in-depth examination of the issue and the positions of others in the scientific community on it, and eventually, to his current activities regarding it. Despite this, he is a venture capitalist, investing in technology companies. He has also raised a specialty venture fund to address the dangers of pandemic diseases, such as the H5N1 avian influenza and biological weapons. Joy's law In his 2013 book Makers, author Chris Anderson credited Joy with establishing "Joy's law" based on a quip: "No matter who you are, most of the smartest people work for someone else [other than you]." His argument was that companies use an inefficient process by not hiring the best employees, only those they are able to hire. His "law" was a continuation of Friedrich Hayek's "The Use of Knowledge in Society" and warned that the competition outside of a company would always have the potential to be greater than the company itself. See also Joy's law (computing) References External links An Introduction to Display Editing with Vi Bill Joy, video clips at Big Picture TV Excerpts from a 1999 Linux Magazine interview regarding the development of vi NerdTV interview (video, audio, and | more astounded at what he felt was a lack of consideration of the contingencies. After bringing the subject up with a few more acquaintances, he states that he was further alarmed by what he felt was the fact that although many people considered these futures possible or probable, that very few of them shared as serious a concern for the dangers as he seemed to. This concern led to his in-depth examination of the issue and the positions of others in the scientific community on it, and eventually, to his current activities regarding it. Despite this, he is a venture capitalist, investing in technology companies. He has also raised a specialty venture fund to address the dangers of pandemic diseases, such as the H5N1 avian influenza and biological weapons. Joy's law In his 2013 book Makers, author Chris Anderson credited Joy with establishing "Joy's law" based on a quip: "No matter who you are, most of the smartest people work for someone else [other than you]." His argument was that companies use an inefficient process by not hiring the best employees, only those they are able to hire. His "law" was a continuation of Friedrich Hayek's "The Use of Knowledge in Society" and warned that the competition outside of a company would always have the potential to be greater than the company itself. See also Joy's law (computing) References External links An Introduction to Display Editing with Vi Bill Joy, video clips at Big Picture TV Excerpts from a 1999 Linux Magazine interview regarding the development of vi NerdTV interview (video, audio, and transcript available) - 30 June 2005 The Six Webs, 10 Years On - speech at MIT Emerging Technologies conference, September 29, 2005 Bill Joy at Dropping Knowledge, his answers to the 100 questions at Dropping Knowledge's Table of Free Voices event in Berlin, 2006. Computer History Museum, Sun Founders Panel, January 11, 2006 1954 births Living people People from Farmington Hills, Michigan UC Berkeley College of Engineering alumni University of Michigan College of Engineering alumni American computer programmers American computer scientists American electrical engineers BSD people Computer systems researchers Futurologists Grace Murray Hopper Award laureates Internet pioneers Wired (magazine) people Members of the United |
of the energy of the signal. x dB bandwidth In some contexts, the signal bandwidth in hertz refers to the frequency range in which the signal's spectral density (in W/Hz or V2/Hz) is nonzero or above a small threshold value. The threshold value is often defined relative to the maximum value, and is most commonly the , that is the point where the spectral density is half its maximum value (or the spectral amplitude, in or , is 70.7% of its maximum). This figure, with a lower threshold value, can be used in calculations of the lowest sampling rate that will satisfy the sampling theorem. The bandwidth is also used to denote system bandwidth, for example in filter or communication channel systems. To say that a system has a certain bandwidth means that the system can process signals with that range of frequencies, or that the system reduces the bandwidth of a white noise input to that bandwidth. The 3 dB bandwidth of an electronic filter or communication channel is the part of the system's frequency response that lies within 3 dB of the response at its peak, which, in the passband filter case, is typically at or near its center frequency, and in the low-pass filter is at or near its cutoff frequency. If the maximum gain is 0 dB, the 3 dB bandwidth is the frequency range where attenuation is less than 3 dB. 3 dB attenuation is also where power is half its maximum. This same half-power gain convention is also used in spectral width, and more generally for the extent of functions as full width at half maximum (FWHM). In electronic filter design, a filter specification may require that within the filter passband, the gain is nominally 0 dB with a small variation, for example within the ±1 dB interval. In the stopband(s), the required attenuation in decibels is above a certain level, for example >100 dB. In a transition band the gain is not specified. In this case, the filter bandwidth corresponds to the passband width, which in this example is the 1 dB-bandwidth. If the filter shows amplitude ripple within the passband, the x dB point refers to the point where the gain is x dB below the nominal passband gain rather than x dB below the maximum gain. In signal processing and control theory the bandwidth is the frequency at which the closed-loop system gain drops 3 dB below peak. In communication systems, in calculations of the Shannon–Hartley channel capacity, bandwidth refers to the 3 dB-bandwidth. In calculations of the maximum symbol rate, the Nyquist sampling rate, and maximum bit rate according to the Hartley's law, the bandwidth refers to the frequency range within which the gain is non-zero. The fact that in equivalent baseband models of communication systems, the signal spectrum consists of both negative and positive frequencies, can lead to confusion about bandwidth since they are sometimes referred to only by the positive half, and one will occasionally see expressions such as , where is the total bandwidth (i.e. the maximum passband bandwidth of the carrier-modulated RF signal and the minimum passband bandwidth of the physical passband channel), and is the positive bandwidth (the baseband bandwidth of the equivalent channel model). For instance, the baseband model of the signal would require a low-pass filter with cutoff frequency of at least to stay intact, and the physical passband channel would require a passband filter of at least to stay intact. Relative bandwidth The absolute bandwidth is not always the most appropriate or useful measure of bandwidth. For instance, in the field of antennas the difficulty of constructing an antenna to meet a specified absolute bandwidth is easier at a higher frequency than at a lower frequency. For this reason, bandwidth is often quoted relative to the frequency of operation which gives a better indication of the structure and sophistication needed for the circuit or device under consideration. There are two different measures of relative bandwidth in common use: fractional bandwidth () and ratio bandwidth (). In the following, the absolute bandwidth is defined as follows, where and are the upper and lower frequency limits respectively of the band in question. Fractional bandwidth Fractional bandwidth is defined as the absolute bandwidth divided by the center frequency (), The center frequency is usually defined | the part of the system's frequency response that lies within 3 dB of the response at its peak, which, in the passband filter case, is typically at or near its center frequency, and in the low-pass filter is at or near its cutoff frequency. If the maximum gain is 0 dB, the 3 dB bandwidth is the frequency range where attenuation is less than 3 dB. 3 dB attenuation is also where power is half its maximum. This same half-power gain convention is also used in spectral width, and more generally for the extent of functions as full width at half maximum (FWHM). In electronic filter design, a filter specification may require that within the filter passband, the gain is nominally 0 dB with a small variation, for example within the ±1 dB interval. In the stopband(s), the required attenuation in decibels is above a certain level, for example >100 dB. In a transition band the gain is not specified. In this case, the filter bandwidth corresponds to the passband width, which in this example is the 1 dB-bandwidth. If the filter shows amplitude ripple within the passband, the x dB point refers to the point where the gain is x dB below the nominal passband gain rather than x dB below the maximum gain. In signal processing and control theory the bandwidth is the frequency at which the closed-loop system gain drops 3 dB below peak. In communication systems, in calculations of the Shannon–Hartley channel capacity, bandwidth refers to the 3 dB-bandwidth. In calculations of the maximum symbol rate, the Nyquist sampling rate, and maximum bit rate according to the Hartley's law, the bandwidth refers to the frequency range within which the gain is non-zero. The fact that in equivalent baseband models of communication systems, the signal spectrum consists of both negative and positive frequencies, can lead to confusion about bandwidth since they are sometimes referred to only by the positive half, and one will occasionally see expressions such as , where is the total bandwidth (i.e. the maximum passband bandwidth of the carrier-modulated RF signal and the minimum passband bandwidth of the physical passband channel), and is the positive bandwidth (the baseband bandwidth of the equivalent channel model). For instance, the baseband model of the signal would require a low-pass filter with cutoff frequency of at least to stay intact, and the physical passband channel would require a passband filter of at least to stay intact. Relative bandwidth The absolute bandwidth is not always the most appropriate or useful measure of bandwidth. For instance, in the field of antennas the difficulty of constructing an antenna to meet a specified absolute bandwidth is easier at a higher frequency than at a lower frequency. For this reason, bandwidth is often quoted relative to the frequency of operation which gives a better indication of the structure and sophistication needed for the circuit or device under consideration. There are two different measures of relative bandwidth in common use: fractional bandwidth () and ratio bandwidth (). In the following, the absolute bandwidth is defined as follows, where and are the upper and lower frequency limits respectively of the band in question. Fractional bandwidth Fractional bandwidth is defined as the absolute bandwidth divided by the center frequency (), The center frequency is usually defined as the arithmetic mean of the upper and lower frequencies so that, and However, the center frequency is sometimes defined as the geometric mean of the upper and lower frequencies, and While the geometric mean is more rarely used than the arithmetic mean (and the latter can be assumed if not stated explicitly) the former is considered more mathematically rigorous. It more properly reflects the logarithmic relationship of fractional bandwidth with increasing frequency. For narrowband applications, there is only marginal difference between the two definitions. The geometric mean version is inconsequentially larger. For wideband applications they diverge substantially with the arithmetic mean version approaching 2 in the limit and the geometric mean version approaching infinity. Fractional bandwidth is sometimes expressed as a percentage of the center frequency (percent bandwidth, ), Ratio bandwidth Ratio bandwidth is defined as the ratio of the upper and lower limits of the band, Ratio bandwidth may be notated as . The relationship between ratio bandwidth and fractional bandwidth is given by, and Percent bandwidth is a less meaningful measure in wideband applications. A percent bandwidth of 100% corresponds to a ratio bandwidth of 3:1. All higher ratios up to infinity are compressed into the range 100–200%. |
until full Buddhahood is attained (at which point one ceases to be reborn, which is the classical view of nirvāṇa). This view is promoted in some sutras like the Pañcavimsatisahasrika-prajñaparamita-sutra. The second theory is the idea that there are two kinds of nirvāṇa, the nirvāṇa of an arhat and a superior type of nirvāṇa called apratiṣṭhita (non-abiding) that allows a Buddha to remain engaged in the world. This doctrine developed in Yogacara. As noted by Paul Williams, the idea of apratiṣṭhita nirvāṇa may have taken some time to develop and is not obvious in some of the early Mahāyāna literature, therefore while earlier sutras may sometimes speak of "postponement", later texts saw no need to postpone the "superior" apratiṣṭhita nirvāṇa. In this Yogacara model, the bodhisattva definitely rejects and avoids the liberation of the śravaka and pratyekabuddha, described in Mahāyāna literature as either inferior or "Hina" (as in Asaṅga's fourth century Yogācārabhūmi) or as ultimately false or illusory (as in the Lotus Sūtra). That a bodhisattva has the option to pursue such a lesser path, but instead chooses the long path towards Buddhahood is one of the five criteria for one to be considered a bodhisattva. The other four are: being human, being a man, making a vow to become a Buddha in the presence of a previous Buddha, and receiving a prophecy from that Buddha. Over time, a more varied analysis of bodhisattva careers developed focused on one's motivation. This can be seen in the Tibetan Buddhist teaching on three types of motivation for generating bodhicitta. According to Patrul Rinpoche's 19th century Words of My Perfect Teacher (Kun bzang bla ma'i gzhal lung), a bodhisattva might be motivated in one of three ways. They are: King-like bodhicitta – To aspire to become a Buddha first in order to then help sentient beings. Boatman-like bodhicitta – To aspire to become a Buddha at the same time as other sentient beings. Shepherd-like bodhicitta – To aspire to become a Buddha only after all other sentient beings have done so. These three are not types of people, but rather types of motivation. According to Patrul Rinpoche, the third quality of intention is most noble though the mode by which Buddhahood occurs is the first; that is, it is only possible to teach others the path to enlightenment once one has attained enlightenment oneself. The ritualized formulation of the bodhisattva vow also reflects this order (becoming a Buddha so that one can then teach others to do the same). A bodhisattva vow ritual text attributed to Nāgārjuna, of the second-third century CE, states the vow as follows: "Just as the past tathāgata arhat samyaksambuddhas, when engaging in the behavior of a bodhisattva, generated the aspiration to unsurpassed complete enlightenment so that all beings be liberated, all beings be freed, all beings be relieved, all beings attain complete nirvana, all beings be placed in omniscient wisdom, in the same way, I whose name is so-and-so, from this time forward, generate the aspiration to unsurpassed complete enlightenment so that all beings be liberated, all beings be freed, all beings be relieved, all beings attain complete nirvana, all beings be placed in omniscient wisdom." The six perfections that constitute bodhisattva practice should not be confused with the acts of benefiting beings that the bodhisattva vows to accomplish once he or she is a Buddha. The six perfections are a mental transformation and need not benefit anyone. This is seen in the story of Vessantara, an incarnation of Śākyamuni Buddha while he was still a bodhisattva, who commits the ultimate act of generosity by giving away his children to an evil man who mistreats them. Vessantara's generous act causes indirect harm, however, the merit from the perfection of his generosity fructifies when he attains complete enlightenment as Śākyamuni Buddha. Bodhisattva grounds or levels According to many traditions within Mahāyāna Buddhism, on the way to becoming a Buddha, a bodhisattva proceeds through ten, or sometimes fourteen, grounds or bhūmis. Below is the list of the ten bhūmis and their descriptions according to the Avataṃsaka Sūtra and The Jewel Ornament of Liberation, a treatise by Gampopa, an influential teacher of the Tibetan Kagyu school. (Other schools give slightly variant descriptions.) Before a bodhisattva arrives at the first ground, he or she first must travel the first two of five paths: the path of accumulation the path of preparation The ten grounds of the bodhisattva then can be grouped into the next three paths: bhūmi 1 the path of insight bhūmis 2–7 the path of meditation bhūmis 8–10 the path of no more learning The chapter of ten grounds in the Avataṃsaka Sūtra refers to 52 stages. The 10 grounds are: Great Joy: It is said that being close to enlightenment and seeing the benefit for all sentient beings, one achieves great joy, hence the name. In this bhūmi the bodhisattvas practice all perfections (pāramitās), but especially emphasizing generosity (dāna). Stainless: In accomplishing the second bhūmi, the bodhisattva is free from the stains of immorality, therefore, this bhūmi is named "stainless". The emphasized perfection is moral discipline (śīla). Luminous: The light of Dharma is said to radiate for others from the bodhisattva who accomplishes the third bhūmi. The emphasized perfection is patience (). Radiant: This bhūmi it is said to be like a radiating light that fully burns that which opposes enlightenment. The emphasized perfection is vigor (vīrya). Very difficult to train: Bodhisattvas who attain this ground strive to help sentient beings attain maturity, and do not become emotionally involved when such beings respond negatively, both of which are difficult to do. The emphasized perfection is meditative concentration (dhyāna). Obviously Transcendent: By depending on the perfection of wisdom, [the bodhisattva] does not abide in either or , so this state is "obviously transcendent". The emphasized perfection is wisdom (prajñā). Gone afar: Particular emphasis is on the perfection of skillful means (upāya), to help others. Immovable: The emphasized virtue is aspiration. This "immovable" bhūmi is where one becomes able to choose his place of rebirth. Good Discriminating Wisdom: The emphasized virtue is the understanding of self and non-self. Cloud of Dharma: The emphasized virtue is the practice of primordial wisdom. After the ten bhūmis, according to Mahāyāna Buddhism, one attains complete enlightenment and becomes a Buddha. With the 52 stages, the Śūraṅgama Sūtra recognizes 57 stages. With the 10 grounds, various Vajrayāna schools recognize 3–10 additional grounds, mostly 6 more grounds with variant descriptions. A bodhisattva above the 7th ground is called a mahāsattva. Some bodhisattvas such as Samantabhadra are also said to have already attained Buddhahood. Important Bodhisattvas Eight Main Bodhisattvas of Shakyamuni Buddha In the Tibetan tradition, the following bodhisattvas are known as the "Eight Great Bodhisattvas", or "Eight Close Sons" (Skt. aṣṭa utaputra; Tib. nyewé sé gyé) and are seen as the main bodhisattvas of Shakyamuni Buddha: Mañjuśrī ("Gentle Glory") Kumarabhuta (Young Prince) Avalokiteśvara ("Lord who gazes down at the world") Vajrapāṇi ("Vajra in hand") Maitreya ("Friendly One") Kṣitigarbha ("Earth Source") Ākāśagarbha ("Space Source") also known as Gaganagañja Sarvanivāraṇaviṣkambhin ("He who blocks the hindrances") Samantabhadra ("Universal Worthy", or "All Good") Female Bodhisattvas Numerous Mahayana sutras feature female bodhisattvas as main characters and discuss their life, teachings and future Buddhahood. These include The Questions of the Girl Vimalaśraddhā (Tohoku Kangyur - Toh number 84), The Questions of Vimaladattā (Toh 77), The Lion’s Roar of Śrīmālādevī (Toh 92), The Inquiry of Lokadhara (Toh 174), The Sūtra of Aśokadattā’s Prophecy (Toh 76), The Questions of Vimalaprabhā (Toh 168), The Sūtra of Kṣemavatī’s Prophecy (Toh 192), The Questions of the Girl Sumati (Toh 74), The Questions of Gaṅgottara (Toh 75), The Questions of an Old Lady (Toh 171), The Miraculous Play of Mañjuśrī (Toh 96), and The Sūtra of the Girl Candrottarā’s Prophecy (Toh 191). Others Other important bodhisattvas in Mahayana include: Tara, a major female bodhisattva in Tibetan Buddhism Prajñāpāramitā, a female personification of the perfection of wisdom Vajrasattva Vimalakirti the famous lay bodhisattva of the Vimalakīrti Nirdeśa Akṣayamati, the main character in the influential Akṣayamatinirdeśa Sūtra Sadāprarudita, a major bodhisattva in the Prajñāpāramitā sutras Sudhana, the main character of the Gaṇḍavyūha Sutra The Four Bodhisattvas of the Earth from the Lotus Sutra Bhaiṣajyarāja or "Medicine King" Candraprabha ("Moon Light") and Sūryaprabha ("Solar Light") Cintāmaṇicakra Cundī, an important female | young man in his current life in the period during which he was working towards his own liberation. During his discourses, to recount his experiences as a young aspirant he regularly uses the phrase "When I was an unenlightened bodhisatta..." The term therefore connotes a being who is "bound for enlightenment", in other words, a person whose aim is to become fully enlightened. In the Pāli canon, the bodhisatta (bodhisattva) is also described as someone who is still subject to birth, illness, death, sorrow, defilement, and delusion. Some of the previous lives of the Buddha as a bodhisattva are featured in the Jataka tales. According to the Theravāda monk Bhikkhu Bodhi, the bodhisattva path is not taught in the earliest strata of Buddhist texts such as the Pali Nikayas (and their counterparts such as the Chinese Āgamas) which instead focus on the ideal of the Arahant. The oldest known story about how Gautama Buddha becomes a bodhisattva is the story of his encounter with the previous Buddha, Dīpankara. During this encounter, a previous incarnation of Gautama, variously named Sumedha, Megha, or Sumati offers five blue lotuses and spreads out his hair or entire body for Dīpankara to walk on, resolving to one day become a Buddha. Dīpankara then confirms that they will attain Buddhahood. Early Buddhist authors saw this story as indicating that the making of a resolution (abhinīhāra) in the presence of a living Buddha and his prediction/confirmation of one's future Buddhahood was necessary to become a bodhisattva. According to Drewes, "all known models of the path to Buddhahood developed from this basic understanding." The path is explained differently by the various Nikaya schools. In the Theravāda Buddhavaṃsa (1st-2nd century BCE), after receiving the prediction, Gautama took four asaṃkheyyas ("incalculable aeons") and a hundred thousand, shorter kalpas (aeons) to reach Buddhahood. The Sarvāstivāda school had similar models about how the Buddha Gautama became a bodhisattva. They held it took him three asaṃkhyeyas and ninety one kalpas (aeons) to become a Buddha after his resolution (praṇidhāna) in front of a past Buddha. During the first asaṃkhyeya he is said to have encountered and served 75,000 Buddhas, and 76,000 in the second, after which he received his first prediction (vyākaraṇa) of future Buddhahood from Dīpankara, meaning that he could no longer fall back from the path to Buddhahood. Thus, the presence of a living Buddha is also necessary for Sarvāstivāda. The Mahāvibhāṣā explains that its discussion of the bodhisattva path is partly meant "to stop those who are in fact not bodhisattvas from giving rise to the self-conceit that they are." The Mahāvastu of the Mahāsāṃghika-Lokottaravādins presents four stages of the bodhisattva path without giving specific time frames (though it's said to take various asaṃkhyeya kalpas): Natural (prakṛti), one first plants the roots of merit in front of a Buddha to attain Buddhahood. Resolution (praṇidhāna), one makes their first resolution to attain Buddhahood in the presence of a Buddha. Continuing (anuloma), one continues to practice until one meets a Buddha who confirms one's future Buddhahood. Irreversible (anivartana), at this stage, one cannot fall back. Later Theravāda The Sri Lankan commentator Dhammapala in his commentary on the Cariyāpiṭaka, a text which focuses on the bodhisattva path, notes that to become a bodhisattva one must make a valid resolution in front of a living Buddha, which confirms that one is irreversible (anivattana) from the attainment of Buddhahood. The Nidānakathā, as well as the Buddhavaṃsa and Cariyāpiṭaka commentaries makes this explicit by stating that one cannot use a substitute (such as a Bodhi tree, Buddha statue or Stupa) for the presence of a living Buddha, since only a Buddha has the knowledge for making a reliable prediction. This is the generally accepted view maintained in orthodox Theravada today. The idea is that any resolution to attain Buddhahood may easily be forgotten or abandoned during the aeons ahead. The Burmese monk Ledi Sayadaw (1846–1923) explains that though it is easy to make vows for future Buddhahood by oneself, it is very difficult to maintain the necessary conduct and views during periods when the Dharma has disappeared from the world. One will easily fall back during such periods and this is why one is not truly a full bodhisattva until one receives recognition from a living Buddha. Because of this, it was and remains a common practice in Theravada to attempt to establish the necessary conditions to meet the future Buddha Maitreya and thus receive a prediction from him. Medieval Theravada literature and inscriptions report the aspirations of monks, kings and ministers to meet Maitreya for this purpose. Modern figures such as Anagarika Dharmapala (1864–1933), and U Nu (1907–1995) both sought to receive a prediction from a Buddha in the future and believed meritorious actions done for the good of Buddhism would help in their endeavor to become bodhisattvas in the future. Over time the term came to be applied to other figures besides Gautama Buddha in Theravada lands, possibly due to the influence of Mahayana. The Theravada Abhayagiri tradition of Sri Lanka practiced Mahayana Buddhism and was very influential until the 12th century. Kings of Sri Lanka were often described as bodhisattvas, starting at least as early as Sirisanghabodhi (r. 247–249), who was renowned for his compassion, took vows for the welfare of the citizens, and was regarded as a mahāsatta (Sanskrit mahāsattva), an epithet used almost exclusively in Mahayana Buddhism. Many other Sri Lankan kings from the 3rd until the 15th century were also described as bodhisattvas and their royal duties were sometimes clearly associated with the practice of the Ten Pāramitās. In some cases, they explicitly claimed to have received predictions of Buddhahood in past lives. Theravadin bhikkhu and scholar Walpola Rahula stated that the bodhisattva ideal has traditionally been held to be higher than the state of a śrāvaka not only in Mahayana but also in Theravada Buddhism. He also quotes the 10th century king of Sri Lanka, Mahinda IV (956–972 CE), who had the words inscribed "none but the bodhisattvas will become kings of a prosperous Lanka," among other examples. Jeffrey Samuels echoes this perspective, noting that while in Mahayana Buddhism the bodhisattva path is held to be universal and for everyone, in Theravada it is "reserved for and appropriated by certain exceptional people." Paul Williams writes that some modern Theravada meditation masters in Thailand are popularly regarded as bodhisattvas. In Mahāyāna Buddhism Early Mahāyāna Mahāyāna Buddhism (often also called Bodhisattvayāna, or the "Bodhisattva Vehicle") is based principally upon the path of a bodhisattva. This path was seen as nobler than becoming an arhat or a solitary Buddha. According to David Drewes, "Mahayana sutras unanimously depict the path beginning with the first arising of the thought of becoming a Buddha (prathamacittotpāda), or the initial arising of bodhicitta, typically aeons before one first receives a Buddha’s prediction, and apply the term bodhisattva from this point." The , one of the earliest known Mahayana texts, contains a simple and brief definition for the term bodhisattva, which is also the earliest known Mahāyāna definition. This definition is given as the following: "Because he has bodhi as his aim, a bodhisattva-mahāsattva is so called." The Aṣṭasāhasrikā, also divides the path into three stages. The first stage is that of bodhisattvas who “first set out in the vehicle” (prathamayānasaṃprasthita), then there is the “irreversible” (avinivartanīya) stage, and finally the third “bound by one more birth” (ekajātipratibaddha), as in, destined to become a Buddha in the next life. Drewes also notes that:When Mahāyāna sūtras present stories of Buddhas and bodhisattvas’ first arising of the thought of attaining Buddhahood, they invariably depict it as taking place in the presence of a Buddha, suggesting that they shared with all known nikāya traditions the understanding that this is a necessary condition for entering the path. In addition, though this key fact is often obscured in scholarship, they apparently never encourage anyone to become a bodhisattva or present any ritual or other means of doing so. Like nikāya texts, they also regard the status of new or recent bodhisattvas as largely meaningless. The Aṣṭasāhasrikā, for instance, states that as many bodhisattvas as there grains of sand in the Ganges turn back from the pursuit of Buddhahood and that out of innumerable beings who give rise to bodhicitta and progress toward Buddhahood, only one or two will reach the point of becoming irreversible. Drewes also adds that early texts like the Aṣṭasāhasrikā treat bodhisattvas who are beginners (ādikarmika) or "not long set out in the [great] vehicle" with scorn, describing them as "blind", "unintelligent", "lazy" and "weak". Early Mahayana works identify them with those who reject Mahayana or who abandon Mahayana, and they are seen as likely to become śrāvakas (those on the arhat path). Rather than encouraging them to become bodhisattvas, what early Mahayana sutras like the Aṣṭa do is to help individuals determine if they have already received a prediction in a past life, or if they are close to this point. The Aṣṭa provides a variety of methods, including forms of ritual or divination, methods dealing with dreams and various tests, especially tests based on one's reaction to the hearing of the content in the Aṣṭasāhasrikā itself. The text states that encountering and accepting its teachings mean one is close to being given a prediction and that if one does not "shrink back, cower or despair" from the text, but "firmly believes it", one is irreversible. Many other Mahayana sutras such as the Akṣobhyavyūha and the Śūraṃgamasamādhi Sūtra present textual approaches to determine one's status as an advanced bodhisattva. These mainly consist in one's attitude towards listening to, believing, preaching, proclaiming, copying or memorizing and reciting the sutra. According to Drewes, this claim that merely having faith in Mahāyāna sūtras meant that one was an advanced bodhisattva, was a departure from previous Nikaya views about bodhisattvas. It created new groups of Buddhists who accepted each other's bodhisattva status. Some of early depictions of the Bodhisattva path in texts such as the Ugraparipṛcchā Sūtra describe it as an arduous, difficult monastic path suited only for the few which is nevertheless the most glorious path one can take. Three kinds of bodhisattvas are mentioned: the forest, city, and monastery bodhisattvas—with forest dwelling being promoted a superior, even necessary path in sutras such as the Ugraparipṛcchā and the Samadhiraja sutras. The early Rastrapalapariprccha sutra also promotes a solitary life of meditation in the forests, far away from the distractions of the householder life. The Rastrapala is also highly critical of monks living in monasteries and in cities who are seen as not practicing meditation and morality. The Ratnagunasamcayagatha also says the bodhisattva should undertake ascetic practices (dhutanga), "wander freely without a home", practice the paramitas and train under a guru in order to perfect his meditation practice and realization of prajñaparamita. Some scholars have used these texts to argue for "the forest hypothesis", the theory that the initial Bodhisattva ideal was associated with a strict forest asceticism. But other scholars point out that many other Mahayana sutras do not promote this ideal, focusing on sutra based practices. Some Mahayana sutras promoted another revolutionary doctrinal turn, claiming that the three vehicles of the Śrāvakayāna, Pratyekabuddhayāna and the Bodhisattvayāna were really just one vehicle (ekayana). This is most famously promoted in the Lotus Sūtra which claims that the very idea of three separate vehicles is just an upaya, a skillful device invented by the Buddha to get beings of various abilities on the path. But ultimately, it will be revealed to them that there is only one vehicle, the ekayana, which ends in Buddhahood. Mature Mahāyāna Over time, Mahayana Buddhists developed mature systematized doctrines about the bodhisattva path. The authors of the various Madhyamaka shastras (treatises) often presented the view of the ekayana. The texts and sutras associated with the Yogacara school developed a different theory of three separate gotras or lineages, that inherently predisposed a person to either the vehicle of the arhat, pratyekabuddha or samyak-saṃbuddha (fully self-awakened one). However, the term was also used in a broader sense. According to the eighth-century Mahāyāna philosopher Haribhadra, the term "bodhisattva" can refer to those who follow any of the three vehicles, since all are working towards bodhi (awakening). Therefore, the specific term for a Mahāyāna bodhisattva is a mahāsattva (great being) bodhisattva. According to Atiśa's 11th century Bodhipathapradīpa, the central defining feature of a Mahāyāna bodhisattva is the universal aspiration to end suffering for all sentient beings, which is termed bodhicitta (the heart set on awakening). Later Sanskrit Mahayana Buddhists also developed specific rituals and devotional acts for the arising of this absolutely central quality of bodhicitta, such as the "seven part worship" (Saptāṇgapūjā or Saptavidhā Anuttarapūjā). This ritual form is visible in the works of Shantideva (8th century) and includes: Vandana (obeisance, bowing down) Puja (worship of the Buddhas) Sarana-gamana (going for refuge) Papadesana (confession of bad deeds) Punyanumodana (rejoicing in merit of the good deeds of oneself and others) Adhyesana (prayer, entreaty) and yacana (supplication) – request to Buddhas and Bodhisattvas to continue preaching Dharma Atmabhavadi-parityagah (surrender) Contemporary Mahāyāna Buddhism follows this model and encourages everyone to give rise to bodhicitta and ceremonially take bodhisattva vows. With these vows, one makes the promise to work for the |
for the decoration of the Blue Drawing Room. This room, long, previously known as the South Drawing Room, has a ceiling designed by Nash, coffered with huge gilt console brackets. In 1938, the north-west pavilion, designed by Nash as a conservatory, was converted into a swimming pool. Second World War During the Second World War, which broke out in 1939, the palace was bombed nine times. The most serious and publicised incident destroyed the palace chapel in 1940. This event was shown in cinemas throughout the United Kingdom to show the common suffering of rich and poor. One bomb fell in the palace quadrangle while George VI and Queen Elizabeth (the future Queen Mother) were in the palace, and many windows were blown in and the chapel destroyed. War-time coverage of such incidents was severely restricted, however. The King and Queen were filmed inspecting their bombed home; it was at this time the Queen famously declared: "I'm glad we have been bombed. Now I can look the East End in the face". The royal family were seen as sharing their subjects' hardship, as The Sunday Graphic reported: On 15 September 1940, known as Battle of Britain Day, an RAF pilot, Ray Holmes of No. 504 Squadron RAF rammed a German Dornier Do 17 bomber he believed was going to bomb the Palace. Holmes had run out of ammunition and made the quick decision to ram it. Holmes bailed out and the aircraft crashed into the forecourt of London Victoria station. The bomber's engine was later exhibited at the Imperial War Museum in London. The British pilot became a King's Messenger after the war and died at the age of 90 in 2005. On VE Day—8 May 1945—the palace was the centre of British celebrations. The King, the Queen, Princess Elizabeth (the future Queen) and Princess Margaret appeared on the balcony, with the palace's blacked-out windows behind them, to cheers from a vast crowd in The Mall. The damaged Palace was carefully restored after the war by John Mowlem & Co. Mid 20th century to present day Many of the palace's contents are part of the Royal Collection, held in trust by Elizabeth II; they can, on occasion, be viewed by the public at the Queen's Gallery, near the Royal Mews. The purpose-built gallery opened in 1962 and displays a changing selection of items from the collection. It occupies the site of the chapel that was destroyed in the Second World War. The palace was designated a Grade I listed building in 1970. Its state rooms have been open to the public during August and September and on some dates throughout the year since 1993. The money raised in entry fees was originally put towards the rebuilding of Windsor Castle after the 1992 fire devastated many of its state rooms. In the year to 31 March 2017, 580,000 people visited the palace, and 154,000 visited the gallery. The palace, like Windsor Castle, is owned by the reigning monarch in right of the Crown. Occupied royal palaces are not part of the Crown Estate, nor are they the monarch's personal property, unlike Sandringham House and Balmoral Castle. The Government of the United Kingdom is responsible for maintaining the palace in exchange for the profits made by the Crown Estate. In 2015, the State Dining Room was closed for a year and a half because its ceiling had become potentially dangerous. A 10-year schedule of maintenance work, including new plumbing, wiring, boilers and radiators, and the installation of solar panels on the roof, has been estimated to cost £369 million and was approved by the prime minister in November 2016. It will be funded by a temporary increase in the Sovereign Grant paid from the income of the Crown Estate and is intended to extend the building's working life by at least 50 years. In 2017, the House of Commons backed funding for the project by 464 votes to 56. Buckingham Palace is a symbol and home of the British monarchy, an art gallery and a tourist attraction. Behind the gilded railings and gates that were completed by the Bromsgrove Guild in 1911 and Webb's famous façade, which has been described in a book published by the Royal Collection Trust as looking "like everybody's idea of a palace", is not only a weekday home of Elizabeth II, but also the London residence of the Duke of York and the Earl and Countess of Wessex. The palace also houses their offices, as well as those of the Princess Royal and Princess Alexandra, and is the workplace of more than 800 people. Every year, some 50,000 invited guests are entertained at garden parties, receptions, audiences and banquets. Three garden parties are held in the summer, usually in July. The forecourt of Buckingham Palace is used for the Changing of the Guard, a major ceremony and tourist attraction (daily from April to July; every other day in other months). Interior The front of the palace measures across, by deep, by high and contains over of floorspace. There are 775 rooms, including 188 staff bedrooms, 92 offices, 78 bathrooms, 52 principal bedrooms and 19 state rooms. It also has a post office, cinema, swimming pool, doctor's surgery and jeweller's workshop. The principal rooms are contained on the piano nobile behind the west-facing garden façade at the rear of the palace. The centre of this ornate suite of state rooms is the Music Room, its large bow the dominant feature of the façade. Flanking the Music Room are the Blue and the White Drawing Rooms. At the centre of the suite, serving as a corridor to link the state rooms, is the Picture Gallery, which is top-lit and long. The Gallery is hung with numerous works including some by Rembrandt, van Dyck, Rubens and Vermeer; other rooms leading from the Picture Gallery are the Throne Room and the Green Drawing Room. The Green Drawing Room serves as a huge anteroom to the Throne Room, and is part of the ceremonial route to the throne from the Guard Room at the top of the Grand Staircase. The Guard Room contains white marble statues of Queen Victoria and Prince Albert, in Roman costume, set in a tribune lined with tapestries. These very formal rooms are used only for ceremonial and official entertaining but are open to the public every summer. Directly underneath the State Apartments are the less grand semi-state apartments. Opening from the Marble Hall, these rooms are used for less formal entertaining, such as luncheon parties and private audiences. Some of the rooms are named and decorated for particular visitors, such as the 1844 Room, decorated in that year for the state visit of Tsar Nicholas I of Russia, and the 1855 Room, in honour of the visit of Emperor Napoleon III of France. At the centre of this suite is the Bow Room, through which thousands of guests pass annually to the Queen's garden parties. The Queen uses a smaller suite of rooms in the north wing. Between 1847 and 1850, when Blore was building the new east wing, the Brighton Pavilion was once again plundered of its fittings. As a result, many of the rooms in the new wing have a distinctly oriental atmosphere. The red and blue Chinese Luncheon Room is made up from parts of the Brighton Banqueting and Music Rooms with a large oriental chimney piece designed by Robert Jones and sculpted by Richard Westmacott. It was formerly in the Music Room at the Brighton Pavilion. The ornate clock, known as the Kylin Clock, was made in Jingdezhen, Jiangxi Province, China, in the second half of the 18th century; it has a later movement by Benjamin Vulliamy circa 1820. The Yellow Drawing Room has wallpaper supplied in 1817 for the Brighton Saloon, and a chimney piece which is a European vision of how the Chinese chimney piece may appear. It has nodding mandarins in niches and fearsome winged dragons, designed by Robert Jones. At the centre of this wing is the famous balcony with the Centre Room behind its glass doors. This is a Chinese-style saloon enhanced by Queen Mary, who, working with the designer Sir Charles Allom, created a more "binding" Chinese theme in the late 1920s, although the lacquer doors were brought from Brighton in 1873. Running the length of the piano nobile of the east wing is the Great Gallery, modestly known as the Principal Corridor, which runs the length of the eastern side of the quadrangle. It has mirrored doors and mirrored cross walls reflecting porcelain pagodas and other oriental furniture from Brighton. The Chinese Luncheon Room and Yellow Drawing Room are situated at each end of this gallery, with the Centre Room in between. When paying a state visit to Britain, foreign heads of state are usually entertained by the Queen at Buckingham Palace. They are allocated an extensive suite of rooms known as the Belgian Suite, situated at the foot of the Minister's Staircase, on the ground floor of the north-facing Garden Wing. It contains the 1844 Room, a sitting room that also serves as an audience room and is often used for personal investitures. Narrow corridors link the rooms of the suite, one of them is given extra height and perspective by saucer domes designed by Nash in the style of Soane. A second corridor in the suite has Gothic-influenced cross-over vaulting. The Belgian Rooms themselves were decorated in their present style and named after King Leopold I of Belgium, uncle of Queen Victoria and Prince Albert. In 1936, the suite briefly became the private apartments of the palace when King Edward VIII occupied them. The original early-19th-century interior designs, many of which still survive, included widespread use of brightly coloured scagliola and blue and pink lapis, on the advice of Sir Charles Long. King Edward VII oversaw a partial redecoration in a Belle Époque cream and gold colour scheme. Court ceremonies Investitures, which include the conferring of knighthoods by dubbing with a sword, and other awards take place in the palace's Ballroom, built in 1854. At long, wide and high, it is the largest room in the palace. It has replaced the throne room in importance and use. During investitures, the Queen stands | palace has 775 rooms, and the garden is the largest private garden in London. The state rooms, used for official and state entertaining, are open to the public each year for most of August and September and on some days in winter and spring. History Pre-1624 In the Middle Ages, the site of the future palace formed part of the Manor of Ebury (also called Eia). The marshy ground was watered by the river Tyburn, which still flows below the courtyard and south wing of the palace. Where the river was fordable (at Cow Ford), the village of Eye Cross grew. Ownership of the site changed hands many times; owners included Edward the Confessor and his queen consort Edith of Wessex in late Saxon times, and, after the Norman Conquest, William the Conqueror. William gave the site to Geoffrey de Mandeville, who bequeathed it to the monks of Westminster Abbey. In 1531, Henry VIII acquired the Hospital of St James, which became St James's Palace, from Eton College, and in 1536 he took the Manor of Ebury from Westminster Abbey. These transfers brought the site of Buckingham Palace back into royal hands for the first time since William the Conqueror had given it away almost 500 years earlier. Various owners leased it from royal landlords, and the freehold was the subject of frenzied speculation during the 17th century. By then, the old village of Eye Cross had long since fallen into decay, and the area was mostly wasteland. Needing money, James I sold off part of the Crown freehold but retained part of the site on which he established a mulberry garden for the production of silk. (This is at the north-west corner of today's palace.) Clement Walker in Anarchia Anglicana (1649) refers to "new-erected sodoms and spintries at the Mulberry Garden at S. James's"; this suggests it may have been a place of debauchery. Eventually, in the late 17th century, the freehold was inherited from the property tycoon Sir Hugh Audley by the great heiress Mary Davies. First houses on the site (1624–1761) Possibly the first house erected within the site was that of a Sir William Blake, around 1624. The next owner was Lord Goring, who from 1633 extended Blake's house, which came to be known as Goring House, and developed much of today's garden, then known as Goring Great Garden. He did not, however, obtain the freehold interest in the mulberry garden. Unbeknown to Goring, in 1640 the document "failed to pass the Great Seal before Charles I fled London, which it needed to do for legal execution". It was this critical omission that would help the British royal family regain the freehold under George III. When the improvident Goring defaulted on his rents, Henry Bennet, 1st Earl of Arlington was able to purchase the lease of Goring House and he was occupying it when it burned down in 1674, following which he constructed Arlington House on the site—the location of the southern wing of today's palace—the next year. In 1698, John Sheffield acquired the lease. He later became the first Duke of Buckingham and Normanby. Buckingham House was built for Sheffield in 1703 to the design of William Winde. The style chosen was of a large, three-floored central block with two smaller flanking service wings. It was eventually sold by Buckingham's illegitimate son, Sir Charles Sheffield, in 1761 to George III for £21,000. Sheffield's leasehold on the mulberry garden site, the freehold of which was still owned by the royal family, was due to expire in 1774. From Queen's House to palace (1761–1837) Under the new royal ownership, the building was originally intended as a private retreat for George III's wife, Queen Charlotte, and was accordingly known as The Queen's House. Remodelling of the structure began in 1762. In 1775, an Act of Parliament settled the property on Queen Charlotte, in exchange for her rights to nearby Somerset House, and 14 of her 15 children were born there. Some furnishings were transferred from Carlton House and others had been bought in France after the French Revolution of 1789. While St James's Palace remained the official and ceremonial royal residence, the name "Buckingham-palace" was used from at least 1791. After his accession to the throne in 1820, George IV continued the renovation intending to create a small, comfortable home. However, in 1826, while the work was in progress, the King decided to modify the house into a palace with the help of his architect John Nash. The external façade was designed, keeping in mind the French neoclassical influence preferred by George IV. The cost of the renovations grew dramatically, and by 1829 the extravagance of Nash's designs resulted in his removal as the architect. On the death of George IV in 1830, his younger brother William IV hired Edward Blore to finish the work. William never moved into the palace. After the Palace of Westminster was destroyed by fire in 1834, he offered to convert Buckingham Palace into a new Houses of Parliament, but his offer was declined. Queen Victoria (1837–1901) Buckingham Palace became the principal royal residence in 1837, on the accession of Queen Victoria, who was the first monarch to reside there; her predecessor William IV had died before its completion. While the state rooms were a riot of gilt and colour, the necessities of the new palace were somewhat less luxurious. It was reported the chimneys smoked so much that the fires had to be allowed to die down, and consequently the palace was often cold. Ventilation was so bad that the interior smelled, and when it was decided to install gas lamps, there was a serious worry about the build-up of gas on the lower floors. It was also said that staff were lax and lazy and the palace was dirty. Following the Queen's marriage in 1840, her husband, Prince Albert, concerned himself with a reorganisation of the household offices and staff, and with addressing the design faults of the palace. By the end of 1840, all the problems had been rectified. However, the builders were to return within the decade. By 1847, the couple had found the palace too small for court life and their growing family and a new wing, designed by Edward Blore, was built by Thomas Cubitt, enclosing the central quadrangle. The large East Front, facing The Mall, is today the "public face" of Buckingham Palace, and contains the balcony from which the royal family acknowledge the crowds on momentous occasions and after the annual Trooping the Colour. The ballroom wing and a further suite of state rooms were also built in this period, designed by Nash's student Sir James Pennethorne. Before Prince Albert's death, the palace was frequently the scene of musical entertainments, and the most celebrated contemporary musicians entertained at Buckingham Palace. The composer Felix Mendelssohn is known to have played there on three occasions. Johann Strauss II and his orchestra played there when in England. Under Victoria, Buckingham Palace was frequently the scene of lavish costume balls, in addition to the usual royal ceremonies, investitures and presentations. Widowed in 1861, the grief-stricken Queen withdrew from public life and left Buckingham Palace to live at Windsor Castle, Balmoral Castle and Osborne House. For many years the palace was seldom used, even neglected. In 1864, a note was found pinned to the fence of Buckingham Palace, saying: "These commanding premises to be let or sold, in consequence of the late occupant's declining business." Eventually, public opinion persuaded the Queen to return to London, though even then she preferred to live elsewhere whenever possible. Court functions were still held at Windsor Castle, presided over by the sombre Queen habitually dressed in mourning black, while Buckingham Palace remained shuttered for most of the year. Early 20th century (1901–1945) In 1901, the new king, Edward VII, began redecorating the palace. The King and his wife, Queen Alexandra, had always been at the forefront of London high society, and their friends, known as "the Marlborough House Set", were considered to be the most eminent and fashionable of the age. Buckingham Palace—the Ballroom, Grand Entrance, Marble Hall, Grand Staircase, vestibules and galleries were redecorated in the Belle Époque cream and gold colour scheme they retain today—once again became a setting for entertaining on a majestic scale but leaving some to feel Edward's heavy redecorations were at odds with Nash's original work. The last major building work took place during the reign of George V when, in 1913, Sir Aston Webb redesigned Blore's 1850 East Front to resemble in part Giacomo Leoni's Lyme Park in Cheshire. This new, refaced principal façade (of Portland stone) was designed to be the backdrop to the Victoria Memorial, a large memorial statue of Queen Victoria created by sculptor Sir Thomas Brock, erected outside the main gates on a surround constructed by architect Sir Aston Webb. George V, who had succeeded Edward VII in 1910, had a more serious personality than his father; greater emphasis was now placed on official entertaining and royal duties than on lavish parties. He arranged a series of command performances featuring jazz musicians such as the Original Dixieland Jazz Band (1919; the first jazz performance for a head of state), Sidney Bechet and Louis Armstrong (1932), which earned the palace a nomination in 2009 for a (Kind of) Blue Plaque by the Brecon Jazz Festival as one of the venues making the greatest contribution to jazz music in the United Kingdom. During the First World War, which lasted from 1914 until 1918, the palace escaped unscathed. Its more valuable contents were evacuated to Windsor, but the royal family remained in residence. The King imposed rationing at the palace, much to the dismay of his guests and household. To the King's later regret, David Lloyd George persuaded him to go further and ostentatiously lock the wine cellars and refrain from alcohol, to set a good example to the supposedly inebriated working class. The workers continued to imbibe, and the King was left unhappy at his enforced abstinence. George V's wife, Queen Mary, was a connoisseur of the arts, and took a keen interest in the Royal Collection of furniture and art, both restoring and adding to it. Queen Mary also had many new fixtures and fittings installed, such as the pair of marble Empire-style chimneypieces by Benjamin Vulliamy, dating from 1810, which the Queen had installed in the ground floor Bow Room, the huge low room at the centre of the garden façade. Queen Mary was also responsible for the decoration of the Blue Drawing Room. This room, long, previously known as the South Drawing Room, has a ceiling designed by Nash, coffered with huge gilt console brackets. In 1938, the north-west pavilion, designed by Nash as a conservatory, was converted into a swimming pool. Second World War During the Second World War, which broke out in 1939, the palace was bombed nine times. The most serious and publicised incident destroyed the palace chapel in 1940. This event was shown in cinemas throughout the United Kingdom to show the common suffering of rich and poor. One bomb fell in the palace quadrangle while George VI and Queen Elizabeth (the future Queen Mother) were in the palace, and many windows were blown in and the chapel destroyed. War-time coverage of such incidents was severely restricted, however. The King and Queen were filmed inspecting their bombed home; it was at this time the Queen famously declared: "I'm glad we have been bombed. Now I can look the East End in the face". The royal family were seen as sharing their subjects' hardship, as The Sunday Graphic reported: On 15 September 1940, known as Battle of Britain Day, an RAF pilot, Ray Holmes of No. 504 Squadron RAF rammed a German Dornier Do 17 bomber he believed was going to bomb the Palace. Holmes had run out of ammunition and made the quick decision to ram it. Holmes bailed out and the aircraft crashed into the forecourt of London Victoria station. The bomber's engine was later exhibited at the Imperial War Museum in London. The British pilot became a King's Messenger after the war and died at the age of 90 in 2005. On VE Day—8 May 1945—the palace was the centre of British celebrations. The King, the Queen, Princess Elizabeth (the future Queen) and Princess Margaret appeared on the balcony, with the palace's blacked-out windows behind them, to cheers from a vast crowd in The Mall. The damaged Palace was carefully restored after the war by John Mowlem & Co. Mid 20th century to present day Many of the palace's contents are part of the Royal Collection, held in trust by Elizabeth II; they can, on occasion, be viewed by the public at the Queen's Gallery, near the Royal Mews. The purpose-built gallery opened in 1962 and displays a changing selection of items from the collection. It occupies the site of the chapel that was destroyed in the Second World War. The palace was designated a Grade I listed building in 1970. Its state rooms have been open to the public during August and September and on some dates throughout the year since 1993. The money raised in entry fees was originally put towards the rebuilding of Windsor |
(also part of the IAG airline group) would succeed Álex Cruz as CEO. Corporate affairs Operations British Airways is the largest airline based in the United Kingdom in terms of fleet size, international flights, and international destinations and was, until 2008, the largest airline by passenger numbers. The airline carried 34.6 million passengers in 2008, but, rival carrier easyJet transported 44.5 million passengers that year, passing British Airways for the first time. British Airways holds a United Kingdom Civil Aviation Authority Type A Operating Licence, it is permitted to carry passengers, cargo, and mail on aircraft with 20 or more seats. The airlines' head office, Waterside, stands in Harmondsworth, a village that is near Heathrow Airport. Waterside was completed in June 1998 to replace British Airways' previous head office, Speedbird House, located in Technical Block C on the grounds of Heathrow. British Airways' main base is at Heathrow Airport, but it also has a major presence at Gatwick Airport. It also has a base at London City Airport, where its subsidiary BA Cityflyer is the largest operator. BA had previously operated a significant hub at Manchester Airport. Manchester to New York (JFK) services were withdrawn; later all international services outside London ceased when the subsidiary BA Connect was sold. Passengers wishing to travel internationally with BA either to or from regional UK destinations must now transfer in London. Heathrow Airport is dominated by British Airways, which owns 40% of the slots available at the airport. The majority of BA services operate from Terminal 5, with the exception of some flights at Terminal 3 owing to insufficient capacity at Terminal 5. In August 2014, Willie Walsh advised the airline would continue to use flight paths over Iraq despite the hostilities there. A few days earlier Qantas announced it would avoid Iraqi airspace, while other airlines did likewise. The issue arose following the downing of Malaysia Airlines Flight 17 over Ukraine, and a temporary suspension of flights to and from Ben Gurion Airport during the 2014 Israel–Gaza conflict. Subsidiaries and shareholdings BA CityFlyer, a wholly owned subsidiary, offers flights from its base at London City Airport to 23 destinations throughout Europe. It flies 22 Embraer E-190 aircraft. The airline focuses on serving the financial market, though it has recently expanded into the leisure market, offering routes to Ibiza, Palma and Venice. In March 2015, Qatar Airways purchased a 10% stake in International Airlines Group, the parent of British Airways and Iberia, for €1.2 billion (US$1.26 billion). By early 2020, this had increased to 25% costing a further US$600 million. BEA Helicopters was renamed British Airways Helicopters in 1974 and operated passenger and offshore oil support services until it was sold in 1986. Other former subsidiaries include the German airline Deutsche BA from 1997 until 2003 and the French airline Air Liberté from 1997 to 2001. British Airways also owned Airways Aero Association, the operator of the British Airways flying club based at Wycombe Air Park in High Wycombe, until it was sold to Surinder Arora in 2007. South Africa's Comair and Denmark's Sun Air of Scandinavia have been franchisees of British Airways since 1996. British Airways obtained a 15% stake in UK regional airline Flybe from the sale of BA Connect in March 2007. It sold the stake in 2014. BA also owned a 10% stake in InterCapital and Regional Rail (ICRR), the company that managed the operations of Eurostar (UK) Ltd from 1998 to 2010, when the management of Eurostar was restructured. With the creation of an Open Skies agreement between Europe and the United States in March 2008, British Airways started a new subsidiary airline called OpenSkies (previously known as "Project Lauren"). The airline started operations in June 2008, and flew directly from Paris—Orly to Newark. However it ceased operations on 2 September 2018 when it was replaced with Level flights on that route. British Airways Limited was established in 2012 to take over the operation of the premium service between London City Airport and New York-JFK. BA began the service in September 2009, using two Airbus A318s fitted with 32 lie-flat beds in an all business class cabin. Flights operate under the numbers previously reserved for Concorde: BA001 – BA004. The flights returned to be directly operated by British Airways plc in 2015. British Airways provides cargo services under the British Airways World Cargo brand. The division has been part of IAG Cargo since 2012 and is the world's twelfth-largest cargo airline based on total freight tonne-kilometres flown. BA World Cargo operates using the main BA fleet. Until the end of March 2014 they also operated three Boeing 747-8 freighter aircraft providing dedicated long-haul services under a wet lease arrangement from Global Supply Systems. The division operates an automated cargo centre at Heathrow Airport and handles freight at Gatwick and Stansted airports. Business trends The key trends for the British Airways PLC Group are shown below. On the merger with Iberia, the accounting reference date was changed from 31 March to 31 December; figures below are therefore for the years to 31 March up to 2010, for the nine months to 31 December 2010, and for the years to 31 December thereafter: In 2020, due to the crisis caused by the COVID-19 pandemic, British Airways had to reduce its 42,000-strong workforce by 12,000 jobs. According to the estimate by IAG, a parent company, it will take the air travel industry several years to return to previous performance and profitability levels. Industrial relations Staff working for British Airways are represented by a number of trade unions, pilots are represented by British Air Line Pilots' Association, cabin crew by British Airlines Stewards and Stewardesses Association (a branch of Unite the Union), while other branches of Unite the Union and the GMB Union represent other employees. Bob Ayling's management faced strike action by cabin crew over a £1 billion cost-cutting drive to return BA to profitability in 1997; this was the last time BA cabin crew would strike until 2009, although staff morale has reportedly been unstable since that incident. In an effort to increase interaction between management, employees, and the unions, various conferences and workshops have taken place, often with thousands in attendance. In 2005, wildcat action was taken by union members over a decision by Gate Gourmet not to renew the contracts of 670 workers and replace them with agency staff; it is estimated that the strike cost British Airways £30 million and caused disruption to 100,000 passengers. In October 2006, BA became involved in a civil rights dispute when a Christian employee was forbidden to wear a necklace bearing the cross, a religious symbol. BA's practice of forbidding such symbols has been publicly questioned by British politicians such as the former Home Secretary John Reid and the former Foreign Secretary Jack Straw. Relations have been turbulent between BA and Unite. In 2007, cabin crew threatened strike action over salary changes to be imposed by BA management. The strike was called off at the last minute, British Airways losing £80 million. In December 2009, a ballot for strike action over Christmas received a high level of support, action was blocked by a court injunction that deemed the ballot illegal. Negotiations failed to stop strike action in March, BA withdrew perks for strike participants. Allegations were made by The Guardian newspaper that BA had consulted outside firms methods to undermine the unions: the story was later withdrawn. A strike was announced for May 2010, British Airways again sought an injunction. Members of the Socialist Workers Party disrupted negotiations between BA management and Unite to prevent industrial action. Further disruption struck when Derek Simpson, a Unite co-leader, was discovered to have leaked details of confidential negotiations online via Twitter. Industrial action re-emerged in 2017, this time by BA's Mixed Fleet flight attendants, whom were employed on much less favorable pay and terms and conditions compared to previous cabin staff who joined prior to 2010. A ballot for industrial action was distributed to Mixed Fleet crew in November 2016 and resulted in an overwhelming yes majority for industrial action. Unite described Mixed Fleet crew as on "poverty pay", with many Mixed Fleet flight attendants sleeping in their cars in between shifts because they cannot afford the fuel to drive home, or operating while sick as they cannot afford to call in sick and lose their pay for the shift. Unite also blasted BA of removing staff travel concessions, bonus payments and other benefits to all cabin crew who undertook industrial action, as well as strike-breaking tactics such as wet-leasing aircraft from other airlines and offering financial incentives for cabin crew not to strike. The first dates of strikes during Christmas 2016 were cancelled due to pay negotiations. Industrial action by Mixed Fleet commenced in January 2017 after rejecting a pay offer. Strike action continued throughout 2017 in numerous discontinuous periods, resulting in one of the longest running disputes in aviation history. On 31 October 2017, after 85 days of discontinuous industrial action, Mixed Fleet accepted a new pay deal from BA which ended the dispute. Senior Leadership Chairman: Sean Doyle (since April 2021) Chief Executive: Sean Doyle (since October 2020) List of Former Chairmen Sir David Nicolson (1972–1975) Lord McFadzean of Kelvinside (1976–1979) Sir Ross Stainton (1979–1980) Lord King of Wartnaby (1981–1993) Lord Marshall of Knightsbridge (1993–2004) Sir Martin Broughton (2004–2013) Keith Williams (2013–2016) Álex Cruz (2016–2021) List of Former Chief Executives The position was formed in 1977. Sir Ross Stainton (1977–1979) Sir Roy Watts (1979–1983) Lord Marshall of Knightsbridge (1983–1995) Bob Ayling (1996–2000) Sir Rod Eddington (2000–2005) Willie Walsh (2005–2010) Keith Williams (2011–2016) Álex Cruz (2016–2020) Destinations British Airways serves over 160 destinations, including eight domestic and 24 in the United States. Alliances British Airways is a member and one of the founders of Oneworld, an airline alliance. Codeshare agreements British Airways codeshares with the following airlines: Aer Lingus airBaltic Alaska Airlines American Airlines Bangkok Airways Cathay Pacific China Eastern Airlines China Southern Airlines Finnair Iberia Japan Airlines Kenya Airways LATAM Brasil LATAM Chile Loganair Malaysia Airlines Qantas Qatar Airways Royal Jordanian S7 Airlines TAAG Angola Airlines Vueling Fleet , the British Airways operates a fleet of 253 aircraft with 47 orders. BA operates a mix of Airbus narrow and wide-body aircraft, and Boeing wide-body aircraft, specifically the 777 and 787. In October 2020, British Airways retired its fleet of 747-400 aircraft. It was one of the largest operators of the 747, having previously operated the -100, -200, and -400 aircraft from 1974 (1969 with BOAC). British Airways Engineering The airline has its own engineering branch to maintain its aircraft fleet, this includes line maintenance at over 70 airports around the world. As well as hangar facilities at Heathrow and Gatwick airport it has two major maintenance centres at Glasgow and Cardiff Airports. Marketing Branding The musical theme predominantly used on British Airways advertising is "The Flower Duet" by Léo Delibes. This, and the slogan "The World's Favourite Airline" were introduced in 1989 with the launch of the iconic "Face" advertisement. The slogan was dropped in 2001 after Lufthansa overtook BA in terms of passenger numbers. "Flower Duet" is still used by the airline, and has been through several different arrangements since 1989. The most recent version of this melody was shown in 2007 with a new slogan: "Upgrade to British Airways". Other advertising slogans have included "The World's Best Airline", "We'll Take More Care of You", and "Fly the Flag". BA had an account for 23 years with Saatchi & Saatchi, an agency that created many of their most famous advertisements, including the influential "Face" campaign. Saatchi & Saatchi later imitated this advert for Silverjet, a rival of BA, after BA discontinued their business activities. Since 2007, BA has used Bartle Bogle Hegarty as its advertising agency. British Airways purchased the internet domain ba.com in 2002 from previous owner Bell Atlantic, 'BA' being the company's acronym and its IATA Airline code. British Airways is the official airline of the Wimbledon Championship tennis tournament, and was the official airline and tier one partner of the 2012 Summer Olympics and Paralympics. British Airways was also the official airline of England's bid to host the 2018 Football World Cup. High Life, founded in 1973, is the official in-flight magazine of the airline. Safety video The airline used a cartoon safety video from circa 2005 until 2017. Beginning on 1 September 2017 the airline introduced the new Comic Relief live action safety video hosted by Chabuddy G, with appearances by British celebrities Gillian Anderson, Rowan Atkinson, Jim Broadbent, Rob Brydon, Warwick Davis, Chiwetel Ejiofor, Ian McKellen, Thandie Newton, and Gordon Ramsay. A "sequel" video, also hosted by Chabuddy G, was released in 2018, with Michael Caine, Olivia Colman, Jourdan Dunn, Naomie Harris, Joanna Lumley, and David Walliams. The two videos are part of Comic Relief's charity programme. Liveries, logos, and tail fins | been franchisees of British Airways since 1996. British Airways obtained a 15% stake in UK regional airline Flybe from the sale of BA Connect in March 2007. It sold the stake in 2014. BA also owned a 10% stake in InterCapital and Regional Rail (ICRR), the company that managed the operations of Eurostar (UK) Ltd from 1998 to 2010, when the management of Eurostar was restructured. With the creation of an Open Skies agreement between Europe and the United States in March 2008, British Airways started a new subsidiary airline called OpenSkies (previously known as "Project Lauren"). The airline started operations in June 2008, and flew directly from Paris—Orly to Newark. However it ceased operations on 2 September 2018 when it was replaced with Level flights on that route. British Airways Limited was established in 2012 to take over the operation of the premium service between London City Airport and New York-JFK. BA began the service in September 2009, using two Airbus A318s fitted with 32 lie-flat beds in an all business class cabin. Flights operate under the numbers previously reserved for Concorde: BA001 – BA004. The flights returned to be directly operated by British Airways plc in 2015. British Airways provides cargo services under the British Airways World Cargo brand. The division has been part of IAG Cargo since 2012 and is the world's twelfth-largest cargo airline based on total freight tonne-kilometres flown. BA World Cargo operates using the main BA fleet. Until the end of March 2014 they also operated three Boeing 747-8 freighter aircraft providing dedicated long-haul services under a wet lease arrangement from Global Supply Systems. The division operates an automated cargo centre at Heathrow Airport and handles freight at Gatwick and Stansted airports. Business trends The key trends for the British Airways PLC Group are shown below. On the merger with Iberia, the accounting reference date was changed from 31 March to 31 December; figures below are therefore for the years to 31 March up to 2010, for the nine months to 31 December 2010, and for the years to 31 December thereafter: In 2020, due to the crisis caused by the COVID-19 pandemic, British Airways had to reduce its 42,000-strong workforce by 12,000 jobs. According to the estimate by IAG, a parent company, it will take the air travel industry several years to return to previous performance and profitability levels. Industrial relations Staff working for British Airways are represented by a number of trade unions, pilots are represented by British Air Line Pilots' Association, cabin crew by British Airlines Stewards and Stewardesses Association (a branch of Unite the Union), while other branches of Unite the Union and the GMB Union represent other employees. Bob Ayling's management faced strike action by cabin crew over a £1 billion cost-cutting drive to return BA to profitability in 1997; this was the last time BA cabin crew would strike until 2009, although staff morale has reportedly been unstable since that incident. In an effort to increase interaction between management, employees, and the unions, various conferences and workshops have taken place, often with thousands in attendance. In 2005, wildcat action was taken by union members over a decision by Gate Gourmet not to renew the contracts of 670 workers and replace them with agency staff; it is estimated that the strike cost British Airways £30 million and caused disruption to 100,000 passengers. In October 2006, BA became involved in a civil rights dispute when a Christian employee was forbidden to wear a necklace bearing the cross, a religious symbol. BA's practice of forbidding such symbols has been publicly questioned by British politicians such as the former Home Secretary John Reid and the former Foreign Secretary Jack Straw. Relations have been turbulent between BA and Unite. In 2007, cabin crew threatened strike action over salary changes to be imposed by BA management. The strike was called off at the last minute, British Airways losing £80 million. In December 2009, a ballot for strike action over Christmas received a high level of support, action was blocked by a court injunction that deemed the ballot illegal. Negotiations failed to stop strike action in March, BA withdrew perks for strike participants. Allegations were made by The Guardian newspaper that BA had consulted outside firms methods to undermine the unions: the story was later withdrawn. A strike was announced for May 2010, British Airways again sought an injunction. Members of the Socialist Workers Party disrupted negotiations between BA management and Unite to prevent industrial action. Further disruption struck when Derek Simpson, a Unite co-leader, was discovered to have leaked details of confidential negotiations online via Twitter. Industrial action re-emerged in 2017, this time by BA's Mixed Fleet flight attendants, whom were employed on much less favorable pay and terms and conditions compared to previous cabin staff who joined prior to 2010. A ballot for industrial action was distributed to Mixed Fleet crew in November 2016 and resulted in an overwhelming yes majority for industrial action. Unite described Mixed Fleet crew as on "poverty pay", with many Mixed Fleet flight attendants sleeping in their cars in between shifts because they cannot afford the fuel to drive home, or operating while sick as they cannot afford to call in sick and lose their pay for the shift. Unite also blasted BA of removing staff travel concessions, bonus payments and other benefits to all cabin crew who undertook industrial action, as well as strike-breaking tactics such as wet-leasing aircraft from other airlines and offering financial incentives for cabin crew not to strike. The first dates of strikes during Christmas 2016 were cancelled due to pay negotiations. Industrial action by Mixed Fleet commenced in January 2017 after rejecting a pay offer. Strike action continued throughout 2017 in numerous discontinuous periods, resulting in one of the longest running disputes in aviation history. On 31 October 2017, after 85 days of discontinuous industrial action, Mixed Fleet accepted a new pay deal from BA which ended the dispute. Senior Leadership Chairman: Sean Doyle (since April 2021) Chief Executive: Sean Doyle (since October 2020) List of Former Chairmen Sir David Nicolson (1972–1975) Lord McFadzean of Kelvinside (1976–1979) Sir Ross Stainton (1979–1980) Lord King of Wartnaby (1981–1993) Lord Marshall of Knightsbridge (1993–2004) Sir Martin Broughton (2004–2013) Keith Williams (2013–2016) Álex Cruz (2016–2021) List of Former Chief Executives The position was formed in 1977. Sir Ross Stainton (1977–1979) Sir Roy Watts (1979–1983) Lord Marshall of Knightsbridge (1983–1995) Bob Ayling (1996–2000) Sir Rod Eddington (2000–2005) Willie Walsh (2005–2010) Keith Williams (2011–2016) Álex Cruz (2016–2020) Destinations British Airways serves over 160 destinations, including eight domestic and 24 in the United States. Alliances British Airways is a member and one of the founders of Oneworld, an airline alliance. Codeshare agreements British Airways codeshares with the following airlines: Aer Lingus airBaltic Alaska Airlines American Airlines Bangkok Airways Cathay Pacific China Eastern Airlines China Southern Airlines Finnair Iberia Japan Airlines Kenya Airways LATAM Brasil LATAM Chile Loganair Malaysia Airlines Qantas Qatar Airways Royal Jordanian S7 Airlines TAAG Angola Airlines Vueling Fleet , the British Airways operates a fleet of 253 aircraft with 47 orders. BA operates a mix of Airbus narrow and wide-body aircraft, and Boeing wide-body aircraft, specifically the 777 and 787. In October 2020, British Airways retired its fleet of 747-400 aircraft. It was one of the largest operators of the 747, having previously operated the -100, -200, and -400 aircraft from 1974 (1969 with BOAC). British Airways Engineering The airline has its own engineering branch to maintain its aircraft fleet, this includes line maintenance at over 70 airports around the world. As well as hangar facilities at Heathrow and Gatwick airport it has two major maintenance centres at Glasgow and Cardiff Airports. Marketing Branding The musical theme predominantly used on British Airways advertising is "The Flower Duet" by Léo Delibes. This, and the slogan "The World's Favourite Airline" were introduced in 1989 with the launch of the iconic "Face" advertisement. The slogan was dropped in 2001 after Lufthansa overtook BA in terms of passenger numbers. "Flower Duet" is still used by the airline, and has been through several different arrangements since 1989. The most recent version of this melody was shown in 2007 with a new slogan: "Upgrade to British Airways". Other advertising slogans have included "The World's Best Airline", "We'll Take More Care of You", and "Fly the Flag". BA had an account for 23 years with Saatchi & Saatchi, an agency that created many of their most famous advertisements, including the influential "Face" campaign. Saatchi & Saatchi later imitated this advert for Silverjet, a rival of BA, after BA discontinued their business activities. Since 2007, BA has used Bartle Bogle Hegarty as its advertising agency. British Airways purchased the internet domain ba.com in 2002 from previous owner Bell Atlantic, 'BA' being the company's acronym and its IATA Airline code. British Airways is the official airline of the Wimbledon Championship tennis tournament, and was the official airline and tier one partner of the 2012 Summer Olympics and Paralympics. British Airways was also the official airline of England's bid to host the 2018 Football World Cup. High Life, founded in 1973, is the official in-flight magazine of the airline. Safety video The airline used a cartoon safety video from circa 2005 until 2017. Beginning on 1 September 2017 the airline introduced the new Comic Relief live action safety video hosted by Chabuddy G, with appearances by British celebrities Gillian Anderson, Rowan Atkinson, Jim Broadbent, Rob Brydon, Warwick Davis, Chiwetel Ejiofor, Ian McKellen, Thandie Newton, and Gordon Ramsay. A "sequel" video, also hosted by Chabuddy G, was released in 2018, with Michael Caine, Olivia Colman, Jourdan Dunn, Naomie Harris, Joanna Lumley, and David Walliams. The two videos are part of Comic Relief's charity programme. Liveries, logos, and tail fins The aeroplanes that British Airways inherited from the four-way merger between BOAC, BEA, Cambrian, and Northeast were temporarily given the text logo "British airways" but retained the original airline's livery. With its formation in 1974, British Airways' aeroplanes were given a new white, blue, and red colour scheme with a cropped Union Jack painted on their tail fins, designed by Negus & Negus. In 1984, a new livery designed by Landor Associates updated the airline's look as it prepared for privatization. For celebrating centenary, BA announced four retro liveries...three on Boeing 747-400 aircraft (one in each of BOAC, Negus & Negus, and Landor Associates liveries), and one A319 in BEA livery. In 1997, there was a controversial change to a new Project Utopia livery; all aircraft used the corporate colours consistently on the fuselage, but tailfins bore one of multiple designs. Several people spoke out against the change, including the former prime minister Margaret Thatcher, who famously covered the tail of a model 747 at an event with a handkerchief, to show her displeasure. BA's traditional rival, Virgin Atlantic, took advantage of the negative press coverage by applying the Union flag to the winglets of their aircraft along with the slogan "Britain's national flagcarrier". In 1999, the CEO of British Airways, Bob Ayling, announced that all BA planes would adopt the tailfin design Chatham Dockyard Union Flag originally intended to be used only on the Concorde, based on the Union Flag. All BA aircraft have since borne the Chatham Dockyard Union flag variant of the Project Utopia livery, except for the four retro aircraft. Loyalty programmes British Airways' tiered loyalty programme, called the Executive Club, includes access to special lounges and dedicated "fast" queues. BA also invites its top corporate accounts to join a "Premier" incentive programme. British Airways operates airside lounges for passengers travelling in premium cabins, and these are available to certain tiers of Executive Club members. First class passengers, as well as Gold Executive Club members, are entitled to use First Class Lounges. Business class passengers (called Club World or Club Europe in BA terms) as well as Silver Executive Club members may use Business lounges. At airports in which BA does not operate a departure lounge, a third party lounge is often provided for premium or status passengers. In 2011, due to the merger with Iberia, British Airways announced changes to the Executive Club to maximise integration between the airlines. This included the combination and rebranding of Air Miles, BA Miles and Iberia Plus points as the IAG operated loyalty programme Avios. Inflight magazines high life Magazine is British Airways' complimentary inflight magazine. It is available to all customers across all cabins and aircraft types. high life shop Magazine is British Airways' inflight shopping magazine. It is available to all customers on all aircraft where the inflight shopping range can be carried. First life is a complimentary magazine offered to all customers travelling in the First cabin. It has a range of articles including fashion, trends and technology with an upmarket target audience. Business life is a complimentary magazine targeted at business travellers and frequent flyers. The magazine can be found in all short haul aircraft seat pockets, in the magazine selection for Club World customers and in lounges operated by British Airways. Cabins and services Short haul Economy class Euro Traveller is British Airways' economy class cabin on all short-haul flights within Europe, including domestic flights within the UK. Heathrow and Gatwick based flights are operated by Airbus A320 series aircraft. Standard seat pitch varies from 29" to 34" depending on aircraft type and location of the seat. All flights from Heathrow and Gatwick have a buy on board system with a range of food designed by Tom Kerridge. Food can be pre-ordered through the British Airways mobile application. Alternatively, a limited selection can be purchased on-board using credit and debit card or by using Frequent Flyer Avios points. British Airways is rolling out Wi-Fi across its fleet of aircraft with 90% expected to be Wi-Fi enabled by 2020. Scheduled services operated by BA Cityflyer currently offer complimentary onboard catering. The service will switch to buy on board in the future. Business class Club Europe is the short-haul business class available on all short-haul flights. This class allows for access to business lounges at most airports and complimentary onboard catering. The middle seat of the standard Airbus configured cabin is left free. Instead, a cocktail table folds up from under the middle seat on refurbished aircraft. Pillows and blankets are available on longer flights. In-flight Mid-haul and long haul First class First is offered on all British Airways Airbus A380s, Boeing 777-300ERs, Boeing 787-9/10s and on some of their Boeing 777-200ERs. There are between eight and fourteen private suites depending on the aircraft type. Each First suit comes with a bed, a wide entertainment screen, and in-seat power. Dedicated British Airways 'Galleries First' lounges are available at some airports. The exclusive 'Concorde Room' lounges at Heathrow Terminal 5 and New York JFK airports offer pre-flight dining with waiter service and more intimate space. Business lounges are used where these are not available. Club World Club World is the mid-haul and long-haul business class cabin. It is offered on all Boeing 777, Boeing 787, Airbus A380, and selected Airbus A321 aircraft. The cabin features fully convertible flat bed seats. In 2006, British Airways launched Next Generation New Club World, featuring larger seats. The Club World cabins are all configured in a similar design on widebody aircraft with aisle seats facing forwards, while middle seats and window seats face backwards (British Airways is one of only five carriers with backwards-facing business-class seats; American Airlines, Etihad Airways, United Airlines and Qatar Airways are the others). In March 2019, BA unveiled its new business-class seats on the new A350 aircraft, which feature a suite with a door. This will be re-fitted on to most of BAs wide-body fleet over the coming years. World Traveller Plus World Traveller Plus is the premium economy class cabin provided on all BA long haul aircraft. This cabin offers wider seats, extended leg-room, additional seat comforts such as larger IFE screen (on most aircraft) a foot rest and power sockets. A complimentary 'World Traveller' bar is offered along with an upgraded main meal. World Traveller World Traveller is the mid-haul and long-haul economy class cabin. It offers seat-back entertainment, complimentary food and drink, pillows, and blankets. AVOD personal TV screens are available on all A321s, A350s, A380s, B777s and B787s. AC power outlets and USB plug-in points are offered in every seat row on the Airbus A350, A380, Boeing B787, B777-300ER and on refurbished B777-200ER aircraft. The outlets accept both UK and US plugs. Incidents and accidents British Airways is known to have a strong reputation for safety and has been consistently ranked within the top 20 safest airlines globally according to Business Insider and AirlineRatings.com. Since BA's inception in 1974, it has been involved in three hull-loss incidents (British Airways Flight 149 was destroyed on the ground at Kuwait International Airport as a result of military action during the First Gulf War with no one on board) and two hijacking attempts. To date, the only fatal accident experienced by a BA aircraft occurred in 1976 with British Airways Flight 476 which was involved in a midair collision later attributed to an error made by air traffic control. On 22 November 1974, British Airways Flight 870 was hijacked shortly after take-off from Dubai International Airport for London-Heathrow. The Vickers VC10 landed at Tripoli for refuelling before flying on to Tunis. The captain, Jim Futcher, returned to |
combined center of mass of a bicycle and its rider must lean into a turn to successfully navigate it. This lean is induced by a method known as countersteering, which can be performed by the rider turning the handlebars directly with the hands or indirectly by leaning the bicycle. Short-wheelbase or tall bicycles, when braking, can generate enough stopping force at the front wheel to flip longitudinally. The act of purposefully using this force to lift the rear wheel and balance on the front without tipping over is a trick known as a stoppie, endo, or front wheelie. Performance The bicycle is extraordinarily efficient in both biological and mechanical terms. The bicycle is the most efficient human-powered means of transportation in terms of energy a person must expend to travel a given distance. From a mechanical viewpoint, up to 99% of the energy delivered by the rider into the pedals is transmitted to the wheels, although the use of gearing mechanisms may reduce this by 10–15%. In terms of the ratio of cargo weight a bicycle can carry to total weight, it is also an efficient means of cargo transportation. A human traveling on a bicycle at low to medium speeds of around uses only the power required to walk. Air drag, which is proportional to the square of speed, requires dramatically higher power outputs as speeds increase. If the rider is sitting upright, the rider's body creates about 75% of the total drag of the bicycle/rider combination. Drag can be reduced by seating the rider in a more aerodynamically streamlined position. Drag can also be reduced by covering the bicycle with an aerodynamic fairing. The fastest recorded unpaced speed on a flat surface is . In addition, the carbon dioxide generated in the production and transportation of the food required by the bicyclist, per mile traveled, is less than that generated by energy efficient motorcars. Parts Frame The great majority of modern bicycles have a frame with upright seating that looks much like the first chain-driven bike. These upright bicycles almost always feature the diamond frame, a truss consisting of two triangles: the front triangle and the rear triangle. The front triangle consists of the head tube, top tube, down tube, and seat tube. The head tube contains the headset, the set of bearings that allows the fork to turn smoothly for steering and balance. The top tube connects the head tube to the seat tube at the top, and the down tube connects the head tube to the bottom bracket. The rear triangle consists of the seat tube and paired chain stays and seat stays. The chain stays run parallel to the chain, connecting the bottom bracket to the rear dropout, where the axle for the rear wheel is held. The seat stays connect the top of the seat tube (at or near the same point as the top tube) to the rear fork ends. Historically, women's bicycle frames had a top tube that connected in the middle of the seat tube instead of the top, resulting in a lower standover height at the expense of compromised structural integrity, since this places a strong bending load in the seat tube, and bicycle frame members are typically weak in bending. This design, referred to as a step-through frame or as an open frame, allows the rider to mount and dismount in a dignified way while wearing a skirt or dress. While some women's bicycles continue to use this frame style, there is also a variation, the mixte, which splits the top tube laterally into two thinner top tubes that bypass the seat tube on each side and connect to the rear fork ends. The ease of stepping through is also appreciated by those with limited flexibility or other joint problems. Because of its persistent image as a "women's" bicycle, step-through frames are not common for larger frames. Step-throughs were popular partly for practical reasons and partly for social mores of the day. For most of the history of bicycles' popularity women have worn long skirts, and the lower frame accommodated these better than the top-tube. Furthermore, it was considered "unladylike" for women to open their legs to mount and dismount—in more conservative times women who rode bicycles at all were vilified as immoral or immodest. These practices were akin to the older practice of riding horse sidesaddle. Another style is the recumbent bicycle. These are inherently more aerodynamic than upright versions, as the rider may lean back onto a support and operate pedals that are on about the same level as the seat. The world's fastest bicycle is a recumbent bicycle but this type was banned from competition in 1934 by the Union Cycliste Internationale. Historically, materials used in bicycles have followed a similar pattern as in aircraft, the goal being high strength and low weight. Since the late 1930s alloy steels have been used for frame and fork tubes in higher quality machines. By the 1980s aluminum welding techniques had improved to the point that aluminum tube could safely be used in place of steel. Since then aluminum alloy frames and other components have become popular due to their light weight, and most mid-range bikes are now principally aluminum alloy of some kind. More expensive bikes use carbon fibre due to its significantly lighter weight and profiling ability, allowing designers to make a bike both stiff and compliant by manipulating the lay-up. Virtually all professional racing bicycles now use carbon fibre frames, as they have the best strength to weight ratio. A typical modern carbon fiber frame can weighs less than . Other exotic frame materials include titanium and advanced alloys. Bamboo, a natural composite material with high strength-to-weight ratio and stiffness has been used for bicycles since 1894. Recent versions use bamboo for the primary frame with glued metal connections and parts, priced as exotic models. Drivetrain and gearing The drivetrain begins with pedals which rotate the cranks, which are held in axis by the bottom bracket. Most bicycles use a chain to transmit power to the rear wheel. A very small number of bicycles use a shaft drive to transmit power, or special belts. Hydraulic bicycle transmissions have been built, but they are currently inefficient and complex. Since cyclists' legs are most efficient over a narrow range of pedaling speeds, or cadence, a variable gear ratio helps a cyclist to maintain an optimum pedalling speed while covering varied terrain. Some, mainly utility, bicycles use hub gears with between 3 and 14 ratios, but most use the generally more efficient dérailleur system, by which the chain is moved between different cogs called chainrings and sprockets to select a ratio. A dérailleur system normally has two dérailleurs, or mechs, one at the front to select the chainring and another at the back to select the sprocket. Most bikes have two or three chainrings, and from 5 to 11 sprockets on the back, with the number of theoretical gears calculated by multiplying front by back. In reality, many gears overlap or require the chain to run diagonally, so the number of usable gears is fewer. An alternative to chaindrive is to use a synchronous belt. These are toothed and work much the same as a chain—popular with commuters and long distance cyclists they require little maintenance. They can't be shifted across a cassette of sprockets, and are used either as single speed or with a hub gear. Different gears and ranges of gears are appropriate for different people and styles of cycling. Multi-speed bicycles allow gear selection to suit the circumstances: a cyclist could use a high gear when cycling downhill, a medium gear when cycling on a flat road, and a low gear when cycling uphill. In a lower gear every turn of the pedals leads to fewer rotations of the rear wheel. This allows the energy required to move the same distance to be distributed over more pedal turns, reducing fatigue when riding uphill, with a heavy load, or against strong winds. A higher gear allows a cyclist to make fewer pedal turns to maintain a given speed, but with more effort per turn of the pedals. With a chain drive transmission, a chainring attached to a crank drives the chain, which in turn rotates the rear wheel via the rear sprocket(s) (cassette or freewheel). There are four gearing options: two-speed hub gear integrated with chain ring, up to 3 chain rings, up to 11 sprockets, hub gear built into rear wheel (3-speed to 14-speed). The most common options are either a rear hub or multiple chain rings combined with multiple sprockets (other combinations of options are possible but less common). Steering The handlebars connect to the stem that connects to the fork that connects to the front wheel, and the whole assembly connects to the bike and rotates about the steering axis via the headset bearings. Three styles of handlebar are common. Upright handlebars, the norm in Europe and elsewhere until the 1970s, curve gently back toward the rider, offering a natural grip and comfortable upright position. Drop handlebars "drop" as they curve forward and down, offering the cyclist best braking power from a more aerodynamic "crouched" position, as well as more upright positions in which the hands grip the brake lever mounts, the forward curves, or the upper flat sections for increasingly upright postures. Mountain bikes generally feature a 'straight handlebar' or 'riser bar' with varying degrees of sweep backwards and centimeters rise upwards, as well as wider widths which can provide better handling due to increased leverage against the wheel. Seating Saddles also vary with rider preference, from the cushioned ones favored by short-distance riders to narrower saddles which allow more room for leg swings. Comfort depends on riding position. With comfort bikes and hybrids, cyclists sit high over the seat, their weight directed down onto the saddle, such that a wider and more cushioned saddle is preferable. For racing bikes where the rider is bent over, weight is more evenly distributed between the handlebars and saddle, the hips are flexed, and a narrower and harder saddle is more efficient. Differing saddle designs exist for male and female cyclists, accommodating the genders' differing anatomies and sit bone width measurements, although bikes typically are sold with saddles most appropriate for men. Suspension seat posts and seat springs provide comfort by absorbing shock but can add to the overall weight of the bicycle. A recumbent bicycle has a reclined chair-like seat that some riders find more comfortable than a saddle, especially riders who suffer from certain types of seat, back, neck, shoulder, or wrist pain. Recumbent bicycles may have either under-seat or over-seat steering. Brakes Bicycle brakes may be rim brakes, in which friction pads are compressed against the wheel rims; hub brakes, where the mechanism is contained within the wheel hub, or disc brakes, where pads act on a rotor attached to the hub. Most road bicycles use rim brakes, but some use disk brakes. Disc brakes are more common for mountain bikes, tandems and recumbent bicycles than on other types of bicycles, due to their increased power, coupled with an increased weight and complexity. With hand-operated brakes, force is applied to brake levers mounted on the handlebars and transmitted via Bowden cables or hydraulic lines to the friction pads, which apply pressure to the braking surface, causing friction which slows the bicycle down. A rear hub brake may be either hand-operated or pedal-actuated, as in the back pedal coaster brakes which were popular in North America until the 1960s. Track bicycles do not have brakes, because all riders ride in the same direction around a track which does not necessitate sharp deceleration. Track riders are still able to slow down because all track bicycles are fixed-gear, meaning that there is no freewheel. Without a freewheel, coasting is impossible, so | of modern bicycles have a frame with upright seating that looks much like the first chain-driven bike. These upright bicycles almost always feature the diamond frame, a truss consisting of two triangles: the front triangle and the rear triangle. The front triangle consists of the head tube, top tube, down tube, and seat tube. The head tube contains the headset, the set of bearings that allows the fork to turn smoothly for steering and balance. The top tube connects the head tube to the seat tube at the top, and the down tube connects the head tube to the bottom bracket. The rear triangle consists of the seat tube and paired chain stays and seat stays. The chain stays run parallel to the chain, connecting the bottom bracket to the rear dropout, where the axle for the rear wheel is held. The seat stays connect the top of the seat tube (at or near the same point as the top tube) to the rear fork ends. Historically, women's bicycle frames had a top tube that connected in the middle of the seat tube instead of the top, resulting in a lower standover height at the expense of compromised structural integrity, since this places a strong bending load in the seat tube, and bicycle frame members are typically weak in bending. This design, referred to as a step-through frame or as an open frame, allows the rider to mount and dismount in a dignified way while wearing a skirt or dress. While some women's bicycles continue to use this frame style, there is also a variation, the mixte, which splits the top tube laterally into two thinner top tubes that bypass the seat tube on each side and connect to the rear fork ends. The ease of stepping through is also appreciated by those with limited flexibility or other joint problems. Because of its persistent image as a "women's" bicycle, step-through frames are not common for larger frames. Step-throughs were popular partly for practical reasons and partly for social mores of the day. For most of the history of bicycles' popularity women have worn long skirts, and the lower frame accommodated these better than the top-tube. Furthermore, it was considered "unladylike" for women to open their legs to mount and dismount—in more conservative times women who rode bicycles at all were vilified as immoral or immodest. These practices were akin to the older practice of riding horse sidesaddle. Another style is the recumbent bicycle. These are inherently more aerodynamic than upright versions, as the rider may lean back onto a support and operate pedals that are on about the same level as the seat. The world's fastest bicycle is a recumbent bicycle but this type was banned from competition in 1934 by the Union Cycliste Internationale. Historically, materials used in bicycles have followed a similar pattern as in aircraft, the goal being high strength and low weight. Since the late 1930s alloy steels have been used for frame and fork tubes in higher quality machines. By the 1980s aluminum welding techniques had improved to the point that aluminum tube could safely be used in place of steel. Since then aluminum alloy frames and other components have become popular due to their light weight, and most mid-range bikes are now principally aluminum alloy of some kind. More expensive bikes use carbon fibre due to its significantly lighter weight and profiling ability, allowing designers to make a bike both stiff and compliant by manipulating the lay-up. Virtually all professional racing bicycles now use carbon fibre frames, as they have the best strength to weight ratio. A typical modern carbon fiber frame can weighs less than . Other exotic frame materials include titanium and advanced alloys. Bamboo, a natural composite material with high strength-to-weight ratio and stiffness has been used for bicycles since 1894. Recent versions use bamboo for the primary frame with glued metal connections and parts, priced as exotic models. Drivetrain and gearing The drivetrain begins with pedals which rotate the cranks, which are held in axis by the bottom bracket. Most bicycles use a chain to transmit power to the rear wheel. A very small number of bicycles use a shaft drive to transmit power, or special belts. Hydraulic bicycle transmissions have been built, but they are currently inefficient and complex. Since cyclists' legs are most efficient over a narrow range of pedaling speeds, or cadence, a variable gear ratio helps a cyclist to maintain an optimum pedalling speed while covering varied terrain. Some, mainly utility, bicycles use hub gears with between 3 and 14 ratios, but most use the generally more efficient dérailleur system, by which the chain is moved between different cogs called chainrings and sprockets to select a ratio. A dérailleur system normally has two dérailleurs, or mechs, one at the front to select the chainring and another at the back to select the sprocket. Most bikes have two or three chainrings, and from 5 to 11 sprockets on the back, with the number of theoretical gears calculated by multiplying front by back. In reality, many gears overlap or require the chain to run diagonally, so the number of usable gears is fewer. An alternative to chaindrive is to use a synchronous belt. These are toothed and work much the same as a chain—popular with commuters and long distance cyclists they require little maintenance. They can't be shifted across a cassette of sprockets, and are used either as single speed or with a hub gear. Different gears and ranges of gears are appropriate for different people and styles of cycling. Multi-speed bicycles allow gear selection to suit the circumstances: a cyclist could use a high gear when cycling downhill, a medium gear when cycling on a flat road, and a low gear when cycling uphill. In a lower gear every turn of the pedals leads to fewer rotations of the rear wheel. This allows the energy required to move the same distance to be distributed over more pedal turns, reducing fatigue when riding uphill, with a heavy load, or against strong winds. A higher gear allows a cyclist to make fewer pedal turns to maintain a given speed, but with more effort per turn of the pedals. With a chain drive transmission, a chainring attached to a crank drives the chain, which in turn rotates the rear wheel via the rear sprocket(s) (cassette or freewheel). There are four gearing options: two-speed hub gear integrated with chain ring, up to 3 chain rings, up to 11 sprockets, hub gear built into rear wheel (3-speed to 14-speed). The most common options are either a rear hub or multiple chain rings combined with multiple sprockets (other combinations of options are possible but less common). Steering The handlebars connect to the stem that connects to the fork that connects to the front wheel, and the whole assembly connects to the bike and rotates about the steering axis via the headset bearings. Three styles of handlebar are common. Upright handlebars, the norm in Europe and elsewhere until the 1970s, curve gently back toward the rider, offering a natural grip and comfortable upright position. Drop handlebars "drop" as they curve forward and down, offering the cyclist best braking power from a more aerodynamic "crouched" position, as well as more upright positions in which the hands grip the brake lever mounts, the forward curves, or the upper flat sections for increasingly upright postures. Mountain bikes generally feature a 'straight handlebar' or 'riser bar' with varying degrees of sweep backwards and centimeters rise upwards, as well as wider widths which can provide better handling due to increased leverage against the wheel. Seating Saddles also vary with rider preference, from the cushioned ones favored by short-distance riders to narrower saddles which allow more room for leg swings. Comfort depends on riding position. With comfort bikes and hybrids, cyclists sit high over the seat, their weight directed down onto the saddle, such that a wider and more cushioned saddle is preferable. For racing bikes where the rider is bent over, weight is more evenly distributed between the handlebars and saddle, the hips are flexed, and a narrower and harder saddle is more efficient. Differing saddle designs exist for male and female cyclists, accommodating the genders' differing anatomies and sit bone width measurements, although bikes typically are sold with saddles most appropriate for men. Suspension seat posts and seat springs provide comfort by absorbing shock but can add to the overall weight of the bicycle. A recumbent bicycle has a reclined chair-like seat that some riders find more comfortable than a saddle, especially riders who suffer from certain types of seat, back, neck, shoulder, or wrist pain. Recumbent bicycles may have either under-seat or over-seat steering. Brakes Bicycle brakes may be rim brakes, in which friction pads are compressed against the wheel rims; hub brakes, where the mechanism is contained within the wheel hub, or disc brakes, where pads act on a rotor attached to the hub. Most road bicycles use rim brakes, but some use disk brakes. Disc brakes are more common for mountain bikes, tandems and recumbent bicycles than on other types of bicycles, due to their increased power, coupled with an increased weight and complexity. With hand-operated brakes, force is applied to brake levers mounted on the handlebars and transmitted via Bowden cables or hydraulic lines to the friction pads, which apply pressure to the braking surface, causing friction which slows the bicycle down. A rear hub brake may be either hand-operated or pedal-actuated, as in the back pedal coaster brakes which were popular in North America until the 1960s. Track bicycles do not have brakes, because all riders ride in the same direction around a track which does not necessitate sharp deceleration. Track riders are still able to slow down because all track bicycles are fixed-gear, meaning that there is no freewheel. Without a freewheel, coasting is impossible, so when the rear wheel is moving, the cranks are moving. To slow down, the rider applies resistance to the pedals, acting as a braking system which can be as effective as a conventional rear wheel brake, but not as effective as a front wheel brake. Suspension Bicycle suspension refers to the system or systems used to suspend the rider and all or part of the bicycle. This serves two purposes: to keep the wheels in continuous contact with the ground, improving control, and to isolate the rider and luggage from jarring due to rough surfaces, improving comfort. Bicycle suspensions are used primarily on mountain bicycles, but are also common on hybrid bicycles, as they can help deal with problematic vibration from poor surfaces. Suspension is especially important on recumbent bicycles, since while an upright bicycle rider can stand on the pedals to achieve some of the benefits of suspension, a recumbent rider cannot. Basic mountain bicycles and hybrids usually have front suspension only, whilst more sophisticated ones also have rear suspension. Road bicycles tend to have no suspension. Wheels and tires The wheel axle fits into fork ends in the frame and fork. A pair of wheels may be called a wheelset, especially in the context of ready-built "off the shelf", performance-oriented wheels. Tires vary enormously depending on their intended purpose. Road bicycles use tires 18 to 25 millimeters wide, most often completely smooth, or slick, and inflated to high pressure to roll fast on smooth surfaces. Off-road tires are usually between wide, and have treads for gripping in muddy conditions or metal studs for ice. Groupset Groupset generally refers to all of the components that make up a bicycle excluding the bicycle frame, fork, stem, wheels, tires, and rider contact points, such as the saddle and handlebars. Accessories Some components, which are often optional accessories on sports bicycles, are standard features on utility bicycles to enhance their usefulness, comfort, safety and visibility. Fenders with spoilers (mudflaps) protect the cyclist and moving parts from spray when riding through wet areas. In some countries (e.g. Germany, UK), fenders are called mudguards. The chainguards protect clothes from oil on the chain while preventing clothing from being caught between the chain and crankset teeth. Kick stands keep bicycles upright when parked, and bike locks deter theft. Front-mounted baskets, front or rear luggage carriers or racks, and panniers mounted above either or both wheels can be used to carry equipment or cargo. Pegs can be fastened to one, or both of the wheel hubs to either help the rider perform certain tricks, or allow a place for extra riders to stand, or rest. Parents sometimes add rear-mounted child seats, an auxiliary saddle fitted to the crossbar, or both to transport children. Bicycles can also be fitted with a hitch to tow a trailer for carrying cargo, a child, or both. Toe-clips and toestraps and clipless pedals help keep the foot locked in the proper pedal position and enable cyclists to pull and push the pedals. Technical accessories include cyclocomputers for measuring speed, distance, heart rate, GPS data etc. Other accessories include lights, reflectors, mirrors, racks, trailers, bags, water bottles and cages, and bell. Bicycle lights, reflectors, and helmets are required by law in some geographic regions depending on the legal code. It is more common to see bicycles with bottle generators, dynamos, lights, fenders, racks and bells in Europe. Bicyclists also have specialized form fitting and high visibility clothing. Children's bicycles may be outfitted with cosmetic enhancements such as bike horns, streamers, and spoke beads. Training wheels are sometimes used when learning to ride. Bicycle helmets can reduce injury in the event of a collision or accident, and a suitable helmet is legally required of riders in many jurisdictions. Helmets may be classified as an accessory or as an item of clothing. Bike trainers are used to enable cyclists to cycle while the bike remains stationary. They are frequently used to warm up before races or indoors when riding conditions are unfavorable. Standards A number of formal and industry standards exist for bicycle components to help make spare parts exchangeable and to maintain a minimum product safety. The International Organization for Standardization (ISO) has a special technical committee for cycles, TC149, that has the scope of "Standardization in the field of cycles, their components and accessories with particular reference to terminology, testing methods and requirements for performance and safety, and interchangeability". The European Committee for Standardization (CEN) also has a specific Technical Committee, TC333, that defines European standards for cycles. Their mandate states that EN cycle standards shall harmonize with ISO standards. Some CEN cycle standards were developed before ISO published their standards, leading to strong European influences in this area. European cycle standards tend to describe minimum safety requirements, while ISO standards have historically harmonized parts geometry. Maintenance and repair Like all devices with mechanical moving parts, bicycles do require a certain amount of regular maintenance and replacement of worn parts. A bicycle is relatively simple compared with a car, so some cyclists choose to do at least part of the maintenance themselves. Some components are easy to handle using relatively simple tools, while other components may require specialist manufacturer-dependent tools. Many bicycle components are available at several different price/quality points; manufacturers generally try to keep all components on any particular bike at about the same quality level, though at the very cheap end of the market there may be some skimping on less obvious components (e.g. bottom bracket). There exist several hundred assisted-service Community Bicycle Organizations worldwide. At a Community Bicycle Organization, laypeople bring in bicycles needing repair or maintenance; volunteers teach them how to do the required steps. Full service is available from bicycle mechanics at a local bike shop. In areas where it is available, some cyclists purchase roadside assistance from companies such as the Better World Club or the American Automobile Association. Maintenance The most basic maintenance item is keeping the tires correctly inflated; this can make a noticeable difference as to how the bike feels to ride. Bicycle tires usually have a marking on the sidewall indicating the pressure appropriate for that tire. Note that bicycles use much higher pressures than cars: car tires are normally in the range 30 to 40 pounds per square inch while bicycle tires are normally in the range of 60 to 100 pounds per square inch. Another basic maintenance item is regular lubrication of the chain and pivot points for derailleurs and brakes. Most of the bearings on a modern bike are sealed and grease-filled and require little or no attention; such bearings will usually last for 10,000 miles or more. The chain and the brake blocks are the components which wear out most quickly, so these need to be checked from time to time (typically every 500 miles or so). Most local bike shops will do such checks for free. Note that when a chain becomes badly worn it will also wear out the rear cogs/cassette and eventually the chain ring(s), so replacing a chain when only moderately worn will prolong the life of other components. Over the longer term, tires do wear out (2000 to 5000 miles); a rash of punctures is often the most visible sign of a worn tire. Repair Very few bicycle components can actually be repaired; replacement of the failing component is the normal practice. The most common roadside problem is a puncture. After removing the offending nail/tack/thorn/glass shard/etc. there are two approaches: either mend the puncture by the roadside, or replace the inner tube and then mend the puncture in the comfort of home. Some brands of tires are much more puncture resistant than others, often incorporating one or more layers of Kevlar; the downside of such tires is that they may be heavier and/or more difficult to fit and remove. Tools There are |
improving elasticity and strength. Without the fibers, starch has poor mechanical properties due to its sensitivity to moisture. Starch being biodegradable and renewable is used for many applications including plastics and pharmaceutical tablets.Cellulose: Cellulose is very structured with stacked chains that result in stability and strength. The strength and stability comes from the straighter shape of cellulose caused by glucose monomers joined together by glycogen bonds. The straight shape allows the molecules to pack closely. Cellulose is very common in application due to its abundant supply, its biocompatibility, and is environmentally friendly. Cellulose is used vastly in the form of nano-fibrils called nano-cellulose. Nano-cellulose presented at low concentrations produces a transparent gel material. This material can be used for biodegradable, homogeneous, dense films that are very useful in the biomedical field.Alginate: Alginate is the most copious marine natural polymer derived from brown seaweed. Alginate biopolymer applications range from packaging, textile and food industry to biomedical and chemical engineering. The first ever application of alginate was in the form of wound dressing, where its gel-like and absorbent properties were discovered. When applied to wounds, alginate produces a protective gel layer that is optimal for healing and tissue regeneration, and keeps a stable temperature environment. Additionally, there have been developments with alginate as a drug delivery medium, as drug release rate can easily be manipulated due to a variety of alginate densities and fibrous composition. Biopolymer applications The applications of biopolymers can be categorized under two main fields, which differ due to their biomedical and industrial use. Biomedical Because one of the main purposes for biomedical engineering is to mimic body parts to sustain normal body functions, due to their biocompatible properties, biopolymers are used vastly for tissue engineering, medical devices and the pharmaceutical industry. Many biopolymers can be used for regenerative medicine, tissue engineering, drug delivery, and overall medical applications due to their mechanical properties. They provide characteristics like wound healing, and catalysis of bio-activity, and non-toxicity. Compared to synthetic polymers, which can present various disadvantages like immunogenic rejection and toxicity after degradation, many biopolymers are normally better with bodily integration as they also possess more complex structures, similar to the human body. More specifically, polypeptides like collagen and silk, are biocompatible materials that are being used in ground breaking research, as these are inexpensive and easily attainable materials. Gelatin polymer is often used on dressing wounds where it acts as an adhesive. Scaffolds and films with gelatin allow for the scaffolds to hold drugs and other nutrients that can be used to supply to a wound for healing. As collagen is one of the more popular biopolymer used in biomedical science, here are some examples of their use:Collagen based drug delivery systems: collagen films act like a barrier membrane and are used to treat tissue infections like infected corneal tissue or liver cancer. Collagen films have all been used for gene delivery carriers which can promote bone formation.Collagen sponges: Collagen sponges are used as a dressing to treat burn victims and other serious wounds. Collagen based implants are used for cultured skin cells or drug carriers that are used for burn wounds and replacing skin.Collagen as haemostat: When collagen interacts with platelets it causes a rapid coagulation of blood. This rapid coagulation produces a temporary framework so the fibrous stroma can be regenerated by host cells. Collagen based haemostat reduces blood loss in tissues and helps manage bleeding in cellular organs like the liver and spleen. Chitosan is another popular biopolymer in biomedical research. Chitosan is derived from chitin, the main component in the exoskeleton of crustaceans and insects and the second most abundant biopolymer in the world. Chitosan has many excellent characteristics for biomedical science. Chitosan is biocompatible, it is highly bioactive, meaning it stimulates a beneficial response from the body, it can biodegrade which can eliminate a second surgery in implant applications, can form gels and films, and is selectively permeable. These properties allow for various biomedical applications of Chitosan.Chitosan as drug delivery: Chitosan is used mainly with drug targeting because it has potential to improve drug absorption and stability. in addition Chitosan conjugated with anticancer agents can also produce better anticancer effects by causing gradual release of free drug into cancerous tissue.Chitosan as an anti-microbial agent: Chitosan is used to stop the growth of microorganisms. It performs antimicrobial functions in microorganisms like algae, fungi, bacteria, and gram positive bacteria of different yeast species.Chitosan composite for tissue engineering: Blended power of Chitosan along with alginate are used together to form functional wound dressings. These dressings create a moist environment which aids in the healing process. This wound dressing is also very biocompatible, biodegradable and has porous structures that allows cells to grow into the dressing. Industrial Food: Biopolymers are being used in the food industry for things like packaging, edible encapsulation films and coating foods. Polylactic acid (PLA) is very common in the food industry due to is clear color and resistance to water. However, most polymers have a hydrophilic nature and start deteriorating when exposed to moisture. Biopolymers are also being used as edible films that encapsulate foods. These films can carry things like antioxidants, enzymes, probiotics, minerals, and vitamins. The food consumed encapsulated with the biopolymer film can supply these things to the body.Packaging: The most common biopolymers used in packaging are polyhydroxyalkanoate (PHA), polylactic acid (PLA), and starch. Starch and PLA are commercially available and biodegradable, making them a common choice for packaging. However, their barrier properties and thermal properties are not ideal. Hydrophilic polymers are not water resistant and allow water to get through the packaging which can affect the contents of the package. Polyglycolic acid (PGA) is a biopolymer that has great barrier characteristics and is now being used to correct the barrier obstacles from PLA and starch.Water purification:''' Chitosan has been used for water purification. It is used as a flocculant that only takes a few weeks or months rather than years to degrade into the environment. Chitosan purifies water by chelation. This is the process in which binding sites along the polymer chain bind with the metal in the water forming chelates. Chitosan has been shown to be an excellent candidate for use in storm and waste water treatment. As materials Some biopolymers- such as PLA, naturally occurring zein, and poly-3-hydroxybutyrate can be used as plastics, replacing the need for polystyrene or polyethylene based plastics. Some plastics are now referred to as being 'degradable', 'oxy-degradable' or 'UV-degradable'. This means that they break down when exposed to light or air, but these plastics are still primarily (as much as 98 per cent) oil-based and are not currently certified as 'biodegradable' under the European Union directive on Packaging and Packaging Waste (94/62/EC). Biopolymers will break down, and some are suitable for domestic composting. Biopolymers (also called renewable polymers) are produced from biomass for use in the packaging industry. Biomass comes from crops such as sugar beet, potatoes or wheat: when used to produce biopolymers, these are classified as non food crops. These can be converted in the following pathways: Sugar beet > Glyconic acid > Polyglyconic acid Starch > (fermentation) > Lactic acid > Polylactic acid (PLA) Biomass > (fermentation) > Bioethanol > Ethene > Polyethylene Many types of packaging can be made from biopolymers: food trays, blown starch pellets for shipping fragile goods, thin films for wrapping. Environmental impacts Biopolymers can be sustainable, carbon neutral and are always renewable, because they are made from plant materials which can be grown | the monomers used and the structure of the biopolymer formed: polynucleotides, polypeptides, and polysaccharides. Polynucleotides, such as RNA and DNA, are long polymers composed of 13 or more nucleotide monomers. Polypeptides and proteins, are polymers of amino acids and some major examples include collagen, actin, and fibrin. Polysaccharides are linear or branched polymeric carbohydrates and examples include starch, cellulose and alginate. Other examples of biopolymers include natural rubbers (polymers of isoprene), suberin and lignin (complex polyphenolic polymers), cutin and cutan (complex polymers of long-chain fatty acids) and melanin. Biopolymers have applications in many fields including the food industry, manufacturing, packaging, and biomedical engineering. Biopolymers versus synthetic polymers A major defining difference between biopolymers and synthetic polymers can be found in their structures. All polymers are made of repetitive units called monomers. Biopolymers often have a well-defined structure, though this is not a defining characteristic (example: lignocellulose): The exact chemical composition and the sequence in which these units are arranged is called the primary structure, in the case of proteins. Many biopolymers spontaneously fold into characteristic compact shapes (see also "protein folding" as well as secondary structure and tertiary structure), which determine their biological functions and depend in a complicated way on their primary structures. Structural biology is the study of the structural properties of biopolymers. In contrast, most synthetic polymers have much simpler and more random (or stochastic) structures. This fact leads to a molecular mass distribution that is missing in biopolymers. In fact, as their synthesis is controlled by a template-directed process in most in vivo systems, all biopolymers of a type (say one specific protein) are all alike: they all contain similar sequences and numbers of monomers and thus all have the same mass. This phenomenon is called monodispersity in contrast to the polydispersity encountered in synthetic polymers. As a result, biopolymers have a dispersity of 1. Conventions and nomenclature Polypeptides The convention for a polypeptide is to list its constituent amino acid residues as they occur from the amino terminus to the carboxylic acid terminus. The amino acid residues are always joined by peptide bonds. Protein, though used colloquially to refer to any polypeptide, refers to larger or fully functional forms and can consist of several polypeptide chains as well as single chains. Proteins can also be modified to include non-peptide components, such as saccharide chains and lipids. Nucleic acids The convention for a nucleic acid sequence is to list the nucleotides as they occur from the 5' end to the 3' end of the polymer chain, where 5' and 3' refer to the numbering of carbons around the ribose ring which participate in forming the phosphate diester linkages of the chain. Such a sequence is called the primary structure of the biopolymer. Sugar Sugar polymers can be linear or branched and are typically joined with glycosidic bonds. The exact placement of the linkage can vary, and the orientation of the linking functional groups is also important, resulting in α- and β-glycosidic bonds with numbering definitive of the linking carbons' location in the ring. In addition, many saccharide units can undergo various chemical modifications, such as amination, and can even form parts of other molecules, such as glycoproteins. Structural characterization There are a number of biophysical techniques for determining sequence information. Protein sequence can be determined by Edman degradation, in which the N-terminal residues are hydrolyzed from the chain one at a time, derivatized, and then identified. Mass spectrometer techniques can also be used. Nucleic acid sequence can be determined using gel electrophoresis and capillary electrophoresis. Lastly, mechanical properties of these biopolymers can often be measured using optical tweezers or atomic force microscopy. Dual-polarization interferometry can be used to measure the conformational changes or self-assembly of these materials when stimulated by pH, temperature, ionic strength or other binding partners. Common biopolymers Collagen: Collagen is the primary structure of vertebrates and is the most abundant protein in mammals. Because of this, collagen is one of the most easily attainable biopolymers, and used for many research purposes. Because of its mechanical structure, collagen has high tensile strength and is a non toxic, easily absorbable, biodegradable and biocompatible material. Therefore, it has been used for many medical applications such as in treatment for tissue infection, drug delivery systems, and gene therapy.Silk fibroin: Silk Fibroin (SF) is another protein rich biopolymer that can be obtained from different silk worm species, such as the mulberry worm Bombyx mori. In contrast to collagen, SF has a lower tensile strength but has strong adhesive properties due to its insoluble and fibrous protein composition. In recent studies, silk fibroin has been found to possess anticoagulation properties and platelet adhesion. Silk fibroin has been additionally found to support stem cell proliferation in vitro.Gelatin: Gelatin is obtained from type I collagen consisting of cysteine, and produced by the partial hydrolysis of collagen from bones, tissues and skin of animals. There are two types of gelatin, Type A and Type B. Type A collagen is derived by acid hydrolysis of collagen and has 18.5% nitrogen. Type B is derived by alkaline hydrolysis containing 18% nitrogen and no amide groups. Elevated temperatures cause the gelatin to melts and exists as coils, whereas lower temperatures result in coil to helix transformation. Gelatin contains many functional groups like NH2, SH, and COOH which allow for gelatin to be modified using nanoparticles and biomolecules. Gelatin is an Extracellular Matrix protein which allows it to be applied for applications such as wound dressings, drug delivery and gene transfection.Starch: Starch is an inexpensive biodegradable biopolymer and copious in supply. Nanofibers and microfibers can be added to the polymer matrix to increase the mechanical properties of starch improving elasticity and strength. Without the fibers, starch has poor mechanical properties due to its sensitivity to moisture. Starch being biodegradable and renewable is used for many applications including plastics and pharmaceutical tablets.Cellulose: Cellulose is very structured with stacked chains that result in stability and strength. The strength and stability comes from the straighter shape of cellulose caused by glucose monomers joined together by glycogen bonds. The straight shape allows the molecules to pack closely. Cellulose is very common in application due to its abundant supply, its biocompatibility, and is environmentally friendly. Cellulose is used vastly in the form of nano-fibrils called nano-cellulose. Nano-cellulose presented at low concentrations produces a transparent gel material. This material can be used for biodegradable, homogeneous, dense films that are very useful in the biomedical field.Alginate: Alginate is the most copious marine natural polymer derived from brown seaweed. Alginate biopolymer applications range from packaging, textile and food industry to biomedical and |
repeat of the 1997 general election, with Labour losing only six seats overall and the Conservatives making a net gain of one seat (gaining nine seats but losing eight). The Conservatives gained a seat in Scotland, which ended the party's status as an "England-only" party in the prior parliament, but failed again to win any seats in Wales. Although they did not gain many seats, three of the few new MPs elected were future Conservative Prime Ministers David Cameron and Boris Johnson and future Conservative Chancellor of the Exchequer George Osborne; Osborne would serve in the same Cabinet as Cameron from 2010 to 2016. The Liberal Democrats made a net gain of six seats. The 2001 general election is the last to date in which any government has held an overall majority of more than 100 seats in the House of Commons, and the second of only two since the Second World War (the other being 1997) in which a single party won over 400 MPs. Notable departing MPs included former Prime Ministers Edward Heath (also Father of the House) and John Major, former Deputy Prime Minister Michael Heseltine, former Liberal Democrat leader Paddy Ashdown, former Cabinet ministers Tony Benn, Tom King, John Morris, Mo Mowlam, John MacGregor and Peter Brooke, Teresa Gorman, and then Mayor of London Ken Livingstone. Change was seen in Northern Ireland, with the moderate unionist Ulster Unionist Party (UUP) losing four seats to the more hardline Democratic Unionist Party (DUP). A similar transition appeared in the nationalist community, with the moderate Social Democratic and Labour Party (SDLP) losing votes to the more staunchly republican and abstentionist Sinn Féin. Exceptionally low voter turnout, which fell below 60% for the first (and so far, only) time since 1918, also marked this election. The election was broadcast live on the BBC and presented by David Dimbleby, Jeremy Paxman, Andrew Marr, Peter Snow, and Tony King. The 2001 general election was notable for being the first in which pictures of the party logos appeared on the ballot paper. Prior to this, the ballot paper had only displayed the candidate's name, address, and party name. Overview The election had been expected on 3 May, to coincide with local elections, but on 2 April 2001, both were postponed to 7 June because of rural movement restrictions imposed in response to the foot-and-mouth outbreak that had started in February. The elections were marked by voter apathy, with turnout falling to 59.4%, the lowest (and first under 70%) since the Coupon Election of 1918. Throughout the election the Labour Party had maintained a significant lead in the opinion polls and the result was deemed to be so certain that some bookmakers paid out for a Labour majority before election day. However, the opinion polls the previous autumn had shown the first Tory lead (though only by a narrow margin) in the opinion polls for eight years as they benefited from the public anger towards the government over the fuel protests which had led to a severe shortage of motor fuel. By the end of 2000, however, the dispute had been resolved and Labour were firmly back in the lead of the opinion polls. In total, a mere 29 parliamentary seats changed hands at the 2001 Election. 2001 also saw the rare election of an independent. Richard Taylor of Independent Kidderminster Hospital and Health Concern (usually now known simply as "Health Concern") unseated a government MP, David Lock, in Wyre Forest. There was also a high vote for British National Party leader Nick Griffin in Oldham West and Royton, in the wake of recent race riots in the town of Oldham. In Northern Ireland, the election was far more dramatic and marked a move by unionists away from support for the Good Friday Agreement, with the moderate unionist Ulster Unionist Party (UUP) losing to the more hardline Democratic Unionist Party (DUP). This polarisation was also seen in the nationalist community, with the Social Democratic and Labour Party (SDLP) vote losing out to more left-wing and republican Sinn Féin. It also saw a tightening of the parties as the small UK Unionist Party lost its only seat. Campaign For Labour, the last four years had run relatively smoothly. The party had successfully defended all their by election seats, and many suspected a Labour win was inevitable from the start. Many in the party, however, were afraid of voter apathy, which was epitomised in a poster of "Hague with Lady Thatcher's hair", captioned "Get out and vote. Or they get in." Despite recessions in mainland Europe and the United States, due to the bursting of global tech bubbles, Britain was notably unaffected and Labour however could rely on a strong economy as unemployment continued to decline toward election day, putting to rest any fears of a Labour government putting the economic situation at risk. For William Hague, however, the Conservative Party had still not fully recovered from the loss in 1997. The party was still divided over Europe, and talk of a referendum on joining the Eurozone was rife. As Labour remained at the political centre, the Tories moved to the right. A policy gaffe by Oliver Letwin over public spending cuts left the party with an own goal that Labour soon exploited. Margaret Thatcher also added to Hague's troubles when speaking out strongly against the Euro to applause. Hague himself, although a witty performer at Prime Minister's Questions, was dogged in the press and reminded of his speech, given at the age of 16, at the 1977 Conservative Conference. The Sun newspaper only added to the Conservatives' woes by backing Labour for a second consecutive election, calling Hague a "dead parrot" during the Conservative Party's conference in October 1998. The Tories campaigned on a strongly right-wing platform, emphasising | (SDLP) vote losing out to more left-wing and republican Sinn Féin. It also saw a tightening of the parties as the small UK Unionist Party lost its only seat. Campaign For Labour, the last four years had run relatively smoothly. The party had successfully defended all their by election seats, and many suspected a Labour win was inevitable from the start. Many in the party, however, were afraid of voter apathy, which was epitomised in a poster of "Hague with Lady Thatcher's hair", captioned "Get out and vote. Or they get in." Despite recessions in mainland Europe and the United States, due to the bursting of global tech bubbles, Britain was notably unaffected and Labour however could rely on a strong economy as unemployment continued to decline toward election day, putting to rest any fears of a Labour government putting the economic situation at risk. For William Hague, however, the Conservative Party had still not fully recovered from the loss in 1997. The party was still divided over Europe, and talk of a referendum on joining the Eurozone was rife. As Labour remained at the political centre, the Tories moved to the right. A policy gaffe by Oliver Letwin over public spending cuts left the party with an own goal that Labour soon exploited. Margaret Thatcher also added to Hague's troubles when speaking out strongly against the Euro to applause. Hague himself, although a witty performer at Prime Minister's Questions, was dogged in the press and reminded of his speech, given at the age of 16, at the 1977 Conservative Conference. The Sun newspaper only added to the Conservatives' woes by backing Labour for a second consecutive election, calling Hague a "dead parrot" during the Conservative Party's conference in October 1998. The Tories campaigned on a strongly right-wing platform, emphasising the issues of Europe, immigration and tax, the fabled "Tebbit Trinity". They also released a poster showing a heavily pregnant Tony Blair, stating "Four years of Labour and he still hasn’t delivered". However, Labour countered by asking where the proposed tax cuts were going to come from, and decried the Tory policy as "cut here, cut there, cut everywhere", in reference to the widespread belief that the Conservatives would make major cuts to public services in order to fund tax cuts. Charles Kennedy contested his first election as leader of the Liberal Democrats. Controversy During the election Sharron Storer, a resident of Birmingham, criticised Prime Minister Tony Blair in front of television cameras about conditions in the National Health Service. The widely televised incident happened on 16 May during a campaign visit by Blair to the Queen Elizabeth Hospital in Birmingham. Sharron Storer's partner, Keith Sedgewick, a cancer patient with non-Hodgkin's lymphoma and therefore highly susceptible to infection, was being treated at the time in the bone marrow unit, but no bed could be found for him and he was transferred to the casualty unit for his first 24 hours. On the evening of the same day Deputy Prime Minister John Prescott punched a protestor after being hit by an egg on his way to an election rally in Rhyl, North Wales. Endorsements Labour received endorsements from The Sun, The Daily Express, The Times (for the first time in its history), The Daily Mirror, and The Guardian. The Independent endorsed Labour and/or the Liberal Democrats. The Conservatives were endorsed by the Daily Mail and The Daily Telegraph. Opinion polling Results The election result was effectively a repeat of 1997, as the Labour Party retained an overwhelming majority with BBC announcing the victory at 02:58 on the early morning of 8 June. Having presided over relatively serene political, economic and social conditions, the feeling of prosperity in the United Kingdom had been maintained into the new millennium, and Labour would have a free hand to assert its ideals in the subsequent parliament. Despite the victory, voter apathy was a major issue, as turnout fell below 60%, 12 percentage points down on 1997. All three of the main parties saw their total votes fall, with Labour's total vote dropping by 2.8 million on 1997, the Conservatives 1.3 million, and the Liberal Democrats 428,000. Some suggested this dramatic fall was a sign of the general acceptance of the status quo and the likelihood of Labour's majority remaining unassailable. For the Conservatives, this huge loss they had sustained in 1997 was repeated. Despite gaining nine seats, the Tories lost seven to the Liberal Democrats, and one even to Labour. William Hague was quick to announce his resignation, doing so at 07:44 |
negative development for humanity, the Book of Mormon instead portrays the fall as a foreordained step in God's plan of salvation, necessary to securing human agency, joy, growth, and eventual righteousness. This positive interpretation of the Adam and Eve story contributes to the Book of Mormon's emphasis "on the importance of human freedom and responsibility" to choose salvation. Dialogic revelation In the Book of Mormon, revelation from God typically manifests as "personalized, dialogic exchange" between God and persons, "rooted in a radically anthropomorphic theology" that personifies deity as a being who hears prayers and provides direct answers to questions. Multiple narratives in the book portray revelation as a dialogue in which petitioners and deity engage one another in a mutual exchange in which God's contributions originate from outside the mortal recipient. The Book of Mormon also emphasizes regular prayer as a significant component of devotional life, depicting it as a central means through which such dialogic revelation can take place. Distinctively, the Book of Mormon's portrayal democratizes revelation by extending it beyond the "Old Testament paradigms" of prophetic authority. In the Book of Mormon, dialogic revelation from God is not the purview of prophets alone but is instead the right of every person. Figures such as Nephi and Ammon receive visions and revelatory direction prior to or without ever becoming prophets, and Laman and Lemuel are rebuked for hesitating to pray for revelation. In the Book of Mormon, God and the divine are directly knowable through revelation and spiritual experience. Also in contrast with traditional Christian conceptions of revelations is the Book of Mormon's broader range of revelatory content. In the Book of Mormon, revelatory topics include not only the expected "exegesis of existence" but also questions that are "pragmatic, and at times almost banal in their mundane specificity". Figures petition God for revelatory answers to doctrinal questions and ecclesiastical crises as well as for inspiration to guide hunts, military campaigns, and sociopolitical decisions, and the Book of Mormon portrays God providing answers to these inquiries. The Book of Mormon depicts revelation as an active and sometimes laborious experience. For example, the Book of Mormon's Brother of Jared learns to act not merely as a petitioner with questions but moreover as an interlocutor with "a specific proposal" for God to consider as part of a guided process of miraculous assistance. Also in the Book of Mormon, Enos describes his revelatory experience as a "wrestle which I had before God" that spanned hours of intense prayer. Religious significance Joseph Smith Like many other early adherents of the Latter Day Saint movement, Smith referenced Book of Mormon scriptures in his preaching relatively infrequently and cited the Bible more often, likely because he was more familiar with the Bible, which he had grown up with. In 1832, Smith dictated a revelation that condemned the "whole church" for treating the Book of Mormon lightly, although even after doing so Smith still referenced the Book of Mormon less often than the Bible. Nevertheless, in 1841 Joseph Smith characterized the Book of Mormon as "the most correct of any book on earth, and the keystone of [the] religion". Although Smith quoted the book infrequently, he was "absorbed into the world of the Book of Mormon" through its narrative content and conceived of his prophetic identity within the framework of the Book of Mormon's portrayal of a world history full of sacred records of God's dealings with humanity and description of him as a revelatory translator. While they were held in Carthage Jail together, shortly before being killed in a mob attack, Joseph's brother Hyrum Smith read aloud from the Book of Mormon, and Joseph told the jail guards present that the Book of Mormon was divinely authentic. The Church of Jesus Christ of Latter-day Saints The Book of Mormon is one of the four sacred texts accepted by Latter-day Saints, who call this scriptural canon the standard works. Church leaders and publications have "strongly affirm[ed]" Smith's claims of the book's significance to the faith. According to the church's "Articles of Faith"—a document written by Joseph Smith in 1842 and canonized by the church as scripture in 1880—members "believe the Bible to be the word of God as far as it is translated correctly," and they "believe the Book of Mormon to be the word of God," without the translation qualification. Up through the mid-twentieth century, the Book of Mormon's significance to Latter-day Saints came more from its "status as a sign" than its specific content. Church leaders and missionaries emphasized it as part of a causal chain which held that if the Book of Mormon was "verifiably true revelation of God," then it justified Smith's claims to prophetic authority to restore the New Testament church. In addition to signifying Smith's prophetic calling, the Book of Mormon also signaled the "restoration of all things", ending what was believed to have been an apostasy from true Christianity. Early Latter-day Saints additionally tended to interpret the Book of Mormon through a millenarian lens and consequently believed the book portended Christ's imminent Second Coming. Latter-day Saints have also long believed the Book of Mormon's contents confirm and fulfill biblical prophecies. For example, "many Latter-day Saints" consider the biblical patriarch Jacob's description of his son Joseph as "a fruitful bough... whose branches run over a wall" a prophecy of Lehi's posterity—described as descendants of Joseph—overflowing into the New World. Latter-day Saints also believe the Bible prophesies of the Book of Mormon as an additional testament to God's dealings with humanity, such as in their interpretation of Ezekiel 37's injunction to "take thee one stick... For Judah, and... take another stick... For Joseph" as referring to the Bible as the "stick of Judah" and the Book of Mormon as "the stick of Joseph". In the 1980s, the church placed greater emphasis on the Book of Mormon as a central text of the faith and on studying and reading it as a means for devotional communion with Jesus Christ. In 1982, it added the subtitle "Another Testament of Jesus Christ" to its official editions of the Book of Mormon. Ezra Taft Benson, the church's thirteenth president (1985–1994), especially emphasized the Book of Mormon. Referencing Smith's 1832 revelation, Benson said the church remained under condemnation for treating the Book of Mormon lightly. Since the late 1980s, Latter-day Saint leaders have encouraged church members to read from the Book of Mormon daily. In an August 2005 message, church president Gordon B. Hinckley challenged each member of the church to re-read the Book of Mormon before the year's end, and by 2016, "Increasing numbers of Latter-day Saints use[d] the [Book of Mormon] for private and family devotions." The Book of Mormon is "the principal scriptural focus" of the church and "absolutely central" to Latter-day Saint worship, including in weekly services, Sunday School, youth seminaries, and more. The church encourages those considering joining the faith to follow the suggestion in the Book of Mormon's final chapter to study the book, ponder it, and pray to God about it. Latter-day Saints believe that sincerely doing so will provide the reader with a spiritual witness confirming it as true scripture. The relevant passage in the chapter is sometimes referred to as "Moroni's Promise." Approximately 90 to 95% of all Book of Mormon printings have been affiliated with the church. As of October 2020, it has published more than 192 million copies of the Book of Mormon. Community of Christ The Community of Christ (formerly the Reorganized Church of Jesus Christ of Latter Day Saints or RLDS Church) views the Book of Mormon as scripture which provides an additional witness of Jesus Christ in support of the Bible. The Community of Christ publishes two versions of the book. The first is the Authorized Edition, first published by the then-RLDS Church in 1908, whose text is based on comparing the original printer's manuscript and the 1837 Second Edition (or "Kirtland Edition") of the Book of Mormon. Its content is similar to the Latter-day Saint edition of the Book of Mormon, but the versification is different. The Community of Christ also publishes a "New Authorized Version" (also called a "reader's edition"), first released in 1966, which attempts to modernize the language of the text by removing archaisms and standardizing punctuation. Use of the Book of Mormon varies among members of the Community of Christ. The church describes it as scripture and includes references to the Book of Mormon in its official lectionary. In 2010, representatives told the National Council of Churches that "the Book of Mormon is in our DNA". At the same time, its use in North American congregations declined between the mid-twentieth and twenty-first centuries. Also during this time, the Community of Christ moved away from emphasizing the Book of Mormon as a historically authentic text. Community of Christ president W. Grant McMurray "opened the door to considering the book more myth than history" in the late-twentieth century, and in 2001 he reflected, "The proper use of the Book of Mormon as sacred scripture has been under wide discussion in the 1970s and beyond, in part because of long-standing questions about its historical authenticity and in part because of perceived theological inadequacies, including matters of race and ethnicity." At the 2007 the Community of Christ World Conference, church president Stephen M. Veazey ruled out-of-order a resolution to "reaffirm the Book of Mormon as a divinely inspired record." He stated that "while the Church affirms the Book of Mormon as scripture, and makes it available for study and use in various languages, we do not attempt to mandate the degree of belief or use. This position is in keeping with our longstanding tradition that belief in the Book of Mormon is not to be used as a test of fellowship or membership in the church." In keeping with this approach, there are "Tens of thousands" of members in some congregations outside North America, such as Haiti and Africa, who "have never used the Book of Mormon". Some Community of Christ members with "more traditional-thinking" on the Book of Mormon have in turn "either left the church or doubled their efforts to bring the Book of Mormon back to the center of the theological and scriptural life of the church." Greater Latter Day Saint movement Since the death of Joseph Smith in 1844, there have been approximately seventy different churches that have been part of the Latter Day Saint movement, fifty of which were extant as of 2012. Religious studies scholar Paul Gutjahr explains that "each of these sects developed its own special relationship with the Book of Mormon". For example James Strang, who led a denomination in the nineteenth century, reenacted Smith's production of the Book of Mormon by claiming in the 1840s and 1850s to receive and translate new scriptures engraved on metal plates, which became the Voree Plates and the Book of the Law of the Lord. William Bickerton led another denomination, The Church of Jesus Christ of Latter Day Saints (today called The Church of Jesus Christ), which accepted the Book of Mormon as scripture alongside the Bible although it did not canonize other Latter Day Saint religious texts like the Doctrine and Covenants and Pearl of Great Price. The contemporary Church of Jesus Christ continues to consider the "Bible and Book of Mormon together" to be "the foundation of [their] faith and the building blocks of" their church. Separate editions of the Book of Mormon have been published by a number of churches in the Latter Day Saint movement, along with private individuals and organizations not endorsed by any specific denomination. Views on historical authenticity Mainstream archaeological, historical and scientific communities do not consider the Book of Mormon an ancient record of actual historical events. Principally, the content of the Book of Mormon does not correlate with archaeological, paleontological, and historical evidence about the past of the Americas. For example, there is no correlation between locations described in the Book of Mormon and known American archaeological sites. There is also no evidence in Mesoamerican societies of cultural influence from anything described in the Book of Mormon. Additionally, the Book of Mormon's narrative refers to the presence of animals, plants, metals, and technologies that archaeological and scientific studies have found little or no evidence of in post-Pleistocene, pre-Columbian America. Such anachronistic references include crops such as barley, wheat, and silk; livestock like sheep and horses; and metals and technology such as brass, steel, the wheel, and chariots. Furthermore, until the late-twentieth century, most adherents of the Latter Day Saint movement who affirmed Book of Mormon historicity believed the people described in the Book of Mormon text were the exclusive ancestors of all indigenous peoples in the Americas. However, linguistics and genetics proved that impossible. There are no widely accepted linguistic connections between any Native American languages and Near Eastern languages, and "the diversity of Native American languages could not have developed from a single origin in the time frame" that would be necessary to validate such a view of Book of Mormon historicity. Finally, there is no DNA evidence linking any Native American group to ancestry from the ancient Near East as a belief in Book of Mormon peoples as the exclusive ancestors of indigenous Americans would require. Instead, geneticists find that indigenous Americans' ancestry traces back to Asia. Despite this, most adherents of the Latter Day Saint movement consider the Book of Mormon to generally be historically authentic. Within the Latter Day Saint movement there are several apologetic groups and scholars that seek to answer challenges to Book of Mormon historicity in various ways. Most Book of Mormon apologetics is done by Latter-day Saints, and the most active and well-known apologetic groups have been the Foundation for Ancient Research and Mormon Studies (FARMS; now defunct) and FAIR (Faithful Answers, Informed Response; formerly FairMormon), both founded and operated by lay Latter-day Saints. Some apologetics aim to reconcile, refute, or dismiss criticisms of Book of Mormon historicity. For example, in response to linguistics and genetics rendering long-popular hemispheric models of Book of Mormon geography impossible, many apologists posit Book of Mormon peoples could have dwelled in a limited geographical region, usually either Mesoamerica or eastern North America, while indigenous peoples of other descents occupied the rest of the Americas. To account for anachronisms, apologists often suggest Smith's translation assigned familiar terms to unfamiliar ideas. Other apologetics strive to "affirmatively advocat[e]" historicity by identifying parallels between the Book of Mormon and antiquity, such as the presence of several complex chiasmi, a literary form used in ancient Hebrew poetry and in the Old Testament. Despite the popularity and influence of literature promoting Book of Mormon historicity among Latter-day Saint views, not all Mormons who affirm Book of Mormon historicity are universally persuaded by apologetic work, and some claim historicity more modestly, such as Richard Bushman's statement that "I read the Book of Mormon as informed Christians read the Bible. As I read, I know the arguments against the book's historicity, but I can't help feeling that the words are true and the events happened. I believe it in the face of many questions." Although there is a "lack of specific response to" elements of the Book of Mormon that some Latter Day Saints consider evidence of ancient origins, when mainstream scholars do examine such they typically deem them "chance based upon only superficial similarities". One critic has dubbed alleged parallels an example of parallelomania. In response to challenges to the Book of Mormon's historicity, some denominations and adherents of the Latter Day Saint movement consider the Book of Mormon a work of inspired fiction akin to pseudepigrapha or biblical midrash that constitutes scripture by revealing true doctrine about God, similar to a common interpretation of the biblical Book of Job. Many in Community of Christ hold this view, and the leadership takes no official position on Book of Mormon historicity while "Opinions about the Book of Mormon range from both ends of the spectrum" among members. Some Latter-day Saints consider the Book of Mormon fictional, although this view is marginal in the community at large. Church leaders and apologists frequently contend that "what is most fundamentally at stake in historicity is not the book's status as scripture but Joseph Smith's claims to prophetic authority." A few scholars propose considering the Book of Mormon an ancient and translated source text appended with modern pseudepigraphic expansions from Smith. Proponents hold that this model can simultaneously account for ancient literary artifacts and nineteenth-century influence in the Book of Mormon. However, the interpretation faces criticism "on multiple fronts" for either conceding too much to skepticism or for being more convoluted than straightforward historicism or unhistoricism. Influenced by continental philosophy, a handful of academics argue for "rethink[ing] the terms of the historicity debates" by understanding the Book of Mormon not as historical or unhistorical (either factual or fictional) but as nonhistorical (existing outside history). Most prominently, James E. Faulconer contends that both skeptical and affirmative approaches to Book of Mormon historicity make the same Enlightenment-derived assumptions about scriptures being representations of external reality, and he argues a more appropriate approach might adopt a premodern understanding of scripture as capable of divinely ordering, rather than simply depicting, reality. Historical context American Indian origins In the 1800s, most early European Americans had a biblical worldview, and numerous attempts were made to explain the origin of the Native Americans biblically. From the sixteenth century through the early-nineteenth, a common belief was that the Jews, particularly the Lost Ten Tribes, were the ancestors of Native Americans. One of the first books to suggest that Native Americans were descended from Jews was written by Jewish-Dutch rabbi and scholar Manasseh ben Israel in 1650. The Book of Mormon provided theological backing to this proposition, and suggested the lost Tribes of Israel would be found in other locations throughout the world as well. The idea was especially popular in the nineteenth century, when the Book of Mormon was published; archaeologist Stephen Williams notes that "the idea of relating the American Indians to the Lost Tribes of Israel was supported by many at this time." Additionally, European settlers viewed the impressive earthworks left behind by the Mound Builder cultures and had some difficulty believing that the Native Americans, whose numbers had been decimated over the previous centuries, could have produced them. A common theory was that a more technologically advanced people had built them, but were overrun and destroyed by a more savage, numerous group. Some observers have suggested the Book of Mormon parallels works within the "mound-builder" genre pervasive in the nineteenth century. Historian Curtis Dahl wrote, "Undoubtedly the most famous and certainly the most influential of all Mound-Builder literature is the Book of Mormon (1830). Whether one wishes to accept it as divinely inspired or the work of Joseph Smith, it fits exactly into the tradition." Others have argued the Book of Mormon does not comfortably fit the genre, such as historian Richard Bushman who wrote, "When other writers delved into Indian origins, they were explicit about recognizable Indian practices", such as Abner Cole, who dressed characters in moccasins in his parody of the book. Meanwhile, the "Book of Mormon deposited its people on some unknown shore—not even definitely identified as America—and had them live out their history in a remote place in a distant time, using names that had no connections to modern Indians" and without including stereotypical Indian terms, practices, or tropes. Critique of the United States The Book of Mormon can be read as a critique of the United States during Smith's lifetime. Historian of religion Nathan O. Hatch called the Book of Mormon "a document of profound social protest", and Bushman "found the book thundering no to the state of the world in Joseph Smith's time." In the Jacksonian era of antebellum America, class inequality was a major concern as fiscal downturns and the economy's transition from guild-based artisanship to private business sharpened socioeconomic disparity. Poll taxes in New York limited access to the vote, and the culture of civil discourse and mores surrounding liberty allowed social elites to ignore and delegitimize populist participation in public discourse. Ethnic injustice was also prominent, as Americans typically stereotyped American Indians as ferocious, lazy, and uncivilized. Meanwhile, Antebellum disestablishment and denominational proliferation could be seen as undermining religious authority through ubiquity as "the different sects understood the same passages of scripture so differently", producing sectarian confusion that, for some, only obfuscated the path to spiritual security. Against the backdrop of these trends, the Book of Mormon "condemned social inequalities, moral abominations, rejection of revelations and miracles, disrespect for Israel (including the Jews), subjection of the Indians, and the abuse of the continent by interloping European migrants." The book's narratives critique the "Nationalist puffery" of "bourgeois public sphere[s]" where rules of civil democracy silence the demands of common people. The Book of Mormon also "advocates the cause of the poor" "[a]gainst increasing wealth and inequality", condemning acquisitiveness as antithetical to righteousness. The book's Lamanites, whom readers generally identified with American Indians, at times were overwhelmingly righteous, even producing a prophet who preached to backsliding Nephites. The Book of Mormon declared natives to be the rightful inheritors to and leaders of the American continent, relegating European migrants to be "Gentiles... com[ing] onstage as interlopers". According to the book, implicitly-European Gentiles had an obligation to serve the native people and join their remnant of covenant Israel or else face a violent downfall like the Nephites of the text. And although a "classic version of America's past... makes a cameo appearance" in the Book of Mormon through a vision of Nephi, the Book of Mormon's doctrine "contests the amalgam of Enlightenment, republican, Protestant, capitalist, and nationalist values that constituted American culture." The Book of Mormon's message can be read as rejecting American denominational pluralism, religious rationalism, capitalist individualism, and nationalist identity, calling instead for ecclesiastical unity, miraculous religion, communitarian economics, and universal society under God's authority. Manuscripts The Book of Mormon was dictated by Joseph Smith to several scribes over a period of 13 months, resulting in three manuscripts. Although 13 months elapsed, the actual translation time was less than 65 actual days of translating. The 116 lost pages contained the first portion of the Book of Lehi; it was lost after Smith loaned the original, uncopied manuscript to Martin Harris. The first completed manuscript, called the original manuscript, was completed using a variety of scribes. Portions of the original manuscript were also used for typesetting. In October 1841, the entire original manuscript was placed into the cornerstone of the Nauvoo House, and sealed up until nearly forty years later when the cornerstone was reopened. It was then discovered that much of the original manuscript had been destroyed by water seepage and mold. Surviving manuscript pages were handed out to various families and individuals in the 1880s. Only 28 percent of the original manuscript now survives, including a remarkable find of fragments from 58 pages in 1991. The majority of what remains of the original manuscript is now kept in the LDS Church's archives. The second completed manuscript, called the printer's manuscript, was a copy of the original manuscript produced by Oliver Cowdery and two other scribes. It is at this point that initial copyediting of the Book of Mormon was completed. Observations of the original manuscript show little evidence of corrections to the text. Shortly before his death in 1850, Cowdery gave the printer's manuscript to David Whitmer, | and orations by various speakers, making up just over 40 percent of the Book of Mormon. These passages contain doctrinal and philosophical teachings on a wide range of topics, from basic themes of Christianity and Judaism to political and ideological teachings. Some of the teachings found in the Book of Mormon reiterate themes common to nineteenth-century American Christianity such as describing the Bible as scripture and affirming covenantal theology. Other teachings are unique and distinctive, such as its descriptions of Jesus and the Atonement, rejection of original sin doctrine, and depiction of dialogic revelation. Jesus As stated on the title page, the Book of Mormon's central purpose is for the "convincing of the Jew and Gentile that Jesus is the Christ, the Eternal God, manifesting himself unto all nations." Jesus is mentioned every 1.7 verses on average and is referred to by one hundred different names. Although much of the Book of Mormon's internal chronology takes place prior to the birth of Jesus, prophets in the book frequently see him in vision and preach about him, and the people in the book worship Jesus as "pre-Christian Christians." For example, the book's first narrator Nephi describes having a vision of the birth, ministry, and death of Jesus, said to have taken place nearly 600 years prior to Jesus' birth, and late in the book the narrator refers to converted peoples as "children of Christ". By depicting ancient prophets and peoples as familiar with Jesus as a Savior, the Book of Mormon universalizes Christian salvation as being the same in all times and places, and it implies that even more ancient peoples were familiar with Jesus. In the Book of Mormon, Jesus visits some early inhabitants of the Americas after his resurrection, and this event is often described as the climax of the book. During this ministry, he reiterates many teachings from the New Testament, re-emphasizes salvific baptism, and introduces the ritual consumption of bread and water "in remembrance of [his] body", a teaching that became the basis for modern Latter-day Saints' "memorialist" view of their sacrament ordinance (analogous to communion). Jesus's ministry in the Book of Mormon has been compared to Jesus's portrayal in the Gospel of John, as Jesus similarly teaches without parables and preaches faith and obedience as a central message. The Book of Mormon depicts Jesus with "a twist" on Christian trinitarianism. Jesus in the Book of Mormon is distinct from God the Father, much as he is in the New Testament, as he prays to God while during a post-resurrection visit with the Nephites. However, the Book of Mormon also emphasizes Jesus and God have "divine unity," and other parts of the book call Jesus "the Father and the Son" or describe the Father, the Son, and the Holy Ghost as "one." As a result, beliefs among the churches of the Latter Day Saint movement range between social trinitarianism (such as among Latter-day Saints) and traditional trinitarianism (such as in Community of Christ). Distinctively, the Book of Mormon describes Jesus as having, prior to his birth, a spiritual "body" "without flesh and blood" that looked similar to how he would appear during his physical life. According to the book, the Brother of Jared lived before Jesus and saw him manifest in this spiritual "body" thousands of years prior to his birth. Plan of salvation The Christian concept of God's plan of salvation for humanity is a frequently recurring theme of the Book of Mormon. While the Bible does not directly outline a plan of salvation, the Book of Mormon explicitly refers to the concept thirty times, using a variety of terms such as plan of salvation, plan of happiness, and plan of redemption. The Book of Mormon's plan of salvation doctrine describes life as a probationary time for people to learn the gospel of Christ through revelation given to prophets and have the opportunity to choose whether or not to obey God. Jesus' atonement then makes repentance possible, enabling the righteous to enter a heavenly state after a final judgment. Although most of Christianity traditionally considers the fall of man a negative development for humanity, the Book of Mormon instead portrays the fall as a foreordained step in God's plan of salvation, necessary to securing human agency, joy, growth, and eventual righteousness. This positive interpretation of the Adam and Eve story contributes to the Book of Mormon's emphasis "on the importance of human freedom and responsibility" to choose salvation. Dialogic revelation In the Book of Mormon, revelation from God typically manifests as "personalized, dialogic exchange" between God and persons, "rooted in a radically anthropomorphic theology" that personifies deity as a being who hears prayers and provides direct answers to questions. Multiple narratives in the book portray revelation as a dialogue in which petitioners and deity engage one another in a mutual exchange in which God's contributions originate from outside the mortal recipient. The Book of Mormon also emphasizes regular prayer as a significant component of devotional life, depicting it as a central means through which such dialogic revelation can take place. Distinctively, the Book of Mormon's portrayal democratizes revelation by extending it beyond the "Old Testament paradigms" of prophetic authority. In the Book of Mormon, dialogic revelation from God is not the purview of prophets alone but is instead the right of every person. Figures such as Nephi and Ammon receive visions and revelatory direction prior to or without ever becoming prophets, and Laman and Lemuel are rebuked for hesitating to pray for revelation. In the Book of Mormon, God and the divine are directly knowable through revelation and spiritual experience. Also in contrast with traditional Christian conceptions of revelations is the Book of Mormon's broader range of revelatory content. In the Book of Mormon, revelatory topics include not only the expected "exegesis of existence" but also questions that are "pragmatic, and at times almost banal in their mundane specificity". Figures petition God for revelatory answers to doctrinal questions and ecclesiastical crises as well as for inspiration to guide hunts, military campaigns, and sociopolitical decisions, and the Book of Mormon portrays God providing answers to these inquiries. The Book of Mormon depicts revelation as an active and sometimes laborious experience. For example, the Book of Mormon's Brother of Jared learns to act not merely as a petitioner with questions but moreover as an interlocutor with "a specific proposal" for God to consider as part of a guided process of miraculous assistance. Also in the Book of Mormon, Enos describes his revelatory experience as a "wrestle which I had before God" that spanned hours of intense prayer. Religious significance Joseph Smith Like many other early adherents of the Latter Day Saint movement, Smith referenced Book of Mormon scriptures in his preaching relatively infrequently and cited the Bible more often, likely because he was more familiar with the Bible, which he had grown up with. In 1832, Smith dictated a revelation that condemned the "whole church" for treating the Book of Mormon lightly, although even after doing so Smith still referenced the Book of Mormon less often than the Bible. Nevertheless, in 1841 Joseph Smith characterized the Book of Mormon as "the most correct of any book on earth, and the keystone of [the] religion". Although Smith quoted the book infrequently, he was "absorbed into the world of the Book of Mormon" through its narrative content and conceived of his prophetic identity within the framework of the Book of Mormon's portrayal of a world history full of sacred records of God's dealings with humanity and description of him as a revelatory translator. While they were held in Carthage Jail together, shortly before being killed in a mob attack, Joseph's brother Hyrum Smith read aloud from the Book of Mormon, and Joseph told the jail guards present that the Book of Mormon was divinely authentic. The Church of Jesus Christ of Latter-day Saints The Book of Mormon is one of the four sacred texts accepted by Latter-day Saints, who call this scriptural canon the standard works. Church leaders and publications have "strongly affirm[ed]" Smith's claims of the book's significance to the faith. According to the church's "Articles of Faith"—a document written by Joseph Smith in 1842 and canonized by the church as scripture in 1880—members "believe the Bible to be the word of God as far as it is translated correctly," and they "believe the Book of Mormon to be the word of God," without the translation qualification. Up through the mid-twentieth century, the Book of Mormon's significance to Latter-day Saints came more from its "status as a sign" than its specific content. Church leaders and missionaries emphasized it as part of a causal chain which held that if the Book of Mormon was "verifiably true revelation of God," then it justified Smith's claims to prophetic authority to restore the New Testament church. In addition to signifying Smith's prophetic calling, the Book of Mormon also signaled the "restoration of all things", ending what was believed to have been an apostasy from true Christianity. Early Latter-day Saints additionally tended to interpret the Book of Mormon through a millenarian lens and consequently believed the book portended Christ's imminent Second Coming. Latter-day Saints have also long believed the Book of Mormon's contents confirm and fulfill biblical prophecies. For example, "many Latter-day Saints" consider the biblical patriarch Jacob's description of his son Joseph as "a fruitful bough... whose branches run over a wall" a prophecy of Lehi's posterity—described as descendants of Joseph—overflowing into the New World. Latter-day Saints also believe the Bible prophesies of the Book of Mormon as an additional testament to God's dealings with humanity, such as in their interpretation of Ezekiel 37's injunction to "take thee one stick... For Judah, and... take another stick... For Joseph" as referring to the Bible as the "stick of Judah" and the Book of Mormon as "the stick of Joseph". In the 1980s, the church placed greater emphasis on the Book of Mormon as a central text of the faith and on studying and reading it as a means for devotional communion with Jesus Christ. In 1982, it added the subtitle "Another Testament of Jesus Christ" to its official editions of the Book of Mormon. Ezra Taft Benson, the church's thirteenth president (1985–1994), especially emphasized the Book of Mormon. Referencing Smith's 1832 revelation, Benson said the church remained under condemnation for treating the Book of Mormon lightly. Since the late 1980s, Latter-day Saint leaders have encouraged church members to read from the Book of Mormon daily. In an August 2005 message, church president Gordon B. Hinckley challenged each member of the church to re-read the Book of Mormon before the year's end, and by 2016, "Increasing numbers of Latter-day Saints use[d] the [Book of Mormon] for private and family devotions." The Book of Mormon is "the principal scriptural focus" of the church and "absolutely central" to Latter-day Saint worship, including in weekly services, Sunday School, youth seminaries, and more. The church encourages those considering joining the faith to follow the suggestion in the Book of Mormon's final chapter to study the book, ponder it, and pray to God about it. Latter-day Saints believe that sincerely doing so will provide the reader with a spiritual witness confirming it as true scripture. The relevant passage in the chapter is sometimes referred to as "Moroni's Promise." Approximately 90 to 95% of all Book of Mormon printings have been affiliated with the church. As of October 2020, it has published more than 192 million copies of the Book of Mormon. Community of Christ The Community of Christ (formerly the Reorganized Church of Jesus Christ of Latter Day Saints or RLDS Church) views the Book of Mormon as scripture which provides an additional witness of Jesus Christ in support of the Bible. The Community of Christ publishes two versions of the book. The first is the Authorized Edition, first published by the then-RLDS Church in 1908, whose text is based on comparing the original printer's manuscript and the 1837 Second Edition (or "Kirtland Edition") of the Book of Mormon. Its content is similar to the Latter-day Saint edition of the Book of Mormon, but the versification is different. The Community of Christ also publishes a "New Authorized Version" (also called a "reader's edition"), first released in 1966, which attempts to modernize the language of the text by removing archaisms and standardizing punctuation. Use of the Book of Mormon varies among members of the Community of Christ. The church describes it as scripture and includes references to the Book of Mormon in its official lectionary. In 2010, representatives told the National Council of Churches that "the Book of Mormon is in our DNA". At the same time, its use in North American congregations declined between the mid-twentieth and twenty-first centuries. Also during this time, the Community of Christ moved away from emphasizing the Book of Mormon as a historically authentic text. Community of Christ president W. Grant McMurray "opened the door to considering the book more myth than history" in the late-twentieth century, and in 2001 he reflected, "The proper use of the Book of Mormon as sacred scripture has been under wide discussion in the 1970s and beyond, in part because of long-standing questions about its historical authenticity and in part because of perceived theological inadequacies, including matters of race and ethnicity." At the 2007 the Community of Christ World Conference, church president Stephen M. Veazey ruled out-of-order a resolution to "reaffirm the Book of Mormon as a divinely inspired record." He stated that "while the Church affirms the Book of Mormon as scripture, and makes it available for study and use in various languages, we do not attempt to mandate the degree of belief or use. This position is in keeping with our longstanding tradition that belief in the Book of Mormon is not to be used as a test of fellowship or membership in the church." In keeping with this approach, there are "Tens of thousands" of members in some congregations outside North America, such as Haiti and Africa, who "have never used the Book of Mormon". Some Community of Christ members with "more traditional-thinking" on the Book of Mormon have in turn "either left the church or doubled their efforts to bring the Book of Mormon back to the center of the theological and scriptural life of the church." Greater Latter Day Saint movement Since the death of Joseph Smith in 1844, there have been approximately seventy different churches that have been part of the Latter Day Saint movement, fifty of which were extant as of 2012. Religious studies scholar Paul Gutjahr explains that "each of these sects developed its own special relationship with the Book of Mormon". For example James Strang, who led a denomination in the nineteenth century, reenacted Smith's production of the Book of Mormon by claiming in the 1840s and 1850s to receive and translate new scriptures engraved on metal plates, which became the Voree Plates and the Book of the Law of the Lord. William Bickerton led another denomination, The Church of Jesus Christ of Latter Day Saints (today called The Church of Jesus Christ), which accepted the Book of Mormon as scripture alongside the Bible although it did not canonize other Latter Day Saint religious texts like the Doctrine and Covenants and Pearl of Great Price. The contemporary Church of Jesus Christ continues to consider the "Bible and Book of Mormon together" to be "the foundation of [their] faith and the building blocks of" their church. Separate editions of the Book of Mormon have been published by a number of churches in the Latter Day Saint movement, along with private individuals and organizations not endorsed by any specific denomination. Views on historical authenticity Mainstream archaeological, historical and scientific communities do not consider the Book of Mormon an ancient record of actual historical events. Principally, the content of the Book of Mormon does not correlate with archaeological, paleontological, and historical evidence about the past of the Americas. For example, there is no correlation between locations described in the Book of Mormon and known American archaeological sites. There is also no evidence in Mesoamerican societies of cultural influence from anything described in the Book of Mormon. Additionally, the Book of Mormon's narrative refers to the presence of animals, plants, metals, and technologies that archaeological and scientific studies have found little or no evidence of in post-Pleistocene, pre-Columbian America. Such anachronistic references include crops such as barley, wheat, and silk; livestock like sheep and horses; and metals and technology such as brass, steel, the wheel, and chariots. Furthermore, until the late-twentieth century, most adherents of the Latter Day Saint movement who affirmed Book of Mormon historicity believed the people described in the Book of Mormon text were the exclusive ancestors of all indigenous peoples in the Americas. However, linguistics and genetics proved that impossible. There are no widely accepted linguistic connections between any Native American languages and Near Eastern languages, and "the diversity of Native American languages could not have developed from a single origin in the time frame" that would be necessary to validate such a view of Book of Mormon historicity. Finally, there is no DNA evidence linking any Native American group to ancestry from the ancient Near East as a belief in Book of Mormon peoples as the exclusive ancestors of indigenous Americans would require. Instead, geneticists find that indigenous Americans' ancestry traces back to Asia. Despite this, most adherents of the Latter Day Saint movement consider the Book of Mormon to generally be historically authentic. Within the Latter Day Saint movement there are several apologetic groups and scholars that seek to answer challenges to Book of Mormon historicity in various ways. Most Book of Mormon apologetics is done by Latter-day Saints, and the most active and well-known apologetic groups have been the Foundation for Ancient Research and Mormon Studies (FARMS; now defunct) and FAIR (Faithful Answers, Informed Response; formerly FairMormon), both founded and operated by lay Latter-day Saints. Some apologetics aim to reconcile, refute, or dismiss criticisms of Book of Mormon historicity. For example, in response to linguistics and genetics rendering long-popular hemispheric models of Book of Mormon geography impossible, many apologists posit Book of Mormon peoples could have dwelled in a limited geographical region, usually either Mesoamerica or eastern North America, while indigenous peoples of other descents occupied the rest of the Americas. To account for anachronisms, apologists often suggest Smith's translation assigned familiar terms to unfamiliar ideas. Other apologetics strive to "affirmatively advocat[e]" historicity by identifying parallels between the Book of Mormon and antiquity, such as the presence of several complex chiasmi, a literary form used in ancient Hebrew poetry and in the Old Testament. Despite the popularity and influence of literature promoting Book of Mormon historicity among Latter-day Saint views, not all Mormons who affirm Book of Mormon historicity are universally persuaded by apologetic work, and some claim historicity more modestly, such as Richard Bushman's statement that "I read the Book of Mormon as informed Christians read the Bible. As I read, I know the arguments against the book's historicity, but I can't help feeling that the words are true and the events happened. I believe it in the face of many questions." Although there is a "lack of specific response to" elements of the Book of Mormon that some Latter Day Saints consider evidence of |
century, estimates are that there were between 100,000 and 300,000 Baptists in Ukraine. An independent All-Ukrainian Baptist Union of Ukraine was established during the brief period of Ukraine's independence in early 20th-century, and once again after the fall of the Soviet Union, the largest of which is currently known as the Evangelical Baptist Union of Ukraine. Missionary organizations Missionary organizations favored the development of the movement on all continents. In England there was the founding of the Baptist Missionary Society in 1792 at Kettering, England. In United States, there was the founding of International Ministries in 1814 and International Mission Board in 1845. Baptist affiliations Many churches are members of a national and international denomination for a cooperative missionary, humanitarian and theological relationship. There also are a substantial number of cooperative groups. In 1905, the Baptist World Alliance (BWA) was formed by 24 Baptist denominations from various countries. The BWA's goals include caring for the needy, leading in world evangelism and defending human rights and religious freedom. Finally, there are Independent Baptist churches that choose to remain independent of any denomination, organization, or association. Membership Statistics According to a Baptist World Alliance census released in 2020, the largest Baptist denomination in the world, it would regroup 245 Baptist denominations members in 128 countries, 173,000 churches and 49,000,000 baptized members. In 2010, 100 million Christians identify themselves as Baptist or belong to Baptist-type churches. In 2020, according to the researcher Sébastien Fath of the CNRS, the movement would have around 170 million believers in the world. Among the censuses carried out by the Baptist denominations in 2021, those which claimed the most members were on each continent: In Africa, the Nigerian Baptist Convention with 13,654 churches and 8,000,637 members, the Baptist Convention of Tanzania with 1,300 churches and 2,660,000 members, the Baptist Community of the Congo River with 2,668 churches and 1,760,634 members. In North America, the Southern Baptist Convention with 47,530 churches and 14,525,579 members, the National Baptist Convention, USA with 21,145 churches and 8,415,100 members. In South America, the Brazilian Baptist Convention with 9,018 churches and 1,790,227 members, the Evangelical Baptist Convention of Argentina with 670 churches and 85,000 members. In Asia, the Myanmar Baptist Convention with 5,319 churches and 1,710,441 members, the Nagaland Baptist Church Council with 1,615 churches and 610,825 members, the Boro Baptist Church Association with 219 churches and 40,000 members, the Boro Baptist Convention with 353 churches and over 52,000 members, the Garo Baptist Convention with 2,619 and 333,908 members, the Convention of Philippine Baptist Churches with 2,668 churches and 600,000 members. In Europe, the All-Ukrainian Union of Churches of Evangelical Christian Baptists with 2,272 churches and 113,000 members, the Baptist Union of Great Britain with 1,895 churches and 111, 208 members, the Union of Evangelical Free Churches in Germany with 801 churches and 80,195 members. In Oceania, the Baptist Union of Papua New Guinea with 489 churches and 84,000 members, the Australian Baptist Ministries with 1,021 churches and 76,046 members. Qualification for membership Membership policies vary due to the autonomy of churches, but generally an individual becomes a member of a church through believer's baptism (which is a public profession of faith in Jesus, followed by immersion baptism). Most baptists do not believe that baptism is a requirement for salvation, but rather a public expression of one's inner repentance and faith. Therefore, some churches will admit into membership persons who make a profession without believer's baptism. In general, Baptist churches do not have a stated age restriction on membership, but believer's baptism requires that an individual be able to freely and earnestly profess their faith. (See Age of Accountability) Baptist beliefs Since the early days of the Baptist movement, various denominations have adopted common confessions of faith as the basis for cooperative work among churches. Each church has a particular confession of faith and a common confession of faith if it is a member of a denomination. Some historically significant Baptist doctrinal documents include the 1689 London Baptist Confession of Faith, 1742 Philadelphia Baptist Confession, the 1833 New Hampshire Baptist Confession of Faith, and written church covenants which some individual Baptist churches adopt as a statement of their faith and beliefs. Baptist theology is an evangelical theology.It is based on believers' Church doctrine. Baptists, like other Christians, are defined by school of thought—some of it common to all orthodox and evangelical groups and a portion of it distinctive to Baptists. Through the years, different Baptist groups have issued confessions of faith—without considering them to be creeds—to express their particular doctrinal distinctions in comparison to other Christians as well as in comparison to other Baptists. Baptist denominations are traditionally seen as belonging to two parties, General Baptists who uphold Arminian theology and Particular Baptists who uphold Reformed theology. During the holiness movement, some General Baptists accepted the teaching of a second work of grace and formed denominations that emphasized this belief, such as the Ohio Valley Association of the Christian Baptist Churches of God and the Holiness Baptist Association. Most Baptists are evangelical in doctrine, but Baptist beliefs can vary due to the congregational governance system that gives autonomy to individual local Baptist churches. Historically, Baptists have played a key role in encouraging religious freedom and separation of church and state. Shared doctrines would include beliefs about one God; the virgin birth; miracles; atonement for sins through the death, burial, and bodily resurrection of Jesus; the Trinity; the need for salvation (through belief in Jesus Christ as the Son of God, his death and resurrection); grace; the Kingdom of God; last things (eschatology) (Jesus Christ will return personally and visibly in glory to the earth, the dead will be raised, and Christ will judge everyone in righteousness); and evangelism and missions. Most Baptists hold that no church or ecclesiastical organization has inherent authority over a Baptist church. Churches can properly relate to each other under this polity only through voluntary cooperation, never by any sort of coercion. Furthermore, this Baptist polity calls for freedom from governmental control. Exceptions to this local form of local governance include a few churches that submit to the leadership of a body of elders, as well as the Episcopal Baptists that have an Episcopal system. Baptists generally believe in the literal Second Coming of Christ.Beliefs among Baptists regarding the "end times" include amillennialism, dispensationalism, and historic premillennialism, with views such as postmillennialism and preterism receiving some support. Some additional distinctive Baptist principles held by many Baptists: The supremacy of the canonical Scriptures as a norm of faith and practice. For something to become a matter of faith and practice, it is not sufficient for it to be merely consistent with and not contrary to scriptural principles. It must be something explicitly ordained through command or example in the Bible. For instance, this is why Baptists do not practice infant baptism—they say the Bible neither commands nor exemplifies infant baptism as a Christian practice. More than any other Baptist principle, this one when applied to infant baptism is said to separate Baptists from other evangelical Christians. Baptists believe that faith is a matter between God and the individual (religious freedom). To them it means the advocacy of absolute liberty of conscience. Insistence on immersion believer's baptism as the only mode of baptism. Baptists do not believe that baptism is necessary for salvation. Therefore, for Baptists, baptism is an ordinance, not a sacrament, since, in their view, it imparts no saving grace. Beliefs that vary among Baptists Since there is no hierarchical authority and each Baptist church is autonomous, there is no official set of Baptist theological beliefs. These differences exist both among associations, and even among churches within the associations. Some doctrinal issues on which there is widespread difference among Baptists are: Eschatology Arminianism versus Calvinism (General Baptists uphold Arminian theology while Particular Baptists teach Calvinist theology). The doctrine of separation from "the world" and whether to associate with those who are "of the world" Belief in a second work of grace, i.e. entire sanctification (held by General Baptists in the Holiness tradition) Speaking-in-tongues and the operation of other charismatic gifts of the Holy Spirit in the charismatic churches How the Bible should be interpreted (hermeneutics) The extent to which missionary boards should be used to support missionaries The extent to which non-members may participate in the Lord's Supper services Which translation of Scripture to use (see King-James-Only movement in the English-speaking world) Dispensationalism versus Covenant theology The role of women in marriage. The ordination of women as deacons or pastors. Attitudes to and involvement in the ecumenical movement. Excommunication is used as a last resort by denominations and churches for members who do not want to repent of beliefs or behavior at odds with the confession of faith of the community. Worship In Baptist churches, worship service is part of the life of the Church and includes praise (Christian music), worship, of prayers to God, a sermon based on the Bible, offering, and periodically the Lord's Supper. In many churches, there are services adapted for children, even teenagers. Prayer meetings are also held during the week. Places of worship The architecture is sober and the Latin cross is one of the only spiritual symbols that can usually be seen on the building of a Baptist church and that identifies the place where it belongs. Education Baptist churches established elementary and secondary schools, Bible colleges, colleges and universities as early as the 1680s in England, before continuing in various countries. Sexuality In matters of sexuality, several Baptist churches are promoting the virginity pledge to young Baptist Christians, who are invited to engage in a public ceremony at sexual abstinence until Christian marriage. This pact is often symbolized by a purity ring. Programs like True Love Waits, founded in 1993 by the Southern Baptist Convention have been developed to support the commitments. In some Baptist churches, young adults and unmarried couples are encouraged to marry early in order to live a sexuality according to the will of God. Some books are specialized on the subject, such as the book The Act of Marriage: The Beauty of Sexual Love published in 1976 by Baptist pastor Tim LaHaye and his wife Beverly LaHaye who was a pioneer in the teaching of Christian sexuality as a gift from God and part of a flourishing Christian marriage. Controversies that have shaped Baptists Baptists have faced many controversies in their 400-year history, controversies of the level of crises. Baptist historian Walter Shurden says the word crisis comes from the Greek word meaning 'to decide.' Shurden writes that contrary to the presumed negative view of crises, some controversies that reach a crisis level may actually be "positive and highly productive." He claims that even schism, though never ideal, has often produced positive results. In his opinion crises among Baptists each have become decision-moments that shaped their future. Some controversies that have shaped Baptists include the "missions crisis", the "slavery crisis", the "landmark crisis", and the "modernist crisis". Missions crisis Early in the 19th century, the rise of the modern missions movement, and the backlash against it, led to widespread and bitter controversy among the American Baptists. During this era, the American Baptists were split between missionary and anti-missionary. A substantial secession of Baptists went into the movement led by Alexander Campbell to return to a more fundamental church. Slavery crisis United States Leading up to the American Civil War, Baptists became embroiled in the controversy over slavery in the United States. Whereas in the First Great Awakening Methodist and Baptist preachers had opposed slavery and urged manumission, over the decades they made more of an accommodation with the institution. They worked with slaveholders in the South to urge a paternalistic institution. Both denominations made direct appeals to slaves and free Blacks for conversion. The Baptists particularly allowed them active roles in congregations. By the mid-19th century, northern Baptists tended to oppose slavery. As tensions increased, in 1844 the Home Mission Society refused to appoint a slaveholder as a missionary who had been proposed by Georgia. It noted that missionaries could not take servants with them, and also that the board did not want to appear to condone slavery. In 1845, a group of churches in favor of slavery and in disagreement with the abolitionism of the Triennial Convention (now American Baptist Churches USA) left to form the Southern Baptist Convention.They believed that the Bible sanctions slavery and that it was acceptable for Christians to own slaves. They believed slavery was a human institution which Baptist teaching could make less harsh. By this time many planters were part of Baptist congregations, and some of the denomination's prominent preachers, such as the Rev. Basil Manly, Sr., president of the University of Alabama, were also planters who owned slaves. As early as the late 18th century, Black Baptists began to organize separate churches, associations and mission agencies. Blacks set up some independent Baptist congregations in the South before the American Civil War. White Baptist associations maintained some oversight of these churches. In the postwar years, freedmen quickly left the white congregations and associations, setting up their own churches. In 1866 the Consolidated American Baptist Convention, formed from Black Baptists of the South and West, helped southern associations set up Black state conventions, which they did in Alabama, Arkansas, Virginia, North Carolina, and Kentucky. In 1880 Black state conventions united in the national Foreign Mission Convention, to support Black Baptist missionary work. Two other national Black conventions were formed, and in 1895 they united as the National Baptist Convention. This organization later went through its own changes, spinning off other conventions. It is the largest Black religious organization and the second-largest Baptist organization in the world. Baptists are numerically most dominant in the Southeast. In 2007, the Pew Research Center's Religious Landscape Survey found that 45% of all African Americans identify with Baptist denominations, with the vast majority of those being within the historically Black tradition. In the American South, the interpretation of the American Civil War, abolition of slavery and postwar period has differed sharply by race since those years. Americans have often interpreted great events in religious terms. Historian Wilson Fallin contrasts the interpretation of Civil War and Reconstruction in white versus Black memory by analyzing Baptist sermons documented in Alabama. Soon after the Civil War, most Black Baptists in the South left the Southern Baptist Convention, reducing its numbers by hundreds of thousands or more. They quickly organized their own congregations and developed their own regional and state associations and, by the end of the 19th century, a national convention. White preachers in Alabama after Reconstruction expressed the view that: Black preachers interpreted the Civil War, Emancipation and Reconstruction as: "God's gift of freedom." They had a gospel of liberation, having long identified with the Book of Exodus from slavery in the Old Testament. They took opportunities to exercise their independence, to worship in their own way, to affirm their worth and dignity, and to proclaim the fatherhood of God and the brotherhood of man. Most of all, they quickly formed their own churches, associations, and conventions to operate freely without white supervision. These institutions offered self-help and racial uplift, a place to develop and use leadership, and places for proclamation of the gospel of liberation. As a result, Black preachers said that God would protect and help him and God's people; God would be their rock in a stormy land. The Southern Baptist Convention supported white supremacy and its results: disenfranchising most Blacks and many poor whites at the turn of the 20th century by raising barriers to voter registration, and passage of racial segregation laws that enforced the system of Jim Crow. Its members largely resisted the civil rights movement in the South, which sought to enforce their constitutional rights for public access and voting; and enforcement of midcentury federal civil rights laws. In 1995, the Southern Baptist Convention passed a resolution that recognized the failure of their ancestors to protect the civil rights of African Americans.More than 20,000 Southern Baptists registered for the meeting in Atlanta. The resolution declared that messengers, as SBC delegates are called, "unwaveringly denounce racism, in all its forms, as deplorable sin" and "lament and repudiate historic acts of evil such as slavery from which we continue to reap a bitter harvest." It offered an apology to all African Americans for "condoning and/or perpetuating individual and systemic racism in our lifetime" and repentance for "racism of which we have been guilty, whether consciously or unconsciously." Although Southern Baptists have condemned racism in the past, this was the first time the convention, predominantly white since the Reconstruction era, had specifically addressed the issue of slavery. The statement sought forgiveness "from our African-American brothers and sisters" and pledged to "eradicate racism in all its forms from Southern Baptist life and ministry." In 1995 about 500,000 members of the 15.6-million-member denomination were African Americans and another 300,000 were ethnic minorities. The resolution marked the denomination's first formal acknowledgment that racism played a role in its founding. Caribbean islands Elsewhere in the Americas, in the Caribbean in particular, Baptist missionaries and members took an active role in the anti-slavery movement. In Jamaica, for example, | Protestant Reformation. There also were Christians who were disappointed that the Church of England had not made corrections of what some considered to be errors and abuses. Of those most critical of the Church's direction, some chose to stay and try to make constructive changes from within the Anglican Church. They became known as "Puritans" and are described by Gourley as cousins of the English Separatists. Others decided they must leave the Church because of their dissatisfaction and became known as the Separatists. In 1579, Faustus Socinus founded the Unitarians in Poland, which was a tolerant country. The Unitarians taught baptism by immersion. When Poland ceased to be tolerant, they fled to Holland. In Holland, the Unitarians introduced immersion baptism to the Dutch Mennonites. Baptist churches have their origins in a movement started by the English John Smyth and Thomas Helwys in Amsterdam. Due to their shared beliefs with the Puritans and Congregationalists, they went into exile in 1607 for Holland with other believers who held the same biblical positions. They believe that the Bible is to be the only guide and that the believer's baptism is what the scriptures require. In 1609, the year considered to be the foundation of the movement, they baptized believers and founded the first Baptist church. In 1609, while still there, Smyth wrote a tract titled "The Character of the Beast," or "The False Constitution of the Church." In it he expressed two propositions: first, infants are not to be baptized; and second, "Antichristians converted are to be admitted into the true Church by baptism." Hence, his conviction was that a scriptural church should consist only of regenerate believers who have been baptized on a personal confession of faith. He rejected the Separatist movement's doctrine of infant baptism (paedobaptism). Shortly thereafter, Smyth left the group. Ultimately, Smyth became committed to believers' baptism as the only biblical baptism. He was convinced on the basis of his interpretation of Scripture that infants would not be damned should they die in infancy. Smyth, convinced that his self-baptism was invalid, applied with the Mennonites for membership. He died while waiting for membership, and some of his followers became Mennonites. Thomas Helwys and others kept their baptism and their Baptist commitments. The modern Baptist denomination is an outgrowth of Smyth's movement. Baptists rejected the name Anabaptist when they were called that by opponents in derision. McBeth writes that as late as the 18th century, many Baptists referred to themselves as "the Christians commonly—though falsely—called Anabaptists." Thomas Helwys took over the leadership, leading the church back to England in 1611 and published the first Baptist confession of faith "A Declaration of Faith of English People" in 1611. He founded the first General Baptist Church in Spitalfields, east London, England in 1612. Another milestone in the early development of Baptist doctrine was in 1638 with John Spilsbury, a Calvinistic minister who helped to promote the strict practice of believer's baptism by immersion (as opposed to affusion or aspersion). According to Tom Nettles, professor of historical theology at Southern Baptist Theological Seminary, "Spilsbury's cogent arguments for a gathered, disciplined congregation of believers baptized by immersion as constituting the New Testament church gave expression to and built on insights that had emerged within separatism, advanced in the life of John Smyth and the suffering congregation of Thomas Helwys, and matured in Particular Baptists." Anabaptist influence view A minority view is that early-17th-century Baptists were influenced by (but not directly connected to) continental Anabaptists. According to this view, the General Baptists shared similarities with Dutch Waterlander Mennonites (one of many Anabaptist groups) including believer's baptism only, religious liberty, separation of church and state, and Arminian views of salvation, predestination and original sin. Representative writers including A.C. Underwood and William R. Estep. Gourley wrote that among some contemporary Baptist scholars who emphasize the faith of the community over soul liberty, the Anabaptist influence theory is making a comeback. However, the relations between Baptists and Anabaptists were early strained. In 1624, the then five existing Baptist churches of London issued a condemnation of the Anabaptists. Furthermore, the original group associated with Smyth and popularly believed to be the first Baptists broke with the Waterlander Mennonite Anabaptists after a brief period of association in the Netherlands. Perpetuity and succession view Traditional Baptist historians write from the perspective that Baptists had existed since the time of Christ. Proponents of the Baptist successionist or perpetuity view consider the Baptist movement to have existed independently from Roman Catholicism and prior to the Protestant Reformation. The perpetuity view is often identified with The Trail of Blood, a booklet of five lectures by J.M. Carrol published in 1931. Other Baptist writers who advocate the successionist theory of Baptist origins are John T. Christian, Thomas Crosby, G. H. Orchard, J. M. Cramp, William Cathcart, Adam Taylor and D. B. Ray This view was also held by English Baptist preacher, Charles Spurgeon as well as Jesse Mercer, the namesake of Mercer University. In 1898 William Whitsitt was pressured to resign his presidency of the Southern Baptist Theological Seminary for denying Baptist successionism. Baptist origins in the United Kingdom In 1612, Thomas Helwys established a Baptist congregation in London, consisting of congregants from Smyth's church. A number of other Baptist churches sprang up, and they became known as the General Baptists. The Particular Baptists were established when a group of Calvinist Separatists adopted believers' Baptism. The Particular Baptists consisted of seven churches by 1644 and had created a confession of faith called the First London Confession of Faith. Baptist origins in North America Both Roger Williams and John Clarke, his compatriot and coworker for religious freedom, are variously credited as founding the earliest Baptist church in North America. In 1639, Williams established a Baptist church in Providence, Rhode Island, and Clarke began a Baptist church in Newport, Rhode Island. According to a Baptist historian who has researched the matter extensively, "There is much debate over the centuries as to whether the Providence or Newport church deserved the place of 'first' Baptist congregation in America. Exact records for both congregations are lacking." The Great Awakening energized the Baptist movement, and the Baptist community experienced spectacular growth. Baptists became the largest Christian community in many southern states, including among the enslaved Black population. Baptist missionary work in Canada began in the British colony of Nova Scotia (present day Nova Scotia and New Brunswick) in the 1760s. The first official record of a Baptist church in Canada was that of the Horton Baptist Church (now Wolfville) in Wolfville, Nova Scotia on 29 October 1778. The church was established with the assistance of the New Light evangelist Henry Alline. Many of Alline's followers, after his death, would convert and strengthen the Baptist presence in the Atlantic region. Two major groups of Baptists formed the basis of the churches in the Maritimes. These were referred to as Regular Baptist (Calvinistic in their doctrine) and Free Will Baptists (Arminian in their doctrine). In May 1845, the Baptist congregations in the United States split over slavery and missions. The Home Mission Society prevented slaveholders from being appointed as missionaries. The split created the Southern Baptist Convention, while the northern congregations formed their own umbrella organization now called the American Baptist Churches USA (ABC-USA). The Methodist Episcopal Church, South had recently separated over the issue of slavery, and southern Presbyterians would do so shortly thereafter. In 2015, Baptists in the U.S. number 50 million people and constitute roughly one-third of American Protestants. Baptist origins in Ukraine The Baptist churches in Ukraine were preceded by the German Anabaptist and Mennonite communities, who had been living in the south of Ukraine since the 16th century, and who practiced adult believers baptism. The first Baptist baptism (adult baptism by full immersion) in Ukraine took place in 1864 on the river Inhul in the Yelizavetgrad region (now Kropyvnytskyi region), in a German settlement. In 1867, the first Baptist communities were organized in that area. From there, the Baptist movement spread across the south of Ukraine and then to other regions as well. One of the first Baptist communities was registered in Kyiv in 1907, and in 1908 the First All-Russian Convention of Baptists was held there, as Ukraine was still controlled by the Russian Empire. The All-Russian Union of Baptists was established in the town of Yekaterinoslav (now Dnipro) in Southern Ukraine. At the end of the 19th century, estimates are that there were between 100,000 and 300,000 Baptists in Ukraine. An independent All-Ukrainian Baptist Union of Ukraine was established during the brief period of Ukraine's independence in early 20th-century, and once again after the fall of the Soviet Union, the largest of which is currently known as the Evangelical Baptist Union of Ukraine. Missionary organizations Missionary organizations favored the development of the movement on all continents. In England there was the founding of the Baptist Missionary Society in 1792 at Kettering, England. In United States, there was the founding of International Ministries in 1814 and International Mission Board in 1845. Baptist affiliations Many churches are members of a national and international denomination for a cooperative missionary, humanitarian and theological relationship. There also are a substantial number of cooperative groups. In 1905, the Baptist World Alliance (BWA) was formed by 24 Baptist denominations from various countries. The BWA's goals include caring for the needy, leading in world evangelism and defending human rights and religious freedom. Finally, there are Independent Baptist churches that choose to remain independent of any denomination, organization, or association. Membership Statistics According to a Baptist World Alliance census released in 2020, the largest Baptist denomination in the world, it would regroup 245 Baptist denominations members in 128 countries, 173,000 churches and 49,000,000 baptized members. In 2010, 100 million Christians identify themselves as Baptist or belong to Baptist-type churches. In 2020, according to the researcher Sébastien Fath of the CNRS, the movement would have around 170 million believers in the world. Among the censuses carried out by the Baptist denominations in 2021, those which claimed the most members were on each continent: In Africa, the Nigerian Baptist Convention with 13,654 churches and 8,000,637 members, the Baptist Convention of Tanzania with 1,300 churches and 2,660,000 members, the Baptist Community of the Congo River with 2,668 churches and 1,760,634 members. In North America, the Southern Baptist Convention with 47,530 churches and 14,525,579 members, the National Baptist Convention, USA with 21,145 churches and 8,415,100 members. In South America, the Brazilian Baptist Convention with 9,018 churches and 1,790,227 members, the Evangelical Baptist Convention of Argentina with 670 churches and 85,000 members. In Asia, the Myanmar Baptist Convention with 5,319 churches and 1,710,441 members, the Nagaland Baptist Church Council with 1,615 churches and 610,825 members, the Boro Baptist Church Association with 219 churches and 40,000 members, the Boro Baptist Convention with 353 churches and over 52,000 members, the Garo Baptist Convention with 2,619 and 333,908 members, the Convention of Philippine Baptist Churches with 2,668 churches and 600,000 members. In Europe, the All-Ukrainian Union of Churches of Evangelical Christian Baptists with 2,272 churches and 113,000 members, the Baptist Union of Great Britain with 1,895 churches and 111, 208 members, the Union of Evangelical Free Churches in Germany with 801 churches and 80,195 members. In Oceania, the Baptist Union of Papua New Guinea with 489 churches and 84,000 members, the Australian Baptist Ministries with 1,021 churches and 76,046 members. Qualification for membership Membership policies vary due to the autonomy of churches, but generally an individual becomes a member of a church through believer's baptism (which is a public profession of faith in Jesus, followed by immersion baptism). Most baptists do not believe that baptism is a requirement for salvation, but rather a public expression of one's inner repentance and faith. Therefore, some churches will admit into membership persons who make a profession without believer's baptism. In general, Baptist churches do not have a stated age restriction on membership, but believer's baptism requires that an individual be able to freely and earnestly profess their faith. (See Age of Accountability) Baptist beliefs Since the early days of the Baptist movement, various denominations have adopted common confessions of faith as the basis for cooperative work among churches. Each church has a particular confession of faith and a common confession of faith if it is a member of a denomination. Some historically significant Baptist doctrinal documents include the 1689 London Baptist Confession of Faith, 1742 Philadelphia Baptist Confession, the 1833 New Hampshire Baptist Confession of Faith, and written church covenants which some individual Baptist churches adopt as a statement of their faith and beliefs. Baptist theology is an evangelical theology.It is based on believers' Church doctrine. Baptists, like other Christians, are defined by school of thought—some of it common to all orthodox and evangelical groups and a portion of it distinctive to Baptists. Through the years, different Baptist groups have issued confessions of faith—without considering them to be creeds—to express their particular doctrinal distinctions in comparison to other Christians as well as in comparison to other Baptists. Baptist denominations are traditionally seen as belonging to two parties, General Baptists who uphold Arminian theology and Particular Baptists who uphold Reformed theology. During the holiness movement, some General Baptists accepted the teaching of a second work of grace and formed denominations that emphasized this belief, such as the Ohio Valley Association of the Christian Baptist Churches of God and the Holiness Baptist Association. Most Baptists are evangelical in doctrine, but Baptist beliefs can vary due to the congregational governance system that gives autonomy to individual local Baptist churches. Historically, Baptists have played a key role in encouraging religious freedom and separation of church and state. Shared doctrines would include beliefs about one God; the virgin birth; miracles; atonement for sins through the death, burial, and bodily resurrection of Jesus; the Trinity; the need for salvation (through belief in Jesus Christ as the Son of God, his death and resurrection); grace; the Kingdom of God; last things (eschatology) (Jesus Christ will return personally and visibly in glory to the earth, the dead will be raised, and Christ will judge everyone in righteousness); and evangelism and missions. Most Baptists hold that no church or ecclesiastical organization has inherent authority over a Baptist church. Churches can properly relate to each other under this polity only through voluntary cooperation, never by any sort of coercion. Furthermore, this Baptist polity calls for freedom from governmental control. Exceptions to this local form of local governance include a few churches that submit to the leadership of a body of elders, as well as the Episcopal Baptists that have an Episcopal system. Baptists generally believe in the literal Second Coming of Christ.Beliefs among Baptists regarding the "end times" include amillennialism, dispensationalism, and historic premillennialism, with views such as postmillennialism and preterism receiving some support. Some additional distinctive Baptist principles held by many Baptists: The supremacy of the canonical Scriptures as a norm of faith and practice. For something to become a matter of faith and practice, it is not sufficient for it to be merely consistent with and not contrary to scriptural principles. It must be something explicitly ordained through command or example in the Bible. For instance, this is why Baptists do not practice infant baptism—they say the Bible neither commands nor exemplifies infant baptism as a Christian practice. More than any other Baptist principle, this one when applied to infant baptism is said to separate Baptists from other evangelical Christians. Baptists believe that faith is a matter between God and the individual (religious freedom). To them it means the advocacy of absolute liberty of conscience. Insistence on immersion believer's baptism as the only mode of baptism. Baptists do not believe that baptism is necessary for salvation. Therefore, for Baptists, baptism is an ordinance, not a sacrament, since, in their view, it imparts no saving grace. Beliefs that vary among Baptists Since there is no hierarchical authority and each Baptist church is autonomous, there is no official set of Baptist theological beliefs. These differences exist both among associations, and even among churches within the associations. Some doctrinal issues on which there is widespread difference among Baptists are: Eschatology Arminianism versus Calvinism (General Baptists uphold Arminian theology while Particular Baptists teach Calvinist theology). The doctrine of separation from "the world" and whether to associate with those who are "of the world" Belief in a second work of grace, i.e. entire sanctification (held by General Baptists in the Holiness tradition) Speaking-in-tongues and the operation of other charismatic gifts of the Holy Spirit in the charismatic churches How the Bible should be interpreted (hermeneutics) The extent to which missionary boards should be used to support missionaries The extent to which non-members may participate in the Lord's Supper services Which translation of Scripture to use (see King-James-Only movement in the English-speaking world) Dispensationalism versus Covenant theology The role |
their far right ("third base"). Each box gets an initial hand of two cards visible to the people playing on it. The dealer's hand gets its first card face up, and, in "hole card" games, immediately gets a second card face down (the hole card), which the dealer peeks at but only reveals when it makes the dealer's hand a blackjack. Hole card games are sometimes played on tables with a small mirror or electronic sensor used to peek securely at the hole card. In European casinos, "no hole card" games are prevalent; the dealer's second card is not drawn until the players have played their hands. Dealers deal the cards from one or two handheld decks, from a dealer's shoe, or from a shuffling machine. Single cards are dealt to each wagered-on position clockwise from the dealer's left, followed by a single card to the dealer, followed by an additional card to each of the positions in play. The players' initial cards may be dealt face up or face down (more common in single-deck games). The object of the game is to win money by creating card totals higher than those of the dealer's hand but not exceeding 21, or by stopping at a total in the hope that dealer will bust. On their turn, players choose to "hit" (take a card), "stand" (end their turn and stop without taking a card), "double" (double their wager, take a single card, and finish), "split" (if the two cards have the same value, separate them to make two hands), or "surrender" (give up a half-bet and retire from the game). Number cards count as their number, the jack, queen, and king ("face cards" or "pictures") count as 10, and aces count as either 1 or 11 according to the player's choice. If the total exceeds 21 points, it busts, and all bets on it immediately lose. After the boxes have finished playing, the dealer's hand is resolved by drawing cards until the hand achieves a total of 17 or higher (a dealer total of 17 including an ace valued as 11, also known as a "soft 17", must be drawn to in some games and must stand in others). The dealer never doubles, splits, or surrenders. If the dealer busts, all remaining player hands win. If the dealer does not bust, each remaining bet wins if its hand is higher than the dealer's and loses if it is lower. A player total of 21 on the first two cards is a "natural" or "blackjack," and the player wins immediately unless dealer also has one, in which case the hand ties. In the case of a tie ("push" or "standoff"), bets are returned without adjustment. But a blackjack beats any hand that is not a blackjack, even one with a value of 21. Wins are paid out at even money, except for player blackjacks, which are traditionally paid out at 3 to 2 odds. Many casinos today pay blackjacks at less than 3:2. This is common in single-deck blackjack games. Blackjack games usually offer a side bet called insurance, which may be placed when the dealer's face up card is an ace. Additional side bets, such as "Dealer Match" which pays when the player's cards match the dealer's up card, are also sometimes available. Player decisions After the initial two cards, the player has up to five options: "hit", "stand", "double down", "split", or "surrender". Each option has a corresponding hand signal. Hit: Take another card. Signal: Scrape cards against table (in handheld games); tap the table with finger or wave hand toward body (in games dealt face up). Stand: Take no more cards; also known as "stand pat", "sit", "stick", or "stay". Signal: Slide cards under chips (in handheld games); wave hand horizontally (in games dealt face up). Double down: Increase the initial bet by 100% and take exactly one more card. The additional bet is placed next to the original bet. Some games permit the player to increase the bet by amounts smaller than 100%. Non-controlling players may or may not double their wager, but they still only take one card. Signal: Place additional chips beside the original bet outside the betting box and point with one finger. Split: Create two hands from a starting hand where both cards are the same value. Each new hand gets another card so that the player has two starting hands. This requires an additional bet on the second hand. The two hands are played out independently, and the wager on each hand is won or lost independently. In the case of cards worth 10 points, some casinos only allow splitting when the cards are the same rank. For example, 10-10 could be split, but K-10 could not. Doubling and re-splitting after splitting are often restricted. A 10-valued card and an ace resulting from a split usually isn't considered a blackjack. Hitting split aces is often not allowed. Non-controlling players can opt to put up a second bet or not. If they do not, they only get paid or lose on one of the two post-split hands. Signal: Place additional chips next to the original bet outside the betting box and point with two fingers spread into a V formation. Surrender: Forfeit half the bet and end the hand immediately. This option is only available at some tables in some casinos, and the option is only available as the first decision. Signal: Spoken; there are no standard signals. Hand signals help the "eye in the sky" make a video recording of the table, which resolves disputes and identifies dealer mistakes. It is also used to protect the casino against dealers who steal chips or players who cheat. Recordings can also identify advantage players. When a player's hand signal disagrees with their words, the hand signal takes precedence. A hand can "hit" as many times as desired until the total is 21 or more. Players must stand on a total of 21. After a bust or a stand, play proceeds to the next hand clockwise around the table. After the last hand is played, the dealer reveals the hole card and stands or draws according to the game's rules. When the outcome of the dealer's hand is established, any hands with bets remaining on the table are resolved (usually in counterclockwise order); bets on losing hands are forfeited, the bet on a push is left on the table, and winners are paid out. Insurance If the dealer shows an ace, an "insurance" bet is allowed. Insurance is a side bet that the dealer has a blackjack. The dealer asks for insurance bets before the first player plays. Insurance bets of up to half the player's current bet are placed on the "insurance bar" above player's cards. If the dealer has a blackjack, insurance pays 2 to 1. In most casinos, the dealer looks at the down card and pays off or takes the insurance bet immediately. In other casinos, the payoff waits until the end of the play. In face-down games, if a player has more than one hand, they are allowed to look at all their hands before deciding. This is the only condition where a player can look at multiple hands. Players with blackjack can also take insurance. Insurance bets lose money in the long run. The dealer has a blackjack less than one-third of the time. In some games, players can also take insurance when a 10-valued card shows, but the dealer has an ace in the hole less than one-tenth of the time. The insurance bet is susceptible to advantage play. It is advantageous to make an insurance bet whenever the hole card has more than a one in three chance of being a ten. Card counting techniques can identify such situations. Rule variations and effects on house edge Note: where changes in the house edge due to changes in the rules are stated in percentage terms, the difference is usually stated here in percentage points, not percentage. For example, if an edge of 10% is reduced to 9%, it is reduced by one percentage point, not reduced by ten percent. Blackjack rules are generally set by regulations which establish permissible rule variations at the casino's discretion. Blackjack comes with a "house edge"; the casino's statistical advantage is built into the game. Most of the house's edge comes from the fact that the player loses when both the player and dealer bust. Blackjack players using basic strategy lose less than an average of 1% of their action over the long run, giving blackjack one of the lowest edges in the casino. The house edge for games where blackjack pays 6 to 5 instead of 3 to 2 increases by about 1.4%, though. Player deviations from basic strategy also increase the house edge. Dealer hits soft 17 Each game has a rule about whether the dealer must hit or stand on soft 17, which is generally printed on the table surface. The variation where the dealer must hit soft 17 is abbreviated "H17" in blackjack literature, with "S17" used for the | the player has up to five options: "hit", "stand", "double down", "split", or "surrender". Each option has a corresponding hand signal. Hit: Take another card. Signal: Scrape cards against table (in handheld games); tap the table with finger or wave hand toward body (in games dealt face up). Stand: Take no more cards; also known as "stand pat", "sit", "stick", or "stay". Signal: Slide cards under chips (in handheld games); wave hand horizontally (in games dealt face up). Double down: Increase the initial bet by 100% and take exactly one more card. The additional bet is placed next to the original bet. Some games permit the player to increase the bet by amounts smaller than 100%. Non-controlling players may or may not double their wager, but they still only take one card. Signal: Place additional chips beside the original bet outside the betting box and point with one finger. Split: Create two hands from a starting hand where both cards are the same value. Each new hand gets another card so that the player has two starting hands. This requires an additional bet on the second hand. The two hands are played out independently, and the wager on each hand is won or lost independently. In the case of cards worth 10 points, some casinos only allow splitting when the cards are the same rank. For example, 10-10 could be split, but K-10 could not. Doubling and re-splitting after splitting are often restricted. A 10-valued card and an ace resulting from a split usually isn't considered a blackjack. Hitting split aces is often not allowed. Non-controlling players can opt to put up a second bet or not. If they do not, they only get paid or lose on one of the two post-split hands. Signal: Place additional chips next to the original bet outside the betting box and point with two fingers spread into a V formation. Surrender: Forfeit half the bet and end the hand immediately. This option is only available at some tables in some casinos, and the option is only available as the first decision. Signal: Spoken; there are no standard signals. Hand signals help the "eye in the sky" make a video recording of the table, which resolves disputes and identifies dealer mistakes. It is also used to protect the casino against dealers who steal chips or players who cheat. Recordings can also identify advantage players. When a player's hand signal disagrees with their words, the hand signal takes precedence. A hand can "hit" as many times as desired until the total is 21 or more. Players must stand on a total of 21. After a bust or a stand, play proceeds to the next hand clockwise around the table. After the last hand is played, the dealer reveals the hole card and stands or draws according to the game's rules. When the outcome of the dealer's hand is established, any hands with bets remaining on the table are resolved (usually in counterclockwise order); bets on losing hands are forfeited, the bet on a push is left on the table, and winners are paid out. Insurance If the dealer shows an ace, an "insurance" bet is allowed. Insurance is a side bet that the dealer has a blackjack. The dealer asks for insurance bets before the first player plays. Insurance bets of up to half the player's current bet are placed on the "insurance bar" above player's cards. If the dealer has a blackjack, insurance pays 2 to 1. In most casinos, the dealer looks at the down card and pays off or takes the insurance bet immediately. In other casinos, the payoff waits until the end of the play. In face-down games, if a player has more than one hand, they are allowed to look at all their hands before deciding. This is the only condition where a player can look at multiple hands. Players with blackjack can also take insurance. Insurance bets lose money in the long run. The dealer has a blackjack less than one-third of the time. In some games, players can also take insurance when a 10-valued card shows, but the dealer has an ace in the hole less than one-tenth of the time. The insurance bet is susceptible to advantage play. It is advantageous to make an insurance bet whenever the hole card has more than a one in three chance of being a ten. Card counting techniques can identify such situations. Rule variations and effects on house edge Note: where changes in the house edge due to changes in the rules are stated in percentage terms, the difference is usually stated here in percentage points, not percentage. For example, if an edge of 10% is reduced to 9%, it is reduced by one percentage point, not reduced by ten percent. Blackjack rules are generally set by regulations which establish permissible rule variations at the casino's discretion. Blackjack comes with a "house edge"; the casino's statistical advantage is built into the game. Most of the house's edge comes from the fact that the player loses when both the player and dealer bust. Blackjack players using basic strategy lose less than an average of 1% of their action over the long run, giving blackjack one of the lowest edges in the casino. The house edge for games where blackjack pays 6 to 5 instead of 3 to 2 increases by about 1.4%, though. Player deviations from basic strategy also increase the house edge. Dealer hits soft 17 Each game has a rule about whether the dealer must hit or stand on soft 17, which is generally printed on the table surface. The variation where the dealer must hit soft 17 is abbreviated "H17" in blackjack literature, with "S17" used for the stand-on-soft-17 variation. Substituting an "H17" rule with an "S17" rule in a game benefits the player, decreasing the house edge by about 0.2%. Number of decks All things being equal, using fewer decks decreases the house edge. This mainly reflects an increased likelihood of player blackjack, since if the players draws a ten on their first card, the subsequent probability of drawing an ace is higher with fewer decks. It also reflects a decreased likelihood of blackjack-blackjack push in a game with fewer decks. Casinos generally compensate by tightening other rules in games with fewer decks, in order to preserve the house edge or discourage play altogether. When offering single deck blackjack games, casinos are more likely to disallow doubling on soft hands or after splitting, to restrict resplitting, require higher minimum bets, and to pay the player less than 3:2 for a winning blackjack. The following table illustrates the mathematical effect on the house edge of the number of decks, by considering games with various deck counts under the following ruleset: double after split allowed, resplit to four hands allowed, no hitting split aces, no surrender, double on any two cards, original bets only lost on dealer blackjack, dealer hits soft 17, and cut-card used. The increase in house edge per unit increase in the number of decks is most dramatic when comparing the single deck game to the two-deck game, and becomes progressively smaller as more decks are added. Late/early surrender Surrender, for those games that allow it, is usually not permitted against a dealer blackjack; if the dealer's first card is an ace or ten, the hole card is checked to make sure there is no blackjack before surrender is offered. This rule protocol is consequently known as "late" surrender. The alternative, "early" surrender, gives player the option to surrender before the dealer checks for blackjack, or in a no hole card game. Early surrender is much more favorable to the player than late surrender. For late surrender, however, while it is tempting to opt for surrender on any hand which will probably lose, the correct strategy is to only surrender on the very worst hands, because having even a one in four chance of winning the full bet is better than losing half the bet and pushing the other half, as entailed by surrendering. Resplitting If the cards of a post-split hand have the same value, most games allow the player to split again, or "resplit". The player places a further wager and the dealer separates the new pair dealing a further card to each as before. Some games allow unlimited resplitting, while others may limit it to a certain number of hands, such as four hands (for example, "resplit to 4"). Hit/resplit split aces After splitting aces, the common rule is that only one card will be dealt to each ace; the player cannot split, double, or take another hit on either hand. Rule variants include allowing resplitting aces or allowing the player to hit split aces. Games allowing aces to be resplit are not uncommon, but those allowing the player to hit split aces are extremely rare. Allowing the player to hit hands resulting from split aces reduces the house edge by about 0.13%; allowing resplitting of aces reduces house edge by about 0.03%. Note that a ten-value card dealt on a split ace (or vice versa) will not be counted as a blackjack, but as a soft 21. No double after split After a split, most games allow doubling down on the new two-card hands. Disallowing doubling after a split increases the house edge by about 0.12%. Double on 9/10/11 or 10/11 only Under the "Reno rule", double down is only permitted on hard totals of 9, 10, or 11 (under a similar European rule, only 10 or 11). Basic strategy would otherwise call for some doubling down with hard 9 and soft 13–18, and advanced players can identify situations where doubling on soft 19–20 and hard 8, 7 and even 6 is advantageous. The Reno rule prevents the player from taking advantage of double down in these situations and thereby increases the player's expected loss. The Reno rule increases the house edge by around 1 in 1,000, and its European version by around 1 in 500. No hole card and OBO In most non-U.S. casinos, a "no hole card" game is played, meaning that the dealer does not draw nor consult their second card until after all players have finished making decisions. With no hole card, it is almost never correct basic strategy to double or split against a dealer ten or ace, since a dealer blackjack will result in the loss of the split and double bets; the only exception is with a pair of aces against a dealer 10, where it is still correct to split. In all other cases, a stand, hit or surrender is called for. For instance, holding 11 against a dealer 10, the correct strategy is to double in a hole card game (where the player knows the dealer's second card is not an ace), but to hit in a no hole card game. The no hole card rule adds approximately 0.11% to the house edge. The "original bets only" rule variation appearing in certain no hole card games states that if the player's hand loses to a dealer blackjack, only the mandatory initial bet ("original") is forfeited, and all optional bets, meaning doubles and splits, are pushed. "Original bets only" is also known by the acronym OBO; it has the same effect on basic strategy and house edge as reverting to a hole card game. Altered payout for a winning blackjack In many casinos, a blackjack pays only 6:5 or even 1:1 instead of the usual 3:2. This is most common at tables with lower table minimums. Although this payoff was originally limited to single-deck games, it has spread to double-deck and shoe games. Among common rule variations in the U.S., these altered payouts for blackjack are the most damaging to the player, causing the greatest increase in house edge. Since blackjack occurs in approximately 4.8% of hands, the 1:1 game increases the house edge by 2.3%, while the 6:5 game adds 1.4% to the house edge. Video blackjack machines generally pay 1:1 payout for a blackjack. Dealer wins ties The rule that bets on tied hands are lost rather than pushed is catastrophic to the player. Though rarely used in standard blackjack, it is sometimes seen in "blackjack-like" games such as in some charity casinos. Blackjack strategy Basic strategy Each blackjack game has a basic strategy, the optimal method of playing any hand. When using basic strategy, the long-term house advantage (the expected loss of the player) is minimized. An example of a basic strategy is shown in the table below, which applies to a game with the following specifications: Four to eight decks The dealer hits on a soft 17 A double is allowed after a split Only original bets are lost on dealer blackjack Key: S = Stand H = Hit Dh = Double (if not allowed, then hit) Ds = Double (if not allowed, then stand) SP = Split Uh = Surrender (if not allowed, then hit) Us = Surrender (if not allowed, then stand) Usp = Surrender (if not allowed, then split) Most basic strategy decisions are the same for all blackjack games. Rule variations call for changes in only a few situations. For example, to use the table above on a game with the stand on soft 17 rule (which favors the player, and is typically found only at higher-limit tables today) only 6 cells would need to be changed: hit on 11 vs. A, hit on 15 vs. A, |
is both the conjugate base of carbonic acid ; and the conjugate acid of , the carbonate ion, as shown by these equilibrium reactions: + 2 H2O + H2O + OH− H2CO3 + 2 OH− H2CO3 + 2 H2O + H3O+ + H2O + 2 H3O+. A bicarbonate salt forms when a positively charged ion attaches to the negatively charged oxygen atoms of the ion, forming an ionic compound. Many bicarbonates are soluble in water at standard temperature and pressure; in particular, sodium bicarbonate contributes to total dissolved solids, a common parameter for assessing water quality. Physiological role Bicarbonate () is a vital component of the pH buffering system of the human body (maintaining acid–base homeostasis). 70%–75% of CO2 in the body is converted into carbonic acid (H2CO3), which is the conjugate acid of and can quickly turn into it. With carbonic acid as the central intermediate species, bicarbonate – in conjunction with water, hydrogen ions, and carbon dioxide – forms this buffering system, which is maintained at the volatile equilibrium required to provide prompt resistance to pH changes in both the acidic and basic directions. This is especially important for protecting tissues of the central nervous system, where pH changes too far outside of the normal range in either direction could prove disastrous (see acidosis or alkalosis). Additionally, bicarbonate plays a key role in the digestive system. It raises the internal pH of the stomach, after highly acidic digestive juices have finished in their digestion of food. Bicarbonate also acts to regulate pH in the small intestine. It is released from the pancreas in response to the hormone secretin to neutralize the acidic chyme entering the duodenum from the stomach. Bicarbonate in the environment Bicarbonate is the dominant form of dissolved inorganic carbon in sea water, and in most fresh waters. As such it is an important sink in the carbon cycle. In freshwater ecology, strong photosynthetic activity by freshwater plants in daylight releases gaseous oxygen into the water and at the same time produces bicarbonate ions. These shift the pH upward until | central intermediate species, bicarbonate – in conjunction with water, hydrogen ions, and carbon dioxide – forms this buffering system, which is maintained at the volatile equilibrium required to provide prompt resistance to pH changes in both the acidic and basic directions. This is especially important for protecting tissues of the central nervous system, where pH changes too far outside of the normal range in either direction could prove disastrous (see acidosis or alkalosis). Additionally, bicarbonate plays a key role in the digestive system. It raises the internal pH of the stomach, after highly acidic digestive juices have finished in their digestion of food. Bicarbonate also acts to regulate pH in the small intestine. It is released from the pancreas in response to the hormone secretin to neutralize the acidic chyme entering the duodenum from the stomach. Bicarbonate in the environment Bicarbonate is the dominant form of dissolved inorganic carbon in sea water, and in most fresh waters. As such it is an important sink in the carbon cycle. In freshwater ecology, strong photosynthetic activity by freshwater plants in daylight releases gaseous oxygen into the water and at the same time produces bicarbonate ions. These shift the pH upward until in |
overlooked talent in hockey. His General Manager Ron Caron said he was "A great playmaker. He makes the average or above average player look like a star at times. He's such an unselfish player." On March 19, 1988, Federko became the 22nd NHL player to record 1000 career points. After a poor season for Federko in 1988–89, he was traded to the Detroit Red Wings with Tony McKegney for future Blues star Adam Oates, and Paul MacLean. In Detroit, Federko re-united with former Blues head coach Jacques Demers, but he had to play behind Steve Yzerman and did not get his desired ice time. After his lowest point output since his rookie season, Federko decided to retire after the 1989–90 season, having played exactly 1,000 NHL games with his final game on April 1, 1990. Post-NHL career Less than a year after retiring as a player, the Blues retired number 24 in his honor on March 16, 1991. Federko was eventually inducted into the Hockey Hall of Fame in 2002, the first Hall of Famer to earn his credentials primarily as a Blue. Currently, Federko is a television color commentator for Bally Sports Midwest during Blues broadcasts. Federko was the head coach/general manager of the St. Louis Vipers roller hockey team of the Roller Hockey International for the 1993 and 1994 seasons. Awards Bob Brownridge Memorial Trophy (WCHL leading scorer) - 1976 Named | mid-season to play 31 games with St. Louis. He scored three hat tricks in those 31 games. In the 1978–79 NHL season, Federko developed into a bona fide star, as he scored 95 points. Federko scored 100 points in a season four times, and was a consistent and underrated performer for the Blues. Federko scored at least 90 points in seven of the eight seasons between 1978 and 1986, and became the first player in NHL history to record at least 50 assists in 10 consecutive seasons. However, in an era when Wayne Gretzky was scoring 200 points a season, Federko never got the attention many felt he deserved. In 1986, in a poll conducted by GOAL magazine, he was named the most overlooked talent in hockey. His General Manager Ron Caron said he was "A great playmaker. He makes the average or above average player look like a star at times. He's such an unselfish player." On March 19, 1988, Federko became the 22nd NHL player to record 1000 career points. After a poor season for Federko in 1988–89, he was traded to the Detroit Red Wings with Tony McKegney for future Blues star Adam Oates, and Paul MacLean. In Detroit, Federko re-united with former Blues head coach Jacques Demers, but he had to play behind Steve Yzerman and did not get his desired ice time. After his lowest point output since his rookie season, Federko decided to retire after the 1989–90 season, having played exactly 1,000 NHL games with his final game on April 1, 1990. Post-NHL career Less than a year after retiring as a player, the Blues retired number 24 in his honor on March 16, 1991. Federko was eventually inducted into the Hockey Hall of Fame in 2002, the first Hall of Famer to earn his credentials primarily as a Blue. Currently, Federko is a television color commentator for Bally Sports Midwest during Blues broadcasts. Federko was the head coach/general |
mixed land use received an award at the 2019 Congress for the New Urbanism conference. Climate Buffalo has a humid continental climate (Köppen Dfb bordering on Dfa, common in the Great Lakes region), and temperatures have been warming with the rest of the US. Lake-effect snow is characteristic of Buffalo winters, with snow bands (producing intense snowfall in the city and surrounding area) depending on wind direction off Lake Erie. However, Buffalo is rarely the snowiest city in the state. The Blizzard of 1977 resulted from a combination of high winds and snow which accumulated on land and on the frozen Lake Erie. Although snow does not typically impair the city's operation, it can cause significant damage in autumn (as the October 2006 storm did). In November 2014 (called "Snowvember"), the region had a record-breaking storm which producing over of snow. Buffalo's lowest recorded temperature was , which occurred twice: on February 9, 1934, and February 2, 1961. Although the city's summers are drier and sunnier than other cities in the northeastern United States, its vegetation receives enough precipitation to remain hydrated. Buffalo summers are characterized by abundant sunshine, with moderate humidity and temperatures; the city benefits from cool, southwestern Lake Erie summer breezes which temper warmer temperatures. Temperatures rise above an average of three times a year. No official recording of or more has occurred to date, with a maximum temperature of 99 °F reached on August 27, 1948. Rainfall is moderate, typically falling at night, and cooler lake temperatures hinder storm development in July. August is usually rainier and muggier, as the warmer lake loses its temperature-controlling ability. Demographics Several hundred Seneca, Tuscarora and other Iroquois tribal peoples were the primary residents of the Buffalo area before 1800, concentrated along Buffalo Creek. After the Revolutionary War, settlers from New England and eastern New York began to move into the area. From the 1830s to the 1850s, they were joined by Irish and German immigrants from Europe, both peasants and working class, who settled in enclaves on the city's south and east sides. At the turn of the 20th century, Polish immigrants replaced Germans on the East Side, who moved to newer housing; Italian immigrant families settled throughout the city, primarily on the lower West Side. During the 1830s, Buffalo residents were generally intolerant of the small groups of Black Americans who began settling on the city's East Side. In the 20th century, wartime and manufacturing jobs attracted Black Americans from the South during the First and Second Great Migrations. In the World War II and postwar years from 1940 to 1970, the city's Black population rose by 433 percent. They replaced most of the Polish community on the East Side, who were moving out to suburbs. However, the effects of redlining, steering, social inequality, blockbusting, white flight and other racial policies resulted in the city (and region) becoming one of the most segregated in the U.S. During the 1940s and 1950s, Puerto Rican migrants arrived en masse, also seeking industrial jobs, settling on the East Side and moving westward. In the 21st century, Buffalo is classified as a majority minority city, with a plurality of residents who are Black and Latino. Buffalo has mitigated the effects of urban decay since the 1970s, including population losses to the suburbs and Sun Belt states, and job losses from deindustrialization. The city's population peaked at 580,132 in 1950, when Buffalo was the 15th-largest city in the United Statesdown from the eighth-largest city in 1900, after its growth rate slowed during the 1920s. Buffalo's population began declining in the second half of the 20th century, due to suburbanization and loss of industrial jobs, and began stabilizing during the 2010s. The city had a population of 261,310 in the 2010 census which increased to 278,349 residents in the 2020 census, making it the 76th-largest city in the United States. Its metropolitan area had 1.1 million residents in 2020, the country's 49th-largest. Compared to other major US metropolitan areas, the number of foreign-born immigrants to Buffalo is low. New immigrants are primarily resettled refugees (especially from war- or disaster-afflicted nations) and refugees who had previously settled in other U.S. cities. During the early 2000s, most immigrants came from Canada and Yemen; this shifted in the 2010s to Burmese (Karen) refugees and Indian immigrants. Between 2008 and 2016, Burmese, Somali, Bhutanese, and Iraqi Americans were the four largest ethnic immigrant groups in Erie County. Poverty has remained an issue for the city; in 2019, it was estimated that 30.1 percent of individuals and 24.8 percent of families lived below the federal poverty line. Per capita income was $24,400 and household income was $37,354: much less than the national average. A 2008 report noted that although food deserts were seen in larger cities and not in Buffalo, the city's neighborhoods of color have access only to smaller grocery stores and lack the supermarkets more typical of newer, white neighborhoods. A 2018 report noted that over fifty city blocks on Buffalo's East Side lacked adequate access to a supermarket. Health disparities exist compared to the rest of the state: Erie County's average 2019 lifespan was three years lower (78.4 years); its 17-percent smoking and 30-percent obesity rates were slightly higher than the state average. According to the Partnership for the Public Good, educational achievement in the city is lower than in the surrounding area; city residents are almost twice as likely as adults in the metropolitan area to lack a high-school diploma. Religion During the early 19th century, Presbyterian missionaries tried to convert the Seneca people on the Buffalo Creek Reservation to Christianity. Initially resistant, some tribal members set aside their traditions and practices to form their own sect. Later, European immigrants added other faiths. Christianity is the predominant religion in Buffalo and Western New York. Catholicism (primarily the Latin Church) has a significant presence in the region, with 161 parishes and over 570,000 adherents in the Diocese of Buffalo. A Jewish community began developing in the city with immigrants from the mid-1800s; about one thousand German and Lithuanian Jews settled in Buffalo before 1880. Buffalo's first synagogue, Temple Beth El, was established in 1847. The city's Temple Beth Zion is the region's largest synagogue. With changing demographics and an increased number of refugees from other areas on the city's East Side, Islam and Buddhism have expanded their presence. In this area, new residents have converted empty churches into mosques and temples. Hinduism maintains a small, active presence in the area, including the town of Amherst. A 2016 American Bible Society survey reported that Buffalo is the fifth-least "Bible-minded" city in the United States; 13 percent of its residents associate with the Bible. Economy The Erie Canal was the impetus for Buffalo's economic growth as a transshipment hub for grain and other agricultural products headed east from the Midwest. Later, manufacturing of steel and automotive parts became central to the city's economy. When these industries downsized in the region, Buffalo's economy became service-based. Its primary sectors include health care, business services (banking, accounting, and insurance), retail, tourism and logistics, especially with Canada. Despite the loss of large-scale manufacturing, some manufacturing of metals, chemicals, machinery, food products, and electronics remains in the region. Advanced manufacturing has increased, with an emphasis on research and development (R&D) and automation. In 2019, the U.S. Bureau of Economic Analysis valued the gross domestic product (GDP) of the Buffalo–Niagara Falls MSA at $53 billion. The civic sector is a major source of employment in the Buffalo area, and includes public, non-profit, healthcare and educational institutions. New York State, with over 19,000 employees, is the region's largest employer. In the private sector, top employers include the Kaleida Health and Catholic Health hospital networks and M&T Bank, the sole Fortune 500 company headquartered in the city. Most have been the top employers in the region for several decades. Buffalo is home to the headquarters of Rich Products, Delaware North and New Era Cap Company; the aerospace manufacturer Moog Inc. is based in nearby East Aurora. Buffalo weathered the Great Recession of 2006–09 well in comparison with other U.S. cities, exemplified by increased home prices during this time. The region's economy began to improve in the early 2010s, adding over 25,000 jobs from 2009 to 2017. With state aid, Tesla, Inc.'s Giga New York plant opened in South Buffalo in 2017. The effects of the COVID-19 pandemic in the United States, however, increased the local unemployment rate to 7.5 percent by December 2020. The local unemployment rate had been 4.2 percent in 2019, higher than the national average of 3.5 percent. The Buffalo area has a larger-than-average pay disparity than the rest of the U.S. The average salary ($43,580) was six percent less than the national average in 2017, with the pay gap increasing to ten percent with increased career specialization. Workforce productivity is higher and turnover lower than other regions. Culture Performing arts and music Buffalo is home to over 20 theater companies, with many centered in the downtown Theatre District. Shea's Performing Arts Center is the city's largest theater. Designed by Louis Comfort Tiffany and built in 1926, the theater presents Broadway musicals and concerts. Other venues include Shea's 710 Theatre, Alleyway Theatre, Theater of Youth and Canalside, where major acts draw about seven thousand concertgoers. The Buffalo Philharmonic Orchestra was formed in 1935 and performs at Kleinhans Music Hall, whose acoustics have been praised. Although the orchestra nearly disbanded during the late 1990s due to a lack of funding, philanthropic contributions and state aid stabilized it. Under the direction of JoAnn Falletta, the orchestra has received a number of Grammy Award nominations and won the Grammy Award for Best Contemporary Classical Composition in 2009. Rick James was born and raised in Buffalo and later lived on a ranch in the nearby Town of Aurora. James formed his Stone City Band in Buffalo, and had national appeal with several crossover singles in the R&B, disco and funk genres in the late 1970s and early 1980s. Around the same time, the jazz fusion band Spyro Gyra and jazz saxophonist Grover Washington Jr. also got their start in the city. Buffalo's Colored Musicians Club, an extension of what was a separate musicians'-union chapter, maintains jazz history. The Goo Goo Dolls, an alternative rock group which formed in 1986, had 19 top-ten singles. Singer-songwriter and activist Ani DiFranco has released over 20 folk and indie rock albums on Righteous Babe Records, her Buffalo-based label. Underground hip-hop acts in the city partner with Buffalo-based Griselda Records, whose artists include Westside Gunn and Conway the Machine and occasionally refer to Buffalo culture in their lyrics. Cuisine The city's cuisine encompasses a variety of cultures and ethnicities. In 2015, the National Geographic Society ranked Buffalo third on its "World's Top Ten Food Cities" list. Teressa Bellissimo first prepared Buffalo wings (seasoned chicken wings) at the Anchor Bar in 1964. The Anchor Bar has a crosstown rivalry with Duff's Famous Wings, but Buffalo wings are served at many bars and restaurants throughout the city (some with unique cooking styles and flavor profiles). Buffalo wings are traditionally served with blue cheese and celery. In 2003, the Anchor Bar received a James Beard Foundation Award in the America's Classics category. Buffalo-style pizza has elements of Chicago-style pizza and New York-style pizza. The Buffalo area has over 600 pizzerias, estimated at more per capita than New York City. Several craft breweries began opening in the 1990s, and the city's last call is 4 am. Other mainstays of Buffalo cuisine include beef on weck and Polish butter lambs, kielbasa and pierogis; sponge candy, and the fish fry (popular during Lent). With an influx of refugees and other immigrants to Buffalo, its number of ethnic restaurants (including the West Side Bazaar kitchen incubator) has increased. Some restaurants use food trucks to serve customers, and nearly fifty food trucks appeared at Larkin Square in 2019. Museums and tourism Buffalo was ranked the seventh-best city in the United States to visit in 2021 by Travel + Leisure, which noted the growth and potential of the city's cultural institutions. The Albright–Knox Art Gallery is a modern and contemporary art museum with a collection of more than 8,000 works, of which only two percent are on display. With a donation from Jeffrey Gundlach, a three-story addition designed by the Dutch architectural firm OMA is under construction and scheduled to open in 2022. Across the street, the Burchfield Penney Art Center contains paintings by Charles E. Burchfield and is operated by Buffalo State College. Buffalo is home to the Freedom Wall, a 2017 art installation commemorating civil-rights activists throughout history. Near both museums is the Buffalo History Museum, featuring artwork, literature and exhibits related to the city's history and major events, and the Buffalo Museum of Science is on the city's East Side. Canalside, Buffalo's historic business district and harbor, attracts more than 1.5 million visitors annually. It includes the Explore & More Children's Museum, the Buffalo and Erie County Naval & Military Park, LECOM Harborcenter, and a number of shops and restaurants. A restored 1924 carousel (now solar-powered) and a replica boathouse were added to Canalside in 2021. Other city attractions include the Theodore Roosevelt Inaugural National Historic Site, the Michigan Street Baptist Church, Buffalo RiverWorks, Seneca Buffalo Creek Casino, Buffalo Transportation Pierce-Arrow Museum, and the Nash House Museum. The National Buffalo Wing Festival is held every Labor Day at Sahlen Field. Since 2002, it has served over 4.8 million Buffalo wings and has had a total attendance of 865,000. The Taste of Buffalo is a two-day food festival held in July at Niagara Square, attracting 450,000 visitors annually. Other events include the Allentown Art Festival, the Polish-American Dyngus Day, the Elmwood Avenue Festival of the Arts, Juneteenth in Martin Luther King Jr. Park, and the World's Largest Disco in October. Sports Buffalo has two major professional sports teams: the Buffalo Sabres (National Hockey League) and the Buffalo Bills (National Football League). The Bills were a founding member of the American Football League in 1960, and have played at Highmark Stadium in Orchard Park since they moved from War Memorial Stadium in 1973. They are the only NFL team based in New York State. Before the Super Bowl era, the Bills won the American Football League Championship in 1964 and 1965. With mixed success throughout their history, the Bills had a close loss in Super Bowl XXV and returned to consecutive Super Bowls after the 1991, 1992, and 1993 seasons (losing each time). The Sabres, an expansion team in 1970, share KeyBank Center with the Buffalo Bandits of the National Lacrosse League. The Bandits are the most successful of the city's three major-league teams, with four championships. The Bills, Sabres and Bandits are owned by Pegula Sports and Entertainment. Several colleges and universities in the area field intercollegiate sports teams; the Buffalo Bulls and the Canisius Golden Griffins compete in NCAA Division I. The Bulls have 16 varsity sports in the Mid-American Conference (MAC); the Golden Griffins field 15 teams in the Metro Atlantic Athletic Conference (MAAC), with the men's hockey team part of the Atlantic Hockey Association (AHA). The Bulls participate in the Football Bowl Subdivision, the highest level of college football. Buffalo's minor-league teams include the Buffalo Bisons (Triple-A baseball), who play at Sahlen Field, and the Buffalo Beauts (National Women's Hockey League). Parks and recreation Frederick Law Olmsted described Buffalo as being "the best planned city [...] in the United States, if not the world". With encouragement from city stakeholders, he and Calvert Vaux augmented the city's grid plan by drawing inspiration from Paris and introducing landscape architecture with aspects of the countryside. Their plan would introduce a system of interconnected parks, parkways and trails, unlike the singular Central Park in New York City. The largest would be Delaware Park, across Forest Lawn Cemetery to amplify the amount of open space. With construction of the system finishing in 1876, it is regarded as the country's oldest; however, some of Olmsted's plans were never fully realized. Some parks later diminished and succumbed to diseases, highway construction, and weather events such as Lake Storm Aphid in 2006. The non-profit Buffalo Olmsted Park Conservancy was created in 2004 to help preserve the of parkland. Olmsted's work in Buffalo inspired similar efforts in cities such as San Francisco, Chicago, and Boston. The city's Division of Parks and Recreation manages over 180 parks and facilities, seven recreational centers, twenty-one pools and splash pads, and three ice rinks. The Delaware Park features the Buffalo Zoo, Hoyt Lake, a golf course, and playing fields. Buffalo collaborated with its sister city Kanazawa to create the park's Japanese Garden in 1970, where cherry blossoms bloom in the spring. Shakespeare in Delaware Park has been held every year since 1976, attracting over forty thousand visitors from across the U.S. Opening in 1976, Tifft Nature Preserve in South Buffalo is on of remediated industrial land. The preserve is an Important Bird Area, including a meadow with trails for hiking and cross-country skiing, marshland and fishing. The Olmsted-designed Cazenovia and South Parks, the latter home to the Buffalo and Erie County Botanical Gardens, are also in South Buffalo. According to the Trust for Public Land, Buffalo's had high marks for access to parks, with 89 percent of city residents living within a ten-minute walk of a park. The city ranked lower in acreage, however; nine percent of city land is devoted to parks, compared with the national median of about fifteen percent. Efforts to convert Buffalo's former industrial waterfront into recreational space have attracted national attention, with some writers comparing its appeal to that of Niagara Falls. Redevelopment of the waterfront began in the early 2000s, with the reconstruction of historically-aligned canals on the site of the former Buffalo Memorial Auditorium. Placemaking initiatives would lead to the area's popularity, rather than | on Lake Erie. After the Civil War, canal traffic began to drop as railroads expanded into Buffalo. Unionization began to take hold in the late 19th century, highlighted by railroad strikes in 1877 and 1892. Steel, challenges and the modern era At the start of the 20th century, Buffalo was the world's leading grain port and a national flour-milling hub. Local mills were among the first to benefit from hydroelectricity generated by the Niagara River. Buffalo hosted the 1901 Pan-American Exposition after the Spanish–American War, showcasing the nation's advances in art, architecture, and electricity. Its centerpiece was the Electric Tower, with over two million light bulbs, but some exhibits were jingoistic and racially charged. At the exposition, President William McKinley was assassinated by anarchist Leon Czolgosz. When McKinley died, Theodore Roosevelt was sworn in at the Wilcox Mansion in Buffalo. Attorney John Milburn and local industrialists and convinced the Lackawanna Iron and Steel Company to relocate from Scranton, Pennsylvania to the town of West Seneca in 1904. Employment was competitive, with many Eastern Europeans and Scrantonians vying for jobs. From the late 19th century to the 1920s, mergers and acquisitions led to distant ownership of local companies; this had a negative effect on the city's economy. Examples include the acquisition of Lackawanna Steel by Bethlehem Steel and, later, the relocation of Curtiss-Wright in the 1940s. The Great Depression saw severe unemployment, especially among the working class. New Deal relief programs operated in full force, and the city became a stronghold of labor unions and the Democratic Party. During World War II, Buffalo regained its manufacturing strength as military contracts enabled the city to manufacture steel, chemicals, aircraft, trucks and ammunition. The 15th-most-populous US city in 1950, Buffalo's economy relied almost entirely on manufacturing; eighty percent of area jobs were in the sector. The city also had over a dozen railway terminals, as railroads remained a significant industry. The St. Lawrence Seaway was proposed in the 19th century as a faster shipping route to Europe, and later as part of a bi-national hydroelectric project with Canada. Its combination with an expanded Welland Canal led to a grim outlook for Buffalo's economy. After its 1959 opening, the city's port and barge canal became largely irrelevant. Shipbuilding in Buffalo wound down in the 1960s due to reduced waterfront activity, ending an industry which had been part of the city's economy since 1812. Downsizing of the steel mills was attributed to the threat of higher wages and unionization efforts. Racial tensions culminated in riots in 1967. Suburbanization led to the selection of the town of Amherst for the new University at Buffalo campus by 1970. Unwilling to modernize its plant, Bethlehem Steel began cutting thousands of jobs in Lackawanna during the mid-1970s before closing it in 1983. The region lost at least 70,000 jobs between 1970 and 1984. Like much of the Rust Belt, Buffalo has focused on recovering from the effects of late-20th-century deindustrialization. Geography Topography Buffalo is on the eastern end of Lake Erie opposite Fort Erie, Ontario. It is at the head of the Niagara River, which flows north over Niagara Falls into Lake Ontario. The Buffalo metropolitan area is on the Erie/Ontario Lake Plain of the Eastern Great Lakes Lowlands, a narrow plain extending east to Utica, New York. The city is generally flat, except for elevation changes in the University Heights and Fruit Belt neighborhoods. The Southtowns are hillier, leading to the Cattaraugus Hills in the Appalachian Upland. Several types of shale, limestone and lagerstätten are prevalent in Buffalo and its surrounding area, lining their stream beds. Although the city has not experienced any recent or significant earthquakes, Buffalo is in the Southern Great Lakes Seismic Zone (part of the Great Lakes tectonic zone). Buffalo has four channels within its boundaries: the Niagara River, Buffalo River (and Creek), Scajaquada Creek, and the Black Rock Canal, adjacent to the Niagara River. The city's Bureau of Forestry maintains a database of over seventy thousand trees. According to the United States Census Bureau, Buffalo has an area of ; is land, and the rest is water. The city's total area is 22.66 percent water. In 2010, its population density was 6,470.6 per square mile. Cityscape Buffalo's architecture is diverse, with a collection of 19th- and 20th-century buildings. Downtown Buffalo landmarks include Louis Sullivan's Guaranty Building, an early skyscraper; the Ellicott Square Building, once one of the largest of its kind in the world; the Art Deco Buffalo City Hall and the McKinley Monument, and the Electric Tower. Beyond downtown, the Buffalo Central Terminal was built in the Broadway-Fillmore neighborhood in 1929; the Richardson Olmsted Complex, built in 1881, was an insane asylum until its closure in the 1970s. Urban renewal from the 1950s to the 1970s spawned the Brutalist-style Buffalo City Court Building and Seneca One Tower, the city's tallest building. In the city's Parkside neighborhood, the Darwin D. Martin House was designed by Frank Lloyd Wright in his Prairie School style. Since 2016, Washington DC real estate developer Douglas Jemal has been acquiring, and redeveloping iconic properties throughout the city. Neighborhoods According to Mark Goldman, the city has a "tradition of separate and independent settlements." The boundaries of Buffalo's neighborhoods have changed over time. The city is divided into five districts, each containing several neighborhoods, for a total of thirty-five neighborhoods. Main Street divides Buffalo's east and west sides, and the west side was fully developed earlier. This division is seen in architectural styles, street names, neighborhood and district boundaries, demographics, and socioeconomic conditions; Buffalo's West Side is generally more affluent than its East Side. Several neighborhoods in Buffalo have had increased investment since the 1990s, beginning with the Elmwood Village. The 2002 redevelopment of the Larkin Terminal Warehouse led to the creation of Larkinville, home to several mixed-use projects and anchored by corporate offices. Downtown Buffalo and its central business district (CBD) had a 10.6-percent increase in residents from 2010 to 2017, as over 1,061 housing units became available; the Seneca One Tower was redeveloped in 2020. Other revitalized areas include Chandler Street, in the Grant-Amherst neighborhood, and Hertel Avenue in Parkside. The Buffalo Common Council adopted its Green Code in 2017, replacing zoning regulations which were over sixty years old. Its emphasis on regulations promoting pedestrian safety and mixed land use received an award at the 2019 Congress for the New Urbanism conference. Climate Buffalo has a humid continental climate (Köppen Dfb bordering on Dfa, common in the Great Lakes region), and temperatures have been warming with the rest of the US. Lake-effect snow is characteristic of Buffalo winters, with snow bands (producing intense snowfall in the city and surrounding area) depending on wind direction off Lake Erie. However, Buffalo is rarely the snowiest city in the state. The Blizzard of 1977 resulted from a combination of high winds and snow which accumulated on land and on the frozen Lake Erie. Although snow does not typically impair the city's operation, it can cause significant damage in autumn (as the October 2006 storm did). In November 2014 (called "Snowvember"), the region had a record-breaking storm which producing over of snow. Buffalo's lowest recorded temperature was , which occurred twice: on February 9, 1934, and February 2, 1961. Although the city's summers are drier and sunnier than other cities in the northeastern United States, its vegetation receives enough precipitation to remain hydrated. Buffalo summers are characterized by abundant sunshine, with moderate humidity and temperatures; the city benefits from cool, southwestern Lake Erie summer breezes which temper warmer temperatures. Temperatures rise above an average of three times a year. No official recording of or more has occurred to date, with a maximum temperature of 99 °F reached on August 27, 1948. Rainfall is moderate, typically falling at night, and cooler lake temperatures hinder storm development in July. August is usually rainier and muggier, as the warmer lake loses its temperature-controlling ability. Demographics Several hundred Seneca, Tuscarora and other Iroquois tribal peoples were the primary residents of the Buffalo area before 1800, concentrated along Buffalo Creek. After the Revolutionary War, settlers from New England and eastern New York began to move into the area. From the 1830s to the 1850s, they were joined by Irish and German immigrants from Europe, both peasants and working class, who settled in enclaves on the city's south and east sides. At the turn of the 20th century, Polish immigrants replaced Germans on the East Side, who moved to newer housing; Italian immigrant families settled throughout the city, primarily on the lower West Side. During the 1830s, Buffalo residents were generally intolerant of the small groups of Black Americans who began settling on the city's East Side. In the 20th century, wartime and manufacturing jobs attracted Black Americans from the South during the First and Second Great Migrations. In the World War II and postwar years from 1940 to 1970, the city's Black population rose by 433 percent. They replaced most of the Polish community on the East Side, who were moving out to suburbs. However, the effects of redlining, steering, social inequality, blockbusting, white flight and other racial policies resulted in the city (and region) becoming one of the most segregated in the U.S. During the 1940s and 1950s, Puerto Rican migrants arrived en masse, also seeking industrial jobs, settling on the East Side and moving westward. In the 21st century, Buffalo is classified as a majority minority city, with a plurality of residents who are Black and Latino. Buffalo has mitigated the effects of urban decay since the 1970s, including population losses to the suburbs and Sun Belt states, and job losses from deindustrialization. The city's population peaked at 580,132 in 1950, when Buffalo was the 15th-largest city in the United Statesdown from the eighth-largest city in 1900, after its growth rate slowed during the 1920s. Buffalo's population began declining in the second half of the 20th century, due to suburbanization and loss of industrial jobs, and began stabilizing during the 2010s. The city had a population of 261,310 in the 2010 census which increased to 278,349 residents in the 2020 census, making it the 76th-largest city in the United States. Its metropolitan area had 1.1 million residents in 2020, the country's 49th-largest. Compared to other major US metropolitan areas, the number of foreign-born immigrants to Buffalo is low. New immigrants are primarily resettled refugees (especially from war- or disaster-afflicted nations) and refugees who had previously settled in other U.S. cities. During the early 2000s, most immigrants came from Canada and Yemen; this shifted in the 2010s to Burmese (Karen) refugees and Indian immigrants. Between 2008 and 2016, Burmese, Somali, Bhutanese, and Iraqi Americans were the four largest ethnic immigrant groups in Erie County. Poverty has remained an issue for the city; in 2019, it was estimated that 30.1 percent of individuals and 24.8 percent of families lived below the federal poverty line. Per capita income was $24,400 and household income was $37,354: much less than the national average. A 2008 report noted that although food deserts were seen in larger cities and not in Buffalo, the city's neighborhoods of color have access only to smaller grocery stores and lack the supermarkets more typical of newer, white neighborhoods. A 2018 report noted that over fifty city blocks on Buffalo's East Side lacked adequate access to a supermarket. Health disparities exist compared to the rest of the state: Erie County's average 2019 lifespan was three years lower (78.4 years); its 17-percent smoking and 30-percent obesity rates were slightly higher than the state average. According to the Partnership for the Public Good, educational achievement in the city is lower than in the surrounding area; city residents are almost twice as likely as adults in the metropolitan area to lack a high-school diploma. Religion During the early 19th century, Presbyterian missionaries tried to convert the Seneca people on the Buffalo Creek Reservation to Christianity. Initially resistant, some tribal members set aside their traditions and practices to form their own sect. Later, European immigrants added other faiths. Christianity is the predominant religion in Buffalo and Western New York. Catholicism (primarily the Latin Church) has a significant presence in the region, with 161 parishes and over 570,000 adherents in the Diocese of Buffalo. A Jewish community began developing in the city with immigrants from the mid-1800s; about one thousand German and Lithuanian Jews settled in Buffalo before 1880. Buffalo's first synagogue, Temple Beth El, was established in 1847. The city's Temple Beth Zion is the region's largest synagogue. With changing demographics and an increased number of refugees from other areas on the city's East Side, Islam and Buddhism have expanded their presence. In this area, new residents have converted empty churches into mosques and temples. Hinduism maintains a small, active presence in the area, including the town of Amherst. A 2016 American Bible Society survey reported that Buffalo is the fifth-least "Bible-minded" city in the United States; 13 percent of its residents associate with the Bible. Economy The Erie Canal was the impetus for Buffalo's economic growth as a transshipment hub for grain and other agricultural products headed east from the Midwest. Later, manufacturing of steel and automotive parts became central to the city's economy. When these industries downsized in the region, Buffalo's economy became service-based. Its primary sectors include health care, business services (banking, accounting, and insurance), retail, tourism and logistics, especially with Canada. Despite the loss of large-scale manufacturing, some manufacturing of metals, chemicals, machinery, food products, and electronics remains in the region. Advanced manufacturing has increased, with an emphasis on research and development (R&D) and automation. In 2019, the U.S. Bureau of Economic Analysis valued the gross domestic product (GDP) of the Buffalo–Niagara Falls MSA at $53 billion. The civic sector is a major source of employment in the Buffalo area, and includes public, non-profit, healthcare and educational institutions. New York State, with over 19,000 employees, is the region's largest employer. In the private sector, top employers include the Kaleida Health and Catholic Health hospital networks and M&T Bank, the sole Fortune 500 company headquartered in the city. Most have been the top employers in the region for several decades. Buffalo is home to the headquarters of Rich Products, Delaware North and New Era Cap Company; the aerospace manufacturer Moog Inc. is based in nearby East Aurora. Buffalo weathered the Great Recession of 2006–09 well in comparison with other U.S. cities, exemplified by increased home prices during this time. The region's economy began to improve in |
same year, he printed a new currency for New Jersey based on innovative anti-counterfeiting techniques he had devised. Throughout his career, Franklin was an advocate for paper money, publishing A Modest Enquiry into the Nature and Necessity of a Paper Currency in 1729, and his printer printed money. He was influential in the more restrained and thus successful monetary experiments in the Middle Colonies, which stopped deflation without causing excessive inflation. In 1766, he made a case for paper money to the British House of Commons. As he matured, Franklin began to concern himself more with public affairs. In 1743, he first devised a scheme for the Academy, Charity School, and College of Philadelphia. However, the person he had in mind to run the academy, Rev. Richard Peters, refused and Franklin put his ideas away until 1749 when he printed his own pamphlet, Proposals Relating to the Education of Youth in Pensilvania. He was appointed president of the Academy on November 13, 1749; the academy and the charity school opened in 1751. In 1743, Franklin founded the American Philosophical Society to help scientific men discuss their discoveries and theories. He began the electrical research that, along with other scientific inquiries, would occupy him for the rest of his life, in between bouts of politics and moneymaking. During King George's War, Franklin raised a militia called the Association for General Defense, because the legislators of the city decided to take no action to defend Philadelphia "either by erecting fortifications or building Ships of War". He raised money to create earthwork defenses and buy artillery. The largest of these was the "Association Battery" or "Grand Battery" of 50 guns. In 1747, Franklin (already a very wealthy man) retired from printing and went into other businesses. He created a partnership with his foreman, David Hall, which provided Franklin with half of the shop's profits for 18 years. This lucrative business arrangement provided leisure time for study, and in a few years he had made many new discoveries. Franklin became involved in Philadelphia politics and rapidly progressed. In October 1748, he was selected as a councilman; in June 1749, he became a justice of the peace for Philadelphia; and in 1751, he was elected to the Pennsylvania Assembly. On August 10, 1753, Franklin was appointed deputy postmaster-general of British North America. His most notable service in domestic politics was his reform of the postal system, with mail sent out every week. In 1751, Franklin and Thomas Bond obtained a charter from the Pennsylvania legislature to establish a hospital. Pennsylvania Hospital was the first hospital in the colonies. In 1752, Franklin organized the Philadelphia Contributionship, the Colonies' first homeowner's insurance company. Between 1750 and 1753, the "educational triumvirate" of Benjamin Franklin, Samuel Johnson of Stratford, Connecticut, and schoolteacher William Smith built on Franklin's initial scheme and created what Bishop James Madison, president of the College of William & Mary, called a "new-model" plan or style of American college. Franklin solicited, printed in 1752, and promoted an American textbook of moral philosophy by Samuel Johnson, titled Elementa Philosophica, to be taught in the new colleges. In June 1753, Johnson, Franklin, and Smith met in Stratford. They decided the new-model college would focus on the professions, with classes taught in English instead of Latin, have subject matter experts as professors instead of one tutor leading a class for four years, and there would be no religious test for admission. Johnson went on to found King's College (now Columbia University) in New York City in 1754, while Franklin hired Smith as provost of the College of Philadelphia, which opened in 1755. At its first commencement, on May 17, 1757, seven men graduated; six with a Bachelor of Arts and one as Master of Arts. It was later merged with the University of the State of Pennsylvania to become the University of Pennsylvania. The college was to become influential in guiding the founding documents of the United States: in the Continental Congress, for example, over one-third of the college-affiliated men who contributed the Declaration of Independence between September 4, 1774, and July 4, 1776, was affiliated with the college. In 1754, he headed the Pennsylvania delegation to the Albany Congress. This meeting of several colonies had been requested by the Board of Trade in England to improve relations with the Indians and defense against the French. Franklin proposed a broad Plan of Union for the colonies. While the plan was not adopted, elements of it found their way into the Articles of Confederation and the Constitution. In 1753, both Harvard and Yale awarded him honorary master of arts degrees. In 1756, Franklin received an honorary Master of Arts degree from the College of William & Mary. Later in 1756, Franklin organized the Pennsylvania Militia. He used Tun Tavern as a gathering place to recruit a regiment of soldiers to go into battle against the Native American uprisings that beset the American colonies. Postmaster Well known as a printer and publisher, Franklin was appointed postmaster of Philadelphia in 1737, holding the office until 1753, when he and publisher William Hunter were named deputy postmasters–general of British North America, the first to hold the office. (Joint appointments were standard at the time, for political reasons.) Franklin was responsible for the British colonies from Pennsylvania north and east, as far as the island of Newfoundland. A post office for local and outgoing mail had been established in Halifax, Nova Scotia, by local stationer Benjamin Leigh, on April 23, 1754, but service was irregular. Franklin opened the first post office to offer regular, monthly mail in Halifax on December 9, 1755. Meantime, Hunter became postal administrator in Williamsburg, Virginia, and oversaw areas south of Annapolis, Maryland. Franklin reorganized the service's accounting system and improved speed of delivery between Philadelphia, New York and Boston. By 1761, efficiencies led to the first profits for the colonial post office. When the lands of New France were ceded to the British under the Treaty of Paris in 1763, the British province of Quebec was created among them, and Franklin saw mail service expanded between Montreal, Trois-Rivières, Quebec City, and New York. For the greater part of his appointment, Franklin lived in England (from 1757 to 1762, and again from 1764 to 1774)—about three-quarters of his term. Eventually, his sympathies for the rebel cause in the American Revolution led to his dismissal on January 31, 1774. On July 26, 1775, the Second Continental Congress established the United States Post Office and named Franklin as the first United States postmaster general. Franklin had been a postmaster for decades and was a natural choice for the position. He had just returned from England and was appointed chairman of a Committee of Investigation to establish a postal system. The report of the committee, providing for the appointment of a postmaster general for the 13 American colonies, was considered by the Continental Congress on July 25 and 26. On July 26, 1775, Franklin was appointed postmaster general, the first appointed under the Continental Congress. His apprentice, William Goddard, felt that his ideas were mostly responsible for shaping the postal system and that the appointment should have gone to him, but he graciously conceded it to Franklin, 36 years his senior. Franklin, however, appointed Goddard as Surveyor of the Posts, issued him a signed pass, and directed him to investigate and inspect the various post offices and mail routes as he saw fit. The newly established postal system became the United States Post Office, a system that continues to operate today. Decades in London From the mid-1750s to the mid-1770s, Franklin spent much of his time in London. Political work In 1757, he was sent to England by the Pennsylvania Assembly as a colonial agent to protest against the political influence of the Penn family, the proprietors of the colony. He remained there for five years, striving to end the proprietors' prerogative to overturn legislation from the elected Assembly and their exemption from paying taxes on their land. His lack of influential allies in Whitehall led to the failure of this mission. At this time, many members of the Pennsylvania Assembly were feuding with William Penn's heirs who controlled the colony as proprietors. After his return to the colony, Franklin led the "anti-proprietary party" in the struggle against the Penn family and was elected Speaker of the Pennsylvania House in May 1764. His call for a change from proprietary to royal government was a rare political miscalculation, however: Pennsylvanians worried that such a move would endanger their political and religious freedoms. Because of these fears and because of political attacks on his character, Franklin lost his seat in the October 1764 Assembly elections. The anti-proprietary party dispatched Franklin to England again to continue the struggle against the Penn family proprietorship. During this trip, events drastically changed the nature of his mission. In London, Franklin opposed the 1765 Stamp Act. Unable to prevent its passage, he made another political miscalculation and recommended a friend to the post of stamp distributor for Pennsylvania. Pennsylvanians were outraged, believing that he had supported the measure all along, and threatened to destroy his home in Philadelphia. Franklin soon learned of the extent of colonial resistance to the Stamp Act, and he testified during the House of Commons proceedings that led to its repeal. With this, Franklin suddenly emerged as the leading spokesman for American interests in England. He wrote popular essays on behalf of the colonies. Georgia, New Jersey, and Massachusetts also appointed him as their agent to the Crown. During his lengthy missions to London between 1757 and 1775, Franklin lodged in a house on Craven Street, just off The Strand in central London. During his stays there, he developed a close friendship with his landlady, Margaret Stevenson, and her circle of friends and relations, in particular, her daughter Mary, who was more often known as Polly. The house is now operating as a museum known as the Benjamin Franklin House. Whilst in London, Franklin became involved in radical politics. He belonged to a gentleman's club (which he called "the honest Whigs"), which held stated meetings, and included members such as Richard Price, the minister of Newington Green Unitarian Church who ignited the Revolution controversy, and Andrew Kippis. Scientific work In 1756, Franklin had become a member of the Society for the Encouragement of Arts, Manufactures & Commerce (now the Royal Society of Arts), which had been founded in 1754. After his return to the United States in 1775, Franklin became the Society's Corresponding Member, continuing a close connection. The Royal Society of Arts instituted a Benjamin Franklin Medal in 1956 to commemorate the 250th anniversary of his birth and the 200th anniversary of his membership of the RSA. The study of natural philosophy (referred today as science in general) drew him into overlapping circles of acquaintance. Franklin was, for example, a corresponding member of the Lunar Society of Birmingham. In 1759, the University of St Andrews awarded Franklin an honorary doctorate in recognition of his accomplishments. In October 1759, he was granted Freedom of the Borough of St Andrews. He was also awarded an honorary doctorate by Oxford University in 1762. Because of these honors, Franklin was often addressed as " Franklin." While living in London in 1768, he developed a phonetic alphabet in A Scheme for a new Alphabet and a Reformed Mode of Spelling. This reformed alphabet discarded six letters Franklin regarded as redundant (c, j, q, w, x, and y), and substituted six new letters for sounds he felt lacked letters of their own. This alphabet never caught on, and he eventually lost interest. Travels around Europe Franklin used London as a base to travel. In 1771, he made short journeys through different parts of England, staying with Joseph Priestley at Leeds, Thomas Percival at Manchester and Erasmus Darwin at Lichfield. In Scotland, he spent five days with Lord Kames near Stirling and stayed for three weeks with David Hume in Edinburgh. In 1759, he visited Edinburgh with his son and later reported that he considered his six weeks in Scotland "six weeks of the densest happiness I have met with in any part of my life". In Ireland, he stayed with Lord Hillsborough. Franklin noted of him that "all the plausible behaviour I have described is meant only, by patting and stroking the horse, to make him more patient, while the reins are drawn tighter, and the spurs set deeper into his sides." In Dublin, Franklin was invited to sit with the members of the Irish Parliament rather than in the gallery. He was the first American to receive this honor. While touring Ireland, he was deeply moved by the level of poverty he witnessed. The economy of the Kingdom of Ireland was affected by the same trade regulations and laws that governed the Thirteen Colonies. Franklin feared that the American colonies could eventually come to the same level of poverty if the regulations and laws continued to apply to them. Franklin spent two months in German lands in 1766, but his connections to the country stretched across a lifetime. He declared a debt of gratitude to German scientist Otto von Guericke for his early studies of electricity. Franklin also co-authored the first treaty of friendship between Prussia and America in 1785. In September 1767, Franklin visited Paris with his usual traveling partner, Sir John Pringle, 1st Baronet. News of his electrical discoveries was widespread in France. His reputation meant that he was introduced to many influential scientists and politicians, and also to King Louis XV. Defending the American cause One line of argument in Parliament was that Americans should pay a share of the costs of the French and Indian War and therefore taxes should be levied on them. Franklin became the American spokesman in highly publicized testimony in Parliament in 1766. He stated that Americans already contributed heavily to the defense of the Empire. He said local governments had raised, outfitted and paid 25,000 soldiers to fight France—as many as Britain itself sent—and spent many millions from American treasuries doing so in the French and Indian War alone. In 1772, Franklin obtained private letters of Thomas Hutchinson and Andrew Oliver, governor and lieutenant governor of the Province of Massachusetts Bay, proving that they had encouraged the Crown to crack down on Bostonians. Franklin sent them to America, where they escalated the tensions. The letters were finally leaked to the public in the Boston Gazette in mid-June 1773, causing a political firestorm in Massachusetts and raising significant questions in England. The British began to regard him as the fomenter of serious trouble. Hopes for a peaceful solution ended as he was systematically ridiculed and humiliated by Solicitor-General Alexander Wedderburn, before the Privy Council on January 29, 1774. He returned to Philadelphia in March 1775, and abandoned his accommodationist stance. In 1773, Franklin published two of his most celebrated pro-American satirical essays: "Rules by Which a Great Empire May Be Reduced to a Small One", and "An Edict by the King of Prussia". Agent for British and Hellfire Club membership Franklin is known to have occasionally attended the Hellfire Club's meetings during 1758 as a non-member during his time in England. However, some authors and historians would argue Benjamin Franklin was in fact a British spy. As there are no records left (having been burned in 1774), many of these members are just assumed or linked by letters sent to each other. One early proponent that Franklin was a member of the Hellfire Club and a double agent was the historian Donald McCormick, who has a history of making controversial claims. Coming of revolution In 1763, soon after Franklin returned to Pennsylvania from England for the first time, the western frontier was engulfed in a bitter war known as Pontiac's Rebellion. The Paxton Boys, a group of settlers convinced that the Pennsylvania government was not doing enough to protect them from American Indian raids, murdered a group of peaceful Susquehannock Indians and marched on Philadelphia. Franklin helped to organize a local militia to defend the capital against the mob. He met with the Paxton leaders and persuaded them to disperse. Franklin wrote a scathing attack against the racial prejudice of the Paxton Boys. "If an Indian injures me", he asked, "does it follow that I may revenge that Injury on all Indians?" He provided an early response to British surveillance through his own network of counter-surveillance and manipulation. "He waged a public relations campaign, secured secret aid, played a role in privateering expeditions, and churned out effective and inflammatory propaganda." Declaration of Independence By the time Franklin arrived in Philadelphia on May 5, 1775, after his second mission to Great Britain, the American Revolution had begun—with skirmishes breaking out between colonials and British at Lexington and Concord. The New England militia had forced the main British army to remain inside Boston. The Pennsylvania Assembly unanimously chose Franklin as their delegate to the Second Continental Congress. In June 1776, Franklin was appointed a member of the Committee of Five that drafted the Declaration of Independence. Although he was temporarily disabled by gout and unable to attend most meetings of the committee, Franklin made several "small but important" changes to the draft sent to him by Thomas Jefferson. At the signing, he is quoted as having replied to a comment by John Hancock that they must all hang together: "Yes, we must, indeed, all hang together, or most assuredly we shall all hang separately." Ambassador to France (1776–1785) In December 1776, Franklin was dispatched to France as commissioner for the United States. He took with him as secretary his 16-year-old grandson, William Temple Franklin. They lived in a home in the Parisian suburb of Passy, donated by Jacques-Donatien Le Ray de Chaumont, who supported the United States. Franklin remained in France until 1785. He conducted the affairs of his country toward the French nation with great success, which included securing a critical military alliance in 1778 and signing the 1783 Treaty of Paris. Among his associates in France was Honoré Gabriel Riqueti, comte de Mirabeau—a French Revolutionary writer, orator and statesman who in 1791 was elected president of the National Assembly. In July 1784, Franklin met with Mirabeau and contributed anonymous materials that the Frenchman used in his first signed work: Considerations sur l'ordre de Cincinnatus. The publication was critical of the Society of the Cincinnati, established in the United States. Franklin and Mirabeau thought of it as a "noble order", inconsistent with the egalitarian ideals of the new republic. During his stay in France, Franklin was active as a Freemason, serving as venerable master of the lodge Les Neuf Sœurs from 1779 until 1781. In 1784, when Franz Mesmer began to publicize his theory of "animal magnetism" which was considered offensive by many, Louis XVI appointed a commission to investigate it. These included the chemist Antoine Lavoisier, the physician Joseph-Ignace Guillotin, the astronomer Jean Sylvain Bailly, and Franklin. In doing so, the committee concluded, through blind trials that mesmerism only seemed to work when the subjects expected it, which discredited mesmerism and became the first major demonstration of the placebo effect, which was described at that time as "imagination." In 1781, he was elected a fellow of the American Academy of Arts and Sciences. Franklin's advocacy for religious tolerance in France contributed to arguments made by French philosophers and politicians that resulted in Louis XVI's signing of the Edict of Versailles in November 1787. This edict effectively nullified the Edict of Fontainebleau, which had denied non-Catholics civil status and the right to openly practice their faith. Franklin also served as American minister to Sweden, although he never visited that country. He negotiated a treaty that was signed in April 1783. On August 27, 1783, in Paris, Franklin witnessed the world's first hydrogen balloon flight. Le Globe, created by professor Jacques Charles and Les Frères Robert, was watched by a vast crowd as it rose from the Champ de Mars (now the site of the Eiffel Tower). Franklin became so enthusiastic that he subscribed financially to the next project to build a manned hydrogen balloon. On December 1, 1783, Franklin was seated in the special enclosure for honored guests when La Charlière took off from the Jardin des Tuileries, piloted by Charles and Nicolas-Louis Robert. Return to America When he returned home in 1785, Franklin occupied a position only second to that of George Washington as the champion of American independence. Franklin returned from France with an unexplained shortage of 100,000 pounds in Congressional funds. In response to a question from a member of Congress about this, Franklin, quoting the Bible, quipped: "Muzzle not the ox that treadeth out his master's grain." The missing funds were never again mentioned in Congress. In 1787, Franklin served as a delegate to the Philadelphia Convention. He held an honorary position and seldom engaged in debate. Le Ray honored him with a commissioned portrait painted by Joseph Duplessis, which now hangs in the National Portrait Gallery of the Smithsonian Institution in Washington, D.C. After his return, Franklin became an abolitionist and freed his two slaves. He eventually became president of the Pennsylvania Abolition Society. Special balloting conducted October 18, 1785, unanimously elected Franklin the sixth president of the Supreme Executive Council of Pennsylvania, replacing John Dickinson. The office was practically that of governor. Franklin held that office for slightly over three years, longer than any other, and served the constitutional limit of three full terms. Shortly after his initial election, he was re-elected to a full term on October 29, 1785, and again in the fall of 1786 and on October 31, 1787. In that capacity he served as host to the Constitutional Convention of 1787 in Philadelphia. Death Franklin suffered from obesity throughout his middle-aged and later years, which resulted in multiple health problems, particularly gout, which worsened as he aged. In poor health during the signing of the US Constitution in 1787, he was rarely seen in public from then until his death. Benjamin Franklin died from pleuritic attack at his home in Philadelphia on April 17, 1790. He was aged 84 at the time of his death. His last words were reportedly, "a dying man can do nothing easy", to his daughter after she suggested that he change position in bed and lie on his side so he could breathe more easily. Franklin's death is described in the book The Life of Benjamin Franklin, quoting from the account of John Jones: Approximately 20,000 people attended his funeral. He was interred in Christ Church Burial Ground in Philadelphia. Upon learning of his death, the Constitutional Assembly in Revolutionary France entered into a state of mourning for a period of three days, and memorial services were conducted in honor of Franklin throughout the country. In 1728, aged 22, Franklin wrote what he hoped would be his own epitaph: Franklin's actual grave, however, as he specified in his final will, simply reads "Benjamin and Deborah Franklin". Inventions and scientific inquiries Franklin was a prodigious inventor. Among his many creations were the lightning rod, Franklin stove, bifocal glasses and the flexible urinary catheter. Franklin never patented his inventions; in his autobiography he wrote, "... as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours; and this we should do freely and generously." Electricity Franklin started exploring the phenomenon of electricity in 1746 when he saw some of Archibald Spencer's lectures using static electricity for illustrations. Franklin proposed that "vitreous" and "resinous" electricity were not different types of "electrical fluid" (as electricity was called then), but the same "fluid" under different pressures. (The same proposal was made independently that same year by William Watson.) Franklin was the first to label them as positive and negative respectively, and he was the first to discover the principle of conservation of charge. In 1748, he constructed a multiple plate capacitor, that he called an "electrical battery" (not a true battery like Volta's pile) by placing eleven panes of glass sandwiched between lead plates, suspended with silk cords and connected by wires. In pursuit of more pragmatic uses for electricity, remarking in spring 1749 that he felt "chagrin'd a little" that his experiments had heretofore resulted in "Nothing in this Way of Use to Mankind," Franklin planned a practical demonstration. He proposed a dinner party where a turkey was to be killed with electric shock and roasted on an electrical spit. After having prepared several turkeys this way, Franklin noted that "the birds kill'd in this manner eat uncommonly tender." Franklin recounted that in the process of one of these experiments, he was shocked by a pair of Leyden jars, resulting in numbness in his arms that persisted for one evening, noting "I am Ashamed to have been Guilty of so Notorious a Blunder." Franklin briefly investigated electrotherapy, including the use of the electric bath. This work led to the field becoming widely known. In recognition of his work with electricity, Franklin received the Royal Society's Copley Medal in 1753, and in 1756, he became one of the few 18th-century Americans elected as a fellow of the Society. The CGS unit of electric charge has been named after him: one franklin (Fr) is equal to one statcoulomb. Franklin advised Harvard University in its acquisition of new electrical laboratory apparatus after the complete loss of its original collection, in a fire that destroyed the original Harvard Hall in 1764. The collection he assembled later became part of the Harvard Collection of Historical Scientific Instruments, now on public display in its Science Center. Kite experiment and lightning rod Franklin published a proposal for an experiment to prove that lightning is electricity by flying a kite in a storm. On May 10, 1752, Thomas-François Dalibard of France conducted Franklin's experiment using a iron rod instead of a kite, and he extracted electrical sparks from a cloud. On June 15, 1752, Franklin may possibly have conducted his well-known kite experiment in Philadelphia, successfully extracting sparks from a cloud. Franklin described the experiment in his newspaper, The Pennsylvania Gazette, on October 19, 1752, without mentioning that he himself had performed it. This account was read to the Royal Society on December 21 and printed as such in the Philosophical Transactions. Joseph Priestley published an account with additional details in his 1767 History and Present Status of Electricity. Franklin was careful to stand on an insulator, keeping dry under a roof to avoid the danger of electric shock. Others, such as Georg Wilhelm Richmann in Russia, were indeed electrocuted in performing lightning experiments during the months immediately following Franklin's experiment. In his writings, Franklin indicates that he was aware of the dangers and offered alternative ways to demonstrate that lightning was electrical, as shown by his use of the concept of electrical ground. Franklin did not perform this experiment in the way that is often pictured in popular literature, flying the kite and waiting to be struck by lightning, as it would have been dangerous. Instead he used the kite to collect some electric charge from a storm cloud, showing that lightning was electrical. On October 19, 1752, in a letter to England with directions for repeating the experiment, Franklin wrote: Franklin's electrical experiments led to his invention of the lightning rod. He said that conductors with a sharp rather than a smooth point could discharge silently and at a far greater distance. He surmised that this could help protect buildings from lightning by attaching "upright Rods of Iron, made sharp as a Needle and gilt to prevent Rusting, and from the Foot of those Rods a Wire down the outside of the Building into the Ground; ... Would not these pointed Rods probably draw the Electrical Fire silently out of a Cloud before it came nigh enough to strike, and thereby secure us from that most sudden and terrible Mischief!" Following a series of experiments on Franklin's own house, lightning rods were installed on the Academy of Philadelphia (later the University of Pennsylvania) and the Pennsylvania State House (later Independence Hall) in 1752. Population studies Franklin had a major influence on the emerging science of demography, or population studies. In the 1730s and 1740s, Franklin began taking notes on population growth, finding that the American population had the fastest growth rate on Earth. Emphasizing that population growth depended on food supplies, Franklin emphasized the abundance of food and available farmland in America. He calculated that America's population was doubling every 20 years and would surpass that of England in a century. In 1751, he drafted Observations concerning the Increase of Mankind, Peopling of Countries, etc. Four years later, it was anonymously printed in Boston, and it was quickly reproduced in Britain, where it influenced the economist Adam Smith and later the demographer Thomas Malthus, who credited Franklin for discovering a rule of population growth. Franklin's predictions how British mercantilism was unsustainable alarmed British leaders who did not want to be surpassed by the colonies, so they became more willing to impose restrictions on the colonial economy. Kammen (1990) and Drake (2011) say Franklin's Observations concerning the Increase of Mankind (1755) stands alongside Ezra Stiles' "Discourse on Christian Union" (1760) as the leading works of 18th-century Anglo-American demography; Drake credits Franklin's "wide readership and prophetic insight." Franklin was also a pioneer in the study of slave demography, as shown in his 1755 essay. Franklin, in his capacity as a farmer, wrote at least one critique about the negative consequences of price controls, trade restrictions, and subsidy of the poor. This is succinctly preserved in his letter to the London Chronicle published November 29, 1766, titled 'On the Price of Corn, and Management of the poor'. Oceanography As deputy postmaster, Franklin became interested in the North Atlantic Ocean circulation patterns. While in England in 1768, he heard a complaint from the Colonial Board of Customs: Why did it take British packet ships carrying mail several weeks longer to reach New York than it took an average merchant ship to reach Newport, Rhode Island? The merchantmen had a longer and more complex voyage because they left from London, while the packets left from Falmouth in Cornwall. Franklin put the question to his cousin Timothy Folger, a Nantucket whaler captain, who told him that merchant ships routinely avoided a strong eastbound mid-ocean current. The mail packet captains sailed dead into it, thus fighting an adverse current of . Franklin worked with Folger and other experienced ship captains, learning enough to chart the current and name it the Gulf Stream, by which it is still known today. Franklin published his Gulf Stream chart in 1770 in England, where it was ignored. Subsequent versions were printed in France in 1778 and the U.S. in 1786. The British edition of the chart, which was the original, was so thoroughly ignored that everyone assumed it was lost forever until Phil Richardson, a Woods Hole oceanographer and Gulf Stream expert, discovered it in the Bibliothèque Nationale in Paris in 1980. This find received front-page coverage in The New York Times. It took many years for British sea captains to adopt Franklin's advice on navigating the current; once they did, they were able to trim two weeks from their sailing time. In 1853, the oceanographer and cartographer Matthew Fontaine Maury noted that while Franklin charted and codified the Gulf Stream, he did not discover it: An aging Franklin accumulated all his oceanographic findings in Maritime Observations, published by the Philosophical Society's transactions in 1786. It contained ideas for sea anchors, catamaran hulls, watertight compartments, shipboard lightning rods and a soup bowl designed to stay stable in stormy weather. Theories and experiments Franklin was, along with his contemporary Leonhard Euler, the only major scientist who supported Christiaan Huygens's wave theory of light, which was basically ignored by the rest of the scientific community. In the 18th century, Isaac Newton's corpuscular theory was held to be true; only after Thomas Young's well-known slit experiment in 1803 were most scientists persuaded to believe Huygens's theory. On October 21, 1743, according to the popular myth, a storm moving from the southwest denied Franklin the opportunity of witnessing a lunar eclipse. Franklin was said to have noted that the prevailing winds were actually from the northeast, contrary to what he had expected. In correspondence with his brother, Franklin learned that the same storm had not reached Boston until after the eclipse, despite the fact that Boston is to the northeast of Philadelphia. He deduced that storms do not always travel in the direction of the prevailing wind, a concept that greatly influenced meteorology. After the Icelandic volcanic eruption of Laki in 1783, and the subsequent harsh European winter of 1784, Franklin made observations connecting the causal nature of these two separate events. He wrote about them in a lecture series. Though Franklin has been most noted kite-wise for his lightning experiments, he has also been noted by many for using kites to pull humans and ships across waterways. George Pocock in the book A TREATISE on The Aeropleustic Art, or Navigation in the Air, by means of Kites, or Buoyant Sails noted being inspired by Benjamin Franklin's traction of his body by kite power across a waterway. Franklin noted a principle of refrigeration by observing that on a very hot day, he stayed cooler in a wet shirt in a breeze than he did in a dry one. To understand this phenomenon more clearly Franklin conducted experiments. In 1758 on a warm day in Cambridge, England, Franklin and fellow scientist John Hadley experimented by continually wetting the ball of a mercury thermometer with ether and using bellows to evaporate the ether. With each subsequent evaporation, the thermometer read a lower temperature, eventually reaching . Another thermometer showed that the room temperature was constant at . In his letter Cooling by Evaporation, Franklin noted that, "One may see the possibility of freezing a man to death on a warm summer's day." According to Michael Faraday, Franklin's experiments on the non-conduction of ice are worth mentioning, although the law of the general effect of liquefaction on electrolytes is not attributed to Franklin. However, as reported in 1836 by Prof. A. D. Bache of the University of Pennsylvania, the law of the effect of heat on the conduction of bodies otherwise non-conductors, for example, glass, could be attributed to Franklin. Franklin writes, "... A certain quantity of heat will make some bodies good conductors, that will not otherwise conduct ..." and again, "... And water, though naturally a good conductor, will not conduct well when frozen into ice." While traveling on a ship, Franklin had observed that the wake of a ship was diminished when the cooks scuttled their greasy water. He studied the effects on a large pond in Clapham Common, London. "I fetched out a cruet of oil and dropt a little of it on the water ... though not more than a teaspoon full, produced an instant calm over a space of several yards square." He later used the trick to "calm the waters" by carrying "a little oil in the hollow joint of my cane". Decision-making In a 1772 letter to Joseph Priestley, Franklin lays out the earliest known description of the Pro & Con list, a common decision-making technique, now sometimes called a decisional balance sheet: Political, social, and religious views Like the other advocates of republicanism, Franklin emphasized that the new republic could survive only if the people were virtuous. All his life he explored the role of civic and personal virtue, as expressed in Poor Richard's aphorisms. Franklin felt that organized religion was necessary to keep men good to their fellow men, but rarely attended religious services himself. When Franklin met Voltaire in Paris and asked his fellow member of the Enlightenment vanguard to bless his grandson, Voltaire said in English, "God and Liberty", and added, "this is the only appropriate benediction for the grandson of Monsieur Franklin." Franklin's parents were both pious Puritans. The family attended the Old South Church, the most liberal Puritan congregation in Boston, where Benjamin Franklin was baptized in 1706. Franklin's father, a poor chandler, owned a copy of a book, Bonifacius: Essays to Do Good, by the Puritan preacher and family friend Cotton Mather, which Franklin often cited as a key influence on his life. "If I have been", Franklin wrote to Cotton Mather's son seventy years later, "a useful citizen, the public owes the advantage of it to that book." Franklin's first pen name, Silence Dogood, paid homage both to the book and to a widely known sermon by Mather. The book preached the importance of forming voluntary associations to benefit society. Franklin learned about forming do-good associations from Cotton Mather, but his organizational skills made him the most influential force in making voluntarism an enduring part of the American ethos. Franklin formulated a presentation of his beliefs and published it in 1728. It did not mention many of the Puritan ideas regarding salvation, the divinity of Jesus, or indeed much religious dogma. He clarified himself as a deist in his 1771 autobiography, although he still considered himself a Christian. He retained a strong faith in a God as the wellspring of morality and goodness in man, and as a Providential actor in history responsible for American independence. Franklin, at a critical impasse during the Constitutional Convention in June 1787, attempted to introduce the practice of daily common prayer with these words: The motion met with resistance and was never brought to a vote. Franklin was an enthusiastic supporter of the evangelical minister George Whitefield during the First Great Awakening. Franklin did not subscribe to Whitefield's theology, but he admired Whitefield for exhorting people to worship God through good works. Franklin published all of Whitefield's sermons and journals, thereby earning a lot of money and boosting the Great Awakening. When he stopped attending church, Franklin wrote in his autobiography: Franklin retained a lifelong commitment to the Puritan virtues and political values he had grown up with, and through his civic work and publishing, he succeeded in passing these values into the American culture permanently. He had a "passion for virtue". These Puritan values included his devotion to egalitarianism, education, industry, thrift, honesty, temperance, charity and community spirit. The classical authors read in the Enlightenment period taught an abstract ideal of republican government based on hierarchical social orders of king, aristocracy and commoners. It was widely believed that English liberties relied on their balance of power, but also hierarchal deference to the privileged class. "Puritanism ... and the epidemic evangelism of the mid-eighteenth century, had created challenges to the traditional notions of social stratification" by preaching that the Bible taught all men are equal, that the true value of a man lies in his moral behavior, not his class, and that all men can be saved. Franklin, steeped in Puritanism and an enthusiastic supporter of the evangelical movement, rejected the salvation dogma but embraced the radical notion of egalitarian democracy. Franklin's commitment to teach these values was itself something he gained from his Puritan upbringing, with its stress on "inculcating virtue and character in themselves and their communities." These Puritan values and the desire to pass them on, were one of Franklin's quintessentially American characteristics and helped shape the character of the nation. Max Weber considered Franklin's ethical writings a culmination of the Protestant ethic, which ethic created the social conditions necessary for the birth of capitalism. One of Franklin's notable characteristics was his respect, tolerance and promotion of all churches. Referring to his experience in Philadelphia, he wrote in his autobiography, "new Places of worship were continually wanted, and generally erected by voluntary Contribution, my Mite for such purpose, whatever might be the Sect, was never refused." "He helped create a new type of nation that would draw strength from its religious pluralism." The evangelical revivalists who were active mid-century, such as Whitefield, were the greatest advocates of religious freedom, "claiming liberty of conscience to be an 'inalienable right of every rational creature.'" Whitefield's supporters in Philadelphia, including Franklin, erected "a large, new hall, that ... could provide a pulpit to anyone of any belief." Franklin's rejection of dogma and doctrine and his stress on the God of ethics and morality and civic virtue made him the "prophet of tolerance." Franklin composed "A Parable Against Persecution", an apocryphal 51st chapter of Genesis in which God teaches Abraham the duty of tolerance. While he was living in London in 1774, he was present at the birth of British Unitarianism, attending the inaugural session of the | was in London, his trip was extended, and there were problems with the governor's promises of support. Perhaps because of the circumstances of this delay, Deborah married a man named John Rodgers. This proved to be a regrettable decision. Rodgers shortly avoided his debts and prosecution by fleeing to Barbados with her dowry, leaving her behind. Rodgers's fate was unknown, and because of bigamy laws, Deborah was not free to remarry. Franklin established a common-law marriage with Read on September 1, 1730. They took in Franklin's recently acknowledged young illegitimate son and raised him in their household. They had two children together. Their son, Francis Folger Franklin, was born in October 1732 and died of smallpox in 1736. Their daughter, Sarah "Sally" Franklin, was born in 1743 and grew up to marry Richard Bache. Deborah's fear of the sea meant that she never accompanied Franklin on any of his extended trips to Europe; another possible reason why they spent much time apart is that he may have blamed her for possibly preventing their son Francis from being inoculated against the disease that subsequently killed him. Deborah wrote to him in November 1769, saying she was ill due to "dissatisfied distress" from his prolonged absence, but he did not return until his business was done. Deborah Read Franklin died of a stroke on December 14, 1774, while Franklin was on an extended mission to Great Britain; he returned in 1775. William Franklin In 1730, 24-year-old Franklin publicly acknowledged the existence of his son William, who was deemed "illegitimate," as he was born out of wedlock, and raised him in his household. William was born February 22, 1730, and his mother's identity is unknown. He was educated in Philadelphia and beginning at about age 30 studied law in London in the early 1760s. William fathered an illegitimate son, William Temple Franklin, born on the same date, February 22, 1760. The boy's mother was never identified, and he was placed in foster care. In 1762, the elder William Franklin married Elizabeth Downes, daughter of a planter from Barbados, in London. In 1763, he was appointed as the last royal governor of New Jersey. A Loyalist to the king, William Franklin and his father Benjamin eventually broke relations over their differences about the American Revolutionary War, as Benjamin Franklin could never accept William's position. Deposed in 1776 by the revolutionary government of New Jersey, William was placed under house arrest at his home in Perth Amboy for six months. After the Declaration of Independence, William was formally taken into custody by order of the Provincial Congress of New Jersey, an entity which he refused to recognize, regarding it as an "illegal assembly." He was incarcerated in Connecticut for two years, in Wallingford and Middletown, and, after being caught surreptitiously engaging Americans into supporting the Loyalist cause, was held in solitary confinement at Litchfield for eight months. When finally released in a prisoner exchange in 1778, he moved to New York City, which was occupied by the British at the time. While in New York City, he became leader of the Board of Associated Loyalists, a quasi-military organization chartered by King George III and headquartered in New York City. They initiated guerrilla forays into New Jersey, southern Connecticut, and New York counties north of the city. When British troops evacuated from New York, William Franklin left with them and sailed to England. He settled in London, never to return to North America. In the preliminary peace talks in 1782 with Britain, "... Benjamin Franklin insisted that loyalists who had borne arms against the United States would be excluded from this plea (that they be given a general pardon). He was undoubtedly thinking of William Franklin." Success as an author In 1733, Franklin began to publish the noted Poor Richard's Almanack (with content both original and borrowed) under the pseudonym Richard Saunders, on which much of his popular reputation is based. Franklin frequently wrote under pseudonyms. He had developed a distinct, signature style that was plain, pragmatic and had a sly, soft but self-deprecating tone with declarative sentences. Although it was no secret that Franklin was the author, his Richard Saunders character repeatedly denied it. "Poor Richard's Proverbs", adages from this almanac, such as "A penny saved is twopence dear" (often misquoted as "A penny saved is a penny earned") and "Fish and visitors stink in three days", remain common quotations in the modern world. Wisdom in folk society meant the ability to provide an apt adage for any occasion, and Franklin's readers became well prepared. He sold about ten thousand copies per year—it became an institution. In 1741, Franklin began publishing The General Magazine and Historical Chronicle for all the British Plantations in America. He used the cipher of the Prince of Wales as the cover illustration. In 1758, the year he ceased writing for the Almanack, he printed Father Abraham's Sermon, also known as The Way to Wealth. Franklin's autobiography, begun in 1771 but published after his death, has become one of the classics of the genre. "Advice to a Friend on Choosing a Mistress" is a letter written by Benjamin Franklin, dated June 25, 1745, in which Franklin gives advice to a young man about channeling sexual urges. Due to its licentious nature, the letter was not published in collections of Franklin's papers during the nineteenth century. Federal court decisions from the mid-to-late twentieth century cited the document as a reason for overturning obscenity laws, using it to make a case against censorship. Public life Early steps in Pennsylvania In 1736, Franklin created the Union Fire Company, one of the first volunteer firefighting companies in America. In the same year, he printed a new currency for New Jersey based on innovative anti-counterfeiting techniques he had devised. Throughout his career, Franklin was an advocate for paper money, publishing A Modest Enquiry into the Nature and Necessity of a Paper Currency in 1729, and his printer printed money. He was influential in the more restrained and thus successful monetary experiments in the Middle Colonies, which stopped deflation without causing excessive inflation. In 1766, he made a case for paper money to the British House of Commons. As he matured, Franklin began to concern himself more with public affairs. In 1743, he first devised a scheme for the Academy, Charity School, and College of Philadelphia. However, the person he had in mind to run the academy, Rev. Richard Peters, refused and Franklin put his ideas away until 1749 when he printed his own pamphlet, Proposals Relating to the Education of Youth in Pensilvania. He was appointed president of the Academy on November 13, 1749; the academy and the charity school opened in 1751. In 1743, Franklin founded the American Philosophical Society to help scientific men discuss their discoveries and theories. He began the electrical research that, along with other scientific inquiries, would occupy him for the rest of his life, in between bouts of politics and moneymaking. During King George's War, Franklin raised a militia called the Association for General Defense, because the legislators of the city decided to take no action to defend Philadelphia "either by erecting fortifications or building Ships of War". He raised money to create earthwork defenses and buy artillery. The largest of these was the "Association Battery" or "Grand Battery" of 50 guns. In 1747, Franklin (already a very wealthy man) retired from printing and went into other businesses. He created a partnership with his foreman, David Hall, which provided Franklin with half of the shop's profits for 18 years. This lucrative business arrangement provided leisure time for study, and in a few years he had made many new discoveries. Franklin became involved in Philadelphia politics and rapidly progressed. In October 1748, he was selected as a councilman; in June 1749, he became a justice of the peace for Philadelphia; and in 1751, he was elected to the Pennsylvania Assembly. On August 10, 1753, Franklin was appointed deputy postmaster-general of British North America. His most notable service in domestic politics was his reform of the postal system, with mail sent out every week. In 1751, Franklin and Thomas Bond obtained a charter from the Pennsylvania legislature to establish a hospital. Pennsylvania Hospital was the first hospital in the colonies. In 1752, Franklin organized the Philadelphia Contributionship, the Colonies' first homeowner's insurance company. Between 1750 and 1753, the "educational triumvirate" of Benjamin Franklin, Samuel Johnson of Stratford, Connecticut, and schoolteacher William Smith built on Franklin's initial scheme and created what Bishop James Madison, president of the College of William & Mary, called a "new-model" plan or style of American college. Franklin solicited, printed in 1752, and promoted an American textbook of moral philosophy by Samuel Johnson, titled Elementa Philosophica, to be taught in the new colleges. In June 1753, Johnson, Franklin, and Smith met in Stratford. They decided the new-model college would focus on the professions, with classes taught in English instead of Latin, have subject matter experts as professors instead of one tutor leading a class for four years, and there would be no religious test for admission. Johnson went on to found King's College (now Columbia University) in New York City in 1754, while Franklin hired Smith as provost of the College of Philadelphia, which opened in 1755. At its first commencement, on May 17, 1757, seven men graduated; six with a Bachelor of Arts and one as Master of Arts. It was later merged with the University of the State of Pennsylvania to become the University of Pennsylvania. The college was to become influential in guiding the founding documents of the United States: in the Continental Congress, for example, over one-third of the college-affiliated men who contributed the Declaration of Independence between September 4, 1774, and July 4, 1776, was affiliated with the college. In 1754, he headed the Pennsylvania delegation to the Albany Congress. This meeting of several colonies had been requested by the Board of Trade in England to improve relations with the Indians and defense against the French. Franklin proposed a broad Plan of Union for the colonies. While the plan was not adopted, elements of it found their way into the Articles of Confederation and the Constitution. In 1753, both Harvard and Yale awarded him honorary master of arts degrees. In 1756, Franklin received an honorary Master of Arts degree from the College of William & Mary. Later in 1756, Franklin organized the Pennsylvania Militia. He used Tun Tavern as a gathering place to recruit a regiment of soldiers to go into battle against the Native American uprisings that beset the American colonies. Postmaster Well known as a printer and publisher, Franklin was appointed postmaster of Philadelphia in 1737, holding the office until 1753, when he and publisher William Hunter were named deputy postmasters–general of British North America, the first to hold the office. (Joint appointments were standard at the time, for political reasons.) Franklin was responsible for the British colonies from Pennsylvania north and east, as far as the island of Newfoundland. A post office for local and outgoing mail had been established in Halifax, Nova Scotia, by local stationer Benjamin Leigh, on April 23, 1754, but service was irregular. Franklin opened the first post office to offer regular, monthly mail in Halifax on December 9, 1755. Meantime, Hunter became postal administrator in Williamsburg, Virginia, and oversaw areas south of Annapolis, Maryland. Franklin reorganized the service's accounting system and improved speed of delivery between Philadelphia, New York and Boston. By 1761, efficiencies led to the first profits for the colonial post office. When the lands of New France were ceded to the British under the Treaty of Paris in 1763, the British province of Quebec was created among them, and Franklin saw mail service expanded between Montreal, Trois-Rivières, Quebec City, and New York. For the greater part of his appointment, Franklin lived in England (from 1757 to 1762, and again from 1764 to 1774)—about three-quarters of his term. Eventually, his sympathies for the rebel cause in the American Revolution led to his dismissal on January 31, 1774. On July 26, 1775, the Second Continental Congress established the United States Post Office and named Franklin as the first United States postmaster general. Franklin had been a postmaster for decades and was a natural choice for the position. He had just returned from England and was appointed chairman of a Committee of Investigation to establish a postal system. The report of the committee, providing for the appointment of a postmaster general for the 13 American colonies, was considered by the Continental Congress on July 25 and 26. On July 26, 1775, Franklin was appointed postmaster general, the first appointed under the Continental Congress. His apprentice, William Goddard, felt that his ideas were mostly responsible for shaping the postal system and that the appointment should have gone to him, but he graciously conceded it to Franklin, 36 years his senior. Franklin, however, appointed Goddard as Surveyor of the Posts, issued him a signed pass, and directed him to investigate and inspect the various post offices and mail routes as he saw fit. The newly established postal system became the United States Post Office, a system that continues to operate today. Decades in London From the mid-1750s to the mid-1770s, Franklin spent much of his time in London. Political work In 1757, he was sent to England by the Pennsylvania Assembly as a colonial agent to protest against the political influence of the Penn family, the proprietors of the colony. He remained there for five years, striving to end the proprietors' prerogative to overturn legislation from the elected Assembly and their exemption from paying taxes on their land. His lack of influential allies in Whitehall led to the failure of this mission. At this time, many members of the Pennsylvania Assembly were feuding with William Penn's heirs who controlled the colony as proprietors. After his return to the colony, Franklin led the "anti-proprietary party" in the struggle against the Penn family and was elected Speaker of the Pennsylvania House in May 1764. His call for a change from proprietary to royal government was a rare political miscalculation, however: Pennsylvanians worried that such a move would endanger their political and religious freedoms. Because of these fears and because of political attacks on his character, Franklin lost his seat in the October 1764 Assembly elections. The anti-proprietary party dispatched Franklin to England again to continue the struggle against the Penn family proprietorship. During this trip, events drastically changed the nature of his mission. In London, Franklin opposed the 1765 Stamp Act. Unable to prevent its passage, he made another political miscalculation and recommended a friend to the post of stamp distributor for Pennsylvania. Pennsylvanians were outraged, believing that he had supported the measure all along, and threatened to destroy his home in Philadelphia. Franklin soon learned of the extent of colonial resistance to the Stamp Act, and he testified during the House of Commons proceedings that led to its repeal. With this, Franklin suddenly emerged as the leading spokesman for American interests in England. He wrote popular essays on behalf of the colonies. Georgia, New Jersey, and Massachusetts also appointed him as their agent to the Crown. During his lengthy missions to London between 1757 and 1775, Franklin lodged in a house on Craven Street, just off The Strand in central London. During his stays there, he developed a close friendship with his landlady, Margaret Stevenson, and her circle of friends and relations, in particular, her daughter Mary, who was more often known as Polly. The house is now operating as a museum known as the Benjamin Franklin House. Whilst in London, Franklin became involved in radical politics. He belonged to a gentleman's club (which he called "the honest Whigs"), which held stated meetings, and included members such as Richard Price, the minister of Newington Green Unitarian Church who ignited the Revolution controversy, and Andrew Kippis. Scientific work In 1756, Franklin had become a member of the Society for the Encouragement of Arts, Manufactures & Commerce (now the Royal Society of Arts), which had been founded in 1754. After his return to the United States in 1775, Franklin became the Society's Corresponding Member, continuing a close connection. The Royal Society of Arts instituted a Benjamin Franklin Medal in 1956 to commemorate the 250th anniversary of his birth and the 200th anniversary of his membership of the RSA. The study of natural philosophy (referred today as science in general) drew him into overlapping circles of acquaintance. Franklin was, for example, a corresponding member of the Lunar Society of Birmingham. In 1759, the University of St Andrews awarded Franklin an honorary doctorate in recognition of his accomplishments. In October 1759, he was granted Freedom of the Borough of St Andrews. He was also awarded an honorary doctorate by Oxford University in 1762. Because of these honors, Franklin was often addressed as " Franklin." While living in London in 1768, he developed a phonetic alphabet in A Scheme for a new Alphabet and a Reformed Mode of Spelling. This reformed alphabet discarded six letters Franklin regarded as redundant (c, j, q, w, x, and y), and substituted six new letters for sounds he felt lacked letters of their own. This alphabet never caught on, and he eventually lost interest. Travels around Europe Franklin used London as a base to travel. In 1771, he made short journeys through different parts of England, staying with Joseph Priestley at Leeds, Thomas Percival at Manchester and Erasmus Darwin at Lichfield. In Scotland, he spent five days with Lord Kames near Stirling and stayed for three weeks with David Hume in Edinburgh. In 1759, he visited Edinburgh with his son and later reported that he considered his six weeks in Scotland "six weeks of the densest happiness I have met with in any part of my life". In Ireland, he stayed with Lord Hillsborough. Franklin noted of him that "all the plausible behaviour I have described is meant only, by patting and stroking the horse, to make him more patient, while the reins are drawn tighter, and the spurs set deeper into his sides." In Dublin, Franklin was invited to sit with the members of the Irish Parliament rather than in the gallery. He was the first American to receive this honor. While touring Ireland, he was deeply moved by the level of poverty he witnessed. The economy of the Kingdom of Ireland was affected by the same trade regulations and laws that governed the Thirteen Colonies. Franklin feared that the American colonies could eventually come to the same level of poverty if the regulations and laws continued to apply to them. Franklin spent two months in German lands in 1766, but his connections to the country stretched across a lifetime. He declared a debt of gratitude to German scientist Otto von Guericke for his early studies of electricity. Franklin also co-authored the first treaty of friendship between Prussia and America in 1785. In September 1767, Franklin visited Paris with his usual traveling partner, Sir John Pringle, 1st Baronet. News of his electrical discoveries was widespread in France. His reputation meant that he was introduced to many influential scientists and politicians, and also to King Louis XV. Defending the American cause One line of argument in Parliament was that Americans should pay a share of the costs of the French and Indian War and therefore taxes should be levied on them. Franklin became the American spokesman in highly publicized testimony in Parliament in 1766. He stated that Americans already contributed heavily to the defense of the Empire. He said local governments had raised, outfitted and paid 25,000 soldiers to fight France—as many as Britain itself sent—and spent many millions from American treasuries doing so in the French and Indian War alone. In 1772, Franklin obtained private letters of Thomas Hutchinson and Andrew Oliver, governor and lieutenant governor of the Province of Massachusetts Bay, proving that they had encouraged the Crown to crack down on Bostonians. Franklin sent them to America, where they escalated the tensions. The letters were finally leaked to the public in the Boston Gazette in mid-June 1773, causing a political firestorm in Massachusetts and raising significant questions in England. The British began to regard him as the fomenter of serious trouble. Hopes for a peaceful solution ended as he was systematically ridiculed and humiliated by Solicitor-General Alexander Wedderburn, before the Privy Council on January 29, 1774. He returned to Philadelphia in March 1775, and abandoned his accommodationist stance. In 1773, Franklin published two of his most celebrated pro-American satirical essays: "Rules by Which a Great Empire May Be Reduced to a Small One", and "An Edict by the King of Prussia". Agent for British and Hellfire Club membership Franklin is known to have occasionally attended the Hellfire Club's meetings during 1758 as a non-member during his time in England. However, some authors and historians would argue Benjamin Franklin was in fact a British spy. As there are no records left (having been burned in 1774), many of these members are just assumed or linked by letters sent to each other. One early proponent that Franklin was a member of the Hellfire Club and a double agent was the historian Donald McCormick, who has a history of making controversial claims. Coming of revolution In 1763, soon after Franklin returned to Pennsylvania from England for the first time, the western frontier was engulfed in a bitter war known as Pontiac's Rebellion. The Paxton Boys, a group of settlers convinced that the Pennsylvania government was not doing enough to protect them from American Indian raids, murdered a group of peaceful Susquehannock Indians and marched on Philadelphia. Franklin helped to organize a local militia to defend the capital against the mob. He met with the Paxton leaders and persuaded them to disperse. Franklin wrote a scathing attack against the racial prejudice of the Paxton Boys. "If an Indian injures me", he asked, "does it follow that I may revenge that Injury on all Indians?" He provided an early response to British surveillance through his own network of counter-surveillance and manipulation. "He waged a public relations campaign, secured secret aid, played a role in privateering expeditions, and churned out effective and inflammatory propaganda." Declaration of Independence By the time Franklin arrived in Philadelphia on May 5, 1775, after his second mission to Great Britain, the American Revolution had begun—with skirmishes breaking out between colonials and British at Lexington and Concord. The New England militia had forced the main British army to remain inside Boston. The Pennsylvania Assembly unanimously chose Franklin as their delegate to the Second Continental Congress. In June 1776, Franklin was appointed a member of the Committee of Five that drafted the Declaration of Independence. Although he was temporarily disabled by gout and unable to attend most meetings of the committee, Franklin made several "small but important" changes to the draft sent to him by Thomas Jefferson. At the signing, he is quoted as having replied to a comment by John Hancock that they must all hang together: "Yes, we must, indeed, all hang together, or most assuredly we shall all hang separately." Ambassador to France (1776–1785) In December 1776, Franklin was dispatched to France as commissioner for the United States. He took with him as secretary his 16-year-old grandson, William Temple Franklin. They lived in a home in the Parisian suburb of Passy, donated by Jacques-Donatien Le Ray de Chaumont, who supported the United States. Franklin remained in France until 1785. He conducted the affairs of his country toward the French nation with great success, which included securing a critical military alliance in 1778 and signing the 1783 Treaty of Paris. Among his associates in France was Honoré Gabriel Riqueti, comte de Mirabeau—a French Revolutionary writer, orator and statesman who in 1791 was elected president of the National Assembly. In July 1784, Franklin met with Mirabeau and contributed anonymous materials that the Frenchman used in his first signed work: Considerations sur l'ordre de Cincinnatus. The publication was critical of the Society of the Cincinnati, established in the United States. Franklin and Mirabeau thought of it as a "noble order", inconsistent with the egalitarian ideals of the new republic. During his stay in France, Franklin was active as a Freemason, serving as venerable master of the lodge Les Neuf Sœurs from 1779 until 1781. In 1784, when Franz Mesmer began to publicize his theory of "animal magnetism" which was considered offensive by many, Louis XVI appointed a commission to investigate it. These included the chemist Antoine Lavoisier, the physician Joseph-Ignace Guillotin, the astronomer Jean Sylvain Bailly, and Franklin. In doing so, the committee concluded, through blind trials that mesmerism only seemed to work when the subjects expected it, which discredited mesmerism and became the first major demonstration of the placebo effect, which was described at that time as "imagination." In 1781, he was elected a fellow of the American Academy of Arts and Sciences. Franklin's advocacy for religious tolerance in France contributed to arguments made by French philosophers and politicians that resulted in Louis XVI's signing of the Edict of Versailles in November 1787. This edict effectively nullified the Edict of Fontainebleau, which had denied non-Catholics civil status and the right to openly practice their faith. Franklin also served as American minister to Sweden, although he never visited that country. He negotiated a treaty that was signed in April 1783. On August 27, 1783, in Paris, Franklin witnessed the world's first hydrogen balloon flight. Le Globe, created by professor Jacques Charles and Les Frères Robert, was watched by a vast crowd as it rose from the Champ de Mars (now the site of the Eiffel Tower). Franklin became so enthusiastic that he subscribed financially to the next project to build a manned hydrogen balloon. On December 1, 1783, Franklin was seated in the special enclosure for honored guests when La Charlière took off from the Jardin des Tuileries, piloted by Charles and Nicolas-Louis Robert. Return to America When he returned home in 1785, Franklin occupied a position only second to that of George Washington as the champion of American independence. Franklin returned from France with an unexplained shortage of 100,000 pounds in Congressional funds. In response to a question from a member of Congress about this, Franklin, quoting the Bible, quipped: "Muzzle not the ox that treadeth out his master's grain." The missing funds were never again mentioned in Congress. In 1787, Franklin served as a delegate to the Philadelphia Convention. He held an honorary position and seldom engaged in debate. Le Ray honored him with a commissioned portrait painted by Joseph Duplessis, which now hangs in the National Portrait Gallery of the Smithsonian Institution in Washington, D.C. After his return, Franklin became an abolitionist and freed his two slaves. He eventually became president of the Pennsylvania Abolition Society. Special balloting conducted October 18, 1785, unanimously elected Franklin the sixth president of the Supreme Executive Council of Pennsylvania, replacing John Dickinson. The office was practically that of governor. Franklin held that office for slightly over three years, longer than any other, and served the constitutional limit of three full terms. Shortly after his initial election, he was re-elected to a full term on October 29, 1785, and again in the fall of 1786 and on October 31, 1787. In that capacity he served as host to the Constitutional Convention of 1787 in Philadelphia. Death Franklin suffered from obesity throughout his middle-aged and later years, which resulted in multiple health problems, particularly gout, which worsened as he aged. In poor health during the signing of the US Constitution in 1787, he was rarely seen in public from then until his death. Benjamin Franklin died from pleuritic attack at his home in Philadelphia on April 17, 1790. He was aged 84 at the time of his death. His last words were reportedly, "a dying man can do nothing easy", to his daughter after she suggested that he change position in bed and lie on his side so he could breathe more easily. Franklin's death is described in the book The Life of Benjamin Franklin, quoting from the account of John Jones: Approximately 20,000 people attended his funeral. He was interred in Christ Church Burial Ground in Philadelphia. Upon learning of his death, the Constitutional Assembly in Revolutionary France entered into a state of mourning for a period of three days, and memorial services were conducted in honor of Franklin throughout the country. In 1728, aged 22, Franklin wrote what he hoped would be his own epitaph: Franklin's actual grave, however, as he specified in his final will, simply reads "Benjamin and Deborah Franklin". Inventions and scientific inquiries Franklin was a prodigious inventor. Among his many creations were the lightning rod, Franklin stove, bifocal glasses and the flexible urinary catheter. Franklin never patented his inventions; in his autobiography he wrote, "... as we enjoy great advantages from the inventions of others, we should be glad of an opportunity to serve others by any invention of ours; and this we should do freely and generously." Electricity Franklin started exploring the phenomenon of electricity in 1746 when he saw some of Archibald Spencer's lectures using static electricity for illustrations. Franklin proposed that "vitreous" and "resinous" electricity were not different types of "electrical fluid" (as electricity was called then), but the same "fluid" under different pressures. (The same proposal was made independently that same year by William Watson.) Franklin was the first to label them as positive and negative respectively, and he was the first to discover the principle of conservation of charge. In 1748, he constructed a multiple plate capacitor, that he called an "electrical battery" (not a true battery like Volta's pile) by placing eleven panes of glass sandwiched between lead plates, suspended with silk cords and connected by wires. In pursuit of more pragmatic uses for electricity, remarking in spring 1749 that he felt "chagrin'd a little" that his experiments had heretofore resulted in "Nothing in this Way of Use to Mankind," Franklin planned a practical demonstration. He proposed a dinner party where a turkey was to be killed with electric shock and roasted on an electrical spit. After having prepared several turkeys this way, Franklin noted that "the birds kill'd in this manner eat uncommonly tender." Franklin recounted that in the process of one of these experiments, he was shocked by a pair of Leyden jars, resulting in numbness in his arms that persisted for one evening, noting "I am Ashamed to have been Guilty of so Notorious a Blunder." Franklin briefly investigated electrotherapy, including the use of the electric bath. This work led to the field becoming widely known. In recognition of his work with electricity, Franklin received the Royal Society's Copley Medal in 1753, and in 1756, he became one of the few 18th-century Americans elected as a fellow of the Society. The CGS unit of electric charge has been named after him: one franklin (Fr) is equal to one statcoulomb. Franklin advised Harvard University in its acquisition of new electrical laboratory apparatus after the complete loss of its original collection, in a fire that destroyed the original Harvard Hall in 1764. The collection he assembled later became part of the Harvard Collection of Historical Scientific Instruments, now on public display in its Science Center. Kite experiment and lightning rod Franklin published a proposal for an experiment to prove that lightning is electricity by flying a kite in a storm. On May 10, 1752, Thomas-François Dalibard of France conducted Franklin's experiment using a iron rod instead of a kite, and he extracted electrical sparks from a cloud. On June 15, 1752, Franklin may possibly have conducted his well-known kite experiment in Philadelphia, successfully extracting sparks from a cloud. Franklin described the experiment in his newspaper, The Pennsylvania Gazette, on October 19, 1752, without mentioning that he himself had performed it. This account was read to the Royal Society on December 21 and printed as such in the Philosophical Transactions. Joseph Priestley published an account with additional details in his 1767 History and Present Status of Electricity. Franklin was careful to stand on an insulator, keeping dry under a roof to avoid the danger of electric shock. Others, such as Georg Wilhelm Richmann in Russia, were indeed electrocuted in performing lightning experiments during the months immediately following Franklin's experiment. In his writings, Franklin indicates that he was aware of the dangers and offered alternative ways to demonstrate that lightning was electrical, as shown by his use of the concept of electrical ground. Franklin did not perform this experiment in the way that is often pictured in popular literature, flying the kite and waiting to be struck by lightning, as it would have been dangerous. Instead he used the kite to collect some electric charge from a storm cloud, showing that lightning was electrical. On October 19, 1752, in a letter to England with directions for repeating the experiment, Franklin wrote: Franklin's electrical experiments led to his invention of the lightning rod. He said that conductors with a sharp rather than a smooth point could discharge silently and at a far greater distance. He surmised that this could help protect buildings from lightning by attaching "upright Rods of Iron, made sharp as a Needle and gilt to prevent Rusting, and from the Foot of those Rods a Wire down the outside of the Building into the Ground; ... Would not these pointed Rods probably draw the Electrical Fire silently out of a Cloud before it came nigh enough to strike, and thereby secure us from that most sudden and terrible Mischief!" Following a series of experiments on Franklin's own house, lightning rods were installed on the Academy of Philadelphia (later the University of Pennsylvania) and the Pennsylvania State House (later Independence Hall) in 1752. Population studies Franklin had a major influence on the emerging science of demography, or population studies. In the 1730s and 1740s, Franklin began taking notes on population growth, finding that the American population had the fastest growth rate on Earth. Emphasizing that population growth depended on food supplies, Franklin emphasized the abundance of food and available farmland in America. He calculated that America's population was doubling every 20 years and would surpass that of England in a century. In 1751, he drafted Observations concerning the Increase of Mankind, Peopling of Countries, etc. Four years later, it was anonymously printed in Boston, and it was quickly reproduced in Britain, where it influenced the economist Adam Smith and later the demographer Thomas Malthus, who credited Franklin for discovering a rule of population growth. Franklin's predictions how British mercantilism was unsustainable alarmed British leaders who did not want to be surpassed by the colonies, so they became more willing to impose restrictions on the colonial economy. Kammen (1990) and Drake (2011) say Franklin's Observations concerning the Increase of Mankind (1755) stands alongside Ezra Stiles' "Discourse on Christian Union" (1760) as the leading works of 18th-century Anglo-American demography; Drake credits Franklin's "wide readership and prophetic insight." Franklin was also a pioneer in the study of slave demography, as shown in his 1755 essay. Franklin, in his capacity as a farmer, wrote at least one critique about the negative consequences of price controls, trade restrictions, and subsidy of the poor. This is succinctly preserved in his letter to the London Chronicle published November 29, 1766, titled 'On the Price of Corn, and Management of the poor'. Oceanography As deputy postmaster, Franklin became interested in the North Atlantic Ocean circulation patterns. While in England in 1768, he heard a complaint from the Colonial Board of Customs: Why did it take British packet ships carrying mail several weeks longer to reach New York than it took an average merchant ship to reach Newport, Rhode Island? The merchantmen had a longer and more complex voyage because they left from London, while the packets left from Falmouth in Cornwall. Franklin put the question to his cousin Timothy Folger, a Nantucket whaler captain, who told him that merchant ships routinely avoided a strong eastbound mid-ocean current. The mail packet captains sailed dead into it, thus fighting an adverse current of . Franklin worked with Folger and other experienced ship captains, learning enough to chart the current and name it the Gulf Stream, by which it is still known today. Franklin published his Gulf Stream chart in 1770 in England, where it was ignored. Subsequent versions were printed in France in 1778 and the U.S. in 1786. The British edition of the chart, which was the original, was so thoroughly ignored that everyone assumed it was lost forever until Phil Richardson, a Woods Hole oceanographer and Gulf Stream expert, discovered it in the Bibliothèque Nationale in Paris in 1980. This find received front-page coverage in The New York Times. It took many years for British sea captains to adopt Franklin's advice on navigating the current; once they did, they were able to trim two weeks from their sailing time. In 1853, the oceanographer and cartographer Matthew Fontaine Maury noted that while Franklin charted and codified the Gulf Stream, he did not discover it: An aging Franklin accumulated all his oceanographic findings in Maritime Observations, published by the Philosophical Society's transactions in 1786. It contained ideas for sea anchors, catamaran hulls, watertight compartments, shipboard lightning rods and a soup bowl designed to stay stable in stormy weather. Theories and experiments Franklin was, along with his contemporary Leonhard Euler, the only major scientist who supported Christiaan Huygens's wave theory of light, which was basically ignored by the rest of the scientific community. In the 18th century, Isaac Newton's corpuscular theory was held to be true; only after Thomas Young's well-known slit experiment in 1803 were most scientists persuaded to believe Huygens's theory. On October 21, 1743, according to the popular myth, a storm moving from the southwest denied Franklin the opportunity of witnessing a lunar eclipse. Franklin was said to have noted that the prevailing winds were actually from the northeast, contrary to what he had expected. In correspondence with his brother, Franklin learned that the same storm had not reached Boston until after the eclipse, despite the fact that Boston is to the northeast of Philadelphia. He deduced that storms do not always travel in the direction of the prevailing wind, a concept that greatly influenced meteorology. After the Icelandic volcanic eruption of Laki in 1783, and the subsequent harsh European winter of 1784, Franklin made observations connecting the causal nature of these two separate events. He wrote about them in a lecture series. Though Franklin has been most noted kite-wise for his lightning experiments, he has also been noted by many for using kites to pull humans and ships across waterways. George Pocock in the book A TREATISE on The Aeropleustic Art, or Navigation in the Air, by means of Kites, or Buoyant Sails noted being inspired by |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.