text
stringlengths 1.14k
135k
|
---|
# The Boat Race 2018
The Boat Race 2018 (also known as The Cancer Research UK Boat Race for the purposes of sponsorship) took place on 24 March 2018. Held annually, The Boat Race is a side-by-side rowing race between crews from the universities of Oxford and Cambridge along a 4.2-mile (6.8 km) tidal stretch of the River Thames in south-west London. For the third time in the history of the event, the men's, women's and both reserves' races were all held on the Tideway on the same day.
The women's race was the first event of the day, and saw Cambridge lead from the start, eventually winning by a considerable margin to record their second consecutive victory, and taking the overall record in the Women's Boat Race to 43–30 in their favour. The men's race was the final event of the day and completed a whitewash as Cambridge won, their second victory in three years, and taking the overall record to 83–80 in their favour. In the women's reserve race, Cambridge's Blondie defeated Oxford's Osiris by nine lengths, their third consecutive victory. The men's reserve race was won by Cambridge's Goldie who defeated Oxford's Isis by a margin of four lengths.
The races were watched by around a quarter of a million spectators live, and were broadcast around the world by a variety of broadcasters. The two main races were also available for the second time as a live stream using YouTube.
## Background
The Boat Race is a side-by-side rowing competition between the University of Oxford (sometimes referred to as the "Dark Blues") and the University of Cambridge (sometimes referred to as the "Light Blues"). First held in 1829, the race takes place on the 4.2-mile (6.8 km) Championship Course, between Putney and Mortlake on the River Thames in south-west London. The rivalry is a major point of honour between the two universities; it is followed throughout the United Kingdom and broadcast worldwide. Oxford went into the race as champions, having won the 2017 race by a margin of one and a quarter lengths, with Cambridge leading overall with 82 victories to Oxford's 80 (excluding the 1877 race, officially a dead heat though claimed as a victory by the Oxford crew).
It was the third time in the history of The Boat Race that all four senior races – the men's, women's, men's reserves' and women's reserves' – were held on the same day and on the same course along the Tideway. Prior to 2015, the women's race, which first took place in 1927, was usually held at the Henley Boat Races along the 2,000-metre (2,200 yd) course. However, on at least two occasions in the interwar period, the women competed on the Thames between Chiswick and Kew. Cambridge's women went into the race as reigning champions, having won the 2017 race by 11 lengths, and led 42–30 overall.
For the sixth year, the men's race was sponsored by BNY Mellon while the women's race had BNY Mellon's subsidiary Newton Investment Management as sponsors. In January 2016, it was announced that the sponsors would be donating the title sponsorship to Cancer Research UK and that the 2016 event onwards would be retitled "The Cancer Research UK Boat Races". There is no monetary award for winning the race, as the journalist Roger Alton notes: "It's the last great amateur event: seven months of pain for no prize money".
The autumn reception was held at the Guildhall in London on 10 November 2017. As Cambridge's women had won the previous year's race, it was Oxford's responsibility to offer the traditional challenge to the Cambridge University Women's Boat Club (CUWBC). To that end, Katherine Erickson, President of Oxford University Women's Boat Club (OUWBC), challenged Daphne Martschenko, her Cambridge counterpart. Oxford's victory in the men's race meant that Hugo Ramambason, President of Cambridge University Boat Club (CUBC), challenged Iain Mandale, President of Oxford University Boat Club (OUBC).
The men's race was umpired by the former Light Blue rower John Garrett who represented Great Britain at the 1984, 1988 and 1992 Summer Olympics. He umpired men's race twice previously, in 2008 and 2012. He rowed for Lady Margaret Boat Club in The Boat Race in 1984 and 1985. The 73rd women's race was umpired by the multiple Olympic gold-medallist Matthew Pinsent. As well as rowing for Oxford in the 1990, 1991 and 1993 races, he was assistant umpire in the 2012 race before umpiring the 2013 race. The women's reserve race was presided over by former Dark Blue, Matt Smith, who rowed for Oxford in the 2001, 2002 and 2003 races. Richard Phelps, who rowed for Cambridge in the 1993, 1994 and 1995 races oversaw the men's reserve race.
The event was broadcast live in the United Kingdom on the BBC. Numerous broadcasters worldwide also showed the main races, including SuperSport across Africa and the EBU across Europe. It was also streamed live on BBC Online. For the second time, the men's and women's races were streamed live on YouTube.
## Coaches
The Cambridge men's crew coaching team was led by their chief coach Steve Trapmore, a gold medal-winning member of the men's eight at the 2000 Summer Olympics, who was appointed to the post in 2010. He was assisted by Richard Chambers, silver medallist in the men's lightweight coxless four at the 2012 Summer Olympics. Donald Legget, who rowed for the Light Blues in the 1963 and 1964 races acted as a supporting coach, along with coxing coach Henry Fieldman (who steered Cambridge in the 2013 race) and the medical officer Simon Owens. Sean Bowden was chief coach for Oxford, having been responsible for the senior men's crew since 1997, winning 12 from 18 races. He is a former Great Britain Olympic coach and coached the Light Blues in the 1993 and 1994 Boat Races.
OUWBC's chief coach was the former OUBC assistant coach Andy Nelder who previously worked with Bowden for eleven years. He was assisted by Jamie Kirkwood. Cambridge's women were coached by former Goldie coach Rob Baker who was assisted by Paddy Ryan.
## Trials
Dates for the trials, where crews are able to simulate the race proper on the Championship Course, were announced on 17 November 2017.
### Women
Cambridge's women's trial took place on the Championship Course on 5 December, between the Harry Potter-themed boats Expecto Patronum and Wingardium Leviosa. The CUWBC president Daphne Martschenko was unavailable through illness to participate in the race, which was umpired by Matthew Pinsent. Wingardium Leviosa took the early lead but the crews were level by Barn Elms boathouse, before Leviosa pulled half a length ahead by Craven Cottage. Level once again as the crews passed Harrods Furniture Depository, Expecto Patronum made a push at Hammersmith Bridge, handling the rough conditions better than their opponents. With a clear water advantage by Chiswick Eyot, Expecto Patronum passed the finish line two lengths ahead.
Oxford's trial race was conducted on 21 January 2018, delayed from December through ill health of the rowers. The race was held in windy and wet conditions on the Tideway between Great Typhoon and Coursing River umpired by Pinsent. Coursing River made the better start from the Surrey station before Great Typhoon drew level, before taking advantage of the curve of the river and pulling ahead. An oar clash followed but a series of pushes from Great Typhoon saw them take the lead and push away under Barnes Bridge, to win by half a length.
### Men
Cambridge's men's trial took place on the Championship Course on 5 December, between the boats Goblins and Goons. Goblins, starting from the Surrey station, took an early lead which they held until Goons drew level, and then began to pull away, as the crews passed below Hammersmith Bridge. Goblins responded, restored parity and then took the lead at the Bandstand. Despite each crew making a series of pushes, Goblins held a half-length lead under Barnes Bridge and maintained the advantage to the finish line.
The Oxford trial boats were named Strong and Stable, in reference to the Tory manifesto for the 2017 general election. They raced against one another along the Championship Course on 6 December 2017, umpired by John Garrett. Strong, starting from the Middlesex station, took an early lead and held a half-length advantage by the time the crews passed Craven Cottage. Garrett repeatedly warned both crews as they each infringed the racing line, and Strong capitalised on the advantage of the bend to be almost a length ahead. Stable fought back and were nearly level by the time they passed Harrods. Strong reacted to pull half a length ahead by Chiswick Eyot, extending to clear water by the Bandstand, and a final push at Barnes Bridge ensured them a two-length victory.
## Build-up
### Women
CUWBC faced a crew from University of London Boat Club (ULBC) in two races on the Tideway umpired by Judith Packer on 17 February 2018. The first segment, from Putney Bridge to Hammersmith Bridge, was an easy victory for the Light Blues, winning by around five lengths. The second segment, from Chiswick Steps to the finish line, saw Cambridge quickly overcome their starting one-length deficit to take a clear water advantage under Barnes Bridge before ending as "clear winners".
OUWBC went up against Oxford Brookes University Boat Club (OBUBC) in a two-piece race on the Championship Course on 24 February 2018. Despite a strong start from OBUBC in the first segment, OUWBC held a lead of around a length by Craven Cottage and continued to pull away to a three-length victory at Chiswick Eyot. In the second segment, OUWBC took a slight early lead but OBUBC remained in contention, taking advantage of Middlesex bend, but could not catch the Dark Blues who passed the finish line with a lead of a couple of seats.
On 4 March 2018, OUWBC took on a crew from Molesey Boat Club in a race along a section of the Championship Course from the start to Chiswick Steps, umpired by Sarah Winckless. The Dark Blues held a three-seat advantage by the time the crews had passed the boathouses, and despite under-rating Molesey, continued to pull away to hold a clear water advantage and a three-length lead by Hammersmith Bridge which they extended to a five-length lead by Chiswick Steps.
### Men
CUBC faced a ULBC crew in a three-piece race along the Tideway umpired by Rob Clegg on 18 February 2018. The first section of the race was strongly contested with clashes in the early stages, with ULBC taking the lead, only for the Light Blues to draw level and then lead past Craven Cottage. The race concluded as Cambridge passed Harrods with a three length lead. The second segment from Harrods to the Bandstand saw Cambridge lead all the way, to win by several lengths. The final section of the race from Chiswick Eyot to the finish line, saw further oar clashes, but Cambridge controlled the situation, winning by more than two lengths.
On 4 March 2018, CUBC took part in a two-piece race against OBUBC. The first section, from the start line to Chiswick Steps, was won by one length by CUBC who led from the start. The second race, from Chiswick Eyot to the finishing line, was more robustly contested. CUBC took an early lead in difficult conditions, only to be overhauled by OBUBC who took a lead of a length from Barnes Bridge to the finish.
OUBC took on an OBUBC crew in two stages along the Championship Course on 24 February 2018, umpired by John Garrett. OBUBC made the better start in the first section, but OUBC drew level at the Town Buoy. Oxford Brookes started to pull away and held a length's advantage as the crews passed the Mile Post and into the headwind. OUBC coped with the conditions well and were just ahead by Hammersmith Bridge but Oxford Brookes took advantage of the stream to win by a length as the crews passed St Paul's School. The second section of the race saw early clashes from which OBUBC emerged with a length advantage. Although OUBC pushed to close the gap, OBUBC responded and pulled away to win by just over one length.
On 3 March 2018, OUBC faced a ULBC crew along a section of the Championship Course from the start line to Chiswick Steps, in a race umpired by Richard Phelps. Starting from the Middlesex station, the Dark Blues took an early lead and held a length's advantage by the time the crews passed Harrods. In an early attempt to claim the racing line, OUBC moved into ULBC's water and both crews were warned by Clegg to return to their station. Despite this, OUBC extended their lead and were several lengths clear by Hammersmith Bridge and were able to take advantage of clear water, winning the race by at least four lengths.
## Crews
The official weigh-in for the crews took place at City Hall, London, on 26 February 2018.
### Women
The Cambridge crew weighed an average of 73.0 kilograms (160.9 lb), 2.1 kilograms (4.6 lb) per rower more than their opponents. The Light Blues averaged in height, 4 centimetres (1.6 in) more than Oxford. The Dark Blues featured one returning crew member, number four Alice Roberts, who rowed in the unsuccessful 2017 Oxford boat. The Cambridge crew included some experienced Boat Race rowers: Thea Zabell, Imogen Grant, Alice White and Myriam Goudet-Boukhatmi all rowed in the previous year's race. The Light Blues also featured the 2015 World Rowing Championships quad sculls gold medallist Olivia Coffey.
### Men
The Cambridge crew weighed an average of 89.8 kilograms (198 lb), 4.2 kilograms (9.3 lb) per rower more than their opponents, and averaged in height, 6 centimetres (2.4 in) taller than Oxford. The Light Blue's number four, James Letten, at , was the tallest individual ever to have competed in The Boat Race. One member of the Oxford crew has previous Boat Race experience: stroke Vassilis Ragoussis featured in the successful 2017 Dark Blue boat. On 20 March 2018, it was announced that as a result of illness, number six Joshua Bugajski would withdraw from the race and be replaced by Isis rower Benedict Aldous. It was later revealed that Bugajski's departure was also related to disagreements with the Dark Blue coach Sean Bowden. Claas Mertens, the Oxford bow man, won gold at the 2015 World Rowing Championships with the German lightweight men's eight. Cambridge's crew contains four individuals who have featured in the Boat Race: Hugo Ramambason, Freddie Davidson and Letten participated in 2017, while Charles Fisher rowed in the 2016 race.
## Pre-race
Two days before the race, both the Blues boats and the reserve boats practised their starts from the stakeboats on the Championship Course. On the same day, two BBC television cameras located on Putney Bridge and Barnes Bridge were targeted by thieves; their attempts at Putney were thwarted by an off-duty policeman and a Royal National Lifeboat Institution crew, but the gang escaped with one of the cameras on Barnes Bridge.
Cambridge were pre-race favourites to win both the men's and women's senior races.
The Queen's barge Gloriana led a procession of traditional craft along the course. These included the waterman's cutters used for the Oxbridge Waterman's Challenge.
## Races
The races were held on 24 March 2018. Weather was overcast with light winds.
### Reserves
Blondie won the women's reserve boat race which was held after the conclusion of the Women's Boat Race by nine lengths in a time of 19 minutes 45 seconds. Already six seconds ahead at the Mile Post, Blondie continued to pull away to be twelve seconds ahead by Hammersmith Bridge before passing the finishing post in 19 minutes 45 seconds, 27 seconds ahead of Osiris. It was Blondie's third consecutive victory, and took the overall tally (since 1968) to 24–20 in Cambridge's favour.
Goldie won the men's reserve race, which was held after the women's reserve race and before the men's race. Five seconds ahead at the Mile Post, the Light Blue reserves were warned after a clash of oars, and Isis reduced the gap to three seconds by Hammersmith Bridge. Goldie were clear of Isis by Barnes Bridge with a seven-second lead, and maintained that advantage as they crossed the finish line in a time of 18 minutes 12 seconds. It was Goldie's first victory since 2010 and took the overall tally in the event to 30–24 in their favour.
### Women's
The women's race started at 4:31 p.m. Greenwich Mean Time (GMT). CUWBC won the toss and elected to start from the Surrey side of the river, handing the Middlesex side to Oxford. Cambridge made the better start taking an early lead and were around a half of a length ahead after the first minute of the race. By Craven Cottage, and in spite of Oxford having the advantage of the bend in the river, the Light Blues were ahead by a length. At the Mile Post, Cambridge held a clear water advantage, two lengths ahead. The Light Blues passed under Hammersmith Bridge with a three-length lead. At Chiswick Steps, Oxford were fifteen seconds behind, and a further five down at Barnes Bridge. Cambridge passed the finishing post in a time of 19 minutes 6 seconds, around seven lengths ahead of Oxford. It was Cambridge's second consecutive victory but only their third win in eleven years, and took the overall record in the event to 43–30 in their favour.
### Men's
The men's race started at 5.33 pm. GMT in very "gloomy conditions". The Light Blues won the toss and elected to start from the Surrey side of the river. Oxford made the better start and were quickly a canvas ahead, but Cambridge restored parity within 40 seconds, going on to take a third of a length lead themselves. Cambridge received several warnings from the umpire John Garrett for encroaching into Oxford's water, forcing them to move back towards their station, but were still over a length ahead by Craven Cottage. The Light Blues passed the Mile Post five seconds ahead, and shot Hammersmith Bridge with a lead of four lengths. They maintained their lead of 12 seconds as they passed Chiswick Steps. The Dark Blues reduced the Cambridge lead to eleven seconds by Barnes Bridge, but Cambridge passed the finishing line in 17 minutes 51 seconds, three lengths ahead of Oxford. It was Cambridge's second victory in the last three years, and took the overall record in the event to 83–80 in their favour.
## Reactions
CUWBC's cox Sophie Shapter said "We just knew we had to go out there and do a job" while OUWBC's president Katherine Erickson explained that she was proud of her crew, many of whom had learnt to row at Oxford. James Letten remarked that his Cambridge crew were "on the money" and had "stepped up and delivered". In CUBC's Steve Trapmore's final Boat Race before moving to Team GB Olympic Rowing as a high performance coach, he admitted that "the boys really stepped up and delivered".
As the men's senior crews passed below Hammersmith Bridge, a banner was unfurled by the Cambridge Zero Carbon Society and smoke flares were let off, to protest against investment in fossil fuel companies by the two universities. Although the banner was not clear to viewers during the live race coverage on the BBC, commentator Andrew Cotter remarked "flares at the boat race, whatever next?" |
# Tawny owl
The tawny owl (Strix aluco), also called the brown owl, is a stocky, medium-sized owl in the family Strigidae. It is commonly found in woodlands across Europe, as well as western Siberia, and has seven recognized subspecies. The tawny owl's underparts are pale with dark streaks, whilst its upper body may be either brown or grey (in several subspecies, individuals may be of both colours). The tawny owl typically makes its nest in a tree hole where it can protect its eggs and young against potential predators. It is non-migratory and highly territorial: as a result, when young birds grow up and leave the parental nest, if they cannot find a vacant territory to claim as their own, they will often starve.
The tawny owl is a nocturnal bird of prey. It is able to hunt successfully at night because of its vision, hearing adaptations and its ability to fly silently. It usually hunts by dropping suddenly from a perch and seizing its prey, which it swallows whole. It mainly hunts rodents, although in urbanized areas its diet includes a higher proportion of birds. It also sometimes catches smaller owls, and is itself sometimes hunted by the eagle owl and the Eurasian goshawk.
Its retina is no more sensitive than a human's. Its directional hearing skill is more important to its hunting success: its ears are asymmetrically placed, which enables it to more precisely pinpoint the location from which a sound originates.
The tawny owl holds a place in human folklore: because it is active at night and has what many humans experience as a haunting call, people have traditionally associated it with bad omens and death. Many people think that all owl species make a hooting sound, but that is an overgeneralization based on the call of this particular species. In addition, the double hoot, which many people think is the tawny owl's prototypical call, is actually a call and response between a male and a female.
## Description
The tawny owl is a robust bird, 37–46 cm (15–18 in) in length, with an 81–105 cm (32–41 in) wingspan. Weight can range from 385 to 800 g (0.849 to 1.764 lb). Its large rounded head lacks ear tufts, and the facial disc surrounding the dark brown eyes is usually rather plain. The nominate race has two morphs which differ in their plumage colour, one form having rufous brown upperparts and the other greyish brown, although intermediates also occur. The underparts of both morphs are whitish and streaked with brown. Feathers are moulted gradually between June and December. This species is sexually dimorphic; the female is much larger than the male, 5% longer and more than 25% heavier.
The tawny owl flies with long glides on rounded wings, less undulating and with fewer wingbeats than other Eurasian owls, and typically at a greater height. The flight of the tawny owl is rather heavy and slow, particularly at takeoff, though the bird can attain a top flight speed of around 50 mph. As with most owls, its flight is silent because of its feathers' soft, furry upper surfaces and a fringe on the leading edge of the outer primaries. Its size, squat shape and broad wings distinguish it from other owls found within its range; the great grey owl (Strix nebulosa), Eurasian eagle-owl (Bubo bubo) and Ural owl (Strix uralensis) are similar in shape, but much larger.
An owl's eyes are placed at the front of the head and have a field overlap of 50–70%, giving it better binocular vision than diurnal birds of prey (overlap 30–50%). The tawny owl's retina has about 56,000 light-sensitive rod cells per square millimetre (36 million per square inch); although earlier claims that it could see in the infrared part of the spectrum have been dismissed, it is still often said to have eyesight 10 to 100 times better than humans in low-light conditions. However, the experimental basis for this claim is probably inaccurate by at least a factor of 10. The owl's actual visual acuity is only slightly greater than that of humans, and any increased sensitivity is due to optical factors rather than to greater retinal sensitivity; both humans and owl have reached the limit of resolution for the retinas of terrestrial vertebrates.
Adaptations to night vision include the large size of the eye, its tubular shape, large numbers of closely packed retinal rods, and an absence of cone cells, since rod cells have superior light sensitivity. There are few coloured oil drops, which would reduce the light intensity. Unlike diurnal birds of prey, owls normally have only one fovea, and that is poorly developed except in daytime hunters such as the short-eared owl.
Hearing is important for a nocturnal bird of prey, and as with other owls, the tawny owl's two ear openings differ in structure and are asymmetrically placed to improve directional hearing. A passage through the skull links the eardrums, and small differences in the time of arrival of a sound at each ear enables its source to be pinpointed. The left ear opening is higher on the head than the larger right ear and tilts downward, improving sensitivity to sounds from below. Both ear openings are hidden under the facial disk feathers, which are structurally specialized to be transparent to sound, and are supported by a movable fold of skin (the pre-aural flap).
The internal structure of the ear, which has large numbers of auditory neurons, gives an improved ability to detect low-frequency sounds at a distance, which could include rustling made by prey moving in vegetation. The tawny owl's hearing is ten times better than a human's, and it can hunt using this sense alone in the dark of a woodland on an overcast night, but the patter of raindrops makes it difficult to detect faint sounds, and prolonged wet weather can lead to starvation if the owl cannot hunt effectively.
The commonly heard female contact call is a shrill, kew-wick but the male has a quavering advertising song hoo...ho, ho, hoo-hoo-hoo-hoo. William Shakespeare used this owl's song in Love's Labour's Lost (Act 5, Scene 2) as "Then nightly sings the staring owl, Tu-whit; Tu-who, a merry note, While greasy Joan doth keel the pot", but this stereotypical call is actually a duet, with the female making the kew-wick sound, and the male responding hooo. The call is easily imitated by blowing into cupped hands through slightly parted thumbs, and a study in Cambridgeshire found that this mimicry produced a response from the owl within 30 minutes in 94% of trials. A male's response to a broadcast song appears to be indicative of his health and vigour; owls with higher blood parasite loads use fewer high frequencies and a more limited range of frequencies in their responses to an apparent intruder. The vocal activity of tawny owls depends on sex, annual cycle stage and weather, with males being more vocal than females year-round, with peak vocal activity during incubation and post-breeding.
### Geographical variation
Although both colour morphs occur in much of the European range, brown birds predominate in the more humid climate of western Europe, with the grey morph becoming more common further east; in the northernmost regions, all the owls are a cold-grey colour. The Siberian and Scandinavian subspecies are 12% larger and 40% heavier, and have 13% longer wings than western European birds, in accordance with Bergmann's rule which predicts that northern forms will typically be bigger than their southern counterparts.
The plumage colour is genetically controlled, and studies in Finland and Italy indicate that grey-morph tawny owls have more reproductive success, better immune resistance, and fewer parasites than brown birds. Although this might suggest that eventually the brown morph could disappear, the owls show no colour preference when choosing a mate, so the selection pressure in favour of the grey morph is reduced. There are also environmental factors involved. The Italian study showed that brown-morph birds were found in denser woodland, and in Finland, Gloger's rule would suggest that paler birds would in any case predominate in the colder climate.
## Taxonomy
The species was given its current scientific name Strix aluco by Carl Linnaeus in the tenth edition of his Systema Naturae in 1758. The binomial derives from the Greek strix "owl" and Italian allocco "tawny owl" (which in turn comes from the Latin ulucus "screech-owl").
The tawny owl is a member of the wood-owl genus Strix, part of the typical owl family Strigidae, which contains all species of owl other than the barn owls. Within its genus, the tawny owl's closest relatives are Hume's owl, Strix butleri, (formerly considered to be conspecific), the Himalayan owl, Strix nivicolum, (sometimes considered conspecific), its larger northern neighbour, the Ural owl, S. uralensis, and the North American barred owl, S. varia. The Early–Middle Pleistocene Strix intermedia is sometimes considered a paleosubspecies of the tawny owl, which would make it that species' immediate ancestor.
The tawny owl subspecies are often poorly differentiated, and may be at a flexible stage of subspecies formation with features related to the ambient temperature, the colour tone of the local habitat, and the size of available prey. Consequently, various authors have historically described between 10 and 15 subspecies. The seven currently recognised subspecies are listed below.
## Distribution and habitat
The tawny owl is non-migratory and has a distribution stretching discontinuously across temperate Europe, from Great Britain and the Iberian Peninsula eastwards to western Siberia. It is absent from Ireland - probably because of competition from the long-eared owl (Asio otus) - and only a rare vagrant to the Balearic and Canary Islands. In the Himalayas and East Asia it is replaced by the Himalayan owl (Strix nivicolum) and in northwest Africa it is replaced by the closely related Maghreb owl (Strix mauritanica).
This species is found in deciduous and mixed forests, and sometimes mature conifer plantations, preferring locations with access to water. Cemeteries, gardens and parks have allowed it to spread into urban areas, including central London. Although tawny owls occur in urban environments, especially those with natural forests and wooded habitat patches, they are less likely to occur at sites with high noise levels at night. The tawny owl is mainly a lowland bird in the colder parts of its range, but breeds to 550 metres (1,800 ft) in Scotland, 1,600 m (5,200 ft) in the Alps, 2,350 m (7,710 ft) in Turkey, and up to 2,800 m (9,200 ft) in Myanmar.
The tawny owl has a geographical range of at least 10 million km<sup>2</sup> (3.8 million mi<sup>2</sup>) and a large population including an estimated 970,000–2,000,000 individuals in Europe alone. Population trends have not been quantified, but there is evidence of an overall increase. This owl is not believed to meet the IUCN Red List criterion of declining more than 30% in ten years or three generations and is therefore evaluated as being of least concern. In the UK it is on the RSPB Amber List of Concern. This species has expanded its range in Belgium, the Netherlands, Norway and Ukraine, and populations are stable or increasing in most European countries. Declines have occurred in Finland, Estonia, Italy and Albania. Tawny owls are listed in Appendix II of the Convention on International Trade in Endangered Species (CITES) meaning international trade (including in parts and derivatives) is regulated.
## Behaviour
### Breeding
Tawny owls pair off from the age of one year, and stay together in a usually monogamous relationship for life. An established pair's territory is defended year-round and maintained with little, if any, boundary change from year to year. The pair sit in cover on a branch close to a tree trunk during the day, and usually roost separately from July to October. Roosting owls may be discovered and "mobbed" by small birds during the day, but they normally ignore the disturbance. Tawny owls are very territorial, and will indicate the location of their chosen territory by their vocalisations, which occur at their greatest frequency during the night, though some owls will continue to call during the day. The owl's home range is determined in early autumn, and the territory is defended throughout the winter and into spring when the breeding season begins.
The tawny owl typically nests in a hole in a tree, but will also use old European magpie nests, squirrel drey or holes in buildings, and readily takes to nest boxes. It nests from February onwards in the south of its range, but rarely before mid-March in Scandinavia. The glossy white eggs are 48 mm × 39 mm (1.9 in × 1.5 in) in size and weigh 39.0 g (1.38 oz), of which 7% is shell. The typical clutch of two or three eggs is incubated for 30 days to hatching, and the altricial, downy chicks fledge in a further 35–39 days. Incubation is usually undertaken by the female alone, although the male has rarely been observed to assist. The young usually leave the nest up to ten days before fledging, and hide on nearby branches.
This species is fearless in defence of its nest and young, and, like other Strix owls, strikes for the intruder's head with its sharp talons. Because its flight is silent, it may not be detected until it is too late to avoid the danger. Dogs, cats and humans may be assaulted, sometimes without provocation. Perhaps the best-known victim of the tawny owl's fierce attack was the renowned bird photographer Eric Hosking, who lost his left eye when struck by a bird he was attempting to photograph near its nest in 1937. He later called his autobiography An Eye for a Bird.
The parents care for young birds for two or three months after they fledge, but from August to November the juveniles disperse to find a territory of their own to occupy. If they fail to find a vacant territory, they usually starve. The juvenile survival rate is unknown, but the annual survival rate for adults is 76.8%. The typical lifespan is five years, but an age of over 18 years has been recorded for a wild tawny owl, and of over 27 years for a captive bird.
Predators of the tawny owl include large birds such as Ural owls, eagle owls, Eurasian goshawks, golden eagles, and common buzzards. Pine martens may raid nests, especially where artificial nest boxes make the owls easy to find, and several instances have been recorded of Eurasian jackdaws building nests on top of a brooding female tawny owl leading to the death of the adult and chicks. A Danish study showed that predation by mammals, especially red foxes, was an important cause of mortality in newly fledged young, with 36% dying between fledging and independence. The mortality risk increased with fledging date from 14% in April to more than 58% in June, and increasing predation of late broods may be an important selective agent for early breeding in this species.
This species is increasingly affected by avian malaria, the incidence of which has tripled in the last 70 years, in parallel with increasing global temperatures. An increase of one degree Celsius produces a two- to three-fold increase in the rate of malaria. In 2010, the incidence in British tawny owls was 60%, compared to 2–3% in 1996.
### Feeding
The tawny owl hunts almost entirely at night, watching from a perch before dropping or gliding silently down to its victim, but very occasionally it will hunt in daylight when it has young to feed. This species takes a wide range of prey, mainly woodland rodents, but also other mammals up to the size of a young rabbit, and birds, earthworms and beetles. In urban areas, birds make up a larger proportion of the diet, and species as unlikely as mallard and kittiwake have been killed and eaten.
Prey is typically swallowed whole, with indigestible parts regurgitated as pellets. These are medium-sized and grey, consisting mainly of rodent fur and often with bones protruding, and are found in groups under trees used for roosting or nesting.
Less powerful woodland owls such as the little owl and the long-eared owl cannot usually co-exist with the stronger tawny owls, which may take them as food items, and are found in different habitats; in Ireland the absence of the tawny owl allowed the long-eared owl to become the dominant owl. Similarly, where the tawny owl has moved into built-up areas, it tends to displace barn owls from their traditional nesting sites in buildings.
## In culture
The tawny owl, like its relatives, has often been seen as an omen of bad luck; William Shakespeare used it as such in Julius Caesar (Act 1 Scene 3): "And yesterday the bird of night did sit/ Even at noon-day upon the market-place/ Hooting and shrieking." John Ruskin is quoted as saying "Whatever wise people may say of them, I at least have found the owl's cry always prophetic of mischief to me".
Wordsworth described the technique for calling an owl in his poem "There Was a Boy".
> > And there, with fingers interwoven, both hands Pressed closely palm to palm and to his mouth Uplifted, he, as through an instrument, Blew mimic hootings to the silent owls, That they might answer him.—And they would shout Across the watery vale, and shout again, Responsive to his call,—with quivering peals, And long halloos, and screams, and echoes loud Redoubled and redoubled; concourse wild Of jocund din\! |
# Metacomet Ridge
The Metacomet Ridge, Metacomet Ridge Mountains, or Metacomet Range of southern New England is a narrow and steep fault-block mountain ridge known for its extensive cliff faces, scenic vistas, microclimate ecosystems, and rare or endangered plants. The ridge is an important recreation resource located within 10 miles (16 km) of more than 1.5 million people, offering four long-distance hiking trails and over a dozen parks and recreation areas, including several historic sites. It has been the focus of ongoing conservation efforts because of its natural, historic, and recreational value, involving municipal, state, and national agencies and nearly two dozen non-profit organizations.
The Metacomet Ridge extends from Branford, Connecticut, on Long Island Sound, through the Connecticut River Valley region of Massachusetts, to northern Franklin County, Massachusetts, 2 miles (3 km) short of the Vermont and New Hampshire borders for a distance of 100 miles (160 km). It is geologically distinct from the nearby Appalachian Mountains and surrounding uplands, and is composed of volcanic basalt (also known as trap rock) and sedimentary rock in faulted and tilted layers many hundreds of feet thick. In most cases, the basalt layers are dominant, prevalent, and exposed. The ridge rises dramatically from much lower valley elevations, although only 1,200 feet (370 m) above sea level at its highest, with an average summit elevation of 725 feet (221 m).
## Geographic definitions
Visually, the Metacomet Ridge exists as one continuous landscape feature from Long Island Sound at Branford, Connecticut, to the end of the Mount Holyoke Range in Belchertown, Massachusetts, a distance of 71 miles (114 km), broken only by the river gorges of the Farmington River in northern Connecticut and the Westfield and Connecticut Rivers in Massachusetts. It was first identified in 1985 as a single geologic feature consisting of trap rock by the State Geological and Natural History Survey of Connecticut. A 2004 report conducted for the National Park Service extends that definition to include the entire traprock ridge from Long Island Sound to the Pocumtuck Range in Greenfield, Massachusetts. Further complicating the matter is the fact that traprock only accounts for the highest surface layers of rock strata on the southern three–fourths of the range; an underlying geology of related sedimentary rock is also a part of the structure of the ridge; in north central Massachusetts it becomes the dominant strata and extends the range geologically from the Holyoke Range another 35 miles (56 km) through Greenfield to nearly the Vermont border.
## Nomenclature
Until January 2008, the United States Board on Geographic Names (USBGN) did not recognize Metacomet Ridge, Traprock Ridge or any other name, although several sub-ranges were identified. Geologists usually refer to the overall range generically as "the traprock ridge" or "the traprock mountains" or refer to it with regard to its prehistoric geologic significance in technical terms. The Sierra Club has referred to the entire range in Connecticut as "The Traprock Ridge". The name Metacomet Ridge was first applied in 1985 in a book published by the Connecticut State Geological Survey, adopting the name from the existing Metacomet Trail along a large portion of the range in central Connecticut.
The name "Metacomet" or "Metacom" is borrowed from the 17th century sachem of the Wampanoag Tribe of southern New England who led his people during King Philip's War in the mid–17th century. Metacomet was also known as King Philip by early New England colonists. A number of features associated with the Metacomet Ridge are named after the sachem, including the Metacomet Trail, the Metacomet-Monadnock Trail, King Philip's Cave, King Philip Mountain, and Sachem Head. According to legend, Metacomet orchestrated the burning of Simsbury, Connecticut, and watched the conflagration from Talcott Mountain near the cave now named after him. The names Metacomet and King Philip have been applied to at least sixteen landscape features and over seventy-five businesses, schools, and civic organizations throughout southern New England.
## Geography
Beginning at Long Island Sound, the Metacomet Ridge commences as two parallel ridges with related sub-ridges and outcrops in between; the latter include the high butte–like cliffs of East Rock and the isolated peak of Peter's Rock. The western ridgeline of the Metacomet Ridge begins in New Haven, Connecticut, as West Rock Ridge and continues as Sleeping Giant, Mount Sanford, Peck Mountain, and Prospect Ridge, for a distance of 16 miles (26 km) before diminishing into a series of low profile outcrops just short of Southington, Connecticut, 2.75 miles (4.4 km) west of the Hanging Hills in Meriden.
To the east, beginning on the rocky prominence of Beacon Hill, 130 feet (40 m), in Branford, Connecticut, overlooking the East Haven River estuary, the Metacomet Ridge continues as a traprock ridge 60 miles (97 km) north to Mount Tom in Holyoke, Massachusetts; it then breaks east across the Connecticut River to form the Holyoke Range, which continues for 10 miles (16 km) before terminating in Belchertown, Massachusetts. Several scattered parallel ridges flank it; the most prominent of these are the hills of Rocky Hill, Connecticut, and the Barn Door Hills of Granby, Connecticut.
North of Mount Tom and the Holyoke Range, the apparent crest of the Metacomet Ridge is broken by a discontinuity in the once dominant traprock strata. Underlying sedimentary layers remain but lack the same profile. Between the Holyoke Range and the Pocumtuck Ridge, a stretch of 9 miles (14 km), the Metacomet Ridge exists only as a series of mostly nondescript rises set among flat plains of sedimentary bedrock. Mount Warner, 512 feet (156 m), in Hadley, Massachusetts, the only significant peak in the area, is a geologically unrelated metamorphic rock landform that extends west into the sedimentary strata.
The Metacomet Ridge picks up elevation again with the Pocumtuck Ridge, beginning on Sugarloaf Mountain and the parallel massif of Mount Toby, 1,269 feet (387 m), the high point of the Metacomet Ridge geography. Both Sugarloaf Mountain and Mount Toby are composed of erosion-resistant sedimentary rock. North of Mount Sugarloaf, the Pocumtuck Ridge continues as alternating sedimentary and traprock dominated strata to Greenfield, Massachusetts. From Greenfield north to Northfield, Massachusetts 2 miles (3 km) short of the Vermont–New Hampshire–Massachusetts tri-border, the profile of the Metacomet Ridge diminishes into a series of nondescript hills and low, wooded mountain peaks composed of sedimentary rock with dwindling traprock outcrops.
In Connecticut, the high point of the Metacomet Ridge is West Peak of the Hanging Hills at 1,024 feet (312 m); in Massachusetts, the highest traprock peak is Mount Tom, 1,202 feet (366 m), although Mount Toby, made of sedimentary rock, is higher. Visually, the Metacomet Ridge is narrowest at Provin Mountain and East Mountain in Massachusetts where it is less than 0.5 miles (1 km) wide; it is widest at Totoket Mountain, over 4 miles (6 km). However, low parallel hills and related strata along much of the range often make the actual geologic breadth of the Metacomet Ridge wider than the more noticeable ridgeline crests, up to 10 miles (16 km) across in some areas. Significant river drainages of the Metacomet Ridge include the Connecticut River and tributaries (Falls River, Deerfield River, Westfield River, Farmington River, Coginchaug River); and, in southern Connecticut, the Quinnipiac River.
The Metacomet Ridge is surrounded by rural wooded, agricultural, and suburban landscapes, and is no more than 6 miles (10 km) from a number of urban hubs such as New Haven, Meriden, New Britain, Hartford, and Springfield. Small city centers abutting the ridge include Greenfield, Northampton, Amherst, Holyoke, West Hartford, Farmington, Wallingford, and Hamden.
## Geology
The Metacomet Ridge is the result of continental rifting processes that took place 200 million years ago during the Triassic and Jurassic periods. The basalt (also called traprock) crest of the Metacomet Ridge is the product of a series of massive lava flows hundreds of feet thick that welled up in faults created by the rifting apart of the North American continent from Eurasia and Africa. Essentially, the area now occupied by the Metacomet Ridge is a prehistoric rift valley which was once a branch of (or a parallel of) the major rift to the east that became the Atlantic Ocean.
Basalt is a dark colored extrusive volcanic rock. The weathering of iron-bearing minerals within it results in a rusty brown color when exposed to air and water, lending it a distinct reddish or purple–red hue. Basalt frequently breaks into octagonal and pentagonal columns, creating a unique "postpile" appearance. Extensive slopes made of fractured basalt talus are visible at the base of many of the cliffs along the Metacomet Ridge.
The basalt floods of lava that now form much of the Metacomet Ridge took place over a span of 20 million years. Erosion and deposition occurring between the eruptions deposited layers of sediment between the lava flows which eventually lithified into sedimentary rock layers within the basalt. The resulting "layer cake" of basalt and sedimentary rock eventually faulted and tilted upward (see fault-block). Subsequent erosion wore away many of the weaker sedimentary layers at a faster rate than the basalt layers, leaving the abruptly tilted edges of the basalt sheets exposed, creating the distinct linear ridge and dramatic cliff faces visible today on the western and northern sides of the ridge. Evidence of this layer-cake structure is visible on Mount Norwottuck of the Holyoke Range in Massachusetts. The summit of Norwottuck is made of basalt; directly beneath the summit are the Horse Caves, a deep overhang where the weaker sedimentary layer has worn away at a more rapid rate than the basalt layer above it. Mount Sugarloaf, Pocumtuck Ridge, and Mount Toby, also in Massachusetts, together present a larger "layer cake" example. The bottom layer is composed of arkose sandstone, visible on Mount Sugarloaf. The middle layer is composed of volcanic traprock, most visible on the Pocumtuck Ridge. The top layer is composed of a sedimentary conglomerate known as Mount Toby Conglomerate. Faulting and earthquakes during the period of continental rifting tilted the layers diagonally; subsequent erosion and glacial activity exposed the tilted layers of sandstone, basalt, and conglomerate visible today as three distinct mountain masses. Although Mount Toby and Mount Sugarloaf are not composed of traprock, they are part of the Metacomet Ridge by virtue of their origin via the same rifting and uplift processes.
Of all the summits that make up the Metacomet Ridge, West Rock, in New Haven, Connecticut, bears special mention because it was not formed by the volcanic flooding that created most of the traprock ridges. Rather, it is the remains of an enormous volcanic dike through which the basalt lava floods found access to the surface.
While the traprock cliffs remain the most obvious evidence of the prehistoric geologic processes of the Metacomet Ridge, the sedimentary rock of the ridge and surrounding terrain has produced equally significant evidence of prehistoric life in the form of Triassic and Jurassic fossils; in particular, dinosaur tracks. At a state park in Rocky Hill, Connecticut, more than 2,000 well preserved early Jurassic prints have been excavated. Other sites in Holyoke and Greenfield have likewise produced significant finds.
## Ecosystem
`The Metacomet Ridge hosts a combination of microclimates unusual to the region. Dry, hot upper ridges support oak savannas, often dominated by chestnut oak and a variety of understory grasses and ferns. Eastern red-cedar, a dry-loving species, clings to the barren edges of cliffs. Backslope plant communities tend to be similar to the adjacent upland plateaus and nearby Appalachians, containing species common to the northern hardwood and oak-hickory forest ecosystem types. Eastern hemlock crowds narrow ravines, blocking sunlight and creating damp, cooler growing conditions with associated cooler climate plant species. Talus slopes are especially rich in nutrients and support a number of calcium-loving plants uncommon in the region. Miles of high cliffs make ideal raptor habitat, and the Metacomet Ridge is a seasonal raptor migration corridor. `
Because the topography of the ridge offers such varied terrain, many species reach the northern or southern limit of their range on the Metacomet Ridge; others are considered rare nationally or globally. Examples of rare species that live on the ridge include the prickly pear cactus, peregrine falcon, northern copperhead, showy lady's slipper, yellow corydalis, ram's–head lady's slipper, basil mountain mint, and devil's bit lily.
The Metacomet Ridge is also an important aquifer. It provides municipalities and towns with public drinking water; reservoirs are located on Talcott Mountain, Totoket Mountain, Saltonstall Mountain, Bradley Mountain, Ragged Mountain, and the Hanging Hills in Connecticut. Reservoirs that supply metropolitan Springfield, Massachusetts, are located on Provin Mountain and East Mountain.
## History
### Pre-colonial era
Native Americans occupied the river valleys surrounding the Metacomet Ridge for at least 10,000 years. Major tribal groups active in the area included the Quinnipiac, Niantic, Pequot, Pocomtuc, and Mohegan. Traprock was used to make tools and arrowheads. Natives hunted game, gathered plants and fruits, and fished in local bodies of water around the Metacomet Ridge. Tracts of woodland in the river bottoms surrounding the ridges were sometimes burned to facilitate the cultivation of crops such as corn, squash, tobacco, and beans. Natives incorporated the natural features of the ridgeline and surrounding geography into their spiritual belief systems. Many Native American stories were in turn incorporated into regional colonial folklore. The giant stone spirit Hobbomock (or Hobomock), a prominent figure in many stories, was credited with diverting the course of the Connecticut River where it suddenly swings east in Middletown, Connecticut, after several hundred miles of running due south. Hobbomuck is also credited with slaying a giant human-eating beaver who lived in a great lake that existed in the Connecticut River Valley of Massachusetts. According to native beliefs as retold by European settlers, the corpse of the beaver remains visible as the Pocumtuck Ridge portion of the Metacomet Ridge. Later, after Hobbomuck diverted the course of the Connecticut River, he was punished to sleep forever as the prominent man-like form of the Sleeping Giant, part of the Metacomet Ridge in southern Connecticut. There seems to be an element of scientific truth in some of these tales. For instance, the great lake that the giant beaver was said to have inhabited may very well have been the post-glacial Lake Hitchcock, extant 10,000 years ago; the giant beaver may have been an actual prehistoric species of bear–sized beaver, Castoroides ohioensis, that lived at that time. Many features of the Metacomet Ridge region still bear names with Native American origins: Besek, Pistapaug, Coginchaug, Mattabesett, Metacomet, Totoket, Norwottuck, Hockanum, Nonotuck, Pocumtuck, and others.
### Colonization, agricultural transformation, and industrialization
Europeans began settling the river valleys around the Metacomet Ridge in the mid–17th century. Forests were cut down or burned to make room for agriculture, resulting in the near complete denuding of the once contiguous forests of southern New England by the 19th century. Steep terrain like the Metacomet Ridge, while not suitable for planting crops, was widely harvested of timber as a result of the expanding charcoal industry that boomed before the mining of coal from the mid–Appalachian regions replaced it as a source of fuel. In other cases, ridgetop forests burned when lower elevation land was set afire, and some uplands were used for pasturing. Traprock was harvested from talus slopes of the Metacomet Ridge to build house foundations; copper ore was discovered at the base of Peak Mountain in northern Connecticut and was mined by prisoners incarcerated at Old Newgate Prison located there.
With the advent of industrialization in the 19th century, riverways beneath the Metacomet Ridge were dammed to provide power as the labor force expanded in nearby cities and towns. Logging to provide additional fuel for mills further denuded the ridges. Traprock and sandstone were quarried from the ridge for paving stones and architectural brownstone, either used locally or shipped via rail, barge, and boat.
### Transcendentalism
Increased urbanization and industrialization in Europe and North America resulted in an opposing aesthetic transcendentalist movement characterized in New England by the art of Thomas Cole, Frederic Edwin Church, and other Hudson River School painters, the work of landscape architects such as Frederick Law Olmsted, and the writings of philosophers such as Henry David Thoreau and Ralph Waldo Emerson. As was true of other scenic areas of New England, the philosophical, artistic, and environmental movement of transcendentalism transformed the Metacomet Ridge from a commercial resource to a recreational resource. Hotels, parks, and summer estates were built on the mountains from the mid-1880s to the early 20th century. Notable structures included summit hotels and inns on Mount Holyoke, Mount Tom, Sugarloaf Mountain, and Mount Nonotuck. Parks and park structures such as Poet's Seat in Greenfield, Massachusetts, and Hubbard Park (designed with the help of Frederick Law Olmsted) of the Hanging Hills of Meriden, Connecticut, were intended as respites from the urban areas they closely abutted. Estates such as Hill-Stead and Heublein Tower were built as mountain home retreats by local industrialists and commercial investors. Although public attention gradually shifted to more remote and less developed destinations with the advent of modern transportation and the westward expansion of the United States, the physical, cultural, and historic legacy of that early recreational interest in the Metacomet Ridge still supports modern conservation efforts. Estates became museums; old hotels and the lands they occupied, frequently subject to damaging fires, became state and municipal parkland through philanthropic donation, purchase, or confiscation for unpaid taxes. Nostalgia among former guests of hotels and estates contributed to the aesthetic of conservation.
### Trailbuilding
Interest in mountains as places to build recreational footpaths took root in New England with organizations such as the Appalachian Mountain Club, the Green Mountain Club, the Appalachian Trail Conference, and the Connecticut Forest and Park Association. Following the pioneering effort of the Green Mountain Club in the inauguration of Vermont's Long Trail in 1918, the Connecticut Forest and Park Association, spearheaded by Edgar Laing Heermance, created the 23 miles (37 km) Quinnipiac Trail on the Metacomet Ridge in southern Connecticut in 1928 and soon followed it up with the 51 miles (82 km) Metacomet Trail along the Metacomet Ridge in central and northern Connecticut. More than 700 miles (1,100 km) of "blue blaze trails" in Connecticut were completed by the association by the end of the 20th century. While the focus of the Appalachian Mountain Club was geared primarily toward the White Mountains of New Hampshire in its early years, as club membership broadened, so did interest in the areas closer to club members' homes. In the late 1950s, the 110 miles (180 km) Metacomet-Monadnock Trail was laid out by the Berkshire Chapter of the Appalachian Mountain Club under leadership of Professor Walter M. Banfield of the University of Massachusetts Amherst. The trail follows the Metacomet Ridge for the first one–third of its length. Overall, trailbuilding had a pro-active effect on conservation awareness by thrusting portions of the Metacomet Ridge into the public consciousness.
### Suburbanization and land conservation
Although the Metacomet Ridge has abutted significant urban areas for nearly two hundred years, because of its rugged, steep, and rocky terrain, the ridge was long considered an undesirable place to build a home except for those wealthy enough to afford such a luxury. However, suburbanization through urban exodus and automobile culture, and modern construction techniques and equipment have created a demand for homes on and around the once undeveloped Metacomet Ridge and its surrounding exurban communities. As of 2007, the metropolitan areas bordering the range—New Haven, Meriden, New Britain, Hartford, Springfield and Greenfield—had a combined population of more than 2.5 million people. Populations in exurban towns around the range in Connecticut have increased 7.6 percent between the mid-1990s to 2000, and building permits increased 26 percent in the same period. Considered an attractive place to build homes because of its views and proximity to urban centers and highways, the Metacomet Ridge has become a target for both developers and advocates of land conservation. Quarrying, supported by the increased need for stone in local and regional construction projects, has been especially damaging to the ecosystem, public access, and visual landscape of the ridge. At the same time, the boom in interest in outdoor recreation in the latter 20th century has made the Metacomet Ridge an attractive "active leisure" resource. In response to public interest in the ridge and its surrounding landscapes, more than twenty local non-profit organizations have become involved in conservation efforts on and around the ridge and surrounding region. Most of these organizations came into being between 1970 and 2000, and nearly all of them have evidenced a marked increase in conservation activity since 1990. Several international and national organizations have also become interested in the Metacomet Ridge, including The Nature Conservancy, the Sierra Club, and the Trust for Public Land.
## Recreation
`Steepness, long cliff–top views, and proximity to urban areas make the Metacomet Ridge a significant regional outdoor recreation resource. The ridge is traversed by more than 200 miles (320 km) of long-distance and shorter hiking trails. Noteworthy trails in Connecticut include the 51-mile (82 km) Metacomet Trail, the 50-mile (80 km) Mattabesett Trail, the 23-mile (37 km) Quinnipiac Trail, and the 6-mile (10 km) Regicides Trail. Massachusetts trails include the 110-mile (177 km) Metacomet-Monadnock Trail, the 47-mile (76 km) Robert Frost Trail, and the 15-mile (24 km) Pocumtuck Ridge Trail. Site–specific activities enjoyed on the ridge include rock climbing, bouldering, fishing, boating, hunting, swimming, backcountry skiing, cross-country skiing, trail running, bicycling, and mountain biking. Trails on the ridge are open to snowshoeing, birdwatching and picnicking as well. The Metacomet Ridge hosts more than a dozen state parks, reservations, and municipal parks, and more than three dozen nature preserves and conservation properties. Seasonal automobile roads with scenic vistas are located at Poet's Seat Park, Mount Sugarloaf State Reservation, J.A. Skinner State Park, the Mount Tom State Reservation, Hubbard Park, and West Rock Ridge State Park; these roads are also used for bicycling and cross–country skiing. Camping and campfires are discouraged on most of the Metacomet Ridge, especially in Connecticut. Museums, historic sites, interpretive centers, and other attractions can be found on or near the Metacomet Ridge; some offer outdoor concerts, celebrations, and festivals.`
## Conservation
Because of its narrowness, proximity to urban areas, and fragile ecosystems, the Metacomet Ridge is most endangered by encroaching suburban sprawl. Quarry operations, also a threat, have obliterated several square miles of traprock ridgeline in both Massachusetts and Connecticut. Ridges and mountains affected include Trimountain, Bradley Mountain, Totoket Mountain, Chauncey Peak, Rattlesnake Mountain, East Mountain, Pocumtuck Ridge, and the former Round Mountain of the Holyoke Range. The gigantic man-like profile of the Sleeping Giant, a traprock massif visible for more than 30 miles (50 km) in south central Connecticut, bears quarrying scars on its "head". Mining there was halted by the efforts of local citizens and the Sleeping Giant Park Association.
Development and quarrying threats to the Metacomet Ridge have resulted in public open space acquisition efforts through collective purchasing and fundraising, active solicitation of land donations, securing of conservation easements, protective and restrictive legislation agreements limiting development, and, in a few cases, land taking by eminent domain. Recent conservation milestones include the acquisition of a defunct ski area on Mount Tom, the purchase of the ledges and summits of Ragged Mountain through the efforts of a local rock climbing club and the Nature Conservancy, and the inclusion of the ridgeline from North Branford, Connecticut, to Belchertown, Massachusetts, in a study by the National Park Service for a new National Scenic Trail now tentatively called the New England National Scenic Trail. Metacomet Ridge Conservation Compact, a Connecticut focus on ridgeline protection was initiated with the creation of the Metacomet Ridge Conservation Compact. The compact was ratified on Earth Day, April 22, 1998. The Compact serves as a guide for local land use decision-makers when discussing land use issues in Metacomet or Trap Rock ridge-line areas within the state. Ultimately signed by eighteen towns out of the nineteen ridge-line communities, this agreement committed local conservation commissions to strive to protect these ridges. The 18 towns whose Conservation Commissions signed the pact are; Avon, Berlin, Bloomfield, Branford, Durham, East Granby, East Haven, Farmington, Guilford, Meriden, Middlefield, North Branford, Plainville, Simsbury, Suffield, Wallingford, West Hartford and West Haven.
## See also
- List of Metacomet Ridge summits
- List of subranges of the Appalachian Mountains
- Traprock mountains in other parts of the world |
# Gilbert du Motier, Marquis de Lafayette
Marie-Joseph Paul Yves Roch Gilbert du Motier de La Fayette, Marquis de La Fayette (; 6 September 1757 – 20 May 1834), known in the United States as Lafayette (/ˌlɑːfiˈɛt, ˌlæf-/ LA(H)F-ee-ET), was a French nobleman and military officer who volunteered to join the Continental Army, led by General George Washington, in the American Revolutionary War. Lafayette was ultimately permitted to command Continental Army troops in the decisive Siege of Yorktown in 1781, the Revolutionary War's final major battle that secured American independence. After returning to France, Lafayette became a key figure in the French Revolution of 1789 and the July Revolution of 1830 and continues to be celebrated as a hero in both France and the United States.
Lafayette was born into a wealthy land-owning family in Chavaniac in the province of Auvergne in south-central France. He followed the family's martial tradition and was commissioned an officer at age 13. He became convinced that the American revolutionary cause was noble, and he traveled to the New World seeking glory in it. He was made a major general at age 19, but he was initially not given American troops to command. He fought with the Continental Army at the Battle of Brandywine near Chadds Ford, Pennsylvania, where he was wounded but managed to organize an orderly retreat, and he served with distinction in the Battle of Rhode Island. In the middle of the war, he returned home to France to lobby for an increase in French support for the American Revolution. He returned to America in 1780, and was given senior positions in the Continental Army. In 1781, troops under his command in Virginia blocked a British army led by Lord Cornwallis until other American and French forces could position themselves for the decisive siege of Yorktown.
Lafayette returned to France and was appointed to the Assembly of Notables in 1787, convened in response to the fiscal crisis. He was elected a member of the Estates General of 1789, where representatives met from the three traditional orders of French society: the clergy, the nobility, and the commoners. After the National Constituent Assembly was formed, he helped to write the Declaration of the Rights of Man and of the Citizen with Thomas Jefferson's assistance. This document was inspired by the United States Declaration of Independence, which was authored primarily by Jefferson, and invoked natural law to establish basic principles of the democratic nation-state. He also advocated for the abolition of slavery, in keeping with the philosophy of natural rights. After the storming of the Bastille, he was appointed commander-in-chief of France's National Guard and tried to steer a middle course through the years of revolution. In August 1792, radical factions ordered his arrest, and he fled to the Austrian Netherlands. He was captured by Austrian troops and spent more than five years in prison.
Lafayette returned to France after Napoleon Bonaparte secured his release in 1797, though he refused to participate in Napoleon's government. After the Bourbon Restoration of 1814, he became a liberal member of the Chamber of Deputies, a position which he held for most of the remainder of his life. In 1824, President James Monroe invited him to the United States as the nation's guest, where he visited all 24 states in the union and met a rapturous reception. During France's July Revolution of 1830, he declined an offer to become the French dictator. Instead, he supported Louis-Philippe as king, but turned against him when the monarch became autocratic. He died on 20 May 1834 and is buried in Picpus Cemetery in Paris, under soil from Bunker Hill. He is sometimes known as "The Hero of the Two Worlds" for his accomplishments in the service of both France and the United States.
## Early life
Lafayette was born on 6 September 1757 to Michel Louis Christophe Roch Gilbert Paulette du Motier, Marquis de La Fayette, colonel of grenadiers, and Marie Louise Jolie de La Rivière, at the Château de Chavaniac, in Chavaniac-Lafayette, near Le Puy-en-Velay, in the province of Auvergne (now Haute-Loire).
Lafayette's lineage was likely one of the oldest and most distinguished in Auvergne and, perhaps, in all of France. Males of the Lafayette family enjoyed a reputation for courage and chivalry and were noted for their contempt for danger. One of Lafayette's early ancestors, Gilbert de Lafayette III, a Marshal of France, had been a companion-at-arms of Joan of Arc's army during the Siege of Orléans in 1429. According to legend, another ancestor acquired the crown of thorns during the Sixth Crusade.
His non-Lafayette ancestors are also notable; his great-grandfather (his mother's maternal grandfather) was the Comte de La Rivière, until his death in 1770 commander of the Mousquetaires du Roi, or "Black Musketeers", King Louis XV's personal horse guard. Lafayette's paternal uncle Jacques-Roch died on 18 January 1734 while fighting the Austrians at Milan in the War of the Polish Succession; upon his death, the title of marquis passed to his brother Michel.
Lafayette's father likewise died on the battlefield. On 1 August 1759, Michel de Lafayette was struck by a cannonball while fighting a Anglo-German army at the Battle of Minden in Westphalia. Lafayette became marquis and Lord of Chavaniac, but the estate went to his mother. Perhaps devastated by the loss of her husband, she went to live in Paris with her father and grandfather, leaving Lafayette to be raised in Chavaniac-Lafayette by his paternal grandmother, Mme de Chavaniac, who had brought the château into the family with her dowry.
In 1768, when Lafayette was 11, he was summoned to Paris to live with his mother and great-grandfather at the comte's apartments in Luxembourg Palace. The boy was sent to school at the Collège du Plessis, part of the University of Paris, and it was decided that he would carry on the family martial tradition. The comte, the boy's great-grandfather, enrolled the boy in a program to train future Musketeers. Lafayette's mother and grandfather died, on 3 and 24 April 1770 respectively, leaving Lafayette an income of 25,000 livres. Upon the death of an uncle, the 12-year-old Lafayette inherited a yearly income of 120,000 livres.
In May 1771, aged less than 14, Lafayette was commissioned an officer in the Musketeers, with the rank of sous-lieutenant. His duties, which included marching in military parades and presenting himself to King Louis, were mostly ceremonial and he continued his studies as usual.
At this time, Jean-Paul-François de Noailles, Duc d'Ayen was looking to marry off some of his five daughters. The young Lafayette, aged 14, seemed a good match for his 12-year-old daughter, Marie Adrienne Françoise, and the duke spoke to the boy's guardian (Lafayette's uncle, the new comte) to negotiate a deal. However, the arranged marriage was opposed by the duke's wife, who felt the couple, and especially her daughter, were too young. The matter was settled by agreeing not to mention the marriage plans for two years, during which time the two spouses-to-be would meet from time to time in casual settings and get to know each other better. The scheme worked; the two fell in love, and were happy together from the time of their marriage in 1774 until her death in 1807.
## Departure from France
### Finding a cause
After the marriage contract was signed in 1773, Lafayette lived with his young wife in his father-in-law's house in Versailles. He continued his education, both at the riding school of Versailles, where his fellow students included the future Charles X, and at the prestigious Académie de Versailles. He was given a commission as a lieutenant in the Noailles Dragoons in April 1773, the transfer from the royal regiment being done at the request of Lafayette's father-in-law.
In 1775, Lafayette took part in his unit's annual training in Metz, where he met Charles François de Broglie, Marquis of Ruffec, the Army of the East's commander. At dinner, both men discussed the ongoing revolt against British rule in the Thirteen Colonies. Historian Marc Leepson argued that Lafayette, "who had grown up loathing the British for killing his father", felt that an American victory in the conflict would diminish Britain's stature internationally. Another historian noted that Lafayette had recently become a Freemason, and news of the revolt "fired his chivalric—and now Masonic—imagination with descriptions of Americans as 'people fighting for liberty'". A third historian, James R. Gaines, recalls Lafayette attending a dinner at this time with the Duke of Gloucester (brother of British King George III), who complained about the Americans who had objected to British rule and mocked their beliefs in equality the right of the people to rule themselves. Lafayette later recalled the dinner as a turning point in his thinking, and when he learned that Washington was seeking recruits for the Continental Army.
In September 1775, when Lafayette turned 18, he returned to Paris and received the captaincy in the Dragoons he had been promised as a wedding present. In December, his first child, Henriette, was born. During these months, Lafayette became convinced that the American Revolution reflected his own beliefs, saying "My heart was dedicated." The year 1776 saw delicate negotiations between American agents, including Silas Deane, and Louis XVI and his foreign minister, Comte Charles de Vergennes. The king and his minister hoped that by supplying the Americans with arms and officers, they might restore French influence in North America, and exact revenge against Britain for France's defeat in the Seven Years' War. When Lafayette heard that French officers were being sent to America, he demanded to be among them. He met Deane, and gained inclusion despite his youth. On 7 December 1776, Deane enlisted Lafayette as a major general.
The plan to send French officers (as well as other forms of aid) to America came to nothing when the British heard of it and threatened war. Lafayette's father-in-law, de Noailles, scolded the young man and told him to go to London and visit the Marquis de Noailles, the French ambassador to Britain and Lafayette's uncle by marriage, which he did in February 1777. In the interim, he did not abandon his plans to go to America. Lafayette was presented to George III, and spent three weeks in London. On his return to France, he went into hiding from his father-in-law (and superior officer), writing to him that he was planning to go to America. De Noailles was furious, and convinced Louis to issue a decree forbidding French officers from serving in America, specifically naming Lafayette. Vergennes may have persuaded the king to order Lafayette's arrest, though this is uncertain.
### Departure for America
Lafayette learned that the Continental Congress lacked funds for his voyage, so he bought the sailing ship Victoire with his own money for 112,000 pounds. He journeyed to Bordeaux, where Victoire was being prepared for her trip, and he sent word asking for information on his family's reaction. The response threw him into emotional turmoil, including letters from his wife and other relatives. Soon after departure, he ordered the ship turned around and returned to Bordeaux, to the frustration of the officers traveling with him. The army commander there ordered Lafayette to report to his father-in-law's regiment in Marseilles. De Broglie hoped to become a military and political leader in America, and he met Lafayette in Bordeaux and convinced him that the government actually wanted him to go. This was not true, though there was considerable public support for Lafayette in Paris, where the American cause was popular. Lafayette wanted to believe it, and pretended to comply with the order to report to Marseilles, going only a few kilometres east before turning around and returning to his ship. Victoire set sail out of Pauillac on the shores of the Gironde on 25 March 1777. However, Lafayette was not on board in order to avoid being identified by British spies or the French Crown; the vessel moored in Pasaia on the Basque coast, and was supplied with 5,000 rifles and ammunition from the factories in Gipuzkoa. He joined the Victoire, departing to America on 26 April 1777. The two-month journey to the New World was marked by seasickness and boredom. The ship's captain Lebourcier intended to stop in the West Indies to sell cargo, but Lafayette was fearful of arrest, so he bought the cargo to avoid docking at the islands. He landed on North Island near Georgetown, South Carolina on 13 June 1777.
## American Revolution
Upon his arrival, Lafayette met Major Benjamin Huger, a wealthy landowner, and stayed with him for two weeks before departing for the revolutionary capital of Philadelphia. The Second Continental Congress, convened in Philadelphia, had been overwhelmed by French officers recruited by Deane, many of whom could not speak English or lacked military experience. Lafayette had learned some English en route and became fluent within a year of his arrival, and his Masonic membership opened many doors in Philadelphia. After Lafayette offered to serve without pay, Congress commissioned him a major general on 31 July 1777. Lafayette's advocates included the recently arrived American envoy to France, Benjamin Franklin, who by letter urged Congress to accommodate the young Frenchman.
General George Washington, commander in chief of the Continental Army, came to Philadelphia to brief Congress on military affairs. Lafayette met him at a dinner on 5 August 1777; according to Leepson, "the two men bonded almost immediately." Washington was impressed by the young man's enthusiasm and was inclined to think well of a fellow Mason; Lafayette was simply in awe of the commanding general. General Washington took the Frenchman to view his military camp; when Washington expressed embarrassment at its state and that of the troops, Lafayette responded, "I am here to learn, not to teach." He became a member of Washington's staff, although confusion existed regarding his status. Congress regarded his commission as honorary, while he considered himself a full-fledged commander who would be given control of a division when Washington deemed him prepared. Washington told Lafayette that a division would not be possible as he was of foreign birth, but that he would be happy to hold him in confidence as "friend and father".
### Brandywine, Valley Forge, and Albany
Lafayette first saw combat at the Battle of Brandywine near Chadds Ford, Pennsylvania, on 11 September 1777. British commander Sir William Howe made plans to occupy Philadelphia by moving troops south by ship to Chesapeake Bay rather than approaching the city through the Delaware Bay, which there was a heavy Continental Army presence, and then bringing British troops over land to the city. After the British outflanked the Americans, Washington sent Lafayette to join General John Sullivan. Upon his arrival, Lafayette went with the Third Pennsylvania Brigade, under Brigadier Thomas Conway, and attempted to rally the unit to face the attack. British and Hessian troops continued to advance with their superior numbers, and Lafayette was shot in the leg. During the American retreat, Lafayette rallied the troops, allowing a more orderly pullback, before being treated for his wound. After the battle, Washington cited him for "bravery and military ardour" and recommended him for the command of a division in a letter to Congress, which was hastily evacuating, as the British occupied Philadelphia later that month.
Lafayette returned to the field in November after two months of recuperation in the Moravian settlement of Bethlehem, and received command of the division previously led by Major General Adam Stephen. He assisted General Nathanael Greene in reconnaissance of British positions in New Jersey; with 300 soldiers, he defeated a numerically superior Hessian force in Gloucester, on 24 November 1777.
Lafayette stayed at Washington's encampment at Valley Forge in the winter of 1777–78, and shared the hardship of his troops. There, the Board of War, led by Horatio Gates, asked Lafayette to prepare an invasion of Quebec from Albany, New York. When Lafayette arrived in Albany, he found too few men to mount an invasion. He wrote to Washington of the situation, and made plans to return to Valley Forge. Before departing, he recruited the Oneida tribe to the American side. The Oneida referred to Lafayette as Kayewla (fearsome horseman). In Valley Forge, he criticized the board's decision to attempt an invasion of Quebec in winter. The Continental Congress agreed, and Gates left the board. Meanwhile, treaties signed by America and France were made public in March 1778, and France formally recognized American independence.
### Barren Hill, Monmouth, and Rhode Island
Faced with the prospect of French intervention, the British sought to concentrate their land and naval forces in New York City, and they began to evacuate Philadelphia in May 1778. Washington dispatched Lafayette with a 2,200-man force on 18 May to reconnoiter near Barren Hill, Pennsylvania. The next day, the British heard that he had made camp nearby and sent 5,000 men to capture him. General Howe led a further 6,000 soldiers on 20 May and ordered an attack on his left flank. The flank scattered, and Lafayette organized a retreat while the British remained indecisive. To feign numerical superiority, Lafayette ordered men to appear from the woods on an outcropping (now Lafayette Hill, Pennsylvania) and to fire upon the British periodically. His troops simultaneously escaped via a sunken road, and he was then able to cross Matson's Ford with the remainder of his force.
The British then marched from Philadelphia toward New York. The Continental Army followed and finally attacked at Monmouth Courthouse in central New Jersey. Washington appointed General Charles Lee to lead the attacking force at the Battle of Monmouth, and Lee moved against the British flank on 28 June. However, he gave conflicting orders soon after fighting began, causing chaos in the American ranks. Lafayette sent a message to Washington to urge him to the front; upon his arrival, he found Lee's men in retreat. Washington relieved Lee, took command, and rallied the American force. After suffering significant casualties at Monmouth, the British withdrew in the night and successfully reached New York.
The French fleet arrived at Delaware Bay on 8 July 1778 under Admiral d'Estaing, with whom General Washington planned to attack Newport, Rhode Island, the other major British base in the north. Lafayette and General Greene were sent with a 3,000-man force to participate in the attack. Lafayette wanted to control a joint Franco-American force but was rebuffed by the admiral. On 9 August, the American land force attacked the British without consulting d'Estaing. The Americans asked d'Estaing to place his ships in Narragansett Bay, but he refused and sought to defeat the British Royal Navy at sea. The fighting was inconclusive as a storm scattered and damaged both fleets.
D'Estaing moved his ships north to Boston for repairs, where it faced an angry demonstration from Bostonians who considered the French departure from Newport to be a desertion. John Hancock and Lafayette were dispatched to calm the situation, and Lafayette then returned to Rhode Island to prepare the retreat made necessary by d'Estaing's departure. For these actions, he was cited by the Continental Congress for "gallantry, skill, and prudence". He wanted to expand the war to fight the British elsewhere in America and even in Europe under the French flag, but he found little interest in his proposals. In October 1778, he requested permission from Washington and Congress to go home on leave. They agreed, with Congress voting to give him a ceremonial sword to be presented to him in France. His departure was delayed by illness, and he sailed for France in January 1779.
### Return to France
Lafayette reached Paris in February 1779 where he was placed under house arrest for eight days for disobeying the king by going to America. This was merely face-saving by Louis XVI; Lafayette was given a hero's welcome and was soon invited to hunt with the king. The American envoy was ill, so Benjamin Franklin's grandson William Temple Franklin presented Lafayette with the gold-encrusted sword commissioned by the Continental Congress.
Lafayette pushed for an invasion of Britain, with himself to have a major command in the French forces. Spain was now France's ally against Britain and sent ships to the English Channel in support. The Spanish ships did not arrive until August 1779 and were met by a faster squadron of British ships that the combined French and Spanish fleet could not catch. In September, the invasion was abandoned, and Lafayette turned his hopes toward returning to America. In December 1779, Adrienne gave birth to Georges Washington Lafayette.
Lafayette worked with Benjamin Franklin to secure the promise of 6,000 soldiers to be sent to America, commanded by General Jean-Baptiste de Rochambeau. Lafayette would resume his position as a major general of American forces, serving as liaison between Rochambeau and Washington, who would be in command of both nations' forces. In March 1780, he departed from Rochefort for America aboard the frigate Hermione, arriving in Boston on 27 April 1780.
### Second voyage to America
On his return, Lafayette found the American cause at a low ebb, rocked by several military defeats, especially in the south. Lafayette was greeted in Boston with enthusiasm, seen as "a knight in shining armor from the chivalric past, come to save the nation". He journeyed southwest and on 10 May 1780 had a joyous reunion with Washington at Morristown, New Jersey. The general and his officers were delighted to hear that the large French force promised to Lafayette would be coming to their aid. Washington, aware of Lafayette's popularity, had him write (with Alexander Hamilton to correct his spelling) to state officials to urge them to provide more troops and provisions to the Continental Army. This bore fruit in the coming months, as Lafayette awaited the arrival of the French fleet. However, when the fleet arrived, there were fewer men and supplies than expected, and Rochambeau decided to wait for reinforcements before seeking battle with the British. This was unsatisfactory to Lafayette, who proposed grandiose schemes for the taking of New York City and other areas, and Rochambeau briefly refused to receive Lafayette until the young man apologized. Washington counseled the marquis to be patient.
That summer Washington placed Lafayette in charge of a division of troops. The marquis spent lavishly on his command, which patrolled North Jersey and adjacent New York state. Lafayette saw no significant action, and in November, Washington disbanded the division, sending the soldiers back to their state regiments. The war continued badly for the Americans, with most battles in the south going against them, and General Benedict Arnold abandoning them for the British side.
Lafayette spent the first part of the winter of 1780–81 in Philadelphia, where the American Philosophical Society elected him its first foreign member. Congress asked him to return to France to lobby for more men and supplies, but Lafayette refused, sending letters instead.
After the Continental victory at the Battle of Cowpens in the Province of South Carolina, in January 1781, Washington ordered Lafayette to re-form his force in Philadelphia and go south to the Colony of Virginia to link up with troops commanded by Baron von Steuben. The combined force was to try to trap British forces commanded by Benedict Arnold, with French ships preventing his escape by sea. If Lafayette was successful, Arnold was to be summarily hanged. British command of the seas prevented the plan, though Lafayette and a small part of his force was able to reach von Steuben in Yorktown, Virginia. Von Steuben sent a plan to Washington, proposing to use land forces and French ships to trap the main British force under Lord Cornwallis. When he received no new orders from Washington, Lafayette began to move his troops north toward Philadelphia, only to be ordered to Virginia to assume military command there. An outraged Lafayette assumed he was being abandoned in a backwater while decisive battles took place elsewhere, and objected to his orders in vain. He also sent letters to the Chevalier de la Luzerne, French ambassador in Philadelphia, describing how ill-supplied his troops were. As Lafayette hoped, la Luzerne sent his letter on to France with a recommendation of massive French aid, which, after being approved by the king, would play a crucial part in the battles to come. Washington, fearing a letter might be captured by the British, could not tell Lafayette that he planned to trap Cornwallis in a decisive campaign.
### Virginia and Yorktown
Lafayette evaded Cornwallis' attempts to capture him in Richmond. In June 1781, Cornwallis received orders from London to proceed to the Chesapeake Bay and to oversee construction of a port, in preparation for an overland attack on Philadelphia. As the British column traveled, Lafayette sent small squads that would appear unexpectedly, attacking the rearguard or foraging parties, and giving the impression that his forces were larger than they were.
On 4 July, the British left Williamsburg and prepared to cross the James River. Cornwallis sent only an advance guard to the south side of the river, hiding many of his other troops in the forest on the north side, hoping to ambush Lafayette. On 6 July, Lafayette ordered General "Mad" Anthony Wayne to strike British troops on the north side with roughly 800 soldiers. Wayne found himself vastly outnumbered, and, instead of retreating, led a bayonet charge. The charge bought time for the Americans, and the British did not pursue. The Battle of Green Spring was a victory for Cornwallis, but the American army was bolstered by the display of courage by the men.
By August, Cornwallis had established the British at Yorktown, and Lafayette took up position on Malvern Hill, stationing artillery surrounding the British, who were close to the York River, and who had orders to construct fortifications to protect the British ships in Hampton Roads. Lafayette's containment trapped the British when the French fleet arrived and won the Battle of the Virginia Capes, depriving Cornwallis of naval protection. On 14 September 1781, Washington's forces joined Lafayette's. On 28 September, with the French fleet blockading the British, the combined forces laid siege to Yorktown. On 14 October, Lafayette's 400 men on the American right took Redoubt 9 after Alexander Hamilton's forces had charged Redoubt 10 in hand-to-hand combat. These two redoubts were key to breaking the British defenses. After a failed British counter-attack, Cornwallis surrendered on 19 October 1781.
## Hero of two worlds
Yorktown was the last major land battle of the American Revolution, but the British still held several major port cities. Lafayette wanted to lead expeditions to capture them, but Washington felt that he would be more useful seeking additional naval support from France. Congress appointed him its advisor to America's envoys in Europe, Benjamin Franklin in Paris, John Jay in Madrid, and John Adams in The Hague, instructing them "to communicate and agree on everything with him". Congress also sent Louis XVI an official letter of commendation on the marquis's behalf.
Lafayette left Boston for France on 18 December 1781 where he was welcomed as a hero, and he was received at the Palace of Versailles on 22 January 1782. He witnessed the birth of his daughter, whom he named Marie-Antoinette Virginie upon Thomas Jefferson's recommendation. He was promoted to maréchal de camp, skipping numerous ranks, and he was made a Knight of the Order of Saint Louis. He worked on a combined French and Spanish expedition against the British West Indies in 1782, as no formal peace treaty had yet been signed. The Treaty of Paris was signed between Great Britain and the United States in 1783, which made the expedition unnecessary; Lafayette took part in those negotiations.
Lafayette worked with Jefferson to establish trade agreements between the United States and France which aimed to reduce America's debt to France. He joined the Society of the Friends of the Blacks, a French abolitionist group which advocated for the abolition of the Atlantic slave trade and equal rights for free people of color. He urged the emancipation of American slaves and their establishment as tenant farmers in a 1783 letter to Washington, who was a slave owner. Washington declined to free his slaves, though he expressed interest in the young man's ideas, and Lafayette purchased three slave plantations in Cayenne, French Guiana to house the project.
Lafayette visited America in 1784–1785 where he enjoyed an enthusiastic welcome, visiting all the states. The trip included a visit to Washington's farm at Mount Vernon on 17 August. He addressed the Virginia House of Delegates where he called for "liberty of all mankind" and urged the abolition of slavery, and he urged the Pennsylvania Legislature to help form a federal union (the states were then bound by the Articles of Confederation). He visited the Mohawk Valley in New York to participate in peace negotiations with the Iroquois, some of whom he had met in 1778. He received an honorary degree from Harvard University, a portrait of Washington from the city of Boston, and a bust from the state of Virginia. Maryland's legislature honored him by making him and his male heirs "natural born Citizens" of the state, which made him a natural-born citizen of the United States after the 1789 ratification of the Constitution. Lafayette later boasted that he had become an American citizen before the concept of French citizenship existed. Connecticut, Massachusetts, and Virginia also granted him citizenship.
Lafayette made the Hôtel de La Fayette in Paris's rue de Bourbon an important meeting place for Americans there. Benjamin Franklin, John and Sarah Jay, and John and Abigail Adams met there every Monday and dined in company with Lafayette's family and the liberal nobility, including Clermont-Tonnerre and Madame de Staël. Lafayette continued to work on lowering trade barriers in France to American goods, and on assisting Franklin and Jefferson in seeking treaties of amity and commerce with European nations. He also sought to correct the injustices that Huguenots in France had endured since the revocation of the Edict of Nantes a century before.
## French Revolution
### Assembly of Notables and Estates-General
On 29 December 1786, King Louis XVI called an Assembly of Notables, in response to France's fiscal crisis. The king appointed Lafayette to the body, which convened on 22 February 1787. In speeches, Lafayette decried those with connections at court who had profited from advance knowledge of government land purchases; he advocated reform. He called for a "truly national assembly", which represented the whole of France. Instead, the king chose to summon an Estates General, to convene in 1789. Lafayette was elected as a representative of the nobility (the Second Estate) from Riom. The Estates General, traditionally, cast one vote for each of the three Estates: clergy, nobility, and commons, meaning the much larger commons was generally outvoted.
The Estates General convened on 5 May 1789; debate began on whether the delegates should vote by head or by Estate. If by Estate, then the nobility and clergy would be able to outvote the commons; if by head, then the larger Third Estate could dominate. Before the meeting, as a member of the "Committee of Thirty", Lafayette agitated for voting by head, rather than estate. He could not get a majority of his own Estate to agree, but the clergy was willing to join with the commons, and on the 17th, the group declared itself the National Assembly. The loyalist response was to lock out the group, including Lafayette, while those who had not supported the Assembly met inside. This action led to the Tennis Court Oath, where the excluded members swore not to separate until a constitution was established. The Assembly continued to meet, and on 11 July 1789, Lafayette presented a draft of the "Declaration of the Rights of Man and of the Citizen" to the Assembly, written by himself in consultation with Jefferson. The next day, after the dismissal of Finance Minister Jacques Necker (who was seen as a reformer), lawyer Camille Desmoulins assembled between 700 and 1000 armed insurgents. The king had the royal army under the duc de Broglie surround Paris. On 14 July, the fortress known as the Bastille was stormed by the insurgents.
### National Guard, Versailles, and Day of Daggers
On 15 July, Lafayette was acclaimed commander-in-chief of the Parisian National Guard, an armed force established to maintain order under the control of the Assembly military service as well as policing, traffic control, sanitization, lighting, among other matters of local administration. Lafayette proposed the name and the symbol of the group: a blue, white, and red cockade. This combined the red and blue colors of the city of Paris with the royal white, and originated the French tricolor. He faced a difficult task as head of the Guard; the king and many loyalists considered him and his supporters to be little better than revolutionaries, whereas many commoners felt that he was helping the king to keep power via this position.
The National Assembly approved the Declaration on 26 August, but the king rejected it on 2 October. Three days later, a Parisian crowd led by women fishmongers marched to Versailles in response to the scarcity of bread. Members of the National Guard followed the march, with Lafayette reluctantly leading them. At Versailles, the king accepted the Assembly's votes on the Declaration, but refused requests to go to Paris, and the crowd broke into the palace at dawn. Lafayette took the royal family onto the palace balcony and attempted to restore order, but the crowd insisted that the king and his family move to Paris and the Tuileries Palace. The king came onto the balcony and the crowd started chanting "Vive le Roi\!" Marie Antoinette then appeared with her children, but she was told to send the children back in. She returned alone and people shouted to shoot her, but she stood her ground and no one opened fire. Lafayette kissed her hand, leading to cheers from the crowd.
Lafayette would later initiate an investigation within the National Assembly on the now declared October Days, which led to the production of the Procédure Criminelle by Jean-Baptiste-Charles Chabroud, a 688-page document accumulating evidence and analysis on the exact events and procedures of the March on Versailles, hoping to condemn those inciting the mob (in his mind being Mirabeau and the Duc d'Orléans). However, the National Assembly thought condemning two significant revolutionaries would hurt the progress and public reception of the revolutionary administration.
As leader of the National Guard, Lafayette attempted to maintain order and steer a middle ground, even as the radicals gained increasing influence. He and Paris' mayor Jean Sylvain Bailly instituted a political club on 12 May 1790 called the Society of 1789 whose intention was to provide balance to the influence of the radical Jacobins.
Lafayette helped organize and lead the assembly at the Fête de la Fédération on 14 July 1790 where he, alongside the National Guard and the king, took the civic oath on the Champs de Mars on 14 July 1790 vowing to "be ever faithful to the nation, to the law, and to the king; to support with our utmost power the constitution decreed by the National Assembly, and accepted by the king." In the eyes of the royalist factions, Lafayette took a large risk holding a largely undisciplined group at the Champs de Mars in fear for the safety of the king, whereas for Jacobins this solidified in their eyes Lafayette's royalist tendencies and an encouragement of the common people's support of the monarchy.
Lafayette continued to work for order in the coming months. He and part of the National Guard left the Tuileries on 28 February 1791 to handle a conflict in Vincennes, and hundreds of armed nobles arrived at the Tuileries to defend the king while he was gone. However, there were rumors that these nobles had come to take the king away and place him at the head of a counter-revolution. Lafayette quickly returned to the Tuileries and disarmed the nobles after a brief standoff. The event came to be known as the Day of Daggers, and it boosted Lafayette's popularity with the French people for his quick actions to protect the king. Nonetheless, the royal family were increasingly prisoners in their palace. The National Guard disobeyed Lafayette on 18 April and prevented the king from leaving for Saint-Cloud where he planned to attend Mass.
### Flight to Varennes
A plot known as the Flight to Varennes almost enabled the king to escape from France on 20 June 1791. The king and queen had escaped from the Tuileries Palace, essentially under the watch of Lafayette and the National Guard. Being notified of their escape, Lafayette sent the Guard out in a multitude of directions in order to retrieve the escapee monarchs. Five days later, Lafayette and the National Guard led the royal carriage back into Paris amidst a crowding mob calling for the heads of the monarchs as well as Lafayette. Lafayette had been responsible for the royal family's custody as leader of the National Guard, and he was thus blamed by extremists such as Georges Danton, declaring in a speech directed towards Lafayette "You swore that the king would not leave. Either you sold out your country or you are stupid for having made a promise for a person whom you could not trust.... France can be free without you." He was further called a traitor to the people by Maximilien Robespierre. These accusations made Lafayette appear a royalist, damaged his reputation in the eyes of the public, and strengthened the hands of the Jacobins and other radicals in opposition to him. He continued to urge the constitutional rule of law, but he was drowned out by the mob and its leaders.
### Champs de Mars massacre
Lafayette's public standing continued to decline through the latter half of 1791. The radical Cordeliers organized an event at the Champ de Mars on 17 July to gather signatures on a petition to the National Assembly that it either abolish the monarchy or allow its fate to be decided in a referendum. The assembled crowd was estimated to be anywhere from 10,000 to 50,000 people. The protesters, finding two men hiding under an altar at the event, accused of being either spies or of potentially planting explosives, eventually hung the men from lampposts and placed their heads on the ends of pikes. Lafayette rode into the Champ de Mars at the head of his troops to restore order, but they were met with the throwing of stones from the crowd. Indeed, an assassination attempt was made on Lafayette, however the gunman's pistol misfired at close range. The soldiers began to first fire above the crowd in order to intimidate and disperse them, which only led to retaliation and eventually the death of two volunteer chasseurs. The National Guard was ordered to fire on the crowd, wounding and killing an unknown number of people. Accounts from those close to Lafayette claim that around ten citizens were killed in the event, whereas other accounts propose fifty-four, and the sensational newspaper publisher Jean-Paul Marat claimed over four hundred bodies had been disposed of into the river later that night.
Martial law was declared, and the leaders of the mob fled and went into hiding, such as Danton and Marat. Lafayette's reputation among many political clubs decreased dramatically, especially with articles in the press, such as the Revolutions de Paris describing the event at the Champ de Mars as "Men, Women, and Children were massacred on the altar of the nation on the Field of the Federation". Immediately after the massacre, a crowd of rioters attacked Lafayette's home and attempted to harm his wife. The Assembly finalized a constitution in September, and Lafayette resigned from the National Guard in early October, with a semblance of constitutional law restored.
### Conflict and exile
Lafayette returned to his home province of Auvergne in October 1791. France declared war on Austria on 20 April 1792, and preparations to invade the Austrian Netherlands (today's Belgium) began. Lafayette, who had been promoted to Lieutenant General on 30 June 1791, received command of one of the three armies, the Army of the Centre, based at Metz, on 14 December 1791. Lafayette did his best to mold inductees and National Guardsmen into a cohesive fighting force, but found that many of his troops were Jacobin sympathizers and hated their superior officers. On 23 April 1792, Robespierre demanded that Lafayette step down. This emotion was common in the army, as demonstrated after the Battle of Marquain, when the routed French troops dragged their commander Théobald Dillon to Lille, where he was torn to pieces by the mob. One of the army commanders, Rochambeau, resigned. Lafayette, along with the third commander, Nicolas Luckner, asked the Assembly to begin peace talks, concerned at what might happen if the troops saw another battle.
In June 1792, Lafayette criticized the growing influence of the radicals through a letter to the Assembly from his field post, and ended his letter by calling for their parties to be "closed down by force". He misjudged his timing, for the radicals were in full control in Paris. Lafayette went there, and on 28 June delivered a fiery speech before the Assembly denouncing the Jacobins and other radical groups. He was instead accused of deserting his troops. Lafayette called for volunteers to counteract the Jacobins; when only a few people showed up, he understood the public mood and hastily left Paris. Robespierre called him a traitor and the mob burned him in effigy. He was transferred to command of the Army of the North on 12 July 1792.
The 25 July Brunswick Manifesto, which warned that Paris would be destroyed by the Austrians and Prussians if the king was harmed, led to the downfall of Lafayette, and of the royal family. A mob attacked the Tuileries on 10 August, and the king and queen were imprisoned at the Assembly, then taken to the Temple. The Assembly abolished the monarchy—the king and queen would be beheaded in the coming months. On 14 August, the minister of justice, Danton, put out a warrant for Lafayette's arrest. Hoping to travel to the United States, Lafayette entered the Austrian Netherlands.
## Prisoner
Lafayette was taken prisoner by the Austrians near Rochefort when another former French officer, Jean-Xavier Bureau de Pusy, asked for rights of transit through Austrian territory on behalf of a group of French officers. This was initially granted, as it had been for others fleeing France, but was revoked when the famous Lafayette was recognized. Frederick William II of Prussia, Austria's ally against France, had once received Lafayette, but that was before the French Revolution—the king now saw him as a dangerous fomenter of rebellion, to be interned to prevent him from overthrowing other monarchies.
Lafayette was held at Nivelles, then transferred to Luxembourg where a coalition military tribunal declared him, de Pusy, and two others to be prisoners of state for their roles in the Revolution. The tribunal ordered them held until a restored French king could render final judgment on them. On 12 September 1792, pursuant to the tribunal's order, the prisoners were transferred to Prussian custody. The party traveled to the Prussian fortress-city of Wesel, where the Frenchmen remained in verminous individual cells in the central citadel from 19 September to 22 December 1792. When victorious French revolutionary troops began to threaten the Rhineland, King Frederick William II transferred the prisoners east to the citadel at Magdeburg, where they remained an entire year, from 4 January 1793 to 4 January 1794.
Frederick William decided that he could gain little by continuing to battle the unexpectedly successful French forces, and that there were easier pickings for his army in the Polish-Lithuanian Commonwealth. Accordingly, he stopped armed hostilities with the Republic and turned the state prisoners back over to his erstwhile coalition partner, the Habsburg Austrian monarch Francis II, Holy Roman Emperor. Lafayette and his companions were initially sent to Neisse (today Nysa, Poland) in Silesia. On 17 May 1794, they were taken across the Austrian border, where a military unit was waiting to receive them. The next day, the Austrians delivered their captives to a barracks-prison, formerly a college of the Jesuits, in the fortress-city of Olmütz, Moravia (today Olomouc in the Czech Republic).
Lafayette, when captured, had tried to use the American citizenship he had been granted to secure his release, and contacted William Short, United States minister in The Hague. Although Short and other U.S. envoys very much wanted to succor Lafayette for his services to their country, they knew that his status as a French officer took precedence over any claim to American citizenship. Washington, who was by then president, had instructed the envoys to avoid actions that entangled the country in European affairs, and the U.S. did not have diplomatic relations with either Prussia or Austria. They did send money for the use of Lafayette, and for his wife, whom the French had imprisoned. Secretary of State Jefferson found a loophole allowing Lafayette to be paid, with interest, for his services as a major general from 1777 to 1783. An act was rushed through Congress and signed by President Washington. These funds allowed both Lafayettes privileges in their captivity.
A more direct means of aiding the former general was an escape attempt sponsored by Alexander Hamilton's sister-in-law Angelica Schuyler Church and her husband John Barker Church, a British Member of Parliament who had served in the Continental Army. They hired as agent a young Hanoverian physician, Justus Erich Bollmann, who acquired an assistant, a South Carolinian medical student named Francis Kinloch Huger. This was the son of Benjamin Huger, whom Lafayette had stayed with upon his first arrival in America. With their help, Lafayette managed to escape from an escorted carriage drive in the countryside outside Olmütz, but he lost his way and was recaptured.
Once Adrienne was released from prison in France, she, with the help of U.S. Minister to France James Monroe, obtained passports for her and her daughters from Connecticut, which had granted the entire Lafayette family citizenship. Her son Georges Washington had been smuggled out of France and taken to the United States. Adrienne and her two daughters journeyed to Vienna for an audience with Emperor Francis, who granted permission for the three women to live with Lafayette in captivity. Lafayette, who had endured harsh solitary confinement since his escape attempt a year before, was astounded when soldiers opened his prison door to usher in his wife and daughters on 15 October 1795. The family spent the next two years in confinement together.
Through diplomacy, the press, and personal appeals, Lafayette's sympathizers on both sides of the Atlantic made their influence felt, most importantly on the post-Reign of Terror French government. A young, victorious general, Napoleon Bonaparte, negotiated the release of the state prisoners at Olmütz, as a result of the Treaty of Campo Formio. Lafayette's captivity of over five years thus came to an end. The Lafayette family and their comrades in captivity left Olmütz under Austrian escort early on the morning of 19 September 1797, crossed the Bohemian-Saxon border north of Prague, and were officially turned over to the American consul in Hamburg on 4 October.
From Hamburg, Lafayette sent a note of thanks to General Bonaparte. The French government, the Directory, was unwilling to have Lafayette return unless he swore allegiance, which he was not willing to do, as he believed it had come to power by unconstitutional means. As revenge, it had his remaining properties sold, leaving him a pauper. The family, soon joined by Georges Washington, who had returned from America, recuperated on a property near Hamburg belonging to Adrienne's aunt. Due to conflict between the United States and France, Lafayette could not go to America as he had hoped, making him a man without a country.
Adrienne was able to go to Paris, and attempted to secure her husband's repatriation, flattering Bonaparte, who had returned to France after more victories. After Bonaparte's coup d'état of 18 Brumaire (9 November 1799), Lafayette used the confusion caused by the change of regime to slip into France with a passport in the name of "Motier". Bonaparte expressed rage, but Adrienne was convinced he was simply posing, and proposed to him that Lafayette would pledge his support, then would retire from public life to a property she had reclaimed, La Grange. France's new ruler allowed Lafayette to remain, though originally without citizenship and subject to summary arrest if he engaged in politics, with the promise of eventual restoration of civil rights. Lafayette remained quietly at La Grange, and when Bonaparte held a memorial service in Paris for Washington, who had died in December 1799, Lafayette, though he had expected to be asked to deliver the eulogy, was not invited, nor was his name mentioned.
## Retreat from politics
Bonaparte restored Lafayette's citizenship on 1 March 1800 and he was able to recover some of his properties. After Marengo, the First Consul offered him the post of French minister to the United States, but Lafayette declined, saying he was too attached to America to act in relation to it as a foreign envoy. In 1802, he was part of the tiny minority that voted no in the referendum that made Bonaparte consul for life. A seat in the Senate and the Legion of Honor were repeatedly offered by Bonaparte, but Lafayette again declined— though stating that he would gladly have accepted the honours from a democratic government.
In 1804, Bonaparte was crowned the Emperor Napoleon after a plebiscite in which Lafayette did not participate. The retired general remained relatively quiet, although he made Bastille Day addresses. After the Louisiana Purchase, President Jefferson asked him if he would be interested in the governorship, but Lafayette declined, citing personal problems and his desire to work for liberty in France.
During a trip to Auvergne in 1807, Adrienne became ill, suffering from complications stemming from her time in prison. She became delirious but recovered enough on Christmas Eve to gather the family around her bed and to say to Lafayette: "Je suis toute à vous" ("I am all yours"). She died the next day. In the years after her death, Lafayette mostly remained quietly at La Grange, as Napoleon's power in Europe waxed and then waned. Many influential people and members of the public visited him, especially Americans. He wrote many letters, especially to Jefferson, and exchanged gifts as he had once done with Washington.
## Bourbon restoration
In 1814, the coalition that opposed Napoleon invaded France and restored the monarchy; the comte de Provence (brother of the executed Louis XVI) took the throne as Louis XVIII. Lafayette was received by the new king, but the staunch republican opposed the new, highly restrictive franchise for the Chamber of Deputies that granted the vote to only 90,000 men in a nation of 25 million. Lafayette did not stand for election in 1814, remaining at La Grange.
There was discontent in France among demobilized soldiers and others. Napoleon had been exiled only as far as Elba, an island in the Tuscan archipelago; seeing an opportunity, he landed at Cannes on 1 March 1815 with a few hundred followers. Frenchmen flocked to his banner, and he took Paris later that month, causing Louis to flee to Ghent. Lafayette refused Napoleon's call to serve in the new government, but accepted election to the new Chamber of Representatives under the Charter of 1815. There, after Napoleon's defeat at the Battle of Waterloo, Lafayette called for his abdication. Responding to the emperor's brother Lucien, Lafayette argued:
> By what right do you dare accuse the nation of ... want of perseverance in the emperor's interest? The nation has followed him on the fields of Italy, across the sands of Egypt and the plains of Germany, across the frozen deserts of Russia. ... The nation has followed him in fifty battles, in his defeats and in his victories, and in doing so we have to mourn the blood of three million Frenchmen.
On 22 June 1815, four days after Waterloo, Napoleon abdicated. Lafayette arranged for the former emperor's passage to America, but the British prevented this, and Napoleon ended his days on the island of Saint Helena. The Chamber of Representatives, before it dissolved, appointed Lafayette to a peace commission that was ignored by the victorious allies who occupied much of France, with the Prussians taking over La Grange as a headquarters. Once the Prussians left in late 1815, Lafayette returned to his house, a private citizen again.
Lafayette's homes, both in Paris and at La Grange, were open to any Americans who wished to meet the hero of their Revolution, and to many other people besides. Among those whom Irish novelist Sydney, Lady Morgan met at table during her month-long stay at La Grange in 1818 were the Dutch painter Ary Scheffer and the historian Augustin Thierry, who sat alongside American tourists. Others who visited included philosopher Jeremy Bentham, American scholar George Ticknor, and writer Fanny Wright.
During the first decade of the Bourbon Restoration, Lafayette lent his support to a number of conspiracies in France and other European countries, all of which came to nothing. He was involved in the various Charbonnier plots, and agreed to go to the city of Belfort, where there was a garrison of French troops, and assume a major role in the revolutionary government. Warned that the royal government had found out about the conspiracy, he turned back on the road to Belfort, avoiding overt involvement. More successfully, he supported the Greek Revolution beginning in 1821, and by letter attempted to persuade American officials to ally with the Greeks. Louis' government considered arresting both Lafayette and Georges Washington, who was also involved in the Greek efforts, but were wary of the political ramifications if they did. Lafayette remained a member of the restored Chamber of Deputies until 1823, when new plural voting rules helped defeat his bid for re-election.
## Grand tour of the United States
President James Monroe and Congress invited Lafayette to visit the United States in 1824, in part to celebrate the nation's upcoming 50th anniversary. Monroe intended to have Lafayette travel on an American warship, but Lafayette felt that having such a vessel as transport was undemocratic and booked passage on the merchant packet Cadmus. Louis XVIII did not approve of the trip and had troops disperse the crowd that gathered at Le Havre to see him off.
Lafayette arrived at New York on 15 August 1824, accompanied by his son Georges Washington and his secretary Auguste Levasseur. He was greeted by a group of Revolutionary War veterans who had fought alongside him many years before. New York erupted for four continuous days and nights of celebration. He then departed for what he thought would be a restful trip to Boston but instead found the route lined by cheering citizens, with welcomes organized in every town along the way. According to Unger, "It was a mystical experience they would relate to their heirs through generations to come. Lafayette had materialized from a distant age, the last leader and hero at the nation's defining moment. They knew they and the world would never see his kind again."
New York, Boston, and Philadelphia did their best to outdo one another in the celebrations honoring Lafayette. Philadelphia renovated the Old State House (today Independence Hall) which might otherwise have been torn down, because they needed a location for a reception for him. Until that point, it had not been usual in the United States to build monuments, but Lafayette's visit set off a wave of construction—usually with him laying the cornerstone himself, in his capacity as mason. The arts benefited by his visit, as well, as many cities commissioned portraits for their civic buildings, and the likenesses were seen on innumerable souvenirs. Lafayette had intended to visit only the original 13 states during a four-month visit, but the stay stretched to 16 months as he visited all 24 states.
The towns and cities that he visited gave him enthusiastic welcomes, including Fayetteville, North Carolina, the first city named in his honor. He visited the capital in Washington City, and was surprised by the simple clothing worn by President Monroe and the lack of any guards around the White House. He went to Mount Vernon in Virginia as he had 40 years before, this time viewing Washington's grave. He was at Yorktown on 19 October 1824 for the anniversary of Cornwallis's surrender, then journeyed to Monticello to meet with his old friend Jefferson—and Jefferson's successor James Madison, who arrived unexpectedly. He had also dined with 89-year-old John Adams, the other living former president, at Peacefield, his home near Boston.
With the roads becoming impassable, Lafayette stayed in Washington City for the winter of 1824–25, and thus was there for the climax of the hotly contested 1824 election in which no presidential candidate was able to secure a majority of the Electoral College, throwing the decision to the House of Representatives. On 9 February 1825, the House selected Secretary of State John Quincy Adams as president; that evening, runner-up General Andrew Jackson shook hands with Adams at the White House as Lafayette looked on.
In March 1825, Lafayette began to tour the southern and western states. The general pattern of the trip was that he would be escorted between cities by the state militia, and he would enter each town through specially constructed arches to be welcomed by local politicians or dignitaries, all eager to be seen with him. There would be special events, visits to battlefields and historic sites, celebratory dinners, and time set aside for the public to meet the legendary hero of the Revolution.
Lafayette visited General Jackson at his home The Hermitage in Tennessee. He was traveling up the Ohio River by steamboat when the vessel sank beneath him, and he was put in a lifeboat by his son and secretary, then taken to the Kentucky shore and rescued by another steamboat that was going in the other direction. Its captain insisted on turning around, however, and taking Lafayette to Louisville, Kentucky. From there, he went generally northeast, viewing Niagara Falls and taking the Erie Canal to Albany, considered a modern marvel. He laid the cornerstone of the Bunker Hill Monument in Massachusetts in June 1825 after hearing an oration by Daniel Webster. He also took some soil from Bunker Hill to be sprinkled on his grave.
After Bunker Hill, Lafayette went to Maine and Vermont, thus visiting all of the states. He met again with John Adams, then went back to New York and then to Brooklyn, where he laid the cornerstone for its public library. He celebrated his 68th birthday on 6 September at a reception with President John Quincy Adams at the White House, and departed the next day. He took gifts with him, besides the soil to be placed on his grave. Congress had voted him $200,000 (equal to $ today) in gratitude for his services to the country at President Monroe's request, along with a large tract of public lands in Florida. He returned to France aboard a ship that was originally called the Susquehanna but was renamed the USS Brandywine in honor of the battle where he shed his blood for the United States.
## Revolution of 1830
When Lafayette arrived in France, Louis XVIII had been dead about a year and Charles X was on the throne. As king, Charles intended to restore the absolute rule of the monarch, and his decrees had already prompted protest by the time Lafayette arrived. Lafayette was the most prominent of those who opposed the king. In the elections of 1827, the 70-year-old Lafayette was elected to the Chamber of Deputies again. Unhappy at the outcome, Charles dissolved the Chamber, and ordered a new election: Lafayette again won his seat.
Lafayette remained outspoken against Charles' restrictions on civil liberties and the newly introduced censorship of the press. He made fiery speeches in the Chamber, denouncing the new decrees and advocating American-style representative government. He hosted dinners at La Grange, for Americans, Frenchmen, and others; all came to hear his speeches on politics, freedom, rights, and liberty. He was popular enough that Charles felt he could not be safely arrested, but Charles' spies were thorough: one government agent noted "his [Lafayette's] seditious toasts ... in honor of American liberty".
On 25 July 1830, the king signed the Ordinances of Saint-Cloud, removing the franchise from the middle class and dissolving the Chamber of Deputies. The decrees were published the following day. On 27 July, Parisians erected barricades throughout the city, and riots erupted. In defiance, the Chamber continued to meet. When Lafayette, who was at La Grange, heard what was going on, he raced into the city, and was acclaimed as a leader of the revolution. When his fellow deputies were indecisive, Lafayette went to the barricades, and soon the royalist troops were routed. Fearful that the excesses of the 1789 revolution were about to be repeated, deputies made Lafayette head of a restored National Guard, and charged him with keeping order. The Chamber was willing to proclaim him as ruler, but he refused a grant of power he deemed unconstitutional. He also refused to deal with Charles, who abdicated on 2 August. Many young revolutionaries sought a republic, but Lafayette felt this would lead to civil war, and chose to offer the throne to the duc d'Orleans, Louis-Philippe, who had lived in America and had far more of a common touch than did Charles. Lafayette secured the agreement of Louis-Philippe, who accepted the throne, to various reforms. The general remained as commander of the National Guard. This did not last long—the brief concord at the king's accession soon faded, and the conservative majority in the Chamber voted to abolish Lafayette's National Guard post on 24 December 1830. Lafayette went back into retirement, expressing his willingness to do so.
## Final years and death
Lafayette grew increasingly disillusioned with Louis-Phillippe, who backtracked on reforms and denied his promises to make them. The retired general angrily broke with his king, a breach which widened when the government used force to suppress a strike in Lyon. Lafayette used his seat in the Chamber to promote liberal proposals, and his neighbors elected him mayor of the village of La Grange and to the council of the département of Seine-et-Marne in 1831. The following year, he served as a pallbearer and spoke at the funeral of General Jean Maximilien Lamarque, another opponent of Louis-Phillippe. He pleaded for calm, but there were riots in the streets and a barricade was erected at the Place de la Bastille. The king forcefully crushed this June Rebellion, to Lafayette's outrage. He returned to La Grange until the Chamber met in November 1832, when he condemned Louis-Phillippe for introducing censorship, as Charles X had.
Lafayette spoke publicly for the last time in the Chamber of Deputies on 3 January 1834. The next month, he collapsed at a funeral from pneumonia. He recovered, but the following May was wet, and he became bedridden after being caught in a thunderstorm. He died at age 76 on 20 May 1834 at 6 rue d'Anjou-Saint-Honoré in Paris (now 8 rue d'Anjou in the 8th arrondissement of Paris). He was buried next to his wife at the Picpus Cemetery under soil from Bunker Hill, which his son Georges Washington sprinkled upon him. King Louis-Philippe ordered a military funeral in order to keep the public from attending, and crowds formed to protest their exclusion.
In the United States, President Jackson ordered that Lafayette receive the same memorial honors that had been bestowed on Washington at his death in December 1799. Both Houses of Congress were draped in black bunting for 30 days, and members wore mourning badges. Congress urged Americans to follow similar mourning practices. Later that year, former president John Quincy Adams gave a eulogy of Lafayette that lasted three hours, calling him "high on the list of the pure and disinterested benefactors of mankind".
## Beliefs
Lafayette was a firm believer in a constitutional monarchy. He believed that traditional and revolutionary ideals could be melded together by having a democratic National Assembly work with a monarch, as France always had. His close relationships to American Founding Fathers such as George Washington and Thomas Jefferson gave him an opportunity to witness the implementation of a democratic system. His views on potential government structures for France were directly influenced by the American form of government, which was in turn influenced by the British form of government. For example, Lafayette believed in a bicameral legislature, as the United States had. The Jacobins detested the idea of a monarchy in France, which led the National Assembly to vote against it. This idea contributed to his fall from favor, especially when Maximilien Robespierre took power.
Lafayette was the author of the Declaration of the Rights of Man and of the Citizen in 1789 and a staunch opponent of slavery. His work never specifically mentioned slavery, but he made his position clear on the controversial topic through letters addressed to friends and colleagues such as Washington and Jefferson. He proposed that slaves not be owned but rather work as free tenants on the land of plantation owners, and he bought three slave plantations in Cayenne in 1785 and 1786 to put his ideas into practice, ordering that none of the 70 slaves on the plantations to be bought or sold. He never freed his slaves, and when the French authorities confiscated his properties in 1795, the 63 remaining slaves on the three plantations were sold by colonial officials in Cayenne. He spent his lifetime as an abolitionist, proposing that slaves be emancipated slowly as he recognized the crucial role that slavery played in many economies. Lafayette hoped that his ideas would be adopted by Washington to free slaves in the United States and spread from there. Washington eventually began implementing those practices on his own plantation in Mount Vernon, but he continued to own slaves until the day he died. In a letter to Matthew Clarkson, the mayor of Philadelphia, Lafayette wrote, "I would never have drawn my sword in the cause of America, if I could have conceived that thereby I was founding a land of Slavery."
## Assessment
Throughout his life, Lafayette was an exponent of the ideals of the Age of Enlightenment, especially on human rights and civic nationalism, and his views were taken very seriously by intellectuals and others on both sides of the Atlantic. His image in the United States was derived from his "disinterestedness" in fighting without pay for the freedom of a country that was not his own. Samuel Adams praised him for "foregoing the pleasures of Enjoyment of domestick Life and exposing himself to the Hardship and Dangers" of war when he fought "in the glorious cause of freedom". This view was shared by many contemporaries, establishing an image of Lafayette seeking to advance the freedom of all mankind rather than the interests of just one nation. During the French Revolution, Americans viewed him as an advocate for American ideals, seeking to transport them from New World to Old. This was reinforced by his position as surrogate son and disciple of George Washington, who was deemed the Father of His Country and the embodiment of American ideals. Novelist James Fenimore Cooper befriended Lafayette during his time in Paris in the 1820s. He admired his patrician liberalism and eulogized him as a man who "dedicated youth, person, and fortune to the principles of liberty."
Lafayette became an American icon in part because he was not associated with any particular region of the country; he was of foreign birth, did not live in America, and had fought in New England, the mid-Atlantic states, and the South, making him a unifying figure. His role in the French Revolution enhanced this popularity, as Americans saw him steering a middle course. Americans were naturally sympathetic to a republican cause, but also remembered Louis XVI as an early friend of the United States. When Lafayette fell from power in 1792, Americans tended to blame factionalism for the ouster of a man who was above such things in their eyes.
In 1824, Lafayette returned to the United States at a time when Americans were questioning the success of the republic in view of the disastrous economic Panic of 1819 and the sectional conflict resulting in the Missouri Compromise. Lafayette's hosts considered him a judge of how successful independence had become. According to cultural historian Lloyd Kramer, Lafayette "provided foreign confirmations of the self-image that shaped America's national identity in the early nineteenth century and that has remained a dominant theme in the national ideology ever since: the belief that America's Founding Fathers, institutions, and freedom created the most democratic, egalitarian, and prosperous society in the world".
Historian Gilbert Chinard wrote in 1936: "Lafayette became a legendary figure and a symbol so early in his life, and successive generations have so willingly accepted the myth, that any attempt to deprive the young hero of his republican halo will probably be considered as little short of iconoclastic and sacrilegious." That legend has been used politically; the name and image of Lafayette were repeatedly invoked in 1917 to gain popular support for America's entry into World War I, culminating with Charles E. Stanton's famous statement "Lafayette, we are here". This occurred at some cost to Lafayette's image in America; veterans returned from the front singing "We've paid our debt to Lafayette, who the hell do we owe now?" According to Anne C. Loveland, "Lafayette no longer served as a national hero-symbol" by the end of the war. In 2002, however, Congress voted to grant him honorary citizenship.
Lafayette's reputation in France is more problematic. Thomas Gaines notes that the response to Lafayette's death was far more muted in France than in America, and suggested that this may have been because Lafayette was the last surviving hero of America's only revolution, whereas the changes in the French government had been far more chaotic. Lafayette's roles created a more nuanced picture of him in French historiography, especially in the French Revolution. 19th-century historian Jules Michelet describes him as a "mediocre idol", lifted by the mob far beyond what his talents deserved. Jean Tulard, Jean-François Fayard, and Alfred Fierro note Napoleon's deathbed comment about Lafayette in their Histoire et dictionnaire de la Révolution française; he stated that "the king would still be sitting on his throne" if Napoleon had Lafayette's place during the French Revolution. They deemed Lafayette "an empty-headed political dwarf" and "one of the people most responsible for the destruction of the French monarchy". Gaines disagreed and noted that liberal and Marxist historians have also dissented from that view. Lloyd Kramer related 57 percent of the French deemed Lafayette the figure from the Revolution whom they most admired, in a survey taken just before the Revolution's bicentennial in 1989. Lafayette "clearly had more French supporters in the early 1990s than he could muster in the early 1790s".
Marc Leepson concluded his study of Lafayette's life:
> The Marquis de Lafayette was far from perfect. He was sometimes vain, naive, immature, and egocentric. But he consistently stuck to his ideals, even when doing so endangered his life and fortune. Those ideals proved to be the founding principles of two of the world's most enduring nations, the United States and France. That is a legacy that few military leaders, politicians, or statesmen can match.
## See also
- List of places named for the Marquis de Lafayette
- LaFayette Motors
- Hermione (2014), a replica of the Hermione of 1779, currently in service
- Hero of Two Worlds: The Marquis de Lafayette in the Age of Revolution, a 2021 biography |
# Banksia caleyi
Banksia caleyi, commonly known as Caley's banksia or red lantern banksia, is a species of woody shrub of the family Proteaceae native to Western Australia. It generally grows as a dense shrub up to 2 m (7 ft) tall, has serrated leaves and red, pendent (hanging) inflorescences which are generally hidden in the foliage. First described by Scottish naturalist Robert Brown in 1830, Banksia caleyi was named in honour of the English botanist George Caley. No subspecies are recognised. It is one of three or four related species with hanging inflorescences, which is an unusual feature within the genus.
Found south and east of the Stirling Ranges through to the vicinity of Jerramungup, Banksia caleyi grows in a habitat marked by periodic bushfires. Plants are killed by fire and regenerate by seed afterwards. The species was classified as "Not Threatened" under the Wildlife Conservation Act of Western Australia. In contrast to most other Western Australian banksias, it appears to have some resistance to dieback from the soil-borne water mould Phytophthora cinnamomi, and is comparatively easy to grow in cultivation.
## Description
Banksia caleyi grows as a many-branched bushy shrub to 2 m (7 ft) in height, with crumbly grey bark. Rarely, plants of up to 4 m (13 ft) have been found. The new growth is hairy, and generally occurs in summer. The branchlets become smooth after around two years. The stiff leaves are narrowly wedge-shaped (cuneate) and measure 5–14 cm (2–5+1⁄2 in) in length by 1.3–2.4 cm (1⁄2–1 in) wide. The leaf margins are serrated, with many teeth measuring 0.4–0.6 cm (1⁄8–1⁄4 in) each.
Flowering takes place between September and January. The inflorescences hang down from the ends of three- to five-year-old branchlets deep within the shrub and measure 5–9 cm (2–3+1⁄2 in) in length and roughly 7 cm (2+3⁄4 in) in diameter. The flowers are cream at the base and deep pink to red in the upper half, and are brightest before anthesis and then gradually fade with age. The inflorescences eventually turn grey, the old flowers remaining as up to 25 large woody follicles develop. Oval in shape and covered with fine hair, the follicles can reach 4 cm (1+5⁄8 in) long, 2.5 cm (1 in) high, and 2.5 cm (1 in) wide.
The obovate seed is 4.3–4.7 cm (1+3⁄4–1+7⁄8 in) long and fairly flattened, and is composed of the wedge-shaped seed body proper, measuring 1.4–1.5 cm (1⁄2–5⁄8 in) long and 1.6–1.7 cm (5⁄8 in) wide, and a papery wing. One side, termed the outer surface, is dark brown and wrinkled, while the other is black and smooth. Both surfaces sparkle slightly. The seeds are separated by a sturdy dark brown seed separator that is roughly the same shape as the seeds with a depression where the seed body sits adjacent to it in the follicle. Seedlings have cuneate cotyledons which measure 1.1–1.3 cm (3⁄8–1⁄2 in) long and 1.3–1.4 cm (1⁄2 in) wide. These are dull green with three veins, and the margin of the wedge may be red and crenulated (lined with small teeth). The hypocotyl is red and measures 1.5–2 cm (5⁄8–3⁄4 in) high. Seedlings have hairy stems and leaves that are oppositely arranged (arising from the stem in pairs) that are obovate with triangular-lobed serrate margins.
## Taxonomy
Robert Brown formally described Banksia caleyi in his 1830 work Supplementum primum Prodromi florae Novae Hollandiae, naming it in honour of the English botanist George Caley. The type specimen was collected by William Baxter, inland from King George Sound on Western Australia's south coast, in 1829.
Carl Meissner placed B. caleyi in series Quercinae in his 1856 arrangement of the genus on account of its strongly dentate, cuneate to obovate leaves. As they were defined on leaf characters alone, all of Meissner's series were highly heterogeneous. Meissner also described B. caleyi variety sinuosa from material collected by James Drummond, which was reviewed by Alex George and found to be no different from the other collections of B. caleyi. Drummond also collected material identified as B. caleyi that was named as a distinct species—Banksia aculeata—in 1981. No subspecies of B. caleyi itself are recognised.
George Bentham published a thorough revision of Banksia in his landmark publication Flora Australiensis in 1870. In Bentham's arrangement, the number of recognised Banksia species was reduced from 60 to 46. Bentham defined four sections based on leaf, style and pollen-presenter characters. Banksia caleyi was placed in section Orthostylis. In 1891, Otto Kuntze, in his Revisio Generum Plantarum, rejected the generic name Banksia L.f., on the grounds that the name Banksia had previously been published in 1776 as Banksia J.R.Forst & G.Forst, referring to the genus now known as Pimelea. Kuntze proposed Sirmuellera as an alternative, referring to this species as Sirmuellera caleyi. This application of the principle of priority was largely ignored by Kuntze's contemporaries, and Banksia L.f. was formally conserved and Sirmuellera rejected in 1940.
In his 1981 monograph The genus Banksia L.f. (Proteaceae), Alex George placed B. caleyi in B. subg. Banksia because its inflorescence is a typical Banksia flower spike shape; in B. sect. Banksia because of its straight styles; and B. ser. Tetragonae because of its pendulous inflorescences. He considered its closest relative to be B. aculeata, which has narrower leaves with fewer, larger lobes; longer perianths, which grade from red to cream rather than from cream to red; shorter pistils; and also differences in the follicles, seeds and flowering time.
In 1996, Kevin Thiele and Pauline Ladiges published the results of a cladistic analysis of morphological characters of Banksia. They retained George's subgenera and many of his series, but discarded his sections. George's B. ser. Tetragonae was found to be monophyletic, and therefore retained; and their analysis of the relationships within the series supported the placement of B. caleyi alongside B. aculeata.
B. caleyi's placement in Thiele and Ladiges' arrangement may be summarised as follows:
-
Banksia
: B. subg. Isostylis (3 species)
: B. elegans (incertae sedis)
: B. subg. Banksia
: : B. ser. Tetragonae
: : : B. elderiana
: : : B. lemanniana
: : : B. caleyi
: : : B. aculeata
The arrangement of Thiele and Ladiges was not accepted by George, and was discarded in his 1999 revision. Under George's 1999 arrangement, B. caleyi's placement was as follows:
-
Banksia
: B. subg. Banksia
: : B. sect. Banksia
: : : B. ser. Salicinae (11 species, 7 subspecies)
: : : B. ser. Grandes (2 species)
: : : B. ser. Banksia (8 species)
: : : B. ser. Crocinae (4 species)
: : : B. ser. Prostratae (6 species, 3 varieties)
: : : B. ser. Cyrtostylis (13 species)
: : : B. ser. Tetragonae
: : : : B. lemanniana
: : : : B. caleyi
: : : : B. aculeata
Since 1998, Austin Mast has been publishing results of ongoing cladistic analyses of DNA sequence data for the subtribe Banksiinae. His analyses suggest a phylogeny that is rather different from previous taxonomic arrangements, but support the placement of B. aculeata alongside B. caleyi in a clade corresponding closely with B. ser. Tetragonae.
Early in 2007, Mast and Thiele initiated a rearrangement by transferring Dryandra to Banksia, and publishing B. subg. Spathulatae for the species having spoon-shaped cotyledons; in this way, they also redefined the autonym B. subg. Banksia. They have refrained from publishing a full arrangement of Banksia until DNA sampling of Dryandra is complete; in the meantime, if Mast and Thiele's nomenclatural changes are taken as an interim arrangement, then B. caleyi is placed in B. subg. Banksia.
## Distribution and habitat
Banksia caleyi is found near the southern coast of Western Australia, from South Stirling to the West River and northeast to Pingrup. Some of its population lies within Fitzgerald River National Park. Often locally abundant, it is found in mallee woodland on white sand, gravel, and sandy clay, generally on flat or slightly undulating land. The annual rainfall is 550–600 mm (22–24 in). Banksia caleyi is classified as Not Threatened under the Wildlife Conservation Act of Western Australia.
## Ecology
Like many plants in south-west Western Australia, Banksia caleyi is adapted to an environment in which bushfire events are relatively frequent. Most Banksia species can be placed in one of two broad groups according to their response to fire: Reseeders are killed by fire, but fire triggers the release of their canopy seed bank, thus promoting recruitment of the next generation. Resprouters survive fire, resprouting from a lignotuber or, more rarely, epicormic buds protected by thick bark. B. caleyi belongs to the reseeder group. In the wild, seedlings take at least three to four years to reach flowering after bushfire. Non-patchy fires occurring at intervals of less than seven years may wipe out local populations of reseeders.
Banksia caleyi has been shown to have a low susceptibility to dieback from the soil-borne water mould Phytophthora cinnamomi, unlike most Western Australian banksias. The fungal pathogen Botryosphaeria ribis has been recovered from B. caleyi. The caterpillar of the dryandra moth (Carthaea saturnioides) feeds on the leaves, though it prefers to eat those of dryandra species that grow alongside it.
The upside-down flower spikes drip nectar onto the ground or lower leaves, suggesting pollination by nonflying mammals which are attracted to the scent. Supporting this hypothesis, the spiky leaves seem to also prevent access to foragers not at ground level. Furthermore, the individual flower structure is similar to Banksia attenuata, for which the honey possum (Tarsipes rostratus) is a major pollinator.
## Cultivation
Seeds do not require any treatment, and take 23 to 50 days to germinate in cultivation. Banksia caleyi is a medium- slow-growing plant, taking four to five years to flower from seed. The flowers are attractive but are obscured by the foliage. This species can grow in a range of soil types so long as they provide good drainage. The nominal soil pH range is from 6 to 7.5. It grows in full sun and partly shaded situations, and tolerates light pruning. Unlike many other Western Australian banksias, Banksia caleyi has had some degree of success in growing in more humid areas, such as Australia's east coast. It attracts pygmy and honey possums in the garden. |
# Australian Air Corps
The Australian Air Corps (AAC) was a temporary formation of the Australian military that existed in the period between the disbandment of the Australian Flying Corps (AFC) of World War I and the establishment of the Royal Australian Air Force (RAAF) in March 1921. Raised in January 1920, the AAC was commanded by Major William Anderson, a former AFC pilot. Many of the AAC's members were also from the AFC and would go on to join the RAAF. Although part of the Australian Army, for most of its existence the AAC was overseen by a board of senior officers that included members of the Royal Australian Navy.
Following the disbandment of the AFC, the AAC was a stop-gap measure intended to remain in place until the formation of a permanent and independent Australian air force. The corps' primary purpose was to maintain assets of the Central Flying School at Point Cook, Victoria, but several pioneering activities also took place under its auspices: AAC personnel set an Australian altitude record that stood for a decade, made the first non-stop flight between Sydney and Melbourne, and undertook the country's initial steps in the field of aviation medicine. The AAC operated fighters, bombers and training aircraft, including some of the first examples of Britain's Imperial Gift to arrive in Australia. As well as personnel, the RAAF inherited Point Cook and most of its initial equipment from the AAC.
## Establishment and control
In December 1919, the remnants of the wartime Australian Flying Corps (AFC) were disbanded, and replaced on 1 January 1920 by the Australian Air Corps (AAC), which was, like the AFC, part of the Australian Army. Australia's senior airman, Lieutenant Colonel Richard Williams, was overseas, and Major William Anderson was appointed commander of the AAC, a position that also put him in charge of the Central Flying School (CFS) at Point Cook, Victoria. As Anderson was on sick leave at the time of the appointment, Major Rolf Brown temporarily assumed command; Anderson took over on 19 February. CFS remained the AAC's sole unit, and Point Cook its only air base.
The AAC was an interim organisation intended to exist until the establishment of a permanent Australian air service. The decision to create such a service had been made in January 1919, amid competing proposals by the Army and the Royal Australian Navy for separate forces under their respective jurisdictions. Budgetary constraints and arguments over administration and control led to ongoing delays in the formation of an independent air force.
By direction of the Chief of the General Staff, Major General Gordon Legge, in November 1919, the AAC's prime purpose was to ensure existing aviation assets were maintained; Legge later added that it should also perform suitable tasks such as surveying air routes. The Chief of the Naval Staff, Rear Admiral Sir Percy Grant, objected to the AAC's being under Army control, and argued that an air board should be formed to oversee the AAC and the proposed Australian air force. A temporary air board first met on 29 January 1920, the Army being represented by Williams and Brigadier General Thomas Blamey, and the Navy by Captain Wilfred Nunn and Lieutenant Colonel Stanley Goble, a former member of Britain's Royal Naval Air Service (RNAS) then seconded to the Navy Office. Williams was given responsibility for administering the AAC on behalf of the board. A permanent Air Board overseen by an Air Council was formed on 9 November 1920; these bodies were made responsible for administering the AAC from 22 November.
## Personnel
Most members of the AAC were former AFC personnel. In August 1919, several senior AFC pilots, including Lieutenant Colonel Oswald Watt, Major Anderson, and Captain Roy Phillipps, were appointed to serve on a committee examining applications for the AAC. Some of the staffing decisions were controversial. At least three officers at the CFS, including the commanding officer, were not offered appointments in the new service. Roy King, the AFC's second highest-scoring fighter ace after Harry Cobby, refused an appointment in the AAC because it had not yet offered a commission to Victoria Cross recipient Frank McNamara. In a letter dated 30 January 1920, King wrote, "I feel I must forfeit my place in favor (sic) of this very good and gallant officer"; McNamara received a commission in the AAC that April. Other former AFC members who took up appointments in the AAC included Captains Adrian Cole, Henry Wrigley, Frank Lukis, and Lawrence Wackett. Captain Hippolyte "Kanga" De La Rue, an Australian who flew with the RNAS during the war, was granted a commission in the AAC because a specialist seaplane pilot was required for naval cooperation work.
The corps' initial establishment was nine officers—commanding officer, adjutant, workshop commander, test pilot, four other pilots, and medical officer—and seventy other ranks. In March 1920, to cope with the imminent arrival of new aircraft and other equipment, approval was given to increase this complement by a further seven officers and thirty-six other ranks. The following month the establishment was increased by fifty-four to make a total of 160 other ranks. An advertising campaign was employed to garner applicants. According to The Age, applicants needed to be aged between eighteen and forty-five, and returned soldiers were preferred; all positions were "temporary" and salaries, including uniform allowance and rations, ranged from £194 to £450. As the AAC was an interim formation, no unique uniform was designed for its members. Within three weeks of the AAC being raised, a directive came down from CFS that the organisation's former AFC staff should wear out their existing uniforms, and that any personnel requiring new uniforms should acquire "AIF pattern, as worn by the AFC".
The AAC suffered two fatalities. On 23 September 1920, two Airco DH.9A bombers recently delivered from Britain undertook a search for the schooner Amelia J., which had disappeared on a voyage from Newcastle to Hobart. Anderson and Sergeant Herbert Chester flew one of the DH.9As, and Captain Billy Stutt and Sergeant Abner Dalzell the other. Anderson's aircraft landed near Hobart in the evening, having failed to locate the lost schooner, but Stutt and Dalzell were missing; their DH.9A was last sighted flying through cloud over Bass Strait. A court of inquiry determined the aircraft had crashed, and that the DH.9As may not have had adequate preparation time for their task, which it attributed to the low staffing levels at CFS. The court proposed compensation of £550 for Stutt's family and £248 for Dalzell's—the maximum amounts payable under government regulations—as the men had been on duty at the time of their deaths; Federal Cabinet increased these payments three-fold. Wreckage that may have belonged to the Amelia J. was found at Flinders Island the following year.
## Equipment
The AAC's initial complement of aircraft included twenty Avro 504K trainers and twelve Sopwith Pup fighters that had been delivered to CFS in 1919, as well as a Royal Aircraft Factory B.E.2 and F.E.2, and a Bristol Scout. Seven of the 504Ks and one of the Pups were written off during the AAC's existence, leaving thirteen and eleven on strength, respectively. The B.E.2 had been piloted by Wrigley and Arthur Murphy in 1919 on the first flight from Melbourne to Darwin, and was allotted to what became the Australian War Memorial in August 1920; the F.E.2 was sold in November 1920, while the Scout remained on strength and was still being flown by the Royal Australian Air Force (RAAF) in 1923. In February 1920, the Vickers Vimy bomber recently piloted by Ross and Keith Smith on the first flight from England to Australia was flown to Point Cook, where it joined the strength of the AAC.
In March 1920, Australia began receiving 128 aircraft with associated spares and other equipment as part of Britain's Imperial Gift to Dominions seeking to establish their own post-war air services. The aircraft included Royal Aircraft Factory S.E.5 fighters, Airco DH.9 and DH.9A bombers, and Avro 504s. Most remained crated for eventual use by the yet-to-be formed RAAF, but several of each type were assembled and employed by the AAC. One of the DH.9As was lost with the disappearance of Stutt and Dalzell in September 1920.
## Notable flights
On 17 June 1920, Cole, accompanied by De La Rue, flew a DH.9A to an altitude of 27,000 feet (8,200 m), setting an Australian record that stood for more than ten years. The effects of hypoxia exhibited by Cole and De La Rue intrigued the medical officer, Captain Arthur Lawrence, who subsequently made observations during his own high-altitude flight piloted by Anderson; this activity has been credited as marking the start of aviation medicine in Australia. Later that month, flying an Avro 504L floatplane, De La Rue became the first person to land an aircraft on the Yarra River in Victoria. On 22 July, Williams, accompanied by Warrant Officer Les Carter, used a DH.9A to make the first non-stop flight from Sydney to Melbourne. A few days earlier, Williams and Wackett had flown two DH.9As to the Royal Military College, Duntroon, to investigate the possibility of taking some of the school's graduates into the air corps, a plan that came to fruition after the formation of the RAAF.
Between July and November 1920, trials of the Avro 504L took place on the Navy's flagship, HMAS Australia, and later aboard the light cruiser HMAS Melbourne. The trials on Melbourne, which operated in the waters off New Guinea and northern Australia, demonstrated that the Avro was not suited to tropical conditions as its engine lacked the necessary power and its skin deteriorated rapidly; Williams recommended that activity cease until Australia acquired a purpose-designed seaplane.
The AAC performed several tasks in connection with the Prince of Wales' tour of Australia in 1920. In May, the AAC was required to escort the Prince's ship, HMS Renown, into Port Melbourne, and then to fly over the royal procession along St Kilda Road. The AAC had more aircraft than pilots available, so Williams gained permission from the Minister for Defence to augment AAC aircrew with former AFC pilots seeking to volunteer their services for the events. In August, the AAC was called upon at the last minute to fly the Prince's mail from Port Augusta, South Australia, to Sydney before he boarded Renown for the voyage back to Britain.
During the Second Peace Loan, which commenced in August 1920, the AAC undertook a cross-country program of tours and exhibition flying to promote the sale of government bonds. Again Williams enlisted the services of former AFC personnel to make up for a shortfall in the number of AAC pilots and mechanics available to prepare and fly the nineteen aircraft allotted to the program. Activities included flyovers at sporting events, leaflet drops over Melbourne, and what may have been Australia's first aerial derby—at Serpentine, Victoria, on 27 August. Poor weather hindered some of the program, and four aircraft were lost in accidents, though no aircrew were killed. The Second Peace Loan gave AAC personnel experience in a variety of flying conditions, and the air service gained greater exposure to the Australian public.
## Disbandment and legacy
On 15 March 1921, the Brisbane Courier reported that the AAC would disband on 30 March, and be succeeded by a new air force. The Australian Air Force was formed on 31 March, inheriting Point Cook and most of its initial personnel and equipment from the AAC. The adjective "Royal" was added to "Australian Air Force" that August. Several officers associated with the AAC, including Williams, Anderson, Wrigley and McNamara, went on to achieve high rank in the Air Force. According to the RAAF's Pathfinder bulletin, the AAC "kept valuable aviation skills alive" until a permanent air force could be established. The corps was, further, "technically separate from the Army and Navy; its director answered to the Minister for Defence, through the Air Council. In effect, the AAC was Australia's first independent air force, albeit an interim one." |
# Mother and Child Reunion (Degrassi: The Next Generation)
"Mother and Child Reunion" is the two-part pilot episode of the Canadian teen drama television series Degrassi: The Next Generation, which premiered on October 14, 2001 on the CTV Television Network. The episode was written by story editor Aaron Martin and series co-creator/creative consultant Yan Moore, and directed by Bruce McDonald. As with the majority of Degrassi: The Next Generation episodes, "Mother and Child Reunion" takes its title from a pop song, "Mother and Child Reunion", written and performed by Paul Simon.
Degrassi: The Next Generation is the fourth series in the fictional Degrassi universe created in 1979. The preceding series, Degrassi High, ended in 1991, although a television movie, School's Out, aired in 1992 and wrapped up the storylines of the characters. "Mother and Child Reunion" reunited some of those characters in a ten-year high school reunion, while also introducing a new generation of Degrassi Community School students: Emma Nelson, Manny Santos, J.T. Yorke and Toby Isaacs.
The episode received mixed reviews from the mass media, with the Ottawa Citizen saying that it offers "nothing new to viewers familiar with the groundbreaking preceding series", and The Seattle Times saying it "soft-pedals through the issues", although the acting from the new generation of children was lauded as "stellar ... solid [and] believable" by Canoe.ca's AllPop. It was nominated for two Gemini Awards and two Directors Guild of Canada Awards, winning in the "Outstanding Achievement in a Television Series – Children's" category.
## Plot
### Part One
Archie "Snake" Simpson (played by Stefan Brogren), a former student of Degrassi High, and now teacher at Degrassi Community School, has arranged a mixed reunion for the classes of 1990 and 1991. Spike Nelson (Amanda Stepto), Caitlin Ryan (Stacie Mistysyn), and Lucy Fernandez (Anais Granofsky), who also attended Degrassi High, plan on attending and try to persuade Joey Jeremiah (Pat Mastroianni) to join them. Joey, however, is reticent as he is still dealing with his grief over the death of his wife. Along with Caitlin's fiancé Keith (Don McKellar), the five friends go out to a bar for the night, reminiscing about the past and discussing their present lives.
Spike's daughter, Emma (played by Miriam McDonald) is told by her online boyfriend, Jordan, that he is coming to Toronto for a school field trip, and asks her if she would like to meet him for the first time. Her friends, Manny Santos (Cassie Steele), J.T. Yorke (Ryan Cooley) and Toby Isaacs (Jake Goldsbie) warn her of the potential dangers of meeting somebody she only knows from the Internet, and tell her that he could be an Internet stalker, pointing out that schools do not take field trips in the middle of summer. However, Emma is undeterred, convinced that Jordan is just a normal boy with whom she shares the same interests.
### Part Two
At the reunion party, Joey and Caitlin meet Alison Hunter (Sara Holmes), another Degrassi High attendee. As the evening progresses, Joey overhears Keith and Alison flirting with each other and Keith reveals he does not want to marry Caitlin. When Joey confronts Keith, their argument turns into a physical altercation, and Alison has to tell Caitlin about Keith's hesitance over getting married. Joey and Caitlin have a heart-to-heart discussion about their past and their relationships, and after ten years, finally make amends after she forgives him for his affair with Tessa Campanelli, while Wheels (Neil Hope) apologizes to Lucy for crippling her while driving drunk nine years ago for which she forgives as she realized how he's been through since the death of his adoptive parents.
While her mother attends the reunion, Emma visits Jordan at his hotel, where she meets his teacher, Mr. Nystrom (Jeff Gruich). He takes her up to Jordan's hotel room, but as they enter, Emma sees that it is completely empty except for a video camera which has been set up. She immediately becomes suspicious and tries to escape, but Nystrom blocks her access to the bedroom door. She locks herself in the bathroom, and comes to the startling realization that Nystrom is Jordan. Nystrom apologizes and tells Emma he will let her go, but when she comes out of the bathroom, he grabs and restrains her.
Unable to get in contact with Emma, Manny tells Toby and J. T. that she is afraid that Emma may have gone to meet Jordan. They hack into Emma's email account and realize that Jordan has told her a number of lies. After discovering which hotel Emma is meeting Jordan at, they rush to the school to inform Spike. As Nystrom attempts to rape Emma, Spike and Snake arrive just in time to save her. Emma manages to break free from Nystrom and rushes out of the hotel room. Snake restrains Nystrom until the police arrive to take him away.
## Production
Linda Schuyler had co-created The Kids of Degrassi Street in 1979 with Kit Hood, and Yan Moore was a writer on that series. As the children grew up, the Degrassi franchise developed into Degrassi Junior High and Degrassi High. In 1999 two episodes of Jonovision, a CBC Television talk show aimed at teenagers, reunited some of the cast members from the series. At the same time, Schuyler and Moore were developing a new television drama. When the Jonovision reunion episode proved to be popular, Schuyler and Moore wondered about reuniting the characters, too. As the months passed, they began thinking about what might have happened to the characters of Degrassi High and realized that the character Emma Nelson, born at the end of Degrassi Junior High's second season, would soon be entering junior high school. Stephen Stohn, Schuyler's husband suggested Degrassi: The Next Generation as the name for the new sequel series, borrowing the concept from Star Trek: The Next Generation, of which he was a fan.
The new series was offered a place on a number of television networks, with CTV and CBC (the franchise's former network) vying as the top contenders. CTV won through, offering $10 million for a fifteen-episode season. The project was greenlit in May 2000, with the originally planned reunion episode serving as the pilot to the new series. CTV announced the new series at its annual press conference in June 2001, and said the pilot would air in the fall.
In contrast to the previous Degrassi series, which were filmed on and around De Grassi Street in Toronto, Ontario, Degrassi Junior High used Vincent Massey Public School, then known as Daisy Avenue Public School, as its primary filming location, and Centennial College was used in Degrassi High. Degrassi: The Next Generation is filmed at Epitome Pictures' studios in Toronto. A 100,000-square-foot (9,300 m<sup>2</sup>) former printing factory was converted in 1997 for Epitome, consisting of four soundstages and a backlot. The exterior of Degrassi Community School was located on the studio's backlot, and used the same colours and glass pattern as Degrassi High"'s Centennial College.
Production on "Mother and Child Reunion" began earlier than expected, as CTV initially planned to launch the series in January 2002. At the eleventh hour the broadcaster decided to bring it forward to October 2001 to coincide with the back-to-school season. The episode was written by series co-creator Moore, also credited as creative consultant, and script editor Aaron Martin. Co-creator Schuyler, with her husband and Epitome Pictures partner Stohn, served as executive producers. The line producer was David Lowe. "Mother and Child Reunion" was directed by Bruce McDonald, who had previously directed the films Roadkill (1989), Highway 61 (1991), Hard Core Logo (1996), and the television series Queer as Folk (2001–2005). Epitome Pictures sought funding from the Government of Canada, through its two Crown corporations, Telefilm Canada and the Canadian Television Fund, which provide financial support to Canadian audiovisual productions. Filming began on July 3, before Epitome Pictures could finalize their contracts with Telefilm and the Television Fund. Other financial contributors included Royal Bank of Canada, Cogeco, Shaw Communications, and Bell Canada.
To appeal to Degrassi's established audience, a number of references to events which occurred in Degrassi Junior High, Degrassi High and School's Out were written into the episode. Throughout those two series, Joey would frequently wear a fedora which became that character's trademark prop. The fedora made a reappearance in this episode, and was worn by Manny, Spike, Snake, Caitlin and Lucy, but not Joey, and it appeared in every scene which featured a character from the old series.
Prior to the episode airing, a website was created with a "virtual school" in which fans could "enroll" in order to receive regular emails from their character "classmates" and discuss ongoing plots, in an effort to provide a complete viewing experience for the audience. As the broadcast date of the episode neared, more content was added to the website to make it appear as if it were a true school reunion website. The website was actually seen on screen when the characters Spike and Caitlin were reminiscing about their high school days.
"Mother and Child Reunion" aired on the terrestrial television network CTV on October 14, 2001, and was advertised as a television special. In the United States, it was broadcast on July 1, 2002, on the Noggin cable channel during its programming block for teenagers, The N; it served as the final episode of season one. In Australia, the episode aired on the terrestrial network ABC TV on August 1, 2002. The episode has been released on DVD as part of the complete first season DVD boxset, which was released in Canada on October 19, 2004, in the U.S. on September 28, 2004, and in Australia on September 8, 2010. The episode is also available at iTunes Stores to download and watch on home computers and certain iPod models, and at Zune Marketplace for the Xbox 360 and Zune media players.
## References to the original series
The two-part episode would also follow up on plotlines of the original series. The accident in the telemovie School's Out when Wheels hit Lucy while drunk driving was discussed twice, first at the bar when the characters were discussing their lives, and a second time when Wheels came to the reunion to apologize to Lucy. Finally, Joey and Caitlin made amends; their relationship had ended when he cheated on her with Tessa Campanelli in School's Out. Snake, who had been dating a girl named Pam by the end of School's Out, is still angry at Wheels for the accident. Lucy's eyesight is restored (whether in one or both eyes is unclear) and, having completed extensive physical therapy, she is able to walk well with the use of a cane; at the same time, she has completed an honours bachelor's degree, a master's degree and most of the work for a Ph.D. Joey and Caitlin have gone their separate ways and not seen each other in many years (though it is unclear if Alexa and Simon's wedding was the last time). Joey is now a used car dealer and a widowed father of a young daughter. His prediction for Caitlin, however, has come true: she lives in Los Angeles and is famous as the host of an environmental-political television series. At the reunion, Caitlin breaks up with her new fiancé Keith; as she takes off her ring while sitting alone with Joey, she asks him, "Bring back any memories?" Alexa and Simon are still married, as evidenced by the characters sitting together and her actress Irene Courakos being credited as "Alexa Dexter" vice "Alexa Pappadopoulos."
An interview of original cast members done in-character was included as a bonus feature on the Season 1 DVD. In the interviews, Alison Hunter says she is a waitress and wants to become a star in Hollywood. Dwayne Myers says that while he is HIV positive, he is AIDS free, and hopes to live ten more years. Kathleen Mead is shown being jealous of Caitlin's success after high school. She says she is pitching an idea for a new television show called "Mead in Canada." Diana Economopoulos is now Diana Platt; she and her husband are accountants. She has two children and brags about her website. Yick Yu says that he and Arthur Kobalewsky are co-owners of a web design company. Alexa and Simon Dexter live around Ottawa with their two children and Alexa now a quintessential soccer mom is pregnant with their third. Ricky is a lawyer and is helping write copyright laws for the internet. Rainbow is a documentary filmmaker. Liz O'Rourke says she is now working with the mentally disabled (although she is later revealed to be a midwife in the episode "Father Figure"). Trish Skye says she had problems but has been clean and sober for two years; she works at a funeral home and is a freelance writer. In the same video, Spike mentions that she bought out her mother's hair salon and comments on the "weird déjà vu" of Emma now attending Degrassi; Snake also appears and notes that he is a teacher at the titular school, and still single.
## Cast
Producers were able to bring back a number of actors from Degrassi High to guest star as their characters for the reunion storyline. Stefan Brogren, Pat Mastroianni, Stacie Mistysyn and Amanda Stepto agreed early on to return for the episode to play their characters Archie "Snake" Simpson, Joey Jeremiah, Caitlin Ryan and Christine "Spike" Nelson respectively, and appeared at the CTV press conference in June to publicize the new series. Brogren and Stepto signed contracts to appear in the entire season. Dan Woods also returned to play Dan Raditch, now principal of Degrassi Community School. By the time "Mother and Child Reunion" began to shoot, twelve more former Degrassi High cast members had agreed to appear: Danah Jean Brown (Trish Skye), Darrin Brown (Dwayne Myers), Michael Carry (Simon Dexter), Irene Courakos (Alexa Pappadopoulos), Chrissa Erodotou (Diana Economopoulos), Anais Granofsky (Lucy Fernandez), Rebecca Haines (Kathleen Mead), Sara Holmes (Alison Hunter), Neil Hope (Derek "Wheels" Wheeler), Kyra Levy (Maya Goldberg), Cathy Keenan (Liz O'Rourke), and Siluck Saysanasy (Yick Yu) all reprised their roles.
For the new generation of students, the producers chose from six hundred auditionees, all of whom were children in an attempt to provide a group of characters that the target audience of teenagers could relate to, rather than actors in their twenties pretending to be teenagers, something other shows of the same period and target audience such as Buffy the Vampire Slayer and Dawson's Creek were doing. Miriam McDonald first auditioned in October 2000 to play Emma Nelson, Spike's daughter, and was selected for the role after a callback and three screen tests. Ryan Cooley appeared as J.T. Yorke, Jake Goldsbie as Toby Isaacs, and Cassie Steele as Manny Santos. All signed their contracts just days before appearing at the CTV press conference. Christina Schmidt appeared briefly as Terri McGreggor, and Melissa McIntyre appeared in just one scene as Ashley Kerwin; she had no lines to speak in this episode. Cassie Steele's sister, Alex Steele, made her first appearance as Angela Jeremiah, Joey's six-year-old daughter. She returned with Mastroianni to the series at the beginning of season two to take on more permanent roles.
Film director Kevin Smith, who had been a fan of Degrassi from the early 1990s when he worked at a convenience store in Leonardo, New Jersey, has paid homage to Degrassi by making reference to it in several of his films. For example, he named a Clerks character Caitlin Bree after Caitlin Ryan, his favorite Degrassi character, wrote Shannen Doherty's character Rene wearing a Degrassi jacket throughout his Mallrats film, and had Jason Lee's character in Chasing Amy specifically mention Degrassi Junior High as the television show he would rather be watching, instead of going out. At the press conference for the new series, Schuyler announced that Smith would appear in "Mother and Child Reunion" as Caitlin's boyfriend, but due to scheduling conflicts he was unable to film the role and it was passed on to Don McKellar. Smith and his View Askewniverse sidekick Jason Mewes later guest starred in Degrassi: The Next Generation for three episodes of season four, two episodes of season five, and four episodes of season eight.
## Reception
"Mother and Child Reunion" received mixed reviews from the media. Stephanie McGrath of Canoe.ca's AllPop acknowledged Miriam McDonald's portrayal of Emma Nelson as "stellar acting abilities in a super creepy storyline ... high on tension, low on cheese [and] top-notch", and continued, "The young actors actually showed up their classic Degrassi counter-parts in the pilot episode. Their acting was solid, believable and age-appropriate, while some of the older crowd's dialogue sounded a bit stilted and over-rehearsed. Slightly wooden acting aside, it was still good to see Joey, Caitlin and the gang together again. Emma's story-line demonstrates that the creative forces behind The Next Generation haven't lost touch with teens yet, showing that one instalment of Degrassi: The Next Generation is worth 20 episodes of Dawson's Creek."
Tony Atherton of the Ottawa Citizen had mixed feelings of the new incarnation, saying it "has a cleaner, more polished look, has lost its edge [and offers] nothing new to viewers familiar with the groundbreaking preceding series, nor to anyone else who has watched the deluge of teen dramas since", adding that because there is "little ground left to break in teen drama there is a sense of déjà vu with regards to the plots and characters". He did, however, praise the show for having "the same simple narrative told from a kid's viewpoint, and the same regard for unvarnished reality [as Degrassi Junior High and Degrassi High]. It is light years from far-fetched high-school melodramas like Boston Public and Dawson's Creek ... is every bit as good as its beloved predecessor. In fact, in some respects it is even better".
When the series began in the U.S., The Seattle Times' Melanie McFarland was unsure whether its success and popularity in Canada would continue across the border. "As popular as Degrassi was, it was still a mere cult hit in the United States; the crowd that had access to it initially on PBS might not be able to tune into Noggin. Soft-pedaling through the issues might work for today's family of viewers, but what's gentle enough for Mom and Dad's peace of mind might not be enough to hook Junior or the original Degrassi's older fans". She was, however, "happy Noggin chose Degrassi students to navigate teen perils instead of digging up Screech and the gang [characters from Saved by the Bell"] for another nauseating go-round".
"Mother and Child Reunion" was nominated for two Directors Guild of Canada Awards, winning in the "Outstanding Achievement in a Television Series – Children's" category, and picked up two Gemini Award nominations in the categories for "Best Photography in a Dramatic Program or Series" and "Best Short Dramatic Program". |
# Slug (song)
"Slug" is a song by Passengers, a side project of rock band U2 and musician Brian Eno. It is the second track on Passengers' only release, the 1995 album Original Soundtracks 1. The track was originally titled "Seibu" and was almost left off the album before it was rediscovered later during the recording sessions. Though Eno made most of the creative decisions during the recording sessions, "Slug" was one of the few tracks that the members from U2 tried to craft themselves.
Lyrically, it is a portrait of a desolate soul during a time of celebration. As Passengers were writing songs for fictional soundtracks, they tried to create a visual suggestion from the music that was more important than the story within the lyrics. In "Slug", the instrumentation is intended to represent the lights turning on in a city at night. The group primarily drew inspiration for the song from U2's experiences in Tokyo at the conclusion of the Zoo TV Tour. "Slug" was praised as one of the best songs on the album by critics from various publications.
## Background and recording
U2 and musician Brian Eno intended to record a soundtrack for Peter Greenaway's 1996 film The Pillow Book. Although the plan did not come to fruition, Eno suggested they continue recording music suitable for film soundtracks, as Eno did with his Music for Films album series. The result was Original Soundtracks 1, an experimental album of ambient and electronica music, created as a side project between U2 and Eno under the pseudonym "Passengers". Vocalist Bono felt the visual suggestion from the music was more important than the story told by the lyrics, so the band tried to create visual music when recording. U2 spent time in Shinjuku, Tokyo, at the end of the Zoo TV Tour in 1993, and their experience in the city influenced the recording sessions. The vivid colours of the street signs and billboards reminded them of the set of the 1982 science-fiction film Blade Runner. Bono has said that Original Soundtracks 1 evoked the setting of a "bullet train in Tokyo".
Recording sessions for Original Soundtracks 1 began with a two-week session in November 1994 at Westside Studios in London, and continued for another five weeks in mid-1995 at Hanover Quay Studios in Dublin. "Slug" was originally titled "Seibu", after the Japanese department store of the same name. The song was written to create the visual of lights turning on at dusk in a city like Tokyo, beginning with "tinkling" opening notes resembling Christmas lights, and a gradually rising and falling synthesizer rhythm throughout the song. After recording "Seibu", the band set it aside, and the piece was forgotten as the sessions progressed. It was almost left off the album, until guitarist the Edge rediscovered the track while looking through the session's discarded songs. Recognizing its potential to become a great song, the Edge brought "Seibu" to Eno's attention, and in early June 1995, Eno listed "Seibu" as a late entry to be considered for the album.
As producer, Eno had most of the artistic control during the sessions, limiting U2's creative input on the recordings, which prompted the Edge to ensure extra work was put into arranging the song. He has said that along with "Miss Sarajevo" and "Your Blue Room", "Seibu" was one of only three tracks from the album in which U2 "really dug in [their] heels and did more work on and tried to craft". By early July 1995, the band renamed the song "Seibu/Slug", and Eno noted that the piece started to sound better and described it as a "lovely song". During the final editing of the track, Eno became angry with U2 because they seemed unfocused and he felt he was doing all the work. Bono decided to completely deconstruct the mix of the song. Eno initially disapproved, but was satisfied after hearing the changes. The editing of the track was finalised on 10 July, and the Edge later said he felt his effort to put extra work into the song "paid off". It was released with the title "Slug" on 7 November 1995, as the second track on the Passengers album Original Soundtracks 1; out of the fourteen tracks on the album, it is one of six tracks to feature vocals. Details of the song's recording sessions were documented in Eno's 1996 book, A Year with Swollen Appendices.
As the compositions on Original Soundtracks 1 were written as film soundtrack music, each track is associated with a specific film in the album's liner notes, which were written by Eno. Four of the fourteen tracks are associated with real films, while "Slug" is credited as having been written for a fictional German film of the same name, directed by "Peter von Heineken" (a reference to U2 manager Paul McGuinness). The liner notes describe the plot of Slug as the story of a young car mechanic who aspires to attract the attention of a cashier by staging a robbery and pretending to be the hero. However, the "robbers" decide to abandon the scheme and commit an actual robbery, causing a shootout where the cashier accidentally shoots a security guard and is arrested, and the mechanic must find a way to get her released from prison.
## Composition and lyrics
"Slug" runs for 4:41 and features a synthesizer rhythm and guitar harmonics laid over a drum track. Jon Pareles of The New York Times described the song's sound as a mix of "shimmering echoed guitars with swampy electronic rhythms". Vocals are sung by Bono in a murmured voice, which begin 1:45 into the song. The lyrics form a laundry list song, and consist of 19 lines, most of which begin with the words "Don't want"; the song's title is included in the lyrics "Don't want to be a slug". The line "Don't want what I deserve" was written by Bono with a sense of "ironic, self-deprecatory humour". The end result is a depiction of celebration set against the thoughts of a desolate soul, as echoed in the closing verse "Don't want to change the frame / Don't want to be a pain / Don't want to stay the same", with an undercurrent of confusion regarding the differences between love and faith.
The lyrics were written in five minutes and are derived from U2's experience in Shinjuku. Bono has compared the lyrics to those in U2's 1991 song "Tryin' to Throw Your Arms Around the World", as both depict the nightlife of a city. Lyrics were also inspired by the presence of the yakuza in Shinjuku; the group saw gang members with amputated fingers as punishment for their misbehaviour, which Bono has described as a "very, very surreal" experience. He has said that "Slug" was about avoiding harmful mistakes, stating "we all play with things we shouldn't play with".
## Reception
"Slug" received positive feedback from critics and was praised as one of the best tracks from the album. Shortly following its release, Tony Fletcher wrote in Newsweek that it is one of the album's "instantly rewarding songs" and that Bono's vocals show "genuine tenderness". The Orange County Register listed "Slug" as one of the best songs on the album, describing it as a "dreamy" track, and The Age and The Dominion both stated that the song features Bono providing his best vocals. Jim DeRogatis of Rolling Stone described "Slug" as one of the album's most engaging tracks, commenting that it could have been an outtake from Zooropa because of Bono's "minimal crooning over skeletal backing tracks".
In retrospective reviews, Pitchfork wrote that "Slug" is the high point of the album, featuring a "beautiful, slow-motion groove", and Slate praised the experimental nature of the song, calling it "lovely and melodic". Uncut reviewer Alastair McKay described the melody as "clockwork" while noting that Eno's "yen for melodic simplicity" was evident. In an otherwise critical review of Original Soundtracks 1, Irvin Tan of Sputnikmusic commented that "Slug" is one of several "strangely beautiful numbers" from the album, and that its "attempt at creating an overarching time/place set actually comes off quite well". Hot Press editor Niall Stokes said "the song has a genuinely reflective quality and it underlines the fact that, some 15 years on since the release of their debut album Boy, U2 are still running." The song was featured on Stereogum's list of "The 31 Best U2 Non-Album Tracks", which claims the song is unlike any other track that U2 has recorded, describing it as a "hauntingly beautiful entry in U2's canon".
## Personnel
- Passengers
- Bono – vocals
- The Edge – guitar
- Adam Clayton – bass guitar
- Larry Mullen Jr. – drums, percussion
- Brian Eno – synthesizers
- Technical
- Brian Eno – mixing, sequencing
- Danton Supple – audio engineering
- Rob Kirwan – audio engineering (assistance) |
# Christian Bale
Christian Charles Philip Bale (born 30 January 1974) is an English actor. Known for his versatility and physical transformations for his roles, he has been a leading man in films of several genres. He has received various accolades, including an Academy Award and two Golden Globe Awards. Forbes magazine ranked him as one of the highest-paid actors in 2014.
Born in Wales to English parents, Bale had his breakthrough role at age 13 in Steven Spielberg's 1987 war film Empire of the Sun. After more than a decade of performing in leading and supporting roles in films, he gained wider recognition for his portrayals of serial killer Patrick Bateman in the black comedy American Psycho (2000) and the title role in the psychological thriller The Machinist (2004). In 2005, he played superhero Batman in Batman Begins and again in The Dark Knight (2008) and The Dark Knight Rises (2012), garnering acclaim for his performance in the trilogy, which is one of the highest-grossing film franchises.
Bale continued in starring roles in a range of films outside his work as Batman, including the period drama The Prestige (2006), the action film Terminator Salvation (2009), the crime drama Public Enemies (2009), the epic film Exodus: Gods and Kings (2014) and the superhero film Thor: Love and Thunder (2022). For his portrayal of boxer Dicky Eklund in the 2010 biographical film The Fighter, he won an Academy Award and a Golden Globe Award. Further Academy Award and Golden Globe Award nominations came for his work in the black comedy American Hustle (2013) and the biographical dramedies The Big Short (2015) and Vice (2018). His performances as politician Dick Cheney in Vice and race car driver Ken Miles in the sports drama Ford v Ferrari (2019) earned him a second win and a fifth nomination respectively at the Golden Globe Awards.
## Early life
Christian Charles Philip Bale was born on 30 January 1974 in Haverfordwest, Pembrokeshire, to English parents—Jenny James, a circus performer, and David Bale, an entrepreneur and activist. Bale has remarked, "I was born in Wales but I'm not Welsh—I'm English." He has two elder sisters, Sharon and Louise, and a half-sister from his father's first marriage, Erin. One of his grandfathers was a comedian while the other was a stand-in for John Wayne. Bale and his family left Wales when he was two years old, and after living in Portugal and Oxfordshire, England, they settled in Bournemouth. As well as saying that the family had lived in 15 towns by the time he was 15, Bale described the frequent relocation as being driven by "necessity rather than choice" and acknowledged that it had a major influence on his career selection. He attended Bournemouth School, later saying he left school at age 16. Bale's parents divorced in 1991, and at age 17, he moved with his sister Louise and their father to Los Angeles.
Bale trained in ballet as a child. His first acting role came at eight years old in a commercial for the fabric softener Lenor. He also appeared in a Pac-Man cereal commercial. After his sister was cast in a West End musical, Bale considered taking up acting professionally. He said later he did not find acting appealing but pursued it at the request of those around him because he had no reason not to do so. After participating in school plays, Bale performed opposite Rowan Atkinson in the play The Nerd in the West End in 1984. He did not undergo any formal acting training.
## Career
### Early roles and breakthrough (1986–1999)
After deciding to become an actor at age ten, Bale secured a minor role in the 1986 television film Anastasia: The Mystery of Anna. Its star, Amy Irving, who was married to director Steven Spielberg, subsequently recommended Bale for Spielberg's 1987 film Empire of the Sun. At age 13, Bale was chosen from over 4,000 actors to portray a British boy in a World War II Japanese internment camp. For the film, he spoke with an upper-class cadence without the help of a dialogue coach. The role propelled Bale to fame, and his work earned him acclaim and the inaugural Best Performance by a Juvenile Actor Award from the National Board of Review of Motion Pictures. Earlier in the same year, he starred in the fantasy film Mio in the Land of Faraway, based on the novel Mio, My Son by Astrid Lindgren. The fame from Empire of the Sun led to Bale being bullied in school and finding the pressures of working as an actor unbearable. He grew distrustful of the acting profession because of media attention but said that he felt obligated at a young age to continue to act for financial reasons. Around this time, actor and filmmaker Kenneth Branagh persuaded Bale to appear in his film Henry V in 1989, which drew him back into acting. The following year, Bale played Jim Hawkins opposite Charlton Heston as Long John Silver in Treasure Island, a television film adaptation of Robert Louis Stevenson's book of the same name.
Bale starred in the 1992 Disney musical film Newsies, which was unsuccessful both at the box office and with critics. Rebecca Milzoff of Vulture revisited the film in 2012 and found the cracks in Bale's voice during his performance of the song "Santa Fe" charming and apt even though he was not a great singer. In 1993, he appeared in Swing Kids, a film about teenagers who secretly listen to forbidden jazz during the rise of Nazi Germany. In Gillian Armstrong's 1994 film Little Women, Bale played Theodore "Laurie" Laurence following a recommendation from Winona Ryder, who starred as Jo March. The film achieved critical and commercial success. Of Bale's performance, Ryder said he captured the complex nature of the role. He next voiced Thomas, a young compatriot of Captain John Smith, in the 1995 Disney animated film Pocahontas, which attracted a mixed critical reception. Bale played a small part in the 1996 film The Portrait of a Lady, based on the Henry James novel of the same name, and appeared in the 1998 musical film Velvet Goldmine, set in the 1970s during the glam rock era. In 1999, he was part of an ensemble cast, which included Kevin Kline and Michelle Pfeiffer, portraying Demetrius in A Midsummer Night's Dream, a film adaptation of William Shakespeare's play of the same name.
### Rise to prominence and commercial decline (2000–2004)
Bale played Patrick Bateman, an investment banker and serial killer, in American Psycho, a film adaptation of Bret Easton Ellis's novel of the same name, directed by Mary Harron. While Harron had chosen Bale for the part, the film's production and distribution company, Lionsgate, originally disagreed and hired Leonardo DiCaprio to play Bateman with Oliver Stone to direct. Bale and Harron were brought back after DiCaprio and Stone had left the project. Bale exercised and tanned himself for months to achieve Bateman's chiseled physique and had his teeth capped to assimilate to the character's narcissistic nature. American Psycho premiered at the 2000 Sundance Film Festival. Harron said critic Roger Ebert named it the most hated film at the event. Of Bale's work, Ebert wrote he "is heroic in the way he allows the character to leap joyfully into despicability; there is no instinct for self-preservation here, and that is one mark of a good actor." The film was released in April 2000, becoming a commercial and critical success and later developing a cult following; the role established Bale as a leading man.
In the four years that followed American Psycho, Bale's career experienced critical and commercial failure. He next played a villainous real estate heir in John Singleton's action film Shaft and appeared in John Madden's film adaptation of the Louis de Bernières novel Captain Corelli's Mandolin as Mandras, a Greek fisherman who vies with Nicolas Cage's title character for the affections of Pelagia, played by Penélope Cruz. Bale said he found it refreshing to play Mandras, who is emotionally humane, after working on American Psycho and Shaft. In 2002, he appeared in three films: Laurel Canyon, Reign of Fire and Equilibrium. Reviewing Laurel Canyon for Entertainment Weekly, Lisa Schwarzbaum called Bale's performance "fussy". After having reservations about joining the post-apocalyptic Reign of Fire, which involved computer-generated imagery, Bale professed his enjoyment of making films that could go awry and cited director Rob Bowman as a reason for his involvement. In Equilibrium, he plays a police officer in a futuristic society and performs gun kata, a fictional martial art that incorporates gunfighting. IGN's Jeff Otto characterised Reign of Fire as "poorly received" and Equilibrium as "highly underrated", while The Independent's Stephen Applebaum described the two films along with Shaft and Captain Corelli's Mandolin as "mediocre fare".
Bale starred as the insomnia-ridden, emotionally dysfunctional title character in the psychological thriller The Machinist. To prepare for the role, he initially only smoked cigarettes and drank whiskey. His diet later expanded to include black coffee, an apple and a can of tuna per day. Bale lost 63 pounds (29 kg), weighing 121 pounds (55 kg) to play the character, who was written in the script as "a walking skeleton". His weight loss prompted comparisons with Robert De Niro's weight gain in preparation to play Jake LaMotta in the 1980 film Raging Bull. Describing his transformation as mentally calming, Bale claimed he had stopped working for a while because he did not come upon scripts that piqued his interest and that the film's script drew him to lose weight for the part. The Machinist was released in October 2004; it performed poorly at the box office. Roger Moore of the Orlando Sentinel regarded it as one of the best films of the year, and Todd McCarthy of Variety wrote that Bale's "haunted, aggressive and finally wrenching performance" gave it a "strong anchor".
### Batman and dramatic roles (2005–2008)
Bale portrayed American billionaire Bruce Wayne and his superhero alias Batman in Christopher Nolan's Batman Begins, a reboot of the Batman film series. Nolan cast Bale, who was still fairly unknown at the time, because Bale had "exactly the balance of darkness and light" Nolan sought. For the part, Bale regained the weight he lost for The Machinist and built muscle, weighing 220 pounds (100 kg). He trained in weapons, Wing Chun Kung Fu and the Keysi Fighting Method. Acknowledging the story's peculiar circumstances involving a character "who thinks he can run around in a batsuit in the middle of the night", Bale said he and Nolan had deliberately approached it with "as realistic a motivation as possible", referencing Wayne's parents' murder. Bale voiced Wayne and Batman differently. He employed gravelly tone qualities for Batman, which Nolan believed reinforced the character's visual appearance. Batman Begins was released in the US in June 2005. Tim Grierson and Will Leitch of Vulture complimented Bale's "sensitive, intelligent portrayal of a spoiled, wayward Bruce who finally grows up (and fights crime)." The performance earned Bale the MTV Movie Award for Best Hero.
In the same year, Bale voiced the titular Howl, a wizard, in the English-language dub of Hayao Miyazaki's Howl's Moving Castle, a Japanese animated film adaptation of Diana Wynne Jones's novel of the same name. He committed himself to voice the role after watching Miyazaki's animated film Spirited Away. Later that year, he starred as a US war veteran who deals with post-traumatic stress disorder in the David Ayer-helmed crime drama Harsh Times, which premiered at the Toronto International Film Festival. He portrayed colonist John Rolfe in The New World, a historical drama film inspired by the stories of Pocahontas, directed by Terrence Malick. The film was released on 25 December 2005. The following year saw the premiere of Rescue Dawn, by German filmmaker Werner Herzog, in which Bale portrayed US fighter pilot Dieter Dengler, who fights for his life after being shot down while on a mission during the Vietnam War. After the two worked together, Herzog stated that he had considered Bale to be among his generation's greatest talents long before he played Batman. The Austin Chronicle's Marjorie Baumgarten viewed Bale's work as a continuance of his "masterful command of yet another American personality type."
For the 2006 film The Prestige, Bale reunited with Batman Begins director Nolan, who said that Bale was cast after offering himself for the part. It is based on the novel of the same name by Christopher Priest about a rivalry between two Victorian era magicians, whom Bale and Hugh Jackman play in the film. While it attracted acclaim from critics, the film performed more modestly during its run in theatres, earning $110 million against a $40 million budget. In his review for The New York Times, A. O. Scott highlighted Bale's "fierce inwardness" and called his performance "something to savor". Bale next starred in the 2007 drama films I'm Not There, portraying two incarnations of singer-songwriter Bob Dylan, and in 3:10 to Yuma, playing a justice-seeking cattleman. He characterised his Dylan incarnations as "two men on a real quest for truth" and attributed his interest in 3:10 to Yuma to his affinity for films where he gets to "just be dirty and crawling in the mud". Bale reprised the role of Batman in Nolan's Batman Begins sequel The Dark Knight, which received acclaim and became the fourth film to gross more than $1 billion worldwide upon its July 2008 release. He did many of his own stunts, including one that involved standing on the roof of the Sears Tower in Chicago. The Dark Knight has been regarded by critics as the best superhero film.
### The Dark Knight trilogy completion and acclaim (2009–2012)
In February 2008, Warner Bros. announced that Bale would star as rebellion leader John Connor in the post-apocalyptic action film Terminator Salvation, directed by McG, who cited Bale as "the most credible actor of his generation". In February 2009, an audio recording of a tirade on the film's set in July 2008 involving Bale was released. It captured him directing profanities towards and threatening to attack the film's cinematographer Shane Hurlbut, who walked onto the set during the filming of a scene acted by Bale and Bryce Dallas Howard, and culminated in Bale threatening to quit the film if Hurlbut was not fired. Several colleagues in the film industry defended Bale, attributing the incident to his dedication to acting. Bale publicly apologised in February 2009, calling the outburst "inexcusable" and his behaviour "way out of order" and affirming to have made amends with Hurlbut. Terminator Salvation was released in May 2009 to tepid reviews. Claudia Puig of USA Today considered Bale's work to be "surprisingly one-dimensional", while The Age's Jake Wilson wrote he gave one of his least compelling performances. Bale later admitted he knew during production that the film would not revitalise the Terminator franchise as he had wished. He asserted he would not work with McG again.
Bale portrayed FBI agent Melvin Purvis opposite Johnny Depp as gangster John Dillinger in Michael Mann's crime drama Public Enemies. Released in July 2009, it earned critical praise and had a commercially successful theatrical run. Dan Zak of The Washington Post was unsatisfied with the casting of Bale and Depp, believing their characters' rivalry lacked electricity, while The New Republic's Christopher Orr found Bale's "characteristically closed off" performance "nonetheless effective". The following year, Bale starred in the role of Dicky Eklund, a professional boxer whose career has ended due to his drug addiction, in David O. Russell's drama film The Fighter. It chronicles the relationship between Eklund and his brother and boxing trainee, Micky Ward, played by Mark Wahlberg. To balance Eklund's tragic condition, Bale incorporated humor in his characterisation. The portrayal, for which he lost 30 pounds (14 kg), was acclaimed, the San Francisco Chronicle's Mick LaSalle describing it as "shrewdly observed, physically precise and psychologically acute". Bale won the Academy Award for Best Supporting Actor and the Golden Globe Award for Best Supporting Actor – Motion Picture for his performance. In 2011, he starred in Zhang Yimou's historical drama film The Flowers of War, which was the highest-grossing Chinese film of the year. Critics described it as "nationalistic", "anti-Japanese" and "too long, too melodramatic, too lightweight".
Bale played Batman again under Christopher Nolan's direction in the sequel The Dark Knight Rises, released in July 2012. He described Batman in the film as a remorseful recluse in poor mental and physical health, who has surrendered following the events of The Dark Knight. Following the shooting at a midnight showing of the film in Aurora, Colorado, Bale and his wife visited survivors, doctors and first responders at The Medical Center of Aurora as well as a memorial to victims. The Dark Knight Rises was the 11th film to gross more than $1 billion worldwide, surpassing The Dark Knight. Nolan's Batman film series, dubbed the Dark Knight trilogy, is one of the highest-grossing film franchises. It is also regarded as one of the best comic book film franchises. Bale's performance in the three films garnered universal acclaim, with The Guardian, The Indian Express, MovieWeb, NME and a poll conducted by the Radio Times ranking it as the best portrayal of Batman on film. Bale later revealed his dissatisfaction with his work throughout the trilogy, saying he "didn't quite nail" his part and that he "didn't quite manage" what he had hoped he would as Batman.
### Continued critical success (2013–2019)
In 2013, Bale played a steel mill worker in Scott Cooper's thriller Out of the Furnace. Cooper rewrote the film's script with Bale in mind before the two even met and would not proceed with the project without the actor's involvement. Critics commended the film and deemed it an excellent beginning of the next phase in Bale's career after playing Batman, with Kristopher Tapley of Variety noting his work in the film was his best. That same year, he starred in American Hustle, which reunited him with David O. Russell after their work on The Fighter. To play con artist Irving Rosenfeld, Bale studied footage of interviews with real-life con artist Mel Weinberg, who served as inspiration for the character. He gained 43 pounds (20 kg), shaved part of his head and adopted a slouched posture, which reduced his height by 3 inches (7.6 cm) and caused him to suffer a herniated disc. Russell indicated that Robert De Niro, who appeared in an uncredited role, did not recognise Bale when they were first introduced. Writing for the New York Daily News, Joe Neumaier found Bale's performance to be "sad, funny and riveting". He was nominated for an Academy Award and a Golden Globe Award for his work.
Bale portrayed Moses in Ridley Scott's epic film Exodus: Gods and Kings. Released in December 2014, the film faced accusations of whitewashing for the casting of Caucasian actors in Middle Eastern roles. Scott justified casting decisions citing financing needs, Bale stating that Scott had been forthright in getting the film made. Its critical response varied between negative and mixed, and the St. Louis Post-Dispatch's Joe Williams called Bale's performance in the film the most apathetic of his career. Bale appeared in Terrence Malick's drama Knight of Cups, which The Atlantic critic David Sims dubbed a "noble failure". During its premiere at the 65th Berlin International Film Festival in February 2015, he said he filmed the project without having learned any dialogue and that Malick had only given him a character description. Later that year, he starred as Michael Burry, an antisocial hedge fund manager, in Adam McKay's The Big Short, a biographical comedy-drama film about the financial crisis of 2007–08. He used an ocular prosthesis in the film. The Wall Street Journal's Joe Morgenstern found his portrayal "scarily hilarious—or in one-liners and quick takes, deftly edited". The role earned Bale Academy Award and Golden Globe Award nominations for Best Supporting Actor.
In the 2016 historical drama The Promise, set during the Armenian Genocide, he played an American journalist who becomes involved in a love triangle with a woman, played by Charlotte Le Bon, and an Armenian medical student, played by Oscar Isaac. Critics disapproved of the film, which accrued a $102 million loss. Reviewing the film for The New York Times, Jeannette Catsoulis wrote that Bale appeared "muffled and indistinct". In Cooper's 2017 film Hostiles, Bale starred as a US Army officer escorting a gravely ill Cheyenne war chief and his family back to their home in Montana. He calls the film "a western with brutal, modern-day resonance" and his character "a bigoted and hate-filled man". Bale learned the Cheyenne language while working on the film. Empire critic Dan Jolin considered his performance striking and one of the strongest of his career. In 2018, Bale voiced Bagheera in the adventure film Mowgli: Legend of the Jungle. Rolling Stone's David Fear wrote that his voice work and that of Andy Serkis, who directed the film, "bring the soul as well as sound and fury".
For the 2018 biographical comedy drama Vice, written and directed by Adam McKay, Bale underwent a major body transformation once again, as he gained over 40 pounds (18 kg) and shaved his head to portray US Vice President Dick Cheney. He described Cheney, who is reckoned the most influential and loathed vice president in US history, as "quiet and secretive". The film reunited Bale with Amy Adams, with whom he had co-starred in The Fighter and American Hustle. It received positive reviews, and The Guardian's Peter Bradshaw commended Bale's "terrifically and in fact rather scarily plausible" Cheney impersonation. Lauded by critics, the performance won Bale the Golden Globe Award for Best Actor – Motion Picture Musical or Comedy and garnered him an Academy Award nomination. During his acceptance speech at the 76th Golden Globe Awards, Bale thanked Satan for inspiring his Cheney portrayal, which elicited a response from Cheney's daughter and US Representative Liz Cheney, who stated that Bale ruined his opportunity to play "a real superhero".
Bale portrayed sports car racing driver Ken Miles in the 2019 sports drama Ford v Ferrari, for which he lost 70 pounds (32 kg) after playing Cheney. Directed by James Mangold, the film follows Miles and automotive designer Carroll Shelby, played by Matt Damon, in events surrounding the 1966 24 Hours of Le Mans race. The role earned Bale a fifth Golden Globe Award nomination. While promoting the film, he said he would no longer go through weight fluctuations for roles.
### Limited work (2020–present)
Bale played Gorr the God Butcher, a villain, in the Marvel Cinematic Universe superhero film Thor: Love and Thunder, which was released in July 2022. He cited a character in the music video for the Aphex Twin song "Come to Daddy" as an inspiration for his characterisation of Gorr. Bale's portrayal drew praise from critics, who deemed it "grounded and non-campy". He produced and appeared in David O. Russell's period film Amsterdam and Scott Cooper's thriller The Pale Blue Eye, reuniting with both directors for the third time. Amsterdam was released in October 2022, receiving dire reviews and failing at the box office. The Pale Blue Eye, adapted from the novel of the same name by Louis Bayard, was released in December 2022, receiving mixed reviews from critics. With his involvement in the 2023 Japanese animated film The Boy and the Heron, Bale voiced a character in an English-language dub of a film by Hayao Miyazaki for the second time.
Bale will next play Frankenstein's monster in Maggie Gyllenhaal's fantasy period film The Bride\! It is scheduled for release in 2025.
## Artistry and public image
Bale is known for his exhaustive dedication to the weight fluctuations that his parts demand as well as "the intensity with which he completely inhabits his roles", with The Washington Post's Ann Hornaday rating him among the most physically gifted actors of his generation. Max O'Connell of RogerEbert.com deemed Bale's commitment to altering his physical appearance "an anchoring facet to a depiction of obsession" in his performances, while the Los Angeles Times's Hugh Hart likened the urgency that drives Bale's acting style to method acting, adding that it "convincingly animates even his most extreme physical transformations." Bale has said that he does not practise method acting and that he does not use a particular technique. He named Rowan Atkinson as his template as an actor and added that he was mesmerised by him when they worked together. He also studied the work of Gary Oldman, crediting him as the reason for his pursuit of acting.
Bale has been recognised for his versatility; Martha Ross of The Mercury News named him one of his generation's most versatile actors. Known to be very private about his personal life, Bale has said that his objective was to embody characters without showing any aspect of himself. He explained that "letting people know who you are" does not help create different characters, viewing anonymity as "what's giving you the ability to play those characters". During interviews to promote films in which he puts on an accent, Bale would continue speaking in the given accent. Bale has also been noted for portraying roles with an American accent, with The Atlantic's Joe Reid listing him among those who "work least in their native accents"; in real life, Bale speaks in an "emphatic, non-posh" English accent.
Bale was ranked at number eight on Forbes magazine's list of the highest-paid actors of 2014, earning $35 million. He has been described as a sex symbol.
## Personal life
Bale has lived in Los Angeles since the 1990s. He holds US citizenship. Bale married Sandra "Sibi" Blažić, a former American model of Serbian descent, in Las Vegas on 29 January 2000. They have a daughter and a son. In 2000, Bale became feminist Gloria Steinem's stepson following her marriage to his father, who died in 2003 of brain lymphoma.
Bale became a vegetarian at seven years old but in 2009 said he was "in and out of the vegetarianism now". He had stopped eating red meat after reading the children's book Charlotte's Web. An animal rights activist, he supports the organisations Greenpeace, the World Wide Fund for Nature, the Doris Day Animal League, the Dian Fossey Gorilla Fund International and Redwings Horse Sanctuary. While promoting The Flowers of War in December 2011, Bale, with a crew from the television network CNN, attempted to visit Chen Guangcheng, a confined blind barefoot lawyer, in a village in eastern China. He was forced to retreat after scuffling with guards at a checkpoint. Bale finally met Chen at a dinner held by the nonprofit Human Rights First the following year, during which he presented Chen with an award. Bale voiced Chen's story in Amnesty International's podcast, In Their Own Words. He co-founded California Together, an organisation aiming to construct a village in Palmdale, California to help siblings in foster care remain together.
On 22 July 2008, Bale was arrested in London after his mother and his sister Sharon reported him to the police for an alleged assault at a hotel. He was released on bail. Bale denied the allegations and later called the incident "a deeply personal matter". On 14 August, the Crown Prosecution Service declared they would take no further action against him because of "insufficient evidence to afford a realistic prospect of conviction".
## Acting credits and accolades
According to the review aggregation website Rotten Tomatoes, which assigns films scores based on the number of positive critical reviews they receive, some of Bale's highest-scoring films are The Dark Knight (2008), Ford v Ferrari (2019), American Hustle (2013), Little Women (1994), The Fighter (2010), Rescue Dawn (2007), 3:10 to Yuma (2007), The Big Short (2015), Howl's Moving Castle (2005) and The Dark Knight Rises (2012); the first, third and last of which are also listed by the data website The Numbers as his highest-grossing films, alongside Terminator Salvation (2009), Batman Begins (2005), Pocahontas (1995), Thor: Love and Thunder (2022) and Exodus: Gods and Kings (2014).
Bale has garnered four Academy Award nominations, including two in the Best Actor category for his work in American Hustle and Vice (2018) as well as two in the Best Supporting Actor category for his work in The Fighter (2010) and The Big Short; he won one for The Fighter. He has earned two Golden Globe Awards, including Best Supporting Actor – Motion Picture for his role in The Fighter and Best Actor – Motion Picture Musical or Comedy for his role in Vice, and received two nominations for Best Actor – Motion Picture Musical or Comedy for his performances in American Hustle and The Big Short and a nomination for Best Actor – Motion Picture Drama for his performance in Ford v Ferrari. Bale has also been nominated for eight Screen Actors Guild Awards, winning in the Outstanding Performance by a Male Actor in a Supporting Role category for The Fighter and the Outstanding Performance by a Cast in a Motion Picture category as part of the American Hustle cast. |
# Political career of John C. Breckinridge
The political career of John C. Breckinridge included service in the state government of Kentucky, the Federal government of the United States, as well as the government of the Confederate States of America. In 1857, 36 years old, he was inaugurated as Vice President of the United States under James Buchanan. He remains the youngest person to ever hold the office. Four years later, he ran as the presidential candidate of a dissident group of Southern Democrats, but lost the election to the Republican candidate Abraham Lincoln.
A member of the Breckinridge political family, John C. Breckinridge became the first Democrat to represent Fayette County in the Kentucky House of Representatives, and in 1851, he was the first Democrat to represent Kentucky's 8th congressional district in over 20 years. A champion of strict constructionism, states' rights, and popular sovereignty, he supported Stephen A. Douglas's Kansas–Nebraska Act as a means of addressing slavery in the territories acquired by the U.S. in the Mexican–American War. Considering his re-election to the House of Representatives unlikely in 1854, he returned to private life and his legal practice. He was nominated for vice president at the 1856 Democratic National Convention, and although he and Buchanan won the election, he enjoyed little influence in Buchanan's administration.
In 1859, the Kentucky General Assembly elected Breckinridge to a U.S. Senate term that would begin in 1861. In the 1860 United States presidential election, Breckinridge captured the electoral votes of most of the Southern states, but finished a distant second among four candidates. Lincoln's election as president prompted the secession of the Southern states to form the Confederate States of America. Though Breckinridge sympathized with the Southern cause, in the Senate he worked futilely to reunite the states peacefully. After the Confederates fired on Fort Sumter, beginning the Civil War, he opposed allocating resources for Lincoln to fight the Confederacy. Fearing arrest after Kentucky sided with the Union, he fled to the Confederacy, joined the Confederate States Army, and was subsequently expelled from the Senate. He served in the Confederate Army from October 1861 to February 1865, when Confederate President Jefferson Davis appointed him Confederate States Secretary of War. Then, concluding that the Confederate cause was hopeless, he encouraged Davis to negotiate a national surrender. Davis's capture on May 10, 1865, effectively ended the war, and Breckinridge fled to Cuba, then Great Britain, and finally Canada, remaining in exile until President Andrew Johnson's offer of amnesty in 1868. Returning to Kentucky, he refused all requests to resume his political career and died of complications related to war injuries in 1875.
## Formative years
Historian James C. Klotter has speculated that, had John C. Breckinridge's father, Cabell, lived, he would have steered his son to the Whig Party and the Union, rather than the Democratic Party and the Confederacy, but the Kentucky Secretary of State and former Speaker of the Kentucky House of Representatives died of a fever on September 1, 1823, months before his son's third birthday. Burdened with her husband's debts, widow Mary Breckinridge and her children moved to her in-laws' home near Lexington, Kentucky, where John C. Breckinridge's grandmother taught him the political philosophies of his late grandfather, U.S. Attorney General John Breckinridge. John Breckinridge believed the federal government was created by, and subject to, the co-equal governments of the states. As a state representative, he introduced the Kentucky Resolutions of 1798 and 1799, which denounced the Alien and Sedition Acts and asserted that states could nullify them and other federal laws that they deemed unconstitutional. A strict constructionist, he held that the federal government could only exercise powers explicitly given to it in the Constitution.
Most of the Breckinridges were Whigs, but John Breckinridge's posthumous influence inclined his grandson toward the Democratic Party. Additionally, John C. Breckinridge's friend and law partner, Thomas W. Bullock, was from a Democratic family. In 1842, Bullock told Breckinridge that by the time they opened their practice in Burlington, Iowa, "you were two-thirds of a Democrat"; living in heavily Democratic Iowa Territory further distanced him from Whiggery. He wrote weekly editorials in the Democratic Iowa Territorial Gazette and Advisor, and in February 1843, he was named to the Des Moines County Democratic committee. A letter from Breckinridge's brother-in-law related that, when Breckinridge's uncle William learned that his nephew had "become loco-foco", he said, "I felt as I would have done if I had heard that my daughter had been dishonored." On a visit to Kentucky in 1843, Breckinridge met and married Mary Cyrene Burch, ending his time in Iowa.
## Views on slavery
Slavery issues dominated Breckinridge's political career, although historians disagree about Breckinridge's views. In Breckinridge: Statesman, Soldier, Symbol, William C. Davis argues that, by adulthood, Breckinridge regarded slavery as evil; his entry in the 2002 Encyclopedia of World Biography records that he advocated voluntary emancipation. In Proud Kentuckian: John C. Breckinridge 1821–1875, Frank Heck disagrees, citing Breckinridge's consistent advocacy for slavery protections, beginning with his opposition to emancipationist candidates—including his uncle, Robert Jefferson Breckinridge—in the state elections of 1849.
### Early influences
Breckinridge's grandfather, John, owned slaves, believing it was a necessary evil in an agrarian economy. He hoped for gradual emancipation but did not believe the federal government was empowered to effect it; Davis wrote that this became "family doctrine". As a U.S. Senator, John Breckinridge insisted that decisions about slavery in Louisiana Territory be left to its future inhabitants, essentially the "popular sovereignty" advocated by John C. Breckinridge prior to the Civil War. John C. Breckinridge's father, Cabell, embraced gradual emancipation and opposed government interference with slavery, but Cabell's brother Robert, a Presbyterian minister, became an abolitionist, concluding that slavery was morally wrong. Davis recorded that all the Breckinridges were pleased when the General Assembly upheld the ban on importing slaves to Kentucky in 1833.
John C. Breckinridge encountered conflicting influences as an undergraduate at Centre College and in law school at Transylvania University. Centre President John C. Young, Breckinridge's brother-in-law, believed in states' rights and gradual emancipation, as did George Robertson, one of Breckinridge's instructors at Transylvania, but James G. Birney, father of Breckinridge's friend and Centre classmate William Birney, was an abolitionist. In an 1841 letter to Robert Breckinridge, who became his surrogate father after Cabell Breckinridge's death, John C. Breckinridge wrote that only "ignorant, foolish men" feared abolition. In an Independence Day address in Frankfort later that year, he decried the "unlawful dominion over the bodies ... of men". An acquaintance believed that Breckinridge's move to Iowa Territory was motivated, in part, by the fact that it was a free territory under the Missouri Compromise.
After returning to Kentucky, Breckinridge became friends with abolitionists Cassius Marcellus Clay, Garrett Davis, and Orville H. Browning. He represented freedmen in court and loaned them money. He was a Freemason and member of the First Presbyterian Church, both of which opposed slavery. Nevertheless, because blacks were educationally and socially disadvantaged in the South, Breckinridge concluded that "the interests of both races in the Commonwealth would be promoted by the continuance of their present relations". He supported the new state constitution adopted in 1850, which forbade the immigration of freedmen to Kentucky and required emancipated slaves to be expelled from the state. Believing it was best to relocate freedmen to the African colony of Liberia, he supported the Kentucky branch of the American Colonization Society. The 1850 Census showed that Breckinridge owned five slaves, aged 11 to 36. Heck recorded that his slaves were well-treated but noted that this was not unusual and proved nothing about his views on slavery.
### Moderate reputation
Because Breckinridge defended both the Union and slavery in the General Assembly, he was considered a moderate early in his political career. In June 1864, Pennsylvania's John W. Forney opined that Breckinridge had been "in no sense an extremist" when elected to Congress in 1851. Of his early encounters with Breckinridge, Forney wrote: "If he had a conscientious feeling, it was hatred of slavery, and both of us, 'Democrats' as we were, frequently confessed that it was a sinful and an anti-Democratic institution, and that the day would come when it must be peaceably or forcibly removed." Heck discounts this statement, pointing out that Forney was editor of a pro-Union newspaper and Breckinridge a Confederate general at the time it was published. As late as the 1856 presidential election, some alleged that Breckinridge was an abolitionist.
By the time he began his political career, Breckinridge had concluded that slavery was more a constitutional issue than a moral one. Slaves were property, and the Constitution did not empower the federal government to interfere with property rights. From Breckinridge's constructionist viewpoint, allowing Congress to legislate emancipation without constitutional sanction would lead to "unlimited dominion over the territories, excluding the people of the slave states from emigrating thither with their property". As a private citizen, he supported the slavery protections in the Kentucky Constitution of 1850 and denounced the Wilmot Proviso, which would have forbidden slavery in territory acquired in the Mexican–American War. As a state legislator, he declared slavery a "wholly local and domestic" matter, to be decided separately by the residents of each state and territory. Because Washington, D.C., was a federal entity and the federal government could not interfere with property rights, he concluded that forced emancipation there was unconstitutional. As a congressman, he insisted on Congress's "perfect non-intervention" with slavery in the territories. Debating the 1854 Kansas–Nebraska Act, he explained, "The right to establish [slavery in a territory by government sanction] involves the correlative right to prohibit; and, denying both, I would vote for neither."
### Later views
Davis notes that Breckinridge's December 21, 1859, address to the state legislature marked a change in his public statements about slavery. He decried the Republicans' desire for "negro equality", his first public indication that he may have believed blacks were biologically inferior to whites. He declared that the Dred Scott decision showed that federal courts afforded adequate protection for slave property, but advocated a federal slave code if future courts failed to enforce those protections; this marked a departure from his previous doctrine of "perfect non-interference". Asserting that John Brown's raid on Harpers Ferry proved Republicans intended to force abolition on the South, he predicted "resistance [to the Republican agenda] in some form is inevitable". He still urged the Assembly against secession—"God forbid that the step shall ever be taken\!"—but his discussion of growing sectional conflict bothered some, including his uncle Robert.
Klotter wrote that Breckinridge's sale of a female slave and her six-week-old child in November 1857 probably ended his days as a slaveholder. Slaves were not listed among his assets in the 1860 Census, but Heck noted that he had little need for slaves at that time, since he was living in Lexington's Phoenix Hotel after returning to Kentucky from his term as vice president. Some slavery advocates refused to support him in the 1860 presidential race because he was not a slaveholder. Klotter noted that Breckinridge fared better in rural areas of the South, where there were fewer slaveholders; in urban areas where the slave population was higher, he lost to Constitutional Unionist candidate John Bell, who owned 166 slaves. William C. Davis recorded that, in most of the South, the combined votes for Bell and Illinois Senator Stephen Douglas exceeded those cast for Breckinridge.
After losing the election to Abraham Lincoln, Breckinridge worked for adoption of the Crittenden Compromise—authored by fellow Kentuckian John J. Crittenden—as a means of preserving the Union. Breckinridge believed the Crittenden proposal—restoring the Missouri Compromise line as the separator between slave and free territory in exchange for stricter enforcement of the Fugitive Slave Act of 1850 and federal non-interference with slavery in the territories and Washington, D.C.—was the most extreme proposal to which the South would agree. Ultimately, the compromise was rejected and the Civil War soon followed.
## Early political career
A supporter of the annexation of Texas and "manifest destiny", Breckinridge campaigned for James K. Polk in the 1844 presidential election, prompting a relative to observe that he was "making himself very conspicuous here by making flaming loco foco speeches at the Barbecues". He decided against running for Scott County clerk after his law partner complained that he spent too much time in politics. In 1845, he declined to seek election to the U.S. House of Representatives from the eighth district but campaigned for Alexander Keith Marshall, his party's unsuccessful nominee. He supported Zachary Taylor for the presidency in mid-1847 but endorsed the Democratic ticket of Lewis Cass and William O. Butler after Taylor became a Whig in 1848.
### Kentucky House of Representatives
In October 1849, Kentucky voters called for a constitutional convention. Emancipationists, including Breckinridge's uncles William and Robert, his brother-in-law John C. Young, and his friend Cassius Marcellus Clay, nominated "friends of emancipation" to seek election to the convention and the state legislature In response, Breckinridge, who opposed "impairing [slavery protections] in any form", was nominated by a bipartisan pro-slavery convention for one of Fayette County's two seats in the Kentucky House of Representatives. With 1,481 votes, 400 more than any of his opponents, Breckinridge became the first Democrat elected to the state legislature from Fayette County, which was heavily Whig.
When the House convened in December 1849, a member from Mercer County nominated Breckinridge for Speaker against two Whigs. After receiving 39 votes—8 short of a majority—on the first three ballots, he withdrew, and the position went to Whig Thomas Reilly. Assigned to the committees on the Judiciary and Federal Relations, Breckinridge functioned as the Democratic floor leader during the session. Davis wrote that his most important work during the session was bank reform.
Breckinridge's first speech favored allowing the Kentucky Colonization Society to use the House chamber; later, he advocated directing Congress to establish an African freedmen colony, and to meet the costs of transporting settlers there. Funding internal improvements was traditionally a Whig stance, but Breckinridge advocated conducting a state geologic survey, making the Kentucky River more navigable, chartering a turnpike, incorporating a steamboat company, and funding the Kentucky Lunatic Asylum. As a reward for supporting these projects, he presided over the approval of the Louisville and Bowling Green Railroad's charter and was appointed director of the asylum.
Resolutions outlining Kentucky's views on the proposed Compromise of 1850 were referred to the Committee on Federal Relations. The committee's Whig majority favored one calling the compromise a "fair, equitable, and just basis" for dealing with slavery in the territories and urging Congress not to interfere with slavery there or in Washington, D.C. Feeling this left open the issue of Congress's ability to legislate emancipation, Breckinridge asserted in a competing resolution that Congress could not establish or abolish slavery in states or territories. Both resolutions, and several passed by the state Senate, were laid on the table without being adopted.
Breckinridge left the session on March 4, 1850, three days before its adjournment, to tend to John Milton Breckinridge, his infant son who had fallen ill; the boy died on March 18. To distract from his grief, he campaigned for ratification of the new constitution, objecting only to its difficult amendment process. He declined renomination, citing concerns "of a private and imperative nature". Davis wrote that the problem was money, since his absence from Lexington had hurt his legal practice, but his son's death was also a factor.
## U.S. House of Representatives
At an October 17, 1850, barbecue celebrating the Compromise of 1850, Breckinridge toasted its author, Whig Party founder Henry Clay. Clay reciprocated by praising Breckinridge's grandfather and father, expressing hope that Breckinridge would use his talents to serve his country, then embracing him. Some observers believed that Clay was endorsing Breckinridge for higher office, and Whig newspapers began referring to him as "a sort of half-way Whig" and implying that he voted for Taylor in 1848.
### First term (1851–1853)
Delegates to the Democrats' January 1851 state convention nominated Breckinridge to represent Kentucky's eighth district in the U.S. House of Representatives. Called the "Ashland district" because it contained Clay's Ashland estate and much of the area he once represented, Whigs typically won there by 600 to 1,000 votes. A Democrat had not represented it since 1828, and in the previous election no Democrat had sought the office. Breckinridge's opponent, Leslie Combs, was a popular War of 1812 veteran and former state legislator. As they campaigned together, Breckinridge's eloquence contrasted with Combs' plainspoken style. Holding that "free thought needed free trade", Breckinridge opposed Whig protective tariffs. He only favored federal funding of internal improvements "of a national character". Carrying only three of seven counties, but bolstered by a two-to-one margin in Owen County, Breckinridge garnered 54% of the vote, winning the election by a margin of 537.
Considered for Speaker of the House, Breckinridge believed his election unlikely and refused to run against fellow Kentuckian Linn Boyd. Boyd was elected, and despite Breckinridge's gesture, assigned him to the lightly regarded Foreign Affairs Committee. Breckinridge resisted United States Democratic Review editor George Nicholas Sanders' efforts to recruit him to the Young America movement. Like Young Americans, Breckinridge favored westward expansion and free trade, but he disagreed with the movement's support of European revolutions and its disdain for older statesmen. On March 4, 1852, Breckinridge made his first speech in the House, defending presidential aspirant William Butler against charges by Florida's Edward Carrington Cabell, a Young American and distant cousin, that Butler secretly sympathized with the Free Soilers. He denounced Sanders for his vitriolic attacks on Butler and for calling all likely Democratic presidential candidates except Stephen Douglas "old fogies".
The speech made Breckinridge a target of Whigs, Young Americans, and Douglas supporters. Humphrey Marshall, a Kentucky Whig who supported incumbent President Millard Fillmore, attacked Breckinridge for claiming Fillmore had not fully disclosed his views on slavery. Illinois' William Alexander Richardson, a Douglas backer, tried to distance Douglas from Sanders' attacks on Butler, but Breckinridge showed that Douglas endorsed the Democratic Review a month after it printed its first anti-Butler article. Finally, Breckinridge's cousin, California's Edward C. Marshall, charged that Butler would name Breckinridge Attorney General in exchange for his support and revived the charge that Breckinridge broke party ranks, supporting Zachary Taylor for president. Breckinridge ably defended himself, but Sanders continued to attack him and Butler, claiming Butler would name Breckinridge as his running mate, even though Breckinridge was too young to qualify as vice president.
After his maiden speech, Breckinridge took a more active role in the House. In debate with Ohio's Joshua Reed Giddings, he defended the Fugitive Slave Law's constitutionality and criticized Giddings for hindering the return of fugitive slaves. He opposed Tennessee Congressman Andrew Johnson's Homestead Bill, fearing it would create more territories that excluded slavery. Although generally opposed to funding local improvements, he supported the repair of two Potomac River bridges to avoid higher costs later. Other minor stands included supporting measures to benefit his district's hemp farmers, voting against giving the president ten more appointments to the U.S. Naval Academy, and opposing funds for a sculpture of George Washington because the sculptor proposed depicting Washington in a toga.
Beginning in April, Breckinridge made daily visits to an ailing Henry Clay. Clay died June 29, 1852, and Breckinridge garnered nationwide praise and enhanced popularity in Kentucky after eulogizing Clay in the House. Days later, he spoke in opposition to increasing a subsidy to the Collins Line for carrying trans-Atlantic mail, noting that Collins profited by carrying passengers and cargo on mail ships. In wartime, the government could commandeer and retrofit Collins's steamboats as warships, but Breckinridge cited Commodore Matthew C. Perry's opinion that they would be useless in war. Finally, he showed Cornelius Vanderbilt's written statement promising to build a fleet of mail ships at his expense and carry the mail for $4 million less than Collins. Despite this, the House approved the subsidy increase.
### Second term (1853–1855)
With Butler's chances for the presidential nomination waning, Breckinridge convinced the Kentucky delegation to the 1852 Democratic National Convention not to nominate Butler until later balloting when he might become a compromise candidate. He urged restraint when Lewis Cass's support dropped sharply on the twentieth ballot, but Kentucky's delegates would wait no longer; on the next ballot, they nominated Butler, but he failed to gain support. After Franklin Pierce, Breckinridge's second choice, was nominated, Breckinridge tried, unsuccessfully, to recruit Douglas to Pierce's cause. Pierce lost by 3,200 votes in Kentucky—one of four states won by Winfield Scott—but was elected to the presidency, and appointed Breckinridge governor of Washington Territory in recognition of his efforts. Unsure of his re-election chances in Kentucky, Breckinridge had sought the appointment, but after John J. Crittenden, rumored to be his challenger, was elected to the Senate in 1853, he decided to decline it and run for re-election.
#### Election
The Whigs chose Attorney General James Harlan to oppose Breckinridge, but he withdrew in March when some party factions opposed him. Robert P. Letcher, a former governor who had not lost in 14 elections, was the Whigs' second choice. Letcher was an able campaigner who combined oratory and anecdotes to entertain and energize an audience. Breckinridge focused on issues in their first debate, comparing the Whig Tariff of 1842 to the Democrats' lower Walker tariff, which increased trade and yielded more tax revenue. Instead of answering Breckinridge's points, Letcher appealed to party loyalty, claiming Breckinridge would misrepresent the district "because he is a Democrat". Letcher appealed to Whigs "to protect the grave of Mr. [Henry] Clay from the impious tread of Democracy", but Breckinridge pointed to his friendly relations with Clay, remarking that Clay's will did not mandate that "his ashes be exhumed" and "thrown into the scale to influence the result of the present Congressional contest".
Cassius Clay, Letcher's political enemy, backed Breckinridge despite their differences on slavery. Citing Clay's support and the abolitionism of Breckinridge's uncle Robert, Letcher charged that Breckinridge was an abolitionist. In answer, Breckinridge quoted newspaper accounts and sworn testimony, provided by John L. Robinson, of a speech Letcher made in Indiana for Zachary Taylor in 1848. In the speech, made alongside Thomas Metcalfe, another former Whig governor of Kentucky, Letcher predicted that the Kentucky Constitution then being drafted would provide for gradual emancipation, declaring, "It is only the ultra men in the extreme South who desire the extension of slavery."
When Letcher confessed doubts about his election chances, Whigs began fundraising outside the district, using the money to buy votes or pay Breckinridge supporters not to vote. Breckinridge estimated that the donations, which came from as far away as New York and included contributions from the Collins Line, totaled $30,000; Whig George Robertson believed it closer to $100,000. Washington, D.C., banker William Wilson Corcoran contributed $1,000 to Breckinridge, who raised a few thousand dollars. Out of 12,538 votes cast, Breckinridge won by 526. He received 71% of the vote in Owen County, which recorded 123 more votes than registered voters. Grateful for the county's support, he nicknamed his son, John Witherspoon Breckinridge, "Owen".
#### Service
Of 234 representatives in the House, Breckinridge was one of 80 re-elected to the Thirty-third Congress. His relative seniority, and Pierce's election, increased his influence. He was rumored to have Pierce's backing for Speaker of the House, but he again deferred to Boyd; Maryland's Augustus R. Sollers spoiled Boyd's unanimous election by voting for Breckinridge. Still not given a committee chairmanship, he was assigned to the Ways and Means Committee, where he secured passage of a bill to cover overspending in fiscal year 1853–1854; it was the only time in his career that he solely managed a bill. His attempts to increase Kentucky's allocation in a rivers and harbors bill were unsuccessful but popular with his Whig constituents.
In January 1854, Douglas introduced the Kansas–Nebraska Act to organize the Nebraska Territory. Southerners had thwarted his previous attempts to organize the territory because Nebraska lay north of parallel 36°30' north, the line separating slave and free territory under the Missouri Compromise. They feared that the territory would be organized into new free states that would vote against the South on slavery issues. The Kansas–Nebraska Act allowed the territory's settlers to decide whether or not to permit slavery, an implicit repeal of the Missouri Compromise. Kentucky Senator Archibald Dixon's amendment to make the repeal explicit angered northern Democrats, but Breckinridge believed it would move the slavery issue from national to local politics, and he urged Pierce to support it. Breckinridge wrote to his uncle Robert that he "had more to do than any man here, in putting [the Act] in its present shape", but Heck notes that few extant records support this claim. The repeal amendment made the act more palatable to the South; only 9 of 58 Southern congressmen voted against it. No Northern Whigs voted for the measure, but 44 of 86 Northern Democrats voted in the affirmative, enough to pass it. The Senate quickly concurred, and Pierce signed the act into law on May 30, 1854.
During the debate on the bill, New York's Francis B. Cutting demanded that Breckinridge retract or explain a statement he had made, which Breckinridge understood as a challenge to duel. Under the code duello, the challenged party selected the weapons and the distance between combatants; Breckinridge chose rifles at 60 paces and suggested the duel be held in Silver Spring, Maryland, on the estate of his friend, Francis Preston Blair. Cutting had not meant his remark as a challenge, but insisted that he was now challenged and selected pistols at 10 paces. While their representatives tried to clarify matters, Breckinridge and Cutting made amends, averting the duel. Had it taken place, Breckinridge could have been removed from the House; the 1850 Kentucky Constitution prevented duelers from holding office.
In the second session of the 33rd Congress, Breckinridge acted as spokesman for Ways and Means Committee bills, including a bill to assume and pay the debts Texas incurred prior to its annexation. Breckinridge's friends, W. W. Corcoran and Jesse D. Bright, were two of Texas's major creditors. The bill, which was approved, paid only those debts related to powers Texas surrendered to Congress upon annexation. Breckinridge was disappointed that the House defeated a measure to pay the Sioux $12,000 owed them for the 1839 purchase of an island in the Mississippi River; the debt was never paid. Another increase in the subsidy to the Collins Line passed over his opposition, but Pierce vetoed it.
### Retirement from the House
In February 1854, the General Assembly's Whig majority gerrymandered the eighth district, removing over 500 Democratic voters and replacing them with several hundred Whig voters by removing Owen and Jessamine counties from the district and adding Harrison and Nicholas counties to it. The cooperation of the Know Nothing Party—a relatively new nativist political entity—with the faltering Whigs further hindered Breckinridge's re-election chances. With his family again in financial straits, his wife wanted him to retire from national politics.
Pierre Soulé, the U.S. Minister to Spain, resigned in December 1854 after being unable to negotiate the annexation of Cuba and angering the Spanish by drafting the Ostend Manifesto, which called for the U.S. to take Cuba by force. Pierce nominated Breckinridge to fill the vacancy, but did not tell him until just before the Senate's January 16 confirmation vote. After consulting Secretary of State William L. Marcy, Breckinridge concluded that the salary was insufficient and Soulé had so damaged Spanish relations that he would be unable to accomplish anything significant. In a letter to Pierce on February 8, 1855, he cited reasons "of a private and domestic nature" for declining the nomination. On March 17, 1855, he announced he would retire from the House.
Breckinridge and Minnesota Territory's Henry Mower Rice were among the speculators who invested in land near present-day Superior, Wisconsin. Rice disliked Minnesota's territorial governor, Willis A. Gorman, and petitioned Pierce to replace him with Breckinridge. Pierce twice investigated Gorman, but found no grounds to remove him from office. Breckinridge fell ill when traveling to view his investments in mid-1855 and was unable to campaign in the state elections. Know Nothings captured every state office and six congressional districts—including the eighth district—and Breckinridge sent regrets to friends in Washington, D.C., promising to take a more active role in the 1856 campaigns.
## U.S. vice president
Two Kentuckians—Breckinridge's friend, Governor Lazarus W. Powell and his enemy, Linn Boyd—were potential Democratic presidential nominees in 1856. Breckinridge—a delegate to the national convention and designated as a presidential elector—favored Pierce's re-election but convinced the state Democratic convention to leave the delegates free to support any candidate the party coalesced behind. To a New Yorker who proposed that Breckinridge's nomination could unite the party, he replied "Humbug".
### Election
Pierce was unable to secure the nomination at the national convention, so Breckinridge switched his support to Stephen Douglas, but the combination of Pierce and Douglas supporters did not prevent James Buchanan's nomination. After Douglas's floor manager, William Richardson, suggested that nominating Breckinridge for vice president would help Buchanan secure the support of erstwhile Douglas backers in the general election, Louisiana's J. L. Lewis nominated him. Breckinridge declined in deference to Linn Boyd but received 51 votes on the first ballot, behind Mississippi's John A. Quitman with 59, but ahead of third-place Boyd, who garnered 33. On the second ballot, Breckinridge received overwhelming support, and opposition delegates changed their votes to make his nomination unanimous.
The election was between Buchanan and Republican John C. Frémont in the north and between Buchanan and Millard Fillmore, nominated by a pro-slavery faction of the Know Nothings, in the South. Tennessee Governor Andrew Johnson and Congressional Globe editor John C. Rives promoted the possibility that Douglas and Pierce supporters would back Fillmore in the Southern states, denying Buchanan a majority in the Electoral College and throwing the election to the House of Representatives. There, Buchanan's opponents would prevent a vote, and the Senate's choice for vice president—certain to be Breckinridge—would become president. There is no evidence that Breckinridge countenanced this scheme. Defying contemporary political convention, Breckinridge spoke frequently during the campaign, stressing Democratic fidelity to the constitution and charging that the Republican emancipationist agenda would tear the country apart. His appearances in the critical state of Pennsylvania helped allay Buchanan's fears that Breckinridge desired to throw the election to the House. "Buck and Breck" won the election with 174 electoral votes to Frémont's 114 and Fillmore's 8, and Democrats carried Kentucky for the first time since 1828. Thirty-six at the time of his inauguration on March 4, 1857, Breckinridge remains the youngest vice president in U.S. history. The Constitution requires the president and vice-president to be at least thirty-five years old.
### Service
When Breckinridge asked to meet with Buchanan shortly after the inauguration, Buchanan told him to come to the White House and ask to see the hostess, Harriet Lane. Offended, Breckinridge refused to do so; Buchanan's friends later explained that asking to see Lane was a secret instruction to take a guest to the president. Buchanan apologized for the misunderstanding, but the event portended a poor relationship between the two men. Resentful of Breckinridge's support for both Pierce and Douglas, Buchanan allowed him little influence in the administration. Breckinridge's recommendation that former Whigs and Kentuckians—Powell, in particular—be included in Buchanan's cabinet went unheeded. Kentuckians James B. Clay and Cassius M. Clay were offered diplomatic missions to Berlin and Peru, respectively, but both declined. Buchanan often asked Breckinridge to receive and entertain foreign dignitaries, but in 1858, Breckinridge declined Buchanan's request that he resign and take the again-vacant position as U.S. Minister to Spain. The only private meeting between the two occurred near the end of Buchanan's term, when the president summoned Breckinridge to get his advice on whether to issue a proclamation declaring a day of "Humiliation and Prayer" over the divided state of the nation; Breckinridge affirmed that Buchanan should make the proclamation.
As vice president, Breckinridge was tasked with presiding over the debates of the Senate. In an early address to that body, he promised, "It shall be my constant aim, gentlemen of the Senate, to exhibit at all times, to every member of this body, the courtesy and impartiality which are due to the representatives of equal States." Historian Lowell H. Harrison wrote that, while Breckinridge fulfilled his promise to the satisfaction of most, acting as moderator limited his participation in debate. Five tie-breaking votes provided a means of expressing his views. Economic motivations explained two—forcing an immediate vote on a codfishing tariff and limiting military pensions to $50 per month ($ in present-day currency). A third cleared the floor for a vote on Douglas's motion to admit Oregon to the Union, and a fourth defeated Johnson's Homestead Bill. The final vote effected a wording change in a resolution forbidding constitutional amendments that empowered Congress to interfere with property rights. The Senate's move from the Old Senate Chamber to a more spacious one on January 4, 1859, provided another opportunity. Afforded the chance to make the last address in the old chamber, Breckinridge encouraged compromise and unity among the states to resolve sectional conflicts.
Despite irregularities in the approval of the Lecompton Constitution by Kansas voters, Breckinridge agreed with Buchanan that it was legitimate, but he kept his position secret, and some believed he agreed with his friend, Stephen Douglas, that Lecompton was invalid. Breckinridge's absence from the Senate during debate on admitting Kansas to the Union under Lecompton seemed to confirm this, but his leave—to take his wife from Baton Rouge, Louisiana, where she was recovering from an illness, to Washington, D.C.—had been planned for months. The death of his grandmother, Polly Breckinridge, prompted him to leave earlier than planned. During his absence, both houses of Congress voted to re-submit the Lecompton Constitution to Kansas voters for approval. On resubmission, it was overwhelmingly rejected.
By January 1859, friends knew Breckinridge desired the U.S. Senate seat of John J. Crittenden, whose term expired on March 3, 1861. The General Assembly would elect Crittenden's successor in December 1859, so Breckinridge's election would not affect any presidential aspirations he might harbor. Democrats chose Breckinridge's friend Beriah Magoffin over Linn Boyd as their gubernatorial nominee, bolstering Breckinridge's chances for the senatorship, the presidency, or both. Boyd was expected to be Breckinridge's chief opponent for the Senate, but he withdrew on November 28, citing ill health, and died three weeks later. The Democratic majority in the General Assembly elected Breckinridge to succeed Crittenden by a vote of 81 to 53 over Joshua Fry Bell, whom Magoffin had defeated for the governorship in August.
After Minnesota's admission to the Union in May 1858, opponents accused Breckinridge of rigging a random draw so that his friend, Henry Rice, would get the longer of the state's two Senate terms. Senate Secretary Asbury Dickins blunted the charges, averring that he alone handled the instruments used in the drawing. Republican Senator Solomon Foot closed a special session of the Thirty-sixth Congress in March 1859 by offering a resolution praising Breckinridge for his impartiality; after the session, the Republican-leaning New York Times noted that while the star of the Buchanan administration "falls lower every hour in prestige and political consequence, the star of the Vice President rises higher".
## Presidential election of 1860
Breckinridge's lukewarm support for Douglas in his 1858 senatorial re-election bid against Abraham Lincoln convinced Douglas that Breckinridge would seek the Democratic presidential nomination, but in a January 1860 letter to his uncle, Breckinridge averred he was "firmly resolved not to". Douglas's political enemies supported Breckinridge, and Buchanan reluctantly dispensed patronage to Breckinridge allies, further alienating Douglas. After Breckinridge left open the possibility of supporting a federal slave code in 1859, Douglas wrote to Robert Toombs that he would support his enemy and fellow Georgian Alexander H. Stephens for the nomination over Breckinridge, although he would vote for Breckinridge over any Republican in the general election.
### Nomination
Breckinridge asked James Clay to protect his interests at the 1860 Democratic National Convention in Charleston, South Carolina. Clay, Lazarus Powell, William Preston, Henry Cornelius Burnett, and James B. Beck desired to nominate Breckinridge for president, but in a compromise with Kentucky's Douglas backers, the delegation went to Charleston committed to former Treasury Secretary James Guthrie of Louisville. Fifty Southern Democrats, upset at the convention's refusal to include slavery protection in the party's platform, walked out of the convention; the remaining delegates decided that nominations required a two-thirds majority of the original 303 delegates. For 35 ballots, Douglas ran well ahead of Guthrie but short of the needed majority. Arkansas's lone remaining delegate nominated Breckinridge, but Beck asked that the nomination be withdrawn because Breckinridge refused to compete with Guthrie. Twenty-one more ballots were cast, but the convention remained deadlocked. On May 3, the convention adjourned until June 18 in Baltimore, Maryland.
Breckinridge's communication with his supporters between the meetings indicated greater willingness to become a candidate, but he instructed Clay to nominate him only if his support exceeded Guthrie's. Many believed that Buchanan supported Breckinridge, but Breckinridge wrote to Beck that "The President is not for me except as a last necessity, that is to say not until his help will not be worth a damn." After a majority of the delegates, most of them Douglas supporters, voted to replace Alabama and Louisiana's walk-out delegates with new, pro-Douglas men in Baltimore, Virginia's delegation led another walk-out of Southern Democrats and Buchanan-controlled delegates from the northeast and Pacific coast; 105 delegates, including 10 of Kentucky's 24, left, and the remainder nominated Douglas. The walk-outs held a rival nominating convention, styled the National Democratic Convention, at the Maryland Institute in Baltimore. At that convention on June 23, Massachusetts' George B. Loring nominated Breckinridge for president, and he received 81 of the 105 votes cast, the remainder going to Daniel S. Dickinson of New York. Oregon's Joseph Lane was nominated for vice-president.
Breckinridge told Beck he would not accept the nomination because it would split the Democrats and ensure the election of Republican Abraham Lincoln. On June 25, Mississippi Senator Jefferson Davis proposed that Breckinridge should accept the nomination; his strength in the South would convince Douglas that his own candidacy was futile. Breckinridge, Douglas, and Constitutional Unionist John Bell would withdraw, and Democrats could nominate a compromise candidate. Breckinridge accepted the nomination, but maintained that he had not sought it and that he had been nominated "against my expressed wishes". Davis's compromise plan failed when Douglas refused to withdraw, believing his supporters would vote for Lincoln rather than a compromise candidate.
### Election
The election effectively pitted Lincoln against Douglas in the North and Breckinridge against Bell in the South. Far from expectant of victory, Breckinridge told Davis's wife, Varina, "I trust I have the courage to lead a forlorn hope." Caleb Cushing oversaw the publication of several Breckinridge campaign documents, including a campaign biography and copies of his speeches on the occasion of the Senate's move to a new chamber and his election to the Senate. After making a few short speeches during stops between Washington, D.C. and Lexington, Breckinridge stated that, consistent with contemporary custom, he would make no more speeches until after the election, but the results of an August 1860 special election to replace the deceased clerk of the Kentucky Court of Appeals convinced him that his candidacy could be faltering. He had expressed confidence that the Democratic candidate for the clerkship would win, and "nothing short of a defeat by 6,000 or 8,000 would alarm me for November". Constitutional Unionist Leslie Combs won by 23,000 votes, prompting Breckinridge to make a full-length campaign speech in Lexington on September 5, 1860.
Breckinridge's three-hour speech was primarily defensive; his moderate tone was designed to win votes in the north but risked losing Southern support to Bell. He denied charges that he had supported Zachary Taylor over Lewis Cass in 1848, that he had sided with abolitionists in 1849, and that he had sought John Brown's pardon for the Harpers Ferry raid. Reminding the audience that Douglas wanted the Supreme Court to decide the issue of slavery in the territories, he pointed out that Douglas then denounced the Dred Scott ruling and laid out a means for territorial legislatures to circumvent it. Breckinridge supported the legitimacy of secession but insisted it was not the solution to the country's sectional disagreements. In answer to Douglas's charge that there was not "a disunionist in America who is not a Breckinridge man", he challenged the assembled crowd "to point out an act, to disclose an utterance, to reveal a thought of mine hostile to the constitution and union of the States". He warned that Lincoln's insistence on emancipation made him the real disunionist.
Breckinridge finished third in the popular vote with 849,781 votes to Lincoln's 1,866,452, Douglas's 1,379,957, and Bell's 588,879. He carried 12 of the 15 Southern states and the border states of Maryland, Delaware and North Carolina but lost his home state to Bell. His greatest support in the Deep South came from areas that opposed secession. Davis pointed out that only Breckinridge garnered nearly equal support from the Deep South, the border states, and the free states of the North. His 72 electoral votes bested Bell's 59 and Douglas's 12, but Lincoln received 180, enough to win the election.
### Aftermath
Three weeks after the election, Breckinridge returned to Washington, D.C., to preside over the Senate's lame duck session. Lazarus Powell, now a senator, proposed a resolution creating a committee of thirteen members to respond to the portion of Buchanan's address regarding the disturbed condition of the country. Breckinridge appointed the members of the committee, which, in Heck's opinion, formed "an able committee, representing every major faction." John J. Crittenden proposed a compromise by which slavery would be forbidden in territories north of parallel 36°30′ north—the demarcation line used in the Missouri Compromise—and permitted south of it, but the committee's five Republicans rejected the proposal. On December 31, the committee reported that it could come to no agreement. Writing to Magoffin on January 6, Breckinridge complained that the Republicans were "rejecting everything, proposing nothing" and "pursuing a policy which ... threatens to plunge the country into ... civil war".
One of Breckinridge's final acts as vice-president was announcing the vote of the Electoral College to a joint session of Congress on February 13, 1861. Rumors abounded that he would tamper with the vote to prevent Lincoln's election. Knowing that some legislators planned to attend the session armed, Breckinridge asked Winfield Scott to post guards in and around the chambers. One legislator raised a point of order, requesting that the guards be ejected, but Breckinridge refused to sustain it; the electoral vote proceeded, and Breckinridge announced Lincoln's election as president. After Lincoln's arrival in Washington, D.C., on February 24, Breckinridge visited him at the Willard Hotel. After making a valedictory address on March 4, he swore in Hannibal Hamlin as his successor as vice president; Hamlin then swore in Breckinridge and the other incoming senators.
## U.S. Senate
Because Republicans controlled neither house of Congress, nor the Supreme Court, Breckinridge did not believe Lincoln's election was a mandate for secession. Ignoring James Murray Mason's contention that no Southerner should serve in Lincoln's cabinet, Breckinridge supported the appointment of Virginian Montgomery Blair as Postmaster General. He also voted against a resolution to remove the names of the senators from seceded states from the Senate roll.
Working for a compromise that might yet save the Union, Breckinridge opposed a proposal by Ohio's Clement Vallandigham that the border states unite to form a "middle confederacy" that would place a buffer between the U.S. and the seceded states, nor did Breckinridge desire to see Kentucky as the southernmost state in a northern confederacy; its position south of the Ohio River left it too vulnerable to the southern confederacy should war occur. Urging that federal troops be withdrawn from the seceded states, he insisted "their presence can accomplish no good, but will certainly produce incalculable mischief". He warned that, unless Republicans made some concessions, Kentucky and the other border states would also secede.
When the legislative session ended on March 28, Breckinridge returned to Kentucky and addressed the state legislature on April 2, 1861. He urged the General Assembly to push for federal adoption of the Crittenden Compromise and advocated calling a border states convention, which would draft a compromise proposal and submit it to the Northern and Southern states for adoption. Asserting that the states were coequal and free to choose their own course, he maintained that, if the border states convention failed, Kentucky should call a sovereignty convention and join the Confederacy as a last resort.
The Battle of Fort Sumter, which began the Civil War, occurred days later, before the border states convention could be held. Magoffin called a special legislative session on May 6, and the legislature authorized creation of a six-man commission to decide the state's course in the war. Breckinridge, Magoffin, and Richard Hawes were the states' rights delegates to the conference, while Crittenden, Archibald Dixon, and Samuel S. Nicholas represented the Unionist position. The delegates were only able to agree on a policy of armed neutrality, which Breckinridge believed impractical and ultimately untenable, but preferable to more drastic actions. In special elections held June 20, 1861, Unionists won nine of Kentucky's ten House seats, and in the August 5 state elections, Unionists gained majorities in both houses of the state legislature.
When the Senate convened for a special session on July 4, 1861, Breckinridge stood almost alone in opposition to the war. Labeled a traitor, he was removed from the Committee on Military Affairs. He demanded to know what authority Lincoln had to blockade Southern ports or suspend the writ of habeas corpus. He reminded his fellow senators that Congress had not approved a declaration of war and maintained that Lincoln's enlistment of men and expenditure of funds for the war effort were unconstitutional. If the Union could be persuaded not to attack the Confederacy, he predicted that "all those sentiments of common interest and feeling ... might lead to a political reunion founded upon consent". On August 1, he declared that if Kentucky supported Lincoln's prosecution of the war, "she will be represented by some other man on the floor of this Senate." Asked by Oregon's Edward Dickinson Baker how he would handle the secession crisis, he responded, "I would prefer to see these States all reunited upon true constitutional principles to any other object that could be offered me in life ... But I infinitely prefer to see a peaceful separation of these States, than to see endless, aimless, devastating war, at the end of which I see the grave of public liberty and of personal freedom."
In early September, Confederate and Union forces entered Kentucky, ending her neutrality. On September 18, Unionists shut down the pro-Southern Louisville Courier newspaper and arrested former governor Charles S. Morehead, who was suspected of having Confederate sympathies. Learning that Colonel Thomas E. Bramlette was under orders to arrest him, Breckinridge fled to Prestonsburg, Kentucky, where he was joined by Confederate sympathizers George W. Johnson, George Baird Hodge, William E. Simms, and William Preston. The group continued to Abingdon, Virginia, where they took a train to Confederate-held Bowling Green, Kentucky.
On October 2, 1861, the Kentucky General Assembly passed a resolution declaring that neither of the state's U.S. Senators—Breckinridge and Powell—represented the will of the state's citizens and requesting that both resign. Governor Magoffin refused to endorse the resolution, preventing its enforcement. Writing from Bowling Green on October 8, Breckinridge declared, "I exchange with proud satisfaction a term of six years in the Senate of the United States for the musket of a soldier." Later that month, he was part of a convention in Confederate-controlled Russellville, Kentucky, that denounced the Unionist legislature as not representing the will of most Kentuckians and called for a sovereignty convention to be held in that city on November 18. Breckinridge, George W. Johnson, and Humphrey Marshall were named to the planning committee, but Breckinridge did not attend the convention, which created a provisional Confederate government for Kentucky. On November 6, Breckinridge was indicted for treason in a federal court in Frankfort. The Senate passed a resolution formally expelling him on December 2, 1861; Powell was the only member to vote against the resolution, claiming that Breckinridge's statement of October 8 amounted to a resignation, rendering the resolution unnecessary.
## Confederate Secretary of War
Breckinridge served in the Confederate Army from November 2, 1861, until early 1865. In mid-January 1865, Confederate President Jefferson Davis summoned Breckinridge to the Confederate capital at Richmond, Virginia, and rumors followed that Davis would appoint Breckinridge Confederate States Secretary of War, replacing James A. Seddon. Breckinridge arrived in Richmond on January 17, and some time in the next two weeks, Davis offered him the appointment. Breckinridge made his acceptance conditional upon the removal of Lucius B. Northrop from his office as Confederate Commissary General. Most Confederate officers regarded Northrop as inept, but Davis had long defended him. Davis relented on January 30, allowing Seddon to replace Northrop with Breckinridge's friend, Eli Metcalfe Bruce, on an interim basis; Breckinridge accepted Davis's appointment the next day.
Some Confederate congressmen were believed to oppose Breckinridge because he had waited so long to join the Confederacy, but his nomination was confirmed unanimously on February 6, 1865. At 44 years old, he was the youngest person to serve in the Confederate president's cabinet. Klotter called Breckinridge "perhaps the most effective of those who held that office", but Harrison wrote that "no one could have done much with the War Department at that late date". While his predecessors had largely served Davis's interests, Breckinridge functioned independently, assigning officers, recommending promotions, and consulting on strategy with Confederate generals.
Breckinridge's first act as secretary was to meet with assistant secretary John Archibald Campbell, who had opposed Breckinridge's nomination, believing he would focus on a select few of the department's bureaus and ignore the rest. During their conference, Campbell expressed his desire to retain his post, and Breckinridge agreed, delegating many of the day-to-day details of the department's operation to him. Breckinridge recommended that Davis appoint Isaac M. St. John, head of the Confederate Nitre and Mining Bureau, as permanent commissary general. Davis made the appointment on February 15, and the flow of supplies to Confederate armies improved under St. John. With Confederate ranks plagued by desertion, Breckinridge instituted a draft; when this proved ineffective, he negotiated the resumption of prisoner exchanges with the Union in order to replenish the Confederates' depleted manpower.
By late February, Breckinridge had concluded that the Confederate cause was hopeless. He opposed the use of guerrilla warfare by Confederate forces and urged a national surrender. Meeting with Confederate senators from Virginia, Kentucky, Missouri, and Texas, he urged, "This has been a magnificent epic. In God's name let it not terminate in a farce." In April, with Union forces approaching Richmond, Breckinridge organized the escape of the other cabinet officials to Danville, Virginia. Afterward, he ordered the burning of the bridges over the James River and ensured the destruction of buildings and supplies that might aid the enemy. During the surrender of the city, he helped preserve the Confederate government and military records housed there.
After a brief rendezvous with Robert E. Lee's retreating forces at Farmville, Virginia, Breckinridge moved south to Greensboro, North Carolina, where he, Naval Secretary Stephen Mallory, and Postmaster General John Henninger Reagan joined Generals Joseph E. Johnston and P. G. T. Beauregard to urge surrender. Davis and Secretary of State Judah P. Benjamin initially resisted, but eventually asked Major General William T. Sherman to parley. Johnston and Breckinridge negotiated terms with Sherman, but President Andrew Johnson (who had assumed the presidency on Lincoln's assassination on April 15) rejected them as too generous. On Davis' orders, Breckinridge told Johnston to meet Richard Taylor in Alabama, but Johnston, believing his men would refuse to fight any longer, surrendered to Sherman on similar terms to those offered to Lee at Appomattox.
After the failed negotiations, Confederate Attorney General George Davis and Confederate Treasury Secretary George Trenholm resigned. The rest of the Confederate cabinet—escorted by over 2,000 cavalrymen under Basil W. Duke and Breckinridge's cousin William Campbell Preston Breckinridge—traveled southwest to meet Taylor at Mobile. Believing that the Confederate cause was not yet lost, Davis convened a council of war on May 2 in Abbeville, South Carolina, but the cavalry commanders told him that the only cause for which their men would fight was to aid Davis's escape from the country. Informed that gold and silver coins and bullion from the Confederate treasury were at the train depot in Abbeville, Breckinridge ordered Duke to load it onto wagons and guard it as they continued southward. En route to Washington, Georgia, some members of the cabinet's escort threatened to take their back salaries by force. Breckinridge had intended to wait until their arrival to make the payments, but to avoid mutiny, he dispersed some of the funds immediately. Two brigades deserted immediately after being paid; the rest continued to Washington, where the remaining funds were deposited in a local bank.
Discharging most of the remaining escort, Breckinridge left Washington with a small party on May 5, hoping to distract federal forces from the fleeing Confederate president. Between Washington and Woodstock, the party was overtaken by Union forces under Lieutenant Colonel Andrew K. Campbell; Breckinridge ordered his nephew to surrender while he, his sons Cabell and Clifton, James B. Clay Jr., and a few others fled into the nearby woods. At Sandersville, he sent Clay and Clifton home, announcing that he and the rest of his companions would proceed to Madison, Florida. On May 11, they reached Milltown, Georgia, where Breckinridge expected to rendezvous with Davis, but on May 14, he learned of Davis's capture days earlier.
## Later life
Besides marking the end of the Confederacy and the war, Davis's capture left Breckinridge as the highest-ranking former Confederate still at large. Fearing arrest, he fled to Cuba, Great Britain, and Canada, where he lived in exile. Andrew Johnson issued a proclamation of amnesty for all former Confederates in December 1868, and Breckinridge returned home the following March. Friends and government officials, including President Ulysses S. Grant, urged him to return to politics, but he declared himself "an extinct volcano" and never sought public office again. He died of complications from war-related injuries on May 17, 1875. |
# Bob Dylan
Bob Dylan (legally Robert Dylan; born Robert Allen Zimmerman, May 24, 1941) is an American singer-songwriter. Often considered one of the greatest songwriters of all time, Dylan has been a major figure in popular culture over his 60-year career. He rose to prominence in the 1960s, when songs such as "The Times They Are a-Changin' (1964) became anthems for the civil rights and antiwar movements. Initially modeling his style on Woody Guthrie's folk songs, Robert Johnson's blues and what he called the "architectural forms" of Hank Williams's country songs, Dylan added increasingly sophisticated lyrical techniques to the folk music of the early 1960s, infusing it "with the intellectualism of classic literature and poetry". His lyrics incorporated political, social and philosophical influences, defying pop music conventions and appealing to the burgeoning counterculture.
Dylan was born and raised in St. Louis County, Minnesota. Following his self-titled debut album of traditional folk songs in 1962, he made his breakthrough with The Freewheelin' Bob Dylan (1963). The album featured "Blowin' in the Wind" and "A Hard Rain's a-Gonna Fall", which adapted the tunes and phrasing of older folk songs. He released the politically charged The Times They Are a-Changin and the more lyrically abstract and introspective Another Side of Bob Dylan in 1964. In 1965 and 1966, Dylan drew controversy among folk purists when he adopted electrically amplified rock instrumentation, and in the space of 15 months recorded three of the most influential rock albums of the 1960s: Bringing It All Back Home, Highway 61 Revisited and Blonde on Blonde. When Dylan made his move from acoustic folk and blues music to rock, the mix became more complex. His six-minute single "Like a Rolling Stone" (1965) expanded commercial and creative boundaries in popular music.
In July 1966, a motorcycle accident led to Dylan's withdrawal from touring. During this period, he recorded a large body of songs with members of the Band, who had previously backed him on tour. These recordings were later released as The Basement Tapes in 1975. In the late 1960s and early 1970s, Dylan explored country music and rural themes on John Wesley Harding (1967), Nashville Skyline (1969) and New Morning (1970). In 1975, he released Blood on the Tracks, which many saw as a return to form. In the late 1970s, he became a born-again Christian and released three albums of contemporary gospel music before returning to his more familiar rock-based idiom in the early 1980s. Dylan's Time Out of Mind (1997) marked the beginning of a career renaissance. He has released five critically acclaimed albums of original material since, most recently Rough and Rowdy Ways (2020). He also recorded a trilogy of albums covering the Great American Songbook, especially songs sung by Frank Sinatra, and an album smoothing his early rock material into a mellower Americana sensibility, Shadow Kingdom (2023). Dylan has toured continuously since the late 1980s on what has become known as the Never Ending Tour.
Since 1994, Dylan has published nine books of paintings and drawings, and his work has been exhibited in major art galleries. He has sold more than 125 million records, making him one of the best-selling musicians ever. He has received numerous awards, including the Presidential Medal of Freedom, ten Grammy Awards, a Golden Globe Award and an Academy Award. Dylan has been inducted into the Rock and Roll Hall of Fame, Nashville Songwriters Hall of Fame and the Songwriters Hall of Fame. In 2008, the Pulitzer Prize Board awarded him a special citation for "his profound impact on popular music and American culture, marked by lyrical compositions of extraordinary poetic power." In 2016, Dylan was awarded the Nobel Prize in Literature.
## Life and career
### 1941–1959: Origins and musical beginnings
Bob Dylan was born Robert Allen Zimmerman ( Shabtai Zisl ben Avraham) in St. Mary's Hospital on May 24, 1941, in Duluth, Minnesota, and raised in Hibbing, Minnesota, on the Mesabi Range west of Lake Superior. Dylan's paternal grandparents, Anna Kirghiz and Zigman Zimmerman, emigrated from Odessa in the Russian Empire (now Odesa, Ukraine) to the United States, following the pogroms against Jews of 1905. His maternal grandparents, Florence and Ben Stone, were Lithuanian Jews who had arrived in the United States in 1902. Dylan wrote that his paternal grandmother's family was originally from the Kağızman district of Kars Province in northeastern Turkey.
Dylan's father Abram Zimmerman and his mother Beatrice "Beatty" Stone were part of a small, close-knit Jewish community. They lived in Duluth until Dylan was six, when his father contracted polio and the family returned to his mother's hometown of Hibbing, where they lived for the rest of Dylan's childhood, and his father and paternal uncles ran a furniture and appliance store.
In the early 1950s Dylan listened to the Grand Ole Opry radio show and heard the songs of Hank Williams. He later wrote: "The sound of his voice went through me like an electric rod." Dylan was also impressed by the delivery of Johnnie Ray: "He was the first singer whose voice and style, I guess, I totally fell in love with... I loved his style, wanted to dress like him too." As a teenager, Dylan heard rock and roll on radio stations broadcasting from Shreveport and Little Rock.
Dylan formed several bands while attending Hibbing High School. In the Golden Chords, he performed covers of songs by Little Richard and Elvis Presley. Their performance of Danny & the Juniors' "Rock and Roll Is Here to Stay" at their high school talent show was so loud that the principal cut the microphone. In 1959, Dylan's high school yearbook carried the caption "Robert Zimmerman: to join 'Little Richard'". That year, as Elston Gunnn, he performed two dates with Bobby Vee, playing piano and clapping. In September 1959, Dylan enrolled at the University of Minnesota. Living at the Jewish-centric fraternity Sigma Alpha Mu house, Dylan began to perform at the Ten O'Clock Scholar, a coffeehouse a few blocks from campus, and became involved in the Dinkytown folk music circuit. His focus on rock and roll gave way to American folk music, as he explained in a 1985 interview:
> The thing about rock'n'roll is that for me anyway it wasn't enough ... There were great catch-phrases and driving pulse rhythms ... but the songs weren't serious or didn't reflect life in a realistic way. I knew that when I got into folk music, it was more of a serious type of thing. The songs are filled with more despair, more sadness, more triumph, more faith in the supernatural, much deeper feelings.
During this period, he began to introduce himself as "Bob Dylan". In his memoir, he wrote that he considered adopting the surname Dillon before unexpectedly seeing poems by Dylan Thomas, and deciding upon the given name spelling. In a 2004 interview, he said, "You're born, you know, the wrong names, wrong parents. I mean, that happens. You call yourself what you want to call yourself. This is the land of the free."
### 1960s
#### Relocation to New York and record deal
In May 1960, Dylan dropped out of college at the end of his first year. In January 1961, he traveled to New York City to perform and visit his musical idol Woody Guthrie at Greystone Park Psychiatric Hospital. Guthrie had been a revelation to Dylan and influenced his early performances. He wrote of Guthrie's impact: "The songs themselves had the infinite sweep of humanity in them... [He] was the true voice of the American spirit. I said to myself I was going to be Guthrie's greatest disciple". In addition to visiting Guthrie, Dylan befriended his protégé Ramblin' Jack Elliott.
From February 1961, Dylan played at clubs around Greenwich Village, befriending and picking up material from folk singers, including Dave Van Ronk, Fred Neil, Odetta, the New Lost City Ramblers and Irish musicians the Clancy Brothers and Tommy Makem. In September, The New York Times critic Robert Shelton boosted Dylan's career with a very enthusiastic review of his performance at Gerde's Folk City: "Bob Dylan: A Distinctive Folk-Song Stylist". That month, Dylan played harmonica on folk singer Carolyn Hester's third album, bringing him to the attention of the album's producer John Hammond, who signed Dylan to Columbia Records. Dylan's debut album, Bob Dylan, released March 19, 1962, consisted of traditional folk, blues and gospel material with just two original compositions, "Talkin' New York" and "Song to Woody". The album sold 5,000 copies in its first year, just breaking even.
`In August 1962, Dylan changed his name to Bob Dylan, and signed a management contract with Albert Grossman. Grossman remained Dylan's manager until 1970, and was known for his sometimes confrontational personality and protective loyalty. Dylan said, "He was kind of like a Colonel Tom Parker figure ... you could smell him coming." Tension between Grossman and John Hammond led to the latter suggesting Dylan work with the jazz producer Tom Wilson, who produced several tracks for the second album without formal credit. Wilson produced the next three albums Dylan recorded.`
Dylan made his first trip to the United Kingdom from December 1962 to January 1963. He had been invited by television director Philip Saville to appear in Madhouse on Castle Street, which Saville was directing for BBC Television. At the end of the play, Dylan performed "Blowin' in the Wind", one of its first public performances. While in London, Dylan performed at London folk clubs, including the Troubadour, Les Cousins, and Bunjies. He also learned material from UK performers, including Martin Carthy.
By the release of Dylan's second album, The Freewheelin' Bob Dylan, in May 1963, he had begun to make his name as a singer-songwriter. Many songs on the album were labeled protest songs, inspired partly by Guthrie and influenced by Pete Seeger's topical songs. "Oxford Town" was an account of James Meredith's ordeal as the first Black student to enroll at the University of Mississippi. The first song on the album, "Blowin' in the Wind", partly derived its melody from the traditional slave song "No More Auction Block", while its lyrics questioned the social and political status quo. The song was widely recorded by other artists and became a hit for Peter, Paul and Mary. "A Hard Rain's a-Gonna Fall" was based on the folk ballad "Lord Randall". With its apocalyptic premonitions, the song gained resonance when the Cuban Missile Crisis developed a few weeks after Dylan began performing it. Both songs marked a new direction in songwriting, blending a stream-of-consciousness, imagist lyrical attack with traditional folk form.
Dylan's topical songs led to his being viewed as more than just a songwriter. Janet Maslin wrote of Freewheelin':
> These were the songs that established [Dylan] as the voice of his generation—someone who implicitly understood how concerned young Americans felt about nuclear disarmament and the growing Civil Rights Movement: his mixture of moral authority and nonconformity was perhaps the most timely of his attributes.
Freewheelin' also included love songs and surreal talking blues. Humor was an important part of Dylan's persona, and the range of material on the album impressed listeners, including the Beatles. George Harrison said of the album: "We just played it, just wore it out. The content of the song lyrics and just the attitude—it was incredibly original and wonderful".
The rough edge of Dylan's singing unsettled some but attracted others. Author Joyce Carol Oates wrote: "When we first heard this raw, very young, and seemingly untrained voice, frankly nasal, as if sandpaper could sing, the effect was dramatic and electrifying". Many early songs reached the public through more palatable versions by other performers, such as Joan Baez, who became Dylan's advocate and lover. Baez was influential in bringing Dylan to prominence by recording several of his early songs and inviting him on stage during her concerts. Others who had hits with Dylan's songs in the early 1960s included the Byrds, Sonny & Cher, the Hollies, the Association, Manfred Mann and the Turtles.
"Mixed-Up Confusion", recorded during the Freewheelin"' sessions with a backing band, was released as Dylan's first single in December 1962, but then swiftly withdrawn. In contrast to the mostly solo acoustic performances on the album, the single showed a willingness to experiment with a rockabilly sound. Cameron Crowe described it as "a fascinating look at a folk artist with his mind wandering towards Elvis Presley and Sun Records".
#### Protest and Another Side
In May 1963, Dylan's political profile rose when he walked out of The Ed Sullivan Show. During rehearsals, Dylan had been told by CBS television's head of program practices that "Talkin' John Birch Paranoid Blues" was potentially libelous to the John Birch Society. Rather than comply with censorship, Dylan refused to appear.
Dylan and Baez were prominent in the civil rights movement, singing together at the March on Washington on August 28, 1963. Dylan performed "Only a Pawn in Their Game" and "When the Ship Comes In".
Dylan's third album, The Times They Are a-Changin', reflected a more politicized Dylan. The songs often took as their subject matter contemporary stories, with "Only a Pawn in Their Game" addressing the murder of civil rights worker Medgar Evers, and the Brechtian "The Lonesome Death of Hattie Carroll" the death of Black hotel barmaid Hattie Carroll at the hands of young White socialite William Zantzinger. "Ballad of Hollis Brown" and "North Country Blues" addressed despair engendered by the breakdown of farming and mining communities.
The final track on the album contained Dylan's angry response to a hostile profile of the singer that had appeared in Newsweek magazine. As biographer Clinton Heylin puts it, the Newsweek journalist wrote a story about "the way the Bar Mitzvah boy from Hibbing, Minnesota, had reinvented himself as the prince of protest", emphasising his birth name Robert Zimmerman, his attendance at the University of Minnesota and his close relationship with his parents whom he claimed to be estranged from. The day after the article appeared, Dylan returned to the studio to record "Restless Farewell" which ends with his vow to "make my stand/ And remain as I am/ And bid farewell and not give a damn".
By the end of 1963, Dylan felt manipulated and constrained by the folk and protest movements. Accepting the "Tom Paine Award" from the Emergency Civil Liberties Committee shortly after the assassination of John F. Kennedy, an intoxicated Dylan questioned the role of the committee, characterized the members as old and balding, and claimed to see something of himself and of every man in Kennedy's assassin, Lee Harvey Oswald.
Another Side of Bob Dylan, recorded in a single evening on June 9, 1964, had a lighter mood. The humorous Dylan reemerged on "I Shall Be Free No. 10" and "Motorpsycho Nightmare". "Spanish Harlem Incident" and "To Ramona" are passionate love songs, while "Black Crow Blues" and "I Don't Believe You (She Acts Like We Never Have Met)" suggest the rock and roll soon to dominate Dylan's music. "It Ain't Me Babe", on the surface a song about spurned love, has been described as a rejection of the role of political spokesman thrust upon him. His new direction was signaled by two lengthy songs: the impressionistic "Chimes of Freedom", which sets social commentary against a metaphorical landscape in a style characterized by Allen Ginsberg as "chains of flashing images," and "My Back Pages", which attacks the simplistic and arch seriousness of his own earlier topical songs and seems to predict the backlash he was about to encounter from his former champions.
In the latter half of 1964 and into 1965, Dylan moved from folk songwriter to folk-rock pop-music star. His jeans and work shirts were replaced by a Carnaby Street wardrobe, sunglasses day or night, and pointed "Beatle boots". A London reporter noted "Hair that would set the teeth of a comb on edge. A loud shirt that would dim the neon lights of Leicester Square. He looks like an undernourished cockatoo." Dylan began to spar with interviewers. Asked about a movie he planned while on Les Crane's television show, he told Crane it would be a "cowboy horror movie." Asked if he played the cowboy, Dylan replied, "No, I play my mother."
#### Going electric
`Dylan's late March 1965 album Bringing It All Back Home was another leap, featuring his first recordings with electric instruments, under producer Tom Wilson's guidance. The first single, "Subterranean Homesick Blues", owed much to Chuck Berry's "Too Much Monkey Business"; its free-association lyrics described as harking back to the energy of beat poetry and as a forerunner of rap and hip-hop. The song was provided with an early music video, which opened D. A. Pennebaker's cinéma vérité presentation of Dylan's 1965 British tour, Dont Look Back. Instead of miming, Dylan illustrated the lyrics by throwing cue cards containing key words on the ground. Pennebaker said the sequence was Dylan's idea, and it has been imitated in music videos and advertisements.`
The second side of Bringing It All Back Home contained four long songs on which Dylan accompanied himself on acoustic guitar and harmonica. "Mr. Tambourine Man" became one of his best-known songs when The Byrds recorded an electric version that reached number one in the US and UK. "It's All Over Now, Baby Blue" and "It's Alright Ma (I'm Only Bleeding)" were two of Dylan's most important compositions.
In 1965, headlining the Newport Folk Festival, Dylan performed his first electric set since high school with a pickup group featuring Mike Bloomfield on guitar and Al Kooper on organ. Dylan had appeared at Newport in 1963 and 1964, but in 1965 was met with cheering and booing and left the stage after three songs. One version has it that the boos were from folk fans whom Dylan had alienated by appearing, unexpectedly, with an electric guitar. Murray Lerner, who filmed the performance, said: "I absolutely think that they were booing Dylan going electric." An alternative account claims audience members were upset by poor sound and a short set.
Dylan's performance provoked a hostile response from the folk music establishment. In the September issue of Sing Out\!, Ewan MacColl wrote: "Our traditional songs and ballads are the creations of extraordinarily talented artists working inside disciplines formulated over time ...'But what of Bobby Dylan?' scream the outraged teenagers ... Only a completely non-critical audience, nourished on the watery pap of pop music, could have fallen for such tenth-rate drivel". On July 29, four days after Newport, Dylan was back in the studio in New York, recording "Positively 4th Street". The lyrics contained images of vengeance and paranoia, and have been interpreted as Dylan's put-down of former friends from the folk community he had known in clubs along West 4th Street.
#### Highway 61 Revisited and Blonde on Blonde
In July 1965, Dylan's six-minute single "Like a Rolling Stone" peaked at number two in the US chart. In 2004 and in 2011, Rolling Stone listed it as number one on "The 500 Greatest Songs of All Time". Bruce Springsteen recalled first hearing the song: "that snare shot sounded like somebody'd kicked open the door to your mind." The song opened Dylan's next album, Highway 61 Revisited, named after the road that led from Dylan's Minnesota to the musical hotbed of New Orleans. The songs were in the same vein as the hit single, flavored by Mike Bloomfield's blues guitar and Al Kooper's organ riffs. "Desolation Row", backed by acoustic guitar and understated bass, offers the sole exception, with Dylan alluding to figures in Western culture in a song described by Andy Gill as "an 11-minute epic of entropy, which takes the form of a Fellini-esque parade of grotesques and oddities featuring a huge cast of celebrated characters". Poet Philip Larkin, who also reviewed jazz for The Daily Telegraph, wrote "I'm afraid I poached Bob Dylan's Highway 61 Revisited (CBS) out of curiosity and found myself well rewarded."
In support of the album, Dylan was booked for two US concerts with Al Kooper and Harvey Brooks from his studio crew and Robbie Robertson and Levon Helm, former members of Ronnie Hawkins's backing band the Hawks. On August 28 at Forest Hills Tennis Stadium, the group was heckled by an audience still annoyed by Dylan's electric sound. The band's reception on September 3 at the Hollywood Bowl was more favorable.
From September 24, 1965, in Austin, Texas, Dylan toured the US and Canada for six months, backed by the five musicians from the Hawks who became known as The Band. While Dylan and the Hawks met increasingly receptive audiences, their studio efforts foundered. Producer Bob Johnston persuaded Dylan to record in Nashville in February 1966, and surrounded him with top-notch session men. At Dylan's insistence, Robertson and Kooper came from New York City to play on the sessions. The Nashville sessions produced the double album Blonde on Blonde (1966), featuring what Dylan called "that thin wild mercury sound". Kooper described it as "taking two cultures and smashing them together with a huge explosion": the musical worlds of Nashville and of the "quintessential New York hipster" Bob Dylan.
On November 22, 1965, Dylan quietly married 25-year-old former model Sara Lownds. Some of Dylan's friends, including Ramblin' Jack Elliott, say that, immediately after the event, Dylan denied he was married. Writer Nora Ephron made the news public in the New York Post in February 1966 with the headline "Hush\! Bob Dylan is wed".
Dylan toured Australia and Europe in April and May 1966. Each show was split in two. Dylan performed solo during the first half, accompanying himself on acoustic guitar and harmonica. In the second, backed by the Hawks, he played electrically amplified music. This contrast provoked many fans, who jeered and slow clapped. The tour culminated in a raucous confrontation between Dylan and his audience at the Manchester Free Trade Hall in England on May 17, 1966. A recording of this concert was released in 1998: The Bootleg Series Vol. 4: Bob Dylan Live 1966. At the climax of the evening, a member of the audience, angered by Dylan's electric backing, shouted: "Judas\!" to which Dylan responded, "I don't believe you ... You're a liar\!" Dylan turned to his band and said, "Play it fucking loud\!"
During his 1966 tour, Dylan was described as exhausted and acting "as if on a death trip". D. A. Pennebaker, the filmmaker accompanying the tour, described Dylan as "taking a lot of amphetamine and who-knows-what-else". In a 1969 interview with Jann Wenner, Dylan said, "I was on the road for almost five years. It wore me down. I was on drugs, a lot of things ... just to keep going, you know?"
#### Motorcycle accident and reclusion
On July 29, 1966, Dylan crashed his motorcycle, a Triumph Tiger 100, near his home in Woodstock, New York. Dylan said he broke several vertebrae in his neck. The circumstances of the accident are unclear since no ambulance was called to the scene and Dylan was not hospitalized. Dylan's biographers have written that the crash offered him the chance to escape the pressures around him. Dylan concurred: "I had been in a motorcycle accident and I'd been hurt, but I recovered. Truth was that I wanted to get out of the rat race." He made very few public appearances, and did not tour again for almost eight years.
Once Dylan was well enough to resume creative work, he began to edit D. A. Pennebaker's film of his 1966 tour. A rough cut was shown to ABC Television, but they rejected it as incomprehensible to mainstream audiences. The film, titled Eat the Document on bootleg copies, has since been screened at a few film festivals. Secluded from public gaze, Dylan recorded over 100 songs during 1967 at his Woodstock home and in the basement of the Hawks' nearby house, "Big Pink". These songs were initially offered as demos for other artists to record and were hits for Julie Driscoll, the Byrds, and Manfred Mann. The public heard these recordings when Great White Wonder, the first "bootleg recording", appeared in West Coast shops in July 1969, containing Dylan material recorded in Minneapolis in 1961 and seven Basement Tapes songs. This record gave birth to a minor industry in the illicit release of recordings by Dylan and other major rock artists. Columbia released a Basement selection in 1975 as The Basement Tapes.
In late 1967, Dylan returned to studio recording in Nashville, accompanied by Charlie McCoy on bass, Kenny Buttrey on drums and Pete Drake on steel guitar. The result was John Wesley Harding, a record of short songs thematically drawing on the American West and the Bible. The sparse structure and instrumentation, with lyrics that took the Judeo-Christian tradition seriously, was a departure from Dylan's previous work. It included "All Along the Watchtower", famously covered by Jimi Hendrix. Woody Guthrie died in October 1967, and Dylan made his first live appearance in twenty months at a memorial concert held at Carnegie Hall on January 20, 1968, where he was backed by the Band.
Nashville Skyline (1969), featured Nashville musicians, a mellow-voiced Dylan, a duet with Johnny Cash and the single "Lay Lady Lay". Variety wrote, "Dylan is definitely doing something that can be called singing. Somehow he has managed to add an octave to his range." During one recording session, Dylan and Cash recorded a series of duets, but only their version of "Girl from the North Country" appeared on the album. The album influenced the nascent genre of country rock.
In 1969, Dylan was asked to write songs for Scratch, Archibald MacLeish's musical adaptation of "The Devil and Daniel Webster". MacLeish initially praised Dylan's contributions, writing to him "Those songs of yours have been haunting me—and exciting me," but creative differences led to Dylan leaving the project. Some of the songs were later recorded by Dylan in a revised form. In May 1969, Dylan appeared on the first episode of The Johnny Cash Show where he sang a duet with Cash on "Girl from the North Country" and played solos of "Living the Blues" and "I Threw It All Away". Dylan traveled to England to top the bill at the Isle of Wight Festival on August 31, 1969, after rejecting overtures to appear at the Woodstock Festival closer to home.
### 1970s
In the early 1970s, critics charged that Dylan's output was varied and unpredictable. Greil Marcus asked "What is this shit?" upon first hearing Self Portrait, released in June 1970. It was a double LP including few original songs and was poorly received. In October 1970, Dylan released New Morning, considered a return to form. The title track was from Dylan's ill-fated collaboration with MacLeish, and "Day of the Locusts" was his account of receiving an honorary degree from Princeton University on June 9, 1970. In November 1968, Dylan co-wrote "I'd Have You Anytime" with George Harrison; Harrison recorded that song and Dylan's "If Not for You" for his album All Things Must Pass. Olivia Newton-John covered "If Not For You" on her debut album and "The Man in Me" was prominently featured in the film The Big Lebowski (1998).
Tarantula, a freeform book of prose-poetry, had been written by Dylan during a creative burst in 1964–65. Dylan shelved his book for several years, apparently uncertain of its status, until he suddenly informed Macmillan at the end of 1970 that the time had come to publish it. The book attracted negative reviews but later critics have suggested its affinities with Finnegans Wake and A Season In Hell.
Between March 16 and 19, 1971, Dylan recorded with Leon Russell at Blue Rock, a small studio in Greenwich Village. These sessions resulted in "Watching the River Flow" and a new recording of "When I Paint My Masterpiece". On November 4, 1971, Dylan recorded "George Jackson", which he released a week later. For many, the single was a surprising return to protest material, mourning the killing of Black Panther George Jackson in San Quentin State Prison. Dylan's surprise appearance at Harrison's Concert for Bangladesh on August 1, 1971, attracted media coverage as his live appearances had become rare.
In 1972, Dylan joined Sam Peckinpah's film Pat Garrett and Billy the Kid, providing the soundtrack and playing "Alias", a member of Billy's gang. Despite the film's failure at the box office, "Knockin' on Heaven's Door" became one of Dylan's most covered songs. That same year, Dylan protested the move to deport John Lennon and Yoko Ono, who had been convicted for marijuana possession, by sending a letter to the US Immigration Service which read in part: "Hurray for John & Yoko. Let them stay and live here and breathe. The country's got plenty of room and space. Let John and Yoko stay\!"
#### Return to touring
Dylan began 1973 by signing with a new label, David Geffen's Asylum Records, when his contract with Columbia Records expired. His next album, Planet Waves, was recorded in the fall of 1973, using the Band as his backing group as they rehearsed for a major tour. The album included two versions of "Forever Young", which became one of his most popular songs. As one critic described it, the song projected "something hymnal and heartfelt that spoke of the father in Dylan", and Dylan said "I wrote it thinking about one of my boys and not wanting to be too sentimental". Columbia Records simultaneously released Dylan, a collection of studio outtakes, widely interpreted as a churlish response to Dylan's signing with a rival record label.
In January 1974, Dylan, backed by the Band, embarked on a North American tour of 40 concerts—his first tour for seven years. A live double album, Before the Flood, was released on Asylum Records. Soon, according to Clive Davis, Columbia Records sent word they "will spare nothing to bring Dylan back into the fold". Dylan had second thoughts about Asylum, unhappy that Geffen had sold only 600,000 copies of Planet Waves despite millions of unfulfilled ticket requests for the 1974 tour; he returned to Columbia Records, which reissued his two Asylum albums.
After the tour, Dylan and his wife became estranged. He filled three small notebooks with songs about relationships and ruptures, and recorded the album Blood on the Tracks in September 1974. Dylan delayed the album's release and re-recorded half the songs at Sound 80 Studios in Minneapolis with production assistance from his brother, David Zimmerman. Released in early 1975, Blood on the Tracks received mixed reviews. In NME, Nick Kent described the "accompaniments" as "often so trashy they sound like mere practice takes". In Rolling Stone, Jon Landau wrote that "the record has been made with typical shoddiness". Over the years critics came to see it as one of Dylan's masterpieces. In Salon, journalist Bill Wyman wrote:
> Blood on the Tracks is his only flawless album and his best produced; the songs, each of them, are constructed in disciplined fashion. It is his kindest album and most dismayed, and seems in hindsight to have achieved a sublime balance between the logorrhea-plagued excesses of his mid-1960s output and the self-consciously simple compositions of his post-accident years.
In the middle of 1975, Dylan championed boxer Rubin "Hurricane" Carter, imprisoned for triple murder, with his ballad "Hurricane" making the case for Carter's innocence. Despite its length—over eight minutes—the song was released as a single, peaking at 33 on the US Billboard chart, and performed at every 1975 date of Dylan's next tour, the Rolling Thunder Revue. Running through late 1975 and again through early 1976, the tour featured about one hundred performers and supporters from the Greenwich Village folk scene, among them Ramblin' Jack Elliott, T-Bone Burnett, Joni Mitchell, David Mansfield, Roger McGuinn, Mick Ronson, Ronee Blakely, Joan Baez and Scarlet Rivera, whom Dylan discovered walking down the street, her violin case on her back. The tour encompassed the January 1976 release of the album Desire. Many of Desires songs featuring a travelogue-like narrative style, influenced by Dylan's new collaborator, playwright Jacques Levy. The 1976 half of the tour was documented by a TV concert special, Hard Rain, and the LP Hard Rain.
The 1975 tour with the Revue provided the backdrop to Dylan's film Renaldo and Clara, a sprawling narrative mixed with concert footage and reminiscences. Actor and playwright Sam Shepard accompanied the Revue and was to serve as screenwriter, but much of the film was improvised. Released in 1978, it received negative, sometimes scathing, reviews. Later in the year, a two-hour edit, dominated by the concert performances, was more widely released. In November 1976, Dylan appeared at the Band's farewell concert with Eric Clapton, Muddy Waters, Van Morrison, Neil Young and Joni Mitchell. Martin Scorsese's 1978 film of the concert, The Last Waltz, included most of Dylan's set.
In 1978, Dylan embarked on a year-long world tour, performing 114 shows in Japan, the Far East, Europe and North America, to a total audience of two million. Dylan assembled an eight-piece band and three backing singers. Concerts in Tokyo in February and March were released as the live double album Bob Dylan at Budokan. Reviews were mixed. Robert Christgau awarded the album a C+ rating, while Janet Maslin defended it: "These latest live versions of his old songs have the effect of liberating Bob Dylan from the originals". When Dylan brought the tour to the US in September 1978, the press described the look and sound as a "Las Vegas Tour". The 1978 tour grossed more than $20 million, and Dylan told the Los Angeles Times that he had debts because "I had a couple of bad years. I put a lot of money into the movie, built a big house ... and it costs a lot to get divorced in California." In April and May 1978, Dylan took the same band and vocalists into Rundown Studios in Santa Monica, California, to record an album of new material, Street-Legal. It was described by Michael Gray as "after Blood On The Tracks, arguably Dylan's best record of the 1970s: a crucial album documenting a crucial period in Dylan's own life". However, it had poor sound and mixing (attributed to Dylan's studio practices), muddying the instrumental detail until a remastered CD release in 1999 restored some of the songs' strengths.
#### Christian period
In the late 1970s, Dylan converted to Evangelical Christianity, undertaking a three-month discipleship course run by the Association of Vineyard Churches. He released three albums of contemporary gospel music. Slow Train Coming (1979) featured Dire Straits guitarist Mark Knopfler and was produced by veteran R\&B producer Jerry Wexler. Wexler said that Dylan had tried to evangelize him during the recording. He replied: "Bob, you're dealing with a 62-year-old Jewish atheist. Let's just make an album." Dylan won the Grammy Award for Best Male Rock Vocal Performance for the song "Gotta Serve Somebody". When touring in late 1979 and early 1980, Dylan would not play his older, secular works, and he delivered declarations of his faith from the stage, such as:
> Years ago they ... said I was a prophet. I used to say, "No I'm not a prophet", they say "Yes you are, you're a prophet." I said, "No it's not me." They used to say "You sure are a prophet." They used to convince me I was a prophet. Now I come out and say Jesus Christ is the answer. They say, "Bob Dylan's no prophet." They just can't handle it.
Dylan's Christianity was unpopular with some fans and musicians. John Lennon, shortly before being murdered, recorded "Serve Yourself" in response to "Gotta Serve Somebody". In 1981, Stephen Holden wrote in The New York Times that "neither age (he's now 40) nor his much-publicized conversion to born-again Christianity has altered his essentially iconoclastic temperament".
### 1980s
In late 1980, Dylan briefly played concerts billed as "A Musical Retrospective", restoring popular 1960s songs to the repertoire. His second Christian album, Saved (1980), received mixed reviews, described by Michael Gray as "the nearest thing to a follow-up album Dylan has ever made, Slow Train Coming II and inferior". His third Christian album was Shot of Love (1981). The album featured his first secular compositions in more than two years, mixed with Christian songs. The lyrics of "Every Grain of Sand" recall William Blake's "Auguries of Innocence". Elvis Costello wrote that "Shot of Love may not be your favorite Bob Dylan record, but it might contain his best song: 'Every Grain of Sand'."
Reception of Dylan's 1980s recordings varied. Gray criticized Dylan's 1980s albums for carelessness in the studio and for failing to release his best songs. Infidels (1983) employed Knopfler again as lead guitarist and also as producer; the sessions resulted in several songs that Dylan left off the album. Best regarded of these were "Blind Willie McTell", which was both a tribute to the eponymous blues musician and an evocation of African American history, "Foot of Pride" and "Lord Protect My Child". These three songs were later released on The Bootleg Series Volumes 1–3 (Rare & Unreleased) 1961–1991.
Between July 1984 and March 1985, Dylan recorded Empire Burlesque. Arthur Baker, who had remixed hits for Bruce Springsteen and Cyndi Lauper, was asked to engineer and mix the album. Baker said he felt he was hired to make Dylan's album sound "a little bit more contemporary". In 1985 Dylan sang on USA for Africa's famine relief single "We Are the World". He also joined Artists United Against Apartheid, providing vocals for their single "Sun City". On July 13, 1985, he appeared at the Live Aid concert at JFK Stadium, Philadelphia. Backed by Keith Richards and Ronnie Wood, he performed a ragged version of "Ballad of Hollis Brown", a tale of rural poverty, and then said to the worldwide audience: "I hope that some of the money ... maybe they can just take a little bit of it, maybe ... one or two million, maybe ... and use it to pay the mortgages on some of the farms and, the farmers here, owe to the banks". His remarks were widely criticized as inappropriate, but inspired Willie Nelson to organize a concert, Farm Aid, to benefit debt-ridden American farmers.
In October 1985, Dylan released Biograph, a box set featuring 53 tracks, 18 of them previously unreleased. Stephen Thomas Erlewine wrote: "Historically, Biograph is significant not for what it did for Dylan's career, but for establishing the box set, complete with hits and rarities, as a viable part of rock history." Biograph also contained liner notes by Cameron Crowe in which Dylan discussed the origins of some of his songs.
In April 1986, Dylan made a foray into rap when he added vocals to the opening verse of "Street Rock" on Kurtis Blow's album Kingdom Blow. Dylan's next studio album, Knocked Out Loaded (1986), contained three covers (by Junior Parker, Kris Kristofferson and the gospel hymn "Precious Memories"), plus three collaborations (with Tom Petty, Sam Shepard and Carole Bayer Sager), and two solo compositions by Dylan. A reviewer wrote that "the record follows too many detours to be consistently compelling, and some of those detours wind down roads that are indisputably dead ends. By 1986, such uneven records weren't entirely unexpected by Dylan, but that didn't make them any less frustrating." It was the first Dylan album since his 1962 debut to fail to make the Top 50. Some critics have called the song Dylan co-wrote with Shepard, "Brownsville Girl", a masterpiece.
In 1986 and 1987, Dylan toured with Tom Petty and the Heartbreakers, sharing vocals with Petty on several songs each night. Dylan also toured with the Grateful Dead in 1987, resulting in the live album Dylan & The Dead, which received negative reviews; Erlewine said it was "quite possibly the worst album by either Bob Dylan or the Grateful Dead". Dylan initiated what came to be called the Never Ending Tour on June 7, 1988, performing with a back-up band featuring guitarist G. E. Smith. Dylan would continue to tour with a small, changing band for the next 30 years. In 1987, Dylan starred in Richard Marquand's movie Hearts of Fire, in which he played Billy Parker, a washed-up rock star turned chicken farmer whose teenage lover (Fiona) leaves him for a jaded English synth-pop sensation (Rupert Everett). Dylan also contributed two original songs to the soundtrack—"Night After Night", and "Had a Dream About You, Baby", as well as a cover of John Hiatt's "The Usual". The film was a critical and commercial flop.
Dylan was inducted into the Rock and Roll Hall of Fame in January 1988. Bruce Springsteen, in his introduction, declared, "Bob freed your mind the way Elvis freed your body. He showed us that just because music was innately physical did not mean that it was anti-intellectual". Down in the Groove (1988) sold even more poorly than Knocked Out Loaded. Gray wrote: "The very title undercuts any idea that inspired work may lie within. Here was a further devaluing of the notion of a new Bob Dylan album as something significant." The critical and commercial disappointment of that album was swiftly followed by the success of the Traveling Wilburys, a supergroup Dylan co-founded with George Harrison, Jeff Lynne, Roy Orbison and Tom Petty. In late 1988, their Traveling Wilburys Vol. 1 reached number three on the US albums chart, featuring songs described as Dylan's most accessible compositions in years. Despite Orbison's death in December 1988, the remaining four recorded a second album in May 1990, Traveling Wilburys Vol. 3.
Dylan finished the decade on a critical high note with Oh Mercy, produced by Daniel Lanois. Gray praised the album as "Attentively written, vocally distinctive, musically warm, and uncompromisingly professional, this cohesive whole is the nearest thing to a great Bob Dylan album in the 1980s." "Most of the Time", a lost-love composition, was prominently featured in the film High Fidelity (2000), while "What Was It You Wanted" has been interpreted both as a catechism and a wry comment on the expectations of critics and fans. The religious imagery of "Ring Them Bells" struck some critics as a re-affirmation of faith.
### 1990s
Dylan's 1990s began with Under the Red Sky (1990), an about-face from the serious Oh Mercy. It contained several apparently simple songs, including "Under the Red Sky" and "Wiggle Wiggle". The album was dedicated to "Gabby Goo Goo", a nickname for the daughter of Dylan and Carolyn Dennis, Desiree Gabrielle Dennis-Dylan, who was four. Musicians on the album included George Harrison, Slash, David Crosby, Bruce Hornsby, Stevie Ray Vaughan, and Elton John. The record received negative reviews and sold poorly. In 1990 and 1991 Dylan was described by his biographers as drinking heavily, impairing his performances on stage. In an interview with Rolling Stone, Dylan dismissed allegations that drinking was interfering with his music: "That's completely inaccurate. I can drink or not drink. I don't know why people would associate drinking with anything I do, really".
Defilement and remorse were themes Dylan addressed when he received a Grammy Lifetime Achievement Award from Jack Nicholson in February 1991. The event coincided with the start of the Gulf War and Dylan played "Masters of War"; Rolling Stone called his performance "almost unintelligible". He made a short speech: "My daddy once said to me, he said, 'Son, it is possible for you to become so defiled in this world that your own mother and father will abandon you. If that happens, God will believe in your ability to mend your own ways'". This was a paraphrase of 19th-century Orthodox Rabbi Samson Raphael Hirsch's commentary on Psalm 27. On October 16, 1992, the thirtieth anniversary of Dylan's debut album was celebrated with a concert at Madison Square Garden, christened "Bobfest" by Neil Young and featuring John Mellencamp, Stevie Wonder, Lou Reed, Eddie Vedder, Dylan and others. It was recorded as the live album The 30th Anniversary Concert Celebration.
Over the next few years Dylan returned to his roots with two albums covering traditional folk and blues songs: Good as I Been to You (1992) and World Gone Wrong (1993), backed solely by his acoustic guitar. Many critics and fans noted the quiet beauty of the song "Lone Pilgrim", written by a 19th-century teacher. In August 1994, he played at Woodstock '94; Rolling Stone called his performance "triumphant". In November, Dylan recorded two live shows for MTV Unplugged. He said his wish to perform traditional songs was overruled by Sony executives who insisted on hits. The resulting album, MTV Unplugged, included "John Brown", an unreleased 1962 song about how enthusiasm for war ends in mutilation and disillusionment.
With a collection of songs reportedly written while snowed in on his Minnesota ranch, Dylan booked recording time with Daniel Lanois at Miami's Criteria Studios in January 1997. The subsequent recording sessions were, by some accounts, fraught with musical tension. Before the album's release Dylan was hospitalized with life-threatening pericarditis, brought on by histoplasmosis. His scheduled European tour was canceled, but Dylan made a speedy recovery and left the hospital saying, "I really thought I'd be seeing Elvis soon". He was back on the road by mid-year, and performed before Pope John Paul II at the World Eucharistic Conference in Bologna, Italy. The Pope treated the audience of 200,000 to a homily based on Dylan's "Blowin' in the Wind".
In September, Dylan released the new Lanois-produced album, Time Out of Mind. With its bitter assessments of love and morbid ruminations, Dylan's first collection of original songs in seven years was highly acclaimed. Alex Ross called it "a thrilling return to form." "Cold Irons Bound" won Dylan another Grammy For Best Male Rock Vocal Performance, and the album won him his first Grammy Award for Album of the Year. The album's first single, "Not Dark Yet", has been called one of Dylan's best songs and "Make You Feel My Love" was covered by Billy Joel, Garth Brooks, Adele and others. Elvis Costello said "I think it might be the best record he's made."
### 2000s
In 2001, Dylan won an Academy Award for Best Original Song for "Things Have Changed", written for the film Wonder Boys. "Love and Theft" was released on September 11, 2001. Recorded with his touring band, Dylan produced the album under the alias Jack Frost. Critics noted that Dylan was widening his musical palette to include rockabilly, Western swing, jazz and lounge music. The album won the Grammy Award for Best Contemporary Folk Album. Controversy ensued when The Wall Street Journal pointed out similarities between the album's lyrics and Junichi Saga's book Confessions of a Yakuza. Saga was not familiar with Dylan's work, but said he was flattered. Upon hearing the album, Saga said of Dylan: "His lines flow from one image to the next and don't always make sense. But they have a great atmosphere."
In 2003, Dylan revisited the evangelical songs from his Christian period and participated in the project Gotta Serve Somebody: The Gospel Songs of Bob Dylan. That year, Dylan released Masked & Anonymous, which he co-wrote with director Larry Charles under the alias Sergei Petrov. Dylan starred as Jack Fate, alongside a cast that included Jeff Bridges, Penélope Cruz and John Goodman. The film polarized critics. In The New York Times, A. O. Scott called it as an "incoherent mess"; a few treated it as a serious work of art.
In 2004, Dylan published the first part of his memoir, Chronicles: Volume One. Confounding expectations, Dylan devoted three chapters to his first year in New York City in 1961–1962, virtually ignoring the mid-1960s when his fame was at its height, while devoting chapters to the albums New Morning (1970) and Oh Mercy (1989). The book reached number two on The New York Times' Hardcover Non-Fiction bestseller list in December 2004 and was nominated for a National Book Award.
Critics noted that Chronicles contained many examples of pastiche and borrowing; sources included Time magazine and the novels of Jack London. Biographer Clinton Heylin queried the veracity of Dylan's autobiography, noting "Not a single checkable story held water; not one anecdote couldn't be shot full of holes by any half-decent researcher."
Martin Scorsese's Dylan documentary No Direction Home was broadcast on September 26–27, 2005, on BBC Two in the UK and as part of American Masters on PBS in the US. It covers the period from Dylan's arrival in New York in 1961 to his motorcycle crash in 1966, featuring interviews with Suze Rotolo, Liam Clancy, Joan Baez, Allen Ginsberg, Pete Seeger, Mavis Staples and Dylan himself. The film earned a Peabody Award and a Columbia-duPont Award. The accompanying soundtrack featured unreleased songs from Dylan's early years.
#### Modern Times
Dylan's career as a radio presenter began on May 3, 2006, with his weekly program, Theme Time Radio Hour, on XM Satellite Radio. He played songs with a common theme, such as "Weather", "Weddings", "Dance" and "Dreams". Dylan's records ranged from Muddy Waters to Prince, L.L. Cool J to the Streets. Dylan's show was praised for the breadth of his musical selections and for his jokes, stories and eclectic references. In April 2009, Dylan broadcast the 100th show in his radio series; the theme was "Goodbye" and he signed off with Woody Guthrie's "So Long, It's Been Good to Know Yuh".
Dylan released Modern Times in August 2006. Despite some coarsening of Dylan's voice (a critic for The Guardian characterized his singing on the album as "a catarrhal death rattle") most reviewers praised the album, and many described it as the final installment of a successful trilogy, encompassing Time Out of Mind and "Love and Theft". Modern Times entered the US charts at number one, making it Dylan's first album to reach that position since 1976's Desire. The New York Times published an article exploring similarities between some of Dylan's lyrics in Modern Times and the work of the Civil War poet Henry Timrod. Modern Times won the Grammy Award for Best Contemporary Folk Album and Dylan won Best Solo Rock Vocal Performance for "Someday Baby". Modern Times was named Album of the Year by Rolling Stone and Uncut. On the same day that Modern Times was released, the iTunes Music Store released Bob Dylan: The Collection, a digital box set containing all of his albums (773 tracks), along with 42 rare and unreleased tracks.
On October 1, 2007, Columbia Records released the triple CD retrospective Dylan, anthologizing his entire career under the Dylan 07 logo. The sophistication of the Dylan 07 marketing campaign was a reminder that Dylan's commercial profile had risen considerably since the 1990s. This became evident in 2004, when Dylan appeared in a TV advertisement for Victoria's Secret. In October 2007, he participated in a multi-media campaign for the 2008 Cadillac Escalade. In 2009 he gave the highest profile endorsement of his career to date, appearing with rapper will.i.am in a Pepsi ad that debuted during Super Bowl XLIII. The ad opened with Dylan singing the first verse of "Forever Young" followed by will.i.am doing a hip hop version of the song's third and final verse.
The Bootleg Series Vol. 8 – Tell Tale Signs was released in October 2008, as both a two-CD set and a three-CD version with a 150-page hardcover book. The set contains live performances and outtakes from selected studio albums from Oh Mercy to Modern Times, as well as soundtrack contributions and collaborations with David Bromberg and Ralph Stanley. The pricing of the album—the two-CD set went on sale for $18.99 and the three-CD version for $129.99—led to complaints about "rip-off packaging". The release was widely acclaimed by critics. The abundance of alternative takes and unreleased material suggested to one reviewer that this volume of old outtakes "feels like a new Bob Dylan record, not only for the astonishing freshness of the material, but also for the incredible sound quality and organic feeling of everything here".
#### Together Through Life and Christmas in the Heart
Dylan released Together Through Life on April 28, 2009. In a conversation with music journalist Bill Flanagan, Dylan explained it originated when French director Olivier Dahan asked him to supply a song for his movie My Own Love Song. He initially intended to record a single track, "Life Is Hard", but "the record sort of took its own direction". Nine of the album's ten songs are credited as co-written by Dylan and Robert Hunter. The album received largely favorable reviews, although several critics described it as a minor addition to Dylan's canon. In its first week of release, the album reached number one on the Billboard 200 chart in the US, making Dylan, at 67 years of age, the oldest artist to ever debut at number one on that chart.
Dylan's Christmas in the Heart was released in October 2009, comprising such Christmas standards as "Little Drummer Boy", "Winter Wonderland" and "Here Comes Santa Claus". Edna Gundersen wrote that Dylan was "revisiting yuletide styles popularized by Nat King Cole, Mel Tormé, and the Ray Conniff Singers". Dylan's royalties from the album were donated to the charities Feeding America in the US, Crisis in the UK, and the World Food Programme. The album received generally favorable reviews. In an interview published in The Big Issue, Flanagan asked Dylan why he had performed the songs in a straightforward style, and he replied: "There wasn't any other way to play it. These songs are part of my life, just like folk songs. You have to play them straight too."
### 2010s
#### Tempest
Volume 9 of Dylan's Bootleg Series, The Witmark Demos, was issued in October 18, 2010. It comprised 47 demo recordings of songs taped between 1962 and 1964 for Dylan's earliest music publishers: Leeds Music in 1962, and Witmark Music from 1962 to 1964. One reviewer described the set as "a hearty glimpse of young Bob Dylan changing the music business, and the world, one note at a time." On the critical aggregator Metacritic, the album has a score of 86, indicating "universal acclaim". In the same week, Sony Legacy released Bob Dylan: The Original Mono Recordings, a box set that presented Dylan's eight earliest albums, from Bob Dylan (1962) to John Wesley Harding (1967), in their original mono mix in the CD format for the first time. The set was accompanied by a booklet featuring an essay by Greil Marcus.
On April 12, 2011, Legacy Recordings released Bob Dylan in Concert – Brandeis University 1963, taped at Brandeis University on May 10, 1963, two weeks before the release of The Freewheelin' Bob Dylan. The tape was discovered in the archive of music writer Ralph J. Gleason, and the recording carries liner notes by Michael Gray, who says it captures Dylan "from way back when Kennedy was President and the Beatles hadn't yet reached America. It reveals him not at any Big Moment but giving a performance like his folk club sets of the period ... This is the last live performance we have of Bob Dylan before he becomes a star."
On Dylan's 70th birthday, three universities organized symposia on his work: the University of Mainz, the University of Vienna, and the University of Bristol invited literary critics and cultural historians to give papers on aspects of Dylan's work. Other events, including tribute bands, discussions and simple singalongs, took place around the world, as reported in The Guardian: "From Moscow to Madrid, Norway to Northampton and Malaysia to his home state of Minnesota, self-confessed 'Bobcats' will gather today to celebrate the 70th birthday of a giant of popular music."
Dylan's 35th studio album, Tempest, was released on September 11, 2012. The album features a tribute to John Lennon, "Roll On John", and the title track is a 14-minute song about the sinking of the Titanic. In Rolling Stone, Will Hermes gave Tempest five out of five stars, writing: "Lyrically, Dylan is at the top of his game, joking around, dropping wordplay and allegories that evade pat readings and quoting other folks' words like a freestyle rapper on fire".
Volume 10 of Dylan's Bootleg Series, Another Self Portrait (1969–1971), was released in August 2013. The album contained 35 previously unreleased tracks, including alternative takes and demos from Dylan's 1969–1971 recording sessions during the making of the Self Portrait and New Morning albums. The box set also included a live recording of Dylan's performance with the Band at the Isle of Wight Festival in 1969. Thom Jurek wrote, "For fans, this is more than a curiosity, it's an indispensable addition to the catalog." Columbia Records released a boxed set containing all 35 Dylan studio albums, six albums of live recordings and a collection of non-album material (Sidetracks) as Bob Dylan: Complete Album Collection: Vol. One, in November 2013. To publicize the box set, an innovative video of "Like a Rolling Stone" was released on Dylan's website. The interactive video, created by director Vania Heymann, allowed viewers to switch between 16 simulated TV channels, all featuring characters who are lip-synching the lyrics.
Dylan appeared in a commercial for the Chrysler 200 car which aired during the 2014 Super Bowl. In it, he says that "Detroit made cars and cars made America... So let Germany brew your beer, let Switzerland make your watch, let Asia assemble your phone. We will build your car." Dylan's ad was criticized for its protectionist implications, and people wondered whether he had "sold out". The Lyrics: Since 1962 was published by Simon & Schuster in the fall of 2014. The book was edited by literary critic Christopher Ricks, Julie Nemrow and Lisa Nemrow and offered variant versions of Dylan's songs, sourced from out-takes and live performances. A limited edition of 50 books, signed by Dylan, was priced at $5,000. "It's the biggest, most expensive book we've ever published, as far as I know", said Jonathan Karp, Simon & Schuster's president and publisher. A comprehensive edition of the Basement Tapes, songs recorded by Dylan and the Band in 1967, was released as The Bootleg Series Vol. 11: The Basement Tapes Complete in November 2014. The album included 138 tracks in a six-CD box; the 1975 album The Basement Tapes contained just 24 tracks from the material which Dylan and the Band had recorded at their homes in Woodstock, New York in 1967. Subsequently, over 100 recordings and alternate takes had circulated on bootleg records. The sleeve notes are by author Sid Griffin. The Basement Tapes Complete won the Grammy Award for Best Historical Album. The box set earned a score of 99 on Metacritic.
#### Shadows in the Night, Fallen Angels and Triplicate
In February 2015, Dylan released Shadows in the Night, featuring ten songs written between 1923 and 1963, which have been described as part of the Great American Songbook. All of the songs had been recorded by Frank Sinatra, but both critics and Dylan himself cautioned against seeing the record as a collection of "Sinatra covers". Dylan explained: "I don't see myself as covering these songs in any way. They've been covered enough. Buried, as a matter a fact. What me and my band are basically doing is uncovering them. Lifting them out of the grave and bringing them into the light of day". Critics praised the restrained instrumental backings and the quality of Dylan's singing. The album debuted at number one in the UK Albums Chart in its first week of release. The Bootleg Series Vol. 12: The Cutting Edge 1965–1966, consisting of previously unreleased material from the three albums Dylan recorded between January 1965 and March 1966 (Bringing It All Back Home, Highway 61 Revisited and Blonde on Blonde) was released in November 2015. The set was released in three formats: a 2-CD "Best Of" version, a 6-CD "Deluxe edition", and an 18-CD limited "Collector's Edition". On Dylan's website the "Collector's Edition" was described as containing "every single note recorded by Bob Dylan in the studio in 1965/1966". The Best of the Cutting Edge entered the Billboard Top Rock Albums chart at number one on November 18, based on its first-week sales.
Dylan released Fallen Angels, described as "a direct continuation of the work of 'uncovering' the Great Songbook that he began on Shadows In the Night", in May. The album contained twelve songs by classic songwriters such as Harold Arlen, Sammy Cahn and Johnny Mercer, eleven of which had been recorded by Sinatra. Jim Farber wrote in Entertainment Weekly: "Tellingly, [Dylan] delivers these songs of love lost and cherished not with a burning passion but with the wistfulness of experience. They're memory songs now, intoned with a present sense of commitment. Released just four days ahead of his 75th birthday, they couldn't be more age-appropriate". The 1966 Live Recordings, including every known recording of Dylan's 1966 concert tour, was released in November 2016. The recordings commence with the concert in White Plains New York on February 5, 1966, and end with the Royal Albert Hall concert in London on May 27. The New York Times reported most of the concerts had "never been heard in any form", and described the set as "a monumental addition to the corpus".
In March 2017, Dylan released a triple album of 30 more recordings of classic American songs, Triplicate. Dylan's 38th studio album was recorded in Hollywood's Capitol Studios and features his touring band. Dylan posted a long interview on his website to promote the album, and was asked if this material was an exercise in nostalgia.
> Nostalgic? No I wouldn't say that. It's not taking a trip down memory lane or longing and yearning for the good old days or fond memories of what's no more. A song like 'Sentimental Journey' is not a way back when song, it doesn't emulate the past, it's attainable and down to earth, it's in the here and now.
Critics praised the thoroughness of Dylan's exploration of the Great American Songbook, though, in the opinion of Uncut, "For all its easy charms, Triplicate labours its point to the brink of overkill. After five albums' worth of croon toons, this feels like a fat full stop on a fascinating chapter."
The next volume of Dylan's Bootleg Series revisited his "Born Again" Christian period of 1979 to 1981, described by Rolling Stone as "an intense, wildly controversial time that produced three albums and some of the most confrontational concerts of his long career". Reviewing the box set The Bootleg Series Vol. 13: Trouble No More 1979–1981, comprising 8 CDs and 1 DVD, Jon Pareles wrote in The New York Times:
> Decades later, what comes through these recordings above all is Mr. Dylan's unmistakable fervor, his sense of mission. The studio albums are subdued, even tentative, compared with what the songs became on the road. Mr. Dylan's voice is clear, cutting and ever improvisational; working the crowds, he was emphatic, committed, sometimes teasingly combative. And the band tears into the music.
Trouble No More includes a DVD of a film directed by Jennifer Lebeau consisting of live footage of Dylan's gospel performances interspersed with sermons delivered by actor Michael Shannon.
In April 2018, Dylan made a contribution to the compilation EP Universal Love, a collection of reimagined wedding songs for the LGBT community. The album was funded by MGM Resorts International and the songs are intended to function as "wedding anthems for same-sex couples". Dylan recorded the 1929 song "She's Funny That Way", changing the gender pronoun to "He's Funny That Way". The song was previously recorded by Billie Holiday and Frank Sinatra. That same month, The New York Times reported that Dylan was launching Heaven's Door, a range of three whiskeys. The Times described the venture as "Mr. Dylan's entry into the booming celebrity-branded spirits market, the latest career twist for an artist who has spent five decades confounding expectations". Dylan has been involved in both the creation and the marketing of the range; on September 21, 2020, Dylan resurrected Theme Time Radio Hour with a two-hour special with the theme of "Whiskey". On November 2, 2018, Dylan released More Blood, More Tracks as Volume 14 in the Bootleg Series. The set comprises all Dylan's recordings for Blood On the Tracks and was issued as a single CD and also as a six-CD Deluxe Edition.
In 2019, Netflix released Rolling Thunder Revue: A Bob Dylan Story by Martin Scorsese, billed as "Part documentary, part concert film, part fever dream". The film received largely positive reviews but also aroused controversy because it mixed documentary footage filmed during the Rolling Thunder Revue in the fall of 1975 with fictitious characters and stories. Coinciding with the film release, the box set The Rolling Thunder Revue: The 1975 Live Recordings, was released by Columbia Records. The set comprises five full Dylan performances from the tour and recently discovered tapes from Dylan's tour rehearsals. The box set received an aggregate score of 89 on Metacritic, indicating "universal acclaim". The next installment of Dylan's Bootleg Series, Bob Dylan (featuring Johnny Cash) – Travelin' Thru, 1967 – 1969: The Bootleg Series Vol. 15, was released on November 1. The set comprises outtakes from Dylan's albums John Wesley Harding and Nashville Skyline, and songs that Dylan recorded with Johnny Cash in Nashville in 1969 and with Earl Scruggs in 1970.
### 2020s
#### Rough and Rowdy Ways
On March 26, 2020, Dylan released "Murder Most Foul", a seventeen-minute song revolving around the Kennedy assassination, on his YouTube channel. Billboard reported on April 8 that "Murder Most Foul" had topped the Billboard Rock Digital Song Sales Chart, the first time that Dylan had scored a number one song on a pop chart under his own name. Three weeks later, on April 17, 2020, Dylan released another new song, "I Contain Multitudes". The title is from Walt Whitman's poem "Song of Myself". On May 7, Dylan released a third single, "False Prophet", accompanied by the news that the three songs would all appear on a forthcoming double album.
Rough and Rowdy Ways, Dylan's 39th studio album and his first album of original material since 2012, was released on June 19 to favorable reviews. Alexis Petridis wrote: "For all its bleakness, Rough and Rowdy Ways might well be Bob Dylan's most consistently brilliant set of songs in years: the die-hards can spend months unravelling the knottier lyrics, but you don't need a PhD in Dylanology to appreciate its singular quality and power." Rob Sheffield wrote: "While the world keeps trying to celebrate him as an institution, pin him down, cast him in the Nobel Prize canon, embalm his past, this drifter always keeps on making his next escape. On Rough and Rowdy Ways, Dylan is exploring terrain nobody else has reached before—yet he just keeps pushing on into the future". The album earned a score of 95 on Metacritic, indicating "universal acclaim". In its first week of release Rough and Rowdy Ways reached number one on the UK album chart, making Dylan "the oldest artist to score a No. 1 of new, original material".
In December 2020, it was announced that Dylan had sold his entire song catalog to Universal Music Publishing Group, including both the income he receives as a songwriter and his control of their copyright. Universal, a division of the French media conglomerate Vivendi, will collect all future income from the songs. The New York Times stated Universal had purchased the copyright to over 600 songs and the price was "estimated at more than $300 million", although other reports suggested the figure was closer to $400 million.
In February 2021, Columbia Records released 1970, a three-CD set of recordings from the Self Portrait and New Morning sessions, including the entirety of the session Dylan recorded with George Harrison on May 1, 1970. Dylan's 80th birthday was commemorated by a virtual conference, Dylan@80, organized by the University of Tulsa Institute for Bob Dylan Studies. The program featured seventeen sessions over three days delivered by over fifty international scholars, journalists and musicians. Several new biographies and studies of Dylan were published.
In July 2021, livestream platform Veeps presented a 50-minute performance by Dylan, Shadow Kingdom: The Early Songs of Bob Dylan. Filmed in black and white with a film noir look, Dylan performed 13 songs in a club setting with an audience. The performance was favorably reviewed, and one critic suggested the backing band resembled the style of the musical Girl from the North Country. The soundtrack to the film was released on 2 LP and CD formats in June 2023. In September, Dylan released Springtime in New York: The Bootleg Series Vol. 16 (1980–1985), issued in 2 LP, 2 CD and 5 CD formats. It comprised rehearsals, live recordings, out-takes and alternative takes from Shot of Love, Infidels and Empire Burlesque. In The Daily Telegraph, Neil McCormick wrote: "These bootleg sessions remind us that Dylan's worst period is still more interesting than most artists' purple patches". Springtime in New York received an aggregate score of 85 on Metacritic.
On July 7, 2022, Christie's, London, auctioned a 2021 recording of Dylan singing "Blowin' in the Wind". The record was in an innovative "one of one" recording medium, branded as Ionic Original, which producer T Bone Burnett claimed "surpasses the sonic excellence and depth for which analogue sound is renowned, while at the same time boasting the durability of a digital recording." The recording fetched GBP £1,482,000—equivalent to $1,769,508. In November, Dylan published The Philosophy of Modern Song, a collection of 66 essays on songs by other artists. The New Yorker described it as "a rich, riffy, funny, and completely engaging book of essays". Other reviewers praised the book's eclectic outlook, while some questioned its variations in style and dearth of female songwriters.
In January 2023, Dylan released The Bootleg Series Vol. 17: Fragments – Time Out of Mind Sessions (1996–1997) in multiple formats. The 5-CD version comprised a re-mix of the 1997 album "to sound more like how the songs came across when the musicians originally played them in the room" without the effects and processing which producer Daniel Lanois applied later; 25 previously unreleased out-takes from the studio sessions; and a disc of live performances of each song on the album performed by Dylan and his band in concert. On November 17, 2023, Dylan released The Complete Budokan 1978, containing the full recordings of the February 28 and March 1 Tokyo concerts from his 1978 Tour.
Dylan contributed a cover version of Cole Porter's song "Don't Fence Me In" to the soundtrack of the biographical film Reagan, which was released on August 30, 2024. On September 20, 2024, Dylan released The 1974 Live Recordings, a 27-disc CD boxset of recordings from the 1974 Bob Dylan & The Band tour, featuring 417 previously unreleased live tracks.
## Never Ending Tour
The Never Ending Tour commenced on June 7, 1988. Dylan has played roughly 100 dates a year since, a heavier schedule than most performers who started in the 1960s. By April 2019, Dylan and his band had played more than 3,000 shows, anchored by long-time bassist Tony Garnier.
To the dismay of some of his audience, Dylan's performances are unpredictable as he often alters his arrangements and changes his vocal approach. These variable performances have divided critics. Richard Williams and Andy Gill argued that Dylan has found a successful way to present his rich legacy of material. Others have criticized his live performances for changing "the greatest lyrics ever written so that they are effectively unrecognisable", and giving so little to the audience that "it is difficult to understand what he is doing on stage at all".
In September 2021, Dylan's touring company announced a series of tours which were billed as the "Rough and Rowdy Ways World Wide Tour, 2021–2024". The Rough and Rowdy Ways World Tour replaced Dylan's varied set lists with a more stable repertory, performing nine of the ten songs on his 2020 album. Nevertheless, the tour has been referred to by the media as an extension of his ongoing Never Ending Tour.
In the fall of 2024 Dylan embarked on a European tour, beginning in Prague, Czech Republic, on October 4 and ending in London on November 14. Alex Ross has summarised Dylan's touring career: "his shows cause his songs to mutate, so that no definitive or ideal version exists. Dylan's legacy will be the sum of thousands of performances, over many decades... Every night, whether he's in good or bad form, he says, in effect, 'Think again.'"
## Personal life
### Romantic relationships
#### Echo Helstrom
Echo Helstrom was Dylan's high school girlfriend. The couple listened together to rhythm-and-blues on the radio, and her family exposed him to singers such as Jimmie Rodgers on 78 RPM records, and a plethora of folk music magazines, sheet music, and manuscripts. Helstrom is believed by some to be the inspiration for Dylan's song "Girl from the North Country", though this is disputed.
#### Suze Rotolo
Dylan's first serious relationship was with artist Suze Rotolo, a daughter of Communist Party USA radicals. According to Dylan, "She was the most erotic thing I'd ever seen ... The air was suddenly filled with banana leaves. We started talking and my head started to spin". Rotolo was photographed arm-in-arm with Dylan on the cover of his album The Freewheelin' Bob Dylan. Critics have connected Rotolo to some of Dylan's early love songs, including "Don't Think Twice It's All Right". The relationship ended in 1964. In 2008, Rotolo published a memoir about her life in Greenwich Village and relationship with Dylan in the 1960s, A Freewheelin' Time.
#### Joan Baez
When Joan Baez met Dylan in April 1961, she had already released her first album and was acclaimed as the "Queen of Folk". On hearing Dylan perform his song "With God on Our Side", Baez later said, "I never thought anything so powerful could come out of that little toad". In July 1963, Baez invited Dylan to join her on stage at the Newport Folk Festival, setting the scene for similar duets over the next two years. By the time of Dylan's 1965 tour of the UK, their romantic relationship had begun to fizzle out, as captured in D. A. Pennebaker's documentary film Dont Look Back. Baez later toured with Dylan as a performer on his Rolling Thunder Revue in 1975–76. Baez also starred as "The Woman In White" in the film Renaldo and Clara (1978), directed by Dylan. Dylan and Baez toured together again in 1984 with Carlos Santana.
Baez recalled her relationship with Dylan in Martin Scorsese's documentary film No Direction Home (2005). Baez wrote about Dylan in two autobiographies—admiringly in Daybreak (1968), and less admiringly in And A Voice to Sing With (1987). Her song "Diamonds & Rust" has been described as "an acute portrait" of Dylan.
#### Sara Lownds
Dylan married Sara Lownds, who had worked as a model and secretary at Drew Associates, on November 22, 1965. They had four children: Jesse Byron Dylan (born January 6, 1966), Anna Lea (born July 11, 1967), Samuel Isaac Abram (born July 30, 1968), and Jakob Luke (born December 9, 1969). Dylan also adopted Sara's daughter from a prior marriage, Maria Lownds (later Dylan, born October 21, 1961). Sara Dylan played the role of Clara in Dylan's film Renaldo and Clara (1978). Bob and Sara Dylan were divorced on June 29, 1977.
#### Carolyn Dennis
Dylan and his backing singer Carolyn Dennis (often professionally known as Carol Dennis) have a daughter, Desiree Gabrielle Dennis-Dylan, born on January 31, 1986. The couple were married on June 4, 1986, and divorced in October 1992. Their marriage and child remained a closely guarded secret until the publication of Howard Sounes's biography Down the Highway: The Life of Bob Dylan, in 2001.
### Home
When not touring, Dylan is believed to live primarily in Point Dume, a promontory on the coast of Malibu, California, though he owns property around the world.
### Religious beliefs
Growing up in Hibbing, Minnesota, Dylan and his family were part of the area's small, close-knit Jewish community, and Dylan had his Bar Mitzvah in May 1954. Around the time of his 30th birthday, in 1971, Dylan visited Israel, and also met Rabbi Meir Kahane, founder of the New York-based Jewish Defense League.
In the late 1970s, Dylan converted to Christianity. In November 1978, guided by his friend Mary Alice Artes, Dylan made contact with the Vineyard School of Discipleship. Vineyard Pastor Kenn Gulliksen recalled: "Larry Myers and Paul Emond went over to Bob's house and ministered to him. He responded by saying yes, he did in fact want Christ in his life. And he prayed that day and received the Lord". From January to March 1979, Dylan attended Vineyard's Bible study classes in Reseda, California.
By 1984, Dylan was distancing himself from the "born again" label. He told Kurt Loder of Rolling Stone: "I've never said I'm 'born again'. That's just a media term. I don't think I've been an agnostic. I've always thought there's a superior power, that this is not the real world and that there's a world to come." In 1997, he told David Gates of Newsweek:
> Here's the thing with me and the religious thing. This is the flat-out truth: I find the religiosity and philosophy in the music. I don't find it anywhere else. Songs like "Let Me Rest on a Peaceful Mountain" or "I Saw the Light"—that's my religion. I don't adhere to rabbis, preachers, evangelists, all of that. I've learned more from the songs than I've learned from any of this kind of entity. The songs are my lexicon. I believe the songs.
Dylan has supported the Chabad Lubavitch movement, and has privately participated in Jewish religious events, including his sons' Bar Mitzvahs and services at Hadar Hatorah, a Chabad Lubavitch yeshiva. In 1989 and 1991, he appeared on the Chabad telethon.
Dylan has continued to perform songs from his gospel albums in concert, occasionally covering traditional religious songs. He has made passing references to his religious faith, such as in a 2004 interview with 60 Minutes, when he told Ed Bradley, "the only person you have to think twice about lying to is either yourself or to God". He explained his constant touring schedule as part of a bargain he made a long time ago with the "chief commander—in this earth and in the world we can't see".
Speaking to Jeff Slate of The Wall Street Journal in December 2022, Dylan reaffirmed his religious outlook: "I read the scriptures a lot, meditate and pray, light candles in church. I believe in damnation and salvation as well as predestination. The Five Books of Moses, Pauline Epistles, Invocation of the Saints, all of it."
## Accolades
Dylan has been inducted into the Rock and Roll Hall of Fame, Nashville Songwriters Hall of Fame and Songwriters Hall of Fame. In 1997, US President Bill Clinton presented Dylan with a Kennedy Center Honor in the East Room of the White House, saying: "He probably had more impact on people of my generation than any other creative artist. His voice and lyrics haven't always been easy on the ear, but throughout his career Bob Dylan has never aimed to please. He's disturbed the peace and discomforted the powerful". In May 2000, Dylan received the Polar Music Prize from Sweden's King Carl XVI. In June 2007, Dylan received the Prince of Asturias Award in the Arts category; the jury called him "a living myth in the history of popular music and a light for a generation that dreamed of changing the world." In 2008, the Pulitzer Prize jury awarded him a special citation for "his profound impact on popular music and American culture, marked by lyrical compositions of extraordinary poetic power".
Dylan received the Presidential Medal of Freedom in May 2012. President Barack Obama, presenting Dylan with the award, said "There is not a bigger giant in the history of American music." Obama praised Dylan's voice for its "unique gravelly power that redefined not just what music sounded like but the message it carried and how it made people feel". In November 2013, Dylan was awarded France's highest honor, the Légion d'Honneur, despite the misgiving of the grand chancellor of the Légion who had declared the singer was unworthy. In February 2015, Dylan accepted the MusiCares Person of the Year award from the National Academy of Recording Arts and Sciences, in recognition of his philanthropic and artistic contributions.
### Nobel Prize in Literature
In 1996, Gordon Ball of the Virginia Military Institute nominated Dylan for the Nobel Prize in Literature, initiating a campaign that lasted for 20 years. On October 13, 2016, the Nobel committee announced that it would be awarding Dylan the prize "for having created new poetic expressions within the great American song tradition". The New York Times reported: "Mr. Dylan, 75, is the first musician to win the award, and his selection on Thursday is perhaps the most radical choice in a history stretching back to 1901." Dylan remained silent for days after receiving the award, and then told journalist Edna Gundersen that it was "amazing, incredible. Whoever dreams about something like that?" Dylan's Nobel Lecture was posted on the Nobel Prize website on June 5, 2017. Horace Engdahl, a member of the Nobel Committee, described Dylan's place in literary history:
> a singer worthy of a place beside the Greek bards, beside Ovid, beside the Romantic visionaries, beside the kings and queens of the blues, beside the forgotten masters of brilliant standards.
## Legacy
Dylan has been described as one of the most influential figures of the 20th century, musically and culturally. He was included in the Time 100: The Most Important People of the Century, where he was called "master poet, caustic social critic and intrepid, guiding spirit of the counterculture generation". Paul Simon suggested that Dylan's early compositions virtually took over the folk genre: "[Dylan's] early songs were very rich ... with strong melodies. 'Blowin' in the Wind' has a really strong melody. He so enlarged himself through the folk background that he incorporated it for a while. He defined the genre for a while."
For many critics, Dylan's greatest achievement was the cultural synthesis exemplified by his mid-1960s trilogy of albums—Bringing It All Back Home, Highway 61 Revisited and Blonde on Blonde. In Mike Marqusee's words:
> Between late 1964 and the middle of 1966, Dylan created a body of work that remains unique. Drawing on folk, blues, country, R\&B, rock'n'roll, gospel, British beat, symbolist, modernist and Beat poetry, surrealism and Dada, advertising jargon and social commentary, Fellini and Mad magazine, he forged a coherent and original artistic voice and vision. The beauty of these albums retains the power to shock and console.
Dylan's lyrics began to receive critical study as early as 1998, when Stanford University sponsored the first international academic conference on Bob Dylan held in the United States. In 2004, Richard F. Thomas, Classics professor at Harvard University, created a freshman seminar titled "Dylan", which aimed "to put the artist in context of not just popular culture of the last half-century, but the tradition of classical poets like Virgil and Homer." Thomas went on to publish Why Bob Dylan Matters, exploring Dylan's connections with Greco-Roman literature. Literary critic Christopher Ricks published Dylan's Visions of Sin, an appreciation of Dylan's work. Following Dylan's Nobel win, Ricks reflected: "I'd not have written a book about Dylan, to stand alongside my books on Milton and Keats, Tennyson and T.S. Eliot, if I didn't think Dylan a genius of and with language." The critical consensus that Dylan's songwriting was his outstanding creative achievement was articulated by Encyclopædia Britannica: "Hailed as the Shakespeare of his generation, Dylan ... set the standard for lyric writing." Former British poet laureate Andrew Motion said Dylan's lyrics should be studied in schools. His lyrics have entered the vernacular; Edna Gundersen notes that
> Lines that branded Dylan a poet and counterculture valedictorian in the '60s are imprinted on the culture: "When you got nothing, you got nothing to lose"; "a hard rain's a-gonna fall"; "to live outside the law you must be honest." Some lyrics — "you don't need a weather man to know which way the wind blows" and "the times they are a-changin' " — appear in Bartlett's Familiar Quotations.
Rolling Stone ranked Dylan first on its 2015 list of the 100 Greatest Songwriters of All Time, fifteenth on its 2023 list of the Greatest Singers of All Time, and placed "Like A Rolling Stone" first on their list of greatest songs in 2004 and 2011. He was listed second on the magazine's list of the hundred greatest artists. The Rolling Stone Encyclopedia of Rock & Roll writes that "His lyrics—the first in rock to be seriously regarded as literature—became so well known that politicians from Jimmy Carter to Václav Havel have cited them as an influence."
Dylan's voice also received critical attention. Robert Shelton described his early vocal style as "a rusty voice suggesting Guthrie's old performances, etched in gravel like Dave Van Ronk's". His voice continued to develop as he began to work with rock'n'roll backing bands; Michael Gray described the sound of Dylan's vocal work on "Like a Rolling Stone" as "at once young and jeeringly cynical". As Dylan's voice aged during the 1980s, for some critics, it became more expressive. Christophe Lebold writes in the journal Oral Tradition:
> Dylan's more recent broken voice enables him to present a world view at the sonic surface of the songs—this voice carries us across the landscape of a broken, fallen world. The anatomy of a broken world in "Everything is Broken" (on the album Oh Mercy) is but an example of how the thematic concern with all things broken is grounded in a concrete sonic reality.
Among musicians who have acknowledged his influence are John Lennon, Paul McCartney, Jerry Garcia, Pete Townshend, Syd Barrett, Joni Mitchell, Neil Young, Bruce Springsteen, David Bowie, Bryan Ferry, Patti Smith, Joe Strummer, Bono, Nick Cave, Leonard Cohen, Tom Waits, Ole Paus and Chuck D. Dylan significantly contributed to the initial success of both the Byrds and the Band: the Byrds achieved chart success with their version of "Mr. Tambourine Man" and the subsequent album, while the Band were Dylan's backing band on his 1966 tour, recorded The Basement Tapes with him in 1967 and featured three previously unreleased Dylan songs on their debut album. Johnny Cash, introducing "Wanted Man", said "I don't have to tell you who Bob Dylan is—the greatest writer of our time."
Some critics have dissented from the view of Dylan as a visionary figure in popular music. In his book Awopbopaloobop Alopbamboom, Nik Cohn objected: "I can't take the vision of Dylan as seer, as teenage messiah, as everything else he's been worshipped as. The way I see him, he's a minor talent with a major gift for self-hype". Australian critic Jack Marx credited Dylan with changing the persona of the rock star: "What cannot be disputed is that Dylan invented the arrogant, faux-cerebral posturing that has been the dominant style in rock since, with everyone from Mick Jagger to Eminem educating themselves from the Dylan handbook".
Fellow musicians have also expressed critical views. Joni Mitchell described Dylan as a "plagiarist" and his voice as "fake" in a 2010 interview in the Los Angeles Times. Mitchell's comments led to discussions on Dylan's use of other people's material, both supporting and criticizing him. Talking to Mikal Gilmore in Rolling Stone in 2012, Dylan responded to the allegation of plagiarism, including his use of Henry Timrod's verse in his album Modern Times, by saying that it was "part of the tradition".
Dylan's music has inspired artists in other fields. Dave Gibbons recalls how he and Alan Moore were inspired by the lines of "Desolation Row" beginning "At midnight, all the agents/ And the superhuman crew...":
> It was a glimpse, a mere fragment of something; something ominous, paranoid and threatening. But something that showed that comics, like poetry or rock and roll or Bob Dylan himself, might feasibly become part of the greater cultural continuum. The lines must have also lodged in Alan's consciousness for, nearly twenty years later, Dylan's words eventually provided the title of the first issue of our comic book series Watchmen.
Gibbons says of their seminal comic, "It began with Bob Dylan."
In 2007, Todd Haynes released I'm Not There, "inspired by the music and many lives of Bob Dylan". The movie used six actors, Christian Bale, Cate Blanchett, Marcus Carl Franklin, Richard Gere, Heath Ledger and Ben Whishaw, to explore different facets of Dylan's life. Dylan's previously unreleased 1967 song from which the film takes its name was included on the original soundtrack along with covers of Dylan songs by such diverse artists as Sonic Youth, Calexico and Yo La Tengo. Irish playwright Conor McPherson wrote and directed the musical Girl from the North Country, which used Dylan's songs to tell the stories of various characters during the Depression years, set in Dylan's birthplace, Duluth, Minnesota. The play premiered in London in 2017.
Dylan's rise to stardom, from his arrival in New York in 1961 to his controversial performance at Newport in 1965, is portrayed by actor Timothée Chalamet in the feature film A Complete Unknown, scheduled to open in the U.S. on December 25, 2024.
If Dylan's work in the 1960s was seen as bringing intellectual ambition to popular music, critics in the 21st century described him as a figure who had greatly expanded the folk culture from which he initially emerged. In his review of I'm Not There, J. Hoberman wrote:
> Elvis might never have been born, but someone else would surely have brought the world rock 'n' roll. No such logic accounts for Bob Dylan. No iron law of history demanded that a would-be Elvis from Hibbing, Minnesota, would swerve through the Greenwich Village folk revival to become the world's first and greatest rock 'n' roll beatnik bard and then—having achieved fame and adoration beyond reckoning—vanish into a folk tradition of his own making.
### Archives and honors
The sale of Dylan's archive of about 6,000 items of memorabilia to the George Kaiser Family Foundation and the University of Tulsa was announced on March 2, 2016. It was reported the sale price was "an estimated $15 million to $20 million". The archive comprises notebooks, drafts of Dylan lyrics, recordings, and correspondence. To house the archive, the Bob Dylan Center in Tulsa, Oklahoma opened on May 10, 2022.
In 2005, 7th Avenue East in Hibbing, Minnesota, the street on which Dylan lived from ages 6 to 18, received the honorary name Bob Dylan Drive. In 2006, a cultural pathway, Bob Dylan Way, was inaugurated in Duluth, Minnesota, where Dylan was born. The 1.8-mile path links "cultural and historically significant areas of downtown for the tourists".
In 2015, a 160-foot-wide Dylan mural by Brazilian street artist Eduardo Kobra was unveiled in downtown Minneapolis.
In December 2013, the Fender Stratocaster which Dylan had played at the 1965 Newport Folk Festival fetched $965,000, the second highest price paid for a guitar. In June 2014, Dylan's hand-written lyrics of "Like a Rolling Stone" fetched $2 million at auction, a record for a popular music manuscript.
## Visual art
Dylan's visual art was first seen by the public via a painting he contributed for the cover of The Band's Music from Big Pink album in 1968. The cover of Dylan's own 1970 album Self Portrait features the painting of a human face by Dylan. More of Dylan's artwork was revealed with the 1973 publication of his book Writings and Drawings. The cover of Dylan's 1974 album Planet Waves again featured one of his paintings. In 1994 Random House published Drawn Blank, a book of Dylan's drawings. In 2007, the first public exhibition of Dylan's paintings, The Drawn Blank Series, opened at the Kunstsammlungen in Chemnitz, Germany; it showcased more than 200 watercolors and gouaches made from the original drawings. The exhibition coincided with the publication of Bob Dylan: The Drawn Blank Series, which includes 170 reproductions from the series. From September 2010 until April 2011, the National Gallery of Denmark exhibited 40 large-scale acrylic paintings by Dylan, The Brazil Series.
In July 2011, a leading contemporary art gallery, Gagosian Gallery, announced their representation of Dylan's paintings. An exhibition of Dylan's art, The Asia Series, opened at the Gagosian Madison Avenue Gallery on September 20, displaying Dylan's paintings of scenes in China and the Far East. The New York Times reported that "some fans and Dylanologists have raised questions about whether some of these paintings are based on the singer's own experiences and observations, or on photographs that are widely available and were not taken by Mr. Dylan". The Times pointed to close resemblances between Dylan's paintings and historic photos of Japan and China, and photos taken by Dmitri Kessel and Henri Cartier-Bresson. Art critic Blake Gopnik has defended Dylan's artistic practice, arguing: "Ever since the birth of photography, painters have used it as the basis for their works: Edgar Degas and Édouard Vuillard and other favorite artists—even Edvard Munch—all took or used photos as sources for their art, sometimes barely altering them". The Magnum photo agency confirmed that Dylan had licensed the reproduction rights of these photographs.
Dylan's second show at the Gagosian Gallery, Revisionist Art, opened in November 2012. The show consisted of thirty paintings, transforming and satirizing popular magazines, including Playboy and Babytalk. In February 2013, Dylan exhibited the New Orleans Series of paintings at the Palazzo Reale in Milan. In August 2013, Britain's National Portrait Gallery in London hosted Dylan's first major UK exhibition, Face Value, featuring twelve pastel portraits.
In November 2013, the Halcyon Gallery in London mounted Mood Swings, an exhibition in which Dylan displayed seven wrought iron gates he had made. In a statement released by the gallery, Dylan said,
> I've been around iron all my life ever since I was a kid. I was born and raised in iron ore country, where you could breathe it and smell it every day. Gates appeal to me because of the negative space they allow. They can be closed but at the same time they allow the seasons and breezes to enter and flow. They can shut you out or shut you in. And in some ways there is no difference.
In November 2016, the Halcyon Gallery featured a collection of drawings, watercolors and acrylic works by Dylan. The exhibition, The Beaten Path, depicted American landscapes and urban scenes, inspired by Dylan's travels across the US. The show was reviewed by Vanity Fair and Asia Times Online. In October 2018, the Halcyon Gallery mounted an exhibition of Dylan's drawings, Mondo Scripto. The works consisted of Dylan hand-written lyrics of his songs, with each song illustrated by a drawing.
Retrospectrum, the largest retrospective of Dylan's visual art to date, consisting of over 250 works in a variety of media, debuted at the Modern Art Museum in Shanghai in 2019. Building on the exhibition in China, a version of Retrospectrum, which includes a new series of paintings, "Deep Focus", drawn from film imagery, opened at the Frost Art Museum in Miami on November 30, 2021.
Since 1994, Dylan has published nine books of paintings and drawings. In November 2022, Dylan apologized for using an autopen to sign books and artwork which were subsequently sold as "hand-signed" since 2019.
In 2024 an abstract painting by Dylan from the late 1960s sold at auction for approximately $200,000. The painting was originally given to a relative of the seller in exchange for an astrology chart.
## Written works
Dylan has published Tarantula, a work of prose poetry; Chronicles: Volume One, the first part of his memoirs; several books of the lyrics of his songs, and nine books of his art. Dylan's third full length book, The Philosophy of Modern Song, which contains 66 essays on songs by other artists, was published on November 1, 2022. Dylan has also been the subject of numerous biographies and critical studies.
## Discography
- Bob Dylan (1962)
- The Freewheelin' Bob Dylan (1963)
- The Times They Are a-Changin' (1964)
- Another Side of Bob Dylan (1964)
- Bringing It All Back Home (1965)
- Highway 61 Revisited (1965)
- Blonde on Blonde (1966)
- John Wesley Harding (1967)
- Nashville Skyline (1969)
- Self Portrait (1970)
- New Morning (1970)
- Pat Garrett & Billy the Kid (1973)
- Dylan (1973)
- Planet Waves (1974)
- Blood on the Tracks (1975)
- The Basement Tapes (1975)
- Desire (1976)
- Street-Legal (1978)
- Slow Train Coming (1979)
- Saved (1980)
- Shot of Love (1981)
- Infidels (1983)
- Empire Burlesque (1985)
- Knocked Out Loaded (1986)
- Down in the Groove (1988)
- Oh Mercy (1989)
- Under the Red Sky (1990)
- Good as I Been to You (1992)
- World Gone Wrong (1993)
- Time Out of Mind (1997)
- "Love and Theft" (2001)
- Modern Times (2006)
- Together Through Life (2009)
- Christmas in the Heart (2009)
- Tempest (2012)
- Shadows in the Night (2015)
- Fallen Angels (2016)
- Triplicate (2017)
- Rough and Rowdy Ways (2020)
- Shadow Kingdom" (2023) |
# U.S. Route 40 Alternate (Keysers Ridge–Cumberland, Maryland)
U.S. Route 40 Alternate (Alt US 40) is the U.S. Highway designation for a former segment of U.S. Route 40 (US 40) through Garrett and Allegany counties in Maryland. The highway begins at US 40 near exit 14 on Interstate 68 (I-68) and runs 31.80 miles (51.18 km) eastward to Cumberland, where it ends at exit 44 on I-68. Alt US 40 is maintained by the Maryland State Highway Administration (MDSHA).
The highway is known as National Pike because it follows the original alignment of the historic National Road. As a result, there are many historic sites along Alt US 40, including the Casselman Bridge in Grantsville and the last remaining National Road toll gate house in Maryland, located in LaVale.
When the National Freeway was built in western Maryland paralleling the old National Road, parts of US 40 were bypassed. The part of the bypassed road between Keyser's Ridge and Cumberland became Alt US 40, and other bypassed sections east of Cumberland became Maryland Route 144 (MD 144) and U.S. Route 40 Scenic. Although Alt US 40 has diminished in importance from its original status as the National Road with the construction of I-68, it remains an important route for local traffic and serves as the Main Streets of Grantsville and Frostburg.
## Route description
Alt US 40 runs from Keyser's Ridge to Cumberland, following part of the route of the National Road through some of Maryland's most mountainous terrain in Garrett and Allegany counties. The highway is a part of the National Highway System as a principal arterial from the eastern junction with MD 36 in Frostburg to the intersection of Mechanic Street and Henderson Avenue in Cumberland.
### Garrett County
Alt US 40 branches from US 40 near exit 14 on I-68 at Keysers Ridge. It runs parallel to I-68 through northern Garrett County as a two-lane road with truck lanes on some uphill sections. The annual average daily traffic (AADT)—that is, the number of cars that use the road per day, averaged over the course of one year—is 1,831 at the western end of Alt US 40. For comparison, the parallel section of I-68 has an AADT of 14,271. Alt US 40 passes through some of the most mountainous terrain in Maryland. The route runs perpendicular to the mountain ridges in Garrett County, and as a result much of the section of the road in Garrett County runs uphill or downhill. The first mountain encountered by the highway east of Keysers Ridge is Negro Mountain. The road passes over the mountain at an elevation of 3,075 feet (937 m), which is the highest point on Alt US 40, and was also the highest point along the National Road. East of Negro Mountain, the highway enters Grantsville, where traffic increases, with the AADT increasing to 3,711, the highest traffic density on Alt US 40 in Garrett County. In Grantsville, Alt US 40 meets MD 669, which connects with Pennsylvania Route 669 toward Salisbury, Pennsylvania. A short distance east of this intersection, the highway meets MD 495, which junctions with I-68 and continues southward toward Oakland. East of Grantsville, Alt US 40 passes over the Casselman River on a steel bridge built in 1933. Downstream from this bridge is the Casselman River Bridge State Park, centered on the stone arch bridge which originally carried the National Road over the Casselman River.
Continuing eastward from Grantsville, Alt US 40 intersects US 219 Bus. a short distance north of exit 22 on I-68, before passing under the US 219 freeway. East of this intersection, traffic decreases, with an AADT of 1,681, the lowest traffic density along the entire route. The US 219 Bus. intersection is at the top of a hill known as Chestnut Ridge.
East of Chestnut Ridge, the highway passes over Meadow Mountain at a height of 2,789 feet (850 m). In eastern Garrett County, traffic on the route gradually increases to an AADT of 2,232. Alt US 40 passes under MD 546, which runs north from I-68, through Finzel, to the Pennsylvania border. Although Alt US 40 does not directly intersect MD 546, it is connected to MD 546 by way of access road MD 546F, and also by MD 946, which intersects Alt US 40 near the top of Little Savage Mountain. Just east, the route crosses the larger Big Savage Mountain at an elevation of 2,847 feet (868 m) before entering Allegany County.
### Allegany County
After continuing into Allegany County, Alt US 40 descends Savage Mountain into Frostburg, where it passes through the town as Main Street. Main Street in Frostburg has the highest traffic density on the route, with an AADT of 15,022. For comparison, the parallel section of I-68 between exits 33 and 34 has an AADT of 20,931. In west Frostburg, the highway intersects MD 36, which then follows the same road as Alt US 40 for about a mile, separating from Alt US 40 in east Frostburg. In central Frostburg, Main Street intersects MD 936, an old alignment of MD 36. Continuing eastward from Frostburg, traffic density decreases, to an AADT of 13,585 at the MD 55 intersection, staying between 13,000 and 15,000 for the remainder of the highway. Alt US 40 passes through Eckhart Mines, where it intersects MD 638, which connects with MD 36 north of Frostburg. In the eastern part of Eckhart Mines, the highway intersects MD 743, which is an old alignment of US 40 which was bypassed by the roadway which became Alt US 40.
East of Eckhart Mines, Alt US 40 passes through Clarysville, where it intersects MD 55. It is near Clarysville that the terrain followed by Alt US 40 changes: from Clarysville westward to the summit of Savage Mountain, the road runs uphill, while east of Clarysville, the road follows valleys, first following the valley around Braddock Run to Cumberland, and then following the valley around Wills Creek into Cumberland. Near the MD 55 intersection is a stone arch bridge which was initially built in 1812 and rebuilt in the 1830s, and carried the National Road over Braddock Run, a tributary to Wills Creek. East of Clarysville, the highway passes through a gap carved by Braddock Run between Piney Mountain and Dan's Mountain. I-68, having been built later, is located on the hillside above Alt US 40, on the Dan's Mountain side of the gap. Alt US 40 then descends Red Hill into LaVale. At the bottom of Red Hill is the La Vale toll gate house. Built in 1836, tolls were collected there until the early 1900s, and it is the last original National Road toll gate house standing in Maryland. In LaVale, the route intersects MD 53, which serves as a truck bypass for US 220 to Cresaptown. Alt US 40 interchanges with westbound I-68 at exit 39, but eastbound access is only available via MD 53 and MD 658, which intersects Alt US 40 east of the exit 39 interchange. The highway expands to a four-lane road near its intersection with MD 53, then narrows to a two-lane road near its intersection with MD 658. East of the intersection with MD 658, Alt US 40 turns northward, passing through LaVale toward the Narrows, bypassing Haystack Mountain to the north, as opposed to I-68, which passes directly over Haystack Mountain, paralleling Braddock Road (MD 49).
Northeast of LaVale, Alt US 40 intersects MD 36 at the northern terminus of MD 36. Alt US 40 then passes through the Narrows, a gap between Haystack Mountain and Wills Mountain carved by Wills Creek, into Cumberland, where it follows Henderson Avenue and Baltimore Avenue to exit 44 on I-68, where Alt US 40 ends. The roadway continues eastward as MD 639.
## History
The roadway which became Alt US 40 in Garrett and Allegany counties is, with some realignments, the route followed by the National Road through western Maryland. Various historic sites associated with the National Road can be found along Alt US 40, including a toll-gate house (La Vale Tollgate House) and mile-marker in LaVale. The toll-gate house in LaVale is the last remaining toll-gate house on the National Road in Maryland. Several historic bridges from the National Road, since bypassed by newer bridges, are still present along the route of Alt US 40, including the Casselman Bridge over the Casselman River in Grantsville and a bridge in Clarysville.
### Braddock Road and the National Road
In 1755, during the French and Indian War, British troops under the command of General Edward Braddock completed the arduous task of building a road westward from Fort Cumberland. They largely followed an Indian trail known as Nemacolin's Path, expanding it to a 12-foot-wide (3.7 m) road using only hand tools. The road construction was part of the Braddock Expedition, which was the British campaign to seize Fort Duquesne from the French and Indian forces. Although the military expedition was a failure, the road continued to be used afterwards. However, with little maintenance being done on the road, it decayed over time until by the early nineteenth century little remained of the road. The route followed by Alt US 40 today is very similar to the route followed by Braddock's Road, with the exceptions of various realignments that have been done to the road over the years. For example, Braddock's Road crossed directly over Haystack Mountain west of Cumberland rather than following the Cumberland Narrows as later roads did.
The National Road, the first road funded by the U.S. federal government, was authorized by the United States Congress in 1806, and ran from Cumberland, Maryland, to Vandalia, Illinois. Construction started in 1811, and by 1837 the road reached Vandalia. Many sites from the National Road remain along Alt US 40, in particular the LaVale toll gate house, built in 1836. Following the completion of the National Road in 1837, the federal government ceded the road to the states to operate as a toll road, and toll gate houses such as the one in LaVale were built along its path in preparation for the transfer. Tolls continued to be collected along the National Road at the LaVale toll house until the late nineteenth century. The LaVale toll house is the first of its kind to be built along the National Road, and it is the last standing toll house along the National Road in Maryland. The LaVale toll house was listed on the National Register of Historic Places in 1977.
### Realignments
Multiple realignments of the road that is now Alt US 40 have occurred since it was originally built as the National Road. Most such realignments are minor, such as to bypass an old bridge, but some have significantly affected the path of the road. One such realignment occurred in 1834, when a new route for the National Road was built through the Cumberland Narrows. The previous route had followed the Braddock Road, a route which is now followed by MD 49. The route following Braddock Road passed over Haystack Mountain and was much steeper than the newer route through the Narrows. The route through the Narrows allowed the road to bypass this steep mountain ascent. The stone arch bridge built across Will's Creek for the new alignment remained in service until 1932, when a new bridge which is the present bridge across Will's Creek replaced it. The old bridge was torn down during the construction of the Will's Creek flood control system in the 1950s.
Another realignment of Alt US 40 occurred in Eckhart Mines, where in 1969 the road, then designated as US 40, was realigned to the north, bypassing the section of the highway through Eckhart Mines, which has a lower speed limit and sharp curves. The speed limit on the old alignment is 25 miles per hour (40 km/h), and the new alignment has a speed limit of 50 miles per hour (80 km/h) along most of the bypass. The new alignment intersects the old alignment, designated as MD 743, on the east end between MD 638 and MD 55. The west end of the old alignment meets MD 36 just south of its intersection with Alt US 40. MD 638, which prior to the realignment ended at US 40, was not truncated, and thus ends at MD 743.
### Historic bridges
There are several historic bridges along the National Road that are still present near the current route of Alt US 40. Among them are the Casselman River bridge in Grantsville, and the bridge over Braddock Run, a tributary of Wills Creek, in Clarysville. The original National Road bridge over the Casselman River was a stone arch bridge constructed in 1813. The 80-foot (24 m) span was built to be the largest bridge of its type in the United States at the time, and during its construction it was believed that the bridge could not stand on its own. The bridge was constructed in this manner in the hopes that the Chesapeake and Ohio Canal would eventually pass under it, though construction on the canal was stopped at Cumberland in 1850. When US 40 was first designated in 1925, it crossed the Casselman River on the original stone bridge. In 1933, a new steel bridge was constructed to replace the National Road bridge, and it is this bridge that Alt US 40 now follows. The original bridge was declared a National Historic Landmark in 1964, and is now part of the Casselman River Bridge State Park.
Another historic bridge stands in Clarysville, near the intersection of Alt US 40 and MD 55. This bridge, which crosses Braddock Run, was built in 1812, with later work being done in 1843. The stone arch bridge, located just south of the current alignment of Alt US 40, was restored in 1976.
### Origins of Alt US 40
Prior to the construction of I-68, US 40 followed the route currently designated as U.S. Route 40 Alternate. The first segment of what would become I-68 was built in Cumberland in the mid-1960s. The freeway, first designated as US 48, was extended westward through the 1970s, reaching West Virginia in 1976. The portions of US 40 that were bypassed between Cumberland and Keysers Ridge became U.S. Route 40 Alternate, which first appeared on MDSHA maps in the early 1980s. At this time, US 40 was realigned to follow the US 48 freeway, sharing the freeway with US 48. In 1991 the freeway was completed from Hancock to Morgantown, West Virginia. The US 48 designation was retired, and on August 2, 1991, the freeway became I-68.
## Junction list
|}
## See also
- |
# AMX-30E
The AMX-30E (E stands for España, Spanish for Spain) is a Spanish main battle tank based on France's AMX-30. Although originally the Spanish government sought to procure the German Leopard 1, the AMX-30 was ultimately awarded the contract due to its lower price and the ability to manufacture it in Spain. 280 units were manufactured by Santa Bárbara Sistemas for the Spanish Army, between 1974 and 1983.
First acquired in 1970, the tank was to supplement Spain's fleet of M47 and M48 Patton United States tanks and to reduce Spain's reliance on American equipment in its army. The first 19 AMX-30 tanks were acquired from France in 1970, while another 280 were assembled in Spain. It was Spain's first mass-produced tank and developed the country's industry to the point where the government felt it could produce a tank on its own, leading to the development of the Lince tank project in 1985. It also offered Santa Bárbara Sistemas the experience which led to the production of the Leopard 2E in late 2003. Although final assembly was carried out by Santa Bárbara Sistemas, the production of the AMX-30E also included other companies in the country. Total production within Spain amounted to as much as 65% of the tank.
Spain's AMX-30E fleet went through two separate modifications in the late 1980s, a modernization program and a reconstruction program. The former, named the AMX-30EM2 (150 tanks), sought to modernize and improve the vehicle's automotive characteristics, while the latter, or the AMX-30EM1 (149 tanks), resulted in a more austere improvement of the tank's power plant by maintaining the existing engine and replacing the transmission. Both programs extended the vehicle's lifetime. Spain's fleet of AMX-30EM1s was replaced in the late 1990s by the German Leopard 2A4, and the AMX-30EM2s were replaced by Centauro wheeled anti-tank vehicles in the early 21st century.
Although 19 AMX-30Es were deployed to the Spanish Sahara in 1970, the tank never saw combat. In 1985 Indonesia expressed interest in the AMX-30E, while in 2004 the Spanish and Colombian governments discussed the possible sale of around 40 AMX-30EM2s. Both trade deals fell through.
## Background
In 1960, Spain's tank fleet was composed mainly of American M47 Patton tanks, with some newer M48 Patton tanks. The M47s had been acquired by the Spanish army in the mid-1950s, replacing the previous fleet of 1930s-vintage Panzer I, T-26 and Panzer IV tank designs. During the 1957-58 Ifni War, the United States' ban on the usage of American ordnance supplied earlier as military aid to Spain pushed Spain to look for alternative equipment which could be freely employed in the Spanish Sahara.
In the early 1960s, Spain looked towards its European neighbors for a new tank. The Spanish government first approached Krauss-Maffei, the German manufacturer of the Leopard 1, and the company applied for an export license from the German Economics Ministry. Spain's status as a non-NATO country meant that the decision to grant the export license had to be reviewed by the Bundessicherheitsrat (Federal Security Council [de]), or the BSR, which was responsible for the coordination of the national defense policy. Ultimately, the council ruled that Krauss-Maffei could sign an export contract with Spain. The deal was, however, stalled by pressure from the United Kingdom's Labour Party on the basis that the Leopard's 105-millimeter (4.13 in) L7 tank gun was British technology. Meanwhile, Spain tested the French AMX-30 between 2 and 10 June 1964.
The Leopard 1 and the AMX-30 originated from a joint tank development program known as the Europanzer. For a tank, the AMX-30 had a low silhouette; the height of the tank was 2.28 meters (7.5 feet), compared to the Leopard's 2.6 meters (8.5 feet). In terms of lethality, the AMX-30's Obus G high-explosive anti-tank (HEAT) round was one of the most advanced projectiles at the time. Because HEAT warheads become less efficient during spin stabilization induced by the rifling of a tank-gun barrel, the Obus G was designed so that the shaped charge warhead was mounted on ball bearings within an outer casing, allowing the round to be spin stabilized through the rifling of the gun without affecting the warhead inside. The Obus G was designed to penetrate up to 400 millimetres (16 inches) of steel armor. On the other hand, the Leopard was armed with the L7A3 tank gun, capable of penetrating the frontal armor of most contemporary tanks. Although the Leopard boasted greater armor than the AMX-30—partially accounting for the weight difference between the two tanks—the latter was sold at a cheaper price.
In May 1970, the Spanish government decided to sign a contract with the French company GIAT to begin production of the AMX-30. However, it was not the advantages of the French vehicle itself that influenced the decision. Rather, it was the UK's unwillingness to sell their L7 tank-gun, the low cost of the AMX-30, and the French offer to allow Spain to manufacture the tank, that led the Spanish Army to favor the French armored vehicle.
## Production
On 22 June 1970, the Spanish Ministry of Defense signed an agreement of military cooperation with France, which outlined plans for the future acquisition of around 200 tanks for the Spanish Army. Of these, 180 were to be manufactured under license in Spain and 20 were to be manufactured by France. Ultimately, GIAT was contracted to manufacture 19 tanks. These were delivered to the Spanish Legion's Bakali company, deployed in the Spanish Sahara. The first six AMX-30s were delivered by rail to the Spanish border city of Irún, in the Basque Country, and then transferred to Bilbao. Finally, they were shipped by the Spanish Navy, on the transport Almirante Lobo, to El Aaiún in the Spanish Sahara. This unit existed until 1975, when it was disbanded and its tanks transferred to the Uad-Ras Mechanized Infantry Regiment.
This agreement laid the foundations for the upcoming tank plant at the industrial polygon of Las Canteras, near the town of Alcalá de Guadaíra. Several parts of the tank were subcontracted to other Spanish companies, including Astilleros Españoles (turret), Boetticher, Duro Felguera and E. N. Bazán. The grade of local production varied per batch. The first 20 tanks were to have 18% of each vehicle manufactured in Spain; the next 40 would have 40% of the vehicle manufactured in Spain. The other 120 had 65% of the tank manufactured in the country. Production began in 1974, at a rate of five tanks per month, and ended on 25 June 1979. The first five tanks were delivered to the Uad Ras Mechanized Infantry Regiment on 30 October 1974. This batch also replaced the M41 Walker Bulldog light tanks and M48 Patton tanks in the Armored Cavalry Regiment Villaviciosa and the Armored Infantry Regiment Alcázar de Toledo, receiving 23 and 44 tanks, respectively.
On 27 March 1979, prior to the end of production of the first batch, the Spanish Army and Santa Bárbara Sistemas signed a contract for the production and delivery of a second batch of 100 AMX-30Es. In 1980, after the 200th AMX-30E was delivered to the Spanish Army, the tank's patent was awarded to Spain. This allowed minor modifications to be done to the vehicle without having to consult GIAT. It also meant that the degree of local construction of each vehicle augmented considerably. Production of the second batch lasted between 1979 and 1983. By the time production ended, the Spanish Army fielded 299 AMX-30Es (280 produced between 1974 and 1983, and 19 delivered from France in 1970) and 4 training vehicles delivered in 1975. Santa Bárbara Sistemas also manufactured 18 Roland España (denominated AMX-30RE) anti-air vehicles and 10 AMX-30D armored recovery vehicles. The average cost per tank, in the first batch, was 45 million pesetas (US$642,800). Cost per tank increased during the second batch to 62 million pesetas (885,700 dollars).
Although brand new, the AMX-30E entered service with automotive issues, including problems with the antiquated 5SD-200D transmission. Consequently, as the first production batch began to end, the Spanish Army and Santa Bárbara Sistemas began to study possible upgrades. The main objectives were to increase the power and the reliability of the power pack, an improvement to the tank's firepower and accuracy, as well as to increase the vehicle's ballistic protection and overall survivability. A number of modernization packages were proposed, including a suggestion to mount the AMX-30E's turret on a Leopard 1's chassis. Other options included swapping the existing power pack for a new American diesel engine and transmission or exchanging the power pack for a new German diesel engine and transmission. More austere versions of these same options were offered, pairing the existing HS-110 engine with the already mentioned transmissions. Another prototype was produced using the Leopard's more modern tracks, and another similar prototype mounted a new 12.7-millimeter (0.5 in) machine gun for the loader's position. France's GIAT also offered to modernize Spain's AMX-30Es to AMX-30B2 standards, a modernization being applied to French AMX-30s.
### Modernization
Ultimately, a mixed solution named Tecnología Santa Bárbara-Bazán (Santa Bárbara-Bazán Technology) (or TSB) was chosen. The improvement of the tank's mobility entailed replacing the HS-110 diesel engine with an MTU 833 Ka-501 diesel engine, producing 850 metric horsepower (625 kW), and the transmission with a German ZF LSG-3000, compatible with engines of up to 1,500 metric horsepower (1103 kW). The first 30 engines were to have 50% of the engine manufactured in Spain; the rest, 73% were to be produced indigenously. This new engine gave the modernized tank a power ratio of 23 metric horsepower per tonne (21.13 hp/S/T). The new engine was coupled with the AMX-30B2's improved torsion-bar suspension, which used larger diameter torsion-bars and new shocks.
To improve the tank's firepower, the gun mount around the loader's turret hatch was modified to allow the installation of a 12.7-millimeter (0.5 in) machine gun, while the main gun's firepower was augmented through the introduction of the new CETME437A armor-piercing, fin-stabilized discarding sabot (APFSDS). The gun's accuracy was improved through the installation of the new Mark 9 modification A/D fire control system, designed by Hughes Aircraft Company. The new system allowed firing during the day and during night operations, and increased the likelihood of a first round impact. The fire control system was also modernized through the exchange of the old M282 gunner's periscope with a new periscope and a new Nd:YAG laser rangefinder. A new ballistics computer, the NSC-800, was issued, as well as a new digital panel for the gunner, designed and manufactured by the Spanish company INISEL. The tank commander also received a control unit that allowed the choice of ammunition for the gun and provided information on the ballistics of the round and the target to be engaged. As a result, the loader received a presentation unit to display information on which round to load into the gun's breech and to communicate ballistic data received, including angular velocity, wind velocity, gun elevation and vehicle inclination. The fire control system also allowed for the future upgrade to a more sophisticated stabilization system for the tank's main gun. Survivability improvements included the addition of new steel side-skirts, a new smoke generating system linked to the engine and a new fire suppression system.
One hundred fifty AMX-30Es received this modernization package and were designated AMX-30EM2s. The program began in 1989 and ended in 1993. Ultimately, Spain's AMX-30EM2s were replaced by brand-new Centauro anti-tank vehicles, which were partially manufactured in Spain, in the early 21st century.
### Reconstruction
The other 149 AMX-30Es were reconstructed to improve their mobility. The reconstruction consisted of the replacement of the original French transmission with the American Allison CD-850-6A. Furthermore, several parts of the tank, such as the brakes, were renovated in order to bring them up to their original standards. The CD-850-6A was an automatic transmission, with a triple differential providing two forward velocities and one reverse velocity. However, the new transmission resulted in a new problem. The excessive heat produced by the transmission reduced the vehicle's range. The reconstruction of the 149 AMX-30Es began in 1988, and these were designated AMX-30EM1s. In the early 1990s Spain received a large number of M60 Patton tanks, replacing its fleet of M47s and M48s, as well as its AMX-30EM1s.
## Export
In the mid-1980s Indonesia approached Spain in an attempt to procure armaments for the modernization of its armed forces. Of the possible armaments for sale, Indonesia expressed interest in the procurement of the AMX-30. Although this deal fell through, in 2004 the Spanish and Colombian governments agreed on the sale of between 33 and 46 second-hand AMX-30EM2s, which had recently been replaced in the Spanish Army. However, the deal was canceled after José María Aznar was replaced by José Luis Rodríguez Zapatero as prime minister of Spain—the new Spanish government declared that Spain didn't even have enough AMX-30EM2s in working condition to sell to Colombia.
## See also
- Tanks in the Spanish Army
- List of armoured fighting vehicles by country |
# The American Bible Challenge
The American Bible Challenge is an American biblical-themed television game show created by Game Show Network. The series is hosted by comedian Jeff Foxworthy, with gospel musician Kirk Franklin joining Foxworthy as co-host and announcer in the second season. The series debuted on August 23, 2012.
Each season of the series is played as a nine-episode tournament with six episodes of opening rounds, two semi-finals, and a final. Each opening round starts with three teams of three contestants answering questions about the Bible. The teams then nominate their strongest contestants to answer questions by themselves without any assistance from their teammates. After this part of the round, the third-placed team is eliminated and the two highest-scoring teams compete in a final round with the scores being reset to zero. The remaining teams answer as many questions correctly as possible within one minute, and the highest-scoring team from this round wins a $20,000 prize which is given to the team's nominated charity. The winning team then advances to a semi-final game against two other winning teams, the winning team from this game advances to a final game where the grand prize is raised to $100,000. Thus, the team that wins the season-long tournament earns a total of $140,000 for their charity.
The show became the highest-rated original program in the history of the Game Show Network. In 2014, The American Bible Challenge received two nominations at the 41st Daytime Emmy Awards: one for the series as Outstanding Game Show and the other for Foxworthy as Outstanding Game Show Host, they lost to Jeopardy\! and Steve Harvey (host of Family Feud) respectively.
## Gameplay
### Main game
To begin the game, a category is revealed and the three teams of three contestants are asked multiple-choice questions under that category, with each question having four possible answers. The contestant who "buzzes in" (sounds the buzzer indicating he or she is ready to answer) with the correct answer earns the respective team 10 points, an incorrect answer loses 10 points and opens up the question to the other teams. Contestants must wait until the host finishes the entire question (including the choices) to buzz in.
Each team then participates in a physical stunt that involves teams using common household items to answer questions about biblical figures. For example, in the game Stick a Fork In It, the teams must answer the question by using a spoon to catapult a fork into one of several glasses labeled with different possible answers. Where teams compete individually, each team is given 60 seconds, occasional games where teams compete at the same time are either untimed (with the first team to complete the game winning) or played in 90 seconds. In all cases, the team that wins the stunt receives 20 points, in case of a tie, each of the teams involved in the tie receives the points. The next round, titled Kirk's Righteous Remix, features Grammy Award–winner Kirk Franklin and the show's choir singing songs relating to various books of the Bible. Each team is then given one question based on an announced subject worth 30 points, and no penalty is assessed for an incorrect answer.
The teams then set their strongest respective contestants aside for the final round of main gameplay. These contestants move to an area behind the teams, and cannot participate in this round. The host then asks each team, in turn, a question based on an announced category. Each question in this round is worth 50 points, with no penalty assessed for an incorrect answer. Only the two contestants standing at the podium may confer and answer the question. Two questions are asked to each team in this round. In the final round titled "The Chosen Three", the contestants who were set aside from the previous round stand alone at their podiums, with their teammates standing in the area behind them. The host asks each contestant, in turn, a question with six possible answers, three of which are correct. The contestants then make three selections without conferring with the rest of their respective teams. Each individual correct answer is worth 100 points, thus, a total of 300 points are available to each team in this round. The two teams with the highest total scores advance to the final round, while the third-place team is eliminated and leaves with $2,500 for their charity.
### The Final Revelation
Before the final round of regular gameplay, titled The Final Revelation, both teams' scores from the main game are discarded. The host announces the category for the round and gives each team a copy of the Bible. The teams then move to a backstage area, and are permitted up to ten minutes to study the Bible for information based on that category. In season two, while backstage, the teams also have the option to use the YouVersion mobile app of the Bible on an electronic tablet along with their physical copy of the Bible.
After ten minutes, the first team comes on stage, while the second team is placed in a soundproof booth. The host then asks the team questions from the announced category. Each question is given, in rotation, to one player, who cannot confer with teammates. Both teams play the same set of questions. Each team has a total of 60 seconds to answer as many questions as possible, and the team that answers more questions correctly wins $20,000 for their charity, while the runners-up win $5,000 for their charity. Teams that win this round advance to a semi-final game, the winners of that episode advance to the season finale, where the team that wins this round wins $100,000 for their charity as well as all winnings from previous episodes.
### Previous rules
Immediately following the first round in season two, each team had a chance to earn 25 additional points. Before the show, a question was asked to 100 YouVersion users (e.g. "Would you rather fast for 40 days or eat manna for 40 years?"). A question with three possible choices was then asked about the percentage of people who answered (e.g. "What percentage said they would rather fast for 40 days than eat manna for 40 years?"). During the break, each team wrote their answer on a tablet computer, and each team that submitted the correct answer earned 25 points. This round was removed from the game in the third season and replaced with another opening round-style game played for ±10 points a question.
Following the second round in season one, teams had to choose the apocryphal verse from three true Bible verses. Since the scoring was disappointingly low, this round was eliminated in later shows.
## Production
The series began development with production staff approaching Troy Schmidt, a pastor at First Baptist Church in Windermere, Florida, to work as both a writer and a consultant for the show. One of Schmidt's initial roles was to be an "on-camera Bible expert" for the series, one of many aspects of the pilot episode that a test audience rejected, and one with which Schmidt himself later admitted he was not comfortable. In response to the early criticism from the test audience, Game Show Network (GSN) took a six-week period to bring in several new staff members and make various changes to the show's format.
After these changes were made, the test audience became more appreciative of the series, and GSN announced its development to the public at an upfront presentation in New York City on March 21, 2012, for the network's upcoming programming. By this time, a pilot episode had already been shot; which was hosted by American stand-up comedian and television personality Jeff Foxworthy. When first asked if he was interested in hosting the show, Foxworthy was hesitant; he agreed to take on the role after learning that contestants would be playing for charity rather than on their own behalf. Casting for the series was held in various cities from May to June 2012. On July 7, 2012, GSN confirmed the show would premiere on August 23, 2012, alongside the premiere of Beat the Chefs.
### Season one
The first season of The American Bible Challenge premiered its first of nine episodes on August 23, 2012. An audience of 1.73 million viewers watched the debut episode, making it GSN's highest-rated original program in its history. In total, when combined with a rerun of the episode later that evening, the show brought in over 2.3 million viewers (1.730 million at 8:00 p.m., 571,000 at 11:00 p.m.) for the night. On October 18, 2012, Team Judson's Legacy, consisting of married couple Drake and Christina Levasheff of Irvine, California and their friend Dean Bobar, were crowned champions of the inaugural season's tournament, winning a total of $140,000 for Hunter's Hope, a leukodystrophy charity, chosen in honor of the Levasheffs' son, Judson, who had died in 2007 of late-onset Krabbe disease. By the end of the first season, the series had become GSN's most successful original program ever, garnering a total of over 13 million total viewers.
### Season two
A second nine-episode season of the series was announced on October 9, 2012. GSN advertised that auditions would be held nationwide in November and December, and that the season would also feature the addition of Franklin to the series. The second season premiered on GSN March 21, 2013, debuting to 1.152 million viewers. On May 23, 2013, Team Wagner Warriors, consisting of brothers Joshua, Jesse, and Daniel Wagner from Owasso and Tulsa, Oklahoma, were crowned champions of the second season, winning a total of $140,000 for Wagner Ministries International, a missionary organization founded by their father. A portion of the winnings was used for Wagner Ministries' involvement in the "One Nation One Day" evangelical event in Honduras in July 2013. The Wagners had previously won the national championship of Assemblies of God Teen Bible Quiz three times in four years.
### Season three
On August 8, 2013, GSN announced plans to renew The American Bible Challenge for a third season, with both Foxworthy and Franklin returning as hosts. The third season once again consisted of nine episodes, which began airing May 22, 2014. On July 17, 2014, Team Bible Belts, consisting of Jonathan King, Matt Phipps and Brad Harris from Otway, Ohio, were crowned as the third season's champions, winning a total $140,000 for Kicks for Jesus, a nonprofit which combines Bible study and taekwondo. While GSN never canceled the series, the third season remains the most recent season to air given the lack of production and series announcements since 2014.
## Reception
David Hinckley of the New York Daily News gave The American Bible Challenge a positive review, saying, "Anyone who knows even a little about the Bible will be unable to resist playing along and matching answers with the teams on the screen." Neil Genzlinger of The New York Times was also pleased, calling the show "nothing if not magnanimous, sending even the losing teams home with a little something for their charities. A spirit of good will prevails." Hank Stuever of The Washington Post was critical, calling the series "just as dull as it sounds," and arguing that "weariness" could be detected in Foxworthy's hosting. Rebecca Cusey of Patheos recommended the series for Christians in particular, saying, "Those that take the Bible as the word of God will enjoy this show." Additionally, Bounce TV expressed excitement when announcing their acquisition of the series in 2013, network chief operating officer Jonathan Katz commented, "We are very confident that the broadcast premieres of The American Bible Challenge and Catch 21 will add fuel to Bounce TV's skyrocketing growth."
The series was honored with two Emmy Award nominations at the 41st Daytime Emmy Awards in 2014. The series received a nomination for Outstanding Game Show, while Foxworthy received one for Outstanding Game Show Host. Both the show and Foxworthy lost to Jeopardy\! and Steve Harvey of Family Feud respectively.
## Merchandise
In an effort to promote the show's second season, Schmidt released a study book titled The American Bible Challenge: A Daily Reader, Volume 1 in 2013. An online Bible study was also launched on GSN's website at the start of the second season. In addition to the Bible studies, GSN released a mobile game based on the show for Facebook, iOS devices, and Android devices in 2012, while Talicor released a board game based on the series in 2014. |
# Barber coinage
The Barber coinage consists of a dime, quarter, and half dollar designed by United States Bureau of the Mint Chief Engraver Charles E. Barber. They were minted between 1892 and 1916, though no half dollars were struck in the final year of the series.
By the late 1880s, there were increasing calls for the replacement of the Seated Liberty design, used since the 1830s on most denominations of silver coins. In 1891, Mint Director Edward O. Leech, having been authorized by Congress to approve coin redesigns, ordered a competition, seeking a new look for the silver coins. As only the winner would receive a cash prize, invited artists refused to participate and no entry from the public proved suitable. Leech instructed Barber to prepare new designs for the dime, quarter, and half dollar, and after the chief engraver made changes to secure Leech's endorsement, they were approved by President Benjamin Harrison in November 1891. Striking of the new coins began the following January.
Public and artistic opinion of the new pieces was, and remains, mixed. In 1915, Mint officials began plans to replace them once the design's minimum term expired in 1916. The Mint issued Barber dimes and quarters in 1916 to meet commercial demand, but before the end of the year, the Mercury dime, Standing Liberty quarter, and Walking Liberty half dollar had begun production. Most dates in the Barber coin series are not difficult to obtain, but the 1894 dime struck at the San Francisco Mint (1894-S), with a mintage of 24, is a great rarity.
## Background
### Charles Barber
Charles E. Barber was born in London in 1840. His grandfather, John Barber, led the family to America in the early 1850s. Both John and his son William were engravers and Charles followed in their footsteps. The Barber family initially lived in Boston upon their arrival to the United States, though they later moved to Providence to allow William to work for the Gorham Manufacturing Company. William Barber's skills came to the attention of Mint Chief Engraver James B. Longacre, who hired him as an assistant engraver in 1865; when Longacre died in 1869, William Barber became chief engraver and Charles was hired as an assistant engraver.
William Barber died on August 31, 1879, of an illness contracted after swimming at Atlantic City, New Jersey. His son applied for the position of chief engraver, as did George T. Morgan, another British-born engraver hired by the Mint. In early December 1879, Treasury Secretary John Sherman, Mint Director Horatio Burchard, and Philadelphia Mint Superintendent A. Loudon Snowden met to determine the issue. They decided to recommend the appointment of Barber, who was subsequently nominated by President Rutherford B. Hayes and in February 1880, was confirmed by the Senate. Barber would serve nine presidents in the position, remaining until his death in 1917, when Morgan would succeed him.
Coinage redesign was being considered during Barber's early years as chief engraver. Superintendent Snowden believed that the base-metal coins then being struck (the one-, three-, and five-cent pieces) should have uniform designs, as did many of the silver pieces, and also some gold coins. He had Barber create experimental pattern coins. In spite of Snowden's desires, the only design modified was that of the five-cent coin, or nickel; Barber's design, known as the Liberty Head nickel, entered production in 1883. The new coin had its denomination designated by a Roman numeral "V" on the reverse; the three-cent coin had always had a "III" to designate its denomination. Enterprising fraudsters soon realized that the nickel and half eagle (or five-dollar gold piece) were close in size, and plated the base metal coins to pass to the unwary. Amid public ridicule of the Mint, production came to a halt until Barber hastily added the word "cents" to the reverse of his design.
### Movement towards redesign
For much of the second half of the 19th century, most U.S. silver coins bore a design of a seated Liberty. This design had been created by Christian Gobrecht, an engraver at the United States Mint in Philadelphia, after a sketch by artist Thomas Sully, and introduced to U.S. coins in the late 1830s. The design reflected an English influence, and as artistic tastes changed over time, was increasingly disliked in the United States. In 1876, The Galaxy magazine said of the then current silver coins:
> Why is it we have the ugliest money of all civilized nations? The design is poor, commonplace, tasteless, characterless, and the execution is like thereunto. They have rather the appearance of tokens or mean medals. One reason for this is that the design is so inartistic, and so insignificant. That young woman sitting on nothing in particular, wearing nothing to speak of, looking over her shoulder at nothing imaginable, and bearing in her left hand something that looks like a broomstick with a woolen nightcap on it—what is she doing there?
Public dissatisfaction with the newly-issued Morgan dollar led the Mint's engravers to submit designs for the smaller silver coins in 1879. Among those who called for new coinage was editor Richard Watson Gilder of The Century Illustrated Monthly Magazine. Sometime in the early 1880s, he, along with one of his reporters and sculptor Augustus Saint-Gaudens visited Mint Director Burchard to argue for the creation of new designs. They brought along classic Greek and Roman coins in an attempt to persuade Burchard that the coinage could easily be made more beautiful. The visitors left disappointed, after learning that Burchard considered the much-criticized Morgan dollar as beautiful as any of them.
In 1885, Burchard was succeeded as Mint director by James Kimball. The new director was more receptive to Gilder's ideas and in 1887 announced a competition for new designs for the non-gold coinage. These plans were scuttled when Vermont Senator Justin Morrill questioned the Mint's authority to produce new designs. The Mint had claimed authority under the Coinage Act of 1873 in issuing the Morgan dollar in 1878 and the Liberty Head nickel in 1883. Morrill was a supporter of coin redesign and had in the past introduced bills to accomplish this; he felt, however, that this could not be done without an act of Congress. Kimball submitted the issue to government lawyers; they indicated that the Mint lacked the claimed authority. All three men worked to secure a bill to authorize new designs: Morrill by introducing and pressing legislation, Kimball by lobbying for the authority in his annual report, and Gilder by orchestrating favorable coverage. With legislators busy with other matters, it was not until September 26, 1890, that President Benjamin Harrison signed legislation making all denominations of U.S. coins available for immediate redesign by the Mint upon obtaining the Secretary of the Treasury's approval. Each coin could thereafter be altered from the 25th year after it was first produced; for example, a coin first struck in 1892 would be eligible for redesign in 1916.
## Inception
Three days before the signing of the 1890 act, Barber wrote to the superintendent of the Philadelphia Mint, Oliver Bosbyshell, setting forth proposed terms of a competition to select new coin designs. Barber suggested that entrants be required to submit models, as opposed to drawings, and that the designs be in low relief, which was used for coins. He proposed that the entries include the lettering and denomination, as submissions without them would not adequately show the appearance of the finished coin. He received a reply that due to other work, the Mint would not be able to address the question until the spring of 1891.
On October 16, 1890, a new Mint director, Edward O. Leech, took office. Leech, aged 38 at the time, had spent his career at the Bureau of the Mint, and was an enthusiastic supporter of redesign. He took the precaution of obtaining recommendations from Barber as to suitable outside artists who might participate in a competition. Since most of the proposed artists were New York-based, Andrew Mason, superintendent of the New York Assay Office, was given the task of finalizing the list of invitees. Leading Mason's list of ten names was that of Saint-Gaudens. Mason sent Leech the recommendations on April 3, 1891; the following day, the Mint director announced the competition, open to the public, but he specifically invited the ten artists named by Mason to participate. Besides Saint-Gaudens, artists asked to compete included Daniel Chester French, Herbert Adams and Kenyon Cox. Although Barber had warned the director that reputable artists would likely not enter a contest in which only the winner received compensation, Leech offered a $500 prize to the winner, and no payments to anyone else. He sought new designs for both sides of the dollar, and for the obverses of the half dollar, quarter, and dime—Leech was content to let the reverses of the Seated Liberty coins continue. By law, an eagle had to appear on the quarter and half dollar, but could not appear on the dime.
Most of the artists conferred in New York and responded in a joint letter that they would be willing to participate, but not on the terms set. They proposed a competition with set fees for sketches and designs submitted by the invited artists, to be judged by a jury of their peers, and with the Mint committed to replace the Seated Liberty coins with the result. They also insisted that the same artist create both sides of a given coin, and that more time be given to allow the development of designs. Leech was unable to meet these terms, as there was only enough money available for the single prize. In addition to inviting the ten artists, he had sent thousands of solicitations through the country; a number of designs were submitted in response to the circulars. To judge the submissions, he appointed a jury consisting of Saint-Gaudens, Barber, and Henry Mitchell, a Boston seal engraver and member of the 1890 Assay Commission. The committee met in June 1891 and quickly rejected all entries.
Leech was quoted in the press regarding the result of the contest:
> It is not likely that another competition will ever be tried for the production of designs for United States coins. The one just ended was too wretched a failure ... The result is not very flattering to the boasted artistic development of this country, inasmuch as only two of the three hundred suggestions submitted were good enough to receive honorable mention.
Barber wrote years later about the competition, "many [entries] were sent in, but Mr. St. Gaudens, [sic] who was appointed one of the committee to pass upon designs, objected to everything submitted". Numismatic historian Roger Burdette explained the artistic differences between the two men:
> It is likely they were so far apart in their artistic understanding that neither listened to what the other had to say ... Barber was from the English trades-apprentice approach where engraving and die sinking were crafts closely aligned to other metal workers such as machine tool makers. His father and grandfather were both engravers. Saint-Gaudens was a classically trained sculptor who began his career as an apprentice cameo cutter in New York, later moving to Paris and Rome for extensive training while perfecting his artistry. Barber generally worked in small, circular formats—a three-inch medal was a large size for his sculptures. Saint-Gaudens was uncomfortable with small medals and typically designed life-size or larger figures ... the 1891 competition turned the two against each other for the rest of their lives.
## Preparation
Frustrated at the competition's outcome, Leech instructed Barber on June 11, 1891, to prepare designs for the half dollar, quarter, and dime. As the Morgan dollar was then being heavily struck, the Mint director decided to leave that design unaltered for the time being. For the obverse of the new coins, Leech suggested a depiction of Liberty similar to that on the French coins of the period; he was content that the current reverses be continued. Leech had previously suggested to Barber that he engage outside help if the work was to be done at the Mint; the chief engraver replied that he was aware of no one who could be of help in the preparation of new designs. Leech had spoken with Saint-Gaudens on the same subject; the sculptor stated that only four men in the world were capable of executing high-quality coin designs; three lived in France and he was the fourth.
Leech announced the decision to have Barber do the work in July, stating that he had instructed the engraver to prepare designs for presentation to Secretary of the Treasury Charles Foster. In a letter printed in the New York Tribune, Gilder expressed disappointment that the Mint was planning to generate the new designs in-house, feeling that the Mint, essentially a factory for coins, was ill-equipped to generate artistic coin designs. Due to Gilder's prominence in the coinage redesign movement, Leech felt the need to respond personally, which he did in early August. He told Gilder that "artistic designs for coins, that would meet the ideas of an art critic like yourself, and artists generally, are not always adapted for practical coining". He assured Gilder that the designs which Barber had already prepared had met with the approval of Mitchell, though Leech himself had some improvements to suggest to the chief engraver.
Barber's first attempt, modeled for the half dollar, disregarded Leech's instructions. Instead of a design based on French coinage, it depicted a standing figure of Columbia, bearing a pileus (a crown fashioned from an olive branch) atop a liberty pole; an eagle spreading its wings stands behind her. The reverse utilized the heraldic eagle from the Great Seal of the United States, enclosed inside a thick oak wreath, with the required legends surrounding the rim. Leech rejected the design, and Barber submitted a revised obverse in mid-September with a head of Liberty similar to that on the adopted coin. Leech got feedback from friends and from Secretary Foster; on September 28, he wrote Barber that Liberty's lips were "rather voluptuous" and directed him to prepare a reverse without the wreath. Barber did so, and pattern coins based on the revised design were struck. Barber complained, in a letter on October 2 to Superintendent Bosbyshell, but intended for Leech, that the constant demands for changes were wasting his time. Leech replied, stating that he did not care how much effort was expended in order to improve the design, especially since, once issued, they would have to be used for 25 years. Barber's reply was transmitted to Leech on October 6 with a cover letter from Acting Superintendent Mark Cobb (Bosbyshell was traveling) stating that Barber "disclaims any intention to be captious and certainly did not intend to question your prerogative as one of the officers designed by law to pass upon new designs for coinage". The letter from Barber was a lengthy technical explanation for various design elements, and requested further advice from Leech if he had preferences; the overall tone was argumentative. Leech chose not to write again; he addressed one concern, about whether the olive branches in the design were rendered accurately, by visiting the National Botanical Garden, obtaining one, and sending it to Barber.
The question of how to render the stars (representing the 13 original states) on the coin was posed in the letters; in the end, Leech opted for six-pointed stars on the obverse and five-pointed ones on the reverse. Barber had prepared three versions of the design, each with clouds over the eagle; Leech approved one on October 31 and ordered working dies prepared, but then began to question the presence of the clouds, and had two more versions made. On November 6, President Harrison and his Cabinet considered which of the designs to approve, and chose one without the clouds; the following day, Leech ordered working dies prepared. Barber scaled down his design for the quarter and dime. While the Cabinet approved the designs, members requested that the Mint embolden the words "Liberty" on the obverse and "E Pluribus Unum" on the reverse, believing that these legends would wear away in circulation; despite the resulting changes, this proved to be accurate. For the reverse of the dime, on which, by law, an eagle could not appear, a slight modification of the reverse of the Seated Liberty dime was used, with a wreath of foliage and produce surrounding the words "One Dime".
It is uncertain when pattern dimes and quarters were struck, but this was most likely in mid-November 1891. One variety each of pattern dime and quarter are known, whereas five different half dollars are extant; all known Barber coin patterns are in the National Numismatic Collection and none are in private hands. On December 11, Bosbyshell requested a delay in production to mid-January 1892 to allow the dies to be more thoroughly tested; Leech refused. The first Barber coins were struck at the Philadelphia Mint on January 2, 1892, at 9:00 a.m. By the end of the day, all three denominations had been coined.
## Design
All three denominations of the Barber coinage depict a head of Liberty, facing right. She wears a pileus and a small headband inscribed "Liberty". On the quarter and half dollar, the motto "In God We Trust" appears above her head; she is otherwise surrounded with 13 six-pointed stars and the date. On the dime, her head is surrounded with "United States of America" and the year. The reverse of the quarter and the half dollar depicts a heraldic eagle, based on the Great Seal. The bird holds in its mouth a scroll inscribed "E Pluribus Unum" and in its right claws an olive branch; in its left it holds 13 arrows. Above the eagle are 13 five-pointed stars; it is surrounded by the name of the country and by the coin's denomination. The reverse of the dime depicts a wreath of corn, wheat, maple and oak leaves surrounding the words "One Dime". Barber's monogram "B" is on the cutoff of Liberty's neck; the mint mark, on the dime, is placed beneath the wreath on the reverse and beneath the eagle on the larger denominations.
Barber's head of Liberty is purely classical, and is rendered in the Roman style. The head is modeled after the French "Ceres" silver coinage of the late 19th century, but bears a resemblance to Morgan's design for the silver dollar. This did not escape numismatist Walter Breen in his comprehensive guide to U.S. coins: "Barber must have been feeling unusually lazy. He left the [dime] rev[erse] design as it had been since 1860, with minor simplifications. His obv[erse] was a mirror image of the Morgan dollar head, with much of Miss Anna Willess Williams' back hair cropped off, the rest concealed ... within a disproportionately large cap." In his text introducing the Barber quarter, Breen states, "the whole composition is Germanically stolid, prosy, crowded (especially on rev[erse]), and without discernible merit aside from the technical one of low relief". Burdette terms Barber's designs, "typically mediocre imitations of the current French-style—hardly better than the arcane seated Liberty type they replaced".
Art historian Cornelius Vermeule, in his work on U.S. coins, took a more positive view of Barber's coinage: "the last word as to their aesthetic merits has yet to be written. Little admired or collected for more than three generations after their appearance [writing in 1971], these essentially conservative but most dignified coins have suddenly become extremely popular with collectors". Vermeule argued that "the designs of Barber's coins were more attuned to the times than he perhaps realized. The plumpish, matronly gravitas of Liberty had come to America seven years earlier in the person of Frédéric Bartholdi's giant statue [the Statue of Liberty] ... " He suggested that the features of Daniel Chester French's huge statue Republic, created for the World Columbian Exposition, "were absolutely in harmony with what Charles Barber had created for the coinage in the year of the Fair's opening".
## Reception
Leech released the new designs to the press about November 10, 1891. According to numismatist David Lange, the new coinage received mixed reviews: "while the general press and public seemed satisfied with the new dime, quarter dollar, and half dollar, numismatists were either mildly disappointed with the new coins or remained silent on the matter." Moran records a number of unfavorable reviews, without listing any favorable ones. Vermeule stated that "the initial comment on the new coinage concerned the novelty of a contest, its failure, and the inevitable result that the commission would go, as always, to the Chief Engraver [Barber] and his staff."
George Heath, editor of The Numismatist, discussed the new pieces: "the mechanical work is all that could be desired, and it is probable that owing to the conventional rut in which our mint authorities seem obliged to keep, this is the best that could be done". W.T.R. Martin wrote in the American Journal of Numismatics, "The general effect is pleasing, of the three the Dime is to many the most attractive piece. The head of Liberty is dignified, but although the silly story has been started that the profile is that of a 'reigning belle' of New York, she can hardly be called a beauty; there is a suggestion ... of the classic heads on some of the Roman coins, and a much stronger suggestion of the head on the French Francs of 1871 and onward ... these coins are an advance on what has hitherto been accomplished, but there is yet a long distance between them and the ideal National coin."
Other reactions were unfavorable. Artist Kenyon Cox, one of the invited artists to the 1891 competition, stated, "I think it disgraceful that this great country should have such a coin as this." Harper's Weekly proclaimed, "The mountain had labored and brought forth a mouse." Saint-Gaudens was also interviewed, and as author Moran put it, "injudiciously ranted": "This is inept; this looks like it had been designed by a young lady of sixteen, a miss who had taken only a few lessons in modeling. It is beneath criticism ... There are hundreds of artists in this country, any of whom, with the aid of a designer, could have made a very respectable coin, which this is not."
## Production and collecting
Soon after issuance of the new quarters, the Mint received complaints that they would not stack properly. Barber made adjustments in his design to remedy this problem. Accordingly, there are two versions of the 1892 quarter, dubbed "Type I" and "Type II", both for the version without mint mark struck at Philadelphia and for those struck at the New Orleans Mint (1892-O) and the San Francisco Mint (1892-S). They may be distinguished by their reverses: Type I quarters have about half of the letter "E" in "UNITED" covered by the eagle's wing; with Type II quarters, the letter is almost entirely obscured. Type I quarters are rarer for each mint.
The 1894-S Barber dime is one of the great numismatic rarities, with a published mintage of 24 proof pieces. Various stories attend the question of how so few came to be coined. According to Nancy Oliver and Richard Kelly in their 2011 article for The Numismatist, the San Francisco Mint in June 1894 needed to coin $2.40 in silver left over from the melting of worn-out coins, just enough to coin 24 dimes. More ten-cent pieces were expected to be struck there later in the year, but this did not occur. Breen, on the other hand, related that San Francisco Mint Superintendent John Daggett had the dimes struck for a group of banker friends, giving three to each. He also gave three to his young daughter Hallie, telling her to retain them until she was as old as he was, and she would be able to sell them for a good price. According to the story, she spent one on a dish of ice cream, but kept the other two until 1954. One of the approximately nine known dimes was retrieved from circulation in 1957, and Breen speculated this may have been the ice cream specimen. One sold for $1,552,500 at auction in 2007.
In 1900, Barber modified the dies. This change resulted in quarters that were thinner, so that 21 of the new coins would stack in the space occupied by 20 of the old. Barber again set to work on the dies. San Francisco Mint officials wanted permission to use the old dies, which was refused, as it was felt that all mints should be producing coins with the same specifications. There are small differences between quarters produced at the different mints.
Except for the 1894-S dime, there are no great rarities in the Barber series, as mintages were generally adequate to high. Key dates for the dime include the 1895-O (with the lowest mintage), 1896-S, 1897-O, 1901-S and 1903-S. For the quarter, key dates are the very low mintage 1896-S, 1901-S, and 1913-S issues, with the 1901-S particularly scarce. The rarest half dollar is the 1892-O "Micro O", in which the mint mark "O" for New Orleans was impressed on the half dollar die with a puncheon intended for the quarter; other key dates are the regular 1892-O, 1892-S, 1893-S, 1897-O, 1897-S, 1913, 1914, and 1915. The last three dates have very low mintages but were preserved in substantial numbers. As half dollars were heavily circulated, prices tend to steeply rise for all coins in higher grades. "Condition rarities", relatively common and inexpensive in circulated condition but costly in high grades, include the 1901-S, 1904-S, and 1907-S half dollars. Thus, although most dates are easily obtainable, many are scarce in higher and uncirculated grades. Also, in 1909, a new half dollar hub was introduced, which made the headband word "Liberty" stronger, thus changing a grading diagnostic. Earlier Barber halves are frequently separately graded for their obverse and reverse characteristics, as the reverse tended to wear faster. Finally, large quantities of lower grade Barber coins were melted for bullion when silver prices rose in 1979 and early 1980.
In 1989, a group of collectors founded the Barber Coin Collectors Society, a nonprofit organization "dedicated to furthering the knowledge of coins designed" by Charles E. Barber and William Barber. The organization publishes a quarterly journal, and holds an annual meeting in conjunction with the American Numismatic Association World's Fair of Money.
## Replacement
According to Burdette, "agitation to replace Barber's banal 1892 Liberty head began almost before the first coins were cold from the press." In 1894, the American Numismatic and Antiquarian Society, in conjunction with various artistic and educational institutes, began to advocate for better designs for U.S. coins, but no change took place in the remainder of the 19th century.
In 1904, President Theodore Roosevelt started to push for improvements to U.S. coins, and arranged for the Mint to engage Saint-Gaudens to redesign coins which could be changed under the 1890 act. Before his death in 1907, the sculptor provided designs for the double eagle and eagle, though the double eagle required adjustment by Barber to lower the relief before it could be released as a circulating coin. Redesign of the smaller gold pieces, Lincoln cent, and Buffalo nickel followed between 1908 and 1913. By then, the dime, quarter, and half dollar were the only coins being struck which had not received a redesign in the 20th century. As the 1916 date approached when the Barber coins could be changed without an act of Congress, calls for a new design increased.
In 1915, a new Mint director, Robert W. Woolley, took office. Woolley advocated the replacement of the silver coins when it was legal to do so, and instructed Barber and Morgan to prepare new designs. He consulted with the Commission of Fine Arts, asking them to examine the designs produced by the Mint's engravers and, if they felt they were not suitable, to recommend artists to design the new coins. The Commission rejected the Barber and Morgan designs and proposed Adolph Weinman, Hermon MacNeil, and Albin Polasek as designers. Although Woolley had hoped that each artist would produce one design, different concepts by Weinman were accepted for the dime and half dollar, and one by MacNeil for the quarter.
Woolley had hoped to begin production of the new coins on July 1, 1916. There was heavy demand for small change, and as delays in actual production stretched into the second half of the year, Woolley was forced to have Barber prepare dies for 1916-dated dimes and quarters bearing the chief engraver's 1892 design. According to numismatist David Lange, "Barber must have secretly smiled to himself as his familiar Roman bust of Liberty once again dropped from the presses by the thousands, and then by the millions." There were sufficient half dollars from 1915 available to meet demand; no Barber halves were struck in 1916. The production difficulties were eventually ironed out, and at least token quantities of each of the new coins were struck in 1916, putting an end to the Barber coinage series. |
# Venus
Venus is the second planet from the Sun. It is a terrestrial planet and is the closest in mass and size to its orbital neighbour Earth. Venus has by far the densest atmosphere of the terrestrial planets, composed mostly of carbon dioxide with a thick, global sulfuric acid cloud cover. At the surface it has a mean temperature of 737 K (464 °C; 867 °F) and a pressure 92 times that of Earth's at sea level. These extreme conditions compress carbon dioxide into a supercritical state at Venus's surface.
Internally, Venus has a core, mantle, and crust. Venus lacks an internal dynamo, and its weakly induced magnetosphere is caused by atmospheric interactions with the solar wind. Internal heat escapes through active volcanism, resulting in resurfacing instead of plate tectonics. Venus is one of two planets in the Solar System, the other being Mercury, that have no moons. Conditions perhaps favourable for life on Venus have been identified at its cloud layers. Venus may have had liquid surface water early in its history with a habitable environment, before a runaway greenhouse effect evaporated any water and turned Venus into its present state.
The rotation of Venus has been slowed and turned against its orbital direction (retrograde) by the currents and drag of its atmosphere. It takes 224.7 Earth days for Venus to complete an orbit around the Sun, and a Venusian solar year is just under two Venusian days long. The orbits of Venus and Earth are the closest between any two Solar System planets, approaching each other in synodic periods of 1.6 years. Venus and Earth have the lowest difference in gravitational potential of any pair of Solar System planets. This allows Venus to be the most accessible destination and a useful gravity assist waypoint for interplanetary flights from Earth.
Venus figures prominently in human culture and in the history of astronomy. Orbiting inferiorly (inside of Earth's orbit), it always appears close to the Sun in Earth's sky, as either a "morning star" or an "evening star". While this is also true for Mercury, Venus appears more prominent, since it is the third brightest object in Earth's sky after the Moon and the Sun. In 1961, Venus became the target of the first interplanetary flight, Venera 1, followed by many essential interplanetary firsts, such as the first soft landing on another planet by Venera 7 in 1970. These probes demonstrated the extreme surface conditions, an insight that has informed predictions about global warming on Earth. This finding ended the theories and then popular science fiction about Venus being a habitable or inhabited planet.
## Physical characteristics
Venus is one of the four terrestrial planets in the Solar System, meaning that it is a rocky body like Earth. It is similar to Earth in size and mass and is often described as Earth's "sister" or "twin". Venus is close to spherical due to its slow rotation. Venus has a diameter of 12,103.6 km (7,520.8 mi)—only 638.4 km (396.7 mi) less than Earth's—and its mass is 81.5% of Earth's, making it the third-smallest planet in the Solar System. Conditions on the Venusian surface differ radically from those on Earth because its dense atmosphere is 96.5% carbon dioxide, with most of the remaining 3.5% being nitrogen. The surface pressure is 9.3 megapascals (93 bars), and the average surface temperature is 737 K (464 °C; 867 °F), above the critical points of both major constituents and making the surface atmosphere a supercritical fluid out of mainly supercritical carbon dioxide and some supercritical nitrogen.
### Geography
The Venusian surface was a subject of speculation until some of its secrets were revealed by planetary science in the 20th century. Venera landers in 1975 and 1982 returned images of a surface covered in sediment and relatively angular rocks. The surface was mapped in detail by Magellan in 1990–91. The ground shows evidence of extensive volcanism, and the sulphur in the atmosphere may indicate that there have been recent eruptions.
About 80% of the Venusian surface is covered by smooth, volcanic plains, consisting of 70% plains with wrinkle ridges and 10% smooth or lobate plains. Two highland "continents" make up the rest of its surface area, one lying in the planet's northern hemisphere and the other just south of the equator. The northern continent is called Ishtar Terra after Ishtar, the Babylonian goddess of love, and is about the size of Australia. Maxwell Montes, the highest mountain on Venus, lies on Ishtar Terra. Its peak is 11 km (7 mi) above the Venusian average surface elevation. The southern continent is called Aphrodite Terra, after the Greek mythological goddess of love, and is the larger of the two highland regions at roughly the size of South America. A network of fractures and faults covers much of this area.
There is recent evidence of lava flow on Venus (2024), such as flows on Sif Mons, a shield volcano, and on Niobe Planitia, a flat plain. There are visible calderas. The planet has few impact craters, demonstrating that the surface is relatively young, at 300–600million years old. Venus has some unique surface features in addition to the impact craters, mountains, and valleys commonly found on rocky planets. Among these are flat-topped volcanic features called "farra", which look somewhat like pancakes and range in size from 20 to 50 km (12 to 31 mi) across, and from 100 to 1,000 m (330 to 3,280 ft) high; radial, star-like fracture systems called "novae"; features with both radial and concentric fractures resembling spider webs, known as "arachnoids"; and "coronae", circular rings of fractures sometimes surrounded by a depression. These features are volcanic in origin.
Most Venusian surface features are named after historical and mythological women. Exceptions are Maxwell Montes, named after James Clerk Maxwell, and highland regions Alpha Regio, Beta Regio, and Ovda Regio. The last three features were named before the current system was adopted by the International Astronomical Union, the body which oversees planetary nomenclature.
The longitude of physical features on Venus is expressed relative to its prime meridian. The original prime meridian passed through the radar-bright spot at the centre of the oval feature Eve, located south of Alpha Regio. After the Venera missions were completed, the prime meridian was redefined to pass through the central peak in the crater Ariadne on Sedna Planitia.
The stratigraphically oldest tessera terrains have consistently lower thermal emissivity than the surrounding basaltic plains measured by Venus Express and Magellan, indicating a different, possibly a more felsic, mineral assemblage. The mechanism to generate a large amount of felsic crust usually requires the presence of water ocean and plate tectonics, implying that habitable condition had existed on early Venus with large bodies of water at some point. However, the nature of tessera terrains is far from certain.
Studies reported on 26 October 2023 suggest for the first time that Venus may have had plate tectonics during ancient times and, as a result, may have had a more habitable environment, possibly one capable of sustaining life. Venus has gained interest as a case for research into the development of Earth-like planets and their habitability.
#### Volcanism
Much of the Venusian surface appears to have been shaped by volcanic activity. Venus has several times as many volcanoes as Earth, and it has 167 large volcanoes that are over 100 km (60 mi) across. The only volcanic complex of this size on Earth is the Big Island of Hawaii. More than 85,000 volcanoes on Venus were identified and mapped. This is not because Venus is more volcanically active than Earth, but because its crust is older and is not subject to the same erosion process. Earth's oceanic crust is continually recycled by subduction at the boundaries of tectonic plates, and has an average age of about 100 million years, whereas the Venusian surface is estimated to be 300–600million years old.
Several lines of evidence point to ongoing volcanic activity on Venus. Sulfur dioxide concentrations in the upper atmosphere dropped by a factor of 10 between 1978 and 1986, jumped in 2006, and again declined 10-fold. This may mean that levels had been boosted several times by large volcanic eruptions. It has been suggested that Venusian lightning (discussed below) could originate from volcanic activity (i.e. volcanic lightning). In January 2020, astronomers reported evidence that suggests that Venus is currently volcanically active, specifically the detection of olivine, a volcanic product that would weather quickly on the planet's surface.
This massive volcanic activity is fuelled by a superheated interior, which models say could be explained by energetic collisions from when the planet was young. Impacts would have had significantly higher velocity than on Earth, both because Venus's orbit is faster due to its closer proximity to the Sun and because objects would require higher orbital eccentricities to collide with the planet.
In 2008 and 2009, the first direct evidence for ongoing volcanism was observed by Venus Express, in the form of four transient localized infrared hot spots within the rift zone Ganis Chasma, near the shield volcano Maat Mons. Three of the spots were observed in more than one successive orbit. These spots are thought to represent lava freshly released by volcanic eruptions. The actual temperatures are not known, because the size of the hot spots could not be measured, but are likely to have been in the 800–1,100 K (527–827 °C; 980–1,520 °F) range, relative to a normal temperature of 740 K (467 °C; 872 °F). In 2023, scientists reexamined topographical images of the Maat Mons region taken by the Magellan orbiter. Using computer simulations, they determined that the topography had changed during an 8-month interval, and concluded that active volcanism was the cause.
#### Craters
Almost a thousand impact craters on Venus are evenly distributed across its surface. On other cratered bodies, such as Earth and the Moon, craters show a range of states of degradation. On the Moon, degradation is caused by subsequent impacts, whereas on Earth it is caused by wind and rain erosion. On Venus, about 85% of the craters are in pristine condition. The number of craters, together with their well-preserved condition, indicates the planet underwent a global resurfacing event 300–600million years ago, followed by a decay in volcanism. Whereas Earth's crust is in continuous motion, Venus is thought to be unable to sustain such a process. Without plate tectonics to dissipate heat from its mantle, Venus instead undergoes a cyclical process in which mantle temperatures rise until they reach a critical level that weakens the crust. Then, over a period of about 100million years, subduction occurs on an enormous scale, completely recycling the crust.
Venusian craters range from 3 to 280 km (2 to 174 mi) in diameter. No craters are smaller than 3km, because of the effects of the dense atmosphere on incoming objects. Objects with less than a certain kinetic energy are slowed so much by the atmosphere that they do not create an impact crater. Incoming projectiles less than 50 m (160 ft) in diameter will fragment and burn up in the atmosphere before reaching the ground.
### Internal structure
Without data from reflection seismology or knowledge of its moment of inertia, little direct information is available about the internal structure and geochemistry of Venus. The similarity in size and density between Venus and Earth suggests that they share a similar internal structure: a core, mantle, and crust. Like that of Earth, the Venusian core is most likely at least partially liquid because the two planets have been cooling at about the same rate, although a completely solid core cannot be ruled out. The slightly smaller size of Venus means pressures are 24% lower in its deep interior than Earth's. The predicted values for the moment of inertia based on planetary models suggest a core radius of 2,900–3,450 km. This is in line with the first observation-based estimate of 3,500 km.
The principal difference between the two planets is the lack of evidence for plate tectonics on Venus, possibly because its crust is too strong to subduct without water to make it less viscous. This results in reduced heat loss from the planet, preventing it from cooling and providing a likely explanation for its lack of an internally generated magnetic field. Instead, Venus may lose its internal heat in periodic major resurfacing events.
### Magnetic field and core
In 1967, Venera 4 found Venus's magnetic field to be much weaker than that of Earth. This magnetic field is induced by an interaction between the ionosphere and the solar wind, rather than by an internal dynamo as in the Earth's core. Venus's small induced magnetosphere provides negligible protection to the atmosphere against solar and cosmic radiation.
The lack of an intrinsic magnetic field on Venus was surprising, given that it is similar to Earth in size and was expected to contain a dynamo at its core. A dynamo requires three things: a conducting liquid, rotation, and convection. The core is thought to be electrically conductive and, although its rotation is often thought to be too slow, simulations show it is adequate to produce a dynamo. This implies that the dynamo is missing because of a lack of convection in Venus's core. On Earth, convection occurs in the liquid outer layer of the core because the bottom of the liquid layer is much higher in temperature than the top. On Venus, a global resurfacing event may have shut down plate tectonics and led to a reduced heat flux through the crust. This insulating effect would cause the mantle temperature to increase, thereby reducing the heat flux out of the core. As a result, no internal geodynamo is available to drive a magnetic field. Instead, the heat from the core is reheating the crust.
One possibility is that Venus has no solid inner core, or that its core is not cooling, so that the entire liquid part of the core is at approximately the same temperature. Another possibility is that its core has already been completely solidified. The state of the core is highly dependent on the concentration of sulphur, which is unknown at present.
Another possibility is that the absence of a late, large impact on Venus (contra the Earth's "Moon-forming" impact) left the core of Venus stratified from the core's incremental formation, and without the forces to initiate/sustain convection, and thus a "geodynamo".
The weak magnetosphere around Venus means that the solar wind is interacting directly with its outer atmosphere. Here, ions of hydrogen and oxygen are being created by the dissociation of water molecules from ultraviolet radiation. The solar wind then supplies energy that gives some of these ions sufficient velocity to escape Venus's gravity field. This erosion process results in a steady loss of low-mass hydrogen, helium, and oxygen ions, whereas higher-mass molecules, such as carbon dioxide, are more likely to be retained. Atmospheric erosion by the solar wind could have led to the loss of most of Venus's water during the first billion years after it formed. However, the planet may have retained a dynamo for its first 2–3 billion years, so the water loss may have occurred more recently. The erosion has increased the ratio of higher-mass deuterium to lower-mass hydrogen in the atmosphere 100 times compared to the rest of the solar system.
## Atmosphere and climate
Venus has a dense atmosphere composed of 96.5% carbon dioxide, 3.5% nitrogen—both exist as supercritical fluids at the planet's surface with a density 6.5% that of water—and traces of other gases including sulphur dioxide. The mass of its atmosphere is 92 times that of Earth's, whereas the pressure at its surface is about 93 times that at Earth's—a pressure equivalent to that at a depth of nearly 1 km (5⁄8 mi) under Earth's ocean surfaces. The density at the surface is 65 kg/m<sup>3</sup> (4.1 lb/cu ft), 6.5% that of water or 50 times as dense as Earth's atmosphere at 293 K (20 °C; 68 °F) at sea level. The CO<sub>2</sub>-rich atmosphere generates the strongest greenhouse effect in the Solar System, creating surface temperatures of at least 735 K (462 °C; 864 °F). This makes the Venusian surface hotter than Mercury's, which has a minimum surface temperature of 53 K (−220 °C; −364 °F) and maximum surface temperature of 700 K (427 °C; 801 °F), even though Venus is nearly twice Mercury's distance from the Sun and thus receives only 25% of Mercury's solar irradiance, of 2,600 W/m<sup>2</sup> (double that of Earth). Because of its runaway greenhouse effect, Venus has been identified by scientists such as Carl Sagan as a warning and research object linked to climate change on Earth.
Venus's atmosphere is rich in primordial noble gases compared to that of Earth. This enrichment indicates an early divergence from Earth in evolution. An unusually large comet impact or accretion of a more massive primary atmosphere from solar nebula have been proposed to explain the enrichment. However, the atmosphere is depleted of radiogenic argon, a proxy for mantle degassing, suggesting an early shutdown of major magmatism.
Studies have suggested that billions of years ago, Venus's atmosphere could have been much more like the one surrounding the early Earth, and that there may have been substantial quantities of liquid water on the surface. After a period of 600 million to several billion years, solar forcing from rising luminosity of the Sun and possibly large volcanic resurfacing caused the evaporation of the original water and the current atmosphere. A runaway greenhouse effect was created once a critical level of greenhouse gases (including water) was added to its atmosphere. Although the surface conditions on Venus are no longer hospitable to any Earth-like life that may have formed before this event, there is speculation on the possibility that life exists in the upper cloud layers of Venus, 50 km (30 mi) up from the surface, where the atmospheric conditions are the most Earth-like in the Solar System, with temperatures ranging between 303 and 353 K (30 and 80 °C; 86 and 176 °F), and the pressure and radiation being about the same as at Earth's surface, but with acidic clouds and the carbon dioxide air. Venus's atmosphere could also have a potential thermal habitable zone at elevations of 54 to 48 km, with lower elevations inhibiting cell growth and higher elevations exceeding evaporation temperature. The putative detection of an absorption line of phosphine in Venus's atmosphere, with no known pathway for abiotic production, led to speculation in September 2020 that there could be extant life currently present in the atmosphere. Later research attributed the spectroscopic signal that was interpreted as phosphine to sulphur dioxide, or found that in fact there was no absorption line.
Thermal inertia and the transfer of heat by winds in the lower atmosphere mean that the temperature of Venus's surface does not vary significantly between the planet's two hemispheres, those facing and not facing the Sun, despite Venus's slow rotation. Winds at the surface are slow, moving at a few kilometres per hour, but because of the high density of the atmosphere at the surface, they exert a significant amount of force against obstructions, and transport dust and small stones across the surface. This alone would make it difficult for a human to walk through, even without the heat, pressure, and lack of oxygen.
Above the dense CO<sub>2</sub> layer are thick clouds, consisting mainly of sulfuric acid, which is formed by sulphur dioxide and water through a chemical reaction resulting in sulfuric acid hydrate. Additionally, the clouds consist of approximately 1% ferric chloride. Other possible constituents of the cloud particles are ferric sulfate, aluminium chloride and phosphoric anhydride. Clouds at different levels have different compositions and particle size distributions. These clouds reflect, similar to thick cloud cover on Earth, about 70% of the sunlight that falls on them back into space, and since they cover the whole planet they prevent visual observation of Venus's surface. The permanent cloud cover means that although Venus is closer than Earth to the Sun, it receives less sunlight on the ground, with only 10% of the received sunlight reaching the surface, resulting in average daytime levels of illumination at the surface of 14,000 lux, comparable to that on Earth "in the daytime with overcast clouds". Strong 300 km/h (185 mph) winds at the cloud tops go around Venus about every four to five Earth days. Winds on Venus move at up to 60 times the speed of its rotation, whereas Earth's fastest winds are only 10–20% rotation speed.
The surface of Venus is effectively isothermal; it retains a constant temperature not only between the two hemispheres but between the equator and the poles. Venus's minute axial tilt—less than 3°, compared to 23° on Earth—also minimizes seasonal temperature variation. Altitude is one of the few factors that affect Venusian temperatures. The highest point on Venus, Maxwell Montes, is therefore the coolest point on Venus, with a temperature of about 655 K (380 °C; 715 °F) and an atmospheric pressure of about 4.5 MPa (45 bar). In 1995, the Magellan spacecraft imaged a highly reflective substance at the tops of the highest mountain peaks, a "Venus snow" that bore a strong resemblance to terrestrial snow. This substance likely formed from a similar process to snow, albeit at a far higher temperature. Too volatile to condense on the surface, it rose in gaseous form to higher elevations, where it is cooler and could precipitate. The identity of this substance is not known with certainty, but speculation has ranged from elemental tellurium to lead sulfide (galena).
Although Venus has no seasons, in 2019 astronomers identified a cyclical variation in sunlight absorption by the atmosphere, possibly caused by opaque, absorbing particles suspended in the upper clouds. The variation causes observed changes in the speed of Venus's zonal winds and appears to rise and fall in time with the Sun's 11-year sunspot cycle.
The existence of lightning in the atmosphere of Venus has been controversial since the first suspected bursts were detected by the Soviet Venera probes. In 2006–07, Venus Express clearly detected whistler mode waves, the signatures of lightning. Their intermittent appearance indicates a pattern associated with weather activity. According to these measurements, the lightning rate is at least half that on Earth, however other instruments have not detected lightning at all. The origin of any lightning remains unclear, but could originate from clouds or Venusian volcanoes.
In 2007, Venus Express discovered that a huge double atmospheric polar vortex exists at the south pole. Venus Express discovered, in 2011, that an ozone layer exists high in the atmosphere of Venus. On 29 January 2013, ESA scientists reported that the ionosphere of Venus streams outwards in a manner similar to "the ion tail seen streaming from a comet under similar conditions."
In December 2015, and to a lesser extent in April and May 2016, researchers working on Japan's Akatsuki mission observed bow-shaped objects in the atmosphere of Venus. This was considered direct evidence of the existence of perhaps the largest stationary gravity waves in the solar system.
## Orbit and rotation
Venus orbits the Sun at an average distance of about 0.72 AU (108 million km; 67 million mi), and completes an orbit every 224.7 days. Although all planetary orbits are elliptical, Venus's orbit is currently the closest to circular, with an eccentricity of less than 0.01. Simulations of the early solar system orbital dynamics have shown that the eccentricity of the Venus orbit may have been substantially larger in the past, reaching values as high as 0.31 and possibly impacting early climate evolution.
All planets in the Solar System orbit the Sun in an anticlockwise direction as viewed from above Earth's north pole. Most planets rotate on their axes in an anticlockwise direction, but Venus rotates clockwise in retrograde rotation once every 243 Earth days—the slowest rotation of any planet. This Venusian sidereal day lasts therefore longer than a Venusian year (243 versus 224.7 Earth days). Slowed by its strong atmospheric current the length of the day also fluctuates by up to 20 minutes. Venus's equator rotates at 6.52 km/h (4.05 mph), whereas Earth's rotates at 1,674.4 km/h (1,040.4 mph). Venus's rotation period measured with Magellan spacecraft data over a 500-day period is smaller than the rotation period measured during the 16-year period between the Magellan spacecraft and Venus Express visits, with a difference of about 6.5minutes. Because of the retrograde rotation, the length of a solar day on Venus is significantly shorter than the sidereal day, at 116.75 Earth days (making the Venusian solar day shorter than Mercury's 176 Earth days — the 116-day figure is close to the average number of days it takes Mercury to slip underneath the Earth in its orbit [the number of days of Mercury's synodic orbital period]). One Venusian year is about 1.92Venusian solar days. To an observer on the surface of Venus, the Sun would rise in the west and set in the east, although Venus's opaque clouds prevent observing the Sun from the planet's surface.
Venus may have formed from the solar nebula with a different rotation period and obliquity, reaching its current state because of chaotic spin changes caused by planetary perturbations and tidal effects on its dense atmosphere, a change that would have occurred over the course of billions of years. The rotation period of Venus may represent an equilibrium state between tidal locking to the Sun's gravitation, which tends to slow rotation, and an atmospheric tide created by solar heating of the thick Venusian atmosphere. The 584-day average interval between successive close approaches to Earth is almost exactly equal to 5Venusian solar days (5.001444 to be precise), but the hypothesis of a spin-orbit resonance with Earth has been discounted.
Venus has no natural satellites. It has several trojan asteroids: the quasi-satellite and two other temporary trojans, and . In the 17th century, Giovanni Cassini reported a moon orbiting Venus, which was named Neith and numerous sightings were reported over the following 200 years, but most were determined to be stars in the vicinity. Alex Alemi's and David Stevenson's 2006 study of models of the early Solar System at the California Institute of Technology shows Venus likely had at least one moon created by a huge impact event billions of years ago. About 10millionyears later, according to the study, another impact reversed the planet's spin direction and the resulting tidal deceleration caused the Venusian moon gradually to spiral inward until it collided with Venus. If later impacts created moons, these were removed in the same way. An alternative explanation for the lack of satellites is the effect of strong solar tides, which can destabilize large satellites orbiting the inner terrestrial planets.
The orbital space of Venus has a dust ring-cloud, with a suspected origin either from Venus–trailing asteroids, interplanetary dust migrating in waves, or the remains of the Solar System's original circumstellar disc that formed the planetary system.
### Orbit in respect to Earth
Earth and Venus have a near orbital resonance of 13:8 (Earth orbits eight times for every 13 orbits of Venus). Therefore, they approach each other and reach inferior conjunction in synodic periods of 584 days, on average. The path that Venus makes in relation to Earth viewed geocentrically draws a pentagram over five synodic periods, shifting every period by 144°. This pentagram of Venus is sometimes referred to as the petals of Venus due to the path's visual similarity to a flower.
When Venus lies between Earth and the Sun in inferior conjunction, it makes the closest approach to Earth of any planet at an average distance of 41 million km (25 million mi). Because of the decreasing eccentricity of Earth's orbit, the minimum distances will become greater over tens of thousands of years. From the year1 to 5383, there are 526 approaches less than 40 million km (25 million mi); then, there are none for about 60,158 years.
While Venus approaches Earth the closest, Mercury is more often the closest to Earth of all planets. Venus has the lowest gravitational potential difference to Earth than any other planet, needing the lowest delta-v to transfer between them.
Tidally Venus exerts the third strongest tidal force on Earth, after the Moon and the Sun, though significantly less.
## Observability
To the naked eye, Venus appears as a white point of light brighter than any other planet or star (apart from the Sun). The planet's mean apparent magnitude is −4.14 with a standard deviation of 0.31. The brightest magnitude occurs during the crescent phase about one month before or after an inferior conjunction. Venus fades to about magnitude −3 when it is backlit by the Sun. The planet is bright enough to be seen in broad daylight, but is more easily visible when the Sun is low on the horizon or setting. As an inferior planet, it always lies within about 47° of the Sun.
Venus "overtakes" Earth every 584 days as it orbits the Sun. As it does so, it changes from the "Evening Star", visible after sunset, to the "Morning Star", visible before sunrise. Although Mercury, the other inferior planet, reaches a maximum elongation of only 28° and is often difficult to discern in twilight, Venus is hard to miss when it is at its brightest. Its greater maximum elongation means it is visible in dark skies long after sunset. As the brightest point-like object in the sky, Venus is a commonly misreported "unidentified flying object".
### Phases
As it orbits the Sun, Venus displays phases like those of the Moon in a telescopic view. The planet appears as a small and "full" disc when it is on the opposite side of the Sun (at superior conjunction). Venus shows a larger disc and "quarter phase" at its maximum elongations from the Sun, and appears at its brightest in the night sky. The planet presents a much larger thin "crescent" in telescopic views as it passes along the near side between Earth and the Sun. Venus displays its largest size and "new phase" when it is between Earth and the Sun (at inferior conjunction). Its atmosphere is visible through telescopes by the halo of sunlight refracted around it. The phases are clearly visible in a 4" telescope. Although naked eye visibility of Venus's phases is disputed, records exist of observations of its crescent.
### Daylight apparitions
When Venus is sufficiently bright with enough angular distance from the sun, it is easily observed in a clear daytime sky with the naked eye, though most people do not know to look for it. Astronomer Edmund Halley calculated its maximum naked eye brightness in 1716, when many Londoners were alarmed by its appearance in the daytime. French emperor Napoleon Bonaparte once witnessed a daytime apparition of the planet while at a reception in Luxembourg. Another historical daytime observation of the planet took place during the inauguration of the American president Abraham Lincoln in Washington, D.C., on 4March 1865.
### Transits
A transit of Venus is the appearance of Venus in front of the Sun, during inferior conjunction. Since the orbit of Venus is slightly inclined relative to Earth's orbit, most inferior conjunctions with Earth, which occur every synodic period of 1.6 years, do not produce a transit of Venus above Earth. Consequently, Venus transits above Earth only occur when an inferior conjunction takes place during some days of June or December, the time where the orbits of Venus and Earth cross a straight line with the Sun. This results in Venus transiting above Earth in a sequence of currently 8 years, 105.5 years, 8 years and 121.5 years, forming cycles of 243 years.
Historically, transits of Venus were important, because they allowed astronomers to determine the size of the astronomical unit, and hence the size of the Solar System as shown by Jeremiah Horrocks in 1639 with the first known observation of a Venus transit (after history's first observed planetary transit in 1631, of Mercury).
Only seven Venus transits have been observed so far, since their occurrences were calculated in the 1621 by Johannes Kepler. Captain Cook sailed to Tahiti in 1768 to record the third observed transit of Venus, which subsequently resulted in the exploration of the east coast of Australia.
The latest pair was June 8, 2004 and June 5–6, 2012. The transit could be watched live from many online outlets or observed locally with the right equipment and conditions. The preceding pair of transits occurred in December 1874 and December 1882.
The next transit will occur in December 2117 and December 2125.
### Ashen light
A long-standing mystery of Venus observations is the so-called ashen light—an apparent weak illumination of its dark side, seen when the planet is in the crescent phase. The first claimed observation of ashen light was made in 1643, but the existence of the illumination has never been reliably confirmed. Observers have speculated it may result from electrical activity in the Venusian atmosphere, but it could be illusory, resulting from the physiological effect of observing a bright, crescent-shaped object. The ashen light has often been sighted when Venus is in the evening sky, when the evening terminator of the planet is towards Earth.
## Observation and exploration history
### Early observation
Venus is in Earth's sky bright enough to be visible without aid, making it one of the classical planets that human cultures have known and identified throughout history, particularly for being the third brightest object in Earth's sky after the Sun and the Moon. Because the movements of Venus appear to be discontinuous (it disappears due to its proximity to the sun, for many days at a time, and then reappears on the other horizon), some cultures did not recognize Venus as a single entity; instead, they assumed it to be two separate stars on each horizon: the morning and evening star. Nonetheless, a cylinder seal from the Jemdet Nasr period and the Venus tablet of Ammisaduqa from the First Babylonian dynasty indicate that the ancient Sumerians already knew that the morning and evening stars were the same celestial object.
In the Old Babylonian period, the planet Venus was known as Ninsi'anna, and later as Dilbat. The name "Ninsi'anna" translates to "divine lady, illumination of heaven", which refers to Venus as the brightest visible "star". Earlier spellings of the name were written with the cuneiform sign si4 (= SU, meaning "to be red"), and the original meaning may have been "divine lady of the redness of heaven", in reference to the colour of the morning and evening sky.
The Chinese historically referred to the morning Venus as "the Great White" (Tàibái 太白) or "the Opener (Starter) of Brightness" (Qǐmíng 啟明), and the evening Venus as "the Excellent West One" (Chánggēng 長庚).
The ancient Greeks initially believed Venus to be two separate stars: Phosphorus, the morning star, and Hesperus, the evening star. Pliny the Elder credited the realization that they were a single object to Pythagoras in the sixth century BC, while Diogenes Laërtius argued that Parmenides (early fifth century) was probably responsible for this discovery. Though they recognized Venus as a single object, the ancient Romans continued to designate the morning aspect of Venus as Lucifer, literally "Light-Bringer", and the evening aspect as Vesper, both of which are literal translations of their traditional Greek names.
In the second century, in his astronomical treatise Almagest, Ptolemy theorized that both Mercury and Venus were located between the Sun and the Earth. The 11th-century Persian astronomer Avicenna claimed to have observed a transit of Venus (although there is some doubt about it), which later astronomers took as confirmation of Ptolemy's theory. In the 12th century, the Andalusian astronomer Ibn Bajjah observed "two planets as black spots on the face of the Sun"; these were thought to be the transits of Venus and Mercury by 13th-century Maragha astronomer Qotb al-Din Shirazi, though this cannot be true as there were no Venus transits in Ibn Bajjah's lifetime.
### Venus and early modern astronomy
When the Italian physicist Galileo Galilei first observed the planet with a telescope in the early 17th century, he found it showed phases like the Moon, varying from crescent to gibbous to full and vice versa. When Venus is furthest from the Sun in the sky, it shows a half-lit phase, and when it is closest to the Sun in the sky, it shows as a crescent or full phase. This could be possible only if Venus orbited the Sun, and this was among the first observations to clearly contradict the Ptolemaic geocentric model that the Solar System was concentric and centred on Earth.
The 1639 transit of Venus was accurately predicted by Jeremiah Horrocks and observed by him and his friend, William Crabtree, at each of their respective homes, on 4December 1639 (24 November under the Julian calendar in use at that time).
The atmosphere of Venus was discovered in 1761 by Russian polymath Mikhail Lomonosov. Venus's atmosphere was observed in 1790 by German astronomer Johann Schröter. Schröter found when the planet was a thin crescent, the cusps extended through more than 180°. He correctly surmised this was due to scattering of sunlight in a dense atmosphere. Later, American astronomer Chester Smith Lyman observed a complete ring around the dark side of the planet when it was at inferior conjunction, providing further evidence for an atmosphere. The atmosphere complicated efforts to determine a rotation period for the planet, and observers such as Italian-born astronomer Giovanni Cassini and Schröter incorrectly estimated periods of about 24 h from the motions of markings on the planet's apparent surface.
### Early 20th century advances
Little more was discovered about Venus until the 20th century. Its almost featureless disc gave no hint what its surface might be like, and it was only with the development of spectroscopic and ultraviolet observations that more of its secrets were revealed.
Spectroscopic observations in the 1900s gave the first clues about the Venusian rotation. Vesto Slipher tried to measure the Doppler shift of light from Venus, but found he could not detect any rotation. He surmised the planet must have a much longer rotation period than had previously been thought.
The first ultraviolet observations were carried out in the 1920s, when Frank E. Ross found that ultraviolet photographs revealed considerable detail that was absent in visible and infrared radiation. He suggested this was due to a dense, yellow lower atmosphere with high cirrus clouds above it.
It had been noted that Venus had no discernible oblateness in its disk, suggesting a slow rotation, and some astronomers concluded based on this that it was tidally locked like Mercury was believed to be at the time; but other researchers had detected a significant quantity of heat coming from the planet's nightside, suggesting a quick rotation (a high surface temperature was not suspected at the time), confusing the issue. Later work in the 1950s showed the rotation was retrograde.
### Space age
Humanity's first interplanetary spaceflight was achieved in 1961 with the robotic space probe Venera 1 of the Soviet Venera programme flying to Venus, but it lost contact en route.
The first successful interplanetary mission, also to Venus, was Mariner 2 of the United States' Mariner programme, passing on 14 December 1962 at 34,833 km (21,644 mi) above the surface of Venus and gathering data on the planet's atmosphere.
Additionally radar observations of Venus were first carried out in the 1960s, and provided the first measurements of the rotation period, which were close to the actual value.
Venera 3, launched in 1966, became humanity's first probe and lander to reach and impact another celestial body other than the Moon, but could not return data as it crashed into the surface of Venus. In 1967, Venera 4 was launched and successfully deployed science experiments in the Venusian atmosphere before impacting. Venera 4 showed the surface temperature was hotter than Mariner 2 had calculated, at almost 500 °C (932 °F), determined that the atmosphere was 95% carbon dioxide (CO<sup></sup>
<sub>2</sub>), and discovered that Venus's atmosphere was considerably denser than Venera 4's designers had anticipated.
In an early example of space cooperation the data of Venera 4 was joined with the 1967 Mariner 5 data, analysed by a combined Soviet–American science team in a series of colloquia over the following year.
On 15 December 1970, Venera 7 became the first spacecraft to soft land on another planet and the first to transmit data from there back to Earth.
In 1974, Mariner 10 swung by Venus to bend its path towards Mercury and took ultraviolet photographs of the clouds, revealing the extraordinarily high wind speeds in the Venusian atmosphere. This was the first interplanetary gravity assist ever used, a technique which would be used by later probes.
Radar observations in the 1970s revealed details of the Venusian surface for the first time. Pulses of radio waves were beamed at the planet using the 300 m (1,000 ft) radio telescope at Arecibo Observatory, and the echoes revealed two highly reflective regions, designated the Alpha and Beta regions. The observations revealed a bright region attributed to mountains, which was called Maxwell Montes. These three features are now the only ones on Venus that do not have female names.
In 1975, the Soviet Venera 9 and 10 landers transmitted the first images from the surface of Venus, which were in black and white. NASA obtained additional data with the Pioneer Venus project, consisting of two separate missions: the Pioneer Venus Multiprobe and Pioneer Venus Orbiter, orbiting Venus between 1978 and 1992. In 1982 the first colour images of the surface were obtained with the Soviet Venera 13 and 14 landers. After Venera 15 and 16 operated between 1983 and 1984 in orbit, conducting detailed mapping of 25% of Venus's terrain (from the north pole to 30°N latitude), the Soviet Venera programme came to a close.
In 1985 the Soviet Vega programme with its Vega 1 and Vega 2 missions carried the last entry probes and carried the first ever extraterrestrial aerobots for the first time achieving atmospheric flight outside Earth by employing inflatable balloons.
Between 1990 and 1994, Magellan operated in orbit until deorbiting, mapping the surface of Venus. Furthermore, probes like Galileo (1990), Cassini–Huygens (1998/1999), and MESSENGER (2006/2007) visited Venus with flybys en route to other destinations. In April 2006, Venus Express, the first dedicated Venus mission by the European Space Agency (ESA), entered orbit around Venus. Venus Express provided unprecedented observation of Venus's atmosphere. ESA concluded the Venus Express mission in December 2014 deorbiting it in January 2015.
In 2010, the first successful interplanetary solar sail spacecraft IKAROS travelled to Venus for a flyby.
Between 2015 and 2024 Japan's Akatsuki probe was active in orbit around Venus and BepiColombo performed flybys in 2020/2021.
### Active and future missions
Currently NASA's Parker Solar Probe and BepiColombo have been performing flybys at Venus.
Beside these flybys there are at the moment several probes under development as well as multiple proposed missions still in their early conceptual stages.
Venus has been identified for future research as an important case for understanding:
- the origins of the solar system and Earth, and if systems and planets like ours are common or rare in the universe.
- how planetary bodies evolve from their primordial states to today's diverse objects.
- the development of conditions leading to habitable environments and life.
## Search for life
Speculation on the possibility of life on Venus's surface decreased significantly after the early 1960s when it became clear that conditions were extreme compared to those on Earth. Venus's extreme temperatures and atmospheric pressure make water-based life, as currently known, unlikely.
Some scientists have speculated that thermoacidophilic extremophile microorganisms might exist in the cooler, acidic upper layers of the Venusian atmosphere. Such speculations go back to 1967, when Carl Sagan and Harold J. Morowitz suggested in a Nature article that tiny objects detected in Venus's clouds might be organisms similar to Earth's bacteria (which are of approximately the same size):
-
While the surface conditions of Venus make the hypothesis of life there implausible, the clouds of Venus are a different story altogether. As was pointed out some years ago, water, carbon dioxide and sunlight—the prerequisites for photosynthesis—are plentiful in the vicinity of the clouds.
In August 2019, astronomers led by Yeon Joo Lee reported that long-term pattern of absorbance and albedo changes in the atmosphere of the planet Venus caused by "unknown absorbers", which may be chemicals or even large colonies of microorganisms high up in the atmosphere of the planet, affect the climate. Their light absorbance is almost identical to that of micro-organisms in Earth's clouds. Similar conclusions have been reached by other studies.
In September 2020, a team of astronomers led by Jane Greaves from Cardiff University announced the likely detection of phosphine, a gas not known to be produced by any known chemical processes on the Venusian surface or atmosphere, in the upper levels of the planet's clouds. One proposed source for this phosphine is living organisms. The phosphine was detected at heights of at least 30 miles (48 km) above the surface, and primarily at mid-latitudes with none detected at the poles. The discovery prompted NASA administrator Jim Bridenstine to publicly call for a new focus on the study of Venus, describing the phosphine find as "the most significant development yet in building the case for life off Earth".
Subsequent analysis of the data-processing used to identify phosphine in the atmosphere of Venus has raised concerns that the detection-line may be an artefact. The use of a 12th-order polynomial fit may have amplified noise and generated a false reading (see Runge's phenomenon). Observations of the atmosphere of Venus at other parts of the electromagnetic spectrum in which a phosphine absorption line would be expected did not detect phosphine. By late October 2020, re-analysis of data with a proper subtraction of background did not show a statistically significant detection of phosphine.
Members of the team around Greaves, are working as part of a project by the MIT to send with the rocket company Rocket Lab the first private interplanetary space craft, to look for organics by entering the atmosphere of Venus with a probe, set to launch in January 2025.
### Planetary protection
The Committee on Space Research is a scientific organization established by the International Council for Science. Among their responsibilities is the development of recommendations for avoiding interplanetary contamination. For this purpose, space missions are categorized into five groups. Due to the harsh surface environment of Venus, Venus has been under the planetary protection category two. This indicates that there is only a remote chance that spacecraft-borne contamination could compromise investigations.
## Human presence
Venus is the place of the first interplanetary human presence, mediated through robotic missions, with the first successful landings on another planet and extraterrestrial body other than the Moon. Currently in orbit is Akatsuki, and other probes routinely use Venus for gravity assist manoeuvres capturing some data about Venus on the way.
The only nation that has sent lander probes to the surface of Venus has been the Soviet Union, which has been used by Russian officials to call Venus a "Russian planet".
### Crewed flight
Studies of routes for crewed missions to Mars have since the 1960s proposed opposition missions instead of direct conjunction missions with Venus gravity assist flybys, demonstrating that they should be quicker and safer missions to Mars, with better return or abort flight windows, and less or the same amount of radiation exposure from the flight as direct Mars flights.
Early in the space age the Soviet Union and the United States proposed the TMK-MAVR and Manned Venus flyby crewed flyby missions to Venus, though they were never realized.
### Habitation
While the surface conditions of Venus are inhospitable, the atmospheric pressure, temperature, and solar and cosmic radiation 50 km above the surface are similar to those at Earth's surface. With this in mind, Soviet engineer Sergey Zhitomirskiy (Сергей Житомирский, 1929–2004) in 1971 and NASA aerospace engineer Geoffrey A. Landis in 2003 suggested the use of aerostats for crewed exploration and possibly for permanent "floating cities" in the Venusian atmosphere, an alternative to the popular idea of living on planetary surfaces such as Mars. Among the many engineering challenges for any human presence in the atmosphere of Venus are the corrosive amounts of sulfuric acid in the atmosphere.
NASA's High Altitude Venus Operational Concept is a mission concept that proposed a crewed aerostat design.
## In culture
Venus is a primary feature of the night sky, and so has been of remarkable importance in mythology, astrology and fiction throughout history and in different cultures.
Several hymns praise Inanna in her role as the goddess of the planet Venus. Theology professor Jeffrey Cooley has argued that, in many myths, Inanna's movements may correspond with the movements of the planet Venus in the sky. The discontinuous movements of Venus relate to both mythology as well as Inanna's dual nature. In Inanna's Descent to the Underworld, unlike any other deity, Inanna is able to descend into the netherworld and return to the heavens. The planet Venus appears to make a similar descent, setting in the West and then rising again in the East. An introductory hymn describes Inanna leaving the heavens and heading for Kur, what could be presumed to be, the mountains, replicating the rising and setting of Inanna to the West. In Inanna and Shukaletuda and Inanna's Descent into the Underworld appear to parallel the motion of the planet Venus. In Inanna and Shukaletuda, Shukaletuda is described as scanning the heavens in search of Inanna, possibly searching the eastern and western horizons. In the same myth, while searching for her attacker, Inanna herself makes several movements that correspond with the movements of Venus in the sky.
The Ancient Egyptians and ancient Greeks possibly knew by the second millennium BC or at the latest by the Late Period, under mesopotamian influence that the morning star and an evening star were one and the same. The Egyptians knew the morning star as Tioumoutiri and the evening star as Ouaiti. They depicted Venus at first as a phoenix or heron (see Bennu), calling it "the crosser" or "star with crosses", associating it with Osiris, and later depicting it two-headed with human or falco heads, and associated it with Horus, son of Isis (which during the even later Hellenistic period was together with Hathor identified with Aphrodite). The Greeks used the names Phōsphoros (Φωσφόρος), meaning "light-bringer" (whence the element phosphorus; alternately Ēōsphoros (Ἠωσφόρος), meaning "dawn-bringer"), for the morning star, and Hesperos (Ἕσπερος), meaning "Western one", for the evening star, both children of dawn Eos and therefore grandchildren of Aphrodite. Though by the Roman era they were recognized as one celestial object, known as "the star of Venus", the traditional two Greek names continued to be used, though usually translated to Latin as Lūcifer and Vesper.
Classical poets such as Homer, Sappho, Ovid and Virgil spoke of the star and its light. Poets such as William Blake, Robert Frost, Letitia Elizabeth Landon, Alfred Lord Tennyson and William Wordsworth wrote odes to it. The composer Holst included it as the second movement of his The Planets suite.
In India, Shukra Graha ("the planet Shukra") is named after the powerful saint Shukra. Shukra which is used in Indian Vedic astrology means "clear, pure" or "brightness, clearness" in Sanskrit. One of the nine Navagraha, it is held to affect wealth, pleasure and reproduction; it was the son of Bhrgu, preceptor of the Daityas, and guru of the Asuras.
The English name of Venus was originally the ancient Roman name for it. Romans named Venus after their goddess of love, who in turn was based on the ancient Greek goddess of love Aphrodite, who was herself based on the similar Sumerian religion goddess Inanna (which is Ishtar in Akkadian religion), all of whom were associated with the planet. The weekday of the planet and these goddesses is Friday, named after the Germanic goddess Frigg, who has been associated with the Roman goddess Venus.
Venus is known as Kejora in Indonesian and Malaysian Malay.
In Chinese the planet is called Jīn-xīng (金星), the golden planet of the metal element. Modern Chinese, Japanese, Korean and Vietnamese cultures refer to the planet literally as the "metal star" (金星), based on the Five elements.
The Maya considered Venus to be the most important celestial body after the Sun and Moon. They called it Chac ek, or Noh Ek', "the Great Star". The cycles of Venus were important to their calendar and were described in some of their books such as Maya Codex of Mexico and Dresden Codex. The Estrella Solitaria ("Lone Star") Flag of Chile depicts Venus.
### Modern culture
The impenetrable Venusian cloud cover gave science fiction writers free rein to speculate on conditions at its surface; all the more so when early observations showed that not only was it similar in size to Earth, it possessed a substantial atmosphere. Closer to the Sun than Earth, the planet was often depicted as warmer, but still habitable by humans. The genre reached its peak between the 1930s and 1950s, at a time when science had revealed some aspects of Venus, but not yet the harsh reality of its surface conditions. Findings from the first missions to Venus showed reality to be quite different and brought this particular genre to an end. As scientific knowledge of Venus advanced, science fiction authors tried to keep pace, particularly by conjecturing human attempts to terraform Venus.
### Symbols
The symbol of a circle with a small cross beneath is the so-called Venus symbol, gaining its name for being used as the astronomical symbol for Venus. The symbol is of ancient Greek origin, and represents more generally femininity, adopted by biology as gender symbol for female, like the Mars symbol for male and sometimes the Mercury symbol for hermaphrodite. This gendered association of Venus and Mars has been used to pair them heteronormatively, describing women and men stereotypically as being so different that they can be understood as coming from different planets, an understanding popularized in 1992 by the book titled Men Are from Mars, Women Are from Venus.
The Venus symbol was also used in Western alchemy representing the element copper (like the symbol of Mercury is also the symbol of the element mercury), and since polished copper has been used for mirrors from antiquity the symbol for Venus has sometimes been called Venus mirror, representing the mirror of the goddess, although this origin has been discredited as an unlikely origin.
Besides the Venus symbol, many other symbols have been associated with Venus, other common ones are the crescent or particularly the star, as with the Star of Ishtar.
## See also
- Outline of Venus
- Physical properties of planets in the Solar System
- Venus zone |
# Joe Warbrick
Joseph Astbury Warbrick (1 January 1862 – 30 August 1903) was a Māori rugby union player who represented New Zealand on their 1884 tour to Australia and later captained the 1888–89 New Zealand Native football team that embarked on a 107-match tour of New Zealand, Australia, and the British Isles.
Born in Rotorua, Warbrick played club rugby for Auckland side Ponsonby while boarding at St Stephen's Native School. In 1877, he was selected to play fullback for Auckland Provincial Clubs as a 15-year-old, making him the youngest person to play first-class rugby in New Zealand. He played for Auckland against the visiting New South Wales team, the first overseas side to tour the country, in 1882. Two years later, he was selected for the first New Zealand representative team, and playing mainly as a three-quarter, appeared in seven of the side's eight matches on their tour of New South Wales.
In 1888, Warbrick conceived of, selected, and led the privately funded New Zealand Native team. The squad, which included four of Warbrick's brothers, was originally envisaged to contain only Māori players, but eventually included several New Zealand-born and foreign-born Europeans. Although the team played 107 matches, including 74 in the British Isles, Warbrick took part in only 21 matches due to injury. The tour, the first from the Southern Hemisphere to visit Britain, remains the longest in rugby's history. In 2008, Warbrick and the Natives were inducted into the World Rugby Hall of Fame.
Warbrick effectively retired from rugby after returning from the tour, with the exception of an appearance for Auckland in 1894, and went on to work as a farmer and tourist guide in the Bay of Plenty. In 1903, he was killed along with several others by an eruption of the Waimangu Geyser.
## Background and early career
Joseph Warbrick was born in Rotorua, New Zealand, on 1 January 1862, the third of five children. His father, Abraham Warbrick, was originally from England, while his mother, Nga Karauna Paerau, was Māori and the daughter of a Ngāti Rangitihi chief. After Joe Warbrick's mother died, his father remarried and had seven more children. Four of his brothers – Alfred, Arthur, Fred, and Billy – went on to tour with Joe as part of the 1888–89 New Zealand Native football team.
With his family still based in the Bay of Plenty, Joe Warbrick was sent to board at St Stephen's Native School in the Bombay Hills, where he started playing rugby union. While living in Bombay in 1877, he started playing club rugby with Ponsonby in Auckland, even though the club was based well north of Bombay. Warbrick played well enough for Ponsonby to earn selection for Auckland Provincial Clubs (now Auckland) that year despite being only 15 years old. Playing at fullback for them against Otago, he became the youngest person to play first-class rugby in New Zealand – a record he still holds as of 2017.
By 1878 Warbrick had left both St Stephen's and Ponsonby and was employed as a public servant. The work required him to relocate regularly, and he moved throughout the North Island for the remainder of his rugby career. By 1879 he was living in Wellington, and represented the provincial team three times that season. He played three further matches for Wellington in 1880, including one against his old province of Auckland. The 1880 match, the first in Auckland for Wellington, was won by the visitors 4–0. Warbrick was renowned for his drop-kicking, and his goal in the match was the only score; many Aucklanders claimed that his performance was the difference between the two sides.
The Australian New South Wales colonial team became the first overseas rugby side to tour New Zealand in 1882 and played seven matches throughout the country. By this point Warbrick was back in Auckland, but this time playing for the North Shore club, and he again won selection for the provincial side. He appeared in both of Auckland's matches against the New South Welshmen: 7–0 and 18–4 victories over the tourists. Warwick remained in Auckland the following year when he toured with the province again, playing in away matches against Wellington, Canterbury and Otago.
## 1884 New Zealand team
In 1884 a team of New Zealanders, organised by the Canterbury player and administrator William Millton, and Dunedin businessman Samuel Sleigh, was selected to tour New South Wales. This is now officially regarded as the first New Zealand representative rugby side. Warbrick was included in a squad that was selected from throughout the country; the entire endeavour was performed without the oversight of a national body – several provincial rugby unions existed, but the New Zealand Rugby Football Union was not formed until 1892.
The squad's 19 players were expected to assemble in Wellington before disembarking for Sydney on 21 May, however Warbrick missed his ship from Auckland and so travelled to Sydney alone. Millton was elected captain, and Sleigh managed the team. The side won all eight of their matches on tour, including the three games against New South Wales. Warbrick appeared in seven matches and scored three drop goals; one of the goals was reportedly kicked from well inside his own half. He played at both fullback and three-quarter, and was noted for his good ball handling and speed, as well as his ability to drop kick.
## Later provincial career
After returning from tour, Warbrick moved to Napier, and in 1885 represented Hawke's Bay provincially, including captaining them against Poverty Bay. By 1886 he was back playing for Auckland, and that year captained them in their wins over both Wellington, and also New South Wales – who were again touring the country. He returned to Hawke's Bay for the 1887 season, and played for them against Wellington, Poverty Bay, and Canterbury. Warbrick had returned to Wellington by the 1888 season when he again played for the province.
The first British Isles side (now known as the British and Irish Lions) toured New Zealand and Australia in 1888. The side was privately organised, without the sanction or prohibition of England's Rugby Football Union, and toured New Zealand in April and May that year where they played against a number of provincial sides. Although the team was not representative of the best British and Irish players, it did include three internationals with the rest predominantly county representatives. Warbrick was in the Wellington team that faced the tourists on 13 May. The match was ill-tempered, with each side accusing the other of rough play, and eventually finished as a 3–3 draw.
## 1888–89 New Zealand Native football team
### Preparations
In early 1888 Warbrick announced plans to assemble a Māori side to face the visiting British during their tour, but he later revealed he wanted to take a team of Māori or part-Māori to tour the British Isles. His ambition was for "Māori football" to be as famous as Australian cricket, whose national side had already developed a strong rivalry with the English. It is not known exactly when Warbrick had conceived of the idea for this tour, but it was well before the arrival of the British Isles team in April 1888.
The touring British did help demonstrate the feasibility of Warbrick's proposal, which was daunting – no New Zealand side had ever toured the Northern Hemisphere. Hearing of Warbrick's plans, civil servant Thomas Eyton contacted him to offer help managing the tour, which Warbrick accepted. By May 1888, James Scott, a publican, had joined the partnership. The three men decided that Warbrick would be the team's captain, coach and selector, Scott its manager, and Eyton its promoter. Although Warbrick had chiefly sporting reasons for conducting the tour, for Eyton and Scott profit was the major motivation.
A New Zealand Māori side had never been selected – the first official side did not play until 1910 – but Warbrick's experience in provincial rugby ensured that he was well qualified to select the team. He travelled the country trying to find players who were talented and willing to spend a year on tour. The make-up of the team changed significantly between March 1888 and when the team departed New Zealand in August. Warbrick encountered challenges assembling the side; there was opposition from some players to including part-Māori in the squad, which prompted several early recruits to withdraw.
Initially, 20 players were selected for the side, named the "New Zealand Māori team". Some of these players had strong family and playing links to Warbrick (such as his four brothers). Warbrick was eventually compelled to add five Pākehā (European non-Māori) players to the squad, which resulted in the side being renamed the "New Zealand Native football team". Warbrick may have wanted a team of exclusively Māori or part-Māori players, but according to historian Greg Ryan, including the Pākehā players was "necessary to strengthen the Native team and create a more effective combination". A further player, Pie Wynyard, was added to the side after they arrived in Britain in November 1888.
### Domestic tour and British Isles
The side's first match was against Hawke's Bay on 23 June 1888, with Warbrick playing in the backs. The match was won 5–0, and was followed by a second match a week later in which Warbrick contributed 10 points in an 11–0 victory.
The next match was against a strong Auckland side, who defeated the Natives 9–0. The heavy defeat was costly for the Native team, with Warbrick breaking several bones in his foot. It was his last game until November that year, and the loss prompted the addition of Patrick Keogh – one of the five Pākehā in the side – to the squad before its departure from New Zealand.
The team departed New Zealand on 1 August 1888, and sailed to England via Melbourne. After their six-week voyage from Australia, the Native team arrived in England on 27 September 1888. Their first match was against Surrey, on 3 October, but Joe Warbrick was still injured and did not play. The side played regularly – they averaged just over three games per week while in Britain – but Warbrick did not appear until 7 November when the team faced Tynemouth. The match was won 7–1, but Warbrick – who played at fullback – exacerbated his foot injury. He managed to play six matches between mid-December and early January before he was again injured. He appeared against Stockport, a match drawn 3–3, on 12 January, but despite being fit enough to play his form was poor.
Warbrick only played twice more in the following month, and was not fit enough to be selected for the team that faced England on 16 February. The match resulted in a controversial 7–0 loss for the Natives, and included the awarding of two dubious English tries by the referee George Rowland Hill – who was also Secretary of the English Rugby Football Union (RFU). The loss and aftermath soured the relationship between Warbrick's team and the RFU – who accused the Natives of poor sportsmanship after they had protested at the awarding of the controversial tries.
By the time the team departed for Australia in late March they had played 74 matches in Britain, winning 49, losing 20, and drawing 5. However, due to injury, Warbrick only appeared in 14 matches; in contrast David Gage featured in 68, and eight other members played more than 50. Warbrick was not the only player to experience injury; the taxing schedule of matches took a toll, and he frequently struggled to find a full complement of 15 fit players. On top of playing relatively few matches in Britain, Warbrick scored only once there – a conversion against Devon.
The high injury toll and congested schedule contributed to complaints about Joe Warbrick's behaviour. His comments to the English press – who directed much of their focus towards him – were viewed negatively by some members of the squad; he was accused of neglecting to acknowledge the contributions of players such as Thomas Ellison, Gage, Keogh, and Edward McCausland but to extol the efforts of himself and his brothers.
Warbrick said of his time in the British Isles: "My impression of England and its people during the tour was a very favourable one, more especially does this apply to private individuals. I found them everywhere very kind and attentive and apparently anxious to make one's visit as pleasant as possible". The term "private individuals" may have been used to exclude from praise both the RFU and London press.
Following the tour he also criticised the partiality of the English referees, and believed that the English administrators displayed a double standard in their treatment of the Natives; the RFU treated the Native team's motives for touring with suspicion, believing the enterprise to be speculative and criticising them for not upholding the amateur principles the RFU liked to espouse. Yet the RFU continued to select Andrew Stoddart for England, despite him touring with the speculative and unsanctioned 1888 British team that travelled to New Zealand and Australia.
### Australia and return to New Zealand
Warbrick and the team sailed to Australia for a leg of their tour described by historian Greg Ryan as "little more than a testimony to the motives of Scott and Eyton as speculators." Their time in Australia started in Victoria, where the side mostly played Victorian Rules Football against Melbourne clubs. These matches were played for financial rather than sporting reasons, and the team had little success at the sport. While the side only played a single rugby match in Victoria, in New South Wales and Queensland they almost exclusively played rugby. Warbrick made few appearances in Australia – two in total – but continued functioning as team captain. The Natives had not lost a rugby match in Australia when they played their second match against the Queensland representative side. The first match was won 22–0, and the second – held on 20 July – was expected to be another comfortable victory for the Natives.
However, at half-time the scores were level, and with the exception of Billy Warbrick, the Natives had played poorly. There were rumours that four of the Natives had been paid by local bookmakers to throw the match. When Joe Warbrick spoke to the team at half-time, he threatened to expose the accused players; this was enough to prompt an improvement in the Natives' play, and the side recovered to win 11–7.
The team returned to New Zealand in August 1889, but the Queensland controversy still hung over the side. The Northern Rugby Union (later renamed the Queensland Rugby Union) did not take any action over the accusations, but the Otago Rugby Union (ORU) decided to conduct an inquiry. The matter was not resolved until after the team arrived in Dunedin when the ORU announced there was no evidence "justifying the accusations", and dismissed taking any further action. The team continued to travel north and to play fixtures throughout the country. Joe Warbrick had played an earlier match in Gore – against Mataura District XVI – where he again suffered injury.
The team's final match was against Auckland on 24 August. The fixture was lost 7–2, but by this point several Native's players had departed the team, including Keogh, Ellison and Gage. Despite the grueling schedule and high number of injuries, the loss to Auckland ended a remarkable streak that had started with a victory over Widnes on 9 March; the Natives had not lost a rugby game in 31 matches, winning 30 and drawing the other. The Natives played a total of 107 rugby matches, including 74 in the British Isles, and the tour remains the longest in the sport's history.
## Retirement and later life
Warbrick retired from rugby at the conclusion of the Natives' tour. He moved to the Bay of Plenty to farm, and occasionally turned out for the Tauranga representative team. Five years after he retired he made a one-match first-class comeback when he played for Auckland against Taranaki in 1894. After this match, an Auckland newspaper wrote:
> Considering that Joe won his cap in 1877, it must be very pleasing to him to be able to record 1894 on it. As I said before, Joe's career as a footballer is, I believe, unparalleled in the colonies. It is certainly a feat Joe may well feel proud of, that after battling the storms for a period of 17 years, he has again been called to render assistance to his province ...
Warbrick married Harriet Burt with whom he had one daughter, and he later worked as a tourist guide in the Rotorua area, where his brother Alfred was the Chief Government Guide. On 30 August 1903, while working with his brother in the geothermal region of the area, Joe Warbrick was killed. The Waimangu Geyser – then the largest geyser in the world – unexpectedly erupted with Joe Warbrick and several tourists in the vicinity; four of them, including Warbrick, were killed instantly by the superheated water ejected during the eruption before they were swept towards Lake Rotomahana. Joe Warbrick had warned one of the tourists not to venture too close to the geyser; however, she insisted on moving closer to get a better photograph. Warbrick accompanied her, and barely two minutes later the geyser erupted and killed the entire party.
## Impact and legacy
As the captain and instigator of the 1888–89 Natives – the first New Zealand team to tour the British Isles – Warbrick had a lasting impact on the development of rugby in his homeland. When the Natives returned from tour they introduced a style of rugby as good as any ever seen in the country. According to Ryan, "their brand of sensational running style and combined forward play had never been seen in New Zealand." The speculative nature of the tour, which was outside the control of an official authority, concerned many of the provincial unions and gave further momentum to efforts to form a national body. In 1892 the New Zealand Rugby Football Union was founded which would, among other things, organise any representative tours. Many of the Natives went on to contribute to rugby as representative players, administrators, or referees. Two players, Ellison and Gage, went on to captain New Zealand.
In 2008 Warbrick was inducted into the World Rugby Hall of Fame, and is a member of the Māori Sports Awards Hall of Fame. A short film, Warbrick, written and directed by brothers Pere and Meihana Durie, was released in 2009 and depicts Joe Warbrick preparing an injury-depleted Natives squad for a match. The film was played for the All Blacks during their preparations for a match against Australia in 2009.
## See also
- List of 1888–89 New Zealand Native football team matches |
# Battle of Heraklion
The Battle of Heraklion was part of the Battle of Crete, fought during World War II on the Greek island of Crete between 20 and 30 May 1941. British, Australian and Greek forces of 14th Infantry Brigade, commanded by Brigadier Brian Chappel, defended Heraklion port and airfield against a German paratrooper attack by the 1st Parachute Regiment of the 7th Air Division, commanded by Colonel Bruno Bräuer.
The attack on Heraklion during the afternoon of 20 May was one of four airborne assaults on Crete that day, following German attacks against Maleme airfield and the main port of Chania in the west of Crete in the morning. The aircraft that dropped the morning attackers were scheduled to drop the 1st Regiment over Heraklion later the same day. Confusion and delays at the airfields in mainland Greece meant that the assault was launched without direct air support, and over several hours rather than simultaneously; some units were still at the airfields by the end of the day. Those German units dropping near Heraklion suffered very high casualties, both from ground fire and upon landing. Those dropping further away were severely hampered by armed Cretan civilians. The initial German attack failed; when it was renewed the next day it failed again. The fighting then settled into a stalemate.
A battalion of the German 5th Mountain Division was supposed to reinforce the paratroopers at Heraklion by sea, bringing with it artillery and anti-aircraft guns. It was delayed en route, diverted to Maleme, then intercepted by a British naval squadron and scattered. The German overall commander, Lieutenant-general Kurt Student, concentrated all resources on the battle for Maleme airfield, which the Germans won. The Allied Commander-in-Chief Middle East, General Archibald Wavell, ordered an evacuation of Crete on 27 May and the 14th Brigade was taken off by Allied warships on the night of 28/29 May. During the return to Alexandria two destroyers were sunk, two cruisers badly damaged, more than 440 Allied servicemen killed, over 250 wounded and 165 taken prisoner. Due to their heavy losses on Crete the Germans attempted no further large-scale airborne operations during the war.
## Background
Greece became a belligerent in World War II when it was invaded by Italy on 28 October 1940. A British and Commonwealth expeditionary force was sent to support the Greeks; this force eventually totalled more than 60,000 men. British forces also garrisoned Crete, enabling the Greek Fifth Cretan Division to reinforce the mainland campaign. This arrangement suited the British as Crete could provide their navy with harbours on its north coast. The Italians were repulsed by the Greeks without the aid of the expeditionary force. In April 1941, six months after the failed Italian invasion, a German attack overran mainland Greece and the expeditionary force was withdrawn. By the end of April, 57,000 Allied troops were evacuated by the Royal Navy. Some were sent to Crete to bolster its garrison, though most had lost their heavy equipment.
The German army high command (Oberkommando des Heeres (OKH)) was preoccupied with the forthcoming Operation Barbarossa, the invasion of the Soviet Union and was largely opposed to an attack on Crete. Adolf Hitler was concerned about attacks on the Romanian oil fields from Crete and Luftwaffe commanders were enthusiastic about the idea of seizing Crete by an airborne attack. In Führer Directive 28 Hitler ordered that Crete was to be invaded to use it "as an airbase against Britain in the Eastern Mediterranean". The directive also stated that the operation was to take place in May and must not interfere with the planned campaign against the Soviet Union.
## Opposing forces
### Allies
On 30 April 1941 Major-general Bernard Freyberg, who had been evacuated from mainland Greece with the 2nd New Zealand Division, was appointed commander-in-chief on Crete. He noted the acute lack of heavy weapons, equipment, supplies and communication facilities. Equipment was scarce in the Mediterranean, particularly in the backwater of Crete. The British forces on Crete had seven commanders in seven months. No Royal Air Force (RAF) units were based permanently on Crete until April 1941 but airfield construction took place, radar sites were built and stores delivered. By early April airfields at Maleme and Heraklion and the landing strip at Rethymno, all on the north coast, were ready and another strip at Pediada-Kastelli was nearly finished.
Of the seven airstrips on Crete, the best equipped, and the only one with a concrete runway, was at Heraklion. It was also the only one with blast pens to protect aircraft on the ground. It was still improvised in nature, with the fuel store located outside the positions defending the airfield for example. A radar station was established on Ames Ridge, a hill south east of Heraklion airfield, but it was outside the defensive perimeter and its communications were unreliable. By 29 April 47,000 Commonwealth troops of the defeated Allied expeditionary force were evacuated from mainland Greece. In the space of a week 27,000 of these arrived on Crete from Greece, many lacking any equipment other than their personal weapons, sometimes not even those. Of these, 9,000 were further evacuated and 18,000 remained on Crete when the battle commenced. With the pre-existing garrison of 14,000 this gave the Allies a total of 32,000 Commonwealth troops to face the German attack, supplemented by 10,000 Greeks.
Heraklion was defended by the British 14th Infantry Brigade, commanded by Brigadier Brian Chappel. The brigade was made up of: the 2nd Battalion, the York and Lancaster Regiment (2nd York and Lancs; with a complement of 742 officers and men on the eve of the battle) and the 2nd Battalion, the Black Watch (Royal Highland Regiment) (2nd Black Watch; 867), with the Australian 2/4th Battalion (2/4th; 550) temporarily attached. The men of the 2/4th had been evacuated from mainland Greece to Crete, arriving on 27 April. On the night of 15/16 May, four days before the battle, the brigade was reinforced by the 2nd Battalion of the Leicester Regiment (2nd Leicesters; 637), which was transported from Alexandria to Heraklion by the cruisers HMS Gloucester and HMS Fiji. Also attached were 450 artillerymen of the 7th Medium Regiment Royal Artillery fighting as infantry and the Greek 3rd and 7th regiments (both battalion-sized and lacking training and weaponry) and a Greek depot battalion undergoing training. Supporting the brigade was the 234 Battery of the 68 Medium Regiment of artillery, which was equipped with 13 captured Italian field guns. Also attached to the brigade were the 7th Battery of the 2/3rd Light Anti-Aircraft Regiment, five heavy infantry tanks and six light tanks – not all necessarily operational at a given time – and a variety of other small anti-aircraft, support and ancillary units. As well as the Italian guns the brigade could field a further 2 artillery pieces and 14 anti-aircraft guns of several calibres. In total Heraklion was defended by just over 7,000 men, of whom approximately 2,700 were Greek.
### Germans
The German assault on Crete was code-named "Operation Mercury" (Unternehmen Merkur) and was controlled by the 12th Army commanded by Field Marshal Wilhelm List. The German 8th Air Corps (VIII Fliegerkorps) provided close air support; it was equipped with 570 combat aircraft. The infantry available for the assault were the German 7th Air Division, with the Air-landing Assault Regiment (Luftlande-Sturm-Regiment) attached, and the 5th Mountain Division. They totalled 22,000 men grouped under the 11th Air Corps (XI Fliegerkorps) which was commanded by Lieutenant-general Kurt Student who was in operational control of the attack. Over 500 Junkers Ju 52 transport aircraft were assembled to carry them. Student planned a series of four parachute assaults against Allied facilities on the north coast of Crete by the 7th Air Division, which would then be reinforced by the 5th Mountain Division, part transported by air and part by sea; the latter would also ferry much of the heavy equipment.
For the assault on Heraklion the Germans assigned their strongest individual force of those launching the initial assault on Crete: the 1st Parachute Regiment, the 2nd Battalion of the 2nd Parachute Regiment and an anti-aircraft machinegun battalion, all from the 7th Air Division. This force totalled approximately 3,000 men and was commanded by Colonel Bruno Bräuer. A few days before the attack, German intelligence summaries stated that the total Allied force on Crete consisted of 5,000 men and that the garrison of Heraklion was 400 strong. Before the invasion, the Germans conducted a bombing campaign against Crete and the surrounding waters to establish air superiority. The RAF rebased its surviving aircraft to Alexandria after 29 of their 35 Crete-based fighters were destroyed.
#### Paratroopers
The design of the German parachutes and the mechanism for opening them imposed operational constraints on the paratroopers. The static lines, which automatically opened the parachutes as the men jumped from the aircraft, were easily fouled and so each man wore a coverall over all of their webbing and equipment. This precluded their jumping with any weapon larger than a pistol or a grenade. Rifles, automatic weapons, mortars, ammunition, food and water were dropped in separate containers. Until and unless the paratroopers reached these they had only their pistols and hand grenades with which to defend themselves.
The danger of fouling the static lines also required that German paratroopers leapt headfirst from their aircraft and so they were trained to land on all fours – rather than the usually recommended feet together, knees-bent posture – which resulted in a high incidence of wrist injuries. Once out of the plane German paratroopers were unable to control their fall or to influence where they landed. Given the importance of landing close to one of the weapons containers, doctrine required jumps to take place from no higher than 400 feet (120 m) and in winds no stronger than 14 miles per hour (23 km/h). The transport aircraft had to fly straight, low and slowly, making them an easy target for any ground fire. Paratroopers were carried by the reliable tri-motored Ju 52. Each aircraft could lift 13 paratroopers, with their weapons containers carried on the planes' external bomb racks.
## Opposing plans
### Allied defences
Chappel deployed the three Greek units in Heraklion and the open ground to the town's west and south. The town was protected by its early modern walls. Running east from the town Chappel deployed in turn the 2nd York and Lancs, the 2nd Leicesters and the 2/4th. The 2/4th was deployed to overlook the airfield from two hills, known as "the Charlies". To their east, closing the gap between the Charlies and the sea were the 2nd Black Watch; within their perimeter was East Hill, from which they could dominate both the coast road and the airfield itself, which lay some 3 miles (5 km) from the centre of Heraklion. The field guns and the artillerymen fighting as infantry were positioned to the east of Heraklion, behind the front-line battalions. Ten Bofors anti-aircraft guns were positioned around the airfield. All units were well dug in and camouflaged.
### German assault
The overall German plan for their assault on Crete was to land two reinforced German regiments by parachute and assault glider at Maleme airfield and near the main port of Chania in the west of Crete the morning of 20 May. The aircraft which dropped them were scheduled to then make further drops at Rethymno and Heraklion in the afternoon.
Bräuer, anticipating opposition from half a battalion, rather than seven, planned for the 2nd Battalion of the 1st Parachute Regiment (II/1), reinforced by an anti-aircraft machinegun company, to land in two groups on or near the airfield and capture it. The regiment's 3rd Battalion (III/1) would land in the open areas south west of Heraklion, rapidly concentrate and take the town by a coup de main. The 2nd Battalion of the 3rd Parachute Regiment (II/3) would land immediately to the west of the III/1. Bräuer would drop with the 1st Battalion (I/1) 5 miles (8 km) to the east as an operational reserve. After capturing the airfield and town Bräuer intended to move his regiment west towards the Germans landing 80 kilometres (50 mi) away at Rethymno, while deploying a scouting screen eastwards.
## Battle
### Initial assault
On the morning of 20 May the attacks on Maleme airfield and Chania took place as scheduled. The transport aircraft involved in them returned to mainland Greece to embark the paratroopers scheduled to drop at Rethymno and Heraklion in the afternoon. The Germans were having problems with their hastily constructed airfield facilities which had consequences for the attack on Heraklion. They were blanketed with dust clouds blown up by the aircraft's engines, reducing safe taxiing speeds and making taking off and landing hazardous. Several Ju 52s which had been damaged by Allied ground fire crashed on landing and had to be towed clear of the runways. Refuelling was carried out by hand and took longer than anticipated. Aware that this would mean a significant delay to when the drop around Heraklion would commence, the commander of the Ju 52 wing, Rüdiger von Heyking, attempted to have the air support attack similarly delayed. Inadequate communication systems prevented this message from getting through in time.
The assault on Heraklion began with a strong German air attack at about 16:00. This was intended to prevent Allied ground fire against the vulnerable Ju 52s. Both the infantry and the anti-aircraft guns were under orders not to return fire, so the attackers were unable to identify their positions and there were few casualties among the well dug in and camouflaged Allies. The German attack was also intended to provide close air support for the paratrooper drop. In the event the attacking bombers and fighters ran low on fuel and departed before the paratrooper transports arrived. Due to a failure of wireless communications, the 14th Brigade was unaware of the airborne assault in western Crete that morning and did not associate the unusually heavy air raid with the possibility of a parachute attack. At around 17:30 the Ju 52s, paralleling the coast, commenced their drop runs. Flying straight and low they were easy targets for the limited number of Allied anti-aircraft guns. Even the Allied infantry were able to engage them. The Australians on the Charlies reported being able to fire directly into aircraft doors as the paratroopers were jumping. Many paratroopers were killed in the air as they slowly descended. The II/1 Battalion was dropped close to the airfield; its men who reached the ground alive were attacked by infantry of the Black Watch supported by tanks before being able to reach their weapons containers. A few attempted an assault on East Hill, but were easily repulsed. Within thirty minutes the battalion lost 400 dead and wounded, the survivors consolidating near Ames Ridge, south east of the Allied positions, or in an abandoned barracks on the coast road to the east.
West of Heraklion the III/1 Battalion also suffered badly from the Allied anti-aircraft fire. Greek troops and armed civilians immediately counter-attacked the Germans on the ground. As the Cretans exhausted their ammunition the Germans attacked the town. The old walls obstructed their advance as they lacked heavy artillery or explosives with which to breach them. They concentrated on the town gates and were able to fight their way into the town in two groups, and house-to-house fighting ensued which continued late into the night. The German battalion commander, Major Karl-Lothar Schulz, attempted to regroup in the southern part of the town but was unable to recall all of those fighting in the narrow streets. Some groups got as far as the harbour. Schulz withdrew from the town with those troops he could gather.
The II/2 Battalion landed uneventfully further west, but it was at half strength; the balance of the battalion was still in mainland Greece, trapped in the chaos at the airfields. Next morning the missing troops were diverted by Student to Maleme Airfield, 100 miles (160 km) to the west. The half-battalion which had dropped took up a position to block the coast road to Heraklion from the west.
The I/1 Battalion landed successfully 5 miles (8 km) east of Heraklion at around 20:00 and captured a radio station near the village of Gournes. Bräuer landed with this unit and, although he was unable to make contact with his other battalions, reported that the attack was progressing "as smooth as silk". He then took one platoon from the battalion and marched west with it and the regimental headquarters section. Nearing the town he discovered that the II/1 had been all but wiped out and that the airfield was still strongly held by the Allies. A little after midnight he passed this update on to mainland Greece and launched his solitary platoon in an attack on East Hill. Facing the dug in Black Watch battalion this failed and the platoon was cut off from Bräuer. The rest of the battalion was unable to move up in time to reinforce the attack as it was delayed by having to assemble itself and retrieve its weapons containers in the dark and by attacks from Cretan civilians. These attacks were responsible for the elimination of an entire platoon, and caused approximately 200 casualties in total.
Because of the disorder at the Greek airfields the German air operations over Heraklion were ill coordinated. The paratrooper drop continued for two to three hours, providing a succession of easy targets for Allied anti-aircraft guns. During this period no German fighters nor bombers returned to suppress the ground fire. A total of 15 Ju 52s were shot down. Before the Germans had completed their drop Chappel had already committed his reserve battalion and tanks to a counter-attack. On receipt of the initial report from Bräuer, Student gave orders for the 5th Mountain Division to be ferried by air to Heraklion airfield on the 21st. When Bräuer's subsequent report was received, Student realised that all four paratrooper assaults had failed. Determined, in the words of the historian Callum MacDonald, "to snatch victory from the jaws of defeat" he ordered that all resources be reallocated to capturing the airfield at Maleme, 100 miles (160 km) west of Heraklion.
### Seaborne contingent
Meanwhile, the 2nd Battalion of the 85th Mountain Regiment (II/85) from the 5th Mountain Division and much of the division's artillery and anti-aircraft guns had loaded onto commandeered Greek caiques at Piraeus, the port of Athens, and sailed to Milos escorted by the Italian torpedo boat Sagittario. This convoy was known as the 2nd Motor Sailing Flotilla. Alongside them was the 1st Motor Sailing Flotilla, bound for Maleme. The two flotillas totalled 70 small vessels. The plan was for both to sail for Milos, then cross from Milos to Crete while a strong escort of aircraft deterred the British navy from attacking. They were detected by Allied signal intelligence and their position confirmed by aerial observation. At nightfall on 20 May an Allied naval squadron known as Force C and consisting of two cruisers, HMAS Perth and HMS Naiad, and four destroyers, commanded by Rear Admiral Edward King, entered the Aegean via the Strait of Kasos to the east of Crete. They sailed to intercept the force believed to be heading towards Heraklion and were attacked by Italian aircraft and light ships at dusk. They found no invasion force between Milos and Heraklion, patrolled off Heraklion until dawn and then returned to the Mediterranean. En route they were attacked by German dive bombers but suffered no losses.
Wary of the Allied naval patrols, the German convoys had spent the night in the vicinity of Milos. At first light on the 21st they headed south. Student had asked Admiral Karlgeorg Schuster to divert the Heraklion-bound convoy to Maleme, in keeping with his new concentration on the latter. The caiques moved at around 6 knots (10 km/h; 7 mph) and the impressed Greek crews were suspected of not getting the best out of their vessels. At 10:00 the convoy was ordered back to Milos due to inaccurate reports of Allied ships in the area; this order was subsequently cancelled, reinstated and cancelled again. Aware of the convoy's progress due to Ultra signals intercepts the Allies sent a squadron through the Kythira Strait to the west of Crete. This was Force D, consisting of the cruisers HMS Ajax, HMS Orion and HMS Dido, and three destroyers commanded by Rear Admiral Irvine Glennie. They were unsuccessfully dive bombed as they entered the Aegean and intercepted the 1st Motor Sailing Flotilla at about 22:30. The British squadron attacked the head of the by now scattered convoy, harried by the Italian torpedo boat Lupo, which was hit repeatedly and driven off. Believing that they had destroyed the convoy, the British ships withdrew. In fact many caiques escaped in the confusion. A total of 297 German soldiers from a force of 2,000 were killed. The diminished II/85 was later airlifted into Crete and by the evening of 23 May they were fighting against the New Zealand 5 Brigade at Galatas.
Reports of this setback caused the recall of the 2nd Motor Sailing Flotilla, but these orders did not reach it until 09:30 on the 22nd. Meanwhile, Force C, reinforced by the anti-aircraft cruiser HMS Carlisle, had re-entered the Aegean the previous night to patrol off Heraklion. Not finding any shipping the squadron searched to the west and intercepted the main escort of the 2nd Motor Sailing Flotilla, the Italian torpedo boat Sagittario, at 10:10 and approximately 25 miles (40 km) off Milos. Eventually, the 2nd Motor Sailing Flotilla and its escort managed to slip away undamaged. King's warships, despite their failure to destroy the German troop transports, had succeeded in forcing the Axis to abort the landing by their mere presence at sea. King, knowing that his ships were low on anti-aircraft ammunition and feeling that he had achieved his main objective, ordered Force C to withdraw. As it headed south Naiad was badly damaged and Carlisle set on fire by German bombers. Admiral Cunningham later criticised King, saying that the safest place during the air attack was amongst the flotilla of caïques.
### Second day and onwards
During the night of 20/21 May many of the Germans who had landed around Heraklion suffered greatly from thirst. Isolated or in small groups, many of them wounded, they were hunted by Allied fighting patrols and Cretan civilians and often pinned down by Allied fire; some contracted dysentery from drinking stagnant water. On the morning of the 21st Bräuer again attacked East Hill, hoping to both relieve the platoon which had been cut off on the hill the previous evening and gain a position overlooking the airfield. The assaults were ill-coordinated and failed with heavy loss; the isolated platoon was overrun at around 12:00. German air attacks were renewed, but the Allies duplicated the German recognition signals, which confused the attackers. Where these ruses were seen through, the bombing was again ineffective against the well dug in Allies.
Schulz, to the west of Heraklion, was out of contact with Bräuer, but could hear heavy firing from the east and when he learnt that the 8th Air Corps was to bomb Heraklion at 10:00 he determined to attempt to capture the town again. He requested reinforcements from the II/2 Battalion, but this unit had heard that its missing components had been diverted to Maleme and, facing large bands of armed Cretan civilians, only sent one platoon. After Heraklion was heavily bombed the III/1 Battalion attacked the shaken Greeks via the South and West Gates, broke into the town and relieved some of the paratroopers isolated the previous evening. The Greeks ran very short of ammunition. The Germans again fought their way as far as the harbour and the Greeks negotiated a surrender of the town. Before this could be put into effect Chappel sent reinforcements which threatened the German flanks and forced them to withdraw. He also sent large quantities of captured German weapons and ammunition. On the 22nd, the 3rd Regiment and armed Greek civilians cleared the western and southern approaches of Heraklion and the Black Watch also cleared the eastern approaches of the airfield. Following reports that the Germans were using civilians as human shields, the Greek military governor of Heraklion, Major-general Michail Linardakis, sent an emissary to demand that this cease, threatening to retaliate against German prisoners of war. Schulz agreed on condition that Heraklion surrender within two hours, which Linardakis refused.
When Ju 52s flew over the Allies ceased fire and displayed captured panels requesting resupply; the confused German pilots dropped large quantities of weapons, ammunition and equipment, including two motorcycles with sidecars, into the Allied positions. Much of the German weaponry was distributed to the local Cretans. On 23 May six Hurricanes from No. 73 Squadron RAF were sent to Heraklion from Egypt, but several suffered landing damage and the facility lacked adequate fuel and ammunition for them and they were withdrawn the next day. On the 24th four companies of paratroopers were dropped west of Heraklion to reinforce the Germans and the town was heavily bombed in retaliation for its non-surrender on the 21st and again on the 23rd; according to MacDonald it was reduced to rubble. On the night of 24/25 May the Greek units were withdrawn to the area of Knossos for rest and refitting and the defence of Heraklion was taken over by Commonwealth units. The 1st Battalion of the Argyll and Sutherland Highlanders (Argylls), 655 men strong, had landed at Tympaki on the south coast of Crete on 19 May; advance elements of it reached Heraklion on 25 May and eventually approximately 340 men of the battalion reinforced the 14th Brigade.
Middle East Command and the Allied HQ on Crete's plan was that when the Argylls arrived at Heraklion, one of 14th Brigade's existing battalions would move westwards to Rythymno. Chappel, believing the German force to be stronger than it was, was content for the brigade to hold its positions, although several tanks and some artillery pieces were sent by sea to the more active fighting in the Maleme area. Chappel was in radio contact with GHQ Middle East in Cairo, but not with Freyberg, his immediate superior. On the night of 26/27 May he queried Freyberg through Cairo as to whether he should attempt to clear the routes to the west and south. The historian Antony Beevor believes that this would have been impractical due to the state of the roads and German opposition. By the time the query was sent the battle on Crete had already been lost. Meanwhile, the Germans were under constant pressure from the Cretans, despite fierce German reprisals. They were further reinforced by paratroopers landing at Gournes on the 27th.
## Allied evacuation and Greek surrender
Meanwhile, the Germans had succeeded in securing Maleme airfield, captured the port town of Chania and pushed the Allies there east and south. On 26 May Freyberg informed General Archibald Wavell, Commander-in-Chief Middle East, that the Battle of Crete was lost. The next day Wavell ordered an evacuation and that evening 14th Brigade was informed that they would be evacuated by sea on the night of 28/29 May, ships arriving at midnight and departing by 03:00. Shortly before noon on 28 May a further 2,000 paratroopers landed to the east of the brigade's position and late that afternoon the Allies were heavily bombed for two hours. During the day, stores, equipment and heavy weapons were destroyed. The naval evacuation force consisted of three cruisers – Orion, Ajax and Dido – and six destroyers – Hotspur, Jackal, Decoy, Hereward, Kimberley and Imperial – and was commanded by Rawlings. Before reaching Heraklion Ajax suffered a near miss from a bomb, which started a fire on the ship; Rawlings ordered her back to Alexandria. The embarkation went smoothly and the squadron was underway by 03:00 with approximately 4,000–4,100 evacuees on board. Many wounded and some detachments guarding road blocks were left behind. To maintain security, the Greeks had not been told of the planned evacuation of the Commonwealth forces and only one Greek soldier was evacuated.
The Imperial's steering gear broke down at about 03:45 and her crew and complement of soldiers had to be taken off at sea, at night, and she was then sunk. This delayed the squadron and they were 90 minutes behind schedule by 06:00, when they were sighted by Luftwaffe reconnaissance planes. Over the following nine hours 400 separate attacks by Junkers Ju 87 dive bombers (Stukas) were counted. The planes were based at Scarpanto, less than 90 miles (140 km) away. At 06:25, Hereward was hit and began sinking. The order to abandon ship was given and six hours later Italian MAS boats rescued 165 crew and about 400 soldiers. Fatalities are estimated at 63 crew and about 50 soldiers. Dido and Orion were both hit repeatedly by the dive bombers, causing heavy casualties. One hit blew Orion's forward turret overboard and temporarily set fire to the ship. There were also high level bombing attacks, although no hits are recorded from them. Allied fighters covered the final part of the journey, and Alexandria was reached at 20:00. Some ships had all but exhausted both their fuel and their ammunition.
The Germans occupied Heraklion on 30 May. Linardakis signed an instrument of surrender there with Bräuer and the Greek troops in the area laid down their weapons. They were taken to Maleme and Chania, but they were gradually released over the following six months.
## Aftermath
During the fighting around the town and the subsequent evacuation the Commonwealth troops of the 14th Brigade suffered 195 killed on Crete, 224 are known to have died during the evacuation or of wounds in Egypt. An unknown, but assumed to be small, number were too intoxicated to disembark from the Imperial and were lost when she sank. An unknown number died due to the sinking of the Hereward. An unknown number of wounded men were left on Crete and 244 wounded who subsequently survived were landed in Alexandria. An unknown number of men were taken prisoner during the fighting or captured as a result of not being evacuated, including more than 300 men of the Argylls who were still making their way from the south coast, and approximately 400 were captured as a result of the sinking of Hereward, some of whom may also have been wounded. Over 200 naval personnel were killed and 165 taken prisoner, the latter all from the Hereward. The number of Greeks killed and wounded is not known.
German losses during the battle are uncertain. They have been reported by British and Australian historians as "over 1,000 killed" on 20 May and at least 1,250 or 1,300 dead by the 22nd. Daniel Davin, in the New Zealand Official History, warns "reports of German casualties in British reports are in almost all cases exaggerated". German records do not show losses at regimental level, so their actual casualties cannot be accurately assessed. They launched four regimental assaults against Crete on the 20th, of which the attack on Heraklion was one; total German paratrooper losses throughout the campaign on Crete have been variously assessed as up to 3,022 killed and approximately 1,500 wounded in the British Official History, up to 2,818 killed and 1,505 wounded in a 2002 study for the US Army, or 3,077 killed, 2,046 wounded and up to 17 captured in the New Zealand Official History. Crete fell to the Germans, but they suffered more casualties than during the entire campaign in the Balkans until then. Almost 200 Ju 52s were put out of action. Due to their heavy losses on Crete the Germans attempted no further large-scale airborne operations during the war.
The German occupation of Crete was brutal, with 3,474 Cretan civilians being executed by firing squad and many more killed in reprisals and atrocities. Bräuer was the German commander on Crete from November 1942 to July 1944. When the war ended in May 1945 the commander of the German garrison signed the capitulation of Crete in Heraklion. After the war, Bräuer was charged with war crimes by a Greek military court. He was convicted and subsequently hanged on 20 May 1947, the sixth anniversary of the German invasion of Crete.
After its capture by the Germans, Heraklion airfield was used to transport supplies and reinforcements to Axis forces operating in North Africa in 1941 and 1942. It was successfully raided in June 1942 by Allied special forces. Since the war it has been developed as an international airport and is the second busiest in Greece, with 7 million passenger movements in 2018.
## Notes, references and sources |
# Edmund Sharpe
Edmund Sharpe (31 October 1809 – 8 May 1877) was an English architect, architectural historian, railway engineer, and sanitary reformer. Born in Knutsford, Cheshire, he was educated first by his parents and then at schools locally and in Runcorn, Greenwich and Sedbergh. Following his graduation from Cambridge University he was awarded a travelling scholarship, enabling him to study architecture in Germany and southern France. In 1835 he established an architectural practice in Lancaster, initially working on his own. In 1845 he entered into partnership with Edward Paley, one of his pupils. Sharpe's main focus was on churches, and he was a pioneer in the use of terracotta as a structural material in church building, designing what were known as "pot" churches, the first of which was St Stephen and All Martyrs' Church, Lever Bridge.
He also designed secular buildings, including residential buildings and schools, and worked on the development of railways in north-west England, designing bridges and planning new lines. In 1851 he resigned from his architectural practice, and in 1856 he moved from Lancaster, spending the remainder of his career mainly as a railway engineer, first in North Wales, then in Switzerland and southern France. Sharpe returned to England in 1866 to live in Scotforth near Lancaster, where he designed a final church near to his home.
While working in his architectural practice, Sharpe was involved in Lancaster's civic affairs. He was an elected town councillor and served as mayor in 1848–49. Concerned about the town's poor water supply and sanitation, he championed the construction of new sewers and a waterworks. He was a talented musician, and took part in the artistic, literary, and scientific activities in the town. Also an accomplished sportsman, he took an active interest in archery, rowing and cricket.
Sharpe achieved national recognition as an architectural historian. He published books of detailed architectural drawings, wrote a number of articles on architecture, devised a scheme for the classification of English Gothic architectural styles, and in 1875 was awarded the Royal Gold Medal of the Royal Institute of British Architects. He was critical of much of the restoration of medieval churches that had become a major occupation of contemporary architects. Towards the end of his career Sharpe organised expeditions to study and draw buildings in England and France. While on such an expedition to Italy in 1877, he was taken ill and died. His body was taken to Lancaster, where he was buried. Sharpe's legacy consists of about 40 extant churches; railway features, including the Conwy Valley Line and bridges on what is now the Lancashire section of the West Coast Main Line; and his archive of architectural books, articles and drawings.
## Early life
Edmund Sharpe was born on 31 October 1809 at Brook Cottage, Brook Street in Knutsford, Cheshire, the first child of Francis and Martha Sharpe. His father, a peripatetic music teacher and organist at Knutsford parish church, came from Stamford in Lincolnshire. At the time of marriage his wife, Martha Whittaker, was on the staff of an academy for young ladies, Belvedere House, in Bath, Somerset. During his childhood in Knutsford, the young Edmund played with Elizabeth Stevenson, the future Mrs Gaskell. In 1812 the Sharpe family moved across town from Over Knutsford to a farm in Nether Knutsford called Heathside, when Francis Sharpe then worked as both farmer and music teacher. Edmund was initially educated by his parents, but by 1818 he was attending a school in Knutsford. Two years later he was a boarder at a school near Runcorn, and in 1821 at Burney's Academy in Greenwich. Edmund's father died suddenly in November 1823, aged 48, and his mother moved to Lancaster with her family, where she later resumed her teaching career.
Edmund continued his education at Burney's Academy, and became head boy. In August 1827 he moved to Sedbergh School (then in the West Riding of Yorkshire, now in Cumbria), where he remained for two years. In November 1829 he entered St John's College, Cambridge as a Lupton scholar. At the end of his course in 1832 he was awarded a Worts Travelling Bachelorship by the University of Cambridge, which enabled him to travel abroad for three years' study. At this time his friend from Lancaster at Trinity College, William Whewell, was Professor of Mineralogy. John Hughes, Edmund Sharpe's biographer, is of the opinion that Whewell was influential in gaining this award for Sharpe. Edmund graduated BA in 1833, and was admitted to the degree of MA in 1836. During his time abroad he travelled in Germany and southern France, studying Romanesque and early Gothic architecture. He had intended to travel further into northern France, but his tour was curtailed in Paris owing to "fatigue and illness". Edmund returned home to Lancaster late in 1835, having by then decided to become an architect. In December he wrote a letter to William Whewell saying that he had "finally determined to adopt the Profession of Architecture". Some sources state that Sharpe was articled to the architect Thomas Rickman. Sharpe did visit Rickman for a few days in 1832 and corresponded with him later. He may have been "acting as a research assistant" while on the Continent, but Hughes states "there is no evidence to suggest that Sharpe spent more time with Rickman, or served any kind of formal apprenticeship with him".
## Architect
### Lancaster practice
Edmund Sharpe started his practice at the end of 1835 in his mother's house in Penny Street, moving into premises in Sun Street in 1838. In October that year he took as his pupil Edward Graham Paley, then aged 15. Later in 1838 Sharpe took a house in St Leonard's Gate large enough to accommodate himself and Paley; the practice continued to use the premises in Sun Street until after Sharpe's retirement. In 1841 Thomas Austin also joined the practice as a pupil, staying until 1852 when he left to set up on his own as an architect in Newcastle upon Tyne. In 1845 Sharpe made Paley a partner, and in 1847 effectively handed the business over to him. At about this time also, John Douglas joined the firm as Paley's assistant, and stayed with the firm until about 1859, when he moved to Chester to establish his own practice. Sharpe retired completely from the practice in 1851, leaving Paley as sole principal. Also in 1851 Paley married Sharpe's sister, Frances.
### Churches
In his letter of December 1835 to William Whewell, Sharpe also mentioned that plans for at least one church, St Mark's at Witton, west of Blackburn, were already well advanced, and that he was working towards another one, St Saviour's near Bamber Bridge, south of Preston. In addition, he was in contact with the Earl of Derby with a view to designing a church for him near his seat at Knowsley, northeast of Liverpool.
Four of Sharpe's earliest churches – St Saviour, Bamber Bridge (1836–37); St Mark, Witton (1836–38);. Christ Church, Chatburn (1837–38); and St Paul, Farington, near Leyland (1839–40) – were in the Romanesque style, which he chose because "no style can be worked so cheap as the Romanesque". They "turned out to be little more than rectangular 'preaching boxes'... with no frills and little ornamentation; and many of them were later enlarged". The only subsequent churches in which Sharpe used Romanesque elements were the chapel of All Saints, Marthall, near Knutsford (1839); St Mary, Conistone in Wharfedale (1846); and St Paul, Scotforth in south Lancaster (1874), the last built towards the end of his life.
By 1838 Sharpe had begun to experiment with elements of English Gothic architecture, initially in the Early English style and in particular the lancet window, dating from the early 12th century or earlier. The first church he built in this style was St John the Evangelist, Cowgill, Dent, (1837–38), followed closely by Holy Trinity, Howgill (1837–38), and then by several others in the same style. He was soon incorporating elements from later styles of English Gothic architecture, and by 1839 was designing churches using Perpendicular features, as at St Peter, Stainforth (1839–42), St John the Baptist, Bretherton, and St Peter, Mawdesley (both 1839–40).
Sharpe was one of the architects who designed churches for the Church Building Commission, which had been established by the Church Building Acts of 1818 and 1824. The resulting churches have been called Commissioners' churches, and were built to provide places of worship in newly populated areas. Sharpe designed six churches for the Commission: St John, Dukinfield, St George, Stalybridge (both 1838–40), St John the Baptist, Bretherton, St Paul, Farington, St Catharine, Scholes (near Wigan; 1839–41), and Holy Trinity, Blackburn (1837–46).He is also credited with the design of St. Bridgets, Beckermet, Cumberland (1842–43).
Although some architects designed the earlier Commissioners' churches in neoclassical style, most were in Gothic Revival style. The earliest of the Gothic Revival churches were based loosely on the Early English style, with single or paired lancet windows between buttresses in the sides of the church, and stepped triple lancets at the east end. Others were in a "stilted Perpendicular" style, with "thin west towers, thin buttresses, fat pinnacles, and interiors with three galleries and plaster vaults". These features were only loosely derived from medieval Gothic architecture, and were not true representations of it. A major influence on the subsequent development of the Gothic Revival was AWN Pugin (1812–52) and, influenced by him, the Cambridge Camden Society (later named the Ecclesiological Society). Among other things, they argued that not only should Gothic be the only right and proper style for churches, but that their features should be accurate representations of that style; they should be "correct" Gothic features, rather than being loosely derived from the style. The term "pre-archaeological" was used to describe churches designed using features only loosely derived from true Gothic.
Sharpe's early Gothic Revival works were pre-archaeological, including Holy Trinity, Blackburn, built in 1837–46 for Revd JW Whittaker. Hughes expresses the opinion that this church is Sharpe's pièce de resistance, it contains "a mongrel mix of Gothic styles". Simultaneously Sharpe was involved in the design of about twelve more churches in Northwest England, which increasingly incorporated more "correct" Gothic features. In 1841 he obtained a contract to build three churches and associated structures (vicarages and schools) for the Weaver Navigation Trustees, at Weston Point, Runcorn; Castle, Northwich; and Winsford. All three were in Cheshire, and built between 1841 and 1844. Between 1835 and 1842 Sharpe designed about 30 new churches in Lancashire and Cheshire, all to a low budget, and all to a degree pre-archaeological. In 1843 Sharpe was able to fulfil his promise to build a church for the Earl of Derby; this was St Mary, Knowsley, which was completed and consecrated the following year. It is described by Hughes as "one of Sharpe's loveliest creations". About the same time he designed a new steeple for St Michael, Kirkham; the steeple and St Mary's Church contained much more in the way of "correct" Gothic features, and both were praised by the Camden Society in The Ecclesiologist.
In the early 1840s Sharpe was invited by John Fletcher, his future brother-in-law, to build a church near Fletcher's home in Little Bolton. Fletcher was the owner of a coal mine at Ladyshore, Little Lever, overlooking the River Irwell and the Manchester, Bolton & Bury Canal. He had been using the clay which came up with the coal to make refractory bricks for furnaces, and suggested its use for building the church, as it was much cheaper than stone. Sharpe then designed the first church in England to be built, in whole or in part, from this material (terracotta), St Stephen and All Martyrs, Lever Bridge (1842–44). As terracotta is commonly used to make plant pots and the like, Sharpe himself called this church, and its two successors, "the pot churches", a nickname that has stuck. The advantages of terracotta were its cheapness, its sturdiness as a building material, and the fact that it could be moulded into almost any shape. It could therefore be used for walls, towers, arches, and arcades in a church, for the detailed decoration of capitals and pinnacles, and also, as at St Stephen's, for the furnishings, such as the altar, pulpit, font, organ case, and the pew ends. Apart from the foundations and the rubble within the walls, St Stephen and All Martyrs was constructed entirely from terracotta. The following year, a second church was built using the same material, Trinity Church, Rusholme, south of Manchester (1845–46), built and paid for by Thomas Carill-Worsley, who lived at nearby Platt Hall. In this case, although the exterior is in terracotta, the interior is of plastered brick. The church was consecrated in June 1846, although at the time work on the spire had not yet started and several other features were incomplete, including the heating, seating, and floor tiling.
Towards the end of his life, Sharpe designed one more church incorporating terracotta, St Paul, Scotforth, Lancaster (1874–76). For this he returned to the Romanesque style, and used terracotta as a building and a decorative material. By this time he was living in Scotforth, then a separate village to the south of Lancaster, but now absorbed into the city. The new church was built within 300 yards (274 m) of his home, and again terracotta was not the only material used. It is used for the dressings, windows, doorways, the upper part of the tower, and internally for the piers and arches of the aisle arcades, but the walls are of stone.
### Other structures
During his time as an architect Sharpe was also involved in the building, repair, and restoration of non-ecclesiastic structures, including houses and bridges. In 1837 he was appointed bridgemaster for the Hundred of Lonsdale South of the Sands, and in 1839 he supervised the repair of Skerton Bridge over the River Lune in Lancaster. The following year he designed a new bridge over the River Hyndburn at Fournessford, a village to the east of Wray. He had also been appointed as architect and superintendent of works for Lancaster Castle, the Judges' Lodgings, and the County Lunatic Asylum (later the Lancaster Moor Hospital). For the asylum he designed several new wings and a chapel, followed by extensions to the union workhouse. Sharpe was also involved in designing and altering several domestic buildings. In 1843 he designed a vicarage in Cockermouth, and the following year he started to remodel Capernwray Hall, a country house northeast of Lancaster. In the same year he designed the Governor's House for Knutsford Gaol, and in 1845 he re-designed Redmarshall Old Rectory for the Revd Thomas Austin, father of Sharpe's pupil (also named Thomas). Following Paley's becoming a partner in 1845, the pair worked together to design Lee Bridge in Over Wyresdale (1847), to plan the conversion of a disused manor house into the Furness Abbey Hotel (1847), and to arrange the remodelling of Hornby Castle (1847–52). In 1849–50 they planned the rebuilding and enlargement of the Charity School for Girls in Middle Street, Lancaster, followed in 1851 by the National School for Boys in St Leonard's Gate. The practice then made plans for a new building at Giggleswick School, and new premises for Lancaster Grammar School in Moor Lane, but by then Sharpe was on the point of withdrawing from the practice, and it is likely that most of the designs were prepared by Paley.
## Architectural historian
Sharpe studied and wrote about ecclesiastical architecture throughout his adult life, both sketching and measuring historical churches and ruins. This resulted in a systematic series of published drawings in twelve parts between 1845 and 1847 entitled Architectural Parallels, containing measured drawings of abbey churches in the early Gothic style, and reissued as a single work in 1848. Sharpe intended to produce a further version with text, but this never transpired. Also in 1848 a Supplement to Architectural Parallels, was published, containing yet more detailed drawings. Simultaneously, Sharpe had produced the two-volume work Decorated Windows, the first volume being published in 1845, and the second in 1849. The work, which was praised by the art critic John Ruskin in The Stones of Venice, consisted largely of drawings by Sharpe's pupils – Paley, Austin, and R. J. Withers – with text by Sharpe describing and analysing the tracery of Gothic windows.
In 1851 Sharpe published a monograph entitled The Seven Periods of English Architecture, a small book of about 50 pages suggesting a new scheme for classifying the styles of English ecclesiastical architecture "from the Heptarchy to the Reformation". It was intended to replace the scheme then in use, which had been proposed in 1817 by Thomas Rickman. Rickman had divided English architecture into "four distinct periods, or styles" which he termed "Norman", "Early English", "Decorated English", and "Perpendicular English". The Norman style lasting until about 1189, was characterised by its arches usually being semicircular, although sometimes pointed; the ornamentation was "bold and rude". The Early English style, continuing to about 1307, was distinguished by its pointed arches and long narrow windows without mullions. He called the characteristic ornamentation "toothed" because it resembled the teeth of the shark. The following period, the Decorated English lasted until 1377, or possibly 10–15 years later, was characterised by large windows with pointed arches containing mullions, and with tracery "in flowing lines forming circles, arches and other figures". There was much ornamentation, carved very delicately. The final period identified by Rickman, the Perpendicular English, lasted until as long as 1630 or 1640. This was distinguished by the mullions and the "ornamental panellings" running in perpendicular lines. The ornamentation was in many cases "so crowded as to destroy the beauty of the design". The carving was again "very delicately executed".
In his classification, Sharpe first identified two main classes, according to whether the arches were "circular" or "pointed". The class characterised by the circular arch was the Romanesque class; that by the pointed arch was the Gothic. He divided the Romanesque class into two periods by date rather than by stylistic differences, the dividing date being 1066; this divided the "Saxon" from the "Norman" stage. Whereas Rickman allowed pointed arches when they occurred in the same building as round arches in his Norman period, Sharpe separated buildings that contained both types of arches into a separate intermediate style, the "Transitional". When it came to the Gothic class, Sharpe identified four styles, in contrast to Rickman's three, using the windows to differentiate between them. The earliest style was characterised by windows resembling a lancet "in its length, breadth, and principal proportions". These windows might be single, or in groups of two, three, five, or seven. This style he termed the "Lancet Period". During the next period, tracery appeared in the windows, and originally consisted of simple geometric forms, in particular the circle. This period he called the "Geometrical Period". Later the tracery became more complex, including the ogee curve; the characteristic feature being the "sinuosity of form" in the windows and elsewhere. This Sharpe termed the "Curvilinear Period". Finally, the transom appeared in the windows, and the curved line in the tracery became replaced by straight lines, an "angularity of form", and a "square edge was preferred". This style he named the "Rectilinear Period". The approximate dates Sharpe gave for his periods were, following 1066, the Norman Period up to 1145, the Transitional Period to 1190, the Lancet Period to 1245, the Geometrical Period to 1315, the Curvilinear Period to 1360, and the Rectilinear Period to 1550.
In comparing the two classifications, Sharpe divides Rickman's Norman period into two, the Norman and the Transitional periods. Then Rickman has three Gothic periods in contrast to Sharpe's four. Comparing the descriptions of the styles and, approximately, the dates, Sharpe's Lancet Period corresponds generally with Rickman's Early English; and Sharpe's Rectilinear Period with Rickman's Perpendicular English. This leaves Rickman's Decorated English style divided into two periods by Sharpe according to the complexity of the tracery, the Geometrical and the Curvilinear Periods. Following the publication of the monograph, Sharpe read a paper to the Royal Institute of British Architects describing his system. The monograph and the paper led to "a bitter controversy". The debate between Sharpe and his followers on one side and supporters of Rickman's scheme on the other was published as a series of letters to the journal The Builder until the editor called a halt to the correspondence.
In the same year as Sharpe's short book, An Essay on the Origin and Development of Window Tracery in England, a much larger work on essentially the same subject, was published by the distinguished historian Edward Augustus Freeman, which proposed the terms "Flowing" and "Flamboyant" (the later already in use in France) where Sharpe used "Curvilinear". Although Rickman's scheme remains in general use, despite recognition of its deficiencies, Sharpe's terms "Geometrical" and "Curvilinear" are very often used in addition to distinguish styles or phases within Rickman's "Decorated". They were used by Francis Bond in his 1905 book Gothic Architecture in England, and are used in various recent works including the Pevsner Architectural Guides.
In 1869 Sharpe joined the Architectural Association, established in 1847 "by a group of dissatisfied young architects ... to provide a self-directed, independent education at a time when there was no formal training available". He then proposed and organised a series of six annual expeditions to study and draw buildings in different areas, which took place between 1870 and 1875. In 1870 the expedition was to Lincoln, Sleaford, and Spalding; in 1871 to Ely, Lynn, and Boston; the following year to Stamford, Oundle, Wellingborough, and Northampton; and in 1873 to Grantham, Newark, Southwell, Ashbourne, and Lichfield. The final two expeditions were to France: in 1874 to the northern part of the country, visiting places around Paris including Soissons, Laon, Rheims, and Chartres; the following year it was to the Charente district of southwest France, including Angoulême. In 1876 Sharpe gave a lecture on this expedition in London, linking the architecture of the region with Byzantine architecture elsewhere. Following Sharpe's death in 1877 the Association complied with his wish that the expeditions should be continued; and in 1882 it published Charente: In Memory of Edmund Sharpe, 1875.
Having been a fellow of the Royal Institute of British Architects since 1848, Sharpe was awarded its Royal Gold Medal in 1875. This was presented to him by Sir George Gilbert Scott, largely in recognition of his writings. In addition to those recorded above they include: The Architectural History of St Mary's Church, New Shoreham (1861), An Account of the Churches visited during the Lincoln Excursion of the Architectural Association (1871), The Mouldings of the Six Periods of British Architecture from the Conquest to the Reformation (1871–74), The Ornamentation of the Transitional Period of British Architecture AD 1145–90 (1871), The Ornamentation of the Transitional Period in Central Germany (1877), and The Churches of the Nene Valley, Northamptonshire (published posthumously in 1880). Other writings by Sharpe were published in The Builder and The Architect. He also delivered papers to the Architectural Association, and to the Royal Institute of British Architects. Among other subjects, he argued for restraint in the use of colour in the decoration of churches, in the painting of walls and the stonework, and in the stained glass. He was very critical of recent restorations of medieval churches, which had been a major occupation of architects during the previous 20 years, and was particularly caustic about the removal of whitewash from the interior of churches, and the damage thus caused to the underlying stonework. Between January 1874 and February 1875 Sharpe published The Architecture of the Cistercians, which dealt in considerable detail with the design and functions of Cistercian monasteries built in the 12th and 13th centuries in Britain and in Europe, most of which he had visited. In addition, Sharpe attended several meetings of the Archaeological Institute, and was a Vice-President of the British Archaeological Association.
## Railway developer and engineer
### England
While Sharpe was designing churches, he was augmenting his income by working as a sub-contractor in building railways. These were the lines between Lancaster and Preston, Lancaster and Skipton, and between Liverpool and Southport. He first became involved with the Lancaster and Preston Junction Railway in 1838, two years after Joseph Locke was appointed as engineer for the line. Sharpe submitted a tender to supply the masonry work for the "Lancaster Contract", the northern section of the line; and Peter Perry from Durham submitted a tender for the earthwork. Locke insisted that both earthwork and masonry work should be under one contract, which Perry accepted and subcontracted the masonry work to Sharpe. Subsequently, Perry reneged on his part of the contract, resulting in serious disputes between Sharpe, Locke, and the directors of the railway company concerning the costs involved and the quality of the work. The masonry for this section of the line included 15 under-and-over bridges and the six-arch viaduct over the River Conder at Galgate. The eventual outcome of the conflict was that Sharpe was dismissed from the work in 1839 with agreed financial compensation, having built most but not all of these structures.
Sharpe's next venture into railway building came in 1845 when, with others, he promoted the building of a cross-country line from Lancaster to Skipton to join the Midland Railway in the West Riding of Yorkshire. This became known as the "Little" North Western Railway ("L"NWR), with projected branches joining the Lancaster and Carlisle Railway (then under construction) at points near Milnthorpe and Orton. In the event the Milnthorpe branch was dropped during the committee stage of the passage through Parliament of the enabling Bill, leaving the Lancaster and Orton branches intact, parting at Ingleton and making much use of the Lune Valley.
About this time, the amount of trade handled by the Port of Lancaster was declining, largely owing to silting up of the River Lune. In May 1842 Sharpe had been elected a Port Commissioner, and later proposed what became the Morecambe Bay Harbour Project. This planned to build a new port at Poulton-le-Sands (soon to become part of Morecambe), and link it to Lancaster by means of a ship canal. After prolonged discussion this proved to be too expensive, and it was agreed to link Lancaster and Morecambe by railway rather than by canal. An Act for the creation of the Morecambe Harbour and Railway Company (MH\&R) received Royal assent in July 1846, the revised plan being to link this line to the "L"NWR at Green Ayre, in the northern part of Lancaster next to the River Lune. A clause in the Act allowed the MH\&R to be sold to the "L"NWR, which took place in October. The parts played by Sharpe in all of this financial manoeuvring were conflicting and complex: he was simultaneously a Port Commissioner, a Town Councillor, a member of the board of the Morecambe Bay Harbour Company, and Secretary to the "L"NWR. In 1847, near the Morecambe terminus of the railway, Sharpe laid the first stone of the North Western Hotel (later the Midland), which he (or more probably Paley) had designed. In April that year Sharpe had resigned as Secretary to the "L"NWR to enable him to tender for building the line from Morecambe to Wennington, a village north-east of Lancaster near to the Yorkshire border. His tender of £100,000 (equivalent to £ as of ) for the line (excluding the bridge over the River Lune at Green Ayre) was accepted. He also gained the contract for building the harbour. In June 1848 the section of line from Lancaster to Morecambe was opened, and by October 1849 the ten-mile section from Lancaster to Wennington was completed. In September Sharpe had also resigned as a director of the "L"NWR to become its traffic manager, and was then contracted to manufacture and supply rolling stock for the railway, something for which he had neither expertise nor previous experience. By February 1851 the line was experiencing difficulties, its traffic being less than expected and its costs rising; and in December Sharpe was given notice that his contract with the company would be curtailed the following month.
Sharpe then turned his attention to the Liverpool, Crosby and Southport Railway (LCSR) and acted as its company secretary. When in 1854 the Lancashire and Yorkshire Railway discontinued leasing its rolling stock to the LCSR, Sharpe arranged the manufacture of its own locomotives and carriages. Also in 1854 he submitted proposals for a branch line running from Bootle to the North Docks in Liverpool, part of which was built in March 1855, though the project was never completed.
### North Wales
In early 1856 Sharpe moved with his family to Llanrwst, North Wales with the intention of building a railway along the Conwy Valley. The prospectus for a line running from Conwy to Llanrwst was published in 1858, with Sharpe named as its engineer. The intention for the full line was to build it from the Chester and Holyhead Railway to Betws-y-Coed, passing through Llanrwst; it would be 15 miles (24 km) long, with a gauge of 3 feet 3 inches (991 mm). A series of discussions and negotiations followed, resulting in changes to the route of the line from the west to the east side of the river, building it to the standard gauge (4 ft 81⁄2in (1,435mm)), and running from Conwy only as far as Llanrwst. Construction started on 27 August 1860, and the railway was opened on 17 June 1863. An extension of the line to Betws-y-Coed was completed in 1868, but by this time Sharpe and his family had moved to Geneva.
### Abroad
In 1860 a horse-drawn tramway had been built by Charles Burn, an Englishman, in Switzerland between Geneva and Carouge, a distance of about 4 miles (6 km). This proved to be a success, and Burn planned to build more lines. In 1863 he was joined by Sharpe as a partner, but after a short time of working together the partnership was dissolved, and Sharpe continued with the project alone. By March 1864 a line from the centre of Geneva to Chêne-Bougeries, a distance of about 6 miles (10 km) was under construction, to an innovative design. The line to Carouge had two grooved rails. Sharpe's line had two flat rails, with a third grooved rail between them, along which ran a wheel allowing the tram to be steered. The wheel could also be raised to permit the tram to deviate from the track to pass around obstacles, or come to the pavement. This line was Sharpe's sole venture in Switzerland.
In August 1863 Sharpe was granted the concession for building a railway line in southern France from Perpignan to Prades in the Pyrenees, a distance of 26 miles (42 km). Work on the line began in 1865, but proceeded very slowly; progress was blocked by local landowners, legal processes, and financial problems. Sharpe was managing the project largely from Paris, through a series of agents. By the latter part of 1864 the stress was adversely affecting his health, so in 1865 he spent some time in Italy to recuperate. Following his return the difficulties continued to mount, and in 1867 he renounced his concession. The line was eventually taken over by the State, and was not fully completed until about 1877. At some point Sharpe bought property and iron ore mines along the route of the line.
## Civic life and sanitary reform
Concurrently with designing churches and building railways, Sharpe was heavily involved in the civic life of Lancaster, particularly in pioneering sanitary reform. By political persuasion he was a Conservative, and in 1837 he joined the local Heart of Oak Club, the core of the Lancaster Conservative Association. He was elected a town councillor for Castle Ward in 1841, a post he held for ten years, and in 1843 was appointed the town council's representative on the local Police Commission. He was also a visitor to the national schools, and in November 1848 he was elected as mayor for year, at that time a position more like that of a "chief magistrate". Through these offices he became aware of the unsatisfactory state of sanitation in the town, and resolved to improve it. The town was overcrowded, it suffered from poor housing, open sewers, overflowing cesspits, and a very poor water supply, mainly from wells polluted by infiltration. Many people suffered from typhus, and in 1848 there was an outbreak of cholera. The Police Commission had been established in Lancaster in 1825 with a wider role than suggested by its title, including "cleansing, lighting and watching" the town. However, there was constant friction between the Police Commission and the Town Council, the former tending to block any necessary reforms on the grounds of cost to the ratepayers. The conflict was unresolved until the two bodies merged in 1849. The functions of the new body included the establishment of the first Lancaster Board of Health.
Before, during and after his mayoralty, Sharpe played a major role in promoting sanitary reform, often meeting considerable opposition and needing to use his oratorical, political and persuasive skills to the full. A campaign to deal with the problems had been initiated in 1847 by two Lancaster doctors, Thomas Howitt and Edward Denis de Vitre. Sharpe joined them, drawing extensively on his experience of having accompanied Professor Richard Owen (born and educated in Lancaster) on his tour of inspection of the town in 1844. In 1848 Robert Rawlinson, also from Lancaster, was appointed as local surveyor, and published a further report that recommended new sewers and drains and the construction of a waterworks. Although Sharpe agreed in principle with the report, he was not satisfied with its details. Later that year, which was during his mayoralty, he travelled to London with the town clerk and a former mayor to meet representatives of the General Board of Health, including its chairman, Lord Morpeth, and its secretary Edwin Chadwick. As a result of this meeting, the Board of Health appointed James Smith from Scotland as an inspector, and commissioned him to produce a further report on Lancaster's problems. Smith's investigation took place in January 1849, and his report was received in July. In his conclusions, Smith noted that Lancaster was favourably situated to provide a healthy environment for its inhabitants, and that this could be achieved by "a complete and constant supply of pure and soft water, and ... a thorough system of drainage and sewerage". Subsequently, an Act of Parliament gave approval for these measures to be carried out, and in 1852 royal assent was given for the waterworks to be constructed. Delays, disputes and controversies continued, until the waterworks was eventually opened in 1855, when work on the drainage and sewage systems was already under way. This enabled underground pipes for the two systems to be laid simultaneously. Sharpe had played a significant part in arranging Queen Victoria's visit to Lancaster in October 1851, and with Paley designed four triumphal arches for the occasion. He also took part in the proceedings on the day, escorting the Queen, Prince Albert, and the Prince of Wales (the future King Edward VII) to the top of the castle tower.
In 1859 Sharpe was appointed as a Justice of the Peace for Lancashire and for Denbighshire. Shortly after his return to Lancaster in 1866 he again became involved in local politics. In 1867 the constituency of Lancaster was disfranchised because of corruption, and so lost its two members of parliament. Sharpe wrote a long letter to Benjamin Disraeli (Chancellor of the Exchequer, and responsible for the Reform Act of that year), arguing the case for reinstating Lancaster as a parliamentary constituency, and putting forward his own proposals for electoral reform. His letter received no reply, and Lancaster remained without parliamentary representation for the next 20 years.
## Personal and family life
On 27 July 1843 Sharpe married Elizabeth Fletcher, second sister of John Fletcher, at Bolton Parish Church. The couple had five children: Francis in 1845, Edmund junior (known as Ted) in 1847, Emily in 1849, Catherine (known as Kate) in 1850, and Alfred in 1853.
When Sharpe moved his family from Lancaster to live in North Wales in early 1856 he was aged 47. The seven years he spent there were later described, in a Memoir published in 1882 by the Architectural Association, as "perhaps the happiest years of his life". The family initially lived in a semi-detached house called Bron Haul near Betws-y-Coed, on what is now the A5 road. Two years later he bought a larger property called Coed-y-Celyn on the east bank of the River Lledr, about a mile south of Betws-y-Coed. After moving to Geneva, the family lived for about three years in a rented property called Richemont on the road from Geneva to Chêne-Bougeries. Finally in 1866 the family moved back to Lancaster to live in Scotforth, then a small village to the south of the town.
Elizabeth Sharpe died on 15 March 1876, a month after the consecration of St Paul, Scotforth where a plaque to her memory can be found in the chancel of the church. A year later, Sharpe travelled to northern Italy with his two daughters, his youngest son Alfred, and three research assistants, to make drawings of 12th-century churches in the region. During the trip he became seriously ill with a chest infection and died on 8 May, in or near Milan. His body was taken to Lancaster, where he was buried on 19 May, alongside his wife, in the municipal cemetery. "Glowing obituaries" were carried by the local newspapers and the architectural press, including The Builder, The Building News, and The Architect. His estate was valued at "under £14,000" (equivalent to £ as of ). A plaque to his memory was placed in the chancel of St Paul's, next to that of his wife.
## Other interests
Throughout his life, Sharpe took an interest in sport, as an active participant and as an organiser. At Cambridge, he was a member of the Lady Margaret Boat Club, and coxed the college boat. Back in Lancaster, he took up archery, joined the John O'Gaunt Bowmen, played cricket and coxed. In June 1841 he helped to found the Lancaster Lunesdale Cricket Club and the Lancaster Rowing Club. Sharpe was also an accomplished musician, and a member of the committee that organised the Lancaster Choral Society's first concert in September 1836. The society thrived for a number of years, and for a time Sharpe was its conductor. By the beginning of 1837 he was a member of the Lancaster Literary, Scientific, and Natural History Society, giving a number of talks to the society, and eventually becoming a committee member. That same year he became the secretary and treasurer of the Lancaster Institution for the Encouragement of the Fine Arts, and in April 1840 he joined the committee of Lancaster's Protestant Association. In 1842 he was part of a committee promoting congregational singing, and he gave an illustrated series of lectures on its history and merits. His love of music continued throughout his life, and included training choirs, composing hymn tunes, and manufacturing musical instruments similar to small harmoniums.
In early 1843 Sharpe bought Lancaster's Theatre Royal (now the Grand Theatre), the third-oldest extant provincial theatre in Britain, which had opened in 1782. He spent £680 (equivalent to £ as of ) on converting it into the Music Hall and Museum. It was the only place in Lancaster, other than the churches, able to accommodate 400 or more people, and so was used for a variety of purposes, including concerts, lectures, and religious meetings. In 1848 Sharpe founded the Lancaster Athenaeum, a private society for "the promotion of public entertainment and instruction", to which end it organised lectures on literary and scientific subjects, concerts and exhibitions. It held its meetings in the Music Hall, which was at one period renamed the Athenaeum. In 1852 Sharpe became the proprietor of the Phoenix Foundry on Germany Street, which among other things supplied cast iron pipes for the Lancaster waterworks, sewers and drains, and shells for the Crimean War.
## Appraisal
Hughes considers that Sharpe was never in the "first division" of 19th-century church architects; his designs were "basic, workmanlike, and occasionally imaginative, though hardly inspiring". There is no such thing as a "typical" church designed by Sharpe. He was an innovator and experimenter, and throughout his life a student of architecture. The architectural styles he used started with the Romanesque, passed through "pre-archaeological" Gothic to "correct" Gothic, and then back to Romanesque for his last church. The sizes of the churches varied, from the small simple chapels at Cowgill and Howgill to the large and splendid church of Holy Trinity, Blackburn. During Sharpe's earlier years in practice, between 1838 and 1842, Britain was going through a period of severe economic recession, which may have been why he designed many of his churches to be built as cheaply as possible.
As an architectural historian, Hughes considers Sharpe to be "in the top rank". His drawings of authentic Gothic buildings were still in use a century after his death. The architectural historian James Price states that Sharpe was "considered the greatest authority on Cistercian Abbeys in England". Some writers have regarded Sharpe as an early pioneer of the Gothic Revival, although in Hughes' opinion this is "probably more for his books than for his buildings". In 1897, 20 years after his death, Sharpe was considered to be sufficiently notable to merit an entry in the Dictionary of National Biography. In the article, the author refers to his being "an enthusiastic and profound student of medieval architecture". As a railway engineer he was "hardly an unqualified success"; but his administrative and persuasive skills were considerable, as is shown in his planning of railways in Northwest England, and in the sanitary reform and water supply of Lancaster. As an amateur musician his "gifts were prodigious". Hughes considers that Sharpe "used his talents to the full", and in view of the ways in which he employed his many gifts, Price describes him as Lancaster's "Renaissance man".
## See also
- List of architectural works by Edmund Sharpe
- List of works by Sharpe and Paley
- Sharpe, Paley and Austin |
# Arch of Remembrance
The Arch of Remembrance is a First World War memorial designed by Sir Edwin Lutyens and located in Victoria Park, Leicester, in the East Midlands of England. Leicester's industry contributed significantly to the British war effort. A temporary war memorial was erected in 1917, and a committee was formed in 1919 to propose a permanent memorial. The committee resolved to appoint Lutyens as architect and to site the memorial in Victoria Park. Lutyens's first proposal was accepted by the committee but was scaled back and eventually cancelled due to a shortage of funds. The committee then asked Lutyens to design a memorial arch, which he presented to a public meeting in 1923.
The memorial is a single Portland stone arch with four legs (a tetrapylon or quadrifrons), 69 feet 4+1⁄4 inches (21 metres) tall. The legs form four arched openings, two large on the main axis, 36 feet (11 metres) tall, oriented north-west to south-east, and two small on the sides, 24 feet (7.3 metres) tall. At the top of the structure is a large dome, set back from the edge. The main arches are aligned so the sun shines through them at sunrise on 11 November (Armistice Day). The inside of the arch has a decorative coffered ceiling and the legs support painted stone flags which represent each of the British armed forces and the Merchant Navy. The arch is surrounded by decorative iron railings, and complemented by the later addition of a set of gates at the University Road entrance to the park and a pair of gates and lodges at the London Road entrance—the war memorial is at the intersection of the paths leading from the two entrances.
With a large budget devoted entirely to the structure, the result is one of Lutyens's largest and most imposing war memorials. It dominates Victoria Park and the surrounding area, and can be seen from the main southward routes out of the city (though building work in the intervening years has reduced the area from which it is visible). The memorial was unveiled on 4 July 1925 by two local widows in front of a large crowd, including Lutyens. It cost £27,000, though the committee was left with a funding shortfall of £5,500 which several members of the committee made up from their own pockets; the committee was sharply criticised in the local press for their handling of the campaign. The arch is a Grade I listed building and since 2015, has been part of a national collection of Lutyens's war memorials.
## Background
In the aftermath of the First World War and its unprecedented casualties, thousands of war memorials were built across Britain. Amongst the most prominent designers of memorials was Sir Edwin Lutyens, described by Historic England as "the leading English architect of his generation". Lutyens established his reputation designing country houses for wealthy clients, but the war had a profound effect on him; following it, he devoted much of his time to memorialising its casualties. He became renowned for his commemorative works through his design for The Cenotaph in London, which became Britain's national war memorial. This, along with his work for the Imperial War Graves Commission (IWGC), led to commissions for war memorials across Britain and the Empire.
Victoria Park is a 35-hectare (86-acre) area of open land to the south-east of Leicester city centre. Formerly a racetrack, it was laid out as a public park in the late 19th century. At the beginning of the First World War, five part-time Territorial Force units were based in Leicester, along with elements of the regular Leicestershire Regiment. The special reserve battalion of the Leicestershire Regiment was sent to man coastal defences near Hull, while all five territorial units were sent to the front. Among them was the city's former Member of Parliament (MP), Eliot Crawshay-Williams, who served in the Middle East with the 1st Leicestershire Royal Horse Artillery. Recruitment to the army was lower in Leicester than in other English industrial towns, partly because of low unemployment in the area—the town's major industries were textile and footwear manufacturing, both of which were necessary for the war effort. Later in the war, many of the town's factories were given over to munitions production; Leicester produced the first batch of howitzer shells by a British company which was not making ammunition before the war. The local authorities held recruiting rallies as the war progressed, aided by William Buckingham, a local soldier who won the Victoria Cross at the Battle of Neuve Chapelle in 1915.
Leicester was granted city status by King George V in 1919, in recognition of its industries' contribution to the British war effort. The king and Queen Mary visited Leicester that summer, during which they called at several businesses in the city. In De Montfort Hall, the king presented gallantry medals to several servicemen who had yet to receive them, and the lord mayor was knighted, after which the king was honoured with a march past by local soldiers, and demobilised veterans in the adjacent Victoria Park. As well as members of the public, the parade was viewed by thousands of disabled veterans, Voluntary Aid Detachment nurses, and war widows and orphans. Such was the size of the force on parade, it took 45 minutes to proceed past the royal pavilion.
## Commissioning
A temporary war memorial was placed outside Leicester Town Hall in 1917. A public meeting was held on 14 May 1919 (the fighting having ceased with the armistice of 11 November 1918), which led to the creation of a War Memorial Committee of 23 members to propose a suitable permanent memorial. The committee was chaired by Henry Manners, 8th Duke of Rutland, with Sir Jonathan North (the lord mayor of Leicester) as vice-chair. Two sub-committees were established, one to look after finance and the other to supervise the design. The Duke of Rutland suggested siting the permanent memorial outside the town hall but this was rejected unanimously by the city council and the committee examined potential sites at Leicester Castle and Victoria Park.
A suggestion from a member of the public was examined by the design sub-committee, but in October 1919 the full committee resolved to appoint Lutyens as architect and to build the memorial in Victoria Park, which had been in the ownership of the city council since the 1860s and was laid out as a public park in 1883. Lutyens visited on 20 October 1919 and was accompanied by the duke and other committee members on an inspection of the chosen site. The original plan involved crossing avenues of lime trees to create a tree cathedral, with a cenotaph (identical to the one in London) at the western end, and a Stone of Remembrance at the crossing, within a circular walled enclosure, which would be inscribed with the names of the dead. The paths along the plan of the cathedral would be paved to accentuate the purpose of the structure. This proposal was accepted, and a model was made and displayed in the city museum. By March 1922, the project had been scaled back due to a shortage of funds and lack of public enthusiasm for the project—the costs were estimated at £23,000, of which only around £4,300 had been raised. At a public meeting on 29 March, the committee agreed to abandon the scheme and that "a memorial worthy of the city be erected on the ground near the main entrance gates".
Two days later, the committee asked Lutyens to design a memorial arch. Lutyens advised that such an arch would cost in the region of £25,000; he suggested they consider alternatives, such as an obelisk (which he estimated would be around half the cost) but the committee decided to proceed with the arch despite the cost. They presented the new design to another public meeting in May 1923. Lutyens told the meeting that the arch represented the city's triumphal spirit, and he announced the name "Arch of Remembrance". The name was chosen to avoid the impression that the memorial would be a triumphal arch, something the committee felt was incompatible with the mood of mourning for the dead.
The new proposal was approved, and construction started on the revised memorial in 1923, and work was completed by 1925. The structure was begun by Nine Elms Stone and Masonry Works, and completed by Holloway Brothers (who built several other memorials for Lutyens, including Southampton Cenotaph). Due to a continuing shortfall of funding, the War Memorial Committee took out a bank loan to pay for the works to be completed. Five committee members served as guarantors.
## Design
The memorial, in Portland stone, is a square-plan arch with four legs (piers; a tetrapylon or quadrifrons) which dominates the surrounding level ground. It is 69 feet 4+1⁄4 inches (21 metres) tall, with large arched openings on the main axis (north-west to south-east), and smaller, lower arches on the north-east and south-west sides. The widths, heights and depths of the arches are in simple 2:4:1 proportions: the larger arches are 18 feet (5.5 m) wide, 36 feet (11 m) tall and 9 feet (2.7 m) deep; and the smaller arches are 12 feet (3.7 m) wide, 24 feet (7.3 m) tall and 6 feet (1.8 m) deep. Stone wreaths are carved in relief on the legs at the front (north-west side, facing University Road) and rear of the largest arch; inside these are carved the dates of the First World War: MCM XIV (1914) on the left side, and MCM XIX (1919) on the right. The structure is topped with a dome (attic), stepped back and concave at the front and rear. The city's coat of arms is carved in relief on the rear, surrounded by large swags.
The larger arches on the main axis form a coffered, barrel vault ceiling, crossed by the lower arches to either side. The main axis is aligned so the sun would have been at its centre at sunrise on Armistice Day, 11 November (trees and buildings to the south have lowered the apparent horizon since the memorial was built, meaning the sun appears to be slightly off-centre). Four painted stone flags are set inside the archway, raised on corbels on the inside of the legs: the Union Flag (representing the British Army) and the flag of the Royal Navy (the White Ensign) at the front, and the flags of the Merchant Navy (the Red Ensign), and Royal Air Force (the Royal Air Force Ensign) at the rear. Painted stone flags are a recurring feature in Lutyens's war memorial designs; he first proposed them for the Cenotaph, where they were rejected in favour of fabric, though they feature on several of his other designs besides Leicester (other examples include Northampton War Memorial and Rochdale Cenotaph).
Above the front arch (facing University Road) is the inscription GLORY TO GOD IN THE HIGHEST AND ON EARTH PEACE and on the opposite side (facing the park), ALL THEY HOPED FOR, ALL THEY HAD, THEY GAVE TO SAVE MANKIND – THEMSELVES THEY SCORNED TO SAVE from the hymn "O Valiant Hearts". Inscriptions lower down, facing into the park, were added later to display the dates of the Second World War: MCM XXXIX (1939) and MCM XLV (1945). The side arches also have inscriptions. The north-east arch (left, when viewed from the direction of University Road) reads REMEMBER IN GRATITUDE TWELVE THOUSAND MEN OF THIS CITY AND COUNTY WHO FOUGHT AND DIED FOR FREEDOM. REMEMBER ALL WHO SERVED AND STROVE AND THOSE WHO PATIENTLY ENDURED; the right (south-west) arch contains an excerpt from William Blake's poem "And did those feet in ancient time": I WILL NOT CEASE FROM MENTAL FIGHT NOR SHALL MY SWORD SLEEP IN MY HAND TILL WE HAVE BUILT JERUSALEM IN ENGLAND'S GREEN AND PLEASANT LAND.
The memorial is encircled by iron railings, which are pierced by four pairs of stone piers supporting gates opposite each arch (the arch was not intended to be passed through and the gates are kept closed). The piers are decorated with meanders (Greek key patterns) and swags and topped by stone urns, similar to the one on Lutyens's Royal Berkshire Regiment War Memorial in Reading.
Arches are a relatively uncommon form of memorial, particularly for the First World War. Leicester's is one of three by Lutyens and the only one in Britain, the other two being the Thiepval Memorial on the Somme in France (unveiled in 1932) and the India Gate (originally named the All India War Memorial, unveiled in 1931) in New Delhi. The India Gate in particular bears a close resemblance to the Arch of Remembrance, though it is nearly twice its height; Thiepval is a much more complex structure, using multiple interlocking arches to form one, much larger, arch. Lutyens proposed an arch with a dome similar to Leicester's for an IWGC memorial at Saint-Quentin in France in 1924, though this was later abandoned in favour of the Thiepval Memorial. The three arches that were built and the abandoned proposal all share a strong visual resemblance.
### Setting
The arch is situated on the highest point of Victoria Park, dominating its surroundings. It is visible for a considerable distance down Lancaster Road (which leads to the park from the city centre), and from London Road (the A6) and Welford Road, the two main routes out of Leicester to the south. At the time of the memorial's construction, the area was much more open and the arch would have been visible from a greater distance, including from the railway to the south-west. Development in the area through the 20th century, including the buildings of the University of Leicester, now partially obscure the view.
The setting was enhanced when, following the death of his wife in the 1930s, North commissioned Lutyens to design two processional entrances to Victoria Park, leading to the war memorial, as a gift to the city. Lutyens produced a pair of lodges and gates at the Granville Road entrance to Victoria Park, to the north-east of the memorial, and a set of gates and gate piers to the north-west, leading out onto University Road. The lodges are single-storey rectangular pavilions which flank the gates. The external walls are stuccoed, giving the effect of ashlar, with quoins at the angles and large sash windows. Both have architraves above the doorways and a pulvinated frieze below the pyramidal slate roofs and large chimney stacks. The four gate piers are made of ashlar, matching the lodges. They support ornate iron gates which feature an overthrow incorporating Leicester's coat of arms. The gate piers at the University Road entrance are in Portland stone, matching the memorial, decorated with Tuscan pilasters and topped with an entablature and tall urns. They support two smaller pedestrian gates, one each side of a central pair. Above the central gates is an overthrow, again featuring the city's coat of arms. A processional way leads from both entrances to the war memorial, where the two paths meet. The 150-metre (490-foot) long path from the memorial to the University Road gates is known as the Peace Walk (formerly War Memorial Approach) and is lined by shrub borders and formal flower beds.
The access from the London Road entrance was laid out in 2016 as a ceremonial approach to the war memorial from the University Road entrance to Victoria Park.
## History
The arch was unveiled on 4 July 1925 by two local widows, Mrs Elizabeth Butler and Mrs Annie Glover, in front of 30,000 people, including Lutyens and local dignitaries. Eight of Butler's sons served in the army during the war, of whom four were killed in action; Glover lost three sons, along with two nephews and two brothers-in-law. The memorial was dedicated by Cyril Bardsley, Bishop of Peterborough, to the 12,000 men from Leicester and Leicestershire killed during the First World War. The total cost of the memorial was £27,000, of which £1,635 was Lutyens's fee and expenses. At the time of the unveiling, only £16,000 had been raised and by the end of 1925, the committee still had a shortfall of £5,500, which the five guarantors repaid to the bank out of their own pockets. The sum spent was similar to that raised for Rochdale Cenotaph, but the committee in Leicester decided to spend the entire sum on a monument (rather than a fund for wounded servicemen or war widows as in Rochdale), with the result that Leicester's is Lutyens's largest war memorial in Britain. It is described by Historic England as "the most imposing of Lutyens' English war memorials" and by Lutyens's biographer Christopher Hussey as one of Lutyens's "most spectacular" memorials, "in appearance and setting". Another biographer, Tim Skelton, laments that the memorial could have been yet more impressive had the commissioning process been smoother.
Reporting on the unveiling, the local newspaper, the Leicester Advertiser praised the design but stridently criticised the war memorial committee, describing it as a "disgrace" that
> nearly seven years after the cessation of hostilities we should be touting around to get money to pay for what should have been bought and paid for at least five years ago. It could have been obtained then quite easily, but dilatoriness on the part of those who had control and a lack of tact in dealing with the public caused the whole thing to fall flat.
The paper went on to compare the scheme with the carillon erected as a war memorial in the nearby town of Loughborough, noting that "Leicester, though some eight times as big as Loughborough, has had a struggle to raise as much money as Loughborough has already spent".
A ceremony is held at the memorial every year on Remembrance Sunday. In 2017, the memorial was twinned with the India Gate in New Delhi to honour members of the Indian Labour Corps who served in the First World War. As part of the ceremonies, India's high commissioner to Britain laid a wreath at the Arch of Remembrance and Britain's high commissioner to India laid one at the India Gate. In 2018, Leicester City Council commissioned photography of the arch using a drone to reach parts of the memorial that cannot be viewed from the ground.
The arch was designated a Grade II\* listed building in 1955 and upgraded to Grade I in 1996. The gates and gate piers leading to University Road are separately listed at Grade II\*. Victoria Park itself is listed at Grade II on the Register of Historic Parks and Gardens. Listed status provides legal protection from demolition or modification; Grade II\* is applied to "particularly important buildings of more than special interest" and applies to about 5.5 per cent of listed buildings. Grade I is reserved for buildings of "exceptional interest" and applied to only 2.5 per cent of listings. The Arch of Remembrance was one of 44 works included in a national collection of Lutyens's war memorials, designated by Historic England in November 2015 as part of commemorations for the centenary of the First World War.
## See also
Other war memorials:
- Anglo-Boer War Memorial (Johannesburg), an earlier war memorial arch by Lutyens
- Midland Railway War Memorial, another Lutyens memorial, in nearby Derby
- City War Memorial, Nottingham, another First World War memorial arch in a nearby city
Lists:
- Grade I listed buildings in Leicester
- Grade I listed war memorials in England |
# Master System
The is an 8-bit third-generation home video game console manufactured and developed by Sega. It was originally a remodeled export version of the Sega Mark III, the third iteration of the SG-1000 series of consoles, which was released in Japan in 1985 with graphical capabilities improved over its predecessors. The Master System launched in North America in 1986, followed by Europe in 1987, and then in Brazil and Korea in 1989. A Japanese version of the Master System was also launched in 1987, which features a few enhancements over the export models (and by proxy the original Mark III): a built-in FM audio chip, a rapid-fire switch, and a dedicated port for the 3D glasses. The Master System II, a cheaper model, was released in 1990 in North America, Australasia and Europe.
The original Master System models use both cartridges and a credit card-sized format known as Sega Cards. Accessories include a light gun and 3D glasses that work with specially designed games. The later Master System II redesign removed the card slot, turning it into a strictly cartridge-only system, and is incompatible with the 3D glasses.
The Master System was released in competition with the Nintendo Entertainment System (NES). Its library is smaller and with fewer well-reviewed games than the NES, due in part to Nintendo licensing policies requiring platform exclusivity. Though the Master System had newer, improved hardware, it failed to overturn Nintendo's significant market share advantage in Japan and North America. However, it attained significantly greater success in other markets, including Europe, Brazil, South Korea and Australia.
The Master System is estimated to have sold between 10-13 million units worldwide. In addition, Tectoy has sold 8 million licensed Master System variants in Brazil. Retrospective criticism has recognized its role in the development of the Sega Genesis, and a number of well-received games, particularly in PAL (including PAL-M) regions, but is critical of its limited library in the NTSC regions, which were dominated by the NES.
## History
### Mark III
On July 15, 1983, Sega released its first video game console, the SG-1000, in Japan. The launch coincided with the same day its competitor Nintendo launched the Famicom. In 1984, parent company Gulf and Western Industries divested its non-core businesses including Sega, and Sega president Hayao Nakayama was installed as CEO. Sega released another console, the SG-1000 II, featuring several hardware alterations, including detachable controllers. Nakayama and Sega co-founder David Rosen arranged a management buyout with financial backing from CSK Corporation and installed CSK CEO Isao Okawa as chairman.
Hoping to better compete with Nintendo, Sega released another console, the Sega Mark III, in Japan in 1985. The Mark III was a redesigned version of the SG-1000. It was engineered by the same team, including Hideki Sato and Masami Ishikawa, who had worked on the II and later led development of the Sega Genesis. According to Sato, the console was redesigned because of the limitations of the TMS9918 graphics chip in the SG-1000, which did not have the power for the kinds of games Sega wanted to make. The Mark III's chip was designed in-house, based around the unit in Sega's System 2 arcade system board.
The Sega Mark III was released in Japan in October 1985 at a price of ¥15,000. Though its hardware was more powerful than the Famicom, the Mark III was not successful on launch. Problems arose from Nintendo's licensing practices with third-party developers, whereby Nintendo required that games for the Famicom not be published on other consoles. Sega developed its own games and obtained the rights to port games from other developers, but they did not sell well.
### North American release as Master System
Though the SG-1000 had not been released in the United States, Sega hoped that their video game console business would fare better in North America than it had in Japan. To accomplish this, Sega of America was established in 1986 to manage the company's consumer products in North America. Rosen and Nakayama hired Bruce Lowry, Nintendo of America's vice president of sales. Lowry was persuaded to change companies because Sega would allow him to start his new office in San Francisco. He chose the name "Sega of America" for his division because he had worked for Nintendo of America and liked the combination of words. Initially, Sega of America was tasked with repackaging the Mark III for a Western release. Sega of America rebranded the Mark III as the Master System, similar to Nintendo's reworking of the Famicom into the Nintendo Entertainment System (NES). The name was chosen by Sega of America employees throwing darts against a whiteboard of suggested names. Plans to release a cheaper console, the Base System, also influenced the decision. Okawa approved of the name after being told it was a reference to the competitive nature of both the video game industry and martial arts, in which only one competitor can be the "Master". The console's futuristic final design was intended to appeal to Western tastes. The North American packaging was white to differentiate it from the black NES packaging, with a white grid design inspired by Apple computer products.
The Master System was first revealed in North America at the Summer Consumer Electronics Show (CES) in Chicago in June 1986. It was initially sold in a package with the "Power Base" console, a light gun, two controllers, and a pack-in multicart. The console was launched in September 1986 at a price of $200 (), including the games Hang-On and Safari Hunt. Nintendo was exporting the Famicom to the US as the NES, and both companies planned to spend $15 million in late 1986 to market their consoles; Sega hoped to sell 400,000 to 750,000 consoles in 1986. By the end of 1986, at least 125,000 Master System consoles had been sold, more than the Atari 7800's 100,000 but less than Nintendo's 1.1 million. Other sources indicate that more than 250,000 Master System consoles were sold by Christmas 1986.
As in Japan, the Master System in North America had a limited game library. Limited by Nintendo's licensing practices, Sega only had two third-party American publishers, Activision and Parker Brothers. Agreements with both of those companies came to an end in 1989. Sega claimed that the Master System was the first console "where the graphics on the box are actually matched by the graphics of the game", and pushed the "arcade experience" in adverts. Its marketing department was run by only two people, giving Sega a disadvantage in advertising. As one method of promoting the console, at the end of 1987 Sega partnered with astronaut Scott Carpenter to start the "Sega Challenge", a traveling program set up in recreational centers where kids were tested on non-verbal skills such as concentration and the ability to learn new skills. Out Run and Shooting Gallery were two games included in the challenge.
In 1987, amid struggling sales in the US, Sega sold the US distribution rights for the Master System to the toy company Tonka, which had no experience with electronic entertainment systems. The thinking at Sega behind the deal was to leverage Tonka's knowledge of the American toy market, since Nintendo had marketed the NES as a toy to great success in the region. The announcement was made shortly after the 1987 Summer CES. During this time, much of Sega of America's infrastructure shifted from marketing and distribution to focus on customer service, and Lowry departed the company. Tonka blocked localization of several popular Japanese games, and during 1988 were less willing to purchase EPROMs needed for game cartridge manufacture during a shortage. They also became less willing to invest in video games after taking massive loans in purchasing Kenner Toys in 1987, followed by poor holiday season sales and financial losses. Though the distributor of the console had changed, the Master System continued to perform poorly in the market.
The Mark III was rereleased as the Master System in Japan in October 1987 for ¥16,800, but still sold poorly. Neither model posed a serious challenge to Nintendo in Japan, and, according to Sato, Sega was only able to attain 10% of the Japanese console market.
### Europe, Brazil, and other markets
The Master System was launched in Europe in 1987. It was distributed by Mastertronic in the United Kingdom, Master Games in France, and Ariolasoft in West Germany, though Ariolasoft initially purchased the distribution rights for the United Kingdom. Because Ariolasoft could not agree to a pricing agreement with Sega, Mastertronic signed a deal in 1987 to take control of UK distribution, and announced the deal at the 1987 Summer CES. The company announced the release of 12 titles by autumn. Mastertronic advertised the Master System as "an arcade in the home" and launched it at £99 (). Advance orders from retailers were high, but Sega proved unable to deliver inventory until Boxing Day on December 26, causing many retailers to cancel their orders; Mastertronic and Master Games entered financial crises and Ariolasoft vowed never to work with Sega again. Mastertronic had already sold a minority interest to the Virgin Group to enter the console business, and sold the remainder to avoid bankruptcy. The newly rebranded Virgin Mastertronic took over all European distribution in 1988.
Virgin Mastertronic focused marketing the Master System on ports of Sega's arcade games and positioning it as a superior video game alternative to the Commodore 64 and the ZX Spectrum computers. As a result of this marketing and of Nintendo's less effective early approaches in Europe, the Master System began to attract European developers. The Master System held a significant part of the video game console market in Europe through the release of Sega's succeeding console, the Mega Drive. In 1989, Virgin Mastertronic began offering rentals of the Master System console and 20 games. The United Kingdom also hosted a Sega video games national championship, with the winner competing against Japanese and American champions on the British television show Motormouth. Players competed in a variety of games, including Astro Warrior, platform games, and sports games. During the late 1980s, the Master System was outselling the NES in the United Kingdom.
The Master System was successful in Europe. By 1990, the Master System was the best-selling console in Europe, though the NES was beginning to have a fast-growing user base in the UK. For the year 1990, Virgin Mastertronic sold 150,000 Master Systems in the United Kingdom, greater than the 60,000 Mega Drives and Nintendo's 80,000 consoles sold in the same period. In the whole of Europe that year, Sega sold a combined 918,000 consoles, greater than Nintendo's 655,000.
The Master System was also successful in Brazil, where it was distributed by Tectoy and launched in September 1989. Tectoy, a Brazilian toy company startup focused on electronic toys, reached out to Sega about distributing their products. Despite hesitation given the situation with Tonka in the US, Tectoy was eventually given liberty to manage Sega products in Brazil. Their success distributing Sega's laser tag gun based on the anime Zillion gave Sega the confidence to allow Tectoy to distribute the Master System. By the end of 1990, the installed base in Brazil was about 280,000 units. Tectoy introduced a telephone service with game tips, created a Master System club, and presented the program Master Tips during commercial breaks of the television show Sessão Aventura of Rede Globo. Nintendo did not arrive in Brazil until 1993, and were unable to officially compete, given that clones of the NES dominated the Brazilian market. Tectoy claimed 80% of the Brazilian video game market.
In South Korea, the Sega Mark III was released by Samsung under the name "Gam\*Boy" in April 1989 and then the Master System II was released under the name "Aladdin Boy" in 1992. It sold 720,000 units in South Korea up until 1993, outselling the NES (released by Hyundai Group as the "Comboy") and becoming the best-selling console in South Korea up until 1993. The Master System was also popular in Australia, where 250,000 units were sold in 1990 alone, and where it was more successful than the NES. 650,000 Master System consoles had been sold in Australia by November 1994.
### Decline
Although the Master System was a success in Europe, and later in Brazil, it failed to ignite significant interest in the Japanese or North American markets, which, by the mid-to-late 1980s, were both dominated by Nintendo. By 1988, Nintendo held 83 percent of the North American video game market. With Sega continuing to have difficulty penetrating the home market, Sega's console R\&D team, led by Ishikawa and supervised by Sato, began work on a successor to the Master System almost immediately after its launch. Another competitor arose in Japan in 1987 when Japanese computer giant NEC released the PC Engine (TurboGrafx-16 in North America) amid great publicity.
Sega released its next console, the 16-bit Mega Drive, in Japan on October 29, 1988. The final licensed release for the Master System in Japan was Bomber Raid in 1989. In the same year, Sega was preparing to release the new Mega Drive, rebranded Genesis, in North America. Displeased with Tonka's handling of the Master System, Sega reacquired the marketing and distribution rights to the Master System for the United States. In 1990, Sega released the remodeled Master System II, designed as a lower-cost version without the Sega Card slot. Sega promoted the new model, but it sold poorly. By early 1992, Master System production had ceased in North America, having sold between 1.5 million and 2 million units, behind both Nintendo and Atari, which controlled 80 percent and 12 percent of the market respectively. The last licensed Master System release in North America was Sonic the Hedgehog (1991).
In Europe, where the Master System was the best-selling console up until 1990, the NES caught up with and narrowly overtook the Master System in Western Europe during the early 1990s, though the Master System maintained its lead in several markets such as the United Kingdom, Belgium and Spain. In 1993, the Master System's estimated active installed user base in Europe was 6.25 million units, larger than that of the Mega Drive's 5.73 million that year but less than the NES's 7.26 million. Combined with the Mega Drive, Sega represented the majority of the European console market that year. The Master System II was also successful and helped Sega to sustain their significant market share. Releases continued into the 1990s in Europe, including Mercs, Sonic the Hedgehog 2 (both 1992), and Streets of Rage 2 (1994).
The Master System has had continued success in Brazil, where dedicated "plug and play" consoles emulating the original hardware continue to be sold by Tectoy, including portable versions. These systems include the Master System Compact and the Master System III, and Tectoy has also received requests to remake the original Master System. A 2012 article on UOL wrote that Tectoy re-releases of the Master System and Mega Drive combined sold around 150,000 units per year in Brazil. By 2016, Tectoy said they had sold 8 million units of Master System branded systems in Brazil.
## Technical specifications
The Master System's main CPU is a Zilog Z80A, an 8-bit processor rated for 4 MHz, but runs at 3.58 MHz. It has 8 KB of ROM, 8 KB of RAM and 16 KB of video RAM. Video is provided through an RF switch (though Model 1s with an AV port can also output composite and even RGB video) and displays at a resolution of 256 × 192 pixels and up to 32 colors at one time from a total palette of 64 colors; the Video Display Processor (VDP) graphics chip was designed by Sega for the Mark III. The Master System measures 365 by 170 by 70 millimetres (14.4 in × 6.7 in × 2.8 in), while the Mark III measures 318 by 145 by 52 millimetres (12.5 in × 5.7 in × 2.0 in). Both consoles use two slots for game input: one for Mega Cartridges and one for Sega Cards, along with an expansion slot and two controller ports. Sound is provided by the SN76489 PSG built into the VDP, which can provide three square wave channels and one noise channel. The Japanese version also integrates the YM2413 FM chip, an optional feature on the Mark III. With few exceptions, Master System hardware is identical to the hardware in the Mark III. Games for the console are playable on the Sega Genesis using the Power Base Converter accessory, and on the Game Gear using the Master Gear Converter. Compared to the base NES, the Master System has four times as much system memory, eight times as much video memory, and a higher CPU clock rate.
Sega produced several iterations of the Master System. The Master System II, released in 1990, removed a number of components to reduce cost: the Sega Card slot, reset button, power light, expansion port, and startup music and logo. In most regions, the Master System II's A/V port was omitted, leaving only RF output available; this was reversed in France, where the local version of the Master System II had only A/V video output available and omitted the RF hardware. In Brazil, Tectoy released several licensed variations; the Master System Super Compact functions wirelessly with an RF transmitter, and the Master System Girl, molded in bright pink plastic, was targeted at girls. The Master System 3 Collection, released in 2006, contains 120 built-in games. Handheld versions of the Master System were released under several brands, such as Coleco in 2006.
### Accessories
A number of cross-compatible accessories were created for the Mark III and Master System. The controller consists of a rectangle with a D-pad and two buttons. Sega also introduced additional Mark III controllers, such as a paddle controller. A combination steering wheel and flight stick, the Handle Controller, was released in 1989. The Sega Control Stick is an arcade-style joystick with the buttons on the opposite side as the standard controller. Unreleased in Europe, the Sega Sports Pad utilizes a trackball and is compatible with three games. Sega also created an expansion for its controller, the Rapid Fire Unit, that allows for auto-fire by holding down one of two buttons. This unit connects between the console and the controller. A light gun peripheral, the Light Phaser, was based on the weapon of the same name from the Japanese anime Zillion. It is compatible with 13 games and released exclusively in the West.
A pair of 3D glasses, the SegaScope 3-D, were created for games such as Space Harrier 3-D, although Mark III users need an additional converter to use them. The SegaScope 3-D works via an active shutter 3D system, creating a stereoscopic effect. The glasses need to be connected to the Sega Card slot, and thus do not function with the Master System II due to lack of the card slot. A total of eight games, including Zaxxon 3-D and Out Run 3-D, are compatible with the glasses.
The Mark III has an optional RF transmitter accessory, allowing wireless play that broadcasts the game being played on a UHF television signal.
### Game Gear
Developed under the name "Project Mercury" and designed based on the Master System's hardware, the Game Gear is a handheld game console. It was first released in Japan on October 6, 1990, in North America and Europe in 1991, and in Australia and New Zealand in 1992. Originally retailing at JP¥19,800 in Japan, $149.99 in North America, and GB£99.99 in the United Kingdom, the Game Gear was designed to compete with the Game Boy, which Nintendo had released in 1989. There are similarities between the Game Gear and the Master System hardware, but the games are not directly compatible; Master System games are only playable on Game Gear using the Master Gear Converter accessory. A large part of the Game Gear's game library consists of Master System ports. Because of hardware similarities, including the landscape screen orientation, Master System games are easily portable to the handheld. In particular, many Master System ports of Game Gear games were done by Tectoy for the Brazilian market, as the Master System was more popular than the Game Gear in the region.
## Game library
Master System games came in two formats: ROM cartridges held up to 4 Mbit (512 KB) of code and data, while Sega Cards held up to 256 Kbit (32 KB). Cartridges were marketed by their storage size: One Mega (1 Mbit), Two Mega (2 Mbit), and Four Mega (4 Mbit). Cards, cheaper to manufacture than the cartridges, included Spy vs. Spy and Super Tennis, but were eventually dropped due to their small memory size. The size of the release library varies based on region; North America received just over 100 games, with Japan receiving less. Europe, by contrast, received over 300 licensed games, including 8-bit ports of Genesis games and PAL-exclusive releases. The first Mark III-specific cartridge was Fantasy Zone, released on June 15, 1986, and Bomber Raid was the final release on February 4, 1989, a few months after the launch of the Mega Drive. The final North American release was Sonic the Hedgehog in October 1991. Games for PAL regions continued to be released until the mid-1990s.
The Sega Mark III and the Japanese Master System are backwards-compatible with SC-3000/SG-1000 cartridges, and can play Sega Card games without the Card Catcher peripheral. However, educational and programming cartridges for the SC-3000 require the SK-1100 keyboard peripheral, which is compatible with the Mark III but not the Japanese Master System. Mark III-specific games were initially available in card format (labelled My Card Mark III to distinguish themselves from games designed for the SC-3000/SG-1000), starting with Teddy Boy Blues and Hang-On, both released on October 20, 1985.
Of the games released for the Master System, Phantasy Star is considered a benchmark role-playing game (RPG), and became a successful franchise. Sega's flagship character at the time, Alex Kidd, featured in games including Alex Kidd in Miracle World. Wonder Boy III: The Dragon's Trap was influential for its blend of platform gameplay with RPG elements. Different Master System consoles included built-in games, including Snail Maze, Hang-On/Safari Hunt, Alex Kidd in Miracle World and Sonic the Hedgehog. Battery-backup save game support was included in eight cartridges, including Penguin Land, Phantasy Star, Ys, and Miracle Warriors.
The more extensive PAL region library includes 8-bit entries in Genesis franchises such as Streets of Rage, a number of additional Sonic the Hedgehog games, and dozens of PAL exclusives such as The Lucky Dime Caper Starring Donald Duck, Asterix, Ninja Gaiden, Master of Darkness, and Power Strike II. Retro Gamer's Damien McFerran praised the "superb" PAL library of "interesting ports and excellent exclusives", which was richer than the North American library and provided a "drip-feed of quality titles".
After the Master System was discontinued in other markets, additional games were released in Brazil by Tectoy, including ports of Street Fighter II: Champion Edition and Dynamite Headdy. Tectoy created Portuguese translations of games exclusive to the region. Some of these would tie in to popular Brazilian entertainment franchises; for example, Teddy Boy became Geraldinho, certain Wonder Boy titles became Monica's Gang games, and Ghost House became Chapolim vs. Dracula: Um Duelo Assutador, based on the Mexican TV series El Chapulín Colorado. Tectoy also ported games to the Master System, including various games from the Genesis and Game Gear. Aside from porting, the company developed Férias Frustradas do Pica-Pau after finding out that Woody Woodpecker (named Pica-Pau in Portuguese) was the most popular cartoon on Brazilian television, along with at least twenty additional exclusives. These titles were developed in-house by Tectoy in Brazil.
Due in part to Nintendo's licensing practices, which stipulated that third-party NES developers could not release games on other platforms, few third-party developers released games for the Master System. According to Sato, Sega was focused on porting its arcade games instead of building relationships with third parties. According to Sega designer Mark Cerny, most of Sega's early Master System games were developed within a strict three-month deadline, which affected their quality. Computer Gaming World compared new Sega games to "drops of water in the desert". Games for the Master System took advantage of more advanced hardware compared to the NES; Alex Kidd in Miracle World, for example, showcases "blistering colors and more detailed sprites" than NES games. The Master System version of R-Type was praised for its visuals, comparable to those of the TurboGrafx-16 port.
In 2005, Sega reached a deal with the company AtGames to release emulated Master System software in Taiwan, Hong Kong, and China. Several Master System games were released for download on Nintendo's Wii Virtual Console, beginning with Hokuto no Ken in 2008 in Japan and Wonder Boy in North America. Master System games were also released via the GameTap online service.
## Reception and legacy
Due to the continued release of new variants in Brazil, the Master System is considered by many video gaming publications to be the longest lived gaming console in video games history, a title it took from the Atari 2600. Sales of the Master System have been estimated between 10 million and 13 million units, not including later Brazil sales. It saw much more continued success in Europe and Brazil than it did in Japan and North America. In 1989, the Master System was listed in the top 20 products of NPD Group's Toy Retail Sales Tracking Service. However, the Electronic Gaming Monthly 1992 Buyer's Guide indicated a souring interest in the console. Four reviewers scored it 5, 4, 5, and 5 out of a possible 10 points each, focusing on the better value of the Genesis and lack of quality games for the Master System. In 1993, reviewers scored it 2, 2, 3, and 3 out of 10, noting its abandonment by Sega in North America and lack of new releases. By contrast, over 34 million NES units were sold in North America alone, outselling the Master System's life time units globally nearly three times over. According to Bill Pearse of Playthings, the NES gained an advantage through better software and more recognizable characters. Sega closed the gap with Nintendo in the next generation with the release of the Genesis, which sold 30.75 million consoles compared with the 49 million Super Nintendo Entertainment System consoles.
Retrospective feedback of the Master System praises its support toward development of the Sega Genesis, but has been critical of its small game library. Writing for AllGame, Dave Beuscher noted that the Master System "was doomed by the lack of third-party software support and all but disappeared from the American market by 1992." Retro Gamer writer Adam Buchanan praised the larger PAL library as a "superb library of interesting ports and excellent exclusives". Damien McFerran, also of Retro Gamer, recognized its importance to the success of the Genesis, stating, "Without this criminally undervalued machine, Sega would not have enjoyed the considerable success it had with the Mega Drive. The Master System allowed Sega to experiment with arcade conversions, original IP and even create a mascot in the form of the lovable monkey-boy Alex Kidd." In 2009, the Master System was named the 20th best console of all time by IGN, behind the Atari 7800 (17th) and the NES (1st). IGN cited the Master System's small and uneven NTSC library as the major problems: "Months could go by between major releases and that made a dud on the Master System feel even more painful." |
# Grey's Anatomy season 17
The seventeenth season of the American medical drama television series Grey's Anatomy was ordered in May 2019, by the American Broadcasting Company (ABC), as part of a double renewal with the sixteenth season. Shortly after, Krista Vernoff signed an agreement to continue serving as the showrunner of the series. Filming on the series began in September 2020 while the season did not premiere until November 12, 2020, both dates being delayed as a result of the COVID-19 pandemic, for the 2020–2021 broadcast television season. The impact of the COVID-19 pandemic on television only allowed seventeen episodes to be produced, the fewest of any season since the fourth season. Numerous safety protocols were also implemented across various areas of production to prevent COVID-19 transmission.
All starring cast members from the previous season returned with the exception of Justin Chambers, who departed early in the sixteenth season. In addition, Richard Flood and Anthony Hill, who both appeared in the sixteenth season in recurring and guest capacities, respectively, received promotions to the main cast. This season also marked the return of former series regulars Patrick Dempsey, T. R. Knight, Chyler Leigh, and Eric Dane to the series. Meanwhile, main cast members Giacomo Gianniotti, Jesse Williams, and Greg Germann all departed the series during the season. Former series regular Sarah Drew also appeared in the season as part of Williams' departure.
Grey's Anatomy centers around the professional and personal lives of a group of medical professionals that work at the fictional Grey Sloan Memorial Hospital. Nearly every main storyline in the season centered around the COVID-19 pandemic, with a number of plot points also connecting to spin-off series Station 19 through fictional crossover events. The season primarily received mixed reviews from critics and remained ABC's most-watched scripted series, even though viewing figures dropped significantly from the previous season. The season eventually concluded on June 3, 2021. Despite initial uncertainty from the cast, crew, and the network, the series was eventually renewed for an eighteenth season.
## Episodes
The number in the "No. overall" column refers to the episode's number within the overall series, whereas the number in the "No. in season" column refers to the episode's number within this particular season. "U.S. viewers in millions" refers to the number of Americans in millions who watched the episodes live. Each episode of this season is named after a song.
|}
## Cast and characters
### Main
- Ellen Pompeo as Dr. Meredith Grey
- Chandra Wilson as Dr. Miranda Bailey
- James Pickens Jr. as Dr. Richard Webber
- Kevin McKidd as Dr. Owen Hunt
- Jesse Williams as Dr. Jackson Avery
- Caterina Scorsone as Dr. Amelia Shepherd
- Camilla Luddington as Dr. Jo Wilson
- Kelly McCreary as Dr. Maggie Pierce
- Giacomo Gianniotti as Dr. Andrew DeLuca
- Kim Raver as Dr. Teddy Altman
- Greg Germann as Dr. Tom Koracick
- Jake Borelli as Dr. Levi Schmitt
- Chris Carmack as Dr. Atticus "Link" Lincoln
- Richard Flood as Dr. Cormac Hayes
- Anthony Hill as Dr. Winston Ndugu
### Recurring
- Patrick Dempsey as Dr. Derek Shepherd
- Jason George as Dr. Ben Warren
- Debbie Allen as Dr. Catherine Fox
- Stefania Spampinato as Dr. Carina DeLuca
- Alex Landi as Dr. Nico Kim
- Jaicy Elliot as Dr. Taryn Helm
- Mackenzie Marsh as Val Ashton
- Lisa Vidal as Dr. Alma Ortiz
- Melissa DuPrey as Dr. Sara Ortiz
- Nikhil Shukla as Dr. Reza Khan
- Robert I. Mesa as Dr. James Chee
- Zaiver Sinnett as Dr. Zander Perez
### Notable guests
- T. R. Knight as Dr. George O'Malley
- Eric Dane as Dr. Mark Sloan
- Chyler Leigh as Dr. Lexie Grey
- Sarah Drew as Dr. April Kepner
- Barrett Doss as Victoria "Vic" Hughes
- Jay Hayden as Travis Montgomery
- Grey Damon as LT Jack Gibson
- Danielle Savre as Captain Maya Bishop
- Okieriete Onaodowan as Dean Miller
- T. J. Thyne as Aaron Morris
- Dorien Wilson as Clifford Ndugu
- Sherri Saum as Allison Brown
- Frankie Faison as William Bailey
- Bianca Taylor as Elena Bailey
- Bess Armstrong as Maureen Lincoln
- Granville Ames as Eric Lincoln
- Phylicia Rashad as Nell Timms
- Eric Roberts as Robert Avery
- Kyle Harris as Dr. Mason Post
- Debra Mooney as Evelyn Hunt
## Production
### Development
On May 10, 2019, ABC renewed Grey's Anatomy for both a sixteenth and seventeenth season. Krista Vernoff, who serves as the series showrunner and an executive producer, signed a multi-year deal with ABC Studios in 2019 to continue working on Grey's Anatomy and spin-off series Station 19. The deal also attached Vernoff's production company, Trip the Light Productions, to the series. Production on the sixteenth season was later cut short as a result of the COVID-19 pandemic, finishing only twenty-one of the twenty-five episodes ordered; at the time it was unknown whether or not the four additional episodes would be produced as part of the seventeenth season. In September 2020, Variety reported that the season would begin filming later that month. Pompeo announced that filming had begun on September 8. An ABC insider later revealed that the network was looking to produce a season of sixteen episodes, down from the twenty-four to twenty-five episodes per season that had been produced since the eighth season, but that the number could change since conditions were uncertain due to COVID. One additional episode was ordered, bringing the total episode count of the season up to seventeen.
The lower episode count caused the season to tie with the fourth for the second-lowest number of episodes, only having more than the first. To limit the spread of COVID-19, cast and crew members only worked ten-hour days compared to the usual twelve hours. The number of people in each scene also had to be reduced to allow for social distancing. Vernoff said that face masks were worn by all cast and crew members while not filming, including between takes and during rehearsals, and that speaking was not allowed in the hair and makeup trailer. Cast members carried their own makeup bags to do last-minute touch-ups, and different camera lenses were used to make people standing far apart appear closer together. In addition, the cast and crew members received testing for the virus three times a week. In March 2021, Deadline Hollywood reported that another spin-off series was in the works following an interview with ABC Entertainment President Craig Erwich. A few days later, ABC Signature President Jonnie David clarified that they only meant to show support towards Grey's Anatomy and that a spin-off was not being discussed as the network was focused on future seasons of Grey's Anatomy. Despite an initial uncertain future from Vernoff, Pompeo, and network executives, the series was renewed for an eighteenth season.
### Casting
Kim Raver, Camilla Luddington, and Kevin McKidd each signed a three-year contract in July 2020, keeping them attached to the series through a potential nineteenth season to portray Dr. Teddy Altman, Dr. Jo Wilson, and Dr. Owen Hunt, respectively. Pompeo signed a one-year contract to return as Dr. Meredith Grey, the title character, making per episode and also receiving producing credits on both Grey's Anatomy and Station 19 along with a signing bonus totaling around $20 million total for her work. On July 30, 2020, it was announced that Richard Flood and Anthony Hill had been promoted to series regulars. Flood recurred in the previous season as Dr. Cormac Hayes while Hill made a guest appearance in the nineteenth episode of the sixteenth season as Dr. Winston Ndugu. Justin Chambers was the only main cast member from the previous season not to return to the series after departing early in the sixteenth season.
A number of previous series regulars appeared in the season during a storyline revolving around Meredith Grey battling COVID-19 while imagining herself on a beach. Patrick Dempsey was the first actor to return to the series as Dr. Derek Shepherd; Dempsey's last appearance was in the eleventh-season finale, "You're My Home". He recurred throughout the season, appearing in four episodes total. T. R. Knight also returned as Dr. George O'Malley in "You'll Never Walk Alone"; Knight last appeared in "Now or Never" in the fifth season. Chyler Leigh and Eric Dane both appeared in "Breathe" as Dr. Lexie Grey and Dr. Mark Sloan, respectively. Prior to their return, Leigh and Dane last appeared in the eighth-season finale "Flight" and "Remember the Time", the second episode of the ninth season. Due to travel restrictions, Leigh was not able to travel to Los Angeles where production takes place, instead she filmed her scenes in Vancouver, Canada. Leigh was filming Supergirl at the time; a green screen was used to eventually place her on the beach and an apple box was used to simulate rocks while tennis balls were used in place of Pompeo and Dane for dialogue portions.
Giacomo Gianniotti, who portrayed Dr. Andrew DeLuca, exited the series as a main character after being killed off in "Helplessly Hoping." He later appeared in two other episodes as a vision to Raver's Dr. Teddy Altman. On May 6, 2021, it was reported that Jesse Williams, who joined the series in the sixth season as Dr. Jackson Avery, would be departing as a series regular following the fifteenth episode, "Tradition". As part of his departure, former series regular Sarah Drew returned as Dr. April Kepner in Williams' penultimate episode, "Look Up Child", after last appearing in the fourteenth-season episode "All of Me". Greg Germann, who had portrayed Dr. Tom Koracick since the fourteenth season, also departed in "Tradition", being written out in the same storyline as Williams' character. Williams and Germann both briefly reprised their roles in the season finale, "Someone Saved My Life Tonight." Germann is expected to return as a guest star in later seasons while Williams said that he would be open to returning in the following season.
Stefania Spampinato continued to make recurring appearances in the season as Dr. Carina DeLuca after being promoted to a series regular on spin-off series Station 19. Debbie Allen and former series regular Jason George also continued to appear in recurring roles as Dr. Catherine Fox and Dr. Ben Warren, respectively; with George also being a series regular on Station 19. Phylicia Rashad, Allen's sister, guest starred in the season's twelfth episode, "Sign O' the Times". In addition, Barrett Doss, Jay Hayden, Grey Damon, Danielle Savre, and Okieriete Onaodowan made guest appearances as their Station 19 characters in crossover events. Mackenzie Marsh was cast in a recurring role for the season to play Val Ashton. Eric Roberts reprised his role as Robert Avery in "Look Up Child". Lisa Vidal and Melissa DuPrey recurred as a mother-daughter pair named Alma and Sara Ortiz who were part of Grey Sloan's new intern class. Robert I. Mesa was also cast in a recurring role for the season portraying James Chee, the first indigenous doctor on the series.
### Writing
The overarching storyline of the season centered around the doctors in the series battling the COVID-19 pandemic. Krista Vernoff initially considered beginning the season prior to the pandemic or not including it at all, but ultimately decided to begin it in the peak of it, stating:
> To be the biggest medical show and ignore the biggest medical story of the century felt irresponsible to the medical community, it just felt like we had to tell this story. The conversation became: How do we tell this painful and brutal story that has hit our medical community so intensely and permanently changed medicine? And create some escapism? And create romance, comedy and joy and fun? That's the challenge this season.
To properly tell the story of the pandemic, the writers opted to begin the season in April 2020, with time slowly progressing throughout the season, instead of telling the story from a present-day standpoint, as done in previous seasons. Zoanne Clack, a medical doctor who serves as a consultant, writer, and executive producer on the series and previously worked for the Centers for Disease Control and Prevention, said that the goal of the season was to accurately show the infection rate and transfer of COVID-19. A sub-storyline centered around the pandemic was Meredith Grey contracting COVID-19 early in the season. Grey drifted in and out of consciousness throughout the season imagining herself on a beach scene seeing past and present characters of the series. Other central characters were also written to have COVID-19 including Germann's Tom Koracick and the mother of Dr. Miranda Bailey. Bailey's portrayer Chandra Wilson stated that nursing homes, where the character's mother was located, were largely affected by COVID-19 so that when the script was given to her she knew that the experience needed to be told.
`The second half of the season picked up in May 2020. These storylines in the season encompassed both Grey's Anatomy and Station 19 through fictional crossover events. One of these finished the story centered around the mental health of Gianniotti's Andrew DeLuca, which was introduced in the sixteenth season, regarding a patient that had been sexually abused and human trafficked. The storyline was finished by DeLuca's death as a result of a stabbing that occurred in Station 19. Sources close to the production of the series reported that the sixteenth season was supposed to include a character death. However, these plans were scrapped when the season was cut short due to the pandemic; Vernoff said that the death would not have been DeLuca because she wanted to show that people could experience a mental health crisis and be successful afterwards.`
> My reaction to [the story idea] was, 'What?\! Fuck\! No\! Really\!? This is what I'm doing?\! No\!' Many times after I pitched it to the writers and we designed the season around this story, I started to chicken out and second-guess myself. 'Can we save him?\! Can he live?\! He can't.' We've done a lot of near-deaths and saved them since I took over the show. So now people are expecting that. This was the story. It was as shocking to me as it was to you.
The season also touched on other issues such as police brutality, racial profiling, and the murder of George Floyd. The episode centering around George Floyd included the internal conflict of characters deciding whether or not to participate in protests. The exit of Williams' and Germann's characters, Jackson Avery and Koracick, respectively, was explained by their characters leaving Seattle and traveling to Boston in aim to "combat the inequalities in medicine as leaders of the Avery Foundation." Germann's character stated before leaving, "I want to be an ally, I want to spend whatever time I've got left making this lousy, stinking place better, I'll operate, I'll administrate, I'll do anything. I don't want money, I don't want a title, just let me help", and explaining that while he was in the hospital with COVID-19 that he had six roommates and was the only white person. Later storylines in the season centered around COVID-19 vaccine trials and the struggles of adoption. The final two episodes featured periodic time jumps, allowing the final episode to end in April 2021.
## Release
### Broadcast
When ABC revealed its fall schedule for the 2020–2021 broadcast television season, it was reported that the season would hold its previous timeslot of Thursdays at 9:00 pm Eastern Time (ET), serving as a lead-out of Station 19. It was later announced that the season premiere would take place on November 12, 2020. The second episode of the season aired outside its regular time slot at 10:00 pm ET, immediately following the first episode in a two-hour back-to-back timeslot. Six episodes aired prior to the mid-season finale on December 17. ABC initially planned to air the remaining episodes beginning on March 4, 2021, but delayed its return by one week. The second half of the season then began airing on March 11, 2021, with the season's seventh episode. This episode also aired outside of its regular timeslot due to a programming delay as a result of a presidential address by Joe Biden, and began broadcasting at 9:25 pm ET. The season finale aired on June 3. Internationally, the season aired in simulcast in Canada on CTV while in the United Kingdom episodes began airing on Sky Witness on April 17, 2021.
### Home media and streaming services
Hulu continued to hold next-day streaming video on demand rights to the series during the season and the most recently aired episodes were also available for streaming on the ABC website. The season was added to Netflix on July 3, 2021, as part of a streaming deal that adds some ABC Shondaland series to Netflix thirty days after the final episode of the season airs. Outside of the United States, the season, along with all past seasons, was added to Star, a content hub within the Disney+ streaming service. A 4-disc DVD set containing all seventeen episodes was released in multiple regions on June 7, 2021.
## Reception
### Critical response
Ani Bundel with NBC Think stated that the season stayed true to the medical community, noting that even though cheerful and funny moments were mixed in, viewers were not able to forget how many people had died. Alex Cranz from Jezebel felt that the season premiere crossover was "a series of memes ripped straight out of May 2020 instead of November 2020", writing that he would have liked to see the episodes three to four months earlier. TVLine's Charlie Mason mentioned that the rules of Meredith's beach were confusing because she was able to see people that were both dead and alive, also saying that although it seemed nice at first, it eventually lost its appeal. Meanwhile, Jack Wilhelmi from Screen Rant said that the return of Patrick Dempsey to the series was a "major mistake"; however, Saloni Gajjar of The A.V. Club stated that all of the former series regulars that returned during the season helped bring nostalgia to the series, particularly mentioning Sarah Drew giving Williams' character a believable exit. Rebecca Nicholson from The Guardian said that the show properly made what is considered the new normal, normal.
### Awards and nominations
Patrick Dempsey and T. R. Knight both received a nomination in the 2021 Gold Derby Awards for Best Drama Guest Actor. The award was lost to Charles Dance for his work on The Crown. The season was awarded The ReFrame Stamp, a certification given to scripted television productions that hire "women or individuals of other underrepresented gender identities/expressions [...] in four out of eight key roles including writer, director, producer, lead, co-leads, and department heads." At the 47th People's Choice Awards Grey's Anatomy was nominated for The Show of 2021 and The Drama Show of 2021. Ellen Pompeo also received nominations as The Female TV Star of 2021 and The Drama TV Star of 2021, both for her work on the series. The Drama Show of 2021 and The Female TV star of 2021 both won in their respective categories, while the other two nominations went to Loki and Chase Stokes for Outer Banks, respectively. For the 33rd GLAAD Media Awards Grey's Anatomy received its tenth nomination for Outstanding Drama Series, an award in which nominated television series must have an LGBT character in a leading, supporting, or recurring role; but it was ultimately awarded to Pose. Chandra Wilson also received an Outstanding Supporting Actress in a Drama Series nomination for her work on the series at the 53rd NAACP Image Awards. This award was lost to Mary J. Blige for Power Book II: Ghost.
### Ratings
The season was ABC's most-watched television series during the 2020–2021 television season. Throughout its broadcast, in same-day viewership, the season averaged a 1.02 rating in the 18–49 demographic and 5.17 million viewers, down 20 and 17 percent, respectively, from the previous season. In Live+7 the season averaged a 1.9 rating in the 18–49 demographic and 8.16 million viewers, down 17 and 13 percent from the sixteenth season. |
# Princess Victoria of Hesse and by Rhine
Princess Victoria of Hesse and by Rhine, then Princess Louis of Battenberg, later Victoria Mountbatten, Marchioness of Milford Haven (5 April 1863 – 24 September 1950), was the eldest daughter of Louis IV, Grand Duke of Hesse and by Rhine, and Princess Alice of the United Kingdom, daughter of Queen Victoria and Prince Albert of Saxe-Coburg and Gotha.
Born in Windsor Castle in the presence of her maternal grandmother, Princess Victoria was raised in Germany and England. Her mother died while Victoria's brother and sisters were still young, which placed her in an early position of responsibility over her siblings. Over her father's disapproval, she married his morganatic first cousin Prince Louis of Battenberg, an officer in the British Royal Navy. Victoria lived most of her married life in various parts of Europe at her husband's naval posts and visiting her many royal relations. She was perceived by her family as liberal in outlook, straightforward, practical and bright. The couple had four children: Alice, Louise, George, and Louis.
During World War I, Victoria and her husband abandoned their German titles and adopted the surname of Mountbatten, which was an anglicised version of the German "Battenberg". Two of her sisters—Elisabeth and Alix, who had married into the Russian imperial family—were murdered by communist revolutionaries. After World War II, her daughter Louise became queen consort of Sweden and her son Louis was appointed the last viceroy of India. She was the maternal grandmother of Prince Philip, Duke of Edinburgh, consort of Queen Elizabeth II; and paternal great-grandmother of King Charles III.
## Early life
Victoria was born on Easter Sunday at Windsor Castle in the presence of her maternal grandmother, Queen Victoria. She was christened in the Lutheran faith in the Green Drawing Room at Windsor Castle, in the arms of the Queen on 27 April. Her godparents were Queen Victoria, Princess Mary Adelaide of Cambridge, Louis III, Grand Duke of Hesse and by Rhine (represented by Prince Alexander of Hesse and by Rhine), the Prince of Wales and Prince Heinrich of Hesse and by Rhine.
Victoria's early life was spent at Bessungen, a suburb of Darmstadt, until the family moved to the New Palace in Darmstadt when she was three years old. There, she shared a room with her younger sister, Elisabeth, until adulthood. She was privately educated to a high standard and was, throughout her life, an avid reader.
During the Prussian invasion of Hesse in June 1866, Victoria and Elisabeth were sent to Britain to live with their grandmother until hostilities were ended by the absorption of Hesse-Kassel and parts of Hesse-Darmstadt into Prussia. During the Franco-Prussian War of 1870, military hospitals were set up in the palace grounds at Darmstadt, and she helped in the soup kitchens with her mother. She remembered the intense cold of the winter, and being burned on the arm by hot soup.
In 1872, Victoria's eighteen-month-old brother, Friedrich, was diagnosed with haemophilia. The diagnosis came as a shock to the royal families of Europe; it had been twenty years since Queen Victoria had given birth to her haemophiliac son, Prince Leopold, Duke of Albany, and it was the first indication that the bleeding disorder in the royal family was hereditary. The following year, Friedrich fell from a window onto stone steps and died. It was the first of many tragedies to beset the Hesse family.
In early November 1878, Victoria contracted diphtheria. Elisabeth was swiftly moved out of their room and was the only member of the family to escape the disease. For days, Victoria's mother nursed the sick, but she was unable to save her youngest daughter, Marie, who died in mid-November. Just as the rest of the family seemed to have recovered, Princess Alice fell ill. She died on 14 December, the anniversary of the death of her father, Prince Albert. As the eldest child, Victoria partly assumed the role of mother to the younger children and of companion to her father. She later wrote, "My mother's death was an irreparable loss ... My childhood ended with her death, for I became the eldest and most responsible".
## Marriage and family
At family gatherings, Victoria had often met Prince Louis of Battenberg, who was her first cousin once removed and a member of a morganatic branch of the Hessian royal family. Prince Louis had adopted British nationality and was serving as an officer in the Royal Navy. In the winter of 1882, they met again at Darmstadt, and were engaged the following summer.
After a brief postponement because of the death of her maternal uncle Prince Leopold, Duke of Albany, Victoria married Prince Louis on 30 April 1884 at Darmstadt. Her father did not approve of the match; in his view Prince Louis—his own first cousin—had little money and would deprive him of his daughter's company, as the couple would naturally live abroad in Britain. However, Victoria was of an independent mind and took little notice of her father's displeasure. Remarkably, that same evening, Victoria's father secretly married his mistress, Countess Alexandrine von Hutten-Czapska, the former wife of Alexander von Kolemine, the Russian chargé d'affaires in Darmstadt. His marriage to a divorcee who was not of equal rank shocked the assembled royalty of Europe and through diplomatic and family pressure Victoria's father was forced to seek an annulment of his own marriage.
Over the next sixteen years, Victoria and her husband had four children:
They lived in a succession of houses at Chichester, Sussex, Walton-on-Thames, and Schloss Heiligenberg, Jugenheim. When Prince Louis was serving with the Mediterranean Fleet, she spent some winters in Malta. In 1887, she contracted typhoid but, after being nursed through her illness by her husband, was sufficiently recovered by June to attend Queen Victoria's Golden Jubilee celebrations in London. She was interested in science and drew a detailed geological map of Malta and also participated in archaeological digs both on the island and in Germany. In leather-bound volumes she kept meticulous records of books she had read, which reveal a wide range of interests, including socialist philosophy.
She personally taught her own children and exposed them to new ideas and inventions. She gave lessons to her younger son, Louis, until he was ten years of age. He said of her in 1968 that she was "a walking encyclopedia. All through her life she stored up knowledge on all sorts of subjects, and she had the great gift of being able to make it all interesting when she taught it to me. She was completely methodical; we had time-tables for each subject, and I had to do preparation, and so forth. She taught me to enjoy working hard, and to be thorough. She was outspoken and open-minded to a degree quite unusual in members of the Royal Family. And she was also entirely free from prejudice about politics or colour and things of that kind."
In 1906, she flew in a Zeppelin airship, and even more daringly later flew in a biplane even though it was "not made to carry passengers, and we perched securely attached on a little stool holding on to the flyer's back". Up until 1914, Victoria regularly visited her relatives abroad in both Germany and Russia, including her two sisters who had married into the Russian imperial family: Elisabeth, who had married Grand Duke Sergei Alexandrovich, and Alix, who had married Emperor Nicholas II. Victoria was one of the Empress's relatives who tried to persuade her away from the influence of Rasputin. On the outbreak of war between Germany and Britain in 1914, Victoria and her daughter, Louise, were in Russia at Yekaterinburg. By train and steamer, they travelled to St Petersburg and from there through Tornio to Stockholm. They sailed from Bergen, Norway, on "the last ship" back to Britain.
## Later life
Prince Louis was forced to resign from the navy at the start of the war when his German origins became an embarrassment, and the couple retired for the war years to Kent House on the Isle of Wight, which Victoria had been given by her aunt Princess Louise, Duchess of Argyll. Victoria blamed her husband's forced resignation on the Government "who few greatly respect or trust". She distrusted the First Lord of the Admiralty, Winston Churchill, because she thought him unreliable—he had once borrowed a book and failed to return it. Continued public hostility to Germany led King George V of the United Kingdom to renounce his German titles, and at the same time on 14 July 1917 Prince Louis and Victoria renounced theirs, assuming an anglicised version of Battenberg—Mountbatten—as their surname. Four months later Louis was re-ennobled by the King as Marquess of Milford Haven. During the war, Victoria's two sisters, Alix and Elisabeth, were murdered in the Russian revolution, and her brother, Ernest Louis, Grand Duke of Hesse, was deposed. On her last visit to Russia in 1914, Victoria had driven past the very house in Yekaterinburg where Alix would be murdered. In January 1921, after a long and convoluted journey, Elisabeth's body was interred in Jerusalem in Victoria's presence. Alix's body was never recovered during Victoria's lifetime.
Victoria's husband died in London in September 1921. After meeting her at the Naval and Military Club in Piccadilly, he complained of feeling unwell and Victoria persuaded him to rest in a room they had booked in the club annexe. She called a doctor, who prescribed some medicine, and Victoria went out to fill the prescription at a nearby pharmacy. When she came back, Louis was dead. Upon her widowhood, Victoria moved into a grace-and-favour residence at Kensington Palace and, in the words of her biographer, "became a central matriarchal figure in the lives of Europe's surviving royalty". In 1930, her eldest daughter, Alice, suffered a nervous breakdown and was diagnosed as schizophrenic. In the following decade Victoria was largely responsible for her grandson Prince Philip's education and upbringing during his parents' separation and his mother's institutionalisation. Prince Philip recalled, "I liked my grandmother very much and she was always helpful. She was very good with children ... she took the practical approach to them. She treated them in the right way—the right combination of the rational and the emotional."
In 1937, Victoria's brother, Ernest Louis, died and soon afterwards her widowed sister-in-law, nephew, granddaughter and two of her great-grandchildren all died in an air crash at Ostend. Victoria's granddaughter, Princess Cecilie of Greece and Denmark, had married Victoria's nephew (Ernest Louis's son), George Donatus of Hesse. They and their two young sons, Louis and Alexander, were all killed. Cecilie's youngest child, Johanna, who was not on the plane, was adopted by her uncle Prince Louis of Hesse and by Rhine, whose wedding the crash victims were en route to, but the little girl only survived her parents and older brothers by eighteen months, dying in 1939 of meningitis.
Further tragedy soon followed when Victoria's son, George, died of bone cancer the following year. Her granddaughter, Lady Pamela Hicks, remembered her grandmother's tears. In World War II Victoria was bombed out of Kensington Palace, and spent some time at Windsor Castle with King George VI. Her surviving son (Louis) and her two grandsons (David Mountbatten and Prince Philip) served in the Royal Navy, while her German relations fought with the opposing forces. She spent most of her time reading and worrying about her children; her daughter, Alice, remained in occupied Greece and was unable to communicate with her mother for four years at the height of the war. After the Allied victory, her son, Louis, was made Viscount Mountbatten of Burma. He was offered the post of Viceroy of India, but she was deeply opposed to his accepting, knowing that the position would be dangerous and difficult; he accepted anyway.
On 15 December 1948, the Dowager Marchioness attended the christening of her great-grandson, Prince Charles. She was one of eight sponsors or godparents, along with King George VI, King Haakon VII of Norway, Queen Mary, Princess Margaret, Prince George of Greece and Denmark, Lady Brabourne, and David Bowes-Lyon.
She fell ill with bronchitis (she had smoked since the age of sixteen) at Lord Mountbatten's home at Broadlands, Hampshire, in the summer of 1950. Saying "it is better to die at home", Victoria moved back to Kensington Palace, where she died on 24 September aged 87. She was buried four days later in the grounds of St. Mildred's Church, Whippingham on the Isle of Wight.
## Legacy
With the help of her lady-in-waiting, Baroness Sophie Buxhoeveden, Victoria wrote a memoir, held in the Mountbatten archive at the University of Southampton, which remains an interesting source for royal historians. A selection of Queen Victoria's letters to Victoria have been published with a commentary by Richard Hough and an introduction by Victoria's granddaughter, Patricia Mountbatten.
Lord Mountbatten remembered her fondly: "My mother was very quick on the uptake, very talkative, very aggressive and argumentative. With her marvellous brain she sharpened people's wits." Her granddaughter thought her "formidable, but never intimidating ... a supremely honest woman, full of commonsense and modesty". Victoria wrote her own typically forthright epitaph at the end of her life in letters to and conversation with her son: "What will live in history is the good work done by the individual & that has nothing to do with rank or title ... I never thought I would be known only as your mother. You're so well known now and no one knows about me, and I don't want them to."
## Honours
- Grand Duchy of Hesse: Dame of the Order of the Golden Lion, 1 January 1883
- Kingdom of Prussia: Red Cross Medal, 1st Class
- Russian Empire: Dame Grand Cross of the Order of St. Catherine
- :
- Queen Victoria Golden Jubilee Medal, 1887
- Royal Order of Victoria and Albert, 1st Class
## Ancestry |
# Galton Bridge
The Galton Bridge is a cast-iron bridge in Smethwick, near Birmingham, in the West Midlands of England. Opened in 1829 as a road bridge, the structure has been pedestrianised since the 1970s. It was built by Thomas Telford to carry a road across the new main line of the Birmingham Canal, which was built in a deep cutting. The bridge is 70 feet (21 metres) above the canal, making it reputedly the highest single-span arch bridge in the world when it was built, 26 feet (7.9 metres) wide, and 150 feet (46 metres) long. The iron components were fabricated at the nearby Horseley Ironworks and assembled atop the masonry abutments. The design includes decorative lamp-posts and X-shaped bracing in the spandrels.
In the 1840s, a railway bridge was built from one of the abutments, with a parapet in keeping with the original. The Galton Bridge carried traffic for over 140 years until it was bypassed by a new road, named Telford Way, in the 1970s, and now carries only pedestrians and cyclists. The bridge is one of six built by Telford that share common design features and the only one still standing without modification. It underwent minor repair work in the 1980s, after which it was repainted from its original black into a colour scheme intended to enhance its features. It is maintained by the Canal and River Trust and lends its name to the nearby Smethwick Galton Bridge railway station. It is a grade I listed building.
## Background
The original Birmingham Canal was built from the late 1760s along a meandering route, connecting Birmingham to Wolverhampton via the Black Country coalfields in the modern-day West Midlands. One of the major obstacles on the route was a patch of high ground at Smethwick, roughly 4 miles (6.4 kilometres) north-west of Birmingham. The engineers had originally planned to tunnel through, but discovered that the ground conditions were not suitable. Thus, the canal was carried over the hill by a flight of locks.
By the 1820s canal traffic had grown enormously and its narrowness was causing congestion. The summit at Smethwick was short and bordered by locks at each end; as a result, it was common for long queues of boats to form at either end and fights often broke out among boat crews. Improvements had been mooted for years, though the immediate catalyst for investment was a proposal for a railway connecting Birmingham to Liverpool via Wolverhampton. The canal proprietors consulted Thomas Telford, the most eminent canal engineer of the day, and he designed a new, straighter route (known as the New Main Line, the original canal becoming the Old Main Line) which significantly reduced the length of the canal. This scheme involved the excavation of an artificial valley through the high ground in Smethwick. The bridge was named after Samuel Tertius Galton, a local businessman and major investor in the Birmingham Canal Company.
Three local roads were severed by the work, two of which were replaced with traditional masonry bridges, but Roebuck Lane was to cross the cutting at its widest and deepest point. Like all the bridges on the new route, it needed to span the canal without obstructing the waterway or the towpaths. Hence, Telford considered a lighter structure was necessary. Telford was a pioneer in the use of cast iron and became famed for his bridges and aqueducts using the material, which he discovered could be used to create wider spans than had previously been possible using brick or stone. Cast iron is brittle under tension but strong under compression; in bridge construction, it tended to be used in arch form. The world's first iron bridge opened in Shropshire fifty years before the Galton Bridge. Engineers including Telford spent the rest of the 18th century and much of the 19th refining the construction methods.
## Design
The bridge is a single span of 150 ft (46 m), 26 ft (8 m) wide and 70 ft (20 m) above the canal. It consists of six cast-iron ribs, each made of seven segments, bolted together. The bridge is supported by tall brick abutments built into the valley sides. The deck plate is supported by X-shaped bracing in the spandrels. Telford added a decorative parapet and lamp-posts, also in cast iron. When built, it was believed to be the longest bridge over a canal and the highest single-span arch bridge in the world; Telford wrote in his memoirs "At the place of greatest excavation is erected the largest canal bridge in the world; it is made of iron." All the ironwork was cast by Horseley Ironworks at its canal-side factory in nearby Tipton. The name "Galton Bridge" is cast into the centre of the structure, below the parapet, on both sides and "Horseley Iron Works 1829" is cast below both spandrels on both sides.
In his memoirs, published posthumously, Telford described the Galton Bridge as an "extraordinary span". He explained that his decision to build such a high bridge and to build it in cast iron, then still a novel material, was one of "safety, combined with economy". A masonry bridge tall enough to reach the top of the banks of the cutting would require substantial abutments which risked the stonework becoming waterlogged and bulging during heavy rain, whereas an iron span was lighter and required smaller abutments. Telford wrote that "the proportion of masonry is small, and produces variety by its appearance of lightness, which agreeably strikes every spectator."
The Galton Bridge is the last of a series of six cast-iron arch bridges built by Telford to a similar design. The first was at Bonar Bridge in the Scottish Highlands, built in 1810, which became the prototype. Others include the Mythe Bridge at Tewkesbury, built three years before the Galton Bridge, and the Holt Fleet Bridge in Worcestershire, completed in 1828. The Galton Bridge is the only one of the six surviving without later modification; Bonar Bridge was washed away in a flood and Mythe and Holt Fleet bridges were both strengthened with modern materials in the 20th century. The others are Craigellachie Bridge (1814) in north-eastern Scotland, and Waterloo Bridge (1816) in Betws-y-Coed, North Wales, both also strengthened in the 20th century.
The Galton Bridge originally held commanding views of the valley on either side, but these are now obstructed. The bridge is hemmed in between the Smethwick Station Bridge, a railway bridge built in the 1860s, on the west (Wolverhampton) side, and a partial infill of the cutting where a 1970s road scheme crosses the canal on the east (Birmingham) side.
## History
Construction work on the cutting began in 1827. It and the bridge opened in December 1829. Isambard Kingdom Brunel, then a young engineer, visited it the following year and described it as "prodigious". In the 1840s to 1850s, the Stour Valley Railway built its Wolverhampton–Birmingham line along a route mostly parallel to the new main line canal. The railway company built an adjacent bridge to take its tracks under the road using one of the abutments from the canal bridge. The span is a masonry arch but the railway company built an iron parapet in keeping with the Galton Bridge.
The bridge carried increasingly heavy vehicles for almost 150 years until the 1970s, when Roebuck Lane (the road which crosses the Galton Bridge and the adjacent Summit Bridge) was bypassed by a road improvement scheme. A much wider road (the A4252) was built and the Galton Bridge was closed to vehicles but continues to carry pedestrians and cyclists. Instead of constructing a new bridge, the 1970s engineers partly filled in the cutting and built a concrete tunnel for the canal, which was reduced in width. The new road, which runs parallel to the Galton Bridge, was named Telford Way and the canal tunnel named Galton Tunnel. The area around the bridge is sometimes known as the Galton Valley. The structure lends its name to the nearby Smethwick Galton Bridge railway station.
The bridge underwent minor structural repair work in 1987 and was repainted in colour to enhance its features; before this, it had always been painted black. An inspection using ropes to access the underside in the 2000s established that the bridge was in excellent condition and that the 1980s paint work had survived well. The bridge is the responsibility of the Canal and River Trust (formerly British Waterways). It has been a grade I listed building, the highest of three grades, since 1971. Listed building status provides legal protection from demolition or modification. The list entry explicitly includes the attached span across the railway.
## See also
- Engine Arm Aqueduct, another Telford cast-iron structure on the same canal
- Grade I listed buildings in the West Midlands
- List of bridges in the United Kingdom |
# Rosewood massacre
The Rosewood massacre was a racially motivated massacre of black people and the destruction of a black town that took place during the first week of January 1923 in rural Levy County, Florida, United States. At least six black people were killed, but eyewitness accounts suggested a higher death toll of 27 to 150. In addition, two white people were killed in self-defense by one of the victims. The town of Rosewood was destroyed in what contemporary news reports characterized as a race riot. Florida had an especially high number of lynchings of black men in the years before the massacre, including the lynching of Charles Strong and the Perry massacre in 1922.
Before the massacre, the town of Rosewood had been a quiet, primarily black, self-sufficient whistle stop on the Seaboard Air Line Railway. Trouble began when white men from several nearby towns lynched a black Rosewood resident because of accusations that a white woman in nearby Sumner had been assaulted by a black drifter. A mob of several hundred whites combed the countryside hunting for black people and burned almost every structure in Rosewood. For several days, survivors from the town hid in nearby swamps until they were evacuated to larger towns by train and car. No arrests were made for what happened in Rosewood. The town was abandoned by its former black and white residents; none of them ever moved back and the town ceased to exist.
Although the rioting was widely reported around the United States at the time, few official records documented the event. The survivors, their descendants, and the perpetrators all remained silent about Rosewood for decades. Sixty years after the rioting, the story of Rosewood was revived by major media outlets when several journalists covered it in the early 1980s. The survivors and their descendants all organized in an attempt to sue the state for failing to protect Rosewood's black community. In 1993, the Florida Legislature commissioned a report on the incident. As a result of the findings, Florida compensated the survivors and their descendants for the damages which they had incurred because of racial violence. The incident was the subject of a 1997 feature film which was directed by John Singleton. In 2004, the state designated the site of Rosewood as a Florida Heritage Landmark.
Officially, the recorded death toll during the first week of January 1923 was eight (six blacks and two whites). Some survivors' stories claim that up to 27 black residents were killed, and they also assert that newspapers did not report the total number of white deaths. Minnie Lee Langley, who was in the Carrier house when it was besieged, recalls that she stepped over many white bodies on the porch when she left the house. A newspaper article published in 1984 stated that estimates of up to 150 victims might have been exaggerations. Several eyewitnesses claim to have seen a mass grave which was filled with the bodies of black people; one of them remembers seeing 26 bodies being covered with a plow which was brought from Cedar Key. However, by the time authorities investigated these claims, most of the witnesses were dead or too elderly and infirm to lead them to a site to confirm the stories.
## Background
### Settlement
Rosewood was settled in 1847, nine miles (14 km) east of Cedar Key, near the Gulf of Mexico. Most of the local economy drew on the timber industry; the name Rosewood refers to the reddish color of cut cedar wood. Two pencil mills were founded nearby in Cedar Key; local residents also worked in several turpentine mills and a sawmill three miles (4.8 km) away in Sumner, in addition to farming of citrus and cotton. The hamlet grew enough to warrant the construction of a post office and train depot on the Florida Railroad in 1870, but it was never incorporated as a town.
Initially, Rosewood had both black and white settlers. When most of the cedar trees in the area had been cut by 1890, the pencil mills closed, and many white residents moved to Sumner. By 1900, the population in Rosewood had become predominantly black. The village of Sumner was predominantly white, and relations between the two communities were relatively amicable. Two black families in Rosewood named Goins and Carrier were the most powerful. The Goins family brought the turpentine industry to the area, and in the years preceding the attacks were the second largest landowners in Levy County. To avoid lawsuits from white competitors, the Goins brothers moved to Gainesville, and the population of Rosewood decreased slightly. The Carriers were also a large family, primarily working at logging in the region. By the 1920s, almost everyone in the close-knit community was distantly related to each other. The population of Rosewood peaked in 1915 at 355 people. Florida had effectively disenfranchised black voters since the start of the 20th century by high requirements for voter registration; both Sumner and Rosewood were part of a single voting precinct counted by the U.S. Census. In 1920, the combined population of both towns was 638 (344 black and 294 white).
As was common in the late 19th century South, Florida had imposed legal racial segregation under Jim Crow laws requiring separate black and white public facilities and transportation. Black and white residents created their own community centers: by 1920, the residents of Rosewood were mostly self-sufficient. They had three churches, a school, a large Masonic Hall, a turpentine mill, a sugarcane mill, a baseball team named the Rosewood Stars, and two general stores, one of which was white-owned. The village had about a dozen two-story wooden plank homes, other small two-room houses, and several small unoccupied plank farm and storage structures. Some families owned pianos, organs, and other symbols of middle-class prosperity. Survivors of Rosewood remember it as a happy place. In 1995, survivor Robie Mortin recalled at age 79 that when she was a child there, that "Rosewood was a town where everyone's house was painted. There were roses everywhere you walked. Lovely."
### Racial tensions in Florida
Racial violence at the time was common throughout the nation, manifested as individual incidents of extra-legal actions, or attacks on entire communities. Lynchings reached a peak around the start of the 20th century as southern states were disenfranchising black voters and imposing white supremacy; white supremacists used it as a means of social control throughout the South. In 1866 Florida, as did many Southern states, passed laws called Black Codes disenfranchising black citizens. Although these were quickly overturned, and black citizens enjoyed a brief period of improved social standing, by the late 19th century black political influence was virtually nil. The white Democratic-dominated legislature passed a poll tax in 1885, which largely served to disenfranchise all poor voters. Losing political power, black voters suffered a deterioration of their legal and political rights in the years following. Without the right to vote, they were excluded as jurors and could not run for office, effectively excluding them from the political process. The United States as a whole was experiencing rapid social changes: an influx of European immigrants, industrialization and the growth of cities, and political experimentation in the North. In the South, black Americans grew increasingly dissatisfied with their lack of economic opportunity and status as second-class citizens.
Elected officials in Florida represented the voting white majority. Governor Napoleon Bonaparte Broward (1905–1909) suggested finding a location out of state for black people to live separately. Tens of thousands of people moved to the North during and after World War I in the Great Migration, unsettling labor markets and introducing more rapid changes into cities. They were recruited by many expanding northern industries, such as the Pennsylvania Railroad, the steel industry, and meatpacking. Florida governors Park Trammell (1913–1917) and Sidney Catts (1917–1921) generally ignored the emigration of blacks to the North and its causes. While Trammell was state attorney general, none of the 29 lynchings committed during his term were prosecuted, nor were any of the 21 that occurred while he was governor. Catts ran on a platform of white supremacy and anti-Catholic sentiment; he openly criticized the National Association for the Advancement of Colored People (NAACP) when they complained he did nothing to investigate two lynchings in Florida. Catts changed his message when the turpentine and lumber industries claimed labor was scarce; he began to plead with black workers to stay in the state. By 1940, 40,000 black people had left Florida to find employment, but also to escape the oppression of segregation, underfunded education and facilities, violence, and disenfranchisement.
When U.S. troop training began for World War I, many white Southerners were alarmed at the thought of arming black soldiers. A confrontation regarding the rights of black soldiers culminated in the Houston Riot of 1917. German propaganda encouraged black soldiers to turn against their "real" enemies: American whites. Rumors reached the U.S. that French women had been sexually active with black American soldiers, which University of Florida historian David Colburn argues struck at the heart of Southern fears about power and miscegenation. Colburn connects growing concerns of sexual intimacy between the races to what occurred in Rosewood: "Southern culture had been constructed around a set of mores and values which places white women at its center and in which the purity of their conduct and their manners represented the refinement of that culture. An attack on women not only represented a violation of the South's foremost taboo, but it also threatened to dismantle the very nature of southern society." The transgression of sexual taboos subsequently combined with the arming of black citizens to raise fears among whites of an impending race war in the South.
The influx of black people into urban centers in the Northeast and Midwest increased racial tensions in those cities. Between 1917 and 1923, racial disturbances erupted in numerous cities throughout the U.S., motivated by economic competition between different racial groups for industrial jobs. One of the first and most violent instances was a riot in East St. Louis, sparked in 1917. In the Red Summer of 1919, racially motivated mob violence erupted in 23 cities—including Chicago, Omaha, and Washington, D.C.—caused by competition for jobs and housing by returning World War I veterans of both races, and the arrival of waves of new European immigrants. Further unrest occurred in Tulsa in 1921, when whites attacked the black Greenwood community. David Colburn distinguishes two types of violence against black people up to 1923: Northern violence was generally spontaneous mob action against entire communities. Southern violence, in contrast, took the form of individual incidents of lynchings and other extrajudicial actions. The Rosewood massacre, according to Colburn, resembled violence more commonly perpetrated in the North in those years.
In the mid-1920s, the Ku Klux Klan (KKK) reached its peak membership in the South and Midwest after a revival beginning around 1915. Its growth was due in part to tensions from rapid industrialization and social change in many growing cities; in the Midwest and West, its growth was related to the competition of waves of new immigrants from Southern and Eastern Europe. The KKK was strong in the Florida cities of Jacksonville and Tampa; Miami's chapter was influential enough to hold initiations at the Miami Country Club. The Klan also flourished in smaller towns of the South where racial violence had a long tradition dating back to the Reconstruction era. An editor of The Gainesville Daily Sun admitted that he was a member of the Klan in 1922, and praised the organization in print.
Despite Governor Catts' change of attitude, white mob action frequently occurred in towns throughout north and central Florida and went unchecked by local law enforcement. Extrajudicial violence against black residents was so common that it seldom was covered by newspapers. In 1920, whites removed four black men from jail, who were suspects accused of raping a white woman in Macclenny, and lynched them. In Ocoee the same year, two black citizens armed themselves to go to the polls during an election. A confrontation ensued and two white election officials were shot, after which a white mob destroyed Ocoee's black community, causing as many as 30 deaths, and destroying 25 homes, two churches, and a Masonic Lodge. Just weeks before the Rosewood massacre, the Perry Race Riot occurred on December 14 and 15, 1922, in which whites burned Charles Wright at the stake and attacked the black community of Perry, Florida after a white schoolteacher was murdered. On the day following Wright's lynching, whites shot and hanged two more black men in Perry; next they burned the town's black school, Masonic lodge, church, amusement hall, and several families' homes.
## Events in Rosewood
### Fannie Taylor's story
The Rosewood massacre occurred after a white woman in Sumner claimed she had been assaulted by a black man. Frances "Fannie" Taylor was 22 years old in 1923 and married to James, a 30-year-old millwright employed by Cummer & Sons in Sumner. They lived there with their two young children. James' job required him to leave each day during the darkness of early morning. Neighbors remembered Fannie Taylor as "very peculiar": she was meticulously clean, scrubbing her cedar floors with bleach so that they shone white. Other women attested that Taylor was aloof; no one knew her very well.
On January 1, 1923, the Taylors' neighbor reported that she heard a scream while it was still dark, grabbed her revolver and ran next door to find Fannie bruised and beaten, with scuff marks across the white floor. Taylor was screaming that someone needed to get her baby. She said a black man was in her house; he had come through the back door and assaulted her. The neighbor found the baby, but no one else. Taylor's initial report stated her assailant beat her about the face but did not rape her. Rumors circulated—widely believed by whites in Sumner—that she was both raped and robbed. The charge of rape of a white woman by a black man was inflammatory in the South: the day before, the Klan had held a parade and rally of over 100 hooded Klansmen 50 miles (80 km) away in Gainesville under a burning cross and a banner reading, "First and Always Protect Womanhood".
The neighbor also reported the absence that day of Taylor's laundress, Sarah Carrier, whom the white women in Sumner called "Aunt Sarah". Philomena Goins, Carrier's granddaughter, told a different story about Fannie Taylor many years later. She joined her grandmother Carrier at Taylor's home as usual that morning. They watched a white man leave by the back door later in the morning before noon. She said Taylor did emerge from her home showing evidence of having been beaten, but it was well after morning. Carrier's grandson and Philomena's brother, Arnett Goins, sometimes went with them; he had seen the white man before. Carrier told others in the black community what she had seen that day; the black community of Rosewood believed that Fannie Taylor had a white lover, they got into a fight that day, and he beat her. When the man left Taylor's house, he went to Rosewood.
Quickly, Levy County Sheriff Robert Elias Walker raised a posse and started an investigation. When they learned that Jesse Hunter, a black prisoner, had escaped from a chain gang, they began a search to question him about Taylor's attack. Men arrived from Cedar Key, Otter Creek, Chiefland, and Bronson to help with the search. Adding confusion to the events recounted later, as many as 400 white men began to gather. Sheriff Walker deputized some of them, but was unable to initiate them all. Walker asked for dogs from a nearby convict camp, but one dog may have been used by a group of men acting without Walker's authority. Dogs led a group of about 100 to 150 men to the home of Aaron Carrier, Sarah's nephew. Aaron was taken outside, where his mother begged the men not to kill him. He was tied to a car and dragged to Sumner. Sheriff Walker put Carrier in protective custody at the county seat in Bronson to remove him from the men in the posse, many of whom were drinking and acting on their own authority. Worried that the group would quickly grow further out of control, Walker also urged black employees to stay at the turpentine mills for their own safety.
A group of white vigilantes, who had become a mob by this time, seized Sam Carter, a local blacksmith and teamster who worked in a turpentine still. They tortured Carter into admitting that he had hidden the escaped chain gang prisoner. Carter led the group to the spot in the woods where he said he had taken Hunter, but the dogs were unable to pick up a scent. To the surprise of many witnesses, someone fatally shot Carter in the face. The group hung Carter's mutilated body from a tree as a symbol to other black men in the area. Some in the mob took souvenirs of his clothes. Survivors suggest that Taylor's lover fled to Rosewood because he knew he was in trouble and had gone to the home of Aaron Carrier, a fellow veteran and Mason. Carrier and Carter, another Mason, covered the fugitive in the back of a wagon. Carter took him to a nearby river, let him out of the wagon, then returned home to be met by the mob, who was led by dogs following the fugitive's scent.
After lynching Sam Carter, the mob met Sylvester Carrier—Aaron's cousin and Sarah's son—on a road and told him to get out of town. Carrier refused, and when the mob moved on, he suggested gathering as many people as possible for protection.
### Escalation
Despite the efforts of Sheriff Walker and mill supervisor W. H. Pillsbury to disperse the mobs, white men continued to gather. On the evening of January 4, a mob of armed white men went to Rosewood and surrounded the house of Sarah Carrier. It was filled with approximately 15 to 25 people seeking refuge, including many children hiding upstairs under mattresses. Some of the children were in the house because they were visiting their grandmother for Christmas. They were protected by Sylvester Carrier and possibly two other men, but Carrier may have been the only one armed. He had a reputation of being proud and independent. In Rosewood, he was a formidable character, a crack shot, expert hunter, and music teacher, who was simply called "Man". Many white people considered him arrogant and disrespectful.
Sylvester Carrier was reported in the New York Times saying that the attack on Fannie Taylor was an "example of what negroes could do without interference". Whether or not he said this is debated, but a group of 20 to 30 white men, inflamed by the reported statement, went to the Carrier house. They believed that the black community in Rosewood was hiding escaped prisoner Jesse Hunter.
Reports conflict about who shot first, but after two members of the mob approached the house, someone opened fire. Sarah Carrier was shot in the head. Her nine-year-old niece at the house, Minnie Lee Langley, had witnessed Aaron Carrier taken from his house three days earlier. When Langley heard someone had been shot, she went downstairs to find her grandmother, Emma Carrier. Sylvester placed Minnie Lee in a firewood closet in front of him as he watched the front door, using the closet for cover: "He got behind me in the wood [bin], and he put the gun on my shoulder, and them crackers was still shooting and going on. He put his gun on my shoulder ... told me to lean this way, and then Poly Wilkerson, he kicked the door down. When he kicked the door down, Cuz' Syl let him have it."
Several shots were exchanged: the house was riddled with bullets, but the whites did not capture it. The standoff lasted long into the next morning, when Sarah and Sylvester Carrier were found dead inside the house; several others were wounded, including a child who had been shot in the eye. Two white men, C. P. "Poly" Wilkerson and Henry Andrews, were killed; Wilkerson had kicked in the front door, and Andrews was behind him. At least four white men were wounded, one possibly fatally. The remaining children in the Carrier house were spirited out the back door into the woods. They crossed dirt roads one at a time, then hid under brush until they had all gathered away from Rosewood.
### Razing Rosewood
News of the armed standoff at the Carrier house attracted white men from all over the state to take part. Reports were carried in the St. Petersburg Independent, the Florida Times-Union, the Miami Herald, and The Miami Metropolis, in versions of competing facts and overstatement. The Miami Metropolis listed 20 black people and four white people dead and characterized the event as a "race war". National newspapers also put the incident on the front page. The Washington Post and St. Louis Dispatch described a band of "heavily armed Negroes" and a "negro desperado" as being involved. Most of the information came from discreet messages from Sheriff Walker, mob rumors, and other embellishments to part-time reporters who wired their stories to the Associated Press. Details about the armed standoff were particularly explosive. According to historian Thomas Dye, "The idea that blacks in Rosewood had taken up arms against the white race was unthinkable in the Deep South".
Black newspapers covered the events from a different angle. The Afro-American in Baltimore highlighted the acts of African-American heroism against the onslaught of "savages". Another newspaper reported: "Two Negro women were attacked and raped between Rosewood and Sumner. The sexual lust of the brutal white mobbists satisfied, the women were strangled."
The white mob burned black churches in Rosewood. Philomena Goins' cousin, Lee Ruth Davis, heard the bells tolling in the church as the men were inside setting it on fire. The mob also destroyed the white church in Rosewood. Many black residents fled for safety into the nearby swamps, some clothed only in their pajamas. Wilson Hall was nine years old at the time; he later recounted his mother waking him to escape into the swamps early in the morning when it was still dark; the lights from approaching cars of white men could be seen for miles. The Hall family walked 15 miles (24 km) through swampland to the town of Gulf Hammock. The survivors recall that it was uncharacteristically cold for Florida, and people suffered when they spent several nights in raised wooded areas called hammocks to evade the mob. Some took refuge with sympathetic white families. Sam Carter's 69-year-old widow hid for two days in the swamps, then was driven by a sympathetic white mail carrier, under bags of mail, to join her family in Chiefland.
White men began surrounding houses, pouring kerosene on and lighting them, then shooting at those who emerged. Lexie Gordon, a light-skinned 50-year-old woman who was ill with typhoid fever, had sent her children into the woods. She was killed by a shotgun blast to the face when she fled from hiding underneath her home, which had been set on fire by the mob. Fannie Taylor's brother-in-law claimed to be her killer. On January 5, more whites converged on the area, forming a mob of between 200 and 300 people. Some came from out of state. Mingo Williams, who was 20 miles (32 km) away near Bronson, was collecting turpentine sap by the side of the road when a car full of whites stopped and asked his name. As was custom among many residents of Levy County, both black and white, Williams used a nickname that was more prominent than his given name; when he gave his nickname of "Lord God", they shot him dead.
Sheriff Walker pleaded with news reporters covering the violence to send a message to the Alachua County Sheriff P. G. Ramsey to send assistance. Carloads of men came from Gainesville to assist Walker; many of them had probably participated in the Klan rally earlier in the week. W. H. Pillsbury tried desperately to keep black workers in the Sumner mill, and worked with his assistant, a man named Johnson, to dissuade the white workers from joining others using extra-legal violence. Armed guards sent by Sheriff Walker turned away black people who emerged from the swamps and tried to go home. W. H. Pillsbury's wife secretly helped smuggle people out of the area. Several white men declined to join the mobs, including the town barber who also refused to lend his gun to anyone. He said he did not want his "hands wet with blood".
Governor Cary Hardee was on standby, ready to order National Guard troops in to neutralize the situation. Despite his message to the sheriff of Alachua County, Walker informed Hardee by telegram that he did not fear "further disorder" and urged the governor not to intervene. The governor's office monitored the situation, in part because of intense Northern interest, but Hardee would not activate the National Guard without Walker's request. Walker insisted he could handle the situation; records show that Governor Hardee took Sheriff Walker's word and went on a hunting trip.
James Carrier, Sylvester's brother and Sarah's son, had previously suffered a stroke and was partially paralyzed. He left the swamps and returned to Rosewood. He asked W. H. Pillsbury, the white turpentine mill supervisor, for protection; Pillsbury locked him in a house but the mob found Carrier, and tortured him to find out if he had aided Jesse Hunter, the escaped convict. After they made Carrier dig his own grave, they fatally shot him.
### Evacuation
On January 6, white train conductors John and William Bryce managed the evacuation of some black residents to Gainesville. The brothers were independently wealthy Cedar Key residents who had an affinity for trains. They knew the people in Rosewood and had traded with them regularly. As they passed the area, the Bryces slowed their train and blew the horn, picking up women and children. Fearing reprisals from mobs, they refused to pick up any black men. Many survivors boarded the train after having been hidden by white general store owner John Wright and his wife, Mary Jo. Over the next several days, other Rosewood residents fled to Wright's house, facilitated by Sheriff Walker, who asked Wright to transport as many residents out of town as possible.
Lee Ruth Davis, her sister, and two brothers were hidden by the Wrights while their father hid in the woods. On the morning of Poly Wilkerson's funeral, the Wrights left the children alone to attend. Davis and her siblings crept out of the house to hide with relatives in the nearby town of Wylly, but they were turned back for being too dangerous. The children spent the day in the woods but decided to return to the Wrights' house. After spotting men with guns on their way back, they crept back to the Wrights, who were frantic with fear. Davis later described the experience: "I was laying that deep in water, that is where we sat all day long ... We got on our bellies and crawled. We tried to keep people from seeing us through the bushes ... We were trying to get back to Mr. Wright house. After we got all the way to his house, Mr. and Mrs. Wright were all the way out in the bushes hollering and calling us, and when we answered, they were so glad." Several other white residents of Sumner hid black residents of Rosewood and smuggled them out of town. Gainesville's black community took in many of Rosewood's evacuees, waiting for them at the train station and greeting survivors as they disembarked, covered in sheets. On Sunday, January 7, a mob of 100 to 150 whites returned to burn the remaining dozen or so structures of Rosewood.
### Response
Many people were alarmed by the violence, and state leaders feared negative effects on the state's tourist industry. Governor Cary Hardee appointed a special grand jury and special prosecuting attorney to investigate the outbreak in Rosewood and other incidents in Levy County. In February 1923, the all-white grand jury convened in Bronson. Over several days, they heard 25 witnesses, eight of whom were black, but found insufficient evidence to prosecute any perpetrators. The judge presiding over the case deplored the actions of the mob.
By the end of the week, Rosewood no longer made the front pages of major white newspapers. The Chicago Defender, the most influential black newspaper in the U.S., reported that 19 people in Rosewood's "race war" had died, and a soldier named Ted Cole appeared to fight the lynch mobs, then disappeared; no confirmation of his existence after this report exists. A few editorials appeared in Florida newspapers summarizing the event. The Gainesville Daily Sun justified the actions of whites involved, writing "Let it be understood now and forever that he, whether white or black, who brutally assaults an innocent and helpless woman, shall die the death of a dog." The Tampa Tribune, in a rare comment on the excesses of whites in the area, called it "a foul and lasting blot on the people of Levy County".
Northern publications were more willing to note the breakdown of law, but many attributed it to the backward mindset in the South. The New York Call, a socialist newspaper, remarked "how astonishingly little cultural progress has been made in some parts of the world", while the Nashville Banner compared the events in Rosewood to recent race riots in Northern cities, but characterized the entire event as "deplorable". A three-day conference in Atlanta organized by the Southern Methodist Church released a statement that similarly condemned the chaotic week in Rosewood. It concluded, "No family and no race rises higher than womanhood. Hence, the intelligence of women must be cultivated and the purity and dignity of womanhood must be protected by the maintenance of a single standard of morals for both races."
Officially, the recorded death toll of the first week of January 1923 was eight people (six black and two white). Historians disagree about this number. Some survivors' stories claim there may have been up to 27 black residents killed, and assert that newspapers did not report the total number of white deaths. Minnie Lee Langley, who was in the Carrier house siege, recalls that she stepped over many white bodies on the porch when she left the house. Several eyewitnesses claim to have seen a mass grave filled with black people; one remembers a plow brought from Cedar Key that covered 26 bodies. However, by the time authorities investigated these claims, most of the witnesses were dead, or too elderly and infirm to lead them to a site to confirm the stories.
Aaron Carrier was held in jail for several months in early 1923; he died in 1965. James Carrier's widow Emma was shot in the hand and the wrist and reached Gainesville by train. She never recovered, and died in 1924. Sarah Carrier's husband Haywood did not see the events in Rosewood. He was on a hunting trip, and discovered when he returned that his wife, brother James, and son Sylvester had all been killed and his house destroyed by a white mob. Following the shock of learning what had happened in Rosewood, Haywood rarely spoke to anyone but himself; he sometimes wandered away from his family unclothed. His grandson, Arnett Goins, thought that he had been unhinged by grief. Haywood Carrier died a year after the massacre. Jesse Hunter, the escaped convict, was never found. Many survivors fled in different directions to other cities, and a few changed their names from fear that whites would track them down. None ever returned to live in Rosewood.
Fannie Taylor and her husband moved to another mill town. She was "very nervous" in her later years, until she succumbed to cancer. John Wright's house was the only structure left standing in Rosewood. He lived in it and acted as an emissary between the county and the survivors. After they left the town, almost all of their land was sold for taxes. Mary Jo Wright died around 1931; John developed a problem with alcohol. He was ostracized and taunted for assisting the survivors, and rumored to keep a gun in every room of his house. He died after drinking too much one night in Cedar Key, and was buried in an unmarked grave in Sumner. The sawmill in Sumner burned down in 1925, and the owners moved the operation to Lacoochee in Pasco County. Some survivors as well as participants in the mob action went to Lacoochee to work in the mill there. W. H. Pillsbury was among them, and he was taunted by former Sumner residents. No longer having any supervisory authority, Pillsbury was retired early by the company. He moved to Jacksonville and died in 1926.
## Culture of silence
Despite nationwide news coverage in both white and black newspapers, the incident, and the small abandoned village, slipped into oblivion. Most of the survivors scattered around Florida cities and started over with nothing. Many, including children, took on odd jobs to make ends meet. Education had to be sacrificed to earn an income. As a result, most of the Rosewood survivors took on manual labor jobs, working as maids, shoe shiners, or in citrus factories or lumber mills.
Although the survivors' experiences after Rosewood were disparate, none publicly acknowledged what had happened. Robie Mortin, Sam Carter's niece, was seven years old when her father put her on a train to Chiefland, 20 miles (32 km) east of Rosewood, on January 3, 1923. Mortin's father avoided the heart of Rosewood on the way to the depot that day, a decision Mortin believes saved their lives. Mortin's father met them years later in Riviera Beach, in South Florida. None of the family ever spoke about the events in Rosewood, on order from Mortin's grandmother: "She felt like maybe if somebody knew where we came from, they might come at us".
This silence was an exception to the practice of oral history among black families. Minnie Lee Langley knew James and Emma Carrier as her parents. She kept the story from her children for 60 years: "I didn't want them to know what I came through and I didn't discuss it with none of them ... I just didn't want them to know what kind of way I come up. I didn't want them to know white folks want us out of our homes." Decades passed before she began to trust white people. Some families spoke of Rosewood, but forbade the stories from being told: Arnett Doctor heard the story from his mother, Philomena Goins Doctor, who was with Sarah Carrier the day Fannie Taylor claimed she was assaulted, and was in the house with Sylvester Carrier. She told her children about Rosewood every Christmas. Doctor was consumed by his mother's story; he would bring it up to his aunts only to be dissuaded from speaking of it.
In 1982, an investigative reporter named Gary Moore from the St. Petersburg Times drove from the Tampa area to Cedar Key looking for a story. When he commented to a local on the "gloomy atmosphere" of Cedar Key, and questioned why a Southern town was all-white when at the start of the 20th century it had been nearly half black, the local woman replied, "I know what you're digging for. You're trying to get me to talk about that massacre." Moore was hooked. He was able to convince Arnett Doctor to join him on a visit to the site, which he did without telling his mother. Moore addressed the disappearance of the incident from written or spoken history: "After a week of sensation, the weeks of January 1923 seem to have dropped completely from Florida's consciousness, like some unmentionable skeleton in the family closet".
When Philomena Goins Doctor found out what her son had done, she became enraged and threatened to disown him, shook him, then slapped him. A year later, Moore took the story to CBS' 60 Minutes, and was the background reporter on a piece produced by Joel Bernstein and narrated by African-American journalist Ed Bradley. Philomena Doctor called her family members and declared Moore's story and Bradley's television exposé were full of lies. A psychologist at the University of Florida later testified in state hearings that the survivors of Rosewood showed signs of posttraumatic stress disorder, made worse by the secrecy. Many years after the incident, they exhibited fear, denial, and hypervigilance about socializing with whites—which they expressed specifically regarding their children, interspersed with bouts of apathy. Despite such characteristics, survivors counted religious faith as integral to their lives following the attack in Rosewood, to keep them from becoming bitter. Michael D'Orso, who wrote a book about Rosewood, said, "[E]veryone told me in their own way, in their own words, that if they allowed themselves to be bitter, to hate, it would have eaten them up." Robie Mortin described her past this way: "I knew that something went very wrong in my life because it took a lot away from me. But I wasn't angry or anything."
The legacy of Rosewood remained in Levy County. For decades no black residents lived in Cedar Key or Sumner. Robin Raftis, the white editor of the Cedar Key Beacon, tried to place the events in an open forum by printing Moore's story. She had been collecting anecdotes for many years, and said, "Things happened out there in the woods. There's no doubt about that. How bad? We don't know ... So I said, 'Okay guys, I'm opening the closet with the skeletons, because if we don't learn from mistakes, we're doomed to repeat them'." Raftis received notes reading, "We know how to get you and your kids. All it takes is a match". University of Florida historian David Colburn stated, "There is a pattern of denial with the residents and their relatives about what took place, and in fact they said to us on several occasions they don't want to talk about it, they don't want to identify anyone involved, and there's also a tendency to say that those who were involved were from elsewhere."
In 1993, a black couple retired to Rosewood from Washington D.C. They told The Washington Post, "When we used to have black friends down from Chiefland, they always wanted to leave before it got dark. They didn't want to be in Rosewood after dark. We always asked, but folks wouldn't say why."
## Seeking justice
### History includes Rosewood
Philomena Goins Doctor died in 1991. Her son Arnett was, by that time, "obsessed" with the events in Rosewood. Although he was originally excluded from the Rosewood claims case, he was included after this was revealed by publicity. By that point, the case had been taken on a pro bono basis by one of Florida's largest legal firms. In 1993, the firm filed a lawsuit on behalf of Arnett Goins, Minnie Lee Langley, and other survivors against the state government for its failure to protect them and their families.
Survivors participated in a publicity campaign to expand attention to the case. Langley and Lee Ruth Davis appeared on The Maury Povich Show on Martin Luther King Day in 1993. Gary Moore published another article about Rosewood in the Miami Herald on March 7, 1993; he had to negotiate with the newspaper's editors for about a year to publish it. At first they were skeptical that the incident had taken place, and secondly, reporter Lori Rosza of the Miami Herald had reported on the first stage of what proved in December 1992 to be a deceptive claims case, with most of the survivors excluded. "If something like that really happened, we figured, it would be all over the history books", an editor wrote.
Arnett Doctor told the story of Rosewood to print and television reporters from all over the world. He raised the number of historic residents in Rosewood, as well as the number who died at the Carrier house siege; he exaggerated the town's contemporary importance by comparing it to Atlanta, Georgia as a cultural center. Doctor wanted to keep Rosewood in the news; his accounts were printed with few changes. According to historian Thomas Dye, Doctor's "forceful addresses to groups across the state, including the NAACP, together with his many articulate and heart-rending television appearances, placed intense pressure on the legislature ... to do something about Rosewood". In December 1996, Doctor told a meeting at Jacksonville Beach that 30 women and children had been buried alive at Rosewood, and that his facts had been confirmed by journalist Gary Moore. He was embarrassed to learn that Moore was in the audience. As the Holland & Knight law firm continued the claims case, they represented 13 survivors, people who had lived in Rosewood at the time of the 1923 violence, in the claim to the legislature.
The lawsuit missed the filing deadline of January 1, 1993. The speaker of the Florida House of Representatives commissioned a group to research and provide a report by which the equitable claim bill could be evaluated. It took them nearly a year to do the research, including interviews, and writing. On December 22, 1993, historians from Florida State University, Florida A\&M University, and the University of Florida delivered a 100-page report (with 400 pages of attached documentation) on the Rosewood massacre. It was based on available primary documents, and interviews mostly with black survivors of the incident. Due to the media attention received by residents of Cedar Key and Sumner following filing of the claim by survivors, white participants were discouraged from offering interviews to the historians. The report used a taped description of the events by Jason McElveen, a Cedar Key resident who had since died, and an interview with Ernest Parham, who was in high school in 1923 and happened upon the lynching of Sam Carter. Parham said he had never spoken of the incident because he was never asked. The report was titled "Documented History of the Incident which Occurred at Rosewood, Florida in January 1923". Gary Moore, the investigative journalist who wrote the 1982 story in The St. Petersburg Times that reopened the Rosewood case, criticized demonstrable errors in the report. The commissioned group retracted the most serious of these, without public discussion. They delivered the final report to the Florida Board of Regents and it became part of the legislative record.
### Rosewood victims v. the State of Florida
Florida's consideration of a bill to compensate victims of racial violence was the first by any U.S. state. Opponents argued that the bill set a dangerous precedent and put the onus of paying survivors and descendants on Floridians who had nothing to do with the incident in Rosewood. James Peters, who represented the State of Florida, argued that the statute of limitations applied because the law enforcement officials named in the lawsuit—Sheriff Walker and Governor Hardee—had died many years before. He also called into question the shortcomings of the report: although the historians were instructed not to write it with compensation in mind, they offered conclusions about the actions of Sheriff Walker and Governor Hardee. The report was based on investigations led by historians as opposed to legal experts; they relied in cases on information that was hearsay from witnesses who had since died. Critics thought that some of the report's writers asked leading questions in their interviews.
Even legislators who agreed with the sentiment of the bill asserted that the events in Rosewood were typical of the era. One survivor interviewed by Gary Moore said that to single out Rosewood as an exception, as if the entire world was not a Rosewood, would be "vile". State legislators who supported the bill Democrat Al Lawson and Republican Miguel De Grandy, argued that, unlike Native Americans or slaves who had suffered atrocities at the hands of whites, the residents of Rosewood were tax-paying, self-sufficient citizens who deserved the protection of local and state law enforcement. While mob lynchings of black people around the same time tended to be spontaneous and quickly concluded, the incident at Rosewood was prolonged over a period of several days. Some legislators began to receive hate mail, including some claiming to be from Ku Klux Klan members. One legislator remarked that his office received an unprecedented response to the bill, with a proportion of ten constituents to one opposing it.
In 1994, the state legislature held a hearing to discuss the merits of the bill. Lee Ruth Davis died a few months before testimony began, but Minnie Lee Langley, Arnett Goins, Wilson Hall, Willie Evans, and several descendants from Rosewood testified. Other witnesses were a clinical psychologist from the University of Florida, who testified that survivors had suffered post-traumatic stress, and experts who offered testimony about the scale of property damages. Langley spoke first; the hearing room was packed with journalists and onlookers who were reportedly mesmerized by her statement. Ernest Parham also testified about what he saw. When asked specifically when he was contacted by law enforcement regarding the death of Sam Carter, Parham replied that he had been contacted for the first time on Carter's death two weeks before testifying. The coroner's inquest for Sam Carter had taken place the day after he was shot in January 1923; he concluded that Carter had been killed "by Unknown Party".
After hearing all the evidence, the Special Master Richard Hixson, who presided over the testimony for the Florida Legislature, declared that the state had a "moral obligation" to make restitution to the former residents of Rosewood. He said, "I truly don't think they cared about compensation. I think they simply wanted the truth to be known about what happened to them ... whether they got fifty cents or a hundred and fifty million dollars. It didn't matter."
Black and Hispanic legislators in Florida took on the Rosewood compensation bill as a cause, and refused to support Democratic Governor Lawton Chiles' healthcare plan until he put pressure on the Democrat controlled state assembly to vote in favor of the bill. Chiles was offended, as he had supported the compensation bill from its early days, and the legislative caucuses had previously promised their support for his healthcare plan. The legislature passed the bill, and Governor Chiles signed the Rosewood Compensation Bill, a $2.1 million package to compensate survivors and their descendants. Seven survivors and their family members were present at the signing to hear Chiles say
> Because of the strength and commitment of these survivors and their families, the long silence has finally been broken and the shadow has been lifted ... Instead of being forgotten, because of their testimony, the Rosewood story is known across our state and across our nation. This legislation assures that the tragedy of Rosewood will never be forgotten by the generations to come.
Originally, the compensation total offered to survivors was $7 million, which aroused controversy. The legislature eventually settled on $1.5 million: this would enable payment of $150,000 to each person who could prove he or she lived in Rosewood during 1923, and provide a $500,000 pool for people who could apply for the funds after demonstrating that they had an ancestor who owned property in Rosewood during the same time. The four survivors who testified automatically qualified; four others had to apply. More than 400 applications were received from around the world.
Robie Mortin came forward as a survivor during this period; she was the only one added to the list who could prove that she had lived in Rosewood in 1923, totaling nine survivors who were compensated. Gaining compensation changed some families, whose members began to fight among themselves. Some descendants refused it, while others went into hiding in order to avoid the press of friends and relatives who asked them for handouts. Some descendants, after dividing the funds among their siblings, received not much more than $100 each. Later, the Florida Department of Education set up the Rosewood Family Scholarship Fund for Rosewood descendants and ethnic minorities.
## Rosewood remembered
### Representation in other media
The Rosewood massacre, the ensuing silence, and the compensation hearing were the subject of the 1996 book titled Like Judgment Day: The Ruin and Redemption of a Town Called Rosewood by Mike D'Orso. It was a New York Times bestseller and won the Lillian Smith Book Award, bestowed by the University of Georgia Libraries and the Southern Regional Council to authors who highlight racial and social inequality in their works.
The dramatic feature film Rosewood (1997), directed by John Singleton, was based on these historic events. Minnie Lee Langley served as a source for the set designers, and Arnett Doctor was hired as a consultant. Recreated forms of the towns of Rosewood and Sumner were built in Central Florida, far away from Levy County. The film version, written by screenwriter Gregory Poirier, created a character named Mann, who enters Rosewood as a type of reluctant Western-style hero. Composites of historic figures were used as characters, and the film offers the possibility of a happy ending. In The New York Times E.R. Shipp suggests that Singleton's youth and his background in California contributed to his willingness to take on the story of Rosewood. She notes Singleton's rejection of the image of black people as victims and the portrayal of "an idyllic past in which black families are intact, loving and prosperous, and a black superhero who changes the course of history when he escapes the noose, takes on the mob with double-barreled ferocity and saves many women and children from death". Singleton has offered his view: "I had a very deep—I wouldn't call it fear—but a deep contempt for the South because I felt that so much of the horror and evil that black people have faced in this country is rooted here ... So in some ways this is my way of dealing with the whole thing."
Reception of the film was mixed. Shipp commented on Singleton's creating a fictional account of Rosewood events, saying that the film "assumes a lot and then makes up a lot more". The film version alludes to many more deaths than the highest counts by eyewitnesses. Gary Moore believes that creating an outside character who inspires the citizens of Rosewood to fight back condescends to survivors, and he criticized the inflated death toll specifically, saying the film was "an interesting experience in illusion". In contrast, in 2001 Stanley Crouch of The New York Times described Rosewood as Singleton's finest work, writing, "Never in the history of American film had Southern racist hysteria been shown so clearly. Color, class and sex were woven together on a level that Faulkner would have appreciated."
### Legacy
The State of Florida declared Rosewood a Florida Heritage Landmark in 2004 and subsequently erected a historical marker on State Road 24 that names the victims and describes the community's destruction. Scattered structures remain within the community, including a church, a business, and a few homes, notably John Wright's. Mary Hall Daniels, the last known survivor of the massacre at the time of her death, died at the age of 98 in Jacksonville, Florida, on May 2, 2018. Vera Goins-Hamilton, who had not previously been publicly identified as a survivor of the Rosewood massacre, died at the age of 100 in Lacoochee, Florida, in 2020.
Rosewood descendants formed the Rosewood Heritage Foundation and the Real Rosewood Foundation, Inc., in order to educate people both in Florida and all over the world about the massacre. The Rosewood Heritage Foundation created a traveling exhibit that tours internationally in order to share the history of Rosewood and the attacks; a permanent display is housed in the library of Bethune-Cookman University in Daytona Beach. The Real Rosewood Foundation presents a variety of humanitarian awards to people in Central Florida who help preserve Rosewood's history. The organization also recognized Rosewood residents who protected blacks during the attacks by presenting an Unsung Heroes Award to the descendants of Sheriff Robert Walker, John Bryce, and William Bryce. Lizzie Jenkins, executive director of the Real Rosewood Foundation and niece of the Rosewood schoolteacher, explained her interest in keeping Rosewood's legacy current:
> It has been a struggle telling this story over the years, because a lot of people don't want to hear about this kind of history. People don't relate to it, or just don't want to hear about it. But Mama told me to keep it alive, so I keep telling it ... It's a sad story, but it's one I think everyone needs to hear.
The Real Rosewood Foundation, Inc., under the leadership of Jenkins, is raising funds to move John Wright's house to nearby Archer, Florida, and make it a museum.
The state of Florida in 2020 established a Rosewood Family Scholarship Program, paying up to $6,100 each to up to 50 students each year who are direct descendants of Rosewood families.
## See also
- African Americans in Florida
- Black genocide – the notion that African Americans have been subjected to genocide
- Domestic terrorism in the United States
- Elaine massacre
- List of ethnic cleansing campaigns
- List of ethnic riots\#United States
- List of expulsions of African Americans
- List of incidents of civil unrest in the United States
- List of massacres in the United States
- Lynching in the United States
- Mass racial violence in the United States
- Murder of Harry and Harriette Moore
- Nadir of American race relations
- Newberry Six lynchings
- Ocoee massacre
- Opelousas massacre
- Perry massacre
- Racism against African Americans
- Racism in the United States
- Red Summer
- Terrorism in the United States
- Timeline of terrorist attacks in the United States
- Tulsa race massacre
- Wilmington insurrection |
# Resident Evil 5
Resident Evil 5 is a 2009 third-person shooter video game developed and published by Capcom. It is a major installment in the Resident Evil series, and was announced in 2005—the same year its predecessor Resident Evil 4 was released. Resident Evil 5 was released for the PlayStation 3 and Xbox 360 consoles in March 2009 and for Windows in September 2009. It was re-released for PlayStation 4 and Xbox One in June 2016. The plot involves an investigation of a terrorist threat by Bioterrorism Security Assessment Alliance agents Chris Redfield and Sheva Alomar in Kijuju, a fictional region of West Africa. Chris learns that he must confront his past in the form of an old enemy, Albert Wesker, and his former partner, Jill Valentine.
The gameplay of Resident Evil 5 is similar to that of the previous installment, though it is the first in the series designed for two-player cooperative gameplay. It has also been considered the first game in the main series to depart from the survival horror genre, with critics saying it bore more resemblance to an action game. Motion capture was used for the cutscenes, and it was the first video game to use a virtual camera system. Several staff members from the original Resident Evil worked on Resident Evil 5. The Windows version was developed by Mercenary Technology.
Resident Evil 5 received a positive reception, despite some criticism for its control scheme. The game received some complaints of racism, though an investigation by the British Board of Film Classification found the complaints were unsubstantiated. As of December 2023, when including the original, special and remastered versions, the game had sold 13.4 million units. It is the best-selling game of the Resident Evil franchise when not including remakes, and the original version remained the best-selling individual Capcom release until March 2018, when it was outsold by Monster Hunter: World. A sequel, Resident Evil 6, was released in 2012.
## Plot
In 2009, five years after the events of Resident Evil 4, Chris Redfield, now an agent of the Bioterrorism Security Assessment Alliance (BSAA), is dispatched to Kijuju in West Africa. He and his new partner Sheva Alomar are tasked with apprehending Ricardo Irving before he can sell a bio-organic weapon (BOW) on the black market. When they arrive, they discover that the locals have been infected by the parasites Las Plagas (those infected are called "Majini") and the BSAA Alpha Team have been killed. Chris and Sheva are rescued by BSAA's Delta Team, which includes Sheva's mentor Captain Josh Stone. In Stone's data Chris sees a photograph of Jill Valentine, his old partner, who has been presumed dead after a confrontation with Albert Wesker. Chris, Sheva and Delta Team close in on Irving, but he escapes with the aid of a hooded figure. Irving leaves behind documents that lead Chris and Sheva to marshy oilfields, where Irving's deal is to occur, but they discover that the documents are a diversion. When Chris and Sheva try to regroup with Delta Team, they find the team slaughtered by a BOW; Sheva cannot find Stone among the dead. Determined to learn if Jill is still alive, Chris does not report to headquarters.
Continuing through the marsh, they find Stone and track down Irving's boat with his help. Irving injects himself with a variant of the Las Plagas parasite and mutates into a huge octopus-like beast. Chris and Sheva defeat him, and his dying words lead them to a nearby cave. The cave is the source of a flower used to create viruses previously used by the Umbrella Corporation, as well as a new strain named Uroboros. Chris and Sheva find evidence that Tricell, the company funding the BSAA, took over a former Umbrella underground laboratory and continued Umbrella's research. In the facility, they discover thousands of capsules holding human test subjects. Chris finds Jill's capsule, but it is empty. When they leave, they discover that Tricell CEO Excella Gionne has been plotting with Wesker to launch missiles with the Uroboros virus across the globe; it is eventually revealed that Wesker hopes to take a chosen few from the chaos of infection and rule them, creating a new breed of humanity. Chris and Sheva pursue Gionne but are stopped by Wesker and the hooded figure, who is revealed to be a brainwashed Jill. Gionne and Wesker escape to a Tricell oil tanker; Chris and Sheva fight Jill, subduing her and removing the mind-control device before she urges Chris to follow Wesker.
Chris and Sheva board the tanker and encounter Gionne, who escapes after dropping a case of syringes; Sheva keeps several. When Chris and Sheva reach the main deck, Wesker announces over the ship's intercom that he has betrayed Gionne and infected her with Uroboros. She mutates into a giant monster, which Chris and Sheva defeat. Jill radios in, telling Chris and Sheva that Wesker must take precise, regular doses of a serum to maintain his strength and speed; a larger or smaller dose would poison him. Sheva realizes that Gionne's syringes are doses of the drug. Chris and Sheva follow Wesker to a bomber loaded with missiles containing the Uroboros virus, injecting him with the syringes Gionne dropped. Wesker tries to escape on the bomber; Chris and Sheva disable it, making him crash-land in a volcano. Furious' Wesker exposes himself to Uroboros and chases Chris and Sheva through the volcano. They fight him, and the weakened Wesker falls into the lava before Chris and Sheva are rescued by a helicopter, which is piloted by Jill and Stone. As a dying Wesker attempts to drag the helicopter into the volcano, Chris and Sheva fire rocket-propelled grenades at Wesker, killing him. In the game's final cutscene, Chris wonders if the world is worth fighting for. Looking at Sheva and Jill, he decides to live in a world without fear.
## Gameplay
Resident Evil 5 is a third-person shooter with an over-the-shoulder perspective. Players can use several weapons including handguns, shotguns, automatic rifles, sniper rifles, and grenade launchers, as well as melee attacks. Players can make quick 180-degree turns to evade enemies. The game involves boss battles, many of which contain quick time events.
As in its predecessor Resident Evil 4, players can upgrade weapons with money and treasure collected in-game and heal themselves with herbs, but cannot run and shoot at the same time. New features include infected enemies with guns and grenades, the ability to upgrade weapons at any time from the inventory screen without having to find a merchant, and the equipping of weapons and items in real-time during gameplay. Each player can store nine items. Unlike the previous games, the item size is irrelevant; a herb or a grenade launcher each occupy one space, and four items may be assigned to the D-pad. The game features puzzles, though fewer than previous titles.
Resident Evil 5 is the first game in the Resident Evil series designed for two-player cooperative gameplay. The player controls Chris, a former member of the fictional Special Tactics and Rescue Service (STARS) and member of the BSAA, and a second player can control Sheva, who is introduced in this game. If a person plays alone, Sheva is controlled by the game's artificial intelligence (AI). When the game has been completed once, there is an option to make Sheva the primary character. Two-player mode is available online or split screen with a second player using the same console. A second player joining a split screen game in progress will make the game reload the last checkpoint (the point at which the game was last saved); the second player joining an online game will have to wait until the first player reaches the next checkpoint, or restarts the previous one, to play. In split-screen mode, one player's viewpoint is presented in the top half of the screen, and the other in the bottom half, but each viewpoint is presented in widescreen format, rather than using the full width of the screen, resulting in unused space to the left and right of the two windows. If one player has critical health, only their partner can resuscitate them, and they will die if their partner cannot reach them. At certain points, players are deliberately separated. Players can trade items during gameplay, although weapons cannot be traded with online players. The game's storyline is linear, and interaction with other characters is mostly limited to cutscenes.
A version of the Mercenaries minigame, which debuted in Resident Evil 3: Nemesis, is included in Resident Evil 5. This minigame places the player in an enclosed environment with a time limit. Customized weapons cannot be used and players must search for weapons, ammunition, and time bonuses while fighting a barrage of enemies, to score as many points as possible within the time limit. The minigame multiplayer mode was initially offline only; a release-day patch needed to be downloaded to access the online multiplayer modes. Mercenaries is unlocked when the game's story mode has been completed.
## Development
Resident Evil 5 was developed by Capcom and produced by Jun Takeuchi, who directed Onimusha: Warlords and produced Lost Planet: Extreme Condition. Keiji Inafune, promotional producer for Resident Evil 2 and executive producer of the PlayStation 2 version of Resident Evil 4, supervised the project. Production began in 2005 and at its peak, over 100 people were working on the project. In February 2007, some members of Capcom's Clover Studio began working on Resident Evil 5 while others were working on Resident Evil: The Umbrella Chronicles, which debuted for the Wii. Yasuhiro Anpo, who worked as a programmer on the original Resident Evil, directed Resident Evil 5. He was one of several staff members who worked on the original game to be involved in Resident Evil 5's development. The game's scenario was written by Haruo Murata and Yoshiaki Hirabayashi, based on a story idea by concept director Kenichi Ueda. Takeuchi announced that the game would retain the gameplay model introduced in Resident Evil 4, with "thematic tastes" from both Resident Evil 4 and the original Resident Evil.
While previous Resident Evil games are mainly set at night, the events of Resident Evil 5 occur almost entirely during the day. The decision for this was a combination of the game being set in Africa and advances in hardware improvements which allowed increasingly detailed graphics. On the subject of changes to Jill and Chris's appearance, production director Yasuhiro Anpo explained that designers tried "to preserve their image and imagined how they would have changed over the passage of time". Their new designs retained the character's signature colors; green for Chris and blue for Jill. Sheva was redesigned several times during production, though all versions tried to emphasize a combination of "feminine attraction and the strength of a fighting woman". The Majini were designed to be more violent than the "Ganado" enemies in Resident Evil 4.
The decision for cooperative gameplay was made part-way through development, for a new experience in a Resident Evil game. Despite initial concern that a second player would dampen the game's tension and horror, it was later realized that this could actually increase such factors where one player had to be rescued. The decision to retain wide-screen proportions in two-player mode was made to avoid having the first player's screen directly on top of the second, which might be distracting, and the restriction on simultaneously moving and shooting was retained to increase player tension by not allowing them to maneuver freely. Takeuchi cited the film Black Hawk Down as an influence on the setting of Resident Evil 5 and his experience working on Lost Planet: Extreme Condition as an influence on its development. When questioned as to why the game was not being released on the Wii, which was the most popular gaming console at that time, Takeuchi responded that although that may have been a good decision "from a business perspective", the Wii was not the best choice in terms of power and visual quality, concluding that he was happy with the console choices they had made.
Resident Evil 5 runs on version 1.4 of Capcom's MT Framework engine and scenes were recorded by motion capture. It was the first video game to use a virtual camera system, which allowed the developers to see character movements in real time as the motion-capture actors recorded. Actors Reuben Langdon, Karen Dyer and Ken Lally portrayed Chris Redfield, Sheva Alomar and Albert Wesker respectively. Dyer also voiced Sheva, while Chris's voice was performed by Roger Craig Smith. Dyer's background training in circus skills helped her win the role of Sheva, as Capcom were searching for someone who could handle the physical skills her motion capture required. She performed her own stunts, and worked in production on the game for over a year, sometimes working 14 hours a day. All of the human character motions were based on motion capture, while the non-human characters in the game were animated by hand.
Kota Suzuki was the game's principal composer and additional music was contributed by Hideki Okugawa, Akihiko Narita and Seiko Kobuchi. The electronic score includes 15 minutes of orchestral music, recorded at the Newman Scoring Stage of 20th Century Fox Studios in Los Angeles with the 103-piece Hollywood Studio Symphony. Other orchestral music and arrangements were by Wataru Hokoyama, who conducted the orchestra. Capcom recorded in Los Angeles because they wanted a Hollywood-style soundtrack to increase the game's cinematic value and global interest. Resident Evil 5's soundtrack features an original theme song, titled "Pray", which was composed by Suzuki and sung by Oulimata Niang.
## Marketing and release
Capcom announced Resident Evil 5 on July 20, 2005, and the company showed a brief trailer for the game at the Electronic Entertainment Expo (E3) in July 2007. The full E3 trailer became available on the Xbox Live Marketplace and the PlayStation Store that same month. A new trailer debuted on Spike TV's GameTrailers TV in May 2008, and on the GameTrailers website. A playable game demo was released in Japan on December 5, 2008, for the Xbox 360, in North America and Europe for the Xbox 360 on January 26, 2009, and on February 2 for the PlayStation 3. Worldwide downloads of the demo exceeded four million for the two consoles; over 1.8 million were downloaded between January 26 and January 29.
In January 2009, D+PAD Magazine reported that Resident Evil 5 would be released with limited-edition Xbox 360 box art; pictures of the limited-edition box claimed that it would allow two to sixteen players to play offline via System Link. Although Capcom said that their "box art isn't lying", the company did not provide details. Capcom soon issued another statement that the box-art information was incorrect, and System Link could support only two players. Microsoft released a limited-edition, red Xbox 360 Elite console which was sold with the game. The package included an exclusive Resident Evil theme for the Xbox 360 Dashboard and a download voucher for Super Street Fighter II Turbo HD Remix from Xbox Live.
Resident Evil 5 was released for PlayStation 3 and Xbox 360 in March 2009, alongside a dedicated Game Space on PlayStation Home. The space, Resident Evil 5 "Studio Lot" (Biohazard 5 "Film Studio" in Japan), had as its theme the in-game location of Kijuju. Its lounge offered Resident Evil 5-related items for sale, events and full game-launching support. Some areas of the space were available only to owners of Resident Evil 5. A Windows version was released in September 2009. This version, using Nvidia's 3D Vision technology through DirectX 10, includes more costumes and a new mode in the Mercenaries minigame. Resident Evil 5 was re-released on Shield Android TV in May 2016, and was re-released on PlayStation 4 and Xbox One the following month, with a physical disc copy following in America that July. It was also released for Nintendo Switch on October 29, 2019.
## Additional content
Shortly before the release of Resident Evil 5, Capcom announced that a competitive multiplayer mode called Versus would be available for download in several weeks. Versus became available for download in Europe and North America on April 7, 2009, through the Xbox Live Marketplace and the PlayStation Store. Versus has two online game types: "Slayers", a point-based game challenging players to kill Majini, and "Survivors", where players hunt each other while dodging and attacking Majini. Both modes can be played by two-player teams. The Windows version of Resident Evil 5 originally did not support downloadable content (DLC).
During Sony's press conference at the 2009 Tokyo Game Show Capcom announced that a special edition of the game, Biohazard 5: Alternative Edition, would be released in Japan for the PlayStation 3 in the spring of 2010. This edition supports the PlayStation Move accessory and includes a new scenario, "Lost in Nightmares", where Chris Redfield and Jill Valentine infiltrate one of Umbrella Corporation co-founder Oswell E. Spencer's estates in 2006. Another special edition of the game, Resident Evil 5: Gold Edition, was released for the Xbox 360 and PlayStation 3 in North America and Europe. Gold Edition includes "Lost in Nightmares" and another campaign-expansion episode, "Desperate Escape", where players control Josh Stone and Jill Valentine as they assist Chris and Sheva. The edition also includes the previously released Versus mode, four new costumes and an alternate Mercenaries mode with eight new playable characters, new items and maps. Like Alternative Edition, Gold Edition supports the PlayStation Move accessory with a patch released on September 14, 2010. The Xbox 360 version of Gold Edition came in a DVD with a token allowing free download of all DLC, while the PlayStation 3 version had all of the new content on a single Blu-ray disc. On November 5, 2012, Resident Evil 5: Gold Edition was placed on the PlayStation Network as a free download for PlayStation Plus users during that month.
As part of the game's conversion to Steamworks, Gold Edition was released for Microsoft Windows on March 26, 2015. Owners of the game from Steam or as a boxed retail Games for Windows – Live can acquire a free Steamworks copy of the base game and purchase the new Gold Edition content. The Steamworks version did not allow the use of Nvidia's 3D Vision technology or fan modifications, though Capcom later confirmed a way to work around these issues. In 2023, an update was released for the Windows version that removed Games for Windows – Live, thus restoring the split screen co-op feature to the game.
## Reception
`Resident Evil 5 received generally favourable reviews, according to review aggregator Metacritic. Reviewers praised the game's visuals and content. Corey Cohen of Official Xbox Magazine complimented the game's fast pace, and called the graphics gorgeous. It was praised by Joe Juba and Matt Miller of Game Informer, who said that it had the best graphics of any game to date and that the music and voice acting helped bring the characters to life, and Brian Crecente of Kotaku said it was one of the most visually stunning games he had ever played. Adam Sessler of X-Play said the game's graphics were exceptional, and Edge praised the gameplay as exhilarating and frantic. For IGN, Ryan Geddes wrote that the game had a surprisingly high replay value, and GameZone's Louis Bedigian said the game was "worth playing through twice in one weekend".`
While still giving favorable reviews of the game, several reviewers considered it to be a departure from the survival horror genre, a decision they lamented. Chris Hudak of GameRevolution considered the game to be a "full-on action blockbuster", and Brian Crecente said that about halfway through the game it "dropp[ed] all pretense of being a survival horror title and unmask[ed] itself as an action shooter title". Kristan Reed of Eurogamer said the game "morphs what was a survival horror adventure into a survival horror shooter", and believed that this attempt to appeal to action gamers would upset some of the series' fans.
Aspects of the game's control scheme were viewed negatively by critics. James Mielke of 1UP.com criticized several inconsistencies in the game, such as only being able to take cover from enemy fire in very specific areas. Mielke also criticized its controls, saying that aiming was too slow and noting the inability to strafe away from (or quickly jump back from) enemies. Despite the problems he found it was "still a very fun game". Kristan Reed also had criticism of some controls, such as the speed at which 180-degree turns were performed and difficulty accessing inventory. Joe Juba said that the inability to move and shoot at the same time seemed more "like a cheap and artificial way to increase difficulty than a technique to enhance tension." While praising some aspects of the AI control of Sheva, Ryan Geddes thought that it also had its annoyances, such as its tendency to recklessly expend ammunition and health supplies.
Reception of the downloadable content was favorable. Steven Hopper of GameZone rated the "Lost in Nightmares" DLC eight out of ten, saying that despite the episode's brevity it had high replay value and the addition of new multiplayer elements made it a "worthy investment for fans of the original game." Samuel Claiborn of IGN rated the "Desperate Escape" DLC seven out of ten: "Despite Desperate Escape's well-crafted action sequences, I actually found myself missing the unique vibe of Lost in Nightmares. The dynamic between Jill and Josh isn't particularly thrilling, and the one-liners, banter and endearing kitsch are kept to a minimum."
### Allegations of racism
Resident Evil 5's 2007 E3 trailer was criticized for depicting a white protagonist killing black enemies in a small African village. According to Newsweek editor N'Gai Croal, "There was a lot of imagery in that trailer that dovetailed with classic racist imagery", although he acknowledged that only the preview had been released. Takeuchi said the game's producers were completely surprised by the complaints. The second trailer for the game, released on May 31, 2008, revealed a more racially diverse group of enemies and the African BSAA agent Sheva, who assists the protagonist. Critics felt that Sheva's character was added to address the issue of racism, though Karen Dyer said the character had been in development before the first trailer was released. Takeuchi denied that complaints about racism had any effect in altering the design of Resident Evil 5. He acknowledged that different cultures may have had differing opinions about the trailer, though said he did not expect there to be further complaints once the game was released and people were "able to play the game and see what it is for themselves". In a Computer and Video Games interview, producer Masachika Kawata also addressed the issue: "We can't please everyone. We're in the entertainment business—we're not here to state our political opinion or anything like that. It's unfortunate that some people felt that way."
In Eurogamer's February 2009 preview of Resident Evil 5, Dan Whitehead expressed concern about controversy the game might generate: "It plays so blatantly into the old clichés of the dangerous 'dark continent' and the primitive lust of its inhabitants that you'd swear the game was written in the 1920s". Whitehead said that these issues became more "outrageous and outdated" as the game progressed and that the addition of the "light-skinned" Sheva just made the overall issue worse. Hilary Goldstein from IGN believed that the game was not deliberately racist, and though he did not personally find it offensive, he felt that others would due to the subjective nature of offensiveness. Chris Hudak dismissed any allegations of racism as "stupid". Karen Dyer, who is of Jamaican descent, also dismissed the claims. She said that in over a year of working on the game's development she never encountered anything racially insensitive, and would not have continued working there if she had.
Wesley Yin-Poole of VideoGamer.com said that despite the controversy the game was attracting due to alleged racism, no expert opinion had been sought. He asked Glenn Bowman, senior lecturer in social anthropology at the University of Kent, whether he thought the game was racist. Bowman considered the racism accusations "silly", saying that the game had an anti-colonial theme and those complaining about the game's racism might be expressing an "inverted racism which says that you can't have scary people who are black". It was reported that one cutscene in the game scene showed "black men" dragging off a screaming white woman; according to Yin-Poole, the allegation was incorrect and the single man dragging the woman was "not obviously black". The scene was submitted to the British Board of Film Classification for evaluation. BBFC head of communications Sue Clark said, "There is only one man pulling the blonde woman in from the balcony [and he] is not black either. As the whole game is set in Africa it is hardly surprising that some of the characters are black ... we do take racism very seriously, but in this case, there is no issue around racism."
Academic journals and conferences, however, have continued to comment on the theme of race within the game. In 2011, André Brock from Games and Culture said that the game drew from well-established racial and gender stereotypes, saying that the African people were only depicted as savage, even before transitioning into zombies. Writing for the Digital Games Research Association in 2011, Geyser and Tshabalala noted that racial stereotyping had never been intended by Capcom, though compared their depiction of Africa to that of the 1899 novel Heart of Darkness. Post-colonial Africa, they opined, was portrayed as being unable to take care of itself, and at the mercy of Western influences.
Writing for The Philosophy of Computer Games Conference in 2015, Harrer and Pichlmair considered Resident Evil 5 to be "yet another moment in the history of commodity racism, which from the late 19th century onwards allowed popular depictions of racial stereotypes to enter the most intimate spaces of European homes". The authors state that Africa is presented from a Western gaze; "what is presented as 'authentic' blackness conforms to the projected fantasy of predominantly white gaming audience". In 2016, Paul Martin from Games and Culture said that the theme of the game could be described as "dark continent", stating it drew on imagery of European colonialism and depictions of "Blackness" reminiscent of 19th-century European theories on race.
### Sales
The PlayStation 3 version of Resident Evil 5 was the top-selling game in Japan in the two weeks following its release, with 319,590 units sold. In March 2009, it became the fastest-selling game of the franchise in the United Kingdom, and the biggest Xbox 360 and PlayStation 3 game release in the country. By December 2023, Resident Evil 5 had sold 9 million units worldwide on PlayStation 3 and Xbox 360 with its original release. The Gold Edition had sold an additional 2.4 million units on PlayStation 3 and Xbox 360. The PlayStation 4 and Xbox One versions sold another 3 million units combined, bringing the total sales to 14.4 million units.
The original release of Resident Evil 5 was Capcom's best-selling individual edition of a game until March 2018, when Monster Hunter: World's sales reached 7.5 million units, compared to 7.3 million for Resident Evil 5 at the time. It was still the best-selling title in the Resident Evil franchise as of 2018, and remains so as of December 2023 when not including remakes; Resident Evil 2 has sold 18.06 million copies when including both the original 1998 release and the 2019 remake.
### Awards
Resident Evil 5 won the "Award of Excellence" at the 2009 Japan Game Awards. It was nominated for both Best Action/Adventure Game and Best Console Game at the 2008 Game Critics Awards, Best Action Game at the 2009 IGN Game of the Year Awards, and Best Sound Editing in Computer Entertainment at the 2010 Golden Reel Awards. It received five nominations at the 2010 Game Audio Network Guild Awards: Audio of the Year, Best Cinematic/Cut-Scene Audio, Best Dialogue, Best Original Vocal Song – Pop (for the theme song "Pray") and Best Use of Multi-Channel Surround in a Game. Karen Dyer's portrayal of Sheva Alomar was nominated for Outstanding Achievement in Character Performance at the 13th Annual Interactive Achievement Awards, while the game itself garnered a nomination for Outstanding Achievement in Art Direction. |
# Kinzua Bridge
The Kinzua Bridge or the Kinzua Viaduct (/ˈkɪnzuː/, /-zuːə/) was a railroad trestle that spanned Kinzua Creek in McKean County in the U.S. state of Pennsylvania. The bridge was 301 feet (92 m) tall and 2,052 feet (625 m) long. Most of its structure collapsed during a tornado in July 2003.
Billed as the "Eighth Wonder of the World", the wrought iron original 1882 structure held the record for the tallest railroad bridge in the world for two years. In 1900, the bridge was dismantled and simultaneously rebuilt out of steel to allow it to accommodate heavier trains. It stayed in commercial service until 1959, when it was sold to a salvage company. In 1963 the Commonwealth of Pennsylvania purchased the bridge as the centerpiece of a state park.
Restoration of the bridge began in 2002, but before it was finished a tornado struck the bridge in 2003, causing a large portion of the bridge to collapse. Corroded anchor bolts holding the bridge to its foundations failed, contributing to the collapse.
Before its collapse, the Kinzua Bridge was ranked as the fourth-tallest railway bridge in the United States. It was listed on the National Register of Historic Places in 1977 and as a National Historic Civil Engineering Landmark by the American Society of Civil Engineers in 1982. The ruins of the Kinzua Bridge are in Kinzua Bridge State Park off U.S. Route 6 near the borough of Mount Jewett, Pennsylvania.
## Original construction and service
In 1882, Thomas L. Kane, president of the New York, Lake Erie and Western Railway (NYLE\&W), was faced with the challenge of building a branch line off the main line in Pennsylvania, from Bradford south to the coalfields in Elk County. The fastest way to do so was to build a bridge across the Kinzua Valley. The only other alternative would have been to lay an additional 8 miles (13 km) of track over rough terrain. When built, the bridge was larger than any ever attempted and over twice as large as the largest similar structure at the time, the Portage Bridge over the Genesee River in western New York.
The first Kinzua Bridge was built by a crew of 40 from 1,552 short tons (1,408 t) of wrought iron in just 94 working days, between May 10 and August 29, 1882. The reason for the short construction time was that scaffolding was not used in the bridge's construction; instead a gin pole was used to build the first tower, then a traveling crane was built atop it and used in building the second tower. The process was then repeated across all 20 towers.
The bridge was designed by the engineer Octave Chanute and was built by the Phoenix Iron Works, which specialized in producing patented, hollow iron tubes called "Phoenix columns". Because of the design of these columns, it was often mistakenly believed that the bridge had been built out of wooden poles. The bridge's 110 sandstone masonry piers were quarried from the hillside used for the foundation of the bridge. The tallest tower had a base that was 193 feet (59 m) wide. The bridge was designed to support a load of 266 short tons (241 t), and was estimated to cost between $167,000 and $275,000.
On completion, the bridge was the tallest railroad bridge in the world and was advertised as the "Eighth Wonder of the World". Six of the bridge's 20 towers were taller than the Brooklyn Bridge. Excursion trains from as far away as Buffalo, New York, and Pittsburgh would come just to cross the Kinzua Bridge, which held the height record until the Garabit viaduct, 401 feet (122 m) tall, was completed in France in 1884. Trains crossing the bridge were restricted to a speed of 5 miles per hour (8.0 km/h) because the locomotive, and sometimes the wind, caused the bridge to vibrate. People sometimes visited the bridge in hopes of finding the loot of a bank robber, who supposedly hid $40,000 in gold and currency under or near it.
## Reconstruction and service
By 1893, the NYLE\&W had gone bankrupt and was merged with the Erie Railroad, which became the owner of the bridge. By the start of the 20th century, locomotives were almost 85 percent heavier and the iron bridge could no longer safely carry trains. The last traffic crossed the old bridge on May 14, 1900, and removal of the old iron began on May 24.
The new bridge was designed by C.R. Grimm and was built by the Elmira Bridge Company out of 3,358 short tons (3,046 t) of steel, at a cost of $275,000. Construction began on May 26, starting from both ends of the old bridge. A crew of between 100 and 150 worked 10-hour days for almost four months to complete the new steel frame. Two Howe truss "timber travelers", each 180 feet (50 m) long and 16 feet (5 m) deep, were used to build the towers. Each "traveler" was supported by a pair of the original wrought-iron towers, separated by the one that was to be replaced. After the middle tower was demolished and a new steel one built in its place, the traveler was moved down the line by one tower and the process was repeated. Construction of each new tower and the spans adjoining it took one week to complete. The bolts used to hold the towers to the anchor blocks were reused from the first bridge, which would eventually play a major role in the bridge's demise. Grimm, the designer of the bridge, later admitted that the bolts should have been replaced.
The Kinzua Viaduct reopened to traffic on September 25, 1900. The new bridge was able to safely accommodate Erie's heavy 2-8-2 Mikados. The Erie Railroad maintained a station at the Kinzua Viaduct. Constructed between 1911 and 1916, the station was not manned by an agent. The station was closed sometime between 1923 and 1927.
Train crews would sometimes play a trick on a brakeman on his first journey on the line. When the train was a short distance from the bridge, the crew would send the brakeman over the rooftops of the cars to check on a small supposed problem. As the train crossed the bridge, the rookie "suddenly found himself terrified, staring down three hundred feet (90 m) from the roof of a rocking boxcar". Even after being reconstructed, the bridge still had a speed limit of 5 miles per hour (8 km/h). As the bridge aged, heavy trains pulled by two steam locomotives had to stop so the engines could cross the bridge one at a time. Diesel locomotives were lighter and did not face that limit; the last steam locomotive for commercial service crossed on October 5, 1950.
The Erie Railroad obtained trackage rights on the nearby Baltimore and Ohio Railroad (B\&O) line in the late 1950s, allowing it to bypass the aging Kinzua Bridge. Regular commercial service ended on June 21, 1959, and the Erie sold the bridge to the Kovalchick Salvage Company of Indiana, Pennsylvania, for $76,000. The bridge was reopened for one day in October 1959 when a wreck on the B\&O line forced trains to be rerouted across it. According to the American Society of Civil Engineers, the Kinzua Bridge "was a critical structure in facilitating the transport of coal from Northwestern Pennsylvania to the Eastern Great Lakes region, and is credited with causing an increase in coal mining that led to significant economic growth."
## Creation of state park
Nick Kovalchick, head of the Kovalchick Salvage Company, which then owned the bridge, was reluctant to dismantle it. On seeing it for the first time he is supposed to have said "There will never be another bridge like this." Kovalchick worked with local groups who wanted to save the structure, and Pennsylvania Governor William Scranton signed a bill into law on August 12, 1963, to purchase the bridge and nearby land for $50,000 and create Kinzua Bridge State Park. The deed for the park's 316 acres (128 ha) was recorded on January 20, 1965, and the park was opened to the public in 1970.
An access road to the park was built in 1974, and new facilities there included a parking lot, drinking water and toilets, and installation of a fence on the bridge deck. On July 5, 1975, there was an official ribbon cutting ceremony for the park, which "was and is unique in the park system" since "its centerpiece is a man-made structure". The bridge was listed on the National Register of Historic Places on August 29, 1977, and was named to the National Register of Historic Civil Engineering Landmarks by the American Society of Civil Engineers on June 26, 1982.
The Knox and Kane Railroad (KKRR) operated sightseeing trips from Kane through the Allegheny National Forest and over the Kinzua Bridge from 1987 until the bridge was closed in 2002. In 1988 it operated the longest steam train excursion in the United States, a 97-mile (156 km) round trip to the bridge from the village of Marienville in Forest County, with a stop in Kane. The New York Times described being on the bridge as "more akin to ballooning than railroading" and noted "You stare straight out with nothing between you and an immense sea of verdure a hundred yards [91 m] below." The railroad still operated excursions through the forest and stopped at the bridge's western approach until October 2004.
As of 2009, Kinzua Bridge State Park is a 329-acre (133 ha) Pennsylvania state park surrounding the bridge and the Kinzua Valley. The park is located off of U.S. Route 6 north of Mount Jewett in Hamlin and Keating Townships. A scenic overlook within the park allows views of the fallen bridge and of the valley, and is also a prime location to view the fall foliage in mid-October. The park has a shaded picnic area with a centrally-located modern restroom. Before the bridge's collapse, visitors were allowed on or under the bridge and hiking was allowed in the valley around the bridge. In September 2002 the bridge was closed even to pedestrian traffic. About 100 acres (40 ha) of Kinzua Bridge State Park are open to hunting. Common game species are turkey, bear and deer.
## Bridge collapse
Since 2002, the Kinzua Bridge had been closed to all "recreational pedestrian and railroad usage" after it was determined that the structure was at risk to high winds. Engineers had determined that during high winds, the bridge's center of gravity could shift, putting weight onto only one side of the bridge and causing it to fail. An Ohio-based bridge construction and repair company had started work on restoring the Kinzua Bridge in February 2003.
On July 21, 2003, construction workers had packed up and were starting to leave for the day when a storm arrived. A tornado spawned by the storm struck the Kinzua Bridge, snapping and uprooting nearby trees, as well as causing 11 of the 20 bridge towers to collapse. There were no deaths or injuries. The tornado was produced by a mesoscale convective system (MCS), a complex of strong thunderstorms, that had formed over an area that included eastern Ohio, western Pennsylvania, western New York, and southern Ontario. The MCS traveled east at around 40 miles per hour (60 km/h). As the MCS crossed northwestern Pennsylvania, it formed into a distinctive comma shape. The northern portion of the MCS contained a long-lived mesocyclone, a thunderstorm with a rotating updraft that is often conducive to tornados.
At approximately 15:20 EDT (19:20 UTC), the tornado touched down in Kinzua Bridge State Park, 1 mile (1.6 km) from the Kinzua Bridge. The tornado, classified as F1 on the Fujita scale, passed by the bridge and continued another 2.5 miles (4 km) before it lifted. It touched down again 2 miles (3 km) from Smethport and traveled another 3 miles (4.8 km) before finally dissipating. It was estimated to have been 1⁄3-mile (540 m) wide and it left a path 3.5 miles (5.6 km) long. The same storm also spawned an F3 tornado in nearby Potter County.
When the tornado touched down, the winds had increased to at least 94 miles per hour (151 km/h) and were coming from the east, perpendicular to the bridge, which ran north–south. An investigation determined that Towers 10 and 11 had collapsed first, in a westerly direction. Meanwhile, Towers 12 through 14 had actually been picked up off their foundations, moved slightly to the northwest and set back down intact and upright, held together by only the railroad tracks on the bridge. Next, towers four through nine collapsed to the west, twisting clockwise, as the tornado started to move northward. As it moved north, inflow winds came in from the south and caused Towers 12, 13, and 14 to finally collapse towards the north, twisting counterclockwise.
The failures were caused by the badly-rusted iron base bolts holding the bases of the towers to concrete anchor blocks embedded into the ground. An investigation determined that the tornado had a wind speed of at least 94 miles per hour (151 km/h), which applied an estimated 90 short tons-force (800 kN) of lateral force against the bridge. The investigation also hypothesized that the whole structure oscillated laterally four to five times before fatigue started to cause the base bolts to fail. The towers fell intact in sections and suffered damage upon impact with the ground. The century-old bridge was destroyed in less than 30 seconds.
## Aftermath
The state decided not to rebuild the Kinzua Bridge, which would have cost an estimated $45 million. Instead, it was proposed that the ruins be used as a visitor attraction to show the forces of nature at work. Kinzua Bridge State Park had attracted 215,000 visitors annually before the bridge collapsed, and was chosen by the Pennsylvania Bureau of Parks for its list of "Twenty Must-See Pennsylvania State Parks". The viaduct and its collapse were featured in the History Channel's Life After People as an example of how corrosion and high winds would eventually lead to the collapse of any steel structure. The bridge was removed from the National Register of Historic Places on July 21, 2004.
The Knox and Kane Railroad was forced to suspend operations in October 2006 after a 75 percent decline in the number of passengers, possibly brought about by the collapse of the Kinzua Bridge. The Kovalchick Corporation bought the Knox and Kane's tracks and all other property owned by the railroad, including the locomotives and rolling stock. The Kovalchick Corporation also owns the East Broad Top Railroad and was the company that owned the Kinzua Bridge before selling it to the state in 1963. The company disclosed plans in 2008 to remove the tracks and sell them for scrap. The right-of-way would then be used to establish a rail trail.
### Sky Walk
The state of Pennsylvania reimagined the Kinzua State Park as one anchored by a "sky walk" viewing platform and network of hiking trails. It released $700,000 to design repairs on the remaining towers and plan development of the new park facilities in June 2005. In late 2005, the Pennsylvania Department of Conservation and Natural Resources (DCNR) put forward an $8 million proposal for a new observation deck and visitors' center, with plans to allow access to the bridge and a hiking trail giving views of the fallen towers. The Kinzua Sky Walk was opened on September 15, 2011, in a ribbon-cutting ceremony. The Sky Walk consists of a pedestrian walkway to an observation deck with a glass floor at the end of the bridge that allows views of the bridge and the valley directly below. The walkway cost $4.3 million to construct, but in 2011 a local tourism expert estimated it could eventually bring in $11.5 million of tourism revenue each year.
## See also
- List of bridges documented by the Historic American Engineering Record in Pennsylvania
- List of Erie Railroad structures documented by the Historic American Engineering Record
- National Register of Historic Places listings in McKean County, Pennsylvania
- Tornadoes of 2003 |
# Haane Manahi
Haane Te Rauawa Manahi, DCM (28 September 1913 – 29 March 1986) was a New Zealand Māori soldier during the Second World War whose gallantry during the Tunisian campaign resulted in a recommendation that he be awarded the Victoria Cross (VC). The subsequent award of the Distinguished Conduct Medal (DCM) disappointed his fellow soldiers who, after his death, advocated greater recognition of his valour. This eventually resulted in a special award in 2007 of an altar cloth for use in a local church, ceremonial sword and a personal letter from Queen Elizabeth II in recognition of his gallantry.
Born in Ohinemutu, New Zealand, Manahi worked as a labourer when, in November 1939, he volunteered to join the Māori Battalion, newly raised for service in the Second World War. In 1941, he participated in the Battle of Greece and fought in the Battle of Crete during which he was wounded. After recovering from his wounds, he returned to his unit and fought through the Western Desert and Tunisian campaigns, during which he was recommended for a VC for his actions at Takrouna over the period 19–21 April 1943. Despite the support of four generals, his VC nomination was downgraded to an award of a DCM, possibly by the British Chief of the General Staff, General Alan Brooke.
In June 1943, he returned to New Zealand on a three-month furlough but when this was completed, was not required to rejoin his battalion. Māori soldiers on furlough were made exempt from active duty. After his discharge from the New Zealand Military Forces in 1946, he was employed as a traffic inspector. After his death in a car crash in 1986, a committee was established to urge the New Zealand Government to make representations to Buckingham Palace for a posthumous award of the VC to Manahi. These efforts were ultimately unsuccessful due to the period of time that had elapsed since the end of the Second World War.
## Early life
Haane Te Rauawa Manahi was the son of Manahi Ngākahawai Te Rauawa, a farm worker, and his wife Neti Mariana née Insley. He was born on 28 September 1913 in Ohinemutu, a village near the town of Rotorua in the North Island of New Zealand. A Māori, he was descended from the Te Arawa and Ngāti Raukawa iwi (tribes) on his father's side, while his mother was also of the Te Arawa iwi in addition to having some Scottish heritage. He attended local schools up to secondary school level. After leaving school, he worked in road construction and farm labour. He also spent time in the timber and building industries alongside his paternal uncle, Matiu Te Rauawa, who had served in the New Zealand Pioneer Battalion that had been raised for military duty during the First World War.
## Second World War
In November 1939, following the outbreak of the Second World War, Manahi was one of the first men to enlist in the newly formed Māori Battalion. The battalion was composed of a headquarters company and four rifle companies, which were organised along tribal lines. Manahi was assigned to B Company, made up largely of other men from Te Arawa. The Māori Battalion was one of ten infantry battalions in the 2nd New Zealand Division and training commenced at Trentham Military Camp in January 1940. Shortly before he departed his home for Trentham, Manahi married Rangiawatea née Te Kiri, the mother of his son, born in 1936.
In early May 1940, after Manahi and the rest of his fellow soldiers had two weeks of home leave prior to departing the country, the battalion embarked for the Middle East as part of the second echelon of the division. While in transit, the convoy carrying the second echelon was diverted to England following the entry of Italy into the war on the side of Nazi Germany. In England, the threat of invasion was high following the evacuation of the British Expeditionary Force from France. After a short period of leave in London, the New Zealanders were engaged in further training and defensive duties, with the Māori Battalion based in Kent and then, once the threat of invasion had receded, in Aldershot. Manahi's company was briefly separated and stationed at Waverley Abbey House in Surrey. By late November, it had been decided that the New Zealanders could be sent to the Middle East. The second echelon left for Egypt in early January 1941, with Manahi and the rest of his battalion aboard the Athlone Castle.
### Greece and Crete
On 27 March 1941, Manahi's battalion, having spent two months in Egypt, arrived in Greece to assist in its defence against an anticipated German invasion. Subordinate to the 5th Infantry Brigade, it initially took up defensive positions around Olympus Pass, and in the days following the beginning of the invasion on 6 April, rebuffed initial contact by the advancing Germans. The battalion had to withdraw as the flanks of the Allied positions were threatened. B Company was the last of the battalion's units to abandon its positions, and together with the rest of the Allies, withdrew over the following days to Porto Rafti, where it boarded a transport ship for the island of Crete.
On Crete, the Allies dug in for the expected airborne attack by German paratroopers. The Māori Battalion was positioned near the town of Platanias, as a reserve for the 5th Infantry Brigade, which was tasked with the defence of Maleme Airfield. On 20 May, the Germans commenced their invasion of the island. Manahi was returning to his trench, having just had breakfast, as planes flew overhead, discharging paratroopers. On 23 May, following the loss of the airfield to the Germans, he received a gunshot wound to the chest. Despite this wound, he remained with his company as it was forced to withdraw to the south-west in the following days and was eventually evacuated from Crete on 31 May and transported to Egypt.
### North Africa
By mid-June 1941, after a period of recuperation and leave, Manahi had returned to the Māori Battalion, which had undergone a reorganisation following the campaign in Greece and Crete. It was now training for desert warfare and constructing defensive positions around the Baggush Box, about 150 kilometres (93 mi) west of El Alamein. During this time he participated in a divisional swimming competition, winning the freestyle 50 yards (46 m) race. In November he, along with the rest of the division, participated in Operation Crusader. After crossing the Egyptian border into Libya, this involved near-constant fighting for well over a month, during which Manahi, with two others, captured and commandeered a German tank which had become stuck in B Company's trenches. He drove the tank during an engagement with elements of the 21st Panzer Division on 26 November, helping capture an enemy field gun. In early 1942, the New Zealanders were withdrawn to Syria for a period of rest and garrison duty.
In late May 1942, the Panzer Army Africa, commanded by Generaloberst (colonel general) Erwin Rommel, attacked into Libya. The 2nd Division was rushed back from Syria and dug in at Minqar Qaim. Encircled by the Germans during the Battle of Mersa Matruh, the division was forced to break out from Minqar Qaim on 26 June and withdrew to positions around El Alamein in Egypt. Here, suffering regular artillery barrages, it dug in to await an expected attack. By late August, no attack had been launched and it was decided a raid for prisoners would be undertaken by two companies, one of them being Manahi's B Company. This was successfully executed on 26 August, with over 40 enemy soldiers made prisoners of war. The next month, the battalion was taken out of the line for a brief period of rest before returning for the Second Battle of El Alamein. During the fourth stage of the battle, in what was codenamed Operation Supercharge, Manahi and his company were involved in a successful bayonet charge against well dug-in Germans that had resisted a previous attack by another battalion.
By now, it was clear that the Germans were in retreat and the Allies pursued them into Libya and Tunisia. After a battle at Tebaga Gap, during which Second Lieutenant Moana-Nui-a-Kiwa Ngarimu of the Māori Battalion's C Company won the Victoria Cross (VC), planning began for a push into Tunis, the capital city of Tunisia. Before this could be achieved, a defensive line around Enfidaville needed to be broken.
### Takrouna
By April 1943, the 2nd New Zealand Division had advanced into mountainous country overlooking Enfidaville. Takrouna was a hill, about 1,000 feet (300 m) high, held by soldiers of the Italian Trieste Division's I/66° Battalion, and a German platoon. A village was situated on the summit of the hill with a prominent ledge to one side. The Māori Battalion was tasked by Brigadier Howard Kippenberger, the acting commander of the 2nd New Zealand Division, with the capture of Takrouna. B Company would make the main assault on 19 April, with C and D companies on the flanks. The initial attack petered out due to heavy machine gun fire from the enemy. The battalion's commander, Lieutenant Colonel Charles Bennett, ordered Manahi to take a party of 12 men to make a feint attack while the remainder of B Company linked up with C Company. The party split into two sections, with one under the command of Manahi, newly promoted to lance sergeant. At dawn, they began their attack up a steep and at times near sheer slope and were successfully able to overwhelm the Italians defending the ledge, capturing 60 prisoners. The New Zealanders then dug in and prepared for a counter-attack. Artillery and mortar fire killed half of the platoon, including its commander. This left Manahi, as the senior non-commissioned officer, in charge.
With two attempts to contact the battalion having failed, Manahi made his way down Takrouna to locate reinforcements and supplies. Ignoring an officer's advice that he abandon the ledge, he returned with a section from C Company as well as ammunition and stretcher-bearers. A further platoon arrived to help consolidate the position. The expected counter-attack commenced and was successfully beaten off. It was only then, after having been on Takrouna for 16 hours, that Manahi and what was left of his section withdrew, leaving the newly arrived platoon to hold the ledge.
Despite reinforcements, a further counter-attack launched by Italian forces on 21 April dislodged the New Zealanders and control of the ledge was lost. Kippenberger ordered the Māori Battalion to send reinforcements to rectify the situation. Manahi was specifically requested to join the effort to recapture the ledge due to his knowledge of the terrain. He went with a group of volunteers to regain the lost position and, with artillery support, the attack was successful. By midday, the ledge was reoccupied by the New Zealanders but the village on the summit remained in the hands of the Italians. Later in the afternoon of 21 April, Manahi led an attacking party of seven soldiers which, working with a group from 21st Battalion, captured the village and took 300 prisoners. After the battle, and with Takrouna secure, he assisted with the recovery of the bodies of his dead comrades.
Manahi's exploits quickly became known throughout the 2nd New Zealand Division, and within a few days of his actions, a nomination for the VC had been prepared by the commander of his battalion. Brigadier Ralph Harding, commander of 5th Infantry Brigade, endorsed the nomination as did four senior officers: Kippenberger, Lieutenant General Bernard Freyberg, the acting commander of X Corps, General Bernard Montgomery, the commander of the Eighth Army, and General Harold Alexander, the commander of 18th Army Group. General Henry Maitland Wilson, commander-in-chief of Middle East Command, likewise endorsed the award after considering the evidence. When the nomination reached the Army Council in London, the award was downgraded to an immediate Distinguished Conduct Medal (DCM). Who authorised the downgrade is not clear, but historian Paul Moon notes that it was most likely that only the Chief of the General Staff, General Alan Brooke, had the seniority to do so, given the individuals who had endorsed the VC recommendation. Manahi's DCM was duly gazetted on 22 July 1943.
The citation for the DCM read:
> On the night of 19–20 April 1943 during the attack upon the Takrouna feature, Tunisia, Lance Sergeant Manahi was in command of a section. The objective of his platoon was the pinnacle, a platform of rock right on top of the feature. By morning the platoon was reduced in strength to ten by heavy mortar and small-arms fire and were pinned to the ground a short way up the feature. The platoon continued towards their objective, Lance Sergeant Manahi leading a party of three up the western side. During this advance, they encountered heavy machine-gun fire from posts on the slope and extensive sniping from the enemy actually on the pinnacle. In order to reach their objective, he and his party had to climb some 500 feet under heavy fire, the last 50 feet being almost sheer. He personally led the party after silencing several machine-gun posts and by climbing hand-over-fist they eventually reached the pinnacle. After a brief fight some sixty enemies, including an artillery observation officer, surrendered. They were then joined by the remainder of the platoon and the pinnacle was captured.
>
> The area was subjected to intense mortar fire from a considerable enemy force still holding Takrouna Village and the northern and western slopes of the feature and later to heavy and continuous shelling. The Platoon Sergeant was killed and other casualties reduced the party holding the pinnacle to Lance Sergeant Manahi and two Privates. An artillery observation officer who had arrived ordered him to withdraw but he and his men remained and held the feature. This action was confirmed by Brigade Headquarters as soon as communications were established. Late morning found the party short of ammunition, rations and water. Lance Sergeant Manahi himself returned to his Battalion at the foot of the feature and brought back supplies and reinforcements, the whole time being under fire. During the afternoon the enemy counter-attacked in force, some of them gaining a foothold. In the face of grenades and small-arms fire he personally led his men against the attackers. Fierce hand-to-hand fighting ensued but eventually, the attackers were driven off. Shortly after this, the party was relieved. The following morning urgent and immediate reinforcements were required as the enemy had once more gained a foothold and Lance Sergeant Manahi led one of two parties which attacked and drove back the enemy despite concentrated mortar and heavy machine-gun fire. All that day the feature was heavily shelled, mortared and subjected to continual machine-gun fire from in and about Takrouna. Late in the afternoon of 21 April, Lance Sergeant Manahi on his own initiative took two men and moved around the north-western side of the feature. In that area were several enemies machine-gun and mortar posts and two 25-pounder guns operated by the enemy. With cool determination Lance Sergeant Manahi led his party against them, stalking one post after another always under the shell and machine-gun fire. By his skill and daring, he compelled the surrender of the enemy in that area.
>
> This courageous action undoubtedly led to the ultimate collapse of the enemy defence and the capture of the whole Takrouna feature with over 300 prisoners, two 25-pounder guns, several mortars and seventy-two machine guns. On the night of 21–22 April, Lance Sergeant Manahi remained on the feature assisting in the evacuation of the dead and wounded and refused to return to his Battalion until this task was completed. During that time the area was being heavily and continually shelled.
>
> Throughout the action, Lance Sergeant Manahi showed the highest qualities of an infantry soldier. He made a supreme contribution to the capture and holding of a feature vital to the success of the operation.
The decision to downgrade the VC recommendation to an award of the DCM was a disappointment to many in the 2nd New Zealand Division. Even outside of the division there was some surprise; the British Lieutenant-General Brian Horrocks, who was present during the fighting at Takrouna and visited the site of the action afterwards, expressed his dismay at the downgrade of Manahi's award in his postwar memoirs. Reports that Manahi's men had killed Italians attempting to surrender were thought by some historians to be a factor in the downgrading of his award. The official history of the Māori Battalion, published in 1956, stated that the surrendering soldiers were "shot, bayonetted or thrown over a cliff" but only after an Italian grenade had been thrown into a building in which wounded New Zealanders were sheltering. However, these reports may not have emerged until after the downgrading, and at the time the killings were alleged to have occurred, Manahi himself was reportedly dealing with an advance by Italian soldiers against the ledge. Another factor in the downgrading may have been the recent VC nomination for Ngarimu, just three weeks earlier. The subsequent nomination of Manahi, a Māori like Ngarimu and from the same battalion, may have led to a perception that VCs were being too easily awarded.
### Return to New Zealand
The surrender of the Axis forces in Tunisia in May left the Allies in control of North Africa. The 2nd New Zealand Division withdrew to its base in Egypt and it was announced that 6,000 of its personnel would return to New Zealand for a three-month furlough. Manahi, as one of around 180 surviving original members of the Māori Battalion, was among those selected and shipped out on 15 June 1943. He was not to return to the war; after many of those on furlough vocalised their displeasure at the prospect of going back to war while other able-bodied men had yet to serve in the military, the New Zealand Government decided to exempt certain long-serving personnel from a return to duty. Māori soldiers, such as Manahi, would be among those released from service.
On returning to Rotorua, Manahi entered a woodworking course and then began working at the local hospital as a carpenter. On 18 December 1945, he was presented with his DCM by Cyril Newall, the Governor-General of New Zealand, in a ceremony at the Auckland Town Hall. He was later selected for the New Zealand Victory Contingent, destined for England to celebrate the Commonwealth's role in the war. As part of the contingent, he participated in the Victory Parade in London on 8 June 1946. This fulfilled his last military obligations, and he was discharged in August.
## Later life
Manahi settled back in Rotorua and returned to the workforce. Employed by the Ministry of Works, he became a traffic inspector which involved travelling around the Bay of Plenty. By this time, he had become estranged from his wife, although the couple never divorced. Manahi later had relationships with other women, and fathered another son with one of them.
A keen sportsman, he became involved in swimming coaching as well as golf and fishing. When his estranged wife died in 1976, he moved away from Rotorua to nearby Maketu, on the coast. He still commuted to Rotorua to socialise at the local branch of the New Zealand Returned Servicemen's Association (RSA). Following his retirement in 1978, he spent even more time at the RSA in Rotorua. On the evening of 29 March 1986, on the way home to Maketu from the RSA clubrooms, he was involved in a car crash. His car veered over the centre line of the road, hit an oncoming vehicle and flipped. The driver and passenger of the other vehicle went to Manahi's aid. He received severe chest and abdominal injuries and was rushed to Tauranga Hospital, where he died later in the evening. His tangi (funeral) was held at the marae (tribal meeting area) in his home village of Ohinemutu, and was attended by former soldiers of the Māori Battalion. Survived by his two sons, he was buried at Muruika cemetery.
## The Manahi VC Committee
The situation regarding Manahi's VC recommendation for his actions at Takrouna still rankled with many members of the Māori Battalion but while he was alive, Manahi's modesty and unwillingness to bring any attention to himself meant that he was not interested in pursuing reconsideration of the award. Following his death, the Manahi VC Committee was established by his former comrades and iwi to lobby for an upgrade to his award.
The committee, which felt the downgrade of Manahi's proposed VC award to a DCM was due to him being a Māori, lobbied the New Zealand Government to make representations to Buckingham Palace. It was hoped that Queen Elizabeth II would reconsider the case and make a posthumous grant of the VC to Manahi. This was despite the Queen's father, King George VI, having ruled in 1949 that no further awards from the Second World War ought to be made. The New Zealand Government was reluctant to be officially involved, fearing an outright rejection if formal approaches were made. It favoured a more gradual and casual method to better assess the likely receptiveness of the Palace to the issue and so supported two informal applications made to the Queen in the early 1990s through former Governors-General of New Zealand; these were unsuccessful, with the passage of time since the events of Takrouna cited as a factor.
Further agitation by the committee for an official approach to be made to the Queen resulted in a formal application to the Government in late 1993. This was rejected, one reason cited being the alleged conduct of the Māori soldiers towards Italian prisoners at Takrouna. This provoked the committee to collect more evidence in support of its case, including rebuttal evidence regarding the treatment of the Italians. It also stressed the point that the Manahi case was one of rectifying an error made by the military authorities to downgrade the VC to a DCM. It was not a matter of an attempt to see a soldier awarded a medal he had been overlooked for, as was alleged by military historian Christopher Pugsley. Finally, in 1997, the then Prime Minister of New Zealand, Jenny Shipley, formally broached the subject with Buckingham Palace. The feedback indicated the elapsed time since the events of Takrouna was a barrier to awarding Manahi a VC.
The campaign to seek redress for Manahi continued and in 2000, his iwi, Te Arawa, lodged a claim with the Waitangi Tribunal and was supported in doing so by the New Zealand RSA. Te Arawa alleged the failure of the New Zealand Government to give full consideration of the award of a VC to Manahi constituted a breach of the Treaty of Waitangi, which required the government to act in good faith regarding grievances of Māori. In December 2005 the tribunal reported that there was no breach of the treaty. While not making any formal conclusions or recommendations, the tribunal suggested that the Manahi VC Committee work with the New Zealand Government in making an approach to Buckingham Palace.
In October 2006, after further dialogue with Buckingham Palace, the New Zealand Minister of Defence, Phil Goff, announced that Manahi's bravery at Takrouna would be recognised by the presentation of an altar cloth for use at St. Faith's Church at Ohinemutu, a personal letter from the Queen acknowledging his gallantry, and a ceremonial sword. The award was presented by Prince Andrew to Manahi's sons, Rauawa and Geoffrey, at a ceremony in Rotorua on 17 March 2007. The sword was later presented to the Chief of the New Zealand Defence Force, Lieutenant General Jerry Mateparae, along with a patu (war club) in memory of Haane Manahi. |
# Supernova
A supernova (: supernovae or supernovas) is a powerful and luminous explosion of a star. A supernova occurs during the last evolutionary stages of a massive star, or when a white dwarf is triggered into runaway nuclear fusion. The original object, called the progenitor, either collapses to a neutron star or black hole, or is completely destroyed to form a diffuse nebula. The peak optical luminosity of a supernova can be comparable to that of an entire galaxy before fading over several weeks or months.
The last supernova directly observed in the Milky Way was Kepler's Supernova in 1604, appearing not long after Tycho's Supernova in 1572, both of which were visible to the naked eye. The remnants of more recent supernovae have been found, and observations of supernovae in other galaxies suggest they occur in the Milky Way on average about three times every century. A supernova in the Milky Way would almost certainly be observable through modern astronomical telescopes. The most recent naked-eye supernova was SN 1987A, which was the explosion of a blue supergiant star in the Large Magellanic Cloud, a satellite galaxy of the Milky Way.
Theoretical studies indicate that most supernovae are triggered by one of two basic mechanisms: the sudden re-ignition of nuclear fusion in a white dwarf, or the sudden gravitational collapse of a massive star's core.
- In the re-ignition of a white dwarf, the object's temperature is raised enough to trigger runaway nuclear fusion, completely disrupting the star. Possible causes are an accumulation of material from a binary companion through accretion, or by a stellar merger.
- In the case of a massive star's sudden implosion, the core of a massive star will undergo sudden collapse once it is unable to produce sufficient energy from fusion to counteract the star's own gravity, which must happen once the star begins fusing iron, but may happen during an earlier stage of metal fusion.
Supernovae can expel several solar masses of material at speeds up to several percent of the speed of light. This drives an expanding shock wave into the surrounding interstellar medium, sweeping up an expanding shell of gas and dust observed as a supernova remnant. Supernovae are a major source of elements in the interstellar medium from oxygen to rubidium. The expanding shock waves of supernovae can trigger the formation of new stars. Supernovae are a major source of cosmic rays. They might also produce gravitational waves.
## Etymology
The word supernova has the plural form supernovae (/-viː/) or supernovas and is often abbreviated as SN or SNe. It is derived from the Latin word nova, meaning , which refers to what appears to be a temporary new bright star. Adding the prefix "super-" distinguishes supernovae from ordinary novae, which are far less luminous. The word supernova was coined by Walter Baade and Fritz Zwicky, who began using it in astrophysics lectures in 1931. Its first use in a journal article came the following year in a publication by Knut Lundmark, who may have coined it independently.
## Observation history
Compared to a star's entire history, the visual appearance of a supernova is very brief, sometimes spanning several months, so that the chances of observing one with the naked eye are roughly once in a lifetime. Only a tiny fraction of the 100 billion stars in a typical galaxy have the capacity to become a supernova, the ability being restricted to those having high mass and those in rare kinds of binary star systems with at least one white dwarf.
### Early discoveries
The earliest record of a possible supernova, known as HB9, was likely viewed by an unknown prehistoric people of the Indian subcontinent and recorded on a rock carving in the Burzahama region of Kashmir, dated to 4500±1000 BC. Later, SN 185 was documented by Chinese astronomers in 185 AD. The brightest recorded supernova was SN 1006, which was observed in AD 1006 in the constellation of Lupus. This event was described by observers in China, Japan, Iraq, Egypt and Europe. The widely observed supernova SN 1054 produced the Crab Nebula.
Supernovae SN 1572 and SN 1604, the latest Milky Way supernovae to be observed with the naked eye, had a notable influence on the development of astronomy in Europe because they were used to argue against the Aristotelian idea that the universe beyond the Moon and planets was static and unchanging. Johannes Kepler began observing SN 1604 at its peak on 17 October 1604, and continued to make estimates of its brightness until it faded from naked eye view a year later. It was the second supernova to be observed in a generation, after Tycho Brahe observed SN 1572 in Cassiopeia.
There is some evidence that the youngest known supernova in our galaxy, G1.9+0.3, occurred in the late 19th century, considerably more recently than Cassiopeia A from around 1680. Neither was noted at the time. In the case of G1.9+0.3, high extinction from dust along the plane of the galactic disk could have dimmed the event sufficiently for it to go unnoticed. The situation for Cassiopeia A is less clear; infrared light echoes have been detected showing that it was not in a region of especially high extinction.
### Telescope findings
With the development of the astronomical telescope, observation and discovery of fainter and more distant supernovae became possible. The first such observation was of SN 1885A in the Andromeda Galaxy. A second supernova, SN 1895B, was discovered in NGC 5253 a decade later. Early work on what was originally believed to be simply a new category of novae was performed during the 1920s. These were variously called "upper-class Novae", "Hauptnovae", or "giant novae". The name "supernovae" is thought to have been coined by Walter Baade and Zwicky in lectures at Caltech in 1931. It was used, as "super-Novae", in a journal paper published by Knut Lundmark in 1933, and in a 1934 paper by Baade and Zwicky. By 1938, the hyphen was no longer used and the modern name was in use.
American astronomers Rudolph Minkowski and Fritz Zwicky developed the modern supernova classification scheme beginning in 1941. During the 1960s, astronomers found that the maximum intensities of supernovae could be used as standard candles, hence indicators of astronomical distances. Some of the most distant supernovae observed in 2003 appeared dimmer than expected. This supports the view that the expansion of the universe is accelerating. Techniques were developed for reconstructing supernovae events that have no written records of being observed. The date of the Cassiopeia A supernova event was determined from light echoes off nebulae, while the age of supernova remnant RX J0852.0-4622 was estimated from temperature measurements and the gamma ray emissions from the radioactive decay of titanium-44.
The most luminous supernova ever recorded is ASASSN-15lh, at a distance of 3.82 gigalight-years. It was first detected in June 2015 and peaked at , which is twice the bolometric luminosity of any other known supernova. The nature of this supernova is debated and several alternative explanations, such as tidal disruption of a star by a black hole, have been suggested.
SN 2013fs was recorded three hours after the supernova event on 6 October 2013, by the Intermediate Palomar Transient Factory. This is among the earliest supernovae caught after detonation, and it is the earliest for which spectra have been obtained, beginning six hours after the actual explosion. The star is located in a spiral galaxy named NGC 7610, 160 million light-years away in the constellation of Pegasus.
The supernova SN 2016gkg was detected by amateur astronomer Victor Buso from Rosario, Argentina, on 20 September 2016. It was the first time that the initial "shock breakout" from an optical supernova had been observed. The progenitor star has been identified in Hubble Space Telescope images from before its collapse. Astronomer Alex Filippenko noted: "Observations of stars in the first moments they begin exploding provide information that cannot be directly obtained in any other way."
The James Webb Space Telescope (JWST) has significantly advanced our understanding of supernovae by identifying around 80 new instances through its JWST Advanced Deep Extragalactic Survey (JADES) program. This includes the most distant spectroscopically confirmed supernova at a redshift of 3.6, indicating its explosion occurred when the universe was merely 1.8 billion years old. These findings offer crucial insights into the early universe's stellar evolution and the frequency of supernovae during its formative years.
### Discovery programs
Because supernovae are relatively rare events within a galaxy, occurring about three times a century in the Milky Way, obtaining a good sample of supernovae to study requires regular monitoring of many galaxies. Today, amateur and professional astronomers are finding several hundred every year, some when near maximum brightness, others on old astronomical photographs or plates. Supernovae in other galaxies cannot be predicted with any meaningful accuracy. Normally, when they are discovered, they are already in progress. To use supernovae as standard candles for measuring distance, observation of their peak luminosity is required. It is therefore important to discover them well before they reach their maximum. Amateur astronomers, who greatly outnumber professional astronomers, have played an important role in finding supernovae, typically by looking at some of the closer galaxies through an optical telescope and comparing them to earlier photographs.
Toward the end of the 20th century, astronomers increasingly turned to computer-controlled telescopes and CCDs for hunting supernovae. While such systems are popular with amateurs, there are also professional installations such as the Katzman Automatic Imaging Telescope. The Supernova Early Warning System (SNEWS) project uses a network of neutrino detectors to give early warning of a supernova in the Milky Way galaxy. Neutrinos are subatomic particles that are produced in great quantities by a supernova, and they are not significantly absorbed by the interstellar gas and dust of the galactic disk.
Supernova searches fall into two classes: those focused on relatively nearby events and those looking farther away. Because of the expansion of the universe, the distance to a remote object with a known emission spectrum can be estimated by measuring its Doppler shift (or redshift); on average, more-distant objects recede with greater velocity than those nearby, and so have a higher redshift. Thus the search is split between high redshift and low redshift, with the boundary falling around a redshift range of z=0.1–0.3, where z is a dimensionless measure of the spectrum's frequency shift.
High redshift searches for supernovae usually involve the observation of supernova light curves. These are useful for standard or calibrated candles to generate Hubble diagrams and make cosmological predictions. Supernova spectroscopy, used to study the physics and environments of supernovae, is more practical at low than at high redshift. Low redshift observations also anchor the low-distance end of the Hubble curve, which is a plot of distance versus redshift for visible galaxies.
As survey programmes rapidly increase the number of detected supernovae, collated collections of observations (light decay curves, astrometry, pre-supernova observations, spectroscopy) have been assembled. The Pantheon data set, assembled in 2018, detailed 1048 supernovae. In 2021, this data set was expanded to 1701 light curves for 1550 supernovae taken from 18 different surveys, a 50% increase in under 3 years.
## Naming convention
Supernova discoveries are reported to the International Astronomical Union's Central Bureau for Astronomical Telegrams, which sends out a circular with the name it assigns to that supernova. The name is formed from the prefix SN, followed by the year of discovery, suffixed with a one or two-letter designation. The first 26 supernovae of the year are designated with a capital letter from A to Z. Next, pairs of lower-case letters are used: aa, ab, and so on. Hence, for example, SN 2003C designates the third supernova reported in the year 2003. The last supernova of 2005, SN 2005nc, was the 367th (14 × 26 + 3 = 367). Since 2000, professional and amateur astronomers have been finding several hundred supernovae each year (572 in 2007, 261 in 2008, 390 in 2009; 231 in 2013).
Historical supernovae are known simply by the year they occurred: SN 185, SN 1006, SN 1054, SN 1572 (called Tycho's Nova) and SN 1604 (Kepler's Star). Since 1885 the additional letter notation has been used, even if there was only one supernova discovered that year (for example, SN 1885A, SN 1907A, etc.); this last happened with SN 1947A. SN, for SuperNova, is a standard prefix. Until 1987, two-letter designations were rarely needed; since 1988, they have been needed every year. Since 2016, the increasing number of discoveries has regularly led to the additional use of three-letter designations. After zz comes aaa, then aab, aac, and so on. For example, the last supernova retained in the Asiago Supernova Catalogue when it was terminated on 31 December 2017 bears the designation SN 2017jzp.
## Classification
Astronomers classify supernovae according to their light curves and the absorption lines of different chemical elements that appear in their spectra. If a supernova's spectrum contains lines of hydrogen (known as the Balmer series in the visual portion of the spectrum) it is classified Type II; otherwise it is Type I. In each of these two types there are subdivisions according to the presence of lines from other elements or the shape of the light curve (a graph of the supernova's apparent magnitude as a function of time).
### Type I
Type I supernovae are subdivided on the basis of their spectra, with type Ia showing a strong ionised silicon absorption line. Type I supernovae without this strong line are classified as type Ib and Ic, with type Ib showing strong neutral helium lines and type Ic lacking them. Historically, the light curves of type I supernovae were seen as all broadly similar, too much so to make useful distinctions. While variations in light curves have been studied, classification continues to be made on spectral grounds rather than light-curve shape.
A small number of type Ia supernovae exhibit unusual features, such as non-standard luminosity or broadened light curves, and these are typically categorised by referring to the earliest example showing similar features. For example, the sub-luminous SN 2008ha is often referred to as SN 2002cx-like or class Ia-2002cx.
A small proportion of type Ic supernovae show highly broadened and blended emission lines which are taken to indicate very high expansion velocities for the ejecta. These have been classified as type Ic-BL or Ic-bl.
Calcium-rich supernovae are a rare type of very fast supernova with unusually strong calcium lines in their spectra. Models suggest they occur when material is accreted from a helium-rich companion rather than a hydrogen-rich star. Because of helium lines in their spectra, they can resemble type Ib supernovae, but are thought to have very different progenitors.
### Type II
The supernovae of type II can also be sub-divided based on their spectra. While most type II supernovae show very broad emission lines which indicate expansion velocities of many thousands of kilometres per second, some, such as SN 2005gl, have relatively narrow features in their spectra. These are called type IIn, where the "n" stands for "narrow".
A few supernovae, such as SN 1987K and SN 1993J, appear to change types: they show lines of hydrogen at early times, but, over a period of weeks to months, become dominated by lines of helium. The term "type IIb" is used to describe the combination of features normally associated with types II and Ib.
Type II supernovae with normal spectra dominated by broad hydrogen lines that remain for the life of the decline are classified on the basis of their light curves. The most common type shows a distinctive "plateau" in the light curve shortly after peak brightness where the visual luminosity stays relatively constant for several months before the decline resumes. These are called type II-P referring to the plateau. Less common are type II-L supernovae that lack a distinct plateau. The "L" signifies "linear" although the light curve is not actually a straight line.
Supernovae that do not fit into the normal classifications are designated peculiar, or "pec".
### Types III, IV and V
Zwicky defined additional supernovae types based on a very few examples that did not cleanly fit the parameters for type I or type II supernovae. SN 1961i in NGC 4303 was the prototype and only member of the type III supernova class, noted for its broad light curve maximum and broad hydrogen Balmer lines that were slow to develop in the spectrum. SN 1961f in NGC 3003 was the prototype and only member of the type IV class, with a light curve similar to a type II-P supernova, with hydrogen absorption lines but weak hydrogen emission lines. The type V class was coined for SN 1961V in NGC 1058, an unusual faint supernova or supernova impostor with a slow rise to brightness, a maximum lasting many months, and an unusual emission spectrum. The similarity of SN 1961V to the Eta Carinae Great Outburst was noted. Supernovae in M101 (1909) and M83 (1923 and 1957) were also suggested as possible type IV or type V supernovae.
These types would now all be treated as peculiar type II supernovae (IIpec), of which many more examples have been discovered, although it is still debated whether SN 1961V was a true supernova following an LBV outburst or an impostor.
## Current models
Supernova type codes, as summarised in the table above, are taxonomic: the type number is based on the light observed from the supernova, not necessarily its cause. For example, type Ia supernovae are produced by runaway fusion ignited on degenerate white dwarf progenitors, while the spectrally similar type Ib/c are produced from massive stripped progenitor stars by core collapse.
### Thermal runaway
A white dwarf star may accumulate sufficient material from a stellar companion to raise its core temperature enough to ignite carbon fusion, at which point it undergoes runaway nuclear fusion, completely disrupting it. There are three avenues by which this detonation is theorised to happen: stable accretion of material from a companion, the collision of two white dwarfs, or accretion that causes ignition in a shell that then ignites the core. The dominant mechanism by which type Ia supernovae are produced remains unclear. Despite this uncertainty in how type Ia supernovae are produced, type Ia supernovae have very uniform properties and are useful standard candles over intergalactic distances. Some calibrations are required to compensate for the gradual change in properties or different frequencies of abnormal luminosity supernovae at high redshift, and for small variations in brightness identified by light curve shape or spectrum.
#### Normal type Ia
There are several means by which a supernova of this type can form, but they share a common underlying mechanism. If a carbon-oxygen white dwarf accreted enough matter to reach the Chandrasekhar limit of about 1.44 solar masses (for a non-rotating star), it would no longer be able to support the bulk of its mass through electron degeneracy pressure and would begin to collapse. However, the current view is that this limit is not normally attained; increasing temperature and density inside the core ignite carbon fusion as the star approaches the limit (to within about 1%) before collapse is initiated. In contrast, for a core primarily composed of oxygen, neon and magnesium, the collapsing white dwarf will typically form a neutron star. In this case, only a fraction of the star's mass will be ejected during the collapse.
Within a few seconds of the collapse process, a substantial fraction of the matter in the white dwarf undergoes nuclear fusion, releasing enough energy (1–2×10<sup>44</sup> J) to unbind the star in a supernova. An outwardly expanding shock wave is generated, with matter reaching velocities on the order of 5,000–20,000 km/s, or roughly 3% of the speed of light. There is also a significant increase in luminosity, reaching an absolute magnitude of −19.3 (or 5 billion times brighter than the Sun), with little variation.
The model for the formation of this category of supernova is a close binary star system. The larger of the two stars is the first to evolve off the main sequence, and it expands to form a red giant. The two stars now share a common envelope, causing their mutual orbit to shrink. The giant star then sheds most of its envelope, losing mass until it can no longer continue nuclear fusion. At this point, it becomes a white dwarf star, composed primarily of carbon and oxygen. Eventually, the secondary star also evolves off the main sequence to form a red giant. Matter from the giant is accreted by the white dwarf, causing the latter to increase in mass. The exact details of initiation and of the heavy elements produced in the catastrophic event remain unclear.
Type Ia supernovae produce a characteristic light curve—the graph of luminosity as a function of time—after the event. This luminosity is generated by the radioactive decay of nickel-56 through cobalt-56 to iron-56. The peak luminosity of the light curve is extremely consistent across normal type Ia supernovae, having a maximum absolute magnitude of about −19.3. This is because typical type Ia supernovae arise from a consistent type of progenitor star by gradual mass acquisition, and explode when they acquire a consistent typical mass, giving rise to very similar supernova conditions and behaviour. This allows them to be used as a secondary standard candle to measure the distance to their host galaxies.
A second model for the formation of type Ia supernovae involves the merger of two white dwarf stars, with the combined mass momentarily exceeding the Chandrasekhar limit. This is sometimes referred to as the double-degenerate model, as both stars are degenerate white dwarfs. Due to the possible combinations of mass and chemical composition of the pair there is much variation in this type of event, and, in many cases, there may be no supernova at all, in which case they will have a less luminous light curve than the more normal SN type Ia.
#### Non-standard type Ia
Abnormally bright type Ia supernovae occur when the white dwarf already has a mass higher than the Chandrasekhar limit, possibly enhanced further by asymmetry, but the ejected material will have less than normal kinetic energy. This super-Chandrasekhar-mass scenario can occur, for example, when the extra mass is supported by differential rotation.
There is no formal sub-classification for non-standard type Ia supernovae. It has been proposed that a group of sub-luminous supernovae that occur when helium accretes onto a white dwarf should be classified as type Iax. This type of supernova may not always completely destroy the white dwarf progenitor and could leave behind a zombie star.
One specific type of supernova originates from exploding white dwarfs, like type Ia, but contains hydrogen lines in their spectra, possibly because the white dwarf is surrounded by an envelope of hydrogen-rich circumstellar material. These supernovae have been dubbed type Ia/IIn, type Ian, type IIa and type IIan.
The quadruple star HD 74438, belonging to the open cluster IC 2391 the Vela constellation, has been predicted to become a non-standard type Ia supernova.
### Core collapse
Very massive stars can undergo core collapse when nuclear fusion becomes unable to sustain the core against its own gravity; passing this threshold is the cause of all types of supernova except type Ia. The collapse may cause violent expulsion of the outer layers of the star resulting in a supernova. However, if the release of gravitational potential energy is insufficient, the star may instead collapse into a black hole or neutron star with little radiated energy.
Core collapse can be caused by several different mechanisms: exceeding the Chandrasekhar limit; electron capture; pair-instability; or photodisintegration.
- When a massive star develops an iron core larger than the Chandrasekhar mass it will no longer be able to support itself by electron degeneracy pressure and will collapse further to a neutron star or black hole.
- Electron capture by magnesium in a degenerate O/Ne/Mg core (8–10 solar mass progenitor star) removes support and causes gravitational collapse followed by explosive oxygen fusion, with very similar results.
- Electron-positron pair production in a large post-helium burning core removes thermodynamic support and causes initial collapse followed by runaway fusion, resulting in a pair-instability supernova.
- A sufficiently large and hot stellar core may generate gamma-rays energetic enough to initiate photodisintegration directly, which will cause a complete collapse of the core.
The table below lists the known reasons for core collapse in massive stars, the types of stars in which they occur, their associated supernova type, and the remnant produced. The metallicity is the proportion of elements other than hydrogen or helium, as compared to the Sun. The initial mass is the mass of the star prior to the supernova event, given in multiples of the Sun's mass, although the mass at the time of the supernova may be much lower.
Type IIn supernovae are not listed in the table. They can be produced by various types of core collapse in different progenitor stars, possibly even by type Ia white dwarf ignitions, although it seems that most will be from iron core collapse in luminous supergiants or hypergiants (including LBVs). The narrow spectral lines for which they are named occur because the supernova is expanding into a small dense cloud of circumstellar material. It appears that a significant proportion of supposed type IIn supernovae are supernova impostors, massive eruptions of LBV-like stars similar to the Great Eruption of Eta Carinae. In these events, material previously ejected from the star creates the narrow absorption lines and causes a shock wave through interaction with the newly ejected material.
#### Detailed process
When a stellar core is no longer supported against gravity, it collapses in on itself with velocities reaching 70,000 km/s (0.23c), resulting in a rapid increase in temperature and density. What follows depends on the mass and structure of the collapsing core, with low-mass degenerate cores forming neutron stars, higher-mass degenerate cores mostly collapsing completely to black holes, and non-degenerate cores undergoing runaway fusion.
The initial collapse of degenerate cores is accelerated by beta decay, photodisintegration and electron capture, which causes a burst of electron neutrinos. As the density increases, neutrino emission is cut off as they become trapped in the core. The inner core eventually reaches typically 30 km in diameter with a density comparable to that of an atomic nucleus, and neutron degeneracy pressure tries to halt the collapse. If the core mass is more than about 15 solar masses then neutron degeneracy is insufficient to stop the collapse and a black hole forms directly with no supernova.
In lower mass cores the collapse is stopped and the newly formed neutron core has an initial temperature of about 100 billion kelvin, 6,000 times the temperature of the Sun's core. At this temperature, neutrino-antineutrino pairs of all flavours are efficiently formed by thermal emission. These thermal neutrinos are several times more abundant than the electron-capture neutrinos. About 10<sup>46</sup> joules, approximately 10% of the star's rest mass, is converted into a ten-second burst of neutrinos, which is the main output of the event. The suddenly halted core collapse rebounds and produces a shock wave that stalls in the outer core within milliseconds as energy is lost through the dissociation of heavy elements. A process that is not clearly understood is necessary to allow the outer layers of the core to reabsorb around 10<sup>44</sup> joules (1 foe) from the neutrino pulse, producing the visible brightness, although there are other theories that could power the explosion.
Some material from the outer envelope falls back onto the neutron star, and, for cores beyond about , there is sufficient fallback to form a black hole. This fallback will reduce the kinetic energy created and the mass of expelled radioactive material, but in some situations, it may also generate relativistic jets that result in a gamma-ray burst or an exceptionally luminous supernova.
The collapse of a massive non-degenerate core will ignite further fusion. When the core collapse is initiated by pair instability (photons turning into electron-positron pairs, thereby reducing the radiation pressure) oxygen fusion begins and the collapse may be halted. For core masses of , the collapse halts and the star remains intact, but collapse will occur again when a larger core has formed. For cores of around , the fusion of oxygen and heavier elements is so energetic that the entire star is disrupted, causing a supernova. At the upper end of the mass range, the supernova is unusually luminous and extremely long-lived due to many solar masses of ejected <sup>56</sup>Ni. For even larger core masses, the core temperature becomes high enough to allow photodisintegration and the core collapses completely into a black hole.
#### Type II
Stars with initial masses less than about never develop a core large enough to collapse and they eventually lose their atmospheres to become white dwarfs. Stars with at least (possibly as much as ) evolve in a complex fashion, progressively burning heavier elements at hotter temperatures in their cores. The star becomes layered like an onion, with the burning of more easily fused elements occurring in larger shells. Although popularly described as an onion with an iron core, the least massive supernova progenitors only have oxygen-neon(-magnesium) cores. These super-AGB stars may form the majority of core collapse supernovae, although less luminous and so less commonly observed than those from more massive progenitors.
If core collapse occurs during a supergiant phase when the star still has a hydrogen envelope, the result is a type II supernova. The rate of mass loss for luminous stars depends on the metallicity and luminosity. Extremely luminous stars at near solar metallicity will lose all their hydrogen before they reach core collapse and so will not form a supernova of type II. At low metallicity, all stars will reach core collapse with a hydrogen envelope but sufficiently massive stars collapse directly to a black hole without producing a visible supernova.
Stars with an initial mass up to about 90 times the Sun, or a little less at high metallicity, result in a type II-P supernova, which is the most commonly observed type. At moderate to high metallicity, stars near the upper end of that mass range will have lost most of their hydrogen when core collapse occurs and the result will be a type II-L supernova. At very low metallicity, stars of around will reach core collapse by pair instability while they still have a hydrogen atmosphere and an oxygen core and the result will be a supernova with type II characteristics but a very large mass of ejected <sup>56</sup>Ni and high luminosity.
#### Type Ib and Ic
These supernovae, like those of type II, are massive stars that undergo core collapse. Unlike the progenitors of type II supernovae, the stars which become types Ib and Ic supernovae have lost most of their outer (hydrogen) envelopes due to strong stellar winds or else from interaction with a companion. These stars are known as Wolf–Rayet stars, and they occur at moderate to high metallicity where continuum driven winds cause sufficiently high mass-loss rates. Observations of type Ib/c supernova do not match the observed or expected occurrence of Wolf–Rayet stars. Alternate explanations for this type of core collapse supernova involve stars stripped of their hydrogen by binary interactions. Binary models provide a better match for the observed supernovae, with the proviso that no suitable binary helium stars have ever been observed.
Type Ib supernovae are the more common and result from Wolf–Rayet stars of type WC which still have helium in their atmospheres. For a narrow range of masses, stars evolve further before reaching core collapse to become WO stars with very little helium remaining, and these are the progenitors of type Ic supernovae.
A few percent of the type Ic supernovae are associated with gamma-ray bursts (GRB), though it is also believed that any hydrogen-stripped type Ib or Ic supernova could produce a GRB, depending on the circumstances of the geometry. The mechanism for producing this type of GRB is the jets produced by the magnetic field of the rapidly spinning magnetar formed at the collapsing core of the star. The jets would also transfer energy into the expanding outer shell, producing a super-luminous supernova.
Ultra-stripped supernovae occur when the exploding star has been stripped (almost) all the way to the metal core, via mass transfer in a close binary. As a result, very little material is ejected from the exploding star (c. ). In the most extreme cases, ultra-stripped supernovae can occur in naked metal cores, barely above the Chandrasekhar mass limit. SN 2005ek might be the first observational example of an ultra-stripped supernova, giving rise to a relatively dim and fast decaying light curve. The nature of ultra-stripped supernovae can be both iron core-collapse and electron capture supernovae, depending on the mass of the collapsing core. Ultra-stripped supernovae are believed to be associated with the second supernova explosion in a binary system, producing for example a tight double neutron star system.
In 2022 a team of astronomers led by researchers from the Weizmann Institute of Science reported the first supernova explosion showing direct evidence for a Wolf-Rayet progenitor star. SN 2019hgp was a type Icn supernova and is also the first in which the element neon has been detected.
#### Electron-capture supernovae
In 1980, a "third type" of supernova was predicted by Ken'ichi Nomoto of the University of Tokyo, called an electron-capture supernova. It would arise when a star "in the transitional range (\~8 to 10 solar masses) between white dwarf formation and iron core-collapse supernovae", and with a degenerate O+Ne+Mg core, imploded after its core ran out of nuclear fuel, causing gravity to compress the electrons in the star's core into their atomic nuclei, leading to a supernova explosion and leaving behind a neutron star. In June 2021, a paper in the journal Nature Astronomy reported that the 2018 supernova SN 2018zd (in the galaxy NGC 2146, about 31 million light-years from Earth) appeared to be the first observation of an electron-capture supernova. The 1054 supernova explosion that created the Crab Nebula in our galaxy had been thought to be the best candidate for an electron-capture supernova, and the 2021 paper makes it more likely that this was correct.
### Failed supernovae
The core collapse of some massive stars may not result in a visible supernova. This happens if the initial core collapse cannot be reversed by the mechanism that produces an explosion, usually because the core is too massive. These events are difficult to detect, but large surveys have detected possible candidates. The red supergiant N6946-BH1 in NGC 6946 underwent a modest outburst in March 2009, before fading from view. Only a faint infrared source remains at the star's location.
### Light curves
The ejecta gases would dim quickly without some energy input to keep them hot. The source of this energy—which can maintain the optical supernova glow for months—was, at first, a puzzle. Some considered rotational energy from the central pulsar as a source. Although the energy that initially powers each type of supernovae is delivered promptly, the light curves are dominated by subsequent radioactive heating of the rapidly expanding ejecta. The intensely radioactive nature of the ejecta gases was first calculated on sound nucleosynthesis grounds in the late 1960s, and this has since been demonstrated as correct for most supernovae. It was not until SN 1987A that direct observation of gamma-ray lines unambiguously identified the major radioactive nuclei.
It is now known by direct observation that much of the light curve (the graph of luminosity as a function of time) after the occurrence of a type II Supernova, such as SN 1987A, is explained by those predicted radioactive decays. Although the luminous emission consists of optical photons, it is the radioactive power absorbed by the ejected gases that keeps the remnant hot enough to radiate light. The radioactive decay of <sup>56</sup>Ni through its daughters <sup>56</sup>Co to <sup>56</sup>Fe produces gamma-ray photons, primarily with energies of 847 keV and 1,238 keV, that are absorbed and dominate the heating and thus the luminosity of the ejecta at intermediate times (several weeks) to late times (several months). Energy for the peak of the light curve of SN1987A was provided by the decay of <sup>56</sup>Ni to <sup>56</sup>Co (half-life 6 days) while energy for the later light curve in particular fit very closely with the 77.3-day half-life of <sup>56</sup>Co decaying to <sup>56</sup>Fe. Later measurements by space gamma-ray telescopes of the small fraction of the <sup>56</sup>Co and <sup>57</sup>Co gamma rays that escaped the SN 1987A remnant without absorption confirmed earlier predictions that those two radioactive nuclei were the power sources.
The late-time decay phase of visual light curves for different supernova types all depend on radioactive heating, but they vary in shape and amplitude because of the underlying mechanisms, the way that visible radiation is produced, the epoch of its observation, and the transparency of the ejected material. The light curves can be significantly different at other wavelengths. For example, at ultraviolet wavelengths there is an early extremely luminous peak lasting only a few hours corresponding to the breakout of the shock launched by the initial event, but that breakout is hardly detectable optically.
The light curves for type Ia are mostly very uniform, with a consistent maximum absolute magnitude and a relatively steep decline in luminosity. Their optical energy output is driven by radioactive decay of ejected nickel-56 (half-life 6 days), which then decays to radioactive cobalt-56 (half-life 77 days). These radioisotopes excite the surrounding material to incandescence. Modern studies of cosmology rely on <sup>56</sup>Ni radioactivity providing the energy for the optical brightness of supernovae of type Ia, which are the "standard candles" of cosmology but whose diagnostic 847 keV and 1,238 keV gamma rays were first detected only in 2014. The initial phases of the light curve decline steeply as the effective size of the photosphere decreases and trapped electromagnetic radiation is depleted. The light curve continues to decline in the B band while it may show a small shoulder in the visual at about 40 days, but this is only a hint of a secondary maximum that occurs in the infra-red as certain ionised heavy elements recombine to produce infra-red radiation and the ejecta become transparent to it. The visual light curve continues to decline at a rate slightly greater than the decay rate of the radioactive cobalt (which has the longer half-life and controls the later curve), because the ejected material becomes more diffuse and less able to convert the high energy radiation into visual radiation. After several months, the light curve changes its decline rate again as positron emission from the remaining cobalt-56 becomes dominant, although this portion of the light curve has been little-studied.
Type Ib and Ic light curves are similar to type Ia although with a lower average peak luminosity. The visual light output is again due to radioactive decay being converted into visual radiation, but there is a much lower mass of the created nickel-56. The peak luminosity varies considerably and there are even occasional type Ib/c supernovae orders of magnitude more and less luminous than the norm. The most luminous type Ic supernovae are referred to as hypernovae and tend to have broadened light curves in addition to the increased peak luminosity. The source of the extra energy is thought to be relativistic jets driven by the formation of a rotating black hole, which also produce gamma-ray bursts.
The light curves for type II supernovae are characterised by a much slower decline than type I, on the order of 0.05 magnitudes per day, excluding the plateau phase. The visual light output is dominated by kinetic energy rather than radioactive decay for several months, due primarily to the existence of hydrogen in the ejecta from the atmosphere of the supergiant progenitor star. In the initial destruction this hydrogen becomes heated and ionised. The majority of type II supernovae show a prolonged plateau in their light curves as this hydrogen recombines, emitting visible light and becoming more transparent. This is then followed by a declining light curve driven by radioactive decay although slower than in type I supernovae, due to the efficiency of conversion into light by all the hydrogen.
In type II-L the plateau is absent because the progenitor had relatively little hydrogen left in its atmosphere, sufficient to appear in the spectrum but insufficient to produce a noticeable plateau in the light output. In type IIb supernovae the hydrogen atmosphere of the progenitor is so depleted (thought to be due to tidal stripping by a companion star) that the light curve is closer to a type I supernova and the hydrogen even disappears from the spectrum after several weeks.
Type IIn supernovae are characterised by additional narrow spectral lines produced in a dense shell of circumstellar material. Their light curves are generally very broad and extended, occasionally also extremely luminous and referred to as a superluminous supernova. These light curves are produced by the highly efficient conversion of kinetic energy of the ejecta into electromagnetic radiation by interaction with the dense shell of material. This only occurs when the material is sufficiently dense and compact, indicating that it has been produced by the progenitor star itself only shortly before the supernova occurs.
Large numbers of supernovae have been catalogued and classified to provide distance candles and test models. Average characteristics vary somewhat with distance and type of host galaxy, but can broadly be specified for each supernova type.
Notes:
### Asymmetry
A long-standing puzzle surrounding type II supernovae is why the remaining compact object receives a large velocity away from the epicentre; pulsars, and thus neutron stars, are observed to have high peculiar velocities, and black holes presumably do as well, although they are far harder to observe in isolation. The initial impetus can be substantial, propelling an object of more than a solar mass at a velocity of 500 km/s or greater. This indicates an expansion asymmetry, but the mechanism by which momentum is transferred to the compact object remains a puzzle. Proposed explanations for this kick include convection in the collapsing star, asymmetric ejection of matter during neutron star formation, and asymmetrical neutrino emissions.
One possible explanation for this asymmetry is large-scale convection above the core. The convection can create radial variations in density giving rise to variations in the amount of energy absorbed from neutrino outflow. However analysis of this mechanism predicts only modest momentum transfer. Another possible explanation is that accretion of gas onto the central neutron star can create a disk that drives highly directional jets, propelling matter at a high velocity out of the star, and driving transverse shocks that completely disrupt the star. These jets might play a crucial role in the resulting supernova. (A similar model is used for explaining long gamma-ray bursts.) The dominant mechanism may depend upon the mass of the progenitor star.
Initial asymmetries have also been confirmed in type Ia supernovae through observation. This result may mean that the initial luminosity of this type of supernova depends on the viewing angle. However, the expansion becomes more symmetrical with the passage of time. Early asymmetries are detectable by measuring the polarisation of the emitted light.
### Energy output
Although supernovae are primarily known as luminous events, the electromagnetic radiation they release is almost a minor side-effect. Particularly in the case of core collapse supernovae, the emitted electromagnetic radiation is a tiny fraction of the total energy released during the event.
There is a fundamental difference between the balance of energy production in the different types of supernova. In type Ia white dwarf detonations, most of the energy is directed into heavy element synthesis and the kinetic energy of the ejecta. In core collapse supernovae, the vast majority of the energy is directed into neutrino emission, and while some of this apparently powers the observed destruction, 99%+ of the neutrinos escape the star in the first few minutes following the start of the collapse.
Standard type Ia supernovae derive their energy from a runaway nuclear fusion of a carbon-oxygen white dwarf. The details of the energetics are still not fully understood, but the result is the ejection of the entire mass of the original star at high kinetic energy. Around half a solar mass of that mass is <sup>56</sup>Ni generated from silicon burning. <sup>56</sup>Ni is radioactive and decays into <sup>56</sup>Co by beta plus decay (with a half life of six days) and gamma rays. <sup>56</sup>Co itself decays by the beta plus (positron) path with a half life of 77 days into stable <sup>56</sup>Fe. These two processes are responsible for the electromagnetic radiation from type Ia supernovae. In combination with the changing transparency of the ejected material, they produce the rapidly declining light curve.
Core collapse supernovae are on average visually fainter than type Ia supernovae, but the total energy released is far higher, as outlined in the following table.
In some core collapse supernovae, fallback onto a black hole drives relativistic jets which may produce a brief energetic and directional burst of gamma rays and also transfers substantial further energy into the ejected material. This is one scenario for producing high-luminosity supernovae and is thought to be the cause of type Ic hypernovae and long-duration gamma-ray bursts. If the relativistic jets are too brief and fail to penetrate the stellar envelope then a low-luminosity gamma-ray burst may be produced and the supernova may be sub-luminous.
When a supernova occurs inside a small dense cloud of circumstellar material, it will produce a shock wave that can efficiently convert a high fraction of the kinetic energy into electromagnetic radiation. Even though the initial energy was entirely normal the resulting supernova will have high luminosity and extended duration since it does not rely on exponential radioactive decay. This type of event may cause type IIn hypernovae.
Although pair-instability supernovae are core collapse supernovae with spectra and light curves similar to type II-P, the nature after core collapse is more like that of a giant type Ia with runaway fusion of carbon, oxygen and silicon. The total energy released by the highest-mass events is comparable to other core collapse supernovae but neutrino production is thought to be very low, hence the kinetic and electromagnetic energy released is very high. The cores of these stars are much larger than any white dwarf and the amount of radioactive nickel and other heavy elements ejected from their cores can be orders of magnitude higher, with consequently high visual luminosity.
### Progenitor
The supernova classification type is closely tied to the type of progenitor star at the time of the collapse. The occurrence of each type of supernova depends on the star's metallicity, since this affects the strength of the stellar wind and thereby the rate at which the star loses mass.
Type Ia supernovae are produced from white dwarf stars in binary star systems and occur in all galaxy types. Core collapse supernovae are only found in galaxies undergoing current or very recent star formation, since they result from short-lived massive stars. They are most commonly found in type Sc spirals, but also in the arms of other spiral galaxies and in irregular galaxies, especially starburst galaxies.
Type Ib and Ic supernovae are hypothesised to have been produced by core collapse of massive stars that have lost their outer layer of hydrogen and helium, either via strong stellar winds or mass transfer to a companion. They normally occur in regions of new star formation, and are extremely rare in elliptical galaxies. The progenitors of type IIn supernovae also have high rates of mass loss in the period just prior to their explosions. Type Ic supernovae have been observed to occur in regions that are more metal-rich and have higher star-formation rates than average for their host galaxies. The table shows the progenitor for the main types of core collapse supernova, and the approximate proportions that have been observed in the local neighbourhood.
There are a number of difficulties reconciling modelled and observed stellar evolution leading up to core collapse supernovae. Red supergiants are the progenitors for the vast majority of core collapse supernovae, and these have been observed but only at relatively low masses and luminosities, below about and , respectively. Most progenitors of type II supernovae are not detected and must be considerably fainter, and presumably less massive. This discrepancy has been referred to as the red supergiant problem. It was first described in 2009 by Stephen Smartt, who also coined the term. After performing a volume-limited search for supernovae, Smartt et al. found the lower and upper mass limits for type II-P supernovae to form to be and , respectively. The former is consistent with the expected upper mass limits for white dwarf progenitors to form, but the latter is not consistent with massive star populations in the Local Group. The upper limit for red supergiants that produce a visible supernova explosion has been calculated at 19+4
−2 M<sub>☉</sub>.
It is thought that higher mass red supergiants do not explode as supernovae, but instead evolve back towards hotter temperatures. Several progenitors of type IIb supernovae have been confirmed, and these were K and G supergiants, plus one A supergiant. Yellow hypergiants or LBVs are proposed progenitors for type IIb supernovae, and almost all type IIb supernovae near enough to observe have shown such progenitors.
Blue supergiants form an unexpectedly high proportion of confirmed supernova progenitors, partly due to their high luminosity and easy detection, while not a single Wolf–Rayet progenitor has yet been clearly identified. Models have had difficulty showing how blue supergiants lose enough mass to reach supernova without progressing to a different evolutionary stage. One study has shown a possible route for low-luminosity post-red supergiant luminous blue variables to collapse, most likely as a type IIn supernova. Several examples of hot luminous progenitors of type IIn supernovae have been detected: SN 2005gy and SN 2010jl were both apparently massive luminous stars, but are very distant; and SN 2009ip had a highly luminous progenitor likely to have been an LBV, but is a peculiar supernova whose exact nature is disputed.
The progenitors of type Ib/c supernovae are not observed at all, and constraints on their possible luminosity are often lower than those of known WC stars. WO stars are extremely rare and visually relatively faint, so it is difficult to say whether such progenitors are missing or just yet to be observed. Very luminous progenitors have not been securely identified, despite numerous supernovae being observed near enough that such progenitors would have been clearly imaged. Population modelling shows that the observed type Ib/c supernovae could be reproduced by a mixture of single massive stars and stripped-envelope stars from interacting binary systems. The continued lack of unambiguous detection of progenitors for normal type Ib and Ic supernovae may be due to most massive stars collapsing directly to a black hole without a supernova outburst. Most of these supernovae are then produced from lower-mass low-luminosity helium stars in binary systems. A small number would be from rapidly rotating massive stars, likely corresponding to the highly energetic type Ic-BL events that are associated with long-duration gamma-ray bursts.
## External impact
Supernovae events generate heavier elements that are scattered throughout the surrounding interstellar medium. The expanding shock wave from a supernova can trigger star formation. Galactic cosmic rays are generated by supernova explosions.
### Source of heavy elements
Supernovae are a major source of elements in the interstellar medium from oxygen through to rubidium, though the theoretical abundances of the elements produced or seen in the spectra varies significantly depending on the various supernova types. Type Ia supernovae produce mainly silicon and iron-peak elements, metals such as nickel and iron. Core collapse supernovae eject much smaller quantities of the iron-peak elements than type Ia supernovae, but larger masses of light alpha elements such as oxygen and neon, and elements heavier than zinc. The latter is especially true with electron capture supernovae. The bulk of the material ejected by type II supernovae is hydrogen and helium. The heavy elements are produced by: nuclear fusion for nuclei up to <sup>34</sup>S; silicon photodisintegration rearrangement and quasiequilibrium during silicon burning for nuclei between <sup>36</sup>Ar and <sup>56</sup>Ni; and rapid capture of neutrons (r-process) during the supernova's collapse for elements heavier than iron. The r-process produces highly unstable nuclei that are rich in neutrons and that rapidly beta decay into more stable forms. In supernovae, r-process reactions are responsible for about half of all the isotopes of elements beyond iron, although neutron star mergers may be the main astrophysical source for many of these elements.
In the modern universe, old asymptotic giant branch (AGB) stars are the dominant source of dust from oxides, carbon and s-process elements. However, in the early universe, before AGB stars formed, supernovae may have been the main source of dust.
### Role in stellar evolution
Remnants of many supernovae consist of a compact object and a rapidly expanding shock wave of material. This cloud of material sweeps up surrounding interstellar medium during a free expansion phase, which can last for up to two centuries. The wave then gradually undergoes a period of adiabatic expansion, and will slowly cool and mix with the surrounding interstellar medium over a period of about 10,000 years.
The Big Bang produced hydrogen, helium and traces of lithium, while all heavier elements are synthesised in stars, supernovae, and collisions between neutron stars (thus being indirectly due to supernovae). Supernovae tend to enrich the surrounding interstellar medium with elements other than hydrogen and helium, which usually astronomers refer to as "metals". These ejected elements ultimately enrich the molecular clouds that are the sites of star formation. Thus, each stellar generation has a slightly different composition, going from an almost pure mixture of hydrogen and helium to a more metal-rich composition. Supernovae are the dominant mechanism for distributing these heavier elements, which are formed in a star during its period of nuclear fusion. The different abundances of elements in the material that forms a star have important influences on the star's life, and may influence the possibility of having planets orbiting it: more giant planets form around stars of higher metallicity.
The kinetic energy of an expanding supernova remnant can trigger star formation by compressing nearby, dense molecular clouds in space. The increase in turbulent pressure can also prevent star formation if the cloud is unable to lose the excess energy.
Evidence from daughter products of short-lived radioactive isotopes shows that a nearby supernova helped determine the composition of the Solar System 4.5 billion years ago, and may even have triggered the formation of this system.
Fast radio bursts (FRBs) are intense, transient pulses of radio waves that typically last no more than milliseconds. Many explanations for these events have been proposed; magnetars produced by core-collapse supernovae are leading candidates.
### Cosmic rays
Supernova remnants are thought to accelerate a large fraction of galactic primary cosmic rays, but direct evidence for cosmic ray production has only been found in a small number of remnants. Gamma rays from pion-decay have been detected from the supernova remnants IC 443 and W44. These are produced when accelerated protons from the remnant impact on interstellar material.
### Gravitational waves
Supernovae are potentially strong galactic sources of gravitational waves, but none have so far been detected. The only gravitational wave events so far detected are from mergers of black holes and neutron stars, probable remnants of supernovae. Like the neutrino emissions, the gravitational waves produced by a core-collapse supernova are expected to arrive without the delay that affects light. Consequently, they may provide information about the core-collapse process that is unavailable by other means. Most gravitational-wave signals predicted by supernova models are short in duration, lasting less than a second, and thus difficult to detect. Using the arrival of a neutrino signal may provide a trigger that can identify the time window in which to seek the gravitational wave, helping to distinguish the latter from background noise.
### Effect on Earth
A near-Earth supernova is a supernova close enough to the Earth to have noticeable effects on its biosphere. Depending upon the type and energy of the supernova, it could be as far as 3,000 light-years away. In 1996 it was theorised that traces of past supernovae might be detectable on Earth in the form of metal isotope signatures in rock strata. Iron-60 enrichment was later reported in deep-sea rock of the Pacific Ocean. In 2009, elevated levels of nitrate ions were found in Antarctic ice, which coincided with the 1006 and 1054 supernovae. Gamma rays from these supernovae could have boosted atmospheric levels of nitrogen oxides, which became trapped in the ice.
Historically, nearby supernovae may have influenced the biodiversity of life on the planet. Geological records suggest that nearby supernova events have led to an increase in cosmic rays, which in turn produced a cooler climate. A greater temperature difference between the poles and the equator created stronger winds, increased ocean mixing, and resulted in the transport of nutrients to shallow waters along the continental shelves. This led to greater biodiversity.
Type Ia supernovae are thought to be potentially the most dangerous if they occur close enough to the Earth. Because these supernovae arise from dim, common white dwarf stars in binary systems, it is likely that a supernova that can affect the Earth will occur unpredictably and in a star system that is not well studied. The closest-known candidate is IK Pegasi (HR 8210), about 150 light-years away, but observations suggest it could be as long as 1.9 billion years before the white dwarf can accrete the critical mass required to become a type Ia supernova.
According to a 2003 estimate, a type II supernova would have to be closer than eight parsecs (26 light-years) to destroy half of the Earth's ozone layer, and there are no such candidates closer than about 500 light-years.
## Milky Way candidates
The next supernova in the Milky Way will likely be detectable even if it occurs on the far side of the galaxy. It is likely to be produced by the collapse of an unremarkable red supergiant, and it is very probable that it will already have been catalogued in infrared surveys such as 2MASS. There is a smaller chance that the next core collapse supernova will be produced by a different type of massive star such as a yellow hypergiant, luminous blue variable, or Wolf–Rayet. The chances of the next supernova being a type Ia produced by a white dwarf are calculated to be about a third of those for a core collapse supernova. Again it should be observable wherever it occurs, but it is less likely that the progenitor will ever have been observed. It is not even known exactly what a type Ia progenitor system looks like, and it is difficult to detect them beyond a few parsecs. The total supernova rate in the Milky Way is estimated to be between 2 and 12 per century, although one has not actually been observed for several centuries.
Statistically, the most common variety of core-collapse supernova is type II-P, and the progenitors of this type are red supergiants. It is difficult to identify which of those supergiants are in the final stages of heavy element fusion in their cores and which have millions of years left. The most-massive red supergiants shed their atmospheres and evolve to Wolf–Rayet stars before their cores collapse. All Wolf–Rayet stars end their lives from the Wolf–Rayet phase within a million years or so, but again it is difficult to identify those that are closest to core collapse. One class that is expected to have no more than a few thousand years before exploding are the WO Wolf–Rayet stars, which are known to have exhausted their core helium. Only eight of them are known, and only four of those are in the Milky Way.
A number of close or well-known stars have been identified as possible core collapse supernova candidates: the high-mass blue stars Spica and Rigel, the red supergiants Betelgeuse, Antares, and VV Cephei A; the yellow hypergiant Rho Cassiopeiae; the luminous blue variable Eta Carinae that has already produced a supernova impostor; and both components, a blue supergiant and a Wolf–Rayet star, of the Regor or Gamma Velorum system. Mimosa and Acrux, two bright star systems in the southern constellation of Crux, each contain blue stars with sufficient masses to explode as supernovae. Others have gained notoriety as possible, although not very likely, progenitors for a gamma-ray burst; for example WR 104.
Identification of candidates for a type Ia supernova is much more speculative. Any binary with an accreting white dwarf might produce a supernova although the exact mechanism and timescale is still debated. These systems are faint and difficult to identify, but the novae and recurrent novae are such systems that conveniently advertise themselves. One example is U Scorpii.
## See also
-
- List of supernovae
- List of supernova remnants
-
-
-
-
-
- |
# Battle of Khafji
The Battle of Khafji was the first major ground engagement of the Gulf War. It took place in and around the Saudi Arabian city of Khafji, from 29 January to 1 February 1991.
Iraqi leader Saddam Hussein, who had already tried and failed to draw Coalition forces into costly ground engagements by shelling Saudi Arabian positions and oil storage tanks and firing Scud surface-to-surface missiles at Israel, ordered the invasion of Saudi Arabia from southern Kuwait. The 1st and 5th Mechanized Divisions and 3rd Armored Division were ordered to conduct a multi-pronged invasion toward Khafji, engaging Saudi Arabian, Kuwaiti, and U.S. forces along the coastline, with a supporting Iraqi commando force ordered to infiltrate further south by sea and harass the Coalition's rear.
These three divisions, which had suffered significant losses from attacks by Coalition aircraft in the preceding days, attacked on 29 January. Most of their attacks were repulsed by U.S. Marine Corps and U.S. Army forces but one of the Iraqi columns occupied Khafji on the night of 29–30 January. Between 30 January and 1 February, two Saudi Arabian National Guard battalions and two Qatari tank companies attempted to retake control of the city, aided by Coalition aircraft and U.S. artillery. By 1 February, the city had been recaptured at the cost of 43 Coalition servicemen dead and 52 wounded. Iraqi Army fatalities numbered between 60 and 300, while an estimated 400 were captured as prisoners of war.
Although the invasion of Khafji was initially a propaganda victory for the Ba'athist Iraqi regime, the battle that led to its swift recapture by Coalition forces demonstrated the importance of air support for ground forces.
## Background
On 2 August 1990, the Iraqi Army invaded and occupied the neighboring state of Kuwait. The invasion, which followed the inconclusive Iran–Iraq War and three decades of political conflict with Kuwait, offered Saddam Hussein the opportunity to distract political dissent at home and add Kuwait's oil resources to Iraq's own, a boon in a time of declining petroleum prices.
In response, the United Nations began to pass a series of resolutions demanding the withdrawal of Iraqi forces from Kuwait. Afraid that Saudi Arabia would be invaded next, the Saudi Arabian government requested immediate military aid. As a result, the United States began marshalling forces from a variety of nations, styled the Coalition, on the Arabian peninsula. Initially, Saddam Hussein attempted to deter Coalition military action by threatening Kuwait's and Iraq's petroleum production and export. In December 1990, Iraq experimented with the use of explosives to destroy wellheads in the area of the Ahmadi loading complex, developing their capability to destroy Kuwait's petroleum infrastructure on a large scale. On 16 January, Iraqi artillery destroyed an oil storage tank in Khafji, Saudi Arabia, and on 19 January the pumps at the Ahmadi loading complex were opened, pouring crude oil into the Persian Gulf. The oil flowed into the sea at a rate of 200,000 barrels (32,000 m<sup>3</sup>) a day, becoming one of the worst ecological disasters to that date.
Despite these Iraqi threats, the Coalition launched a 38-day aerial campaign on 17 January 1991. Flying an estimated 2,000 sorties a day, Coalition aircraft rapidly crippled the Iraqi air defense systems and effectively destroyed the Iraqi Air Force, whose daily sortie rate plummeted from a prewar level of an estimated 200 per day to almost none by 17 January. On the third day of the campaign, many Iraqi pilots fled across the Iranian border in their aircraft rather than be destroyed. The air campaign also targeted command-and-control sites, bridges, railroads, and petroleum storage facilities.
Saddam Hussein, who is believed to have said, "The air force has never decided a war," nevertheless worried that the air campaign would erode Iraq's national morale. The Iraqi leader also believed that the United States would not be willing to lose many troops in action, and therefore sought to draw Coalition ground troops into a decisive battle. In an attempt to provoke a ground battle, he directed Iraqi forces to launch Scud missiles against Israel, while continuing to threaten the destruction of oilfields in Kuwait. These efforts were unsuccessful in provoking a large ground battle, so Saddam Hussein decided to launch a limited offensive into Saudi Arabia with the aim of inflicting heavy losses on the Coalition armies.
As the air campaign continued, the Coalition's expectations of an Iraqi offensive decreased. As a result, the United States redeployed the XVIII Airborne Corps and the VII Corps 480 kilometers (300 mi) to the west. The Coalition's leadership believed that should an Iraqi force go on the offensive, it would be launched from the al-Wafra oil fields, in Southern Kuwait.
## Order of battle
The Iraqi Army had between 350,000 and 500,000 soldiers in theater, organized into 51 divisions, including eight Republican Guard divisions. Republican Guard units normally received the newest equipment; for example, most of the estimated 1,000 T-72 tanks in the Iraqi Army on the eve of the war were in Republican Guard divisions. The Iraqi Army in the Kuwaiti Theater of Operations (KTO) also included nine heavy divisions, composed mostly of professional soldiers, but with weapons of a generally lesser grade than those issued to the Republican Guard.
Most non-Republican Guard armored units had older tank designs, mainly the T-55 or its Chinese equivalents, the Type 59 and Type 69. The remaining 34 divisions were composed of poorly trained conscripts. These divisions were deployed to channel the Coalition's forces through a number of break points along the front, allowing the Iraqi Army's heavy divisions and the Republican Guard units to isolate them and counterattack. However, the Iraqis left their western flank open, failing to account for tactics made possible by the Global Positioning System and other new technologies.
In Saudi Arabia, the Coalition originally deployed over 200,000 soldiers, 750 aircraft and 1,200 tanks. This quickly grew to 3,600 tanks and over 600,000 personnel, of whom over 500,000 were from the United States.
### Iraqi forces
Earmarked for the offensive into Saudi Arabia was the Iraqi Third Corps, the 1st Mechanized Division from Fourth Corps and a number of commando units. Third Corps, commanded by Major General Salah Aboud Mahmoud (who would also command the overall offensive), had the 3rd Armored Division and 5th Mechanized Division, as well as a number of infantry divisions. Fourth Corps' commander was Major General Ayad Khalil Zaki. The 3rd Armored Division had a number of T-72 tanks, the only non-Republican Guard force to have them, while the other armored battalions had T-62s and T-55s, a few of which had an Iraqi appliqué armor similar to the Soviet bulging armor also known as "brow" laminate armor or BDD.
During the battle of Khafji, these upgraded T-55s survived impacts from MILAN anti-tank missiles. These divisions also had armored personnel vehicles such as the BMP-1, scout vehicles such as the BRDM-2, and several types of artillery. Also deployed along this portion of the front, though not chosen to participate in the invasion, were five infantry divisions that were under orders to remain in their defensive positions along the border.
U.S. Marine Corps reconnaissance estimated that the Iraqi Army had amassed around 60,000 troops across the border, near the Kuwaiti town of Wafra, in as many as 5 or 6 divisions. Infantry divisions normally consisted of three brigades with an attached commando unit, although some infantry divisions could have up to eight brigades–however most infantry divisions along the border were understrength, primarily due to desertion.
Armored and mechanized divisions normally made use of three brigades, with each brigade having up to four combat battalions; depending on the division type, these were generally a three to one mix, with either three mechanized battalions and one armored battalion, or vice versa. Given the size of the forces deployed across the border, it is thought that the Iraqi Army planned to continue the offensive, after the successful capture of Khafji, in order to seize the valuable oil fields at Dammam.
The attack would consist of a four-prong offensive. The 1st Mechanized Division would pass through the 7th and 14th Infantry Divisions to protect the flank of the 3rd Armored Division, which would provide a blocking force west of Khafji while the 5th Mechanized Division took the town. The 1st Mechanized and 3rd Armored divisions would then retire to Kuwait, while the 5th Mechanized Division would wait until the Coalition launched a counteroffensive. The principal objectives were to inflict heavy casualties on the attacking Coalition soldiers and take prisoners of war, who Saddam Hussein theorized would be an excellent bargaining tool with the Coalition.
As the units moved to the Saudi Arabian border, many were attacked by Coalition aircraft. Around the Al-Wafrah forest, about 1,000 Iraqi armored fighting vehicles were attacked by Harrier aircraft with Rockeye cluster bombs. Another Iraqi convoy of armored vehicles was hit by A-10s, which destroyed the first and last vehicles, before systematically attacking the stranded remainders. Such air raids prevented the majority of the Iraqi troops deployed for the offensive from taking part in it.
### Coalition forces
During the buildup of forces, the U.S. had built observation posts along the Kuwaiti-Saudi Arabian border to gather intelligence on Iraqi forces. These were manned by U.S. Navy SEALs, U.S. Marine Corps Force Reconnaissance and Army Special Forces personnel. Observation post 8 was farthest to the east, on the coast, and another seven observation posts were positioned each 20 km (12 mi) until the end of the "heel", the geographic panhandle of southernmost Kuwait. Observation posts 8 and 7 overlooked the coastal highway that ran to Khafji, considered the most likely invasion route of the city. 1st Marine Division had three companies positioned at observation posts 4, 5 and 6 (Task Force Shepard), while the 2nd Marine Division's 2nd Light Armored Infantry Battalion set up a screen between observation post 1 and the Al-Wafrah oil fields. The U.S. Army's 2nd Armored Division provided its 1st Brigade to give the Marines some much needed armored support.
The Saudi Arabians gave responsibility for the defense of Khafji to the 2nd Saudi Arabian National Guard Brigade, attached to Task Force Abu Bakr. The 5th Battalion of the 2nd Saudi Arabian National Guard Brigade set up a screen north and west of Khafji, under observation post 7. At the time, a Saudi Arabian National Guard Brigade could have up to four motorized battalions, each with three line companies. The brigade had a nominal strength of an estimated 5,000 soldiers. The Saudi Arabians also deployed the Tariq Task Force, composed of Saudi Arabian marines, a Moroccan mechanized infantry battalion, and two Senegalese infantry companies. Two further task forces, Othman and Omar Task Forces, consisted of two Mechanized Ministry of Defense and Aviation Brigades, providing screens about 3 km (1.9 mi) south of the border. The road south of Khafji was covered by one battalion of Saudi Arabian National Guard supported by one battalion of Qatari tanks. The country's main defenses were placed 20 km (12 mi) south of the screen.
The majority of the Arab contingent was led by General Khaled bin Sultan. The forces around Khafji were organized into the Joint Forces Command-East, while Joint Forces Command-North defended the border between observation post 1 and the Kuwaiti-Iraqi border.
## Battle
On 27 January 1991, Iraqi President Saddam Hussein met in Basra with the two Iraqi army corps commanders who were to lead the operation, and Major General Salah Mahmoud told him that Khafji would be his by 30 January. During his return trip to Baghdad, Saddam Hussein's convoy was attacked by Coalition aircraft; the Iraqi leader escaped unscathed.
Throughout 28 January, the Coalition received a number of warnings suggesting an impending Iraqi offensive. The Coalition was flying two brand-new E-8A Joint Surveillance Target Attack Radar System (Joint STARS) aircraft, which picked up the deployment and movement of Iraqi forces to the area opposite of Khafji. Observation posts 2, 7 and 8 also detected heavy Iraqi reconnoitering along the border, and their small teams of air-naval gunfire liaison Marines called in air and artillery strikes throughout the day. Lieutenant Colonel Richard Barry, commander of the forward headquarters of the 1st Surveillance, Reconnaissance and Intelligence Group, sent warnings about an impending attack to Central Command. CentCom leaders were too preoccupied with the air campaign to heed them however, and so the Iraqi operation came as a surprise.
### Beginning of Iraqi offensive: 29 January
The Iraqi offensive began on the night of 29 January, when approximately 2,000 soldiers in several hundred armored fighting vehicles moved south. Post-war analysis by the US Air Force's Air University suggests Iraq planned to utilize the 3rd Armored Division and 5th Mechanized Division to make the actual attack on Khafji, with the 1st Mechanized Division assigned to protect the attacking force's western flank. The Iraqi incursion into Saudi Arabia consisted of three columns, mostly made up of T-62 tanks and armored personnel carriers (APCs). The Gulf War's first ground engagement was near observation post 4 (OP-4), built on the Al-Zabr police building. Elements of the Iraqi 6th Armored Brigade, ordered to take the heights above Al-Zabr, engaged Coalition units at Al-Zabr. At 20:00 hours, U.S. Marines at the observation post, who had noticed large groups of armored vehicles through their night vision devices, attempted to talk to battalion headquarters but received no response. Since contact earlier was no problem, there was a strong presumption that the reconnaissance platoon's radios were being jammed. Using runners, Lieutenant Ross alerted his platoon and continued trying to get through and inform higher headquarters and Company D of the oncoming Iraqi force. Contact was not established until 20:30 hours, which prompted Task Force Shepard to respond to the threat. Coalition soldiers at observation post 4 were lightly armed, and could only respond with TOW anti-tank missiles before calling in air support. Air support arrived by 21:30 and took the form of several F-15E, F-16C, four A-10 Tank Killers and three AC-130 gunships, which intervened in a heavy firefight between Iraqi and Coalition ground forces at OP-4. The reconnaissance platoon stationed at OP-4 was the first to come under attack, their withdrawal from the engagement was facilitated by another company providing cover fire. The attempt by the soldiers stationed at OP-4 to fend off or delay the Iraqi advance cost them several casualties, and in the face of a heavy Iraqi response they were forced to retire south, by order of its commanding officer.
To cover the withdrawal, the company's platoon of LAV-25s and LAV-ATs (anti-tank variants) moved to engage the Iraqi force. After receiving permission, one of the anti-tank vehicles opened fire at what it believed was an Iraqi tank. Instead, the missile destroyed a friendly LAV-AT a few hundred meters in front of it. Despite this loss, the platoon continued forward and soon opened fire on the Iraqi tanks with the LAV-25s' autocannons. The fire could not penetrate the tanks' armor, but did damage their optics and prevented the tanks from fighting back effectively.
Soon thereafter, a number of A-10 ground-attack aircraft arrived but found it difficult to pinpoint enemy targets and began dropping flares to illuminate the zone. One of these flares landed on a friendly vehicle, and although the vehicle radioed in its position, it was hit by an AGM-65 Maverick air-to-ground missile that killed the entire crew except for the driver. Following the incident, the company was withdrawn and the remaining vehicles reorganized into another nearby company. With observation post 4 cleared, the Iraqi 6th Armored Brigade withdrew over the border to Al-Wafrah under heavy fire from Coalition aircraft. Coalition forces had lost 11 troops to friendly fire and none to enemy action.
While the events at observation post 4 were unfolding, the Iraqi 5th Mechanized Division crossed the Saudi Arabian border near observation post 1. A Company of the 2nd Light Infantry Armored Battalion, which was screening the Iraqi unit, reported a column of 60–100 BMPs. The column was engaged by Coalition A-10s and Harrier jump jets. This was then followed by another column with an estimated 29 tanks. One of the column's T-62 tanks was engaged by an anti-tank missile and destroyed. Coalition air support, provided by A-10s and F-16s, engaged the Iraqi drive through observation post 1 and ultimately repulsed the attack back over the Kuwaiti border. Aircraft continued to engage the columns throughout the night, until the next morning. Another column of Iraqi tanks, approaching observation post 2, were engaged by aircraft and also repulsed that night.
An additional Iraqi column crossed the Saudi Arabian border to the East, although still along the coast, towards the city of Khafji. These Iraqi tanks were screened by the 5th Mechanized Battalion of the 2nd Saudi Arabian National Guard Brigade. This battalion withdrew when it came under heavy fire, as it had been ordered to not engage the Iraqi column. Elements of the 8th and 10th Saudi Arabian National Guard Brigades also conducted similar screening operations. Due to the order to not engage, the road to Khafji was left open. At one point, Iraqi T-55s of another column rolled up to the Saudi Arabian border, signaling that they intended to surrender. As they were approached by Saudi Arabian troops, they reversed their turrets and opened fire. This prompted air support from a nearby AC-130, destroying 13 vehicles.
Nevertheless, the Iraqi advance towards Khafji continued on this sector, despite repeated attacks from an AC-130. Attempts by the Saudi Arabian commanders to call in additional air strikes on the advancing Iraqi column failed when the requested heavy air support never arrived. Khafji was occupied by approximately 00:30 on 30 January, trapping two six-man reconnaissance teams from the 1st Marine Division in the city. The teams occupied two apartment buildings in the southern sector of the city and called artillery fire on their position to persuade the Iraqis to call off a search of the area. Throughout the night, Coalition air support composed of helicopters and fixed-wing aircraft continued to engage Iraqi tanks and artillery.
### Initial response: 30 January
Distressed by the occupation of Khafji, Saudi Arabian commander General Khaled bin Sultan appealed to U.S. General Norman Schwarzkopf for an immediate air campaign against Iraqi forces in and around the city. However this was turned down because the buildings would make it difficult for aircraft to spot targets without getting too close. It was instead decided that the city would be retaken by Arab ground forces. The task fell to the 2nd Saudi Arabian National Guard Brigade's 7th Battalion, composed of Saudi Arabian infantry with V-150 armored cars and two Qatari tank companies attached to the task force. These were supported by U.S. Army Special Forces and Marine Reconnaissance personnel.
The force was put under the command of Saudi Arabian Lieutenant Colonel Matar, who moved out by 17:00 hours. The force met up with elements of the U.S. 3rd Marine Regiment, south of Khafji, and were ordered to directly attack the city. A platoon of Iraqi T-55s attacked south of the city, leading to the destruction of three T-55s by Qatari AMX-30s, and the capture of a fourth Iraqi tank. Lacking any coordinated artillery support, artillery fire was provided by the 10th Marine Regiment.
An initial attack on the city was called off after the Iraqi occupants opened up with heavy fire, prompting the Saudi Arabians to reinforce the 7th Battalion with two more companies from adjacent Saudi Arabian units. The attempt to retake the city had been preceded by a 15-minute preparatory fire from U.S. Marine artillery. However Iraqi fire did manage to destroy one Saudi Arabian V-150.
Meanwhile, 2nd Saudi Arabian National Guard Brigade's 5th Battalion moved north of Khafji to block Iraqi reinforcements attempting to reach the city. This unit was further bolstered by the 8th Ministry of Defense and Aviation Brigade, and heavily aided by Coalition air support. Although fear of friendly fire forced the 8th Ministry of Defense and Aviation Brigade to pull back the following morning, Coalition aircraft successfully hindered Iraqi attempts to move more soldiers down to Khafji and caused large numbers of Iraqi troops to surrender to Saudi Arabian forces.
That night, two U.S. Army heavy equipment transporters entered the city of Khafji, apparently lost, and were fired upon by Iraqi troops. Although one truck managed to turn around and escape, the two drivers of the second truck were wounded and captured. This led to a rescue mission organized by 3rd Battalion 3rd Marine Regiment, which sent a force of 30 men to extract the two wounded drivers. Although encountering no major opposition, they did not find the two drivers who had, by this time, been taken prisoner. The Marines did find a burnt out Qatari AMX-30, with its dead crew. In the early morning hours, despite significant risk to their safety, an AC-130 providing overwatch stayed beyond sunrise. It was shot down by an Iraqi surface-to-air missile (SAM), killing the aircraft's crew of 14.
The interdiction on the part of Coalition aircraft and Saudi Arabian and Qatari ground forces was having an effect on the occupying Iraqi troops. Referring to Saddam Hussein's naming of the ground engagement as the "mother of all battles", Iraqi General Salah radioed in a request to withdraw, stating, "The mother was killing her children." Since the beginning of the battle, Coalition aircraft had flown at least 350 sorties against Iraqi units in the area and on the night of 30–31 January, Coalition air support also began to attack units of the Iraqi Third Corps assembled on the Saudi Arabian border.
### Recapture of Khafji: 31 January – 1 February
On 31 January, the effort to retake the city began anew. The attack was launched at 08:30 hours, and was met by heavy but mostly inaccurate Iraqi fire; however, three Saudi Arabian V-150 armored cars were knocked out by RPG-7s at close range. The 8th battalion of the Saudi Arabian brigade was ordered to deploy to the city by 10:00 hours, while 5th Battalion to the north engaged another column of Iraqi tanks attempting to reach the city. The latter engagement led to the destruction of around 13 Iraqi tanks and armored personnel carriers, and the capture of 6 more vehicles and 116 Iraqi soldiers, costing the Saudi Arabian battalion two dead and two wounded. The 8th Battalion engaged the city from the northeast, linking up with 7th Battalion. These units cleared the southern portion of the city, until 7th Battalion withdrew south to rest and rearm at 18:30 hours, while the 8th remained in Khafji. The two Qatari tank companies, with U.S. Marine artillery and air support, moved north of the city to block Iraqi reinforcements.
The 8th continued clearing buildings and by the time the 7th had withdrawn to the south, the Saudi Arabians had lost approximately 18 dead and 50 wounded, as well as seven V-150 vehicles. Coalition aircraft continued to provide heavy support throughout the day and night. A veteran of the Iran-Iraq War later mentioned that Coalition airpower "imposed more damage on his brigade in half an hour than it had sustained in eight years of fighting against the Iranians." During the battle, an Iraqi amphibious force was sent to land on the coast and moved into Khafji. As the boats made their way through the Persian Gulf towards Khafji, U.S. and British aircraft caught the Iraqi boats in the open and destroyed over 90% of the Iraqi amphibious force.
The Saudi Arabian and Kuwaiti units renewed operations the following day. Two Iraqi companies, with about 20 armored vehicles, remained in the city and had not attempted to break out during the night. While the Saudi Arabian 8th Battalion continued operations in the southern portion of the city, the 7th Battalion began to clear the northern sector of the city. Iraqi resistance was sporadic and most Iraqi soldiers surrendered on sight; as a result, the city was recaptured on 1 February 1991.
## Aftermath
During the battle, Coalition forces incurred 43 fatalities and 52 injured casualties. This included 25 Americans killed, 11 of them by friendly fire along with 14 airmen killed when their AC-130 was shot down by Iraqi SAMs. The U.S. also had two soldiers wounded and another two soldiers were captured in Khafji.
Saudi Arabian casualties totaled 18 killed and 50 wounded. Two Saudi main battle tanks and ten lightly armored V-150s were knocked out. Most of the V150s were knocked out by RPG-7 fire in close-range fighting inside the town of Khafji. One of the two main battle tanks was hit by a 100mm main gun round from a T-55.
Iraq listed its casualties as 71 dead, 148 wounded and 702 missing. U.S. sources present at the battle claim that 300 Iraqis lost their lives, and at least 90 vehicles were destroyed. Another source suggests that 60 Iraqi soldiers were killed and at least 400 taken prisoner, while no less than 80 armored vehicles were knocked out; however these casualties are attributed to the fighting both inside and directly north of Khafji.
Whatever the exact casualties, the majority of three Iraqi mechanized/armored divisions had been destroyed.
The Iraqi capture of Khafji was a major propaganda victory for Iraq: on 30 January Iraqi radio claimed that they had "expelled Americans from the Arab territory". For many in the Arab world, the battle of Khafji was seen as an Iraqi victory, and Hussein made every possible effort to turn the battle into a political victory. On the other side, confidence within the United States Armed Forces in the abilities of the Saudi Arabian and Kuwaiti armies increased as the battle progressed. After Khafji, the Coalition's leadership began to sense that the Iraqi Army was a "hollow force" and it provided them with an impression of the degree of resistance they would face during the Coalition's ground offensive that would begin later that month. The battle was felt by the Saudi Arabian government to be a major propaganda victory, which had successfully defended its territory.
Despite the success of the engagements between 29 January and 1 February, the Coalition did not launch its main offensive into Kuwait and Iraq until the night of 24–25 February. The invasion of Iraq was completed about 48 hours later. The Battle of Khafji served as a modern example of the ability of air power to serve a supporting role to ground forces. It offered the Coalition an indication of the manner in which Operation Desert Storm would be fought, but also hinted at future friendly-fire casualties which accounted for nearly half of the U.S. dead. |
# Ray Lindwall with the Australian cricket team in England in 1948
Ray Lindwall was a key member of Donald Bradman's famous Australian cricket team, which toured England in 1948. The Australians went undefeated in their 34 matches; this unprecedented feat by a Test side touring England earned them the sobriquet The Invincibles.
Lindwall played as a right-arm opening fast bowler and right-handed batsman in the lower middle-order. Along with Keith Miller, Lindwall formed Australia's first-choice pace duo, regarded as one of the best of all time, and Bradman typically used them in short and sharp bursts against the home batsmen. The pair were used to target England's leading batsmen, Len Hutton and Denis Compton during the major matches, and subdued Hutton for much of the summer. England had agreed to make a new ball available after every 55 overs, more often than the usual regulations at the time, thereby allowing the pair more frequent use of a shiny ball that swung at high pace. Bradman gave the duo lighter workloads in the tour matches in order to preserve their energy for the new ball battles against England's key batsmen in the Tests. Lindwall was a capable lower-order batsman who made two Test centuries during his career, and he featured in several rearguard actions that boosted Australia's scores during the tour.
Lindwall was the equal leading wicket-taker in the Tests (27 along with Bill Johnston) and had the best bowling average (19.62) and strike rate. In the first-class matches, he led the averages although he was second in the wicket-taking list with 86 at 15.68 behind Johnston (102), who was assigned more of the workload in order to keep Miller and Lindwall fresh for the Tests. With the bat, Lindwall scored 191 runs at a batting average of 31.83 in the Tests.
Lindwall's most influential contributions in the Ashes matches were his 5/70 in the first innings of the Second Test at Lord's, a hard-hitting 77 that limited Australia's first innings deficit in the Fourth Test at Headingley, and most notably, his 6/20 on the first day of the Fifth Test at The Oval. The performance was a display of extreme pace and swing that earned high praise from pundits and was largely responsible for England being bowled out for 52. Outside the Tests, Lindwall took 11/59 in a match against Sussex, with eight of his victims being bowled as the ball curved through their defences at high pace. In recognition of his achievements, Lindwall was chosen as one of the five Wisden Cricketers of the Year. Wisden said that "by whatever standard he is judged ... [Lindwall] must be placed permanently in the gallery of great fast bowlers".
## Background
A bowler of express pace, Lindwall was a regular member of the Test team and had opened Australia's attack since the resumption of cricket following World War II. During the Australian summer of 1947–48, Lindwall played in all five Tests against the touring India national cricket team. He played a major part in Australia's 4–0 series win as the leading wicket-taker with 18 scalps at an average of 16.88, ahead of Ian Johnson and Bill Johnston who took 16 apiece at averages of 16.31 and 11.37 respectively. As a result, Lindwall was selected as part of Don Bradman's Invincibles that toured England without defeat in 1948, with the intention of leading the pace attack. There were two concerns for Lindwall in the lead-up to the tour. Lindwall had been playing with an injured leg tendon and his foot drag during the delivery stride led to discussion in the media and among umpires as to its legality. Bradman arranged for Lindwall to see his Melbourne masseur Ern Saunders, who restored the paceman's leg to prime condition in a fortnight. On the public relations front, Bradman stated his firm belief in the fairness of Lindwall's delivery. During the lengthy sea voyage to England, Bradman emphasized the importance of caution with respect to his bowling action. Bradman advised Lindwall to ensure that his dragging rear right foot was further behind the line than usual to avoid being no-balled, and to refrain from bowling at full speed until the umpires were satisfied with his delivery stride. The Australian captain guaranteed Lindwall selection for the Tests and told him that his first priority in the lead-in tour matches was passing the umpires' scrutiny. Bradman recalled how paceman Ernie McCormick had been no-balled 35 times in the traditional tour opener against Worcester during the 1938 campaign, destroying his confidence for the rest of the season.
## Early tour
In the lead-up to the match against Worcestershire, photographers and cameramen constantly followed Lindwall, trying to capture visual evidence of an illegal drag when he was bowling in the practice nets. Bradman tried to stop the journalists from taking photographs of the crease as Lindwall came in to bowl, and deflected media questions about his bowler's drag. The spectators and media were keenly observing the position of his foot in the tour opener, but Lindwall was not no-balled in the first match at Worcester, and complaints about his drag faded away for the rest of the tour. The hosts elected to bat first, and Lindwall took 2/41 in the first innings, bowling and trapping his victims leg before wicket before delivering three wicketless overs for 19 runs in the second innings. Lindwall took a wicket with his second ball in England, trapping Don Kenyon for a duck to leave the hosts at 1/0. He was promoted to No. 4 as Bradman rotated his batting order and he scored a quickfire 32 from 34 balls with six boundaries before Australia completed an innings victory. According to former Australian Test batsman and journalist Jack Fingleton, Lindwall "took things very quietly ... The fast bowler is very wise who builds up his speed match by match". With the media and public attention now focused on actual bowling, Lindwall's classical bowling action evoked almost as much interest as his captain's batting.
Bradman rested Lindwall for the second tour match against Leicestershire, which ended in an innings victory for the Australians. Lindwall returned for the next fixture against Yorkshire at Bradford, but bowled only nine overs for a total of 1/16. He made a duck in the first innings on a damp pitch favourable to slower bowling, as Australia scraped home by four wickets. Lindwall was due in next when Australia collapsed to 6/31 in pursuit of 60 for victory in the second innings. The tourists were effectively seven wickets down with the injured Sam Loxton unable to bat, but Neil Harvey and Don Tallon saw Australia to the target without further loss. It was the closest Australia came to defeat for the whole tour. The Australians travelled to London to play Surrey at The Oval. They batted first and Lindwall managed only four, clean bowled by Alec Bedser, as Australia amassed 632. He then took the first two wickets to reduce Surrey to 2/15 in the first innings. Bradman used Lindwall sparingly, taking a match total of 3/45 from 25 overs as Surrey were defeated by an innings. Fingleton felt that Lindwall was at his fastest for the season during the Surrey match. One of Lindwall's bouncers flew over three feet above the batsman's head.
Lindwall had another light workout in the match against Cambridge University, taking match figures of 1/33 from nine overs. His solitary wicket was that of Doug Insole, and he was not required to bat as Australia completed another innings victory. After the fixture against Cambridge, Lindwall was rested for two consecutive matches. In the first, Australia crushed Essex by an innings and 451 runs, its largest winning margin for the summer. The second match resulted in another innings victory, this time over Oxford University.
Lindwall was brought back for the match against the Marylebone Cricket Club (MCC) at Lord's. The MCC fielded seven players who would represent England in the Tests, and were basically a full strength Test team, while Australia selected their first-choice team. It was a chance for the Australian pace attack to gain a psychological advantage ahead of the Tests, with Len Hutton, Denis Compton and Bill Edrich—three of England's first four batsmen—all playing. Australia batted first and Lindwall scored 29, including three sixes from the bowling of England Test off spinner Jim Laker. After Australia amassed 552, Lindwall took combined figures of 2/68, removing Compton and England and MCC captain Norman Yardley as the tourists enforced the follow on by an innings and 158 runs. The MCC fixture was followed by Australia's first non-victory of the tour against Lancashire. After the first day was washed out, Lindwall made a duck and then took 3/44, removing Test players Ken Cranston and Jack Ikin during the hosts' only innings. He was not required to bat in the second innings as the match petered into a draw on the final day.
In the next match against Nottinghamshire at Trent Bridge, Lindwall took 6/14 to gain a psychological blow ahead of the First Test to be held at the same venue. After taking the first four wickets and cutting through the top-order, the paceman returned to finish off the tail and ensure the hosts were dismissed for 179. Lindwall bowled three of his victims with fast and swinging yorkers. He delivered 91 balls, conceding less than a run per over. Only ten of his balls were scored from and not a single run was taken from his last 30 balls. Fingleton said that Lindwall "absolutely paralysed" the batsmen, with some of his bowling "in the real Larwood manner". Harold Larwood was a Nottinghamshire express pace bowler of the 1920s and 1930s who led the controversial Bodyline attack in Australia in 1932–33, and Lindwall modelled his bowling action on Larwood after seeing him in action at the Sydney Cricket Ground during the tour. Lindwall could only manage eight runs with the bat and was unable to repeat his incisive bowling in the second innings, sending down 14 wicketless overs as the match ended in a draw.
Lindwall was rested for Australia's eight-wicket win over Hampshire, before returning in the following match against Sussex. Bowling with a tailwind, he took two early wickets before returning to remove the last four batsmen to fall. Five of Lindwall's opponents were bowled as he ended with 6/34 from 19.4 overs; the hosts were dismissed for 86. During the tourists' reply, Bradman promoted Lindwall to No. 4 and he came to the wicket at 2/342, scored 57 and put on 93 runs with Neil Harvey, who scored an unbeaten 100 as Australia declared at 5/549. Lindwall then broke through with two early wickets to leave Sussex at 2/2, and went on to finish with 5/25 in 15 overs as the home side were bowled out for 138, handing Bradman's men a victory by an innings and 325 runs in two days. He ended with match figures of 11/59, with eight of his victims being bowled, five of these by swinging yorkers, unable to counter Lindwall's swerving deliveries. Fingleton said that "Lindwall bundled the stumps over in all directions" as Sussex "crumpled completely ... in as depressing a batting performance as the tour knew".
## First Test
Lindwall lined up for the First Test at Trent Bridge, a venue where he had taken six wickets in the tour game against Nottinghamshire. Australia bowled first, and Lindwall delivered the first ball of the match at a moderate pace; Hutton pushed it square of the wicket on the off side for a single to start proceedings. Lindwall gradually increased his speed and rhythm, and took the wicket of Cyril Washbrook, caught on the run by Bill Brown on the fine leg boundary after the batsman had attempted a hook shot. However, he was forced to leave the field with a groin strain midway through the first day, ending with figures of 1/30. After England were out for 165, Lindwall came out to bat at 7/365 without a runner on the third day, and appeared to be able to run twos and threes without significant difficulty. He added 107 runs for the eighth wicket with vice-captain Lindsay Hassett. Hassett reached his century and proceeded to 137 in almost six hours of batting before Bedser struck his off stump. Four runs later, Lindwall was caught by wicket-keeper Godfrey Evans down the leg side to leave the total at 6/469, but some free hitting by Australia's last pair allowed them to advance to 509, giving them a 344-run lead.
Although Lindwall was able to jog between the wickets, he did not take the field in the second innings and the 12th man Neil Harvey replaced him. This gave Australia a marked fielding advantage—Fingleton described Harvey as "by far the most brilliant fieldsman of both sides". Yardley was sceptical as to whether Lindwall was sufficiently injured to be forced from the field, but he did not formally object to Harvey's presence on the field. Journalist, former Australian Test cricketer and Lindwall mentor Bill O'Reilly, said that as Lindwall demonstrated his mobility during his innings, he was in no way "incapacitated" and that the English captain "must be condemned for carrying his concepts of sportsmanship too far" when no substitute was justified. O'Reilly decried the benefit Australia derived through the substitution, agreeing with Fingleton that Harvey was the tourists' best fielder by far. English commentator John Arlott went further, calling Harvey the best fielder in the world.
However, the gains provided by Harvey were outweighed by Lindwall's absence, which severely hampered Australia's bowling and made their eventual victory much more difficult. England made 441 and Australia reached their target of 98 on the final afternoon to complete an eight-wicket victory. Due to his injury, Lindwall was omitted from the team for the two matches leading up to the Second Test. Australia proceeded to defeat Northamptonshire by an innings before drawing with Yorkshire.
## Second Test
Two weeks after injuring himself at Trent Bridge, Lindwall undertook a thorough fitness examination on the morning of the Second Test at Lord's. Bradman was not convinced of the bowler's fitness, but Lindwall—keen to play at the historic ground known as the "home of cricket"—was able to convince his captain to risk his inclusion. Australia won the toss and elected to bat, allowing Lindwall further time to recover. He came in late on the first afternoon with the score at 7/246 and made three runs before stumps were called at 7/258. The next morning, Lindwall batted confidently from the outset. He hit two cover drives for four from Bedser after the new ball had been taken, prompting O'Reilly to say that Lindwall was playing in the same manner as when he made his maiden Test century in the last Ashes series in 1946–47. However, he then played around a straight ball from Bedser, and was bowled for 15 to leave the score at 275/8. Australia were out for 350 shortly before lunch on the second morning. In the hosts' reply, Lindwall took the new ball and felt pain in his groin after delivering the first ball of the innings to Len Hutton. Despite this, the paceman persevered through the pain.
In his fourth over, Lindwall had Washbrook caught behind for eight after needlessly playing at a ball wide outside off stump. Before lunch, Lindwall bowled six overs and took 1/7, while Bill Johnston accompanied him from the other end because Keith Miller had a back injury and was only able to bat. Neither Hutton nor Washbrook appeared comfortable against the bowling, and the new batsman Bill Edrich tried to hit Lindwall through the off side, leading to a loud appeal for caught behind, which was turned down. After the lunch break, Hutton fell, and Compton came in, having been dismissed hit wicket after falling over in the First Test while trying to avoid a bouncer. Lindwall delivered a few short balls straight away, but the new batsman was not caught off-guard. Lindwall then clean bowled Bill Edrich—who was playing across the line of the ball—for five. Tom Dollery was the new batsman and stopped the first ball with his pads before Lindwall's next delivery broke through his defences and hit the stumps to send him back to the pavilion for a duck. Dollery's bat was about to start its downward swing by the time Lindwall's outswinger had passed him and hit the stumps. O'Reilly said that Dollery's inability to deal with Lindwall was typical of English cricket's lack of answers to express pace bowling. This was part of a six-over post-lunch spell of 2/11 by Lindwall; the batsmen appeared unable to deal with his swing and extreme pace. England were 4/46 and Australia firmly in control, but the home side were given some respite when both Lindwall and Johnston were taken out of the attack. Australia had the option of taking the new ball just before the tea break, but Bradman decided to wait so his two pacemen could have an extra 20 minutes to replenish their energy levels.
Once the second new ball was taken after tea, Lindwall returned but appeared to be tired and lacking in spirit in his first over. Johnston removed Compton, and one run later, Lindwall clipped Yardley's off stump with the first ball of the next over to leave England at 134/6; the home skipper had made 44 before an outswinger had evaded his bat as he attempted to play a back foot defensive shot. At the start of the third morning, he bowled Alec Bedser off an inside edge from a bouncer to finish with 5/70 as Australia took a 135-run first innings lead on the third morning. O'Reilly said Lindwall "bowled as well as any fast bowler can bowl. He always seemed to have the situation sized up correctly and he knew just when to put his all into the task ... and enjoyed a triumph which seldom comes to any bowler." Arlott praised Lindwall for his subtle variations in pace, line and length, and how he kept the batsman guessing as to what was coming to them.
Bradman's batsmen set about building on their lead and Lindwall was initially not expected to bat; the Australian captain was expected to declare just before lunch on the fourth day so he could attack the English openers for a short period before the adjournment, but a shower deterred him from doing so, as his bowlers would have struggled to grip the ball. Lindwall had also been injured on a slippery surface in earlier times. Lindwall came to the crease to join Miller with Australia at 5/416 on the fourth afternoon. They attacked at every opportunity before the declaration. Miller fell and Lindwall ran out of his crease in attempt to hit Laker across the line to the boundary. He missed and was stumped for 25. This prompted Bradman to declare at 7/460 immediately upon his dismissal, leaving England to chase a world record 596 for victory.
Further showers breathed extra life into the pitch at the start of the run chase, and Lindwall and Johnston extracted steep bounce with the new ball, troubling the English batsmen. Lindwall dropped Hutton from Johnston's bowling before he had scored and the English batsmen played and missed multiple times. Hutton had trouble seeing and playing Lindwall's deliveries in the deteriorating light with no sightscreen available, and Fingleton described it as "probably Hutton's worst effort in a Test". O'Reilly said Hutton "seemed to have lost all power of concentration and looked like a man being led to the gallows", calling him "little more than a masquerader compared to the Hutton [of 1938]". Hutton took 32 minutes of batting to score his first run of the innings.
Hutton and Washbrook took the score to 42—England's highest opening partnership of the series thus far—before the latter edged Lindwall to Johnson in the slips and was out for 13. Edrich and Washbrook were then subjected to repeated short balls, and the latter was hit several times on the fingers while fending down Lindwall's bouncers, having decided to avoid the hook shot. However, they survived Lindwall's spell of bowling. Late in the day, Lindwall was brought back to put pressure on Dollery, having bowled him for a duck in the first innings, but the batsman had already been in the middle for a short period, and played the pace bowling with more assurance.
Lindwall returned on the final morning with the score at 6/141. Dollery, who had been batting with assurance, shaped to duck under a Lindwall bouncer, but it skidded through low and bowled him. Later in the same over, Lindwall bowled Laker for a duck to leave England at 8/141. The match ended when Doug Wright hit Ernie Toshack to Lindwall and was caught for four. Lindwall ended with 3/61 as Australia took a 409-run victory. Arlott said that while Toshack (5/40) had the best figures, Lindwall was the pivotal figure. He said that when Lindwall "so patently disturbed Hutton he struck a blow at the morale of the English batting that was never overcome." In later years, Bradman told Lindwall he pretended not to notice the bowler's pain. Lindwall was worried Bradman had noticed his injury and would be disappointed, but the Australian captain later claimed to have feigned ignorance to allow Lindwall to relax and focus on his bowling.
After carrying an injury into the Second Test, Bradman rested Lindwall for the match against Surrey, which started the day after the proceedings at Lord's ended; Australia took a ten-wicket victory. Lindwall returned for the match against Gloucestershire. Australia batted first and made 7/774, its highest score of the tour and the second highest by any Australian team in England. Lindwall was not able to partake in the prolific run-getting, as the tourists declared before he had scored. He bowled a total of 21 overs, taking 0/41 as Australia enforced the follow on and won by an innings and 363 runs.
## Third Test
When the teams reconvened at Old Trafford for the Third Test, Hutton had been dropped, sparking much acrimony and controversy in the English cricket community. The reason was said to be Hutton's difficulties against Lindwall's short-pitched bowling. Observers noticed Hutton backing away, and the English selectors believed such a sight would have a negative effect on the rest of the side as it was a poor example from a key player. The Australians were pleased and believed the England selectors had erred, because they regarded Hutton as the hosts' best batsman. England batted first and Lindwall removed Hutton's replacement as opener, George Emmett, who fended a short ball to Sid Barnes at short leg to leave England at 2/28. Emmett had been surprised by Lindwall's bouncer and took his eyes from the ball, fending with one hand on the bat, while ducking his head below his arms. The ball bounced slowly off the pitch and after hitting Emmett's bat, rebounded gently up in the air for Barnes to collect. In Australia's match against Gloucestershire immediately preceding the Test, Lindwall bowled a bouncer to Emmett, who hesitantly parried it away for a single. Lindwall did not bounce Emmett again during the match, and O'Reilly surmised that the paceman was quietly waiting until the Tests to expose his opponent's weakness against the short ball. O'Reilly concluded that Lindwall and Johnston "had again disposed of the English opening batsmen with the minimum amount of effort".
Lindwall then struck Edrich on the hand with a short ball, provoking angry heckling from spectators who compared him to Larwood. During this period, both Edrich and Compton found it difficult to position themselves quickly enough to play Lindwall. The Australian paceman then hit Compton on the arm, and soon after, felled him with a bouncer that the batsman top-edged into his face. This forced Compton to leave the field with a bloodied eyebrow with the score at 2/33. Upon hearing the umpire's call of no-ball while the ball was travelling towards him, and knowing he was immune from dismissal, Compton decided to change his stroke. Having initially positioned himself to deflect the ball into the leg side, he then attempted to hook the ball, but could not readjust quickly enough. The velocity of the ball was such that after rebounding from his head, it flew more than half-way to the boundary before landing. This was followed by a period of slow play as England tried to regroup.
Lindwall returned to take the second new ball and trapped Jack Crapp, who did not offer a shot, for 37 to leave the score at 3/87. The batsman misjudged the line of a straight ball and thought it had pitched and struck his leg outside off stump. Lindwall later had Edrich gloving a rearing ball to the wicket-keeper to leave England at 5/119. Compton returned upon the fall of the fifth wicket to revive the innings and Lindwall ended with 4/99 after having Godfrey Evans caught behind from an expansive cut shot. England ended on 363 all out on the second day; Compton made an unbeaten 145. Lindwall had beaten Compton in each of his last three overs before lunch on the second day, but the Englishman survived to add more than 20 further runs. On the third day, Lindwall came to the crease at 6/172 after Barnes—who had collapsed due to the aftereffects of being hit in the ribs from point blank range when fielding—had been forced to retire hurt. Australia faced the prospect of being forced to follow on, and Lindwall received five consecutive bouncers from Edrich, one of which hit him in the hand, evoking cheers from the home crowd. Lindwall made 23 as Australia struggled to 221 and avoided the follow on by eight runs; he was the last man to be dismissed.
At the start of England's second innings, Washbrook took a single from Lindwall, who promptly removed Emmett for a duck. Lindwall pitched an outswinger on the line of off stump and Emmett edged it to wicket-keeper Tallon, who dived and took it in his right hand. This brought Lindwall's tormentor Edrich to the crease. Bradman advised Lindwall not to bounce Edrich, fearing this would be interpreted as retaliation and generate negative media attention. However, Miller retaliated with four consecutive bouncers, angering the crowd. He struck Edrich on the body before Bradman and ordered him to stop; the Australian captain apologised to Edrich for the hostile bowling. Lindwall bounced Washbrook and was no-balled by umpire Dai Davies for dragging his foot beyond the line. After a disagreement, Davies threw Lindwall his jumper, but the tension faded away and the paceman was not no-balled again after discussing the matter with Bradman. Lindwall bounced Washbrook again and this time the England opener went for the hook shot. The ball flew off the top edge in the air, straight towards Hassett at fine leg, who dropped the catch after juggling three times. Having received a reprieve on 21, Washbrook settled down and reached 50 in only 70 minutes with England at 1/80.
Lindwall returned for a new spell late on the third day and almost hit Washbrook in the head. Hassett again dropped Washbrook, who was on 78 when he again hooked Lindwall to long leg. The Australian vice-captain responded by borrowing a helmet from a nearby policeman to signify his need for protection from the ball, much to the amusement of the crowd. Lindwall ended with 1/37 as England declared at their stumps score of 3/174 after the entire fourth day and the final morning was washed out. The match petered into a draw with Lindwall not required as Australia safely batted out the final day. Lindwall played in Australia's only match—against Middlesex—before the Fourth Test, but was not at his best. He and Miller had been partying heavily in the days before the match, and were out and binge drinking on the night before the match; they did not return until after dawn, when they were caught severely inebriated by Bradman at breakfast time. Miller was rested for the match, but Lindwall was selected for the match. The home team won the toss and elected to bat, so Lindwall could not rest and sober up in the dressing room while the specialist batsmen were at work. He was asked to bowl a lengthy spell in warm and sunny conditions on the opening morning, and was at times lying on the ground in an attempt to recover during stoppages. Nevertheless, he ended with 1/28 from 16 overs, the opposition still unable to score heavily despite his obvious lethargy. He took a total of 3/59 from 25 overs and scored one as the tourists won by an innings, removing the home team's captain George Mann twice and Edrich once. In the second innings, Lindwall took 2/31, but effectively had a third wicket. He bowled a bouncer at opener Jack Robertson, who tried to hook, but missed, suffered a fractured jaw and was forced to retire hurt, prompting angry shouting and booing from sections of the crowd. Robertson defended Lindwall, contending that the delivery was fair and that he had executed his shot incorrectly.
## Fourth Test
Hutton returned for the Fourth Test at Headingley and played effectively. At one stage he and Washbrook took five boundaries from six Lindwall overs. An opening partnership of 168 resulted until Lindwall bowled him for 81. The English opener went onto the front foot and was clean bowled, much to the dismay of the home crowd. The stand came after Washbrook had decided to refrain from hooking Lindwall's bouncers, which had caused him problems in the earlier Tests. England did not lose their second wicket until the last over of the day, when Washbrook hit Johnston into Lindwall's hands for 143, leaving the total at 2/268. Both Fingleton and O'Reilly criticised the bowling group as a whole for what they deemed a very lethargic display; the former deemed it the worst day's performance since World War II and the latter accused all the Australian bowlers of operating "without object".
During the innings, Lindwall appealed for lbw four times while wicket-keeper Ron Saggers—standing in for the injured Tallon—remained silent, not supporting the appeal. England ran up a large score of 496 but squandered a very strong position after losing their last 8 wickets for 73 runs; Lindwall had Compton caught down the leg side to give Saggers his first Test catch, leaving England at 6/473. Lindwall ended with 2/79 from 38 overs. Fingleton said "this grand fast bowler held the side together splendidly and answered every call". O'Reilly said that until England collapsed—mostly due to unforced errors despite favourable conditions—only Lindwall appeared capable of threatening the batsmen. He said the paceman "kept slogging away, tirelessly retaining his pace and enthusiasm long after the other members of the attack had lost all signs of hostility ... Bradman could not afford to spare him from doing much more than his share of the galley-slave work." O'Reilly decried Lindwall's workload as excessive and potentially harmful to his longevity.
In reply, Australia was still some way behind when Lindwall came in at 6/329 on the third afternoon. With the fall of Sam Loxton and Saggers in quick succession, Bradman's men were at 8/355 with only Johnston and Toshack remaining. Lindwall hit out, scoring 77 in an innings marked by powerful driving and pulling; he dominated in stands of 48 and 55 with Johnston and Toshack respectively. He particularly liked to use his feet to get to the ball on the half-volley to hit lofted drives. Of the 103 runs added for the last two wickets, the two tail-enders managed only 25 between them. Johnston accompanied Lindwall for 80 minutes, before the injured Toshack survived the last 50 minutes until stumps with Johnston running for him. Australia were 9/457 at stumps, with Lindwall on 76 and Toshack on 12. During Lindwall's partnership with Johnston, Yardley bowled himself for over an hour, failing to bring on a frontline bowler in his stead despite being unable to dislodge the batsmen. Lindwall farmed the strike by trying to hit boundaries and twos during the over, but Yardley did not resort to the tactic of setting a deep field to yield a single to Lindwall to get the tailenders on strike. Despite Toshack and Johnston's lack of familiarity with having and acting as a runner respectively, and the resulting disorders in running between the wickets, Lindwall was able to manipulate the strike and face most of the balls. O'Reilly speculated that Yardley may have bowled himself in an attempt to contain the Australians rather than dismiss them before the close of play, so his openers would not have to bat for a short period before stumps when the visitors' attack could have made inroads. However, Yardley was neither able to contain Lindwall nor dismiss the Australians. Sunday was a rest day, and on Monday, the fourth morning, Lindwall was the last man out in the third over of the day, leaving Australia on 458, 38 runs in arrears on the first innings. Lindwall edged Bedser into the slips cordon and Crapp took the catch low down in his left hand.
England made a strong start in their second innings—the openers registered their second century stand for the match. When Australia took the second new ball, Lindwall—worried by the substantial and hazardous craters in the pitch he and the other bowlers had created while following through on the left-hand side of the crease—changed to bowling from around the wicket and was warned for running on the pitch. He reverted to bowling from over the wicket, although he delivered from the edge of the crease to avoid the holes. O'Reilly said the warning to Lindwall played into Australia's hands as the bowler's follow through from around the wicket was accentuating a rough patch outside the right-hander's off stump that the English bowlers target when Bradman's men had to chase the target. England reached 2/232 before Lindwall trapped Edrich to end a 103-run partnership. The paceman followed this by bowling Jack Crapp, who inside edged an attempted forcing stroke through the off side from the back foot onto his own stumps. This triggered a collapse of 4/33 from 3/260 to 7/293. Lindwall took 2/84 as the hosts declared on the final day at 8/365, leaving Australia to chase a world Test record of 404 for victory, with only 345 minutes available. Centuries to Bradman and Morris in a 301-run stand saw Australia seal the series 3–0 with a record-breaking seven-wicket win with 15 minutes in hand.
The paceman was rested for the match against Derbyshire immediately after the Headingley Test, which Australia won by an innings. Lindwall returned for the match against Glamorgan and took 2/36 in a rain-affected draw that did not reach the second innings. In the next match against Warwickshire, he claimed 3/27 in the first innings, taking three consecutive middle-order wickets—including Test batsmen Tom Dollery and Abdul Hafeez Kardar—in the space of 12 balls as the hosts fell for 138. In reply, Australia stumbled to 6/161 when Lindwall joined Hassett. The pair put on 70 for the seventh wicket, the largest partnership in a low-scoring match. Lindwall ended with 45, the second highest score for the entire match, as Australia took a 116-run lead. He took the first wicket and ended with 1/32 as Bradman's men won by nine wickets. Australia proceeded to face Lancashire at Old Trafford for the second time during the season in a match that doubled as Washbrook's benefit. Lindwall made 17 in the tourists' first innings of 321 and then dismissed the home side's first three batsmen, taking 3/32 as Lancashire fell for 130. Washbrook top-scored with 38 before Lindwall had him caught in the slips by Miller. He also collected several painful bruises from Lindwall on his right hand and thumb. Bradman described his leading paceman as being in "stupendous form ... I have not seen before or since such sustained brilliance from a pace bowler". Australia made 3/265 declared in their second innings, leaving the hosts with a target of 457 in less than a day, with Washbrook unable to bat due to Lindwall's bruising bowling. Lindwall bowled both openers with the new ball, but Lancashire appeared to be safely batting out a draw at 5/191 with only eight minutes remaining. Lindwall returned after Bradman took the new ball, and told the slip cordon to move halfway back to the boundary. Bowling with a tailwind, Lindwall was at full pace, in one of the fastest displays Bradman had seen in his long career. He bowled Jack Ikin for 99 and Dick Pollard—who later claimed to have not seen the ball—for a golden duck. The hosts were in danger of suffering a late collapse and defeat, but William Roberts successfully defended the hat-trick ball; Lancashire lost no further wickets and were 7/191 when stumps were drawn; Lindwall ended with 4/27. He also caused Washbrook to miss the final Test with a thumb injury. The paceman was rested from the non-first-class match against Durham, which was a rain-affected draw.
## Fifth Test
According to Bradman, Fingleton and O'Reilly, Lindwall's performance in the final Test at The Oval was one of the best they had ever seen from any player. English skipper Yardley won the toss and elected to bat on a rain-affected pitch, surprising most observers. The damp conditions necessitated the addition of large amounts of sawdust to allow the players to keep their grip. Along with the rain, humidity assisted the bowlers, particularly Lindwall, who managed to make the ball bounce at variable heights.
After Miller had taken an early wicket, Lindwall bounced Compton, resulting in an edge towards the slips cordon. However, the ball continued to rise and cleared the ring of Australian fielders. Hutton called Compton through for a run, but his surprised partner was watching the ball and dropped his bat in panic. Luckily for Compton, the ball went to Hassett at third man, who stopped the ball and waited for the batsman to regather his bat and his composure before returning the ball, thereby forfeiting the opportunity to effect a run out. However, this gesture of sportsmanship cost Australia little, because when Compton was on three, Lindwall bowled another bouncer. Compton attempted a hook shot and Arthur Morris ran from his position at short square leg to take a difficult catch, leaving the hosts at 3/17. Fingleton described Morris's feat as "one of the catches of the season".
After the lunch break, England struggled to 4/35, before Lindwall bowled Yardley with a swinging yorker. The debutant Allan Watkins then batted for 16 balls in an attempt to get off the mark with a series of failed hook shots. He missed an attempted hook shot and was hit in the shoulder by another Lindwall bouncer. Soon after, Watkins was dismissed without scoring after playing across the line and being trapped lbw by Johnston, leaving England at 6/42. Watkins also collected a bruise on his right shoulder for his troubles with Lindwall; this inhibited his bowling later in the match. Following the departure of Watkins, Lindwall removed Godfrey Evans, Alec Bedser and Young, all yorked in the space of two runs. England fell from 6/45 to 9/47, bringing Hollies in at No. 11 to accompany Hutton, who had batted through the innings. Hutton then hit the only boundary of the innings, lofting a straight drive back over Lindwall's head. The ball almost went for six, landing just short of the fence. The home team's innings ended on 52 when Hutton—who never appeared to be troubled by the bowling—leg glanced Lindwall and was caught by wicket-keeper Don Tallon one-handed, at full stretch to his left. Lindwall described the catch as one of the best he had ever seen. In his post-lunch spell, Lindwall bowled a spell of 8.1 overs during which he took 5/8; he totalled 6/20 in 16.1 overs. Bradman described the spell as "the most devastating and one of the fastest I ever saw in Test cricket". Fingleton, who played against the Bodyline attack in 1932–33, said "I was watching a man almost the equal of Larwood [the Bodyline spearhead] in pace ... Truly a great bowler". O'Reilly said that Lindwall's "magnificent performance must go down as one of the greatest bowling efforts in Anglo-Australian Tests. He had two gruelingly long sessions in the innings and overcame each so well that he set the seal on his well-earned reputation as one of the best bowlers ever."
In Australia's reply, Lindwall came in at 6/304 and attacked immediately, scoring two fours before falling for nine. He played a cover drive from the bowling of Young, but hit the ball too early and thus launched it into the air, and it was caught by Edrich at cover point to leave the score at 7/332. The visitors ended on 389 and England started their second innings late on the second day. Debutant John Dewes took strike and got off the mark when he aimed a hook shot and was credited with a boundary when the ball came off his shoulder. Lindwall's steepling bouncer had risen over his bat and narrowly missed his head. Lindwall made the early breakthrough soon after, bowling Dewes—who offered no shot—for 10 to leave England at 1/20. Dewes often committed to playing the ball from the front foot before the bowler delivered the ball, thereby putting himself into difficulty.
Early on the third day, Lindwall bowled Edrich—who was playing across the line—for 28, hitting the off stump with a ball that cut inwards to leave England's score at 64, before Compton and Hutton consolidated the innings and took the total to 2/121 at lunch. Soon after, with his score on 39, Compton aimed a hard cut shot that flew into Lindwall's left hand at second slip for a "freak slip catch" from Johnston's bowling; leaving England at 3/125. Lindwall returned for another spell after lunch and bowled Evans for eight. Evans appeared to not detect Lindwall's yorker in the fading light, and the umpires called off play due to poor visibility after an appeal by Yardley. The next morning, England were bowled out for 188, giving Australia an innings victory and the series 4–0. Lindwall took 3/50 to total 9/70 for the match. He ended the Tests as the leading wicket-taker with 27 wickets at 19.62, and scored 191 runs at 31.83.
## Later tour matches
Seven matches remained on Bradman's quest to go through a tour of England without defeat. Australia batted first against Kent and Lindwall made only one in a total of 361. The paceman took two wickets with the new ball to help reduce the hosts to 5/16, before they were all out for 51; Lindwall ended with 2/16. Forced to follow on, Kent were reduced to 4/37 by three early Lindwall wickets; the victims included Tony Pawson and former Test wicket-keeper Les Ames. The hosts were further hampered by the absence of opener Leslie Todd, who had been hit on the foot by a Lindwall swinging yorker in the first innings. The blow caused so much bruising that Todd's foot could not fit inside his cricket boots. Lindwall ended with 4/25 as Kent fell for 124 to lose by an innings. In his match total of 6/41 from 15 overs, Lindwall bowled four of his victims. The Kent fixture was followed by a match against the Gentlemen of England. Lindwall was not required to bat, and after taking 1/39 in the first innings, Bradman allowed him to rest as Australia enforced the follow on and completed an innings victory. Bradman rested Lindwall for the match against Somerset, which resulted in another innings victory for the tourists. The paceman returned against the South of England—a representative team—scoring an unbeaten 17 in an unbroken 61-run stand with Sam Loxton before Australia declared at 7/522. He then took 1/45 as rain ended the match at the conclusion of the hosts' second innings.
Australia's biggest challenge in the post-Test tour matches was against the Leveson Gower's XI. During the last Australian tour in 1938, this team was effectively a full-strength England outfit, but this time Bradman insisted that only six current Test players be allowed to represent the hosts. After the hosts had complied, the Australian skipper fielded a full-strength team. In a rain-interrupted match, Lindwall bowled Hutton for a duck with the new ball. He returned and took five of the last six wickets to fall as the hosts lost their last six wickets for 57 runs. His last five victims were former England captain Walter Robins and Test players Freddie Brown, Evans, Bedser and Laker. Lindwall ended with 6/59; four of his wickets were bowled, while the other two were caught by Ian Johnson in the slips. He made five as Australia with 8/489 declared; time ran out in the rain-affected match with Leveson-Gower's XI at 0/75.
The tour ended with two non-first-class matches against Scotland. Lindwall was rested from the first, which Australia won by an innings. In the second fixture, Lindwall signed off in a low key manner, scoring 15 and taking a total of 0/28 from 14 overs as Australia ended their campaign with another innings triumph.
## Role
When fit, Lindwall opened the bowling with Miller in the Tests, and the pair operated in short and fiery bursts with the new ball. English cricket administrators had agreed to make a new ball available every 55 overs in the Tests; at the time, the norm was to allow a new ball for every 200 runs scored, something which usually took longer than 55 overs. The new regulation played directly into the hands of the Australians, as a new ball is ideal for fast bowling and the tourists had a vastly superior pace attack. Bradman thus wanted to preserve his two first-choice pacemen for a vigorous attack on the English batsmen every 55 overs. As a result, Lindwall bowled 224 overs, while Australia's third fast bowler Bill Johnston bowled 306. Lindwall led the Test bowling averages, with 27 wickets at 19.62, making him the equal leading wicket-taker along with Johnston, who averaged 23.33. The duo's haul of 27 Test wickets equalled the record for an Australian fast bowler during a tour of England. The Australian pair were substantially ahead the next most successful bowler, England's Alec Bedser, who took 18 wickets at 38.22. Lindwall's role as the leading strike bowler is borne out in his economy and strike rate in both the Tests and all first-class matches. He was the least economical of the three pacemen, but took his wickets more frequently than any other frontline bowler.
In all first-class matches, he took 86 wickets at 15.68 and held onto 14 catches, fielding in the slips. There were many consecutive matches during the tour with no rest day in between, so Bradman ensured Miller and Lindwall remained fresh for the new ball bursts in the Tests by giving them a lighter workload during the tour matches. During all first-class matches, Johnston bowled 851.1 overs, Johnson 668, Lindwall 573.4, Toshack 502, while Miller bowled only 429.4 overs. Outside the Tests, Lindwall bowled 349.3 overs, only the fifth heaviest workload among the Australians in those matches.
The local batsmen were unable to cope with Lindwalls high pace and swing; 43 of his wickets came after his opponent had missed the ball and been bowled. Lindwall scored 411 runs at 24.17 with two fifties in the first-class matches, including 191 at 31.83 in the Tests. Lindwall had limited batting opportunities, usually playing from No. 7 to No. 9. It was hard for Lindwall to get any higher up the order as Australia's other frontline bowlers, such as Colin McCool, Ian Johnson and Doug Ring, all scored centuries and more than 20 fifties each during their first-class career, and were of similar batting ability. As Australia often won by an innings, and declared in the first innings on many occasions due to their batting strength, Lindwall only had 20 innings in his 22 matches, and usually batted at numbers 7, 8 or 9.<sup>N-</sup> However, he was often effective when he did get an opportunity.
Wisden recognised Lindwall by naming him as one of its five Cricketers of the Year in 1949. The publication cited the paceman's ability to seize the initiative for Australia in all but one of the Tests by taking early wickets. The fast bowler's success was attributed to a "superb control of length and direction, his change of pace and general skill, the like of which in a slower bowler could be classed as cunning". The ferocity of Lindwall's bouncer often prompted opponents to retreat onto the back foot before he had even released the ball. Wisden said that "by whatever standard he is judged ... [Lindwall] must be placed permanently in the gallery of great fast bowlers". |
# Kala (album)
Kala is the second studio album by British recording artist M.I.A. It was released on 8 August 2007 through XL Recordings. M.I.A. named the album after her mother and has stated that her mother's struggles in life are a major theme of the album. It was mainly written and produced by M.I.A. and Switch, and features contributions from Timbaland, Diplo, Afrikan Boy and The Wilcannia Mob.
Initially planning to work with American producer Timbaland for the bulk of the album, M.I.A. was unable to gain a long-term work visa to enter the US. Therefore, she recorded Kala in numerous countries around the world, including India, Angola, Trinidad, Liberia, Jamaica and Australia. M.I.A. and Switch relied heavily on the digital audio workstation Logic Pro and recorded additional vocals and background sounds outside the studio environment. Kala incorporates prominent influences from South Asian music, featuring samples of Bollywood and Tamil cinema. The album draws on various styles, from funk carioca to African folk. The songs are about political themes related to the Third World, including illegal immigration, poverty and capitalism.
Kala was the best-performing album on the US Billboard Electronic Albums chart of 2007, and was certified gold by the RIAA for selling 500,000 copies in the US. It was certified platinum in Canada and silver in the UK. It spawned the singles "Bird Flu", "Boyz", "Jimmy" and "Paper Planes", the latter of which received a Grammy nomination for Record of the Year at the 2009 Grammy Awards.
The album received widespread critical acclaim and was ranked as one of the best albums of 2007 by many publications. Since its initial release, it has been included in several greatest albums lists. Renowned music critic Robert Christgau remarked that the album is his favourite of the 21st century.
## Composition and recording
M.I.A. (Mathangi "Maya" Arulpragasam) had released her debut album Arular in 2005, which achieved critical acclaim and sold 130,000 copies. Plans for a second album were first revealed when she spoke later that year of her intention to work with American producer Timbaland. At one point it was anticipated that he would produce the bulk of the album. However, she was unable to gain a long-term work visa to enter the US, reportedly due to her family's connections with guerrillas in Sri Lanka. This led to conflicts between the two artists' schedules and meant that Timbaland's involvement was restricted to a poorly received guest verse on the track "Come Around". M.I.A. instead opted to record the album at a variety of locations around the world, beginning by travelling to India following the last date of her Arular Tour in Japan in February 2006.
She initially travelled to India to meet A. R. Rahman, but found it hard to communicate her ideas to him and the planned musical collaboration did not take place. Rahman did, however, provide M.I.A. with a number of contacts and allow her to use his studio, where 22 members of drumming group The Tapes were recorded for Kala. Producer Switch, who had initially travelled to India purely to engineer the planned sessions, ultimately became involved in the composition of several tracks for the album. A visit to Angola to work with DJ Znobia was cancelled after Znobia was involved in a car accident, but M.I.A. was able to record in Trinidad, Liberia, Jamaica and Australia. She and Switch relied heavily on Logic Pro, a digital audio workstation produced by Apple, and were able to capture vocals and background sounds outside the traditional studio environment, using a microphone and a MacBook Pro. The album features guest vocals from Afrikan Boy, The Wilcannia Mob, and Timbaland, and further collaborations with Switch, Blaqstarr, Morganics and Diplo. She likened the process of recording the album to "making a big old marble cake with lots of different countries and influences. Then you slice it up and call each slice a song".
## Music and lyrics
Kala is named after M.I.A.'s mother, in contrast to her previous album, Arular, which was named after her father. She contends that Arular was a "masculine" album, but that Kala "is about my mum and her struggle–how do you work, feed your children, nurture them and give them the power of information?" She further summed the album up as "shapes, colours, Africa, street, power, bitch, nu world, and brave." The album is musically more aggressive and sample-heavy than M.I.A.'s debut album, and features a range of musical styles, including dance music and makes extensive use of South Asian music such as that of the urumee, a drum used in gaana music native to Tamil Nadu, India, and features samples of Bollywood and Tamil cinema vocals. Like Arular, the album also draws heavy influence from funk carioca, while M.I.A. makes use of featuring African chanting and the indigenous Australian instrument the didgeridoo for the first time. Topics on the album are Third World themed, including illegal immigration, poverty, capitalism, violence, and the availability of guns in Africa, while also touching on facets of urban British Culture like rave culture.
According to Dominic Horner of The Independent, the album's music may not be appropriately described exclusively as either dance or world music, but it is a mixture of the two. Music critic Robert Christgau characterised its music as an "eclectic world-underclass dance amalgam", and Jonathan Brown of the Irish Independent cited Kala as a proper introduction to "world fusion", a genre that "blends sounds from across the globe which wouldn't – and some say shouldn't – be put together." By contrast, NPR's Oliver Wang argued that, rather than a "so-called world music fusion" attempt, Kala is "agitated, propulsive pop" informed by international sounds. Music journalist Jody Rosen called it "an agitprop dance record" that reappropriates hip hop in an international setting with beatbox riddims, "playground" rhymes, unconventional samples, and gunshot sounds.
The tracks "Boyz" and "Bird Flu" use urumee drums, a signature instrument of Gaana, a Tamil genre of music, with which M.I.A. was familiar from her time spent living in Sri Lanka. She later worked on these tracks in Trinidad, where she absorbed influences from the country's love of soca music. The lyrics of "Boyz" deal with the artist's time in Jamaica, and reference Jamaican dance moves. The song "Hussel" began as an image in M.I.A.'s head of refugees being smuggled in boats, which she expressed musically by imagining how "if they banged that beat on the side of a boat, what would it sound like? That's why it's all echo-y and submarine-y". The sounds on the intro were recorded from Keralan [sic] fishermen chanting as they pull their fishing boats into the water. "World Town" used instrumentation from the temple music she recalled waking up to as a child in Sri Lanka. After playing the track to children in Liberia, she expressed a desire to record a video for the song there. M.I.A.'s "flat, unaffected vocals and delivery of lyrics" on some songs drew comparisons to British post-punk bands such as Delta 5 and The Slits. She says it "was just what was happening to me naturally ... I wanted it to be difficult and raw and not get into it so much".
M.I.A. described opening track "Bamboo Banga" as having a "bamboo-stick beat, house-y feel". Afrikan Boy, an exponent of grime, a UK-based genre of urban music, provided vocals on the song "Hussel". M.I.A. opted to work with him because she felt he seemed comfortable with his identity as a "real immigrant" and because his background was different to that of most MCs in the genre. She had originally planned to include "Mango Pickle Down River"—her remix of The Wilcannia Mob's song "Down River"—on a mixtape, but chose to include it on the album because she felt it was rare to hear the "aboriginal voice" in recorded music. "Jimmy" is a tribute to her mother and is M.I.A.'s version of an old Bollywood film track to which she danced at parties as a child. Despite the involvement of Baltimore club musician Blaqstarr, "The Turn" is the album's only ballad, and the track has been described as the least like club music. "20 Dollar" was written about the relative ease of buying AK-47s in war-torn Liberia. "XR2" recalls part of M.I.A.'s life growing up with rave music in early 1990s London, while "Paper Planes" jokingly plays on her problems with visas and certain perceptions of immigrants.
## Release and artwork
In April 2007 Rolling Stone reported that Kala would be released on 26 June of that year. After being delayed for unknown reasons, the album was eventually released by XL Recordings on 8 August 2007 in Japan and on 20 August in the UK, and by Interscope Records on 21 August in the United States. The Japanese edition featured three extra tracks not included on the versions released in other countries, with "Far Far" shortly being re-released on the How Many Votes Fix Mix EP. Following the unexpected commercial success of "Paper Planes", Kala was re-issued in the United Kingdom in October 2008. A 4 November 2008 US re-release was announced, but as of late 2009 the album had not been re-issued in the United States.
The album's packaging includes photographs taken by M.I.A. and others in Liberia and Jamaica. The cover artwork to Kala, designed by Steve Loveridge, features neon fractal patterns and repeated slogans, including "Fight On\! Fight On\! Fight On\!", which surrounds her image on the front cover. The cover was considered garish, prompting The Village Voice to comment "Maybe one day [she'll] make an album cover that it doesn't hurt to look at". Additional graphics for the album were provided by English fashion designer Carri Mundane (also known as Cassette Playa) and Steve Loveridge. The album's artwork was inspired by African art, "from dictator fashion to old stickers on the back of cars," which M.I.A hoped, like her artwork extended "Okley Run" clothing range, would capture "a 3-D sense, the shapes, the prints, the sound, film, technology, politics, economics" of a certain time.
## Promotion
M.I.A. began her promotion of the new album with a live appearance at Radio 1's Big Weekend in Preston in May 2007, where she performed six songs from Kala. In July she began the full KALA Tour with dates in the United States before going on to play a number of festivals in Europe and America. After dates in Asia, she returned to America for a series of shows in October and November, before ending the year with concerts in the UK. The tour continued during the first half of 2008 under the banner of the People Vs. Money Tour with further dates in North America, although the planned European leg of the tour was eventually cancelled.
The first track from the album to be made available to the public was "Bird Flu", which was made available digitally as a promotional single on 13 November 2006. The first official single to be lifted from the album, "Boyz", was released on 11 June 2007. The second single was "Jimmy", which was released on 1 October 2007. The EP Paper Planes - Homeland Security Remixes EP, featuring various mixes of "Paper Planes", was released digitally on 11 February 2008 and physically three weeks later. A new physical single version was released in the UK on 13 October 2008. Also in October 2008, How Many Votes Fix Mix EP was released, containing a remix of "Boyz" with Jay-Z and the tracks "Shells" and "Far Far".
## Critical reception
Kala was met with widespread critical acclaim. At Metacritic, which assigns a normalised rating out of 100 to reviews from mainstream publications, the album received an average score of 87, based on 37 reviews.
Reviewing for the Los Angeles Times, Ann Powers wrote that Kala succeeded at embodying disenfranchised characters in the dissonant Third World with "truly multi-vocal" music whose "every sound signified something different, driving the music's meaning into some new corner". Andy Battaglia from The A.V. Club said the music ventured far enough where it served as both a party album and a progressive aural assault, while AllMusic's Andy Kellman felt that Kala was better for intensifying Arular's qualities and "mixing cultures with respectful irreverence". Barry Walters of Spin credited M.I.A. with evoking both the social demands and percussive sounds of the Third World, while finding the album relevant at a time when "more Americans than ever feel like outsiders in their own country". Writing for MSN Music, Robert Christgau said the lyrics were "cannier politically" than Arular, and the music was more decisive in embodying the imagination and recreation of "an unbowed international underclass that proves how smart it is just by stating its business, which includes taking your money". He later said that he had "pressed" the editors of Rolling Stone to let him give Kala four-and-a-half stars in his review for the magazine, wishing he "had the foresight to fight for five" because the album "kept growing on me till I even dug the Timbaland remnant ['Come Around']".
Jonathan Keefe from Slant Magazine was somewhat less impressed, deeming Kala less successful than Arular, which he said had more immediate hooks and clever pop sensibility. Writing for Pitchfork, Mark Pytlik described Kala as "clattering, buzzy, and sonically audacious", but said that, because most of M.I.A.'s lyrics gave the impression they were written "in the service of the rhythms", her allusions sounded more "rewarding" than what she literally had written. Andy Gill of The Independent found her lyrics unclear in their message about money and social concerns, while remarking that the gun references on "World Town" and "Paper Planes" blemished "an otherwise fine album". In The Guardian, Alexis Petridis wrote that Kala was still a "unique" listen in spite of occasionally tuneless songs. Writing for NME, Alex Miller acknowledged that the record's music polarised opinion, claiming that some members of the magazine's staff had "fed several copies [of the album] into the shredder claiming aural abuse", although others went on to praise the album for its innovation and referred to it as M.I.A.'s masterpiece.
At the end of 2007, Kala was named one of the year's best albums in critics' lists, including rankings at number eight (The Wire and Stylus Magazine), number seven (NME), number six (Paste, The A.V. Club and Entertainment Weekly), number four (The Guardian and Drowned in Sound), and number three (Pitchfork). The album was also listed at number three on The Village Voice's 35th annual Pazz & Jop poll. Blender and Rolling Stone both named Kala as their number one album of 2007. "Boyz" was number nine and "Shells" number sixty-seven on the same magazine's list of the 100 Best Songs of 2007 and 2008 respectively. The album was nominated for the 2007 Shortlist Music Prize.
## Commercial performance
Kala debuted and peaked at number 18 on the US Billboard 200, selling 29,000 copies in its first week. It also topped the Top Electronic Albums chart, and was ranked first on the Billboard Year-End Top Electronic Albums of 2007. The album was certified gold by the Recording Industry Association of America (RIAA) on 5 March 2010, and by September 2013, it had sold 559,000 copies in the United States.
The album debuted and peaked at number 39 on the UK Albums Chart. The album was certified silver by the British Phonographic Industry (BPI) on 30 January 2009, denoting sales in excess of 60,000 copies within the United Kingdom. In Canada, Kala was certified gold by Music Canada on 27 August 2018. The album also reached the top 40 in a number of other countries, including Belgium, Finland, Ireland, Japan, Norway and Sweden.
## Legacy
Kala appears on professional rankings of the greatest albums. In 2009, NME placed the album at number seventy-two in its list of the 100 greatest records of the decade, and Rolling Stone ranked it as the ninth best album of the same period. Christgau named it the decade's best album in his ballot for the magazine. In 2012, Rolling Stone ranked it at number 393 in a revised edition of their Rolling Stone's 500 Greatest Albums of All Time issue. In 2013, NME ranked it number 184 in its list of the 500 Greatest Albums of All Time. In 2015, the album was ranked number 42 by Spin in its list of "The 300 Best Albums of the Past 30 Years (1985-2014)". The album was also included in the book 1001 Albums You Must Hear Before You Die. In 2019, the album was ranked 75th on The Guardian"'s 100 Best Albums of the 21st Century list. When asked in March 2020 whether Kala remains his favourite album of the 21st century, Christgau responded, "Yup. No contest."
Writing for Dazed Digital, Grant Rinder praised the album for transforming M.I.A. from a "cult hero" to an "international star". Rinder commented that the album was a "tremendous" step forward towards shedding light on the realities of Third World countries that the Western world may not have thoroughly understood. Reflecting on diversity and representation issues in society, as well as politics surrounding President Donald Trump, Rinder said that Kala "feels particularly ahead of its time", and concluded that "M.I.A. was truly a pioneer for a global humanitarian perspective that no artist has been able to deliver quite as well since." Frank Guan of Vulture said that M.I.A. "sounded like the future" and that "her immediate influence was remarkable", as the album "seems to herald certain trends current in contemporary hip-hop". Guan further gave appraisal to M.I.A. for being the "precursor" for "fashion-rap" acts, including Travis Scott, Lil Uzi Vert, Playboi Carti, and ASAP Rocky.
In a 2013 Rolling Stone article titled "How M.I.A. made Kala", Jody Rosen regarded the album as "a landmark, agitprop dance record that restyled hip-hop as one big international block party, mixing up beatbox riddims, playground rhymes, left-field samples and gunshots. It was also, against all odds, a hit, which spawned a huge single and transformed M.I.A. from a cult heroine to an A-lister." In 2017, following the 10th anniversary of its release, Spencer Kornhaber of The Atlantic also felt that Kala "feels newly relevant amid global political currents trending toward isolationism" Reflecting on the lack of mainstream music tackling global issues, Simran Hans of Noisey wrote that Kala "felt, and still feels, like both a party, and a fight." and that it's "hard to imagine a dance record as combative being released now". Gabriela Tully Claymore of Stereogum wrote that the album "promotes a global conscience not easily heard in a lot of popular music at the time [and] it was a dance album that confronted the hegemony of a market largely dominated by quote-unquote Western forms...M.I.A. had her finger on a pulse that spanned nations, and she figured out a way to harness disparate influences into a singular style that could thrive in various markets."
## Track listing
### Sample credits
- "Bamboo Banga" incorporates elements of "Roadrunner" written by Jonathan Richman and "Kaattukkuyilu", written and performed by Ilaiyaraaja from the film Thalapathi.
- "Bird Flu" incorporates elements of "Thirvizha Na Vantha" written and performed by R. P. Patnaik from the film Jayam.
- "Jimmy" incorporates elements of "Jimmy Jimmy Jimmy Aaja" written by Bappi Lahiri from the film Disco Dancer.
- "Mango Pickle Down River" is remixed from the original recording "Down River" by the Wilcannia Mob.
- "20 Dollar" incorporates elements of "Where Is My Mind?" by the Pixies.
- "World Town" incorporates elements of "Hands Up, Thumbs Down" written by Blaqstarr.
- "Paper Planes" incorporates elements of "Straight to Hell" by the Clash.
## Personnel
Credits adapted from the liner notes of the expanded edition of Kala".
- M.I.A. – vocals (all tracks), production (tracks 1–6, 8, 9, bonus disc tracks 2, 3), additional vocal production (track 6), artwork, photography
- Afrikan Boy – vocals (track 5, bonus disc track 1)
- Jim Beanz – vocal production (track 12)
- Janette Beckman – photography
- Blaqstarr – production (track 9, bonus disc track 5), vocal production (bonus disc track 1)
- Demacio "Demo" Castellon – engineering, mixing, programming (track 12)
- Conductor – production (bonus disc track 6)
- Diplo – production (tracks 5, 10, 11, bonus disc tracks 1, 2, 4), additional vocal production (track 6)
- DJ Ability – cuts (track 6)
- Marty Green – assistant engineering (track 12)
- Liz Johnson-Artur – photography
- Michael Kamber – photography
- Lil' John – production (bonus disc track 6)
- Steve Loveridge – additional graphics
- Larry "Live" Lyons – assistant engineering (track 12)
- Morganics – production (track 6)
- Carri Mundane – additional graphics
- Riot – production (bonus disc track 6)
- Rye Rye – vocals (bonus disc track 1)
- Spike Stent – mixing (tracks 3, 4)
- Switch – production (tracks 1, 3–5, 7, 8, bonus disc track 3), additional production (tracks 2, 10, 11, bonus disc tracks 1, 4), mixing (track 11)
- Ron Taylor – vocal Pro Tools editing (track 12)
- Timbaland – all instruments, production, vocals (track 12)
- The Wilcannia Mob – vocals (track 6)
## Charts
### Weekly charts
### Year-end charts
## Certifications
## Release history |
# Arthur Gould (rugby union)
Arthur Joseph "Monkey" Gould (10 October 1864 – 2 January 1919) was a Welsh international rugby union centre and fullback who was most associated as a club player with Newport Rugby Football Club. He won 27 caps for Wales, 18 as captain, and critics consider him the first superstar of Welsh rugby. A talented all-round player and champion sprinter, Gould could side-step and kick expertly with either foot. He never ceased practising to develop his fitness and skills, and on his death was described as "the most accomplished player of his generation".
Following the withdrawal of their regular fullback, Newport RFC first selected Gould in 1882, when he was 18. He was never dropped from the side thereafter and played regularly until he retired in 1898. Gould played for Newport during their "invincible" season of 1891–92, when they did not lose a match, and scored a record 37 tries in Newport's 24-game 1893–94 season, a club record that still stands. Gould frequently travelled due to his job as a public contractor, and consequently turned out for a number of other sides during his career, including the clubs Richmond and London Welsh, and the county side Middlesex.
Gould was first selected for Wales in 1885 when he played at fullback against England. He was awarded the captaincy in 1889, by which time he was playing at centre, and led Wales to their first Home Nations Championship and Triple Crown titles in 1893; that tournament's match against England established Gould as a great player and captain. By the time Gould retired he was the most capped Welsh centre, a record he held until 1980, with 25 caps in the position. He ended his international career against England on 9 January 1897. The game, played in front of 17,000 supporters at Rodney Parade, was Gould's 18th as Wales captain – a record eventually broken by Ieuan Evans in 1994.
Towards the end of his career, Gould was at the centre of a controversy known as the "Gould affair" that saw Wales withdraw from international rugby for a year. The controversy centred on the support of the Welsh Football Union (WFU) for a testimonial for Gould on his retirement. The English Rugby Football Union and International Rugby Football Board (IRFB) argued that the testimonial constituted professionalism – which they claimed breached the sport's by-laws. The WFU withdrew from the IRFB in protest, rejoining a year later under the IRFB-imposed condition that Gould would not represent Wales again. He worked as a brewery representative after retiring from rugby, and died of an internal haemorrhage in 1919 at the age of 54.
## Family and early years
Arthur Joseph Gould was born into a sporting family in Newport, Wales, on 10 October 1864 to Joseph and Elizabeth. His father, from Oxford, England, moved to Newport to find work, setting up his own brass foundry business. Joseph was also an ardent sportsman, playing for the local cricket team.
Gould's five brothers were all notable rugby players and athletes. His brother Bob was a forward who played 136 times for Newport Rugby Football Club, whom he captained in the 1886–87 season. Bob was also capped 11 times for Wales between 1882 and 1887, and captained his country once, versus Scotland in 1887. A younger brother, Bert, was a centre who played three times for Wales – he appeared with Gould in the Welsh team that won the Triple Crown for the first time in 1893. His other brothers – Harry, Gus and Wyatt – all played rugby for Newport. Wyatt captained Newport in 1905–06, and Harry played for them in their inaugural season of 1875–76. For the first 29 seasons of its existence, Newport RFC always had at least one of the Gould brothers in the team. Wyatt played for the club until 1907; he also ran the 400 m hurdles for Great Britain in the 1908 Summer Olympics.
The young Gould often climbed trees, and thus acquired the childhood nickname "Monkey", which was soon contracted by most to "Monk". Like his brother Wyatt, he was a keen athlete and made £1,000 during his years as a rugby player by entering track and field meets. A county champion sprinter and hurdler, Gould finished third in the Amateur Athletic Association 120-yard hurdles in 1887 and 1893.
## Rugby career
### Club and county history
At the age of 14 Gould captained the Newport Junior team, and later played a few games for the Third XV. Gould was drafted into the First XV – the senior team – as a fullback at the age of 18. On 18 November 1882 Newport had a home fixture against Weston-super-Mare at Rodney Parade. The Newport groundsman, John Butcher, had been sent by the club to collect the regular fullback who had not appeared for the game. Gould, who was returning from a youth match, saw Butcher outside the missing fullback's home and approached the groundsman to discover that the player was at a funeral. Butcher offered Gould the position instead, and then the groundsman successfully persuaded the club captain to play him. Gould ran in two of his team's three tries after disregarding the instructions of his captain, Charlie Newman, who kept shouting for Gould to "Kick, kick\!" After this, he was not dropped by Newport until his retirement in 1898.
As rugby was then an exclusively amateur sport, Gould and his brother Bob travelled Britain working as public works contractors. During this time he entered open athletic meets and played for various English rugby teams including the Southampton Trojans, and from 1887 was a regular member of the London side Richmond. In 1885 Gould was invited to play for the newly formed Welsh exiles team London Welsh. London Scottish F.C. had been founded for Scottish players working or studying in the city, but until this time a London club for Welsh players had not existed. The side's first game, a trial match, was played on 21 October 1885 at Putney, and three days later the first team played London Scottish at the Saracens' Palmerston Road ground in Walthamstow. Gould played at half-back, and was joined in the team by Martyn Jordan, Thomas Judson, Rowley Thomas, Charles Taylor and T. Williams – all past or future Welsh internationals. During the 1885–86 season London Welsh were invited to form a combined "exiles" team with London Scottish, to face a London XV in a charity match at The Oval. Gould was one of six Welsh players selected to play in front of a crowd of 8,000 that included the Prince of Wales.
In the 1885–86 season he was moved up to play as a threequarter for Newport. As he was frequently travelling and playing in England between 1885 and 1890, Gould was not a regular member of the team. After playing just a handful games during the three seasons preceding it, in the 1889–90 season he managed 15 games for the club – scoring ten tries and five dropped goals.
In June 1890 Gould left Britain to complete a works contract in the West Indies, but returned to Newport in time for the 1891–92 season. Newport were unbeaten throughout that campaign, which was later dubbed their "invincible" season. Gould captained Newport between 1893–94, when the team lost only three games, and in the 1894–95 season, in which the club lost only to Llanelli. During his first period as captain, 1893–94, Gould scored 37 tries in 24 games, a club record that still stands as of 2013. Although records before 1886 are incomplete, Newport RFC acknowledge Gould's scoring record at the club between the 1882–83 and 1898–99 seasons as 159 tries, 66 conversions, 61 dropped goals and a single penalty, over 231 appearances.
Gould also turned out for the Middlesex county side, and was a mainstay during their "invincible" season of 1887–88. He also played for them against the New Zealand Native team in 1888. The match, hosted by the Earl of Sheffield, was an invitation only event. The Middlesex side won 9–0, and scored three tries in the match, the second one resulting from a smart pass by Gould. In addition to Gould, the Middlesex backline of the time regularly fielded a number of English and Scottish internationals – this earned the side the sarcastic nickname "the Imperial team".
### International career
#### 1885–89
Gould was first capped for Wales against England in the opening game of the 1885 Home Nations Championship. He joined his brother Bob in the side, and played at fullback, his preferred position at the time. Played under the captaincy of Newport teammate Charlie Newman, this was Wales' eighth-ever international and fourth encounter with England. The Welsh lost by a goal and a try to a goal and four tries. Wing Martyn Jordan of London Welsh scored both Welsh tries, with one successfully converted into a goal. Some accounts award the conversion to Charles Taylor, though it is now generally credited to Gould. Gould was selected for the second game of the tournament, an away draw to Scotland, in which both teams played a pair of brothers; George and Richard Maitland for Scotland, and Arthur and Bob Gould for Wales.
By 1886 the four threequarter system had spread throughout Wales. First instituted by Cardiff RFC in 1884, the system was designed to allow Cardiff centres Frank Hancock and Tom Williams to play at the same time, and involved dropping the ninth forward to include a second centre. Newport were reluctant to adopt this style of play, mainly due to Gould's excellent kicking and covering abilities, which allowed the club to continue with the advantage of the extra forward.
For the 1885–86 season Gould switched from fullback at Newport to the centre position; this tactic was adopted by Wales and Gould replaced Cardiff's Hancock at centre for their first match of the 1886 Home Nations Championship against England. The match resulted in a Welsh loss. In Wales' next game, against Scotland, Wales became the first country to trial the four threequarter system. They did this by bringing back Hancock as captain and having him play at centre alongside Gould. The experiment was a tactical disaster – the eight Welsh forwards struggled against the nine Scottish forwards. Hancock duly regrouped the team at half-time and readopted the standard formation, bringing Harry Bowen from fullback into the pack, and pushing Gould from centre into Bowen's vacant position. Wales lost by two goals to nil, the system was deemed a failure and Hancock never represented Wales again. The whole affair had a negative effect on Gould, who initially disliked the strategy, stating that he was "prejudiced against the four three-quarters." Gould even went as far as persuading the Welsh selectors to revert to the old formation. The next time Wales trialled the system was in the 1888 encounter with the touring New Zealand Natives, a match in which Gould was unavailable to play.
In the next season, Wales completed their first full Home Nations Championship; Gould played as the lone centre in all three games. It was a fairly successful Championship for the Welsh, with a draw, a win and a loss, leaving them second in the table. Of note during the series was Bob Gould's captaincy in the second match, against Scotland, and Gould's first ever international dropped goal – which gave Wales a win over Ireland and made up for him missing a dropped goal by just a yard in their draw against England. Due to work commitments, Gould only played one of the two Wales games of the 1888 Home Nations Championship, in the country's first victory over Scotland, thanks to a single try from Thomas Pryce-Jenkins. In the second game, played away to Ireland, George Bowen was given the centre position in the last match Wales would play with a three threequarter system. Gould then missed the first Welsh international against an overseas touring side, when the New Zealand Natives were beaten at St. Helen's in Swansea, and was still absent two months later for the opening game of the 1889 Championship. Gould returned in time for the clash with Ireland where he was given the captaincy and played alongside Llanelli centre Tom Morgan. Gould lost his first match as captain, losing by two tries to nil at home – this was the first of 18 caps he earned leading his country.
#### 1890–93
Gould appeared in Wales' three matches of the 1890 Home Nations Championship where he partnered Dickie Garrett, a coal tipper who played for Penarth, at centre. Gould lost the team captaincy for the first match to Frank Hill, a game which Wales lost to Scotland 5–1, though Gould did score his first international try. The game is also notable for featuring the first appearance of Billy Bancroft, the Swansea all-round sportsman who would take over the captaincy from Gould on his retirement. Bancroft was fullback in Gould's next 18 international games. Gould regained the captaincy for the next game, an encounter with England at Crown Flatt in Dewsbury, and from that point held the captaincy whenever he represented Wales. The encounter was an historic day for Wales, with the country's first win over England, a single try from Buller Stadden giving Wales the victory. The campaign ended in a disappointing away draw with Ireland, which saw the introduction of Tom Graham, a Newport forward who would become Gould's club captain during the 1891–92 "invincible" season.
Gould missed the entire 1891 campaign as he and his brother Bob had travelled to the West Indies to conduct civil engineering work. Gould regained his international place and the captaincy on his return for the 1892 Home Nations Championship. The tournament was a failure for Wales; the team lost all three of their matches. There was little consistency for Gould at centre, with three different centre-pairings in each of the matches; Garrett against England, Conway Rees at home to Scotland and in the Irish encounter, Gould's younger brother Bert. The 1892 Championship was soured by the aftermath of the Wales–Scotland encounter, which was played in Swansea at St. Helen's. After Wales lost the game 7–2, members of the crowd, angered by Jack Hodgson's refereeing of the game, attacked him. The assailants by-passed the police and the referee had to be rescued by members of the Welsh team. In the struggle, Gould was struck on the chin, and it was reported that Hodgson only reached the Mackworth Hotel because Gould accompanied him on the coach.
The Welsh performance during the 1893 Home Nations Championship was in stark contrast to the previous year. Under the captaincy of Gould, Wales not only won the Championship for the first time, but also the Triple Crown. The first match of the campaign was against England, and played at the Cardiff Arms Park. The pitch had been kept from freezing over the night before by 500 braziers dispersed across the playing field. This led to a slippery ground, with play further hampered by a strong wind.
The English played the first half with the wind behind them and their nine-man scrum dominated the smaller Welsh pack. At half time Wales were 7–0 down following tries from Frederick Lohden and Howard Marshall and a conversion from England captain Andrew Stoddart. The second half started poorly for Wales when Marshall scored a second try following excellent English forward pressure. The game turned not long after: the English forwards could not maintain the pace they had set in the first half of the game, and began to slow. Then Welsh forward Charles Nicholl broke through a line-out with the ball, transferred it to Hannan, who passed to Gould at the halfway line. Gould evaded both Alderson and Lockwood before outpacing Edwin Field to score beneath the posts. Bancroft converted. A near identical move resulted in Conway Rees then releasing Cardiff wing Norman Biggs who scored with a run from the half-way line, though this time the conversion missed.
The Welsh backs repeatedly exposed the three threequarter system used by the English, as once the Welsh backs broke through the pack there was little defensive-cover to prevent run away scores. With the score at 9–7 to England, Marshall extended the lead with his third try of the match. This gave England an 11–7 lead with only ten minutes remaining. The game swung again when Percy Phillips received ball quickly before passing to Gould. Gould broke through the English defence and scored, though again Bancroft missed the conversion. With further Welsh pressure, a penalty was awarded to Wales on the English 25-yard line, but at a wide angle. Accounts differ as to what happened; some say that Gould tried to place the ball for Bancroft, but failed on the frozen ground, another states that Bancroft defied his captain to take the penalty as a drop kick, while other accounts mention Bancroft and Gould arguing on the pitch before Bancroft's attempt. Regardless, Bancroft kicked the penalty, the first penalty to be scored in an international match. It was the final score of the game and Wales were victorious, 12–11.
At the final whistle the pitch was invaded by Welsh fans and Gould was carried shoulder-high back to the Angel Hotel, cheered all the way. It was a defining moment for the Welsh style of play. England adopted the four threequarter system the following year.
Gould continued to captain the Wales team through victory over Scotland, with tries coming from Bert Gould, Biggs and William McCutcheon; all the result of precision handling from the backs. This left the final encounter with Ireland, played at Stradey Park in Llanelli, as the deciding match for a Welsh Triple Crown. Despite an unconvincing Welsh display, an enthusiastic crowd of 20,000 watched their country win the game and with it the title, decided by a single try from Bert Gould.
#### 1894–97
The 1894 Championship began with a loss for the defending champions against England, during a game in which Welsh in-fighting affected the result. Before the game, Gould instructed his forwards to heel the ball from the scrums swiftly, so it would get to the backs quickly and allow them to run at the English. Frank Hill decided that this was the wrong option and put all his might into wheeling the scrums instead, which worked against the efforts of Jim Hannan, who was trying to follow his captain's wishes. In the next match Gould was partnered by Dai Fitzgerald in a win over Scotland, but was unavailable for the encounter with Ireland and was replaced by Jack Elliott from Cardiff RFC.
By 1895 the only backs remaining from the 1893 Championship-winning team were Bancroft and Gould. Gould was now partnered with Owen Badger, who kept his place for the whole campaign. As the other teams adopted the Welsh style of play, Wales lost their advantage; the livelier English forwards outplayed their Welsh counterparts to give England victory in the opening game of the 1895 Championship. This was followed by a close loss to Scotland at Raeburn Place and then a narrow win over Ireland at the Cardiff Arms Park.
1896 was Gould's last full international tournament. The Championship started badly for Wales with a heavy defeat by England, during which Wales were reduced to 14 men after Badger broke his collar-bone in the first 15 minutes. In the second game Wales beat Scotland 6–0, with a try each for Gould and Cliff Bowen. The final game of the Championship was an away loss to Ireland, in which Gould scored his last international points with a dropped goal. At the end of 1896 Gould decided to retire from rugby.
In 1897, Gould was enticed out of retirement for one last Championship. By now Gould was a household name throughout Britain, as much due to his personality and good looks as his brilliant centre play; a testimonial fund had been started with contributions being made by the public. This caused a stir among the other Home Unions, who viewed this as an effort to pay Gould for playing, which would constitute professionalism. As the arguments continued, Gould played his final international game, a solid 11–0 win over England in early January. Wales played no further matches that season after the events behind Gould's testimonial fund caused Wales to leave the International Rugby Football Board (IRFB), in a situation now referred to as the "Gould affair".
#### Matches played
## Gould affair
By 1896 Gould had played more first class matches, scored more tries and dropped more goals than any other player on record. This led to South Wales Argus journalist W. J. Townsend Collins, to write in the paper: "... as Arthur Gould is as pre-eminent in football as W. G. Grace is in cricket, the footballing enthusiasts of Wales might recognise his services to the game ... by some national testimonial."
A Welsh shipbroker, W. J. Orders, organised a collection fund on the floor of the Cardiff Coal Exchange and floated a public testimonial of one shilling. The national response was considerable and within weeks the total was into hundreds of pounds. This drew the Welsh Football Union (WFU) into a confrontation with the IRFB, as rule 2 on professionalism stated that no player was allowed to receive money from his club, or any member of his club, for services rendered to football. The fund could have been seen as a professional fee to Gould, henceforth making him ineligible to play for his country. The WFU argued that the money raised was not given by the club, but rather an outpouring of thanks from the Welsh public to a national hero.
By April 1896 the Welsh Football Union had sanctioned a subscription of 1,000 shillings to be contributed to the Gould testimonial. The Rugby Football Union (RFU) complained and the IRFB reacted by informing the WFU that only a plate up to the value of a hundred pounds sterling could be given to Gould, and that the remaining funds should be donated to charity; otherwise Wales would lose their international fixtures. The WFU stood down and withdrew their subscription. The reaction in Wales was one of anger, with the people feeling that the WFU had bowed to English pressure, and had been bullied into a decision against the people's wishes.
In a move that was described as an act of hurt pride by social historian David Smith, but also as a manoeuvre to appease the Welsh supporters, in February 1897 the WFU wrote to the IRFB and withdrew their membership. The WFU claimed that they alone had authority over the matter because the IRFB did not have any rules regarding amateurism. The WFU then reinstated their subscription to Gould, and on Easter Monday 1897 a banquet was arranged at Drill Hall in Newport in Gould's honour. Many civic and sporting worthies were in attendance to witness the WFU president Sir John Llewellyn present Gould with the title deeds of a gift house. The 250 guests, including David A. Thomas, were joined by a reed and string orchestra, the band of the Fourth Battalion of the South Wales Borderers, and galleries packed by members of the public.
Wales did not field an international team until the IRFB, supported by the RFU, recommended that the WFU be readmitted into the organisation in February 1898. The WFU agreed that they would in future abide by all IRFB by-laws, and that Gould not be allowed to play in any future internationals. Gould accepted the ruling but returned to rugby as a referee and Welsh international selector. The compromise prevented a long term split in the sport, and by 1901 the IRFB added laws to the game banning professionalism to clarify their authority on the issue.
## Later life and legacy
After retiring from rugby, Gould became a brewery representative around Newport. He was still a very popular figure and was followed during his work by fans; his image was still worth money, appearing on merchandise such as cigarette cards and matchboxes.
Gould died in 1919, at the age of 54. Falling ill at work on 2 January, he was rushed home where he died later that day of an internal haemorrhage. His funeral was reported as the biggest ever seen in Wales up to that time; it was surpassed three decades later by that of the former British Prime Minister David Lloyd George. Gould was buried at St Woolos Cemetery, Newport. In reporting his death, The Times stated:
> To him more than anyone else is due the rise of Welsh football, and so football as we know it now. He did more than any one else to transform a game from one in which brute force and individual skill were the chief characteristics to one in which scientific combination became the main feature, without the sacrifice of individualism.
In 1923 a memorial fund was raised in Gould's name, the donations of which were given to the Royal Gwent Hospital in Newport. The hospital recognised the gift with the Arthur Gould Memorial Bed, inscribed: "To the memory of Arthur Gould – Greatest of Rugby Football Players". The bed was lost, however, when a portion of the hospital was demolished. Donations for the memorial, which totalled £1,525, were received from all over the world, and several matches were staged to raise funds, including a fixture between Newport and Cardiff.
Gould has been described as the first superstar of his sport by rugby historian Terry Godwin while David Smith in the Official History of the Welsh Rugby Union described him as the first player to surpass national recognition, becoming in both meanings of the word "an international". The Welsh Academy's Encyclopedia of Wales, published over 90 years after his death, records Gould as "Welsh rugby's first superstar", while a 1919 obituary described him as "the most accomplished player of his generation". He set several long-standing records for his country, including captaining Wales 18 times, a number eventually surpassed by Ieuan Evans in 1994. Gould played 25 matches at centre for Wales, a record that stood until beaten by Steve Fenwick in 1980. He was also the most capped Welsh player, with 27, at the time of his retirement.
Gould was inducted into the Welsh Sports Hall of Fame in June 2007; members of Gould's family were in attendance including his granddaughter Mary Hales. When Newport RFC set up their own hall of fame in 2012 the first person inaugurated was Gould. Gould was inducted into the World Rugby Hall of Fame in November 2016. |
# Bill Ponsford
William Harold Ponsford MBE (19 October 1900 – 6 April 1991) was an Australian cricketer. Usually playing as an opening batsman, he formed a successful and long-lived partnership opening the batting for Victoria and Australia with Bill Woodfull, his friend and state and national captain. Ponsford is the only player to twice break the world record for the highest individual score in first-class cricket; Ponsford and Brian Lara are the only cricketers to twice score 400 runs in an innings. Ponsford holds the Australian record for a partnership in Test cricket, set in 1934 in combination with Don Bradman (451 for 2nd wicket)—the man who broke many of Ponsford's other individual records. In fact, he along with Bradman set the record for the highest partnership ever for any wicket in Test cricket history when playing on away soil (451 runs for the second wicket)
Despite being heavily built, Ponsford was quick on his feet and renowned as one of the finest ever players of spin bowling. His bat, much heavier than the norm and nicknamed "Big Bertha", allowed him to drive powerfully and he possessed a strong cut shot. However, critics questioned his ability against fast bowling, and the hostile short-pitched English bowling in the Bodyline series of 1932–33 was a contributing factor in his early retirement from cricket a year and a half later. Ponsford also represented his state and country in baseball, and credited the sport with improving his cricketing skills.
Ponsford was a shy and taciturn man. After retiring from cricket, he went to some lengths to avoid interaction with the public. He spent over three decades working for the Melbourne Cricket Club, where he had some responsibility for the operations of the Melbourne Cricket Ground (MCG), the scene of many of his great performances with the bat. In 1981 the Western Stand at the MCG was renamed the W.H. Ponsford Stand in his honour. This stand was demolished in 2003 as part of the redevelopment of the ground for the 2006 Commonwealth Games, but its replacement was also named the W.H. Ponsford Stand. At the completion of the stadium redevelopment in 2005, a statue of Ponsford was installed outside the pavilion gates. In recognition of his contributions as a player, Ponsford was one of the ten initial inductees into the Australian Cricket Hall of Fame.
## Early life
The son of William and Elizabeth (née Best) Ponsford, Bill Ponsford was born in the Melbourne suburb of Fitzroy North on 19 October 1900. His father was a postman whose family had emigrated from Devon, England, to Bendigo, Victoria, to work in the mines during the 1850s gold rush. His mother was also born in the goldfields, at Guildford, before moving to Melbourne with her father, a Crown Lands bailiff. Ponsford grew up on Newry St in Fitzroy North, and attended the nearby Alfred Crescent School, which stood beside the Edinburgh Gardens.
Ponsford learnt the rudiments of cricket from his uncle Cuthbert Best—a former club player for Fitzroy. He had the best batting and bowling averages for his school team in 1913, 1914 and 1915 and eventually rose to the captaincy. His local grade club, Fitzroy, awarded Ponsford a medallion—presented by the local mayor—as the most outstanding cricketer for his school during the 1913–14 and 1914–15 seasons. The medallion was awarded along with an honorary membership of the club, and Ponsford trained enthusiastically, running from school to the nearby Brunswick Street Oval in the Edinburgh Gardens to practise in the nets. Les Cody, the general secretary of Fitzroy Cricket Club and a first-class cricketer with New South Wales and Victoria, was Ponsford's first cricketing role model.
In December 1914, Ponsford completed his schooling and earned a qualifying certificate, which allowed him to continue his education at a high school should he wish. He instead chose to attend a private training college, Hassett's, to study for the Bank Clerk's exam. Ponsford passed the exam and commenced employment with the State Savings Bank at the Elizabeth Street head office in early 1916. In May 1916, the Ponsford family moved to Orrong Road in Elsternwick, a wealthier part of Melbourne. Ponsford played with Fitzroy in a minor league for the remainder of the 1915–16 season, but under the geographical "zoning" rules in place for club cricket, he was required to transfer to St Kilda Cricket Club in the following season.
## Cricket career
### Early record breaking
The First World War and the creation of the First Australian Imperial Force led to a significant shortage of players available for cricket. As a result, Ponsford was called up to make his first-grade debut for St Kilda during the 1916–17 season, just one week before his sixteenth birthday. This match was against his old club Fitzroy, and was played at the familiar Brunswick Street Oval. The young Ponsford's shot-making lacked power, and after making twelve singles, he was bowled. He played ten matches in his first season with the St Kilda First XI and averaged 9.30 runs per innings. By the 1918–19 season, Ponsford topped the club batting averages with an average of 33. He also topped the bowling averages, taking 10 wickets at an average of 16.50 runs per wicket with his leg spin.
Despite failing to score a century for his club side (something he did not rectify until the 1923–24 season), Ponsford was called up to represent Victoria against the visiting England team in February 1921—his first-class cricket debut. His selection was controversial; the leading personality in Victorian cricket and national captain, "The Big Ship" Warwick Armstrong, had been dropped. Armstrong's omission sparked a series of angry public meetings protesting against the perceived persecution of Armstrong by administrators. While making his way to the Melbourne Cricket Ground (MCG) for the match, Ponsford had to walk through demonstrators carrying placards that denounced his selection at the expense of Armstrong. Without Armstrong, the Victorians were comfortably beaten by Johnny Douglas's English team by seven wickets. Batting down the order, Ponsford made six in the first innings and 19 in the second innings. Later that month, Ponsford made his maiden first-class century, scoring 162 against Tasmania at the NTCA Ground in Launceston, despite batting low in the order, at number eight.
Ponsford was named captain of a Victorian side made of up of promising youngsters, to play against Tasmania at the MCG on 2–5 February 1923. In this, only his third first-class match, Ponsford broke the world record for the highest individual innings score at that level on the final day of the match, scoring 429 runs and batting for nearly eight hours. Along the way, he broke Armstrong's record for the highest score for Victoria (250), before surpassing former England captain Archie MacLaren's world record individual score of 424. The team score of 1,059 was also a new record for a first-class innings—an impromptu paint job was needed to show the score on a board that was not designed to display a four-figure total.
The Governor General of Australia, Lord Forster, visited the dressing rooms after the day's play to congratulate Ponsford personally. Cables from around the world applauded the new record-holder, including one from Frank Woolley, whose 305\* was the previous highest score against Tasmania. The former world record holder MacLaren was not so forthcoming. MacLaren thought that the two teams were both short of first-class standard and therefore the record should not be recognised. However, an agreement made in 1908 confirmed that matches against Tasmania should be categorised as first-class matches. An exchange of letters between MacLaren and the Victorian Cricket Association, and speculation over possible political motives followed in the popular press, but the famous Wisden Cricketers' Almanack recognised and published Ponsford's score as the record.
Selected for his first Sheffield Shield match, against South Australia three weeks after his record-breaking innings, Ponsford—still batting down the order, at number five—made 108. The South Australian (and former Australian) captain Clem Hill watched Ponsford bat and commented, "[Ponsford] is young and full of promise; in fact, since Jim Mackay, the brilliant New South Welshman, I think he is the best." In 1923–24 Ponsford continued to score at a heavy rate. Against Queensland in December, he made 248 and shared in a partnership of 456 runs with Edgar Mayne—the highest first wicket partnership by an Australian pair to this day. Later that season, he scored a pair of centuries against arch-rivals New South Wales, accumulating 110 in each innings.
### Test debut and more records
Ponsford broke into international cricket in the 1924–25 season. After scoring 166 for Victoria against South Australia, and 81 for an Australian XI against the touring English team, he was selected for the first Test against England at the Sydney Cricket Ground (SCG). Batting at number three, Ponsford joined his captain Herbie Collins at the wicket after the dismissal of opening batsman Warren Bardsley. Although Ponsford initially struggled against the "baffling" swing bowling of Maurice Tate, the experienced Collins was confident enough to farm the strike during Tate's initial spell and Ponsford went on to make a century (110) on his Test debut. Ponsford later said "I was most grateful for Herbie taking [Tate's bowling] until I was settled in. I doubt I would have scored a century but for his selfless approach." He scored 128 in the second Test at Melbourne; thereby becoming the first batsman to score centuries in his first two Tests. Ponsford played in all five Tests of the series, scoring 468 runs at an average of 46.80.
There were no international visitors to Australia in the 1925–26 season, so Ponsford was able to play a full season for Victoria. He scored 701 runs at an average of 63.72, including three centuries, making him the fourth highest runscorer for the season. At the end of the season, Ponsford was chosen for the Australian team to tour England in 1926. He was one of the younger players in the squad; 9 of the 15 players were over the age of 36. He made a good start to the tour, scoring a century (110\*) in his first innings at Lord's against the Marylebone Cricket Club in May. Unfortunately for Ponsford, tonsillitis caused him to miss three weeks of cricket in June and he was not chosen for the first three Tests of the English summer. He returned for the fourth and fifth Tests. The fifth Test was the only match that saw a result—an English victory—which meant that the hosts won the series and the Ashes one Test to nil. For the tour, Ponsford made 901 runs at an average of 40.95, including three centuries. Wisden described Ponsford's performances for the season as "something of a disappointment" but noted that "he batted well enough on occasion to demonstrate his undoubted abilities".
In the season following his return to Australia, Ponsford continued to make large scores. He started the season by hitting 214 runs (out of a Victoria team total of 315) against South Australia at the Adelaide Oval and followed this with 151 at the MCG against Queensland. In his next match, against New South Wales, Ponsford again rewrote the record books. Ponsford scored 352 runs, 334 of them in a single day, and helped Victoria to an innings total of 1,107, which remains the highest team total in first-class cricket, breaking Victoria's own record set four years earlier. After Ponsford played the ball back on to his stumps to be dismissed bowled, he then turned to look at his broken wicket and famously said, "Cripes, I am unlucky." For the season, Ponsford went on to score 1,229 runs at an average of 122.90, including six centuries and two half-centuries from only ten innings.
In the 1927–28 season, Ponsford continued where he had left off at the end of the previous summer. Ponsford topped the aggregate and the averages for the season, scoring 1,217 runs at an average of 152.12. In December 1927, he improved on his own first-class world record score, hitting 437 against Queensland; later that month he scored 202 and 38 against New South Wales and he then added another 336 against South Australia over the New Year. He had scored 1,013 runs in the space of four innings. This feat was part of a sequence in which he scored a century in a record ten consecutive first-class matches from December 1926 to December 1927. In January 1928 the Daily News in London described Ponsford as "the most remarkable and the most heart-breaking scoring-machine ever invented". Ponsford toured New Zealand with an Australian squad in 1928. In the six first-class matches scheduled, he scored 452 at an average of 56.50, second only to his opening partner Bill Woodfull in both average and aggregate. In the 1929–30 domestic season, Ponsford scored 729 runs at an average of 45.56, including three centuries, to finish fourth in the season aggregates.
### Struggles and success
A strong England team—captained by Percy Chapman and including Jack Hobbs, Herbert Sutcliffe, Wally Hammond and Harold Larwood—toured Australia in 1928–29. Ponsford's form was good in the lead up to the Tests; he scored 60 not out for Victoria against the tourists, and added 275\* against South Australia. Before the Test series started, Ponsford had declared in a column in the Herald that Harold Larwood's "pace through the air is not all that fast for a fast bowler", with the qualification that "he makes great pace off the pitch". Larwood dismissed him for scores of two and six in the first Test, and fractured a bone in Ponsford's hand in the second. The injury sidelined Ponsford for the remainder of the Test series.
Ponsford travelled to England for a second time, with the 1930 Australian team. In a wet summer, Australia won the series two Tests to one, recovering The Ashes. For the second time in as many trips to England, Ponsford fell ill—gastritis caused him to miss the third Test at Headingley Stadium. Despite this setback, Ponsford scored 330 runs in the Tests at an average of 55.00. Wisden wrote, "Ponsford had a much better season—especially in the Test matches—than four years previously. ... In helping his captain to wear down England's bowling he accomplished great work and, even if he was seldom really attractive to watch, there could be no question about his skill and how difficult he was to get out." The outstanding performer of the tour was the young Don Bradman, who scored 974 runs in the Test matches—this remains a world record for the most runs scored in a Test series. Ponsford played a part in Bradman's success; Wisden stated that "it is only fair to say that on more than one occasion [Bradman's] task was rendered the easier by the skilful manner in which Woodfull and Ponsford, by batting of different description, had taken the sting out of the England bowling."
In 1930–31, the West Indies sent their first-ever touring team to Australia for a five Test series. Ponsford was paired with a new partner, Archie Jackson; Woodfull chose to bat down the order to allow the young New South Welshman to open the batting. The change had little effect on Ponsford, who scored 467 runs at an average of 77.83 against the Caribbean tourists. Ponsford and Jackson started the Test series well, their 172 run partnership in the second innings taking Australia to a 10-wicket victory in the first Test. Ponsford finished just short of his century, unbeaten on 92. Before walking out to bat, Jackson had said to Ponsford, "I see the skipper padded up. We won't give him a hit\!" Jackson failed in the second Test at the SCG, but Ponsford went on to score his highest Test score to date, 183, before being bowled by Tommy Scott. Another century (109) in the third Test was part of a 229 run partnership with Bradman, who went on to score 223. Ponsford was reunited with Woodfull as his opening partner for the remaining Tests after Jackson, ill and struggling for form, was omitted. The West Indies had a famous victory in the fifth Test, but lost the series 4–1.
Ponsford had less success against the South Africans during their tour of Australia in 1931–32. While the Australians took a clean sweep of the Test 5–0, Ponsford's highest score in the four Tests he played was 34; he totalled 97 runs at an average of 19.40. It was Bradman who dominated with the bat for Australia, scoring four centuries and 806 runs overall.
### Bodyline
In a response to the record-breaking feats of Don Bradman, the English team that toured Australia in 1932–33—led by Douglas Jardine—adopted a tactic of fast, short pitched bowling directed at the body, later known as Bodyline. While Bodyline sought to curb Bradman, it was used against all the Australian batsmen, including Ponsford. After being bowled twice behind his legs—by Larwood for 32 in the first innings and for two in the second innings by Bill Voce—in the first Test at Sydney, Ponsford was omitted from the team for the second Test at Melbourne. Ponsford returned for the third Test in Adelaide, batting down the order. The Test was controversial and highly acrimonious; several Australian batsmen—including Woodfull and Bert Oldfield—were hit on the body and head from the English fast bowling. Ponsford was hit on several occasions during his innings of 85; he chose to turn his torso and take the rising balls on his body—especially on his left shoulder blade and backside—rather than risk a catch to the leg side fielders. When Ponsford returned to the dressing room after his dismissal, his teammates were amazed by the mass of bruises that covered his back and shoulders. Ponsford remarked to Bill O'Reilly, "I wouldn't mind having a couple more if I could get a hundred."
After failing in the fourth Test, Ponsford was again dropped. The hostile barrage of short-pitched bowling had a major impact on Ponsford's technique and career. In the three Tests that Ponsford played during the Bodyline series, he estimated he was hit around fifty times. During the series Ponsford developed a habit of turning his back on the rising ball and, if hit, glowering at the affected bowler. While the manager of the England team, Pelham Warner, thought that Ponsford "met the fast-leg theory in plucky and able style", this behaviour was criticised by the British cricket writer, R. C. Robertson-Glasgow. Bradman thought that the Bodyline tactics hastened Ponsford's eventual retirement.
### Triumph and retirement
After the disappointments of the Bodyline series, Ponsford returned to domestic cricket in 1933–34, scoring 606 runs at an average of 50.50. At the end of the domestic season, he was selected for his third tour of England with the Australian team in 1934. Illness again interrupted Ponsford's English summer, causing him to miss the second Test at Lord's. In the final two Tests of the series, the two record breakers—Ponsford and Bradman—combined in two remarkable partnerships.
In the fourth Test at Headingley, Bradman joined Ponsford at the fall of the third wicket when the Australians had scored only 39 runs (39/3). By the time Ponsford was dismissed hit wicket for 181, Australia were 427/4; the partnership had yielded 388 runs. Bradman went on to make 304. The partnership was the highest ever in Test cricket at the time and as of 2009 is still the highest fourth wicket partnership for Australia. Wisden said of Ponsford's innings "... he hit the ball hard and placed it well when scoring in front of the wicket. Moreover, his defence was rock-like in its steadiness and accuracy."
With the series locked at 1–1, the fifth and deciding Test at The Oval saw an even larger partnership between Bradman and Ponsford. The pair added 451 runs for the second wicket in an Australian total of 701 runs. Bradman scored 244 and Ponsford—again dismissed hit wicket—his highest Test score, 266. This partnership remained the highest in Tests until 1991 and the highest for the second wicket until 1997. As of 2009, it remains the highest ever in Australian Test history. Again Wisden was complimentary, saying "It would be hard to speak in too high terms of praise of the magnificent displays of batting given by Ponsford and Bradman" but noted that "Before Bradman joined him Ponsford had shown an inclination to draw away from the bowling of Bowes."
In the four Tests that Ponsford for the English summer, he made 569 runs at an average of 94.83. His performance saw him named as one of the five Wisden Cricketers of the Year in 1935.
> It is, perhaps, scarcely too much to say that English bowlers last summer thought he was every bit as difficult to get rid of as Bradman. Never a graceful or elegant batsman, Ponsford could with greater emphasis be called sound and workmanlike. He seemed in 1934 to hit the ball much harder than when he was here in 1926 and 1930, while his placing improved out of all knowledge. A delivery overpitched to any degree, he almost invariably punished to the full, while he could cut and turn the ball to leg with great certainty.
Upon their return to Australia, a testimonial match was arranged on behalf of the two Victorian opening batsmen, Woodfull and Ponsford. Woodfull—the senior member of the partnership—had announced his retirement from first-class cricket before returning from England and the press had speculated that Ponsford would succeed him as captain of Victoria. Walking out to bat in the match, the pair were cheered by the crowd to the strains of "For He's a Jolly Good Fellow". Together, the two Bills made another century partnership, before Ponsford was dismissed for 83; Woodfull went on to make a century.
During the match, to the surprise of the public, the press and his teammates, Ponsford announced his retirement from first-class cricket at the relatively young age of 34. His announcement remarked upon the changing atmosphere in high level cricket and touched on the effects of Bodyline.
> I am feeling the strain of the last tour. I am thirty four and when you get to that age you start to lose your keenness. ... Test cricket has become too serious. It is not a game anymore but a battle ... I can remember when it was all quite different to what it is now. I do not want to refer to that "bodyline" business—I am out of all that. Cricket was a different game before bodyline. Naturally I have a tinge of regret ... but it is better to go out of cricket before being dropped.
Woodfull remarked that Ponsford's retirement was premature, while teammate and journalist Jack Fingleton believed that the task of maintaining such high standards had affected Ponsford's nervous energy: "At the age of 34 he felt that he never wanted to see a bat or a cricket game again." Arthur Mailey suspected that Ponsford's sensitivity to criticism, especially from the media, was a key factor behind the early retirement. The memory of being omitted from the Australian side twice during the Bodyline series also stung Ponsford sorely. Ponsford continued playing for the Melbourne Cricket Club until 1939, but never represented his state or country again.
## Off the field
### Personal life
Ponsford began his working life at the State Savings Bank. On his return from England in 1926, the bank advised him that they might not tolerate so much leave for cricket in the future. Ponsford received a lucrative offer to play for Blackpool Cricket Club, which he was inclined to accept. This news was received with dismay by Australian fans, who had earlier seen players such as Ted McDonald leave Australia and accept contracts in the professional English leagues. To keep Ponsford in Melbourne, The Herald—a local newspaper—employed him on the basis that he would remain available for all representative cricket. The new role included writing articles for the paper.
In 1932, at the end of his five-year contract with the newspaper, Ponsford successfully applied for a position on the staff of the Melbourne Cricket Club. He was appointed to an unspecified office job working for the club secretary Hugh Trumble, which required him to transfer his cricket and baseball allegiances from St Kilda to Melbourne. The Herald unsuccessfully tried to retain his services, and Keith Murdoch—the Editor-in-Chief of the Herald, father of Rupert Murdoch—visited the Ponsford home to lobby against the move. Ponsford's new role included managing the staffing arrangements and crowd control at the Melbourne Cricket Ground for Australian rules football and cricket matches. In 1956, following the retirement of Vernon Ransford, Ponsford unsuccessfully applied for the position of club secretary, effectively its chief executive officer and one of the most prestigious positions in Australian cricket. However, in the event recently retired Test cricketer Ian Johnson was appointed to the position. Ponsford remained with the club until his retirement in June 1969.
Ponsford met Vera Neill at his local Methodist church; the pair married in 1924 and settled in the Melbourne suburb of Caulfield South. They had two sons, Bill Jr. and Geoff. Ponsford became a Freemason in 1922 and continued in the movement until 1985, retiring with the rank of Master Mason. During the Second World War, Ponsford attempted to volunteer with the Royal Australian Air Force, but was rejected on account of his colour blindness. In 1978, four years after the death of his wife, Ponsford moved in with his son, Geoff, at Woodend in rural Victoria, and was an active lawn bowler. An infection after an operation in 1988 saw Ponsford admitted to a nursing home in nearby Kyneton. He died there on 6 April 1991; at the time he was Australia's oldest living Test cricketer.
### Baseball
Baseball was a reasonably popular sport in Australia in the early 20th century and Ponsford alternated between cricket and baseball throughout his sporting life. At the time, baseball was generally played in Australia during the winter months, as many of the leading players were enthusiastic cricketers who viewed the sport as a means of improving their fielding skills. As with cricket, Ponsford started his baseball career at Alfred Crescent School, where his coach was the former Victorian player Charles Landsdown. As a junior Ponsford played shortstop, later as a senior for the Fitzroy Baseball Club he converted to catching.
Ponsford improved rapidly and by 1913 he was included in the Victorian schoolboys side for a tournament in Adelaide. He was again selected in the following year—now as a catcher—representing his state at the first national schoolboys championship in Sydney. The tournament coincided with a visit to Australia by two professional major league teams from the United States—the Chicago White Sox and the New York Giants. The manager of the Giants, John "Mugsy" McGraw, watched part of the tournament; the Ponsford family claim that McGraw was so impressed with Ponsford's skills that he later spoke to Ponsford's parents about the possibility of Bill playing in the United States.
In 1919, Ponsford was selected for Victoria's baseball team, alongside future Test cricket teammate Jack Ryder. In 1923, The Sporting Globe claimed that Ponsford was "... the best batter of the season. ... Indeed, as an all-round man, it is doubtful if he has a superior in the state." In 1925, Ponsford captained the Victorian team and was selected as centre fielder in an Australian representative team that played three matches against an outfit from the United States Pacific Fleet, which had docked in Melbourne. Over the three matches, won by the Australians, Ponsford made five safe hits, gained eight bases and his batting average was .357. Ponsford's next match against American opposition was against a team from Stanford University that visited Australia in 1927. Ponsford's Victorian team defeated Stanford 5–3; it was the visitors' only loss on the tour.
Ponsford simultaneously retired from baseball and cricket in 1934. In his newspaper column, he said that he liked both sports equally. He felt that baseball gave a player more opportunities to perform: "In cricket you may have the bad luck to get out early; which often means a blank afternoon. It is not so with baseball; you are in the game all the time." Joe Clark, the author of History of Australian Baseball, said "Ponsford is considered by many to be the best baseballer of his time in Australia." The official program for the 1952 Claxton Shield—held in Perth—made a similar claim.
> One name in Australian baseball stands pre-eminent above all others and that is the name of Bill Ponsford ... During his long career he was a star outfielder, perhaps the finest third baseman to represent his state and certainly as a catcher the equal of anybody. ... But it was as a batter that Bill outshone anyone ... Ponsford could, and did, hit to any part of a baseball field at will, and would nominate innings by innings, where he would hit the ball ... Ponsford will always remain amongst the greatest sportsmen of all time.
## Context
### Legacy and statistical analysis
In first-class cricket, Ponsford scored 13,819 runs at an average of 65.18, as of 2009 the fifth highest complete career average of any player, worldwide. Ponsford was not satisfied with merely making centuries; he strove to score 200 and more. Arriving in big cricket a few years before Bradman, for a time Ponsford was considered the heaviest scorer in cricket history. Jack Fingleton claimed that "The true perspective of Ponsford's deeds had barely dawned on the game when Bradman ruthlessly thrust him from public thought ..."
Apart from Brian Lara, Ponsford is the only man to twice score 400 runs in a first-class innings and along with Bradman and Wally Hammond, he remains one of only three men to have scored four triple-centuries. His 437 against Queensland is, as at 2009, still the fifth highest score in first-class cricket.
Ponsford was known for batting in partnerships, sharing in five that amassed over 375 runs each. Ponsford and his long-time partner, Woodfull, were known as "the two Bills", "Willy Wo and Willy Po" and "Mutt and Jeff" amongst other names. Together, the pair made 23 century partnerships; 12 of these exceeded 150 runs. Ponsford's other prolific partnership was with Bradman. In two Tests in 1934, the pair set records that still stand today:
- The highest partnership for Australia in Test cricket and the highest for the second wicket: 451
- The third highest partnership for Australia in Test cricket and the highest for Australia for the fourth wicket: 388
Cricket writer Ray Robinson said of the pair batting together, "[Ponsford] was the only one who could play in Bradman's company and make it a duet."
For services to cricket, Ponsford was made a Member of the Order of the British Empire (MBE) in the 1982 New Year Honours announced on 31 December 1981. Ponsford was one of the ten inaugural inductees when the Australian Cricket Hall of Fame was launched in 1996. In 2000, Ponsford and Arthur Morris were chosen to open the batting for the Australian Cricket Board's Team of the Century, a theoretical selection of the best team of Australian cricketers of the 20th century. In 2001, Ponsford was selected in the Melbourne Cricket Club Team of the Century.
In 1986 the Western Stand of the Melbourne Cricket Ground was renamed the "W.H. Ponsford Stand". Ponsford was described by his son as being "tickled pink" by the honour, but that he would only agree to the renaming if he was not required to participate in any public appearance or media interview. As part of the ongoing modernisation of the MCG the W.H. Ponsford Stand was torn down; the new stand was completed in 2004 and again named in his honour. A statue of the cricketer was installed outside the W.H. Ponsford Stand in 1995—one of a series in place around the stadium commemorating Australia's sporting heroes.
### Style and personality
Answering to the nickname of "Puddin'", Ponsford was a thickset man, weighing in at around 13 stone (83 kg) during his playing career. Despite this, he was known for his quick footwork, and was regarded as an excellent player of spin bowling. Ponsford was noted for his ability to maintain intense levels of concentration for extended periods. He possessed a strong cut shot and he drove through mid off powerfully, although critics noted that his backlift was not completely straight. He had a tendency to shuffle too far to the off; this exposed his leg stump and he was bowled behind his legs on six occasions in Tests against England. However, Ray Robinson felt that "no bowler could have got a marble, much less a [cricket] ball between his bat and his left leg."
Fingleton wrote, "He crouched a little at the crease ... he tapped the ground impatiently with his bat while awaiting the ball, and his feet were so eager to be on the move that they began an impulsive move forward just before the ball was bowled. This was the shuffle that sometimes took him across the pitch against a fast bowler; but, that aside, his footwork was perfection. I never saw a better forcer of the ball to the on-side, and for this stroke his body moved beautifully into position." However, Ponsford was not a stylish batsman. Bradman said "There were more beautiful players, but for absolute efficiency and results where can one turn to equal [Ponsford]?" Robinson described Ponsford as the "founder of total batting, the first to make a habit of regarding 100 as merely the opening battle in a campaign for a larger triumph." The New South Wales and Australian bowler Arthur Mailey later said that "I don't think it was the rungetting Ponny enjoyed so much as the bowlers' discomfort, especially when those bowlers came from New South Wales."
Ponsford used a heavy bat—2 pounds 10 ounces (1.2 kg)—nicknamed "Big Bertha". Opposition players sometimes joked that Ponsford's bat was larger than allowed under the laws of cricket and indeed in one match in Sydney, it was found to be slightly larger than permitted—the result of the bat spreading from his powerful hitting. Throughout his innings, Ponsford would pull his cap further to the left. Robinson claimed that "if you saw the peak at a rakish angle towards his left ear you could tell he was heading for his second hundred". When volunteering for service with the Royal Australian Air Force, Ponsford discovered he possessed abnormal colour vision, unable to distinguish red from green. The examining doctor was astonished and asked Ponsford, "What colour did [the ball] look to you after it was worn?" Ponsford replied, "I never noticed its colour, only its size." A later study identified Ponsford's specific colour vision as protanopia, a form of dichromacy in which red appears dark. Ponsford did not enjoy batting on rain-affected wickets. When on tour his teammates did not ask if it had rained last night, merely "Did Ponny wake during the night?"—legend had it that even the slightest trickle would wake him and have him anguishing over having to bat on the "sticky" in the morning.
Ponsford was a shy person, on the field and off. Robinson wrote that Ponsford "was so reserved that you had to know him for three years or the duration of a Test tour before his reticence relaxed." Similarly, when photographed Ponsford would hang his head so his cap would cover most of his face. This shyness intensified after his retirement. He would often walk along laneways to his work at the MCC, rather than be recognised on the way to the train station. While on the train, he would cover his face with the newspaper. At work, he disliked interaction with the public and would direct staff to advise visitors that he was not in, despite often being clearly in view. Bill O'Reilly said of Ponsford, "He spoke rarely and even then only if he could improve on silence." Nonetheless he was popular with his teammates and was said to have a droll sense of humour. |
# Siege of Sidney Street
The siege of Sidney Street of January 1911, also known as the Battle of Stepney, was a gunfight in the East End of London between a combined police and army force and two Latvian revolutionaries. The siege was the culmination of a series of events that began in December 1910, with an attempted jewellery robbery at Houndsditch in the City of London by a gang of Latvian immigrants which resulted in the murder of three policemen, the wounding of two others, and the death of George Gardstein, a key member of the Latvian gang.
An investigation by the Metropolitan and City of London Police forces identified Gardstein's accomplices, most of whom were arrested within two weeks. The police were informed that the last two members of the gang were hiding at 100 Sidney Street in Stepney. The police evacuated local residents, and on the morning of 3 January a firefight broke out. Armed with inferior weapons, the police sought assistance from the army. The siege lasted for about six hours. Towards the end of the stand-off, the building caught fire; no single cause has been identified. One of the agitators in the building was shot before the fire spread. While the London Fire Brigade were damping down the ruins—in which they found the two bodies—the building collapsed, killing a fireman.
The siege marked the first time the police had requested military assistance in London to deal with an armed stand-off. It was also the first siege in Britain to be caught on camera, as the events were filmed by Pathé News. Some of the footage included images of the Home Secretary, Winston Churchill. His presence caused a political row over the level of his operational involvement. At the trial in May 1911 of those arrested for the Houndsditch jewellery robbery, all but one of the accused were acquitted; the conviction was overturned on appeal. The events were fictionalised in film—in The Man Who Knew Too Much (1934) and The Siege of Sidney Street (1960)—and novels. On the centenary of the events two tower blocks in Sidney Street were named after Peter the Painter, one of the minor members of the gang who was probably not present at either Houndsditch or Sidney Street. The murdered policemen and the fireman who died are commemorated with memorial plaques.
## Background
### Immigration and demographics in London
In the 19th century, the Russian Empire was home to about five million Jews, the largest Jewish community at the time. Subjected to religious persecution and violent pogroms, many emigrated and between 1875 and 1914 around 120,000 arrived in the United Kingdom, mostly in England. The influx reached its peak in the late 1890s when large numbers of Jewish immigrants—mostly poor and semi-skilled or unskilled—settled in the East End of London. The concentration of Jewish immigrants into some areas was almost 100 per cent of the population, and a study undertaken in 1900 showed that Houndsditch and Whitechapel were both identified as a "well-defined intensely Jewish district".
Some of the expatriates were revolutionaries, many of whom were unable to adapt to life in the politically less oppressive London. The social historian William J. Fishman writes that "the meschuggena (crazy) Anarchists were almost accepted as part of the East End landscape"; the terms "socialist" and "anarchist" had been conflated in the British press, who used the terms interchangeably to refer to those with revolutionary beliefs. A leading article in The Times described the Whitechapel area as one that "harbours some of the worst alien anarchists and criminals who seek our too hospitable shore. And these are the men who use the pistol and the knife."
From the turn of the century, gang warfare persisted in the Whitechapel and Aldgate areas of London between groups of Bessarabians and refugees from Odessa; various revolutionary factions were active in the area. The Tottenham Outrage of January 1909, by two revolutionary Russians in London—Paul Helfeld and Jacob Lepidus—was an attempt to rob a payroll van, which left two dead and twenty injured. The event used a tactic often employed by revolutionary groups in Russia: the expropriation or theft of private property to fund radical activities.
The influx of émigrés and the increase of violent crime associated with it, led to popular concerns and comments in the press. The government passed the Aliens Act 1905 in an attempt to reduce immigration. The popular press reflected the opinions of many at the time; a leading article in The Manchester Evening Chronicle supported the bill to bar "the dirty, destitute, diseased, verminous and criminal foreigner who dumps himself on our soil". The journalist Robert Winder, in his examination of migration into Britain, opines that the Act "gave official sanction to xenophobic reflexes which might ... have remained dormant".
### Latvian émigré gang
By 1910 Russian émigrés met regularly at the Anarchist Club in Jubilee Street, Stepney. Many of its members were not anarchists, and the club became a meeting and social venue for the Russian émigré diaspora, most of whom were Jewish. The small group of Latvians who became involved in the events at Houndsditch and Sidney Street were not all anarchists—although anarchist literature was later found among their possessions. Most members of the group were revolutionaries who had been radicalised by their involvement in the unsuccessful 1905 revolution in Latvia and its violent suppression. All had left-wing political views and believed the expropriation of private property was a valid practice.
A leading figure in the group was George Gardstein, whose real name was probably Hartmanis; he also used the aliases Garstin, Poloski, Poolka, Morountzeff, Mourimitz, Maurivitz, Milowitz, Morintz, Morin and Levi. Gardstein, who probably was an anarchist, had been accused of murder and acts of terrorism in Warsaw in 1905 before his arrival in London. Another member of the group, Jacob (or Yakov) Peters, had been an agitator in Russia while in the army and later as a dockyard worker. He had served a term in prison for his activities and had been tortured by the removal of his fingernails. Yourka Dubof was another Russian agitator who had fled to England after being flogged by Cossacks. Fritz Svaars (Fricis Svars) was a Latvian who had been arrested by the Russian authorities three times for terrorist offences, but escaped each time. He had travelled through the United States, where he undertook a series of robberies, before arriving in London in June 1910.
Another member was "Peter the Painter", a nickname for an man also known as Peter Piaktow (or Piatkov, Pjatkov or Piaktoff); his real name was Janis Zaklis. The police suggested he was the ringleader of the gang, although there is no evidence that he was present at Houndsditch or Sidney Street. William (or Joseph) Sokoloff (or Sokolow) was a Latvian who had lived in Latvia and had been arrested in Riga in 1905 for murder and robbery before travelling to London. Another of the group's members was Karl Hoffman—whose real name was Alfred Dzircol—who had been involved in revolutionary and criminal activities for several years, including gun-running. In London he had worked as a decorator. John Rosen—real name John Zelin or Tzelin—came to London in 1909 from Riga and worked as a barber, while another member of the gang was Max Smoller, also known as Joe Levi and "Josepf the Jew". He was wanted in his native Crimea for several jewel robberies. Three women members of the gang, or associates of members of the gang, were among those who faced charges arising from the Houndsditch robbery attempt: Nina Vassileva—who was convicted of a minor offence but was cleared on appeal—Luba Milstein and Rosa Trassjonsky.
### Policing in the capital
Following the Metropolitan Police Act 1829 and the City of London Police Act 1839, the capital was policed by two forces, the Metropolitan Police, who held sway over most of the capital, and the City of London Police, who were responsible for law enforcement within the historic City boundaries. The events in Houndsditch in December 1910 fell into the purview of the City of London service, and the subsequent actions at Sidney Street in January 1911 were in the jurisdiction of the Metropolitan force. Both services came under the political control of the Home Secretary, who in 1911 was the 36-year-old rising politician Winston Churchill.
While on the beat, or in the course of their normal duties, the officers of the City of London and Metropolitan forces were provided with a short wooden truncheon for protection. When they faced armed opponents—as was the case in Sidney Street—the police were issued with Webley and Bull Dog revolvers, shotguns and small-bore rifles fitted with .22 Morris-tube barrels, the last of which were more commonly used on small indoor shooting galleries.
## Houndsditch murders, December 1910
At the beginning of December 1910 Smoller, using the name Joe Levi, visited Exchange Buildings, a small cul-de-sac that backed onto the properties of Houndsditch. He rented No. 11 Exchange Buildings; a week later Svaars rented number 9 for a month, saying he needed it for storage. The gang were unable to rent number 10, which was directly behind their target, 119 Houndsditch, the jeweller's shop owned by Henry Samuel Harris. The safe in the jeweller's was reputed to contain between £20,000 and £30,000 worth of jewellery; Harris's son later stated the total was only around £7,000. Over the next two weeks the gang brought in various pieces of necessary equipment, including a 60 foot (18 m) length of India rubber gas hose, a cylinder of compressed gas and a selection of tools, including diamond-tipped drills. Some of this equipment had been obtained from the Italian anarchist exile Errico Malatesta, who had a workshop in Islington; he was not aware it was for use in a robbery.
With the exception of Gardstein, the identities of the gang members present in Houndsditch on the night of 16 December 1910 have never been confirmed. It is likely that as well as Gardstein, Fritz Svaars and William Sokoloff—the two gunmen who died in the Sidney Street siege—were present, along with Max Smoller and Nina Vassileva. Bernard Porter, writing in the Dictionary of National Biography, considers that Peter the Painter was not at the property that night. Donald Rumbelow, a former policeman who wrote a history of the events, takes a different view. He considers that those present consisted of Gardstein, Smoller, Peters and Dubof, with a second group in case the work needed to continue into the following day, which included among their number Sokolow and Svaars. Rumbelow considers a third group on standby, staying at Hoffman's lodgings, to have comprised Hoffman, Rosen and Osip Federoff, an unemployed locksmith. Rumbelow also considers that present at the events—either as lookouts or in unknown capacities—were Peter the Painter and Nina Vassilleva.
On 16 December, working from the small yard behind 11 Exchange Buildings, the gang began to break through the back wall of the shop; number 10 had been unoccupied since 12 December. At around 10:00 that evening, returning to his home at 120 Houndsditch, Max Weil heard curious noises coming from his neighbour's property. Outside his house Weil found Police Constable Piper on his beat and informed him of the noises. Piper checked at 118 and 121 Houndsditch, where he could hear the noise, which he thought was unusual enough to investigate further. At 11:00 he knocked at the door of 11 Exchange Buildings—the only property with a light on in the back. The door was opened in a furtive manner and Piper became suspicious immediately. So as not to rouse the man's concerns, Piper asked him "is the missus in?" The man answered in broken English that she was out, and the policeman said he would return later.
Piper reported that as he was leaving Exchange Buildings to return to Houndsditch he saw a man acting suspiciously in the shadows of the cul-de-sac. As the policeman approached him, the man walked away; Piper later described him as being approximately 5 feet 7 inches (1.70 m), pale and fair-haired. When Piper reached Houndsditch he saw two policemen from the adjoining beats—constables Woodhams and Walter Choate—who watched 120 Houndsditch and 11 Exchange Buildings while Piper went to the nearby Bishopsgate Police Station to report. By 11:30 seven uniformed and two plain clothes policemen had gathered in the locality, each armed with his wooden truncheon. Sergeant Robert Bentley from Bishopsgate police station knocked at number 11, unaware that Piper had already done so, which alerted the gang. The door was answered by Gardstein, who made no response when Bentley asked if anyone was working there. Bentley asked him to fetch someone who spoke English; Gardstein left the door half-closed and disappeared inside. Bentley entered the hall with Sergeant Bryant and Constable Woodhams; as they could see the bottom of his trouser legs, they soon realised that someone was watching them from the stairs. The police asked the man if they could step into the back of the property, and he agreed. As Bentley moved forward, the back door opened and one of the gang exited, firing from a pistol as he did so; the man on the stairs also began firing. Bentley was shot in the shoulder and the neck—the second round severing his spine. Bryant was shot in the arm and chest and Woodhams was wounded in the leg, which broke his femur; both collapsed. Although they survived, neither Bryant or Woodhams fully recovered from their injuries.
As the gang exited the property and made to escape up the cul-de-sac, other police intervened. Sergeant Charles Tucker from Bishopsgate police station was hit twice, once in the hip and once in the heart by Peters: he died instantly. Choate grabbed Gardstein and wrestled for his gun, but the Russian managed to shoot him in the leg. Other members of the gang ran to Gardstein's assistance, shooting Choate twelve times in the process, but Gardstein was also wounded; as the policeman collapsed, Gardstein was carried away by his accomplices, who included Peters. As these men, aided by an unknown woman, made their escape with Gardstein they were accosted by Isaac Levy, a passer-by, whom they threatened at pistol-point. He was the only witness to the escape who was able to provide firm details; other witnesses confirmed they saw a group of three men and a woman, and thought one of the men was drunk as he was being helped by his friends. The group went to Svaars' and Peter the Painter's lodgings at 59 Grove Street (now Golding Street), off Commercial Road, where Gardstein was tended by two of the gang's associates, Milstein and Trassjonsky. As they left Gardstein on the bed, Peters left his Dreyse pistol under the mattress, either to make it seem the wounded man was the one who had killed Tucker, or to enable him to defend himself against a possible arrest.
Other policemen arrived in Houndsditch, and began to attend to the wounded. Tucker's body was put into a taxi and he was taken to the London Hospital (now the Royal London Hospital) in Whitechapel Road. Choate was also taken there, where he was operated on, but he died at 5:30 am on 17 December. Bentley was taken to St Bartholomew's Hospital. He was half-conscious on arrival, but recovered enough to be able to have a conversation with his pregnant wife and answer questions about the events. At 6:45 pm on 17 December his condition worsened, and he died at 7:30. The killings of Tucker, Bentley and Choate remain one of the largest multiple murders of police officers carried out in Britain in peacetime.
## Investigation, 17 December 1910 – 2 January 1911
The City of London police informed the Metropolitan force, as their protocol demanded, and both services issued revolvers to the detectives involved in the search. The subsequent investigation was challenging for the police because of the cultural differences between the British police and the largely foreign residents of the area covered by the search. The police did not have any Russian, Latvian or Yiddish speakers on the force.
In the early hours of the morning of 17 December Milstein and Trassjonsky became increasingly concerned as Gardstein's condition worsened, and they sent for a local doctor, explaining that their patient had been wounded accidentally by a friend. The doctor thought the bullet was still in the chest—it was later found to be touching the right ventricle of the heart. The doctor wanted to take Gardstein to the London Hospital, but he refused; with no other course open to him, the doctor sold them pain medication and left. The Russian was dead by 9:00 that morning. The doctor returned at 11:00 am and found the body. He had not heard of the events at Exchange Buildings the night before, and so reported the death to the coroner, not the police. At midday the coroner reported the death to the local police who, led by Divisional Detective Inspector Frederick Wensley, went to Grove Street and discovered the corpse. Trassjonsky was in the next room when they entered, and she was soon found by the police, hastily burning papers; she was arrested and taken to the police headquarters at Old Jewry. Many of the papers recovered linked the suspects to the East End, particularly to the anarchist groups active in the area. Wensley, who had extensive knowledge of the Whitechapel area, subsequently acted as a liaison officer to the City of London force throughout the investigation.
Gardstein's body was removed to a local mortuary where his face was cleaned, his hair brushed, his eyes opened and his photograph taken. The picture, and descriptions of those who had helped Gardstein escape from Exchange Buildings, were distributed on posters in English and Russian, asking locals for information. About 90 detectives vigorously searched the East End, spreading details of those they were looking for. A local landlord, Isaac Gordon, reported one of his lodgers, Nina Vassileva, after she had told him she had been one of the people living at Exchange Buildings. Wensley questioned the woman, finding anarchist publications in her rooms, along with a photograph of Gardstein. Information began to come in from the public and the group's associates: on 18 December Federoff was arrested at home, and on 22 December Dubof and Peters were both captured.
On 22 December a public memorial service took place for Tucker, Bentley and Choate at St Paul's Cathedral. King George V was represented by Edward Wallington, his Groom in Waiting; also present were Churchill and the Lord Mayor of London. The crime had shocked Londoners and the service showed evidence of their feelings. An estimated ten thousand people waited in St Paul's environs, and many local businesses closed as a mark of respect; the nearby London Stock Exchange ceased trading for half an hour to allow traders and staff to watch the procession along Threadneedle Street. After the service, when the coffins were being transported on an eight-mile (13 km) journey to the cemeteries, it was estimated that 750,000 people lined the route, many throwing flowers onto the hearses as they passed.
Identity parades were held at Bishopsgate police station on 23 December. Isaac Levy, who had seen the group leaving Exchange Buildings, identified Peters and Dubof as the two he had seen carrying Gardstein. It was also ascertained that Federoff had been witnessed at the events. The following day Federoff, Peters and Dubof all appeared at the Guildhall police court where they were charged with being connected to the murder of the three policemen, and with conspiracy to burgle the jewellery shop. All three pleaded not guilty.
On 27 December the poster bearing Gardstein's picture was seen by his landlord, who alerted police. Wensley and his colleagues visited the lodgings on Gold Street, Stepney and found knives, a gun, ammunition, false passports and revolutionary publications. Two days later there was another hearing at the Guildhall police court. In addition to Federoff, Peters and Dubof, present in the dock were Milstein and Trassjonsky. With some of the defendants having a low standard of English, interpreters were used throughout the proceedings. At the end of the day the case was adjourned until 6 January 1911.
On New Year's Day 1911 the body of Léon Beron, a Russian Jewish immigrant, was found on Clapham Common in South London. He had been badly beaten and two S-shaped cuts, both two inches long, were on his cheeks. The case became connected in the press with the Houndsditch murders and the subsequent events at Sidney Street, although the evidence at the time for the link was scant. The historian F G Clarke, in his history of the events, located information from another Latvian who stated that Beron had been killed not because he was one of the informers who had passed on information, but because he was planning to pass the information on, and the act was a pre-emptive one, designed to scare the locals into not informing on the anarchists. The police believed that the Clapham Common murder was not connected to the Houndsditch police murders.
The posters of Gardstein proved effective, and late on New Year's Day a member of the public came forward to provide information about Svaars and Sokoloff. The informant told police that the men were hiding at 100 Sidney Street, along with a lodger, Betty Gershon, who was Sokoloff's mistress. The informant was persuaded to visit the property the following day to confirm the two men were still present. A meeting took place on the afternoon of 2 January to decide the next steps. Wensley, high-ranking members of the Metropolitan force and Sir William Nott-Bower, the Commissioner of the City Police, were present.
## Events of 3 January
Just after midnight on 3 January, 200 police officers from the City of London and Metropolitan forces cordoned off the area around 100 Sidney Street. Armed officers were placed at number 111, directly opposite number 100, and throughout the night the residents of the houses on the block were roused and evacuated. Wensley woke the ground floor tenants at number 100 and asked them to fetch Gershon, claiming that she was needed by her sick husband. When Gershon appeared she was grabbed by the police and taken to the City of London police headquarters; the ground floor lodgers also evacuated. Number 100 was now empty of all residents, apart from Svaars and Sokoloff, neither of whom seemed to be aware of the evacuation.
The police's operating procedure—and the law which governed their actions—meant they were unable to open fire without being fired upon first. This, along with the structure of the building, which had a narrow, winding stairwell up which police would have to pass, meant any approach to the gang members was too perilous to attempt. It was decided to wait until dawn before taking any action. At about 7:30 am a policeman knocked on the door of number 100, which elicited no response; stones were then thrown at the window to wake the men. Svaars and Sokoloff appeared at the window and opened fire at the police. A police sergeant was wounded in the chest; he was evacuated under fire across the rooftops and taken to the London Hospital. Some members of the police returned fire, but their guns were only effective over shorter ranges, and proved ineffective against the comparatively advanced automatic weapons of Svaars and Sokoloff.
By 9:00 am it was apparent that the two gunmen possessed superior weapons and ample ammunition. The police officers in charge on the scene, Superintendent Mulvaney and Chief Superintendent Stark, contacted Assistant Commissioner Major Frederick Wodehouse at Scotland Yard. He telephoned the Home Office and obtained permission from Churchill to bring in a detachment of Scots Guards, who were stationed at the Tower of London. It was the first time that the police had requested military assistance in London to deal with an armed siege. Twenty-one volunteer marksmen from the Guards arrived at about 10:00 am and took firing positions at each end of the street and in the houses opposite. The shooting continued without either side gaining any advantage.
Churchill arrived on the scene at 11:50 am to observe the incident at first hand; he later reported that he thought the crowd were unwelcoming to him, as he heard people asking "Oo let 'em in?", in reference to the Liberal Party's immigration policy that had allowed the influx from Russia. Churchill's role during the siege is unclear. His biographers, Paul Addison and Roy Jenkins, both consider that he gave no operational commands to the police, but a Metropolitan police history of the event states that the events of Sidney Street were "a very rare case of a Home Secretary taking police operational command decisions". In a subsequent letter to The Times, Churchill clarified his role while he was present:
> I did not interfere in any way with the dispositions made by the police authorities on the spot. I never overruled those authorities nor overrode them. From beginning to end the police had an absolutely free hand. ... I did not send for the Artillery or the Engineers. I was not consulted as to whether they should be sent for.
Shooting between the two sides reached a peak between 12:00 and 12:30 pm, but at 12:50 smoke was seen coming from the building's chimneys and from the second floor windows; it has not been established how the fire was started, whether by accident or design. The fire slowly spread, and by 1:30 it had taken a firm hold and had spread to the other floors. A second detachment of Scots Guards arrived, bringing with them a Maxim machine gun, which was never used. Horse-drawn artillery field guns were also brought from St John's Wood barracks, but again not used. Shortly afterwards Sokoloff put his head out of the window; he was shot by one of the soldiers and he fell back inside. The senior officer of the London Fire Brigade present on the scene sought permission to extinguish the blaze, but was refused. He approached Churchill in order to have the decision overturned, but the Home Secretary approved the police decision. Churchill later wrote:
> I now intervened to settle this dispute, at one moment quite heated. I told the fire-brigade officer on my authority as Home Secretary that the house was to be allowed to burn down and that he was to stand by in readiness to prevent the conflagration from spreading.
By 2:30 pm the shooting from the house had ceased. One of the detectives present walked close to the wall and pushed the door open, before retreating. Other police officers, and some of the soldiers, came out and waited for the men to exit. None did, and as part of the roof collapsed, it was clear to onlookers that the men were both dead; the fire brigade was allowed to start extinguishing the fire. At 2:40 Churchill left the scene, at about the time the Royal Horse Artillery arrived with two 13-pounder field artillery pieces. Sokoloff's body was found soon after the firemen entered. A wall collapsed on a group of five firemen, who were all taken to the London Hospital. One of the men, Superintendent Charles Pearson, had a fractured spine; he died six months later. After shoring up the building, the firemen resumed their search; at around 6:30 pm the second body—that of Svaars—was found.
## Aftermath
The siege was captured by Pathé News cameras—one of their earliest stories and the first siege to be captured on film—and it included footage of Churchill. When the newsreels were screened in cinemas, Churchill was booed with shouts of "shoot him" from audiences. His presence was controversial to many and the Leader of the Opposition, Arthur Balfour, remarked, "He [Churchill] was, I understand, in military phrase, in what is known as the zone of fire—he and a photographer were both risking valuable lives. I understand what the photographer was doing, but what was the right hon. Gentleman doing? That I neither understood at the time, nor do I understand now." Jenkins suggests that he went simply because "he could not resist going to see the fun himself".
An inquest was held in January into the deaths at Houndsditch and Sidney Street. The jury took fifteen minutes to reach the conclusion that the two bodies located were those of Svaars and Sokoloff, that Tucker, Bentley and Choate had been murdered by Gardstein and others during the burglary attempt. Rosen was arrested on 2 February at work in Well Street, Hackney, and Hoffman was taken into custody on 15 February. The committal proceedings spread from December 1910—with Milstein and Trassjonsky appearing—to March 1911, and included Hoffman from 15 February. The proceedings consisted of 24 individual hearings. In February Milstein was discharged on the basis that there was insufficient evidence against her; Hoffman, Trassjonsky and Federoff were released in March on the same basis.
The case against the four remaining arrested gang members was heard at the Old Bailey by Mr Justice Grantham in May. Dubof and Peters were accused of Tucker's murder, Dubof, Peters, Rosen and Vassilleva were charged with "feloniously harbouring a felon guilty of murder" and for "conspiring and agreeing together and with others unknown to break and enter the shop of Henry Samuel Harris with intent to steal his goods". The case lasted for eleven days; there were problems with the proceedings because of the language difficulties and the chaotic personal lives of the accused. The case resulted in acquittals for all except Vassileva, who was convicted of conspiracy in the burglary and sentenced to two years' imprisonment; her conviction was overturned on appeal.
After the high levels of criticism aimed at the Aliens Act, Churchill decided to strengthen the legislation, and proposed the Aliens (Prevention of Crime) Bill under the Ten Minute Rule. The MP Josiah C Wedgwood objected, and wrote to Churchill to ask him not to introduce the hard-line measures "You know as well as I do that human life does not matter a rap in comparison with the death of ideas and the betrayal of English traditions." The bill did not become law.
## Legacy
Bryant and Woodhams were presented with the King's Police Medal for their bravery; Woodhams was still badly injured and had to be carried to the king on a stretcher for the presentation. Both Bryant and Woodhams were also promoted; as they were being invalided out of the force, the promotions ensured they were paid a higher pension. The Lord Mayor of London presented the King's Police Medal to the families of the three murdered policemen. For each child of the murdered policemen, the City of London Corporation gave five shillings a week until they reached the age of fifteen.
The inadequacy of the police's firepower led to criticism in the press, and on 12 January 1911 several alternative weapons were tested. The trials resulted in the Metropolitan Police replacing the Webley revolver with the Webley & Scott .32 calibre MP semi-automatic pistol later that year; the City of London Police adopted the weapon in 1912.
The members of the group dispersed after the events. Peter the Painter was never seen or heard from again. It was assumed he left the country, and there were several possible sightings in the years afterwards; none were confirmed. Jacob Peters returned to Russia, rose to be deputy head of the Cheka, the Soviet secret police, and was executed in Joseph Stalin's 1938 purge. Trassjonsky had a mental breakdown and was confined for a time at Colney Hatch Lunatic Asylum, where she died in 1971. Dubof and Federoff disappeared from the records; Vassilleva remained in the East End for the remainder of her life and died at Brick Lane in 1963. Hoffman moved to New York where he lived for many years with Luba Milstein, who had given birth to Fritz Svaars child. Smoller left the country in 1911 and travelled to Paris, after which he disappeared; Milstein later emigrated to the United States.
The siege was the inspiration for the final scene in Alfred Hitchcock's original 1934 version of The Man Who Knew Too Much; the story was heavily fictionalised in the 1960 film The Siege of Sidney Street. The novelist Georges Simenon drew on the story for his 1930 Maigret detective novel Pietr-le-Letton (Pietr the Latvian). The siege was also the inspiration for two other novels, The Siege of Sidney Street (1960) by F Oughton and A Death Out of Season (1973) by Emanuel Litvinoff.
In September 2008 Tower Hamlets London Borough Council named two tower blocks in Sidney Street, Peter House and Painter House; Peter the Painter was only involved in a minor capacity in the events and was not present at the siege. The name plaques on the buildings call Peter the Painter an "anti-hero"; the decision angered the Metropolitan Police Federation. A council spokesman said that "There is no evidence that Peter the Painter killed the three policemen, so we knew we were not naming the block after a murderer. ... but he is the name that East Enders associate with the siege and Sidney Street." In December 2010, on the centenary of the events at Houndsditch, a memorial plaque for the three murdered policemen was unveiled near the location. Three weeks later, on the anniversary of the siege, a plaque was unveiled in honour of Pearson, the fireman who died because the building collapsed on him. |
# Worlds (Porter Robinson album)
Worlds is the debut studio album by the American electronic music producer Porter Robinson, released on August 12, 2014, by Astralwerks. Initially known for his heavier bass-centric production, Robinson became increasingly dissatisfied with the electronic dance music (EDM) genre, believing it limited his artistic expression. In 2012, Robinson released his first song with a greater emphasis on melody, "Language", and decided thereafter to prioritize aesthetic and emotional qualities in his work. He was inspired by media that evoked nostalgia for his childhood, and wrote music integrating elements taken from anime, films, and sounds from 1990s video games.
Robinson's primary inspirations for Worlds were Daft Punk's Discovery (2001) and Kanye West's Graduation (2007). Critics described the work as electropop, noting similarities to the styles of M83 and Passion Pit. In late 2013, a bidding war broke out among record labels over which of them would release the record. The album was preceded by four singles: "Sea of Voices", "Sad Machine", "Lionhearted", and "Flicker", and promoted with a tour in North America and Europe.
Worlds was well received by most critics, who praised it as innovative and forecasted a promising career for Robinson, though others felt the record lacked coherence or was unexciting. Retrospectively, the album was noted for its impact on the EDM scene. It charted in the United States, the United Kingdom, Australia, and the Netherlands. Following Worlds's positive reception, Robinson felt pressured to write an appropriate follow-up work. As a result, he experienced a period of writer's block and depression, leading to the seven-year gap until his next studio album, Nurture (2021).
## Background and development
Porter Robinson was initially known for his electro and complextro music, such as the 2010 single "Say My Name" and the 2011 extended play Spitfire; Robinson described his initial sound as "very heavy" and "bass-aggressive". "Say My Name" topped Beatport's electro house chart, while Spitfire caused the website to crash after being promoted by Skrillex and Tiësto.
Across 2012, Robinson performed at major electronic dance music (EDM) festivals, but gradually became dissatisfied with the genre. He mentioned having four or five intense anxiety attacks that year while performing, at one point shouting that "dance music is terrible" during a show. Robinson came to believe that the genre limited his expression; in an interview with NME, he said "[EDM] is entertainment, it's not really art". Robinson felt that by attempting to add DJ-friendly and dance-oriented features to his music, he frequently compromised and diminished the quality of his songs.
Robinson conceived the idea for Worlds in 2012 following the release of "Language", his first song to have a greater emphasis on melody. Although it was a departure from his earlier sound, "Language" was accepted by audiences, surprising Robinson. As a result, he decided to prioritize "beauty" and "emotion" in his music, which became his first principles for Worlds. He also considered it necessary to be "sincere" and "honest". Rather than creating club-oriented music, he chose to produce the music he wanted to hear and believed should exist. In 2013, he released "Easy" with Mat Zo, which Andy Kellman of AllMusic characterized as one of the standout commercial dance singles of the year.
Robinson moved to his parents' home in Chapel Hill, North Carolina, and spent a year revisiting soundtracks of Nintendo 64 video games from the 1990s and 2000s. Robinson produced the album in FL Studio, and wrote around 50 tracks for the album, which were later narrowed to 12 on the final tracklist. In a May 2013 interview, Robinson said he had set July as the deadline to finish the album, and that the title still had not been chosen. When Robinson signed with Astralwerks in November 2013, the album was nearly complete. Robinson collaborated with Spanish illustrator David Aguado to create the album's artwork and design.
## Composition
Robinson was inspired by themes of fantasy, escapism, fiction, and nostalgia; he said that Worlds is not associated with, nor has a place in, reality. Robinson incorporated elements from video games, anime, and movies. His experiences with massively multiplayer online role-playing games and associated nostalgia were an influence. He admired the worlds these games – Star Wars Galaxies (2003) in particular – provided and was affected by how dwindling player bases and bankruptcies eventually brought them offline. These themes influenced Robinson to title the album Worlds.
Robinson used General MIDI sounds that resembled the music of Nintendo 64 and PlayStation video games, including those he played while growing up in the 1990s, such as The Legend of Zelda: Ocarina of Time (1998), which evoked childhood nostalgia for Robinson. By emulating the "slight[ly] sad vibe" of the stories that inspired him, Robinson wanted to give the album a retrospective and emotional atmosphere. Daft Punk's Discovery (2001), an album Robinson considers the best of all time, was his biggest influence for the record, with Kanye West's Graduation (2007) in second. Multiple critics wrote that the album's sound resembled M83 and Passion Pit.
Larry Fitzmaurice of Pitchfork said that Worlds is clearly electropop, and Megan Buerger of Billboard wrote that the album combines ambient, disco and electropop. Vice's Elissa Stolman felt that several tracks on the album were inspired by new wave. While Robinson intended to stray from EDM, the album still kept some of its elements; some critics described the album's sound as "post-EDM". Sharon O'Connell of Uncut said that Robinson mixed EDM tropes and nu-rave with M83-like synth-pop and "bangers" by Daft Punk and Justice. Conversely, Buerger wrote that bass drops and dance-like rhythms were substituted by "delicate chord progressions and deep, forceful synths". Barry Walters of Wondering Sound wrote that, in contrast to the typically higher tempos of EDM, much of Worlds is at a lower, ballad-like speed.
## Songs
### Tracks 1–5
Worlds opens with "Divinity", which contains vocals by Canadian singer Amy Millan, from the bands Stars and Broken Social Scene. Robinson chose the track as the opener because it was the first he wrote with a slower tempo and more emotional chords, a style he considered representative of Worlds. Tatiana Cirisano of Billboard wrote that there is a large contrast between the intro and chorus; while the former contains "underwater-sounding", smooth vocals, the latter contains a "cacophony" of cymbals and glitch-like sounds reminiscent of video games. Barry Walters of Wondering Sound said that it features common characteristics of EDM, such as a powerful beat, dense layers of synthesizers, and an airy female vocal, while Elissa Stolman of Vice described the track as an indie-electronic "festival rave anthem", with synths that resembled M83's "Midnight City" (2011). Alternatively, Rupert Howe of Q found similarities to electronica and M83-like space rock.
The next track, "Sad Machine", was the first song for which Robinson had recorded his own vocals. Describing it as a "duet between a lonely robot girl and the human boy", Robinson employed Avanna, a Vocaloid voice, as the song's lead singer. Larry Fitzmaurice of Pitchfork considered it one among other tracks on Worlds which resembled the "high-wire synth-pop fantasias" of Passion Pit, as it contained a mid-tempo instrumental and "starry-eyed melodic structure". Las Vegas Weekly's Mike Prevatt identified inspirations from M83 and Sigur Rós. Lucas Villa of AXS described the track as "heroic and awe-inspiring" and felt that it evoked the "dreamier" elements of electronic music. The third song, "Years of War", features Breanne Düren of Owl City and Sean Caskey of Last Dinosaurs. Pursuing a "cutesy synth-pop thing", Robinson said it was the hardest he had ever worked on a song. It leans into electropop, synth-pop, and new wave. The song's main instrument is a trance synth, which Stolman felt contrasted with the song's retro elements, such as a boom-clap rhythm and "sepia-toned synths".
The song is followed by "Flicker", which Robinson considered one of his proudest moments on the album. The song begins with a calm disco beat reminiscent of old video games and a faint bassline building in the background. A female voice enters, speaking chopped-up Japanese phrases. Prevatt said that the song uses a classic hip hop breakbeat before the chorus, which he described as an "emotional payoff". Just after the two-minute mark, the song switches to a bass-heavy atmosphere, and Buerger comments that Robinson retains his "invitation to the party" in spite of the song's experimental elements. She described the song as the most dynamic on the album. Garrett Kamps of Spin identified melodic similarities with Boards of Canada. "Fresh Static Snow", the fifth track on the album, also uses Avanna. Robinson said that the song focuses on his feelings of loneliness and the idea of soulmates. Consequence of Sound's Derek Staples found the song's "ethereal electro vibes" to be reminiscent of The Glitch Mob and The M Machine. Stolman described it as a "coiled, metallic guitar squall" which goes to "midrange bass grit" culminating in a heavenly breakdown with melancholy robotic vocals.
### Tracks 6–12
The album's sixth track, "Polygon Dust", is a collaboration with Lemaitre, a band Robinson was fond of. Its main element is a trance synth. Stolman described the track as one of the safest of the album, containing natural vocals as opposed to "Sad Machine" and "Fresh Static Snow", as well as calmer synths. It is followed by "Hear the Bells", which features Imaginary Cities. It is based on one of the band's existing songs, "Bells of Cologne". Robinson felt that the song is where he sings with the greatest stage presence. Kamps thought the vocal choir was "fantastical and defiantly cheery", while Stolman wrote that the song contains Givers-like layered indie vocals and emotional lyrics. Fitzmaurice said that "Hear the Bells" has a good amount of "rocket fuel" due to its dynamic electronics and anthemic synthesizers.
"Natural Light", Worlds's eighth track, is an interlude. Robinson enjoyed the track due to its intelligent dance music passages inspired by artists such as Aphex Twin and Venetian Snares. Stolman commented that, despite its driving bass, sharp drum hits, vocal fragments, and sparkling keys, the track could be called minimal in the context of the album. The ninth track is "Lionhearted", which features Urban Cone. It was one of the first tracks Robinson wrote for the album, describing it as "anthemic". Critics wrote that this was the album's first display of a faster tempo. Kamps described the sound as "exuberant pop" and Prevatt felt there were similarities to the styles of Holy Ghost\! and Passion Pit. The next song, "Sea of Voices", went through multiple iterations before its release. It is a five-minute orchestral track that contain no drums in its first minutes, being only composed of synths reminiscent of atmospheric big room. Noting the late introduction of beats, Buerger said the track has "the emotions of a tear-jerking blockbuster". Kamps found the build-up similar to ones by Sigur Rós.
"Fellow Feeling" is Worlds's penultimate track. In a criticism of EDM composition, Robinson starkly juxtaposed what he felt was "beautiful and serene" with aggressive and violent elements. Sharon O'Connell of Uncut felt that the opening section was reminiscent of chamber music, that was described by Villa as "cinematic" and Walters as "symphonic". Further into the song, a voice says, "Now, please, hear what I hear", and a strong bass enters. Walters claimed the track is interrupted by aggressive dubstep elements which O'Connell described as electro funk that had been chopped and screwed. Villa named it the album's most climactic moment. The final track, "Goodbye to a World", is the third to use Avanna. Robinson wanted the feeling of a "beautiful apocalypse" for the song. It has lullaby-like moments contrasted with sections that Staples found similar to breakcore and Stolman characterized as "fist-pumping brutality".
## Release and promotion
Following a bidding war over the record, it was announced on November 14, 2013, that Robinson had signed a deal with Astralwerks; Worlds would be released through their Capitol Records imprint in the US and their Virgin EMI Records imprint internationally. Robinson chose Astralwerks because it was not an EDM label. On February 10, 2014, Robinson revealed the album's title in a video that featured a robotic voice repeating worlds for ten hours. When the video was released, Robinson stated that he disliked marketing campaigns that were "wishy-washy", and attempted to create all of his work with a clear intent.
Astralwerks wanted to release "Shepherdess" as the album's first single, which Robinson described as the "most EDM thing" he had made since 2011. However, he decided to lead with the song he felt was "the least accessible to fans of dance music", "Sea of Voices". The single was released on March 2, shortly before the 86th Academy Awards. "Sea of Voices" became a trending topic on Twitter during the event, and received positive reactions from audiences, contrary to Robinson's expectations. Though originally intending "Flicker" to be the album's second single, Robinson changed it to "Sad Machine" three days beforehand, which he claimed caused "mayhem" at the label. "Sad Machine" was the last song written for the album, and Robinson felt certain it should be the next release upon its completion. "Sad Machine" was premiered by The Fader on May 12, 2014, and made available elsewhere a day later. A lyric video was released on May 21.
On June 3, Stereogum premiered Worlds's third single, "Lionhearted", which features Swedish band Urban Cone. It also debuted at BBC Radio 1. The single was officially released on June 17, accompanied by a music video in which Mixmag's Carré Orenstein described Robinson and a group of women "wreak[ing] havoc around the city streets, resulting in an eruption of [color]". "Flicker" was premiered on July 28 by Vogue, being officially released the next day as Worlds's fourth and final single. An official music video was released on August 14. The video is set on a train and views glitchy effects occurring on a passing Japanese landscape through the window.
In July, Robinson announced a limited edition box set of Worlds containing bonus remixes and tracks. On August 4, the album was premiered by NPR as part of their "First Listen" series. It was fully released on August 12, 2014. From August 28 to October 18, Robinson performed on a North American tour for Worlds, which later extended to Europe. He once again took inspiration from fictional universes for its visuals, which featured video game-like, pixelated worlds on large LED displays. The visuals were managed by Imaginary Light Network.
On October 2, 2015, Robinson released Worlds Remixed, a remix album involving artists and producers such as Mat Zo, Odesza, Sleepy Tom, Galimatias, and San Holo. As with Worlds, David Aguado illustrated several pieces for the remix album.
A tenth anniversary edition of the album was released on vinyl in 2024. The pressing featured a previously unreleased song, "Hollowheart", which Robinson had intended to appear on Worlds but did not submit in time to be included. A live album featuring recordings of Robinson's performance at the 2019 Second Sky festival was released concurrently.
## Critical reception
According to review aggregator Metacritic, Worlds received "generally favorable reviews" based on a weighted average score of 63 out of 100 from 8 critics scores, while, on AnyDecentMusic?, the album received a rating of 6.4 out of 10 from 7 critic scores.
Some reviewers praised Worlds as innovative. Lucas Villa of AXS felt that Robinson exceeded expectations by crafting a complete experience, venturing boldly into uncharted territory for DJs, while Garrett Kamps of Spin said that "it's pretty hard to deny this kid has done something amazing, no matter what you call it". Writing for Billboard, Megan Buerger thought Worlds was "the next frontier" for Robinson, praising its focus on the individual instead of the collective. She described the album as "ideal headphone music", while Rolling Stone's Elissa Stolman wrote that it "manages to retain the thrilling rush of emotions that the best raves inspire", despite not fully sounding like EDM. Las Vegas Weekly writer Mike Prevatt wrote that the album was "a necessary crosscurrent to the swells of EDM" even if it did not catalyze a new musical trend.
Although some reviewers were critical of the album, they acknowledged it was evident Robinson had a promising career ahead of him. Andy Kellman of AllMusic felt that it was clear Robinson was yet to become accustomed to creating music outside the context of raves due to the album's "several clumsier moments". Kellman, when considering Robinson's ambitions and accomplishments with the work, forecasted a "fascinating" career in his future. While Pitchfork's Larry Fitzmaurice did not find Worlds's style to be inventive, he admired the transition Robinson was making and wrote that his career seemed "extremely promising". Sharon O'Connell of Uncut felt that many characterized Robinson as a stylistic pioneer, a view she disagreed with, but also wrote that "youth is on Robinson's side." Q's Rupert Howe said that Robinson had fulfilled his reputation as an accomplished producer, but that, while having different aspirations than his peers, he "hasn't completely freed himself of their influence".
Some reviewers thought that the record lacked coherence; Consequence of Sound's Derek Staples felt that while reinventing EDM was a noble idea, Robinson's execution was weak, and Worlds more resembled a "remix compilation" than a proper album. Others found the album unexciting. Samuel Tolzmann of Spectrum Culture wrote that Worlds ultimately embraces generic conventions and that the expectation for the album to redefine the genre highlighted more about the stagnation of this style of EDM than Robinson's music's complexity or creativity. Barry Walters of Wondering Sound said that little of Worlds was memorable, suggesting that Robinson's personal universe felt notably derivative.
Worlds was considered the second best album of the year by Thump and appeared in a list of best albums of the year by Complex.
## Commercial performance
In the United States, Worlds debuted at number one on Billboard's Top Dance/Electronic Albums, holding that position for a week. The album spent a total of 23 weeks on the chart. On Billboard 200, the magazine's main album chart, it peaked at number 18 and spent a total of seven weeks on the list. In the United Kingdom, the album debuted and peaked at number 13 on the Official Charts Company's UK Dance Albums and at 86 on the company's main chart, UK Albums Chart. The album also charted at number 13 in Australia and 96 in the Netherlands.
## Legacy
Worlds had a notable impact on the EDM scene. John Ochoa of DJ Mag described it as a "breakthrough" that precipitated a wider shift in the electronic music industry, allowing for "softer" and "dreamier" music in the genre. According to Ochoa, "Worlds was Robinson's attempt to change the course of an entire genre and scene. He succeeded." Similarly, Kat Bein of Billboard said that the album influenced "a generation of producers to make pretty, emotional dance music", as well as attempt live performances. According to Paper's Matt Moen, a wave of artists would cite Worlds as a major influence, and Krystal Rodriguez and Bein of Billboard said that Worlds and its tour became a model for a generation of young producers to emulate. In November 2019, Billboard staff ranked Worlds as the fifteenth greatest dance album of the 2010s and as the ninety-seventh greatest album of the decade more broadly.
As a result of the album's positive reception, Robinson had set high expectations for himself, stating in 2018 that he felt significant pressure to create something similar to a follow-up. This caused him to go through an extended period of writer's block and depression, during which he released very little music. Robinson's second studio album, Nurture, was released on April 23, 2021, seven years after Worlds. His experiences with his mental health during this time were reflected in Nurture's lyrics.
## Track listing
## Personnel
Adapted from the CD liner notes.
- Porter Robinson – production, engineering, mixing, additional drum programming
- Mike Marsh – mastering
- Simon Davey – mastering
- Karen Thompson – mastering
- Nicole Frantz – art direction
- David Aguado – artwork, design
- Randall Leddy – layout
## Charts |
# The Lucy poems
The Lucy poems are a series of five poems composed by the English Romantic poet William Wordsworth (1770–1850) between 1798 and 1801. All but one were first published during 1800 in the second edition of Lyrical Ballads, a collaboration between Wordsworth and Samuel Taylor Coleridge that was both Wordsworth's first major publication and a milestone in the early English Romantic movement. In the series, Wordsworth sought to write unaffected English verse infused with abstract ideals of beauty, nature, love, longing, and death.
The "Lucy poems" consist of "Strange fits of passion have I known", "She dwelt among the untrodden ways", "I travelled among unknown men", "Three years she grew in sun and shower", and "A slumber did my spirit seal". Although they are presented as a series in modern anthologies, Wordsworth did not conceive of them as a group, nor did he seek to publish the poems in sequence. He described the works as "experimental" in the prefaces to both the 1798 and 1800 editions of Lyrical Ballads, and revised the poems significantly—shifting their thematic emphasis—between 1798 and 1799. Only after his death in 1850 did publishers and critics begin to treat the poems as a fixed group.
The poems were written during a short period while the poet lived in Germany. Although they individually deal with a variety of themes, the idea of Lucy's death weighs heavily on the poet throughout the series, imbuing the poems with a melancholic, elegiac tone. Whether Lucy was based on a real woman or was a figment of the poet's imagination has long been a matter of debate among scholars. Generally reticent about the poems, Wordsworth never revealed the details of her origin or identity. Some scholars speculate that Lucy is based on his sister Dorothy, while others see her as a fictitious or hybrid character. Most critics agree that she is essentially a literary device upon whom he could project, meditate and reflect.
## Background
### Lyrical Ballads
In 1798, Wordsworth and Samuel Taylor Coleridge jointly published Lyrical Ballads, with a Few Other Poems, a collection of verses each had written separately. The book became hugely popular and was published widely; it is generally considered a herald of the Romantic movement in English literature. In it, Wordsworth aimed to use everyday language in his compositions as set out in the preface to the 1802 edition: "The principal object, then, proposed in these Poems was to choose incidents and situations from common life, and to relate or describe them, throughout, as far as was possible in a selection of language really used by men, and at the same time, to throw over them a certain colouring of imagination, whereby ordinary things should be presented to the mind in an unusual aspect."
The two poets had met three years earlier in either late August or September 1795 in Bristol. The meeting laid the foundation for an intense and profoundly creative friendship, based in part on their shared disdain for the artificial diction of the poetry of the era. Beginning in 1797, the two lived within walking distance of each other in Somerset, which solidified their friendship. Wordsworth believed that his life before meeting Coleridge was sedentary and dull and that his poetry amounted to little. Coleridge influenced Wordsworth, and his praise and encouragement inspired Wordsworth to write prolifically. Dorothy, Wordsworth's sister, related the effect Coleridge had on her brother in a March 1798 letter: "His faculties seem to expand every day, he composes with much more facility than he did, as to the mechanism [emphasis in original] of poetry, and his ideas flow faster than he can express them." With his new inspiration, Wordsworth came to believe he could write poetry rivalling that of John Milton. He and Coleridge planned to collaborate, but never moved beyond suggestions and notes for each other.
The expiration of Wordsworth's Alfoxton House lease soon provided an opportunity for the two friends to live together. They conceived a plan to settle in Germany with Dorothy and Coleridge's wife, Sara, "to pass the two ensuing years in order to acquire the German language and to furnish ourselves with a tolerable stock of information in natural science". In September 1798, Wordsworth, Coleridge, and Dorothy travelled to Germany to explore proximate living arrangements, but this proved difficult. Although they lived together in Hamburg for a short time, the city was too expensive for their budgets. Coleridge soon found accommodations in the town of Ratzeburg in Schleswig-Holstein, which was less expensive but still socially vibrant. The impoverished Wordsworth, however, could neither afford to follow Coleridge nor provide for himself and his sister in Hamburg; the siblings instead moved to moderately priced accommodations in Goslar in Lower Saxony, Germany.
### Separation from Coleridge
Between October 1798 and February 1799, Wordsworth worked on the first draft of the "Lucy poems" together with a number of other verses, including the "Matthew poems", "Lucy Gray" and The Prelude. Coleridge had yet to join the siblings in Germany, and Wordsworth's separation from his friend depressed him. In the three months following their parting, Wordsworth completed the first three of the "Lucy poems": "Strange fits", "She dwelt", and "A slumber". They first appeared in a letter to Coleridge dated December 1798, in which Wordsworth wrote that "She dwelt" and "Strange fits" were "little Rhyme poems which I hope will amuse you". Wordsworth characterised the two poems thus to mitigate any disappointment Coleridge might suffer in receiving these two poems instead of the promised three-part philosophical epic The Recluse.
In the same letter, Wordsworth complained that:
> As I have had no books I have been obliged to write in self-defense. I should have written five times as much as I have done but that I am prevented by an uneasiness at my stomach and side, with a dull pain about my heart. I have used the word pain, but uneasiness and heat are words which more accurately express my feelings. At all events it renders writing unpleasant. Reading is now become a kind of luxury to me. When I do not read I am absolutely consumed by thinking and feeling and bodily exertions of voice or of limbs, the consequence of those feelings.
Wordsworth partially blamed Dorothy for the abrupt loss of Coleridge's company. He felt that their finances—insufficient for supporting them both in Ratzeburg—would have easily supported him alone, allowing him to follow Coleridge. Wordsworth's anguish was compounded by the contrast between his life and that of his friend. Coleridge's financial means allowed him to entertain lavishly and to seek the company of nobles and intellectuals; Wordsworth's limited wealth constrained him to a quiet and modest life. Wordsworth's envy seeped into his letters when he described Coleridge and his new friends as "more favored sojourners" who may "be chattering and chatter'd to, through the whole day".
Although Wordsworth sought emotional support from his sister, their relationship remained strained throughout their time in Germany. Separated from his friend and forced to live in the sole company of his sister, Wordsworth used the "Lucy poems" as an emotional outlet.
### Identity of Lucy
Wordsworth did not reveal the inspiration for the character of Lucy, and over the years the topic has generated intense speculation among literary historians. Little biographical information can be drawn from the poems—it is difficult even to determine Lucy's age. In the mid-19th century, Thomas DeQuincey (1785–1859), author and one-time friend of Wordsworth, wrote that the poet "always preserved a mysterious silence on the subject of that 'Lucy', repeatedly alluded to or apostrophised in his poems, and I have heard, from gossiping people about Hawkshead, some snatches of tragic story, which, after all, might be an idle semi-fable, improved out of slight materials."
Critic Herbert Hartman believes Lucy's name was taken from "a neo-Arcadian commonplace", and argues she was not intended to represent any single person. In the view of one Wordsworth biographer, Mary Moorman (1906–1994), "The identity of 'Lucy' has been the problem of critics for many years. But Wordsworth is a poet before he is a biographer, and neither 'Lucy' nor her home nor his relations with her are necessarily in the strict sense historical. Nevertheless, as the Lyrical Ballads were all of them 'founded on fact' in some way, and as Wordsworth's mind was essentially factual, it would be rash to say that Lucy is entirely fictitious."
Moorman suggests that Lucy may represent Wordsworth's romantic interest Mary Hutchinson, but wonders why she would be represented as one who died. It is possible that Wordsworth was thinking of Margaret Hutchinson, Mary's sister who had died. There is no evidence, however, that the poet loved any of the Hutchinsons other than Mary. It is more likely that Margaret's death influenced but is not the foundation for Lucy.
In 1980, Hunter Davies contended that the series was written for the poet's sister Dorothy, but found the Lucy–Dorothy allusion "bizarre". Earlier, literary critic Richard Matlak tried to explain the Lucy–Dorothy connection, and wrote that Dorothy represented a financial burden to Wordsworth, which had effectively forced his separation from Coleridge. Wordsworth, depressed over the separation from his friend, in this interpretation, expresses both his love for his sister and fantasies about her loss through the poems. Throughout the poems, the narrator's mixture of mourning and antipathy is accompanied by denial and guilt; his denial of the Lucy–Dorothy relationship and the lack of narratorial responsibility for the death of Lucy allow him to escape from questioning his desires for the death of his sister. After Wordsworth began the "Lucy poems", Coleridge wrote, "Some months ago Wordsworth transmitted to me a most sublime Epitaph / whether it had any reality, I cannot say. —Most probably, in some gloomier moment he had fancied the moment in which his Sister might die." It is, however, possible that Wordsworth simply feared her death and did not wish it, even subconsciously.
Reflecting on the significance and relevance of Lucy's identity, the 19th-century poet, essayist and literary critic Frederic Myers (1843–1901) observed that:
> here it was that the memory of some emotion prompted the lines on "Lucy". Of the history of that emotion, he has told us nothing; I forbear, therefore, to inquire concerning it, or even to speculate. That it was to the poet's honour, I do not doubt; but who ever learned such secrets rightly? or who should wish to learn? It is best to leave the sanctuary of all hearts inviolate, and to respect the reserve not only of the living but of the dead. Of these poems, almost alone, Wordsworth in his autobiographical notes has said nothing whatever.
Literary scholar Karl Kroeber (1926–2009) argues that Lucy "possesses a double existence; her actual, historical existence and her idealised existence in the poet's mind. In the poem, Lucy is both actual and idealised, but her actuality is relevant only insofar as it makes manifest the significance implicit in the actual girl." Hartman holds the same view; to him Lucy is seen "entirely from within the poet, so that this modality may be the poet's own", but then he argues, "she belongs to the category of spirits who must still become human ... the poet describes her as dying at a point at which she would have been humanized." The literary historian Kenneth Johnston concludes that Lucy was created as the personification of Wordsworth's muse, and the group as a whole "is a series of invocations to a Muse feared to be dead...As epitaphs, they are not sad, a very inadequate word to describe them, but breathlessly, almost wordlessly aware of what such a loss would mean to the speaker: 'oh, the difference to me\!'"
Scholar John Mahoney observes that whether Lucy is intended to represent Dorothy, Mary or another is much less important to understanding the poems than the fact that she represented "a hidden being who seems to lack flaws and is alone in the world". Furthermore, she is represented as being insignificant in the public sphere but of the utmost importance in the private sphere; in "She dwelt" this manifests through the comparison of Lucy to both a hidden flower and a shining star. Neither Lucy nor Wordsworth's other female characters "exist as independent self-conscious human beings with minds as capable of the poet's" and are "rarely allowed to speak for themselves". G. Kim Blank takes a psycho-autobiographical approach: he situates the core Lucy poems in the context of what surfaces during Wordsworth's depressive and stressful German experience in the winter of 1798–1799; he concludes that "Lucy dies at the threshold of being fully expressed as a feeling of loss," and that, for Wordsworth, she "represents a cluster of unresolved emotions"—Wordsworth's own emotions, that is.
## The poems
The "Lucy poems" are written from the point of view of a lover who has long viewed the object of his affection from afar, and who is now affected by her death. Yet Wordsworth structured the poems so that they are not about any one person who has died; instead they were written about a figure representing the poet's lost inspiration. Lucy is Wordsworth's inspiration, and the poems as a whole are, according to Wordsworth biographer Kenneth Johnston, "invocations to a Muse feared to be dead". Lucy is represented in all five poems as sexless; it is unlikely that the poet ever realistically saw her as a possible lover. Instead, she is presented as an ideal and represents Wordsworth's frustration at his separation from Coleridge; the asexual imagery reflects the futility of his longing.
Wordsworth's voice slowly disappears from the poems as they progress, and his voice is entirely absent from the fifth poem. His love operates on the subconscious level, and he relates to Lucy more as a spirit of nature than as a human being. The poet's grief is private, and he is unable to fully explain its source. When Lucy's lover is present, he is completely immersed in human interactions and the human aspects of nature, and the death of his beloved is a total loss for the lover. The 20th-century critic Spencer Hall argues that the poet represents a "fragile kind of humanism".
### "Strange fits of passion have I known"
"Strange fits" is probably the earliest of the poems and revolves around a fantasy of Lucy's death. It describes the narrator's journey to Lucy's cottage and his thoughts along the way. Throughout, the motion of the Moon is set in opposition to the motion of the speaker. The poem contains seven stanzas, a relatively elaborate structure which underscores his ambivalent attitude towards Lucy's imagined death. The constant shifts in perspective and mood reflect his conflicting emotions. The first stanza, with its use of dramatic phrases such as "fits of passion" and "dare to tell", contrasts with the subdued tone of the rest of the poem. As a lyrical ballad, "Strange fits" differs from the traditional ballad form, which emphasises abnormal action, and instead focuses on mood.
The presence of death is felt throughout the poem, although it is mentioned explicitly only in the final line. The Moon, a symbol of the beloved, sinks steadily as the poem progresses, until its abrupt drop in the penultimate stanza. That the speaker links Lucy with the Moon is clear, though his reasons are unclear. The Moon nevertheless plays a significant role in the action of the poem: as the lover imagines the Moon slowly sinking behind Lucy's cottage, he is entranced by its motion. By the fifth stanza, the speaker has been lulled into a somnambulistic trance—he sleeps while still keeping his eyes on the Moon (lines 17–20).
The narrator's conscious presence is wholly absent from the next stanza, which moves forward in what literary theorist Geoffrey Hartman describes as a "motion approaching yet never quite attaining its end". When the Moon abruptly drops behind the cottage, the narrator snaps out of his dream, and his thoughts turn towards death. Lucy, the beloved, is united with the landscape in death, while the image of the retreating, entrancing Moon is used to portray the idea of looking beyond one's lover. The darker possibility also remains that the dream state represents the fulfilment of the lover's fantasy through the death of the beloved. In falling asleep while approaching his beloved's home, the lover betrays his own reluctance to be with Lucy.
Wordsworth made numerous revisions to each of the "Lucy poems". The earliest version of "Strange fits" appears in a December 1798 letter from Dorothy to Coleridge. This draft contains many differences in phrasing and does not include a stanza that appeared in the final published version. The new lines direct the narrative towards "the Lover's ear alone", implying that only other lovers can understand the relationship between the Moon, the beloved and the beloved's death. Wordsworth also removed from the final stanza the lines:
> > I told her this; her laughter light Is ringing in my ears; And when I think upon that night My eyes are dim with tears.
This final stanza lost its significance with the completion of the later poems in the series, and the revision allowed for a sense of anticipation at the poem's close and helped draw the audience into the story of the remaining "Lucy poems". Of the other changes, only the description of the horse's movement is important: "My horse trudg'd on" becomes "With quickening pace my horse drew nigh", which heightens the narrator's vulnerability to fantasies and dreams in the revised version.
### "She dwelt among the untrodden ways"
"She dwelt among the untrodden ways" presents Lucy as having lived in solitude near the source of the River Dove. According to literary critic Geoffrey Durrant, the poem charts her "growth, perfection, and death". To convey the dignified, unaffected naturalness of his subject, Wordsworth uses simple language, mostly words of one syllable. In the opening quatrain, he describes the isolated and untouched area where Lucy lived, as well as her innocence and beauty, which he compares to that of a hidden flower in the second. The poem begins in a descriptive rather than narrative manner, and it is not until the line "When Lucy ceased to be" that the reader is made aware that the subject of the verse has died. Literary scholar Mark Jones describes this effect as finding the poem is "over before it has begun", while according to writer Margaret Oliphant (1828–1897), Lucy "is dead before we so much as heard of her".
Lucy's "untrodden ways" are symbolic of both her physical isolation and the unknown details of her thoughts and life as well as her sense of mystery. The third quatrain is written with an economy intended to capture the simplicity the narrator sees in Lucy. Her femininity is described in girlish terms. This has drawn criticism from those who see the female icon, in the words of literary scholar John Woolford, "represented in Lucy by condemning her to death while denying her the actual or symbolic fulfillment of maternity". To evoke the "loveliness of body and spirit", a pair of complementary but paradoxical images are employed in the second stanza: the solitary, hidden violet juxtaposed to the publicly visible Venus, emblem of love and first star of evening. Wondering if Lucy more resembles the violet or the star, the critic Cleanth Brooks (1906–1994) concludes that while Wordsworth likely views her as "the single star, completely dominating [his] world, not arrogantly like the sun, but sweetly and modestly", the metaphor is a conventional compliment with only vague relevance. For Wordsworth, Lucy's appeal is closer to the violet and lies in her seclusion and her perceived affinity with nature.
Wordsworth acquired a copy of the antiquarian and churchman Thomas Percy's (1729–1811) collection of British ballads Reliques of Ancient English Poetry (1765) in Hamburg a few months before he began to compose the series. The influence of the traditional English folk ballad is evident in the metre, rhythm and structure of "She dwelt". It follows the variant ballad stanza a4–b3–a4–b3, and, in keeping with ballad tradition, tells a dramatic story. As Durrant observed, "To confuse the mode of the 'Lucy' poems with that of the love lyric is to overlook their structure, in which, as in the traditional ballad, a story is told as boldly and briefly as possible." Kenneth and Warren Ober compare the opening lines of "She dwelt" to the traditional ballad "Katharine Jaffray" and note similarities in rhythm and structure, as well as in theme and imagery:
The narrator of the poem is less concerned with the experience of observing Lucy than with his reflections and meditations on his observations. Throughout the poem sadness and ecstasy are intertwined, a fact emphasised by the exclamation marks in the second and third verses. The critic Carl Woodring writes that "She dwelt" and the Lucy series can be read as elegiac, as "sober meditation[s] on death". He found that they have "the economy and the general air of epitaphs in the Greek Anthology... [I]f all elegies are mitigations of death, the Lucy poems are also meditations on simple beauty, by distance made more sweet and by death preserved in distance".
An early draft of "She dwelt" contained two stanzas which had been omitted from the first edition. The revisions exclude many of the images but emphasise the grief that the narrator experienced. The original version began with floral imagery, which was later cut:
> > My hope was one, from cities far, Nursed on a lonesome heath; Her lips were red as roses are, Her hair a woodbine wreath.
A fourth stanza, also later removed, mentions Lucy's death: "But slow distemper checked her bloom / And on the Heath she died."
### "I travelled among unknown men"
The last of the "Lucy poems" to be composed, "I travelled among unknown men", was the only one not included in the second edition of Lyrical Ballads. Although Wordsworth claimed that the poem was composed while he was still in Germany, it was in fact written in April 1801. Evidence for this later date comes from a letter Wordsworth wrote to Mary Hutchinson referring to "I travelled" as a newly created poem. In 1802, he instructed his printer to place "I travelled" immediately after "A slumber did my spirit seal" in Lyrical Ballads, but the poem was omitted. It was later published in Poems, in Two Volumes in 1807.
The poem has frequently been read as a declaration of Wordsworth's love for his native England and his determination not to live abroad again:
> > 'Tis past, that melancholy dream\! Nor will I quit thy shore A second time; for still I seem To love thee more and more. (lines 5–8)
The first two stanzas seem to speak of the poet's personal experience, and a patriotic reading would reflect his appreciation and pride for the English landscape. The possibility remains, however, that Wordsworth is referring to England as a physical rather than a political entity, an interpretation that gains strength from the poem's connections to the other "Lucy poems".
Lucy only appears in the second half of the poem, where she is linked with the English landscape. As such, it seems as if nature joins with the narrator in mourning for her, and the reader is drawn into this mutual sorrow.
Although "I travelled" was written two years after the other poems in the series, it echoes the earlier verses in both tone and language. Wordsworth gives no hint as to the identity of Lucy, and although he stated in the preface to Lyrical Ballads that all the poems were "founded on fact", knowing the basis for the character of Lucy is not necessary to appreciate the poem and understand its sentiment. Similarly, no insight can be gained from determining the exact geographical location of the "springs of Dove"; in his youth, Wordsworth had visited springs of that name in Derbyshire, Patterdale and Yorkshire.
### "Three years she grew in sun and shower"
"Three years she grew in sun and shower" was composed between 6 October and 28 December 1798. The poem depicts the relationship between Lucy and nature through a complex opposition of images. Antithetical couplings of words—"sun and shower", "law and impulse", "earth and heaven", "kindle and restrain"—are used to evoke the opposing forces inherent in nature. A conflict between nature and humanity is described, as each attempts to possess Lucy. The poem contains both epithalamic and elegiac characteristics; Lucy is shown as wedded to nature, while her human lover is left alone to mourn in the knowledge that death has separated her from humanity.
### "A slumber did my spirit seal"
Written in spare language, "A slumber did my spirit seal" consists of two stanzas, each four lines long. The first stanza is built upon even, soporific movement in which figurative language conveys the nebulous image of a girl who "seemed a thing that could not feel / The touch of earthly years". The second maintains the quiet and even tone of the first but serves to undermine its sense of the eternal by revealing that Lucy has died and that the calmness of the first stanza represents death. The narrator's response to her death lacks bitterness or emptiness; instead he takes consolation from the fact that she is now beyond life's trials, and "at last ... in inanimate community with the earth's natural fixtures". The lifeless rocks and stones depicted in the concluding line convey the finality of Lucy's death.
## Grouping as a series
Although the "Lucy poems" share stylistic and thematic similarities, it was not Wordsworth but literary critics who first presented the five poems as a unified set called the "Lucy poems". The grouping was originally suggested by critic Thomas Powell in 1831 and later advocated by Margaret Oliphant in an 1871 essay. The 1861 Golden Treasury, compiled by the English historian Francis Palgrave (1788–1861), groups only four of the verses, omitting "Strange fits". The poems next appeared as a complete set of five in the collection of Wordsworth's poems by English poet and critic Matthew Arnold (1822–1888).
The grouping and sequence of the "Lucy poems" has been a matter of debate in literary circles. Various critics have sought to add poems to the group; among those proposed over the years are "Alcaeus to Sappho", "Among all lovely things", "Lucy Gray", "Surprised by joy", "Tis said, that some have died for love", "Louisa", "Nutting", "Presentiments", "She was a Phantom of delight", "The Danish Boy", "The Two April Mornings", "To a Young Lady", and "Written in Very Early Youth". None of the proposals have met with widespread acceptance. The five poems included in the Lucy "canon" focus on similar themes of nature, beauty, separation and loss, and most follow the same basic ballad form. Literary scholar Mark Jones offers a general characterisation of a Lucy poem as "an untitled lyrical ballad that either mentions Lucy or is always placed with another poem that does, that either explicitly mentions her death or is susceptible of such a reading, and that is spoken by Lucy's lover."
With the exception of "A slumber", all of the poems mention Lucy by name. The decision to include this work is based in part on Wordsworth's decision to place it in close proximity to "Strange fits" and directly after "She dwelt" within Lyrical Ballads. In addition, "I travelled" was sent to the poet's childhood friend and later wife, Mary Hutchinson, with a note that said it should be "read after 'She dwelt'". Coleridge biographer J. Dykes Campbell records that Wordsworth instructed "I travelled" to be included directly following "A slumber", an arrangement that indicates a connection between the poems. Nevertheless, the question of inclusion is further complicated by Wordsworth's eventual retraction of these instructions and his omission of "I travelled" from the two subsequent editions of Lyrical Ballads.
The 1815 edition of Lyrical Ballads organised the poems into the Poems Founded on the Affections ("Strange fits", "She dwelt", and "I travelled") and Poems of the Imagination ("Three years she grew" and "A slumber"). This arrangement allowed the two dream-based poems ("Strange fits" and "A slumber") to frame the series and to represent the speaker's different sets of experiences over the course of the longer narrative. In terms of chronology, "I travelled" was written last, and thus also served as a symbolic conclusion—both emotionally and thematically—to the "Lucy poems".
## Interpretation
### Nature
According to critic Norman Lacey, Wordsworth built his reputation as a "poet of nature". Early works, such as "Tintern Abbey", can be viewed as odes to his experience of nature. His poems can also be seen as lyrical meditations on the fundamental character of the natural world. Wordsworth said that, as a youth, nature stirred "an appetite, a feeling and a love", but by the time he wrote Lyrical Ballads, it evoked "the still sad music of humanity".
The five "Lucy poems" are often interpreted as representing Wordsworth's opposing views of nature as well as meditations on the cycle of life. They describe a variety of relationships between humanity and nature. For example, Lucy can be seen as a connection between humanity and nature, as a "boundary being, nature sprite and human, yet not quite either. She reminds us of the traditional mythical person who lives, ontologically, an intermediate life, or mediates various realms of existence." Although the poems evoke a sense of loss, they also hint at the completeness of Lucy's life—she was raised by nature and survives in the memories of others. She became, in the opinion of the American poet and writer David Ferry (b. 1924), "not so much a human being as a sort of compendium of nature", while "her death was right, after all, for by dying she was one with the natural processes that made her die, and fantastically ennobled thereby".
Cleanth Brooks writes that "Strange fits" presents "Kind Nature's gentlest boon", "Three years" its duality, and "A slumber" the clutter of natural object. Other scholars see "She dwelt", along with "I travelled", as representing nature's "rustication and disappearance". Mahoney views "Three years" as describing a masculine, benevolent nature similar to a creator deity. Although nature shapes Lucy over time and she is seen as part of nature herself, the poem shifts abruptly when she dies. Lucy appears to be eternal, like nature itself. Regardless, she becomes part of the surrounding landscape in life, and her death only verifies this connection.
The series presents nature as a force by turns benevolent and malign. It is shown at times to be oblivious to and uninterested in the safety of humanity. Hall argues, "In all of these poems, nature would seem to betray the heart that loves her". The imagery used to evoke these notions serves to separate Lucy from everyday reality. The literary theorist Frances Ferguson (b. 1947) notes that the "flower similes and metaphors become impediments rather than aids to any imaginative visualization of a woman; the flowers do not simply locate themselves in Lucy's cheeks, they expand to absorb the whole of her ... The act of describing seems to have lost touch with its goal—description of Lucy."
### Death
The poems Wordsworth wrote while in Goslar focus on the dead and dying. The "Lucy poems" follow this trend, and often fail to delineate the difference between life and death. Each creates an ambiguity between the sublime and nothingness, as they attempt to reconcile the question of how to convey the death of a girl intimately connected to nature. They describe a rite of passage from innocent childhood to corrupted maturity and, according to Hartman, "center on a death or a radical change of consciousness which is expressed in semi-mythical form; and they are, in fact, Wordsworth's nearest approach to a personal myth." The narrator is affected greatly by Lucy's death and cries out in "She dwelt" of "the difference to me\!". Yet in "A slumber" he is spared from trauma by sleep.
The reader's experience of Lucy is filtered through the narrator's perception. Her death suggests that nature can bring pain to all, even to those who loved her. According to the British classical and literary scholar H. W. Garrod (1878–1960), "The truth is, as I believe, that between Lucy's perfection in Nature and her death there is, for Wordsworth, really no tragic antithesis at all." Hartman expands on this view to extend the view of death and nature to art in general: "Lucy, living, is clearly a guardian spirit, not of one place but of all English places ... while Lucy, dead, has all nature for her monument. The series is a deeply humanized version of the death of Pan, a lament on the decay of English natural feeling. Wordsworth fears that the very spirit presiding over his poetry is ephemeral, and I think he refuses to distinguish between its death in him and its historical decline."
## Critical assessment
The first mention of the poems came from Dorothy, in a letter sent to Coleridge in December 1798. Of "Strange fits", she wrote, "[this] next poem is a favourite of mine—i.e. of me Dorothy—". The first recorded mention of any of the "Lucy poems" (outside of notes by either William or Dorothy) occurred after the April 1799 death of Coleridge's son Berkeley. Coleridge was then living in Germany, and received the news through a letter from his friend Thomas Poole, who in his condolences mentioned Wordsworth's "A slumber":
> But I cannot truly say that I grieve—I am perplexed—I am sad—and a little thing, a very trifle would make me weep; but for the death of the Baby I have not wept\!—Oh\! this strange, strange, strange Scene-shifter, Death\! that giddies one with insecurity, & so unsubstantiates the living Things that one has grasped and handled\!—/ Some months ago Wordsworth transmitted to me a most sublime Epitaph / whether it had any reality, I cannot say.—Most probably, in some gloomier moment he had fancied the moment in which his sister might die.
Later, the essayist Charles Lamb (1775–1834) wrote to Wordsworth in 1801 to say that "She dwelt" was one of his favourites from Lyrical Ballads. Likewise, the Romantic poet John Keats (1795–1821) praised the poem. To the diarist and writer Henry Crabb Robinson (1775–1867), "She dwelt" gave "the powerful effect of the loss of a very obscure object upon one tenderly attached to it—the opposition between the apparent strength of the passion and the insignificance of the object is delightfully conceived."
Besides word of mouth and opinions in letters, there were only a few published contemporary reviews. The writer and journalist John Stoddart (1773–1856), in a review of Lyrical Ballads, described "Strange fits" and "She dwelt" as "the most singular specimens of unpretending, yet irresistible pathos". An anonymous review of Poems in Two Volumes in 1807 had a less positive opinion about "I travell'd": "Another string of flat lines about Lucy is succeeded by an ode to Duty". Critic Francis Jeffrey (1773–1850) claimed that, in "Strange fits", "Mr Wordsworth, however, has thought fit to compose a piece, illustrating this copious subject by one single thought. A lover trots away to see his mistress one fine evening, staring all the way at the moon: when he comes to her door, 'O mercy\! to myself I cried, / If Lucy should be dead\!' And there the poem ends\!" On "A slumber did my spirit seal", Wordsworth's friend Thomas Powell wrote that the poem "stands by itself, and is without title prefixed, yet we are to know, from the penetration of Mr. Wordsworth's admirers, that it is a sequel to the other deep poems that precede it, and is about one Lucy, who is dead. From the table of contents, however, we are informed by the author that it is about 'A Slumber;' for this is the actual title which he has condescended to give it, to put us out of pain as to what it is about."
Many Victorian critics appreciated the emotion of the "Lucy poems" and focused on "Strange fits". John Wilson, a personal friend of both Wordsworth and Coleridge, described the poem in 1842 as "powerfully pathetic". In 1849, critic Rev. Francis Jacox, writing under the pseudonym "Parson Frank", remarked that "Strange fits" contained "true pathos. We are moved to our soul's centre by sorrow expressed as that is; for, without periphrasis or wordy anguish, without circumlocution of officious and obtrusive, and therefore, artificial grief; the mourner gives sorrow words... But he does it in words as few as may be: how intense their beauty\!" A few years later, John Wright, an early Wordsworth commentator, described the contemporary perception that "Strange fits" had a "deep but subdued and 'silent fervour'". Other reviewers emphasised the importance of "She dwelt among the untrodden ways", including Scottish writer William Angus Knight (1836–1916), when he described the poem as an "incomparable twelve lines".
At the beginning of the 20th century, literary critic David Rannie praised the poems as a whole: "that strange little lovely group, which breathe a passion unfamiliar to Wordsworth, and about which he—so ready to talk about the genesis of his poems—has told us nothing [...] Let a poet keep some of his secrets: we need not grudge him the privacy when the poetry is as beautiful as this; when there is such celebration of girlhood, love, and death [...] The poet's sense of loss is sublime in its utter simplicity. He finds harmony rather than harshness in the contrast between the illusion of love and the fact of death." Later critics focused on the importance of the poems to Wordsworth's poetic technique. Durrant argued that "The four 'Lucy' poems which appeared in the 1800 edition of Lyrical Ballads are worth careful attention because they represent the clearest examples of the success of Wordsworth's experiment." Alan Grob (1932–2007) focused less on the unity that the poems represent and believed that "the principal importance of the 'Matthew' and 'Lucy' poems, apart from their intrinsic achievement, substantial as that is, is in suggesting the presence of seeds of discontent even in a period of seemingly assured faith that makes the sequence of developments in the history of Wordsworth's thought a more orderly, evolving pattern than the chronological leaps between stages would seem to imply."
Later critics de-emphasised the significance of the poems in Wordsworth's artistic development. Hunter Davies (b. 1936) concluded that their impact relies more on their popularity than importance to Wordsworth's poetic career. Davies went on to claim, "The poems about Lucy are perhaps Wordsworth's best-known work which he did in Germany, along with 'Nutting' and the Matthew poems, but the most important work was the beginning of The Prelude" (emphasis in original). Some critics emphasised the importance behind Lucy as a figure, including Geoffrey Hartman (b. 1929), when he claimed, "It is in the Lucy poems that the notion of spirit of place, and particularly English spirit of place, reaches its purest form." Writer and poet Meena Alexander (b. 1951) believed that the character of Lucy "is the impossible object of the poet's desire, an iconic representation of the Romantic feminine."
## Parodies and allusions
The "Lucy poems" have been parodied numerous times since their first publication. These were generally intended to ridicule the simplification of textual complexities and deliberate ambiguities in poetry. They also questioned the way many 19th-century critics sought to establish definitive readings. According to Jones, such parodies commented in a "meta-critical" manner and themselves present an alternative mode of criticism. Among the more notable is the one by Samuel Taylor Coleridge's son Hartley Coleridge (1796–1849), called "On William Wordsworth" or simply "Imitation", as in the 1827 version published for The Inspector magazine ("He lived amidst th' untrodden ways / To Rydal Lake that lead; / A Bard whom there were none to praise / And very few to read" lines 1–4). Parody also appears in the 1888 murder-mystery reading of the poem by Victorian author Samuel Butler (1835–1902). Butler believed Wordsworth's use of the phrase "the difference to me\!" was overly terse, and remarked that the poet was "most careful not to explain the nature of the difference which the death of Lucy will occasion him to be ... The superficial reader takes it that he is very sorry she was dead ... but he has not said this." Not every work referring to the "Lucy poems" is intended to mock, however; the novelist and essayist Mary Shelley (1797–1851) drew upon the poems to comment on and re-imagine the Romantic portrayal of femininity.
## Settings
The "Lucy poems" (omitting "I travelled among unknown men" but adding "Among all lovely things") have been set for voice and piano by the composer Nigel Dodd. The settings were first performed at St George's, Brandon Hill, Bristol, in October 1995 at a concert marking the bicentenary of the first meeting of Wordsworth and Coleridge.
Among settings of individual poems is Benjamin Britten's "Lucy" ("I travelled among unknown men") composed in 1926.
Three of the five poems were set to music and recorded by the orchestral pop band The Divine Comedy on their album Liberation. |
# Sally Ride
Sally Kristen Ride (May 26, 1951 – July 23, 2012) was an American astronaut and physicist. Born in Los Angeles, she joined NASA in 1978, and in 1983 became the first American woman and the third woman to fly in space, after cosmonauts Valentina Tereshkova in 1963 and Svetlana Savitskaya in 1982. She was the youngest American astronaut to have flown in space, having done so at the age of 32.
Ride was a graduate of Stanford University, where she earned a Bachelor of Science degree in physics and a Bachelor of Arts degree in English literature in 1973, a Master of Science degree in 1975, and a Doctor of Philosophy in 1978 (both in physics) for research on the interaction of X-rays with the interstellar medium. She was selected as a mission specialist astronaut with NASA Astronaut Group 8, the first class of NASA astronauts to include women. After completing her training in 1979, she served as the ground-based capsule communicator (CapCom) for the second and third Space Shuttle flights, and helped develop the Space Shuttle's robotic arm. In June 1983, she flew in space on the on the STS-7 mission. The mission deployed two communications satellites and the first Shuttle pallet satellite (SPAS-1). Ride operated the robotic arm to deploy and retrieve SPAS-1. Her second space flight was the STS-41-G mission in 1984, also on board Challenger. She spent a total of more than 343 hours in space. She left NASA in 1987.
Ride worked for two years at Stanford University's Center for International Security and Arms Control, then at the University of California, San Diego, primarily researching nonlinear optics and Thomson scattering. She served on the committees that investigated the loss of Challenger and of Columbia, the only person to participate in both. Having been married to astronaut Steven Hawley during her spaceflight years and in a private, long-term relationship with former Women's Tennis Association player Tam O'Shaughnessy, she is the first astronaut known to have been LGBTQ. She died of pancreatic cancer in 2012.
## Early life
Sally Kristen Ride was born on May 26, 1951, in the Encino neighborhood of Los Angeles, California, the elder child of Dale Burdell Ride and Carol Joyce Ride . She had one sibling, Karen, known as "Bear". Both parents were elders in the Presbyterian Church. Her mother, who was of Norwegian descent, had worked as a volunteer counselor at a women's correctional facility. Her father served with the U.S. Army in Europe with the 103rd Infantry Division during World War II. After the war he went to Haverford College on the G.I. Bill, earned a master's degree in education at the University of California, Los Angeles, and became a political science professor at Santa Monica College.
Ride grew up in the Van Nuys and Encino neighborhoods of Los Angeles. In 1960, when she was nine years old, the family spent a year traveling in Europe. In Spain, Ride played tennis for the first time. She enjoyed sports, tennis most of all, and at age 10 was coached by Alice Marble, a former world number one player. By 1963 Ride was ranked number 20 in Southern California for girls aged 12 and under. She attended Encino Elementary School, Portola Junior High (now Portola Middle School), Birmingham High School and then, as a sophomore on a tennis scholarship, Westlake School for Girls, an exclusive all-girls private school in Los Angeles. Elizabeth Mommaerts, who taught human physiology, became a mentor. Ride resolved to become an astrophysicist. She graduated in June 1968, and then took a class in advanced math at Santa Monica College during the summer break.
Her friend Sue Okie was interested in going to Swarthmore College in Pennsylvania, so Ride applied too. She was interviewed by Fred Hargadon, the dean of admissions, who was impressed by both her mental and her tennis ability. She was admitted on a full scholarship. She commenced classes at Swarthmore on September 18, 1968. She played golf, and made Swarthmore's field hockey varsity team. She won all six of her intercollegiate tennis matches, and became the Eastern Intercollegiate Women's Singles champion. She defended her title in May 1969, winning in straight sets. However, Ride was homesick for California, and before Title IX, women's tennis was not well-supported at the college level; Swarthmore had four tennis courts but no indoor courts and she could not practice when it snowed. After three semesters at Swarthmore, she returned to California in January 1970, with the aim of becoming a professional tennis player.
Ride entered the University of California, Los Angeles, where she enrolled in courses in Shakespeare and quantum mechanics, earning A's in both subjects. She was the only woman majoring in physics. She was romantically involved with the teaching assistant, John Tompkins, but the relationship ended in September when he went to Moscow to conduct research at the Institute for High Energy Physics. Her foray into professional tennis was unsuccessful; after playing three matches in a single August morning her whole body ached the following day. She realized that far more effort would be necessary in order to reach the required level of fitness: she needed to practice for eight hours a day. She concluded that she did not have what it took to be a professional tennis player.
Ride applied for a transfer to Stanford University as a junior. The tennis coach was eager to have her on the team, and by coincidence, Fred Hargadon was now the dean of admissions there. He was once again instrumental in approving her admission. She graduated in 1973 with a Bachelor of Science degree in physics and a Bachelor of Arts degree in English literature. She then earned a Master of Science degree in physics in 1975 and a Doctor of Philosophy in 1978. Astrophysics and free-electron lasers were her areas of study. She wrote her doctoral dissertation on "the interaction of X-rays with the interstellar medium", under the supervision of Arthur B. C. Walker Jr.
At Stanford, Ride renewed her acquaintance with Molly Tyson, who was a year younger than her. The two had met on the tennis circuit as junior tennis players. Although Ride was rated number one at Stanford and Tyson was number six, the two played doubles together. Ride later quit the Stanford tennis team in protest against the university's refusal to join the Pac-8 Conference in women's tennis. To earn money Ride and her then-girlfriend Tyson gave tennis lessons, and in 1971 and 1972 they were counselors at Dennis Van der Meer's TennisAmerica summer camp at Lake Tahoe, Nevada. In August 1972, Ride played in a doubles match with Van der Meer against Billie Jean King, the world number 1 ranked female tennis player, and Dick Peters, the camp director; Martin Luther King III and Dexter King served as ball boys. Billie Jean King became a mentor and a friend. Ride watched her win the Battle of the Sexes match against Bobby Riggs in 1973. Tyson ended their relationship in 1975, and Ride moved in with Bill Colson, a fellow graduate physics student who was recently divorced.
## NASA astronaut
### Selection and training
In January 1977, Ride spotted an article on the front page of The Stanford Daily that told how the National Aeronautics and Space Administration (NASA) was recruiting a new group of astronauts for the Space Shuttle program and wanted to recruit women. No women had previously been NASA astronauts, although the Soviet Union's cosmonaut Valentina Tereshkova had flown in space in 1963. Ride mailed a request for, and received the application forms. When asked for three persons with knowledge of her qualifications, she gave the names of three with whom she had been in relationships: Colson, Tompkins and Tyson.
Ride's was one of 8,079 applications NASA received by the June 30, 1977, deadline. She then became one of 208 finalists. She was the only woman among the twenty applicants in the sixth group, all applicants for mission specialist positions, who reported to NASA's Johnson Space Center (JSC) in Houston, Texas, on October 3, for a week of interviews and medical examinations. Her physical fitness impressed the doctors. They also placed her in a Personal Rescue Enclosure to see if she suffered from claustrophobia. She was asked to write a one-page essay on why she wanted to become an astronaut. Finally, she was interviewed by the selection committee. On January 16, 1978, she received a phone call from George Abbey, NASA's director of flight operations, who informed her that she had been selected as part of NASA Astronaut Group 8. She was one of 35 astronaut candidates in the group, of whom six were women.
Group 8's name for itself was "TFNG". The abbreviation was deliberately ambiguous; for public purposes, it stood for "Thirty-Five New Guys", but within the group itself, it was known to stand for the military phrase, "the fucking new guy", used to denote newcomers to a military unit. Officially, they were astronaut candidates; they would not become fully-fledged astronauts until they had completed their training. Ride was graded a civil service GS-12, with a salary of US$21,883 (). She bought a unit in the Nassau Bay, Texas, area, and moved in with Colson, who secured a research grant at Rice University so they could move to Houston together. He became the only unmarried astronaut candidate's partner. Ride and Colson split up in January 1979, and she briefly dated fellow astronaut candidate Robert "Hoot" Gibson.
Astronaut candidate training included learning to fly NASA's T-38 Talon jet aircraft. Officially, mission specialists did not have to qualify as pilots, only ride in the back seat and handle an emergency if the pilot became incapacitated. They were never to control the aircraft below 5,000 feet (1,500 m), but many of the astronaut pilots and pilot candidates, eager to share their love of flying, ignored the rules, and let the more proficient mission specialist candidates fly the jets lower. John Fabian even had her fly "under the hood", with the windows blacked out and using instruments only. Ride enjoyed flying so much she took private flying lessons to earn a private pilot's license. She bought a part interest in a Grumman Tiger aircraft, which she would fly on weekends. On August 31, 1979, NASA announced that the 35 astronaut candidates had completed their training and evaluation, and were now officially astronauts, qualified for selection on space flight crews.
In 1981, Ride began dating Steven Hawley, another one of the TFNGs. They moved in together, and considered themselves engaged. Unlike Colson, he was not aware of her earlier relationship with Tyson. They were married on July 26, 1982, in the backyard of Hawley's parents' house in Salina, Kansas. Ride flew up from Houston for the occasion in her Grumman Tiger, and wore white jeans. The ceremony was jointly conducted by Hawley's father Bernard, the pastor at the local Presbyterian church, and Ride's sister Bear. It was deliberately kept low-key, with only parents and siblings in attendance. They became the third NASA astronaut couple, after Rhea Seddon and Hoot Gibson, who had married a few months before, and Anna Fisher and her husband Bill Fisher, who became an astronaut couple when the latter was selected with NASA Astronaut Group 9 in 1980. Ride did not take her husband's name.
### STS-7
Ride served as a ground-based capsule communicator (CapCom) for the second and third Space Shuttle flights, and helped develop the Shuttle Remote Manipulator System (RMS), also known as the "Canadarm" or robot arm. She was the first woman to serve as a CapCom. By early 1982, George Abbey and the Chief of the Astronaut Office, John Young, wanted to begin scheduling missions with the TFNGs, starting with the seventh Space Shuttle mission. To command it, they chose Robert Crippen, who had flown with Young on the first Space Shuttle mission. They wanted a woman to fly on the mission, and since the mission involved the use of the RMS, the choice narrowed to Ride, Judy Resnik and Anna Fisher, who had specialized on it. Factors in Ride's favor included her agreeable personality and ability to work with others, her performance as CapCom, and her skill with the robot arm. However, JSC director Chris Kraft preferred Fisher, and Abbey had to defend their decision. NASA Headquarters ultimately approved Ride's selection, which was officially announced in April 1982.
As the first American woman to fly in space, Ride was subjected to media attention. There were over five hundred requests for private interviews, all of which were declined. Instead, NASA hosted the usual pre-launch press conference on May 24, 1983. Ride was asked questions such as, "Will the flight affect your reproductive organs?" and "Do you weep when things go wrong on the job?" She insisted that she saw herself in only one way—as an astronaut. NASA was still adjusting to female astronauts, and engineers had asked Ride to assist them in developing a "space makeup kit", assuming it would be something a woman would want on board. They also infamously suggested providing Ride with a supply of 100 tampons for the six-day mission.
When the lifted off from the Kennedy Space Center (KSC) on June 18, 1983, Ride became the first American woman to fly in space, and the third woman overall. She also became the youngest American astronaut in space, although there had been younger cosmonauts. Many of the people attending the launch wore T-shirts bearing the words "Ride, Sally Ride", lyrics from Wilson Pickett's song "Mustang Sally". The purpose of the mission was to deploy two communications satellites: Anik C2 for Telesat of Canada and Palapa B1 for Indonesia. Both were deployed during the first two days of the mission.
The mission also carried the first Shuttle pallet satellite (SPAS-1), which carried ten experiments to study formation of metal alloys in microgravity. Part of Ride's job was to operate the robot arm to deploy and later retrieve SPAS-1, which was brought back to Earth. The orbiter's small Reaction control system rockets were fired while SPAS-1 was held by the remote manipulator system to test the movement on an extended arm. STS-7 was also the first occasion on which a photograph was taken of the Space Shuttle in orbit. This was done using the camera on SPAS-1. Ride manipulated the robot arm into the shape of a "7", as it appeared on the mission patch. The mission also studied Space adaptation syndrome, a bout of nausea frequently experienced by astronauts during the early phase of a space flight. Ride was not affected and did not require medication for the syndrome. Bad weather forced Challenger to land at Edwards Air Force Base in California instead of the Shuttle Landing Facility at the KSC. The mission lasted 6 days, 2 hours, 23 minutes and 59 seconds.
Now a celebrity, Ride, along with her STS-7 crewmates, spent the next few months after her flight on tour. She met with the Governor of California, George Deukmejian, and the Mayor of New York, Ed Koch. She testified before the Congressional Space Caucus on the efficacy of the robot arm, and addressed the National Press Club, but declined to appear with Bob Hope, whom she regarded as sexist. The crew presented President Ronald Reagan with jelly beans that had been flown on the flight. In September 1983, on her own initiative, she met with Svetlana Savitskaya, the second woman to fly in space, in Budapest. The two formed an instant camaraderie, and they were able to converse for six hours, thanks to Savitskaya's command of English. They exchanged gifts: Savitskaya presented Ride with Russian dolls, books and a scarf, and Ride gave Savitskaya an STS-7 charm that had flown on the mission and a TFNG shirt. They also signed autographs for each other on Russian first day covers.
### STS-41-G
While she was still engaged on the publicity tour, Abbey assigned Ride to the crew of STS-41-G. This was on Crippen's request; he had been assigned to another mission, STS-41-C, that would fly beforehand as part of a test to see how quickly crews could be turned around, and wanted Ride as his flight engineer again so that she could sit in for him during crew training for STS-41-G in the meantime. During mission simulations, she sat in the commander's left hand seat. Ride would become the first American woman to fly twice, and her TFNG crewmate Kathryn Sullivan would become the first American woman to perform an extravehicular activity (EVA); Savitskaya had already become the first woman to do both when she flew in space on Soyuz T-12 in July 1984. However, it would be the first time that two women were in space together.
The mission lifted off from the KSC in Challenger on October 5, 1984. The rookie astronauts on the flight were cautious about moving about too soon, lest they suffer from space adaptation syndrome, but Ride was now a veteran astronaut, one who knew that she would not be affected. Once in orbit she immediately and gracefully began moving about. The crew deployed the Earth Radiation Budget Satellite, conducted scientific observations of the Earth with the OSTA-3 pallet (including the SIR-B radar, FILE, and MAPS experiments) and large format camera (LFC), and conducted numerous in-cabin experiments as well as activating eight Getaway Special canisters containing experiments devised by outside groups.
When the SIR-B antenna failed to unfold correctly, Ride used the robot arm to shake it loose, manipulating the robot arm much faster than she had been trained. She also repaired a broken antenna on the middeck. During the second day of the mission, the SIR-B antenna had to be stowed so Challenger's orbit could be altered but its latches failed to clamp and close the antenna. Ride then used the RMS to nudge the antenna panel closed. Sullivan performed an EVA with fellow TFNG mission specialist David Leestma, in which they showed that a satellite could be refueled in orbit. On this mission Challenger completed 132 orbits of the Earth in 197.5 hours, landing back at the KSC on October 13, 1984. During the mission, Ride carried a white silk scarf that had been worn by Amelia Earhart. On her two flights Ride had spent over 343 hours in space.
### Planned third mission
Ride was soon back in the rotation, training for her third flight, STS-61-I. This mission was scheduled to be flown no later than July 15, 1986, and was to deploy the Intelsat VI-1 and INSAT 1-C communications satellites and carry the Materials Science Lab-4. The crew was subsequently switched to STS-61-M, a Tracking and Data Relay Satellite System (TRDS) deployment mission scheduled to be flown in July 1986. She also served on two more missions as CapCom. On January 7, 1986, Ride provided a glowing reference for her friend (and eventual biographer) Lynn Sherr for NASA's Journalist in Space Project. Sherr became one of the finalists. During 1985, Ride began an affair with Tam O'Shaughnessy. The two knew each other from the junior tennis circuit, and from when Ride was at Stanford. O'Shaughnessy was now living in Atlanta, and had recently broken up with her female partner. Ride visited when she went to Atlanta on speaking engagements. Hawley was aware that his marriage was in trouble, but not that O'Shaughnessy was more than a friend. Ride still performed her astronaut spouse duties for Hawley when he flew in space for the second time on STS-61-C in January 1986. Astronauts and their spouses were quarantined for a few days before launch, and they stayed at the astronaut beach house at the KSC. Spouses were expected to attend events before and after launches, including the post-mission publicity tour. This could be agonizing for a couple whose marriage was breaking up.
### Rogers Commission
STS-61-M was cancelled after the Space Shuttle Challenger disaster later that month. Ride was appointed to the Rogers Commission, the presidential commission investigating the disaster, and headed its subcommittee on operations. She was the only Space Shuttle astronaut and the only current NASA employee on the commission. After her death in 2012, Major General Donald J. Kutyna revealed that she had discreetly provided him with key information about O-rings, namely, that they become stiff at low temperatures, that eventually led to identification of the cause of the explosion. To protect her source, they then fed this information to Richard Feynman. Ride was even more disturbed by revelations of NASA dysfunctional management decision-making and risk-assessment processes. According to Roger Boisjoly, who was one of the engineers that warned of the technical problems that led to the Challenger disaster, after the entire workforce of Morton-Thiokol shunned him, Ride was the only public figure to show support for him when he went public with his pre-disaster warnings. Ride hugged him publicly to show her support for his efforts. The Rogers Commission submitted its report on June 6, 1986.
Following the Challenger investigation, Ride was assigned to NASA headquarters in Washington, D.C., where she led NASA's first strategic planning effort. She authored a report titled "NASA Leadership and America's Future in Space". NASA management was unhappy with its prioritization of Earth exploration over a mission to Mars. She founded NASA's Office of Exploration, which she headed for two months. On weekends she flew to Atlanta to be with O'Shaughnessy. In October 1986, she published a children's book, To Space and Back, which she co-wrote with Sue Okie, her high school and Swarthmore friend.
## After NASA
In May 1987, Ride announced that she was leaving NASA to take up a two-year fellowship at the Stanford University Center for International Security and Arms Control (CISAC), commencing on August 15, 1987. She divorced Hawley in June. At Stanford, her colleagues included Condoleezza Rice, a specialist on the Soviet Union. Ride researched means by which nuclear warheads could be counted and verified from space, but the impending end of the Cold War made this a much less pressing issue. As the end of her fellowship approached, Ride hoped to secure a permanent position at Stanford. Sidney Drell, who had recruited her, attempted to get a department to appoint her as a professor, but none would. Drell resigned from CISAC in protest.
On July 1, 1989, Ride became a professor of physics at the University of California, San Diego (UCSD), and director of the California Space Institute (Cal Space), part of the university's Scripps Institution of Oceanography. She was paid a professor's salary of $64,000 () plus a $6,000 stipend as director of Cal Space, which employed 28 full- and part-time staff and had a budget of $3.3 million (equivalent to $ million in ). Her research primarily involved the study of nonlinear optics and Thomson scattering. She remained director of Cal Space until 1996. She retired from UCSD in 2007 and became a professor emeritus.
From the mid-1990s until her death, Ride led two public-outreach programs for NASA—the ISS EarthKAM and GRAIL MoonKAM projects, in cooperation with NASA's Jet Propulsion Laboratory and UCSD. The programs allowed middle school students to request images of the Earth and the Moon. Ride bought a house in La Jolla, California, and O'Shaughnessy moved in after taking up a teaching position at San Diego Mesa College. She turned down offers from President Bill Clinton to become NASA Administrator, not wanting to leave California, but did agree to serve on the President's Committee of Advisors on Science and Technology (PCAST). This involved flying to Washington, D.C., every few months for studies and presentations. Due to the experience at CISAC, Clinton appointed her to a PCAST panel chaired by John Holdren to assess the risk of fissile materials being stolen in Russia and ending up in the hands of terrorists.
From September 1999 to July 2000, Ride was the president of the space news website, Space.com, a company that aggregated news about science and space on its website. She then became the president and CEO of Sally Ride Science, a company she co-founded with O'Shaughnessy, who served as the chief executive officer and chair of the board. Sally Ride Science created entertaining science programs and publications for upper elementary and middle school students, with a particular focus on girls. Ride and O'Shaughnessy co-wrote six books on space aimed at children, with the goal of encouraging children to study science.
In 2003, Ride served on the Columbia Accident Investigation Board, and was the only person to serve on both the panel that investigated the Challenger disaster and the one that investigated the Columbia disaster. She endorsed Barack Obama for president in 2008, and was contacted by Lori Garver, the head of Barack Obama's transition team for NASA in 2008, but once again made it clear that she was not interested in the post of NASA administrator. She served on the board of the National Math and Science Initiative in 2007 and the Educate to Innovate initiative in 2009, and was a member of the Review of United States Human Space Flight Plans Committee, which conducted an independent review of American space policy requested by the Office of Science and Technology Policy (OSTP) on May 7, 2009.
## Death
When Ride delivered a speech at the National Science Teachers Association Conference in San Francisco on March 10, 2011, O'Shaughnessy and a friend noted that she looked ill. Alarmed, O'Shaughnessy had her book a doctor's appointment for the following day. A medical ultrasound revealed a tumor the size of a golf ball in her abdomen. A follow-up CT scan at UCSD confirmed a diagnosis of pancreatic cancer. She underwent chemotherapy and radiation therapy to reduce the size of the tumor. Ride had ensured that O'Shaughnessy would inherit her estate when she drew up her will in 1992. They registered their domestic partnership on August 15, 2011. On October 27, surgeons removed part of Ride's pancreas, bile duct, stomach and intestine, along with her gallbladder.
Ride died on July 23, 2012, at the age of 61, at her home in La Jolla. Following cremation, her ashes were interred next to those of her father at Woodlawn Memorial Cemetery, Santa Monica. Her papers are in the National Air and Space Museum Archives of the Smithsonian Institution. Ride's obituary publicly revealed for the first time that O'Shaughnessy had been her partner for 27 years. This made Ride the first known LGBT astronaut. The relationship was confirmed by Ride's sister Bear, who said Ride chose to keep her personal life private, including her sickness and treatments.
## Awards and honors
Ride received numerous awards throughout her lifetime and after. She received the National Space Society's von Braun Award, the Lindbergh Eagle by the Charles A. Lindbergh Fund, and the NCAA's Theodore Roosevelt Award. She was inducted into the National Women's Hall of Fame and the Astronaut Hall of Fame and was awarded the NASA Space Flight Medal twice. Elementary schools in the United States were named after her, including Sally Ride Elementary School in The Woodlands, Texas, and Sally Ride Elementary School in Germantown, Maryland. In 1984, she received the Samuel S. Beard Award for Greatest Public Service by an Individual 35 Years or Under, an award given out annually by Jefferson Awards. California Governor Arnold Schwarzenegger and First Lady Maria Shriver inducted Ride into the California Hall of Fame at the California Museum for History, Women, and the Arts on December 6, 2006. The following year she was inducted into the National Aviation Hall of Fame in Dayton, Ohio.
Ride directed public outreach and educational programs for NASA's Gravity Recovery and Interior Laboratory (GRAIL) mission, which sent twin satellites to map the moon's gravity. On December 17, 2012, the two GRAIL probes, Ebb and Flow, were directed to complete their mission by crashing on an unnamed lunar mountain near the crater Goldschmidt. NASA announced that it was naming the landing site in her honor. Also in December 2012, the Space Foundation bestowed upon Ride its highest honor, the General James E. Hill Lifetime Space Achievement Award.
In April 2013, the United States Navy announced that a research ship would be named in honor of Ride. The RV Sally Ride (AGOR-28) was christened by O'Shaughnessy on August 9, 2014, and delivered to Scripps Institution of Oceanography in 2016. It was the first vessel in the research fleet to be named after a female scientist.
A "National Tribute to Sally Ride" was held at the John F. Kennedy Center for the Performing Arts in Washington, D.C., on May 20, 2013. That day, President Barack Obama announced that Ride would receive the Presidential Medal of Freedom, the highest civilian award in the United States. The medal was presented to O'Shaughnessy in a ceremony at the White House on November 20, 2013. In July 2013, Flying magazine ranked Ride at number 50 on their list of the "51 Heroes of Aviation". For their first match of March 2019, the women of the United States women's national soccer team each wore a jersey with the name of a woman they were honoring on the back; Tierna Davidson chose the name of Sally Ride.
Ride was inducted into the Legacy Walk, an outdoor public display in Chicago that celebrates LGBT history and people, in 2014. She was honored with a Google Doodle on what would have been her 64th birthday in 2015. It was reused on International Women's Day in 2017. Stanford University's Serra House located in Lucie Stern Hall was renamed the Sally Ride House in 2019. The U.S. Postal Service issued a first-class postage stamp honoring her in 2018, and Ride appeared as one of the first two honorees of the American Women quarters series in March 2022. She was the first known LGBT person to appear on U.S. currency.
On 1 April 2022, a satellite named after Ride (ÑuSat 27 or "Sally", COSPAR 2022-033R) was launched into space as part of the Satellogic Aleph-1 constellation.
The Cygnus spacecraft used for the NG-18 mission was named the S.S. Sally Ride in her honor. It launched successfully on November 7, 2022.
In 2022 a statue of Ride was unveiled outside the Cradle of Aviation Museum. In 2023 another statue of Ride was unveiled; this one is outside the Ronald Reagan Presidential Library.
## In popular culture
- Ride appeared as herself in the 1999 Touched by an Angel episode "Godspeed".
- In 2013, Janelle Monáe released a song called "Sally Ride".
- Astronauts Chris Hadfield and Catherine Coleman performed a song called "Ride On". The song was later released as part of Hadfield's album Space Sessions: Songs from a Tin Can under the name "Ride That Lightning."
- A 2017 "Women of NASA" Lego set featured mini-figurines of Ride, Margaret Hamilton, Mae Jemison, and Nancy Roman.
- In 2019, Mattel released a Barbie doll in Ride's likeness as part of their "Inspiring Women" series.
- On October 21, 2019, the play Dr. Ride's American Beach House by playwright Liza Birkenmeier premiered off-Broadway at Ars Nova's Greenwich House Theater in New York City. The play is set the evening before Ride's 1983 space flight, and is about women's desires and American norms of sex and power, lensed with Ride's experience as an astronaut in relation to her sex and identity. The title of the play alludes to NASA's astronaut beach house where astronauts were quarantined before missions.
- In the film Valley Girl (2020), Ride is referred to not only as the first woman astronaut, but also as a valley girl, since she was from Encino.
- In 2021, Ride was featured in the second season of the Apple TV+ streaming series For All Mankind, where she was played by Ellen Wroe.
## Selected works
-
-
-
-
-
-
-
-
-
## See also
- Women in space
- List of female astronauts
- List of female explorers and travelers
- Mercury 13
- Women in science |
# Florin (British coin)
The British florin, or two-shilling piece (2/– or 2s.), was a coin worth 1⁄10 of one pound, or 24 pence. It was issued from 1849 until 1967, with a final issue for collectors dated 1970. It was the last coin circulating immediately prior to decimalisation to be demonetised, in 1993, having for a quarter of a century circulated alongside the ten-pence piece, identical in specifications and value.
The florin was introduced as part of an experiment in decimalisation that went no further at the time. The original florins, dated 1849, attracted controversy for omitting a reference to God from Queen Victoria's titles; that type is accordingly known as the "Godless florin", and was in 1851 succeeded by the "Gothic florin", for its design and style of lettering. Throughout most of its existence, the florin bore some variation of either the shields of the United Kingdom, or the emblems of its constituent nations on the reverse, a tradition broken between 1902 and 1910, when the coin featured a windswept figure of a standing Britannia.
In 1911, following the accession of George V, the florin regained the shields and sceptres design it had in the late Victorian era, and it kept that motif until 1937, when the national emblems were placed on it. The florin retained such a theme for the remainder of its run, though a new design was used from 1953, following the accession of Elizabeth II. In 1968, prior to decimalisation, the Royal Mint began striking the ten-pence piece. The old two-shilling piece remained in circulation until the ten-pence piece was made smaller, and earlier coins, including the florin, were demonetised.
## History
### Background
The drive for decimalisation of the currency in Britain dates as far back as 1682. Although nothing was done about early proposals, the adoption of decimal currencies in the United States, France and other nations in the late 18th and early 19th centuries renewed the call, and commissions in 1841 and 1843 called for the adoption of decimal coinage. In 1847, a motion was introduced in Parliament by Sir John Bowring calling for the introduction of a decimal currency and the striking of coins of one-tenth and one-hundredth of a pound. Bowring obtained surprisingly strong support for his motion, and the Russell government promised that a coin valued at one-tenth of a pound (two shillings) would be produced to test public opinion, with consideration to be given in future to the introduction of other decimal coins.
There was much discussion about what the coin should be called – centum, decade, and dime were among the suggestions – before florin was eventually settled upon, not because of the old English coin of that name, but because the Netherlands had a florin, or gulden, about that size and value. The original florin, the fiorino d'oro of the Republic of Florence, was a gold coin struck from 1252 to 1533.
### Victorian issues (1849–1901)
The first florins were struck in 1849. They were in the Gothic style, and featured a portrait of Queen Victoria as a very young woman, with the crowned shields of England (twice, per the usual arrangement of the royal coat of arms of the United Kingdom as used in England), Scotland, and Ireland arranged in a cruciform pattern shown on the reverse, and the nations' floral emblems in the angles (again with England's shown twice). The new florin closely resembles the Gothic crown of 1847; the obverse for both was designed by the Chief Engraver of the Royal Mint, William Wyon, while the reverse of both was designed by William Dyce. Unlike the crown's Gothic script, the 1849 florin has Roman lettering. The 1849 florin, issued in silver, weighed 11.3 grams (defined as 4⁄11 troy ounce) and had a diameter of 28 millimetres (1.10 in). The new coin made clear its value with the inscription ONE FLORIN ONE TENTH OF A POUND on the reverse. To aid in the decimal experiment, the half crown (two shillings and sixpence, or one-eighth of a pound), near to the florin in size and value, was not issued between 1850 and 1874, when it was struck again at the request of the banks, and surveys found that both coins played useful parts in commerce. Each would continue to be struck, and would circulate together, until decimalisation.
These first coins were probably a shock to the public, as for the first time in nearly 200 years a British coin featured a portrait of the monarch wearing a crown. Even more of a shock, including (allegedly) to Queen Victoria herself, was the inscription on the obverse, VICTORIA REGINA 1849, omitting the usual D G for Dei Gratia (By the Grace of God) from the coin's inscription. This resulted in it being known as the "Godless florin". Further controversy was caused by the omission of the usual abbreviation F D for Fidei Defensor (Defender of the Faith): the Master of the Mint, Richard Lalor Sheil, an Irishman and a Roman Catholic, was suspected by some of plotting to overthrow the Protestant regime. The inscription had in fact been suggested by Albert, Prince Consort, Victoria's husband. Sheil said in the House of Commons the inscription had been a mistake, and the florin was redesigned for its next issue in 1851.
The revised florin's diameter was increased to 30 millimetres (1.18 in) (the weight was unchanged), and all the lettering on the coin was in Gothic script, resulting in it being known as the Gothic florin. The coin was by the same designers; its date was rendered in Roman numerals. The bust of Victoria and the heraldry on the reverse were largely unchanged. The Latin inscription on the obverse read VICTORIA D G BRITT REG F D with the date, while the reverse read ONE FLORIN ONE TENTH OF A POUND. Despite a Royal Commission, the drive for decimalisation soon died out; there was only lukewarm support for an 1855 motion in the Commons applauding the issuance of the florin and seeking further decimal coins. The Gothic Florin was produced each year until 1887, excepting 1861 and 1882. From 1864 until 1879, many florins were struck with die numbers on the obverse (found to the right of Victoria's brooch, possibly part of a Mint investigation into how long it took coinage dies to wear out. Beginning with some 1867 issues, BRIT on the obverse was rendered BRITT, following the Latin practice in abbreviations of doubling a final consonant for a plural. Thus, Victoria's title changed from "Queen of Britain" to "Queen of the Britains", including the colonies and other territories.
In 1887, as part of a coinage redesign for Victoria's Golden Jubilee, a new obverse design, showing the queen as an older woman, debuted on the gold and silver coinage. This was dubbed the "Jubilee coinage" and was by Sir Joseph Boehm. The various flora were removed from the florin's reverse and were replaced by sceptres between the shields with a Garter Star in the centre. The Jubilee Head quickly proved unpopular, due in part to the crown worn by the Queen, which was deemed ridiculously small. The Jubilee florin shared its reverse with the short-lived double florin, which Gertrude Rawlings in 1898 described as "radiating kitchen pokers and tea trays". The reverse design was created and engraved by Leonard Charles Wyon (who also engraved the obverse), though it was probably influenced by the gold coinage of Charles II designed by John Roettier. The diameter was reduced to 29.5 millimetres (1.16 in). All the inscriptions were in Latin letters and Arabic numerals. The inscription on the obverse read VICTORIA DEI GRATIA, while the reverse read FID DEF BRITT REG, with no indication of the value. The Jubilee florin was struck each year between 1887 and 1892.
Given the unpopularity of the Jubilee bust, a committee was set up in February 1891 to recommend new designs. An obverse designed by Thomas Brock was selected, and the committee also recommended some new reverses. This advisory committee recommended a different bust (also by Brock) be used on the florin to distinguish it from the half crown. The recommendation was not accepted, and the florin used the same "Veiled Head" or "Old Head" obverse that was introduced to the silver and gold coinage in 1893. To better distinguish it from the half crown, the diameter was reduced from 29 to 28.5 millimetres (1.14 to 1.12 in). The obverse was inscribed VICTORIA DEI GRA BRITT REGINA FID DEF IND IMP, together with a new reverse showing three shields separated by a rose, shamrock, and thistle (symbolising England, Ireland, and Scotland) under a crown, and the inscription ONE FLORIN TWO SHILLINGS. This reverse was created by Sir Edward Poynter, and was issued each year between 1893 and 1901, the year of Victoria's death.
### Edward VII (1901–1910)
Both sides of the florin were redesigned following the accession of Victoria's son, Edward VII, each design being created by the Chief Engraver of the Royal Mint, George William de Saulles. The florin of King Edward VII was minted every year from 1902 to 1910. Its specifications remained at 11.3 grams weight and 28.5 millimetres diameter. The obverse shows the right-facing head of the King, inscribed EDWARDVS VII DEI GRA BRITT OMN REX FD IND IMP, while the other side features what Coincraft's Standard Catalogue of English and UK Coins deems "a most unusual and original reverse". It shows a windswept figure of Britannia standing holding a shield with her left hand and a trident with her right, and inscribed ONE FLORIN TWO SHILLINGS, with the date below. Peter Seaby, in his history of British coinage, described the figure of Britannia as "standing on some mythical ancient ship which could hardly be sea-worthy under her weight", but "a pleasing composition".
De Saulles created the new florin in this manner to distinguish the coin from the half crown, as there had been complaints of confusion. He probably based the design on his British trade dollar (1895). The sitter for the design was Susan Hicks-Beach, the daughter of Michael Hicks-Beach, 1st Earl St Aldwyn who had served as Chancellor of the Exchequer and ex officio Master of the Mint. The modern-day Britannia coinage, bullion pieces struck by the Royal Mint for investors and collectors, has a reverse that strongly resembles that of the Edwardian florin.
### George V (1910–1936)
Florins bearing a left-facing effigy of George V by Sir Bertram Mackennal were minted in each year of the King's reign (1910–1936) except 1910 and 1934. The initial reverse design (1911–1926) was developed internally at the Royal Mint, and is intended to be that of the 1887 double florin, to which the Jubilee florin is very similar. The weight and diameter of the coin were unchanged but, because of rises in the price of silver, the metallic composition was changed in 1920 from 0.925 silver to 50% silver, 40% copper, 10% nickel, then again in 1922 to 50% silver, 50% copper, and again in 1927 to 50% silver, 40% copper, 5% nickel, 5% zinc. The changes in alloy after 1920 were due to the Mint's trying to find a silver alloy that would remain attractive as it wore. The inscriptions on the obverse of the original version of the George V florin were GEORGIVS V D G BRITT OMN REX F D IND IMP and on the reverse were ONE FLORIN and the year of striking.
The modified florin, dated 1927 to 1936, was designed by George Kruger Gray and did not greatly alter the design of shields and sceptres, but removed the crowns from the shields and placed them on the sceptres. A "G", the King's initial, is at the centre of the design. The obverse inscription became GEORGIVS V DEI GRA BRITT OMN REX and the reverse one was FID DEF IND IMP with the date and denomination ONE FLORIN. The bust of the King on the obverse was slightly modified in 1927.
### Edward VIII (1936)
Throughout 1936, the year in which Edward VIII reigned, coins of all denominations continued to be struck using the designs of George V, pending preparation of the new monarch's coinage. No coins depicting Edward VIII were officially released to circulation. A pattern florin exists for King Edward, which would have been due to receive approval around the time the King abdicated in December 1936. Although there is a tradition of alternating the direction the monarch faces with each reign, and George V had faced left, Edward believed that side more flattering. Thus, the obverse depicts the left-facing effigy of the King by Thomas Humphrey Paget inscribed EDWARDVS VIII D G BR OMN REX. The reverse, by Kruger Gray, shows a crowned rose flanked by a thistle and shamrock, with E below the thistle and R below the shamrock, and the inscription FID DEF IND IMP and TWO SHILLINGS 1937.
### George VI (1936–1952)
King George VI's florin, produced each year between 1937 and 1951, looks very much like the one planned for his brother Edward VIII. Like on the patterns for King Edward, the words ONE FLORIN are omitted; they would remain absent for the coin's remaining existence. The obverse, by Thomas Humphrey Paget, shows the left-facing effigy of the King inscribed GEORGIVS VI D G BR OMN REX. The reverse, by Kruger Gray, depicts a crowned rose with a thistle and shamrock on either side. There is a G below the thistle and R below the shamrock, and the inscription FID DEF IND IMP TWO SHILLINGS date until 1948. From 1949, the coins were struck without the IND IMP, in acknowledgement of India's independence. From 1947, the metal content was changed, as for all British silver circulating coins, to 75% copper, 25% nickel. This was due to the need for Britain to return Lend-Lease silver to the United States. The florin's diameter and weight remained unchanged at 11.3 grams and 28.5 millimetres, despite the change of alloy.
### Elizabeth II (1953–1970)
Florins were produced for Queen Elizabeth II each year between 1953 and 1967, with proof coins dated 1970. The obverse shows the Mary Gillick head of Queen Elizabeth, inscribed ELIZABETH II DEI GRATIA BRITT OMN REGINA (1953 only) or ELIZABETH II DEI GRATIA REGINA (all other years). This change was made to acknowledge the evolving British Commonwealth, which by then contained some republics. The reverse, by Edgar Fuller and Cecil Thomas, depicts a Tudor rose in the centre surrounded by thistles, shamrocks and leeks, with the Latin phrase FID DEF, the denomination and the date. The designs were selected by the Royal Mint Advisory Committee following a public competition. The artists' initials appear either side of the Welsh leek at the bottom of the reverse. When the reverse of the new coin was illustrated in the press, there was no consensus as to which way was up; numismatist H.W.A. Linecar has noted that the second L in SHILLINGS marks the bottom of the coin.
In accordance with the plan for decimalisation of the currency (120 years after this denomination was first introduced in the initial plan to introduce a decimal currency), from 1968 the ten pence coin was introduced of the same size, weight and metal composition as the florin. Thus, the florin ceased to be struck for circulation after the 1967-dated pieces. The new and the old circulated side by side as florins prior to Decimal Day (15 February 1971) and as ten pence pieces after. Florins (usually dated 1947 or later) remained in circulation after Decimal Day. In 1987, following a study of the currency, the Thatcher government announced its intent to issue a new ten pence piece, reduced in size. A smaller ten pence piece was issued in 1992, after which the old florin was demonetised on 30 June 1993. The florin, the first decimal coin, was the last coin in general circulation just prior to decimalisation to be withdrawn.
## Mintages
Victoria
- 1849 - 413,320
- 1851 - 1,540
- 1852 - 1,014,552
- 1853 - 3,919,950
- 1854 - 550,413
- 1855 - 831,017
- 1856 - 2,201,760
- 1857 - 1,671,120
- 1858 - 2,239,380
- 1859 - 2,568,060
- 1860 - 1,475,100
- 1862 - 594,000
- 1863 - 938,520
- 1864 - 1,861,200
- 1865 - 1,580,040
- 1866 - 914,760
- 1867 - 423,720
- 1868 - 869,940
- 1869 - 297,000
- 1870 - 1,080,648
- 1871 - 3,425,605
- 1872 - 7,199,690
- 1873 - 5,921,839
- 1874 - 1,642,630
- 1875 - 1,117,030
- 1876 - 580,034
- 1877 - 682,292
- 1878 - 1,786,680
- 1879 - 1,512,247
- 1880 - 2,161,170
- 1881 - 2,376,337
- 1883 - 3,555,667
- 1884 - 1,447,379
- 1885 - 1,758,210
- 1886 - 591,773
- 1887 - 543,525 (Gothic)
- 1887 - 1,233,378 (Jubilee)
- 1888 - 1,647,540
- 1889 - 2,973,561
- 1890 - 1,684,737
- 1891 - 836,438
- 1892 - 283,401
- 1893 - 1,667,415
- 1894 - 1,952,842
- 1895 - 2,182,968
- 1896 - 2,944,416
- 1897 - 1,699,921
- 1898 - 3,061,343
- 1899 - 3,966,953
- 1900 - 5,528,630
- 1901 - 2,648,870
Edward VII
- 1902 - 2,204,698
- 1903 - 995,298
- 1904 - 2,769,932
- 1905 - 1,187,596
- 1906 - 6,910,128
- 1907 - 5,947,895
- 1908 - 3,280,010
- 1909 - 3,482,829
- 1910 - 5,650,713
George V
- 1911 - 5,957,291
- 1912 - 8,571,731
- 1913 - 4,545,278
- 1914 - 21,252,701
- 1915 - 12,357,939
- 1916 - 21,064,337
- 1917 - 11,181,617
- 1918 - 29,211,792
- 1919 - 9,469,292
- 1920 - 15,387,833
- 1921 - 34,863,895
- 1922 - 23,861,044
- 1923 - 21,546,533
- 1924 - 4,582,372
- 1925 - 1,404,136
- 1926 - 5,125,410
- 1927 - 15,000 (Proof Only)
- 1928 - 11,087,186
- 1929 - 16,397,279
- 1930 - 5,733,568
- 1931 - 6,566,331
- 1932 - 717,041
- 1933 - 8,685,303
- 1935 - 7,540,546
- 1936 - 9,897,448
George VI
- 1937 - 13,033,183
- 1938 - 7,909,388
- 1939 - 20,850,607
- 1940 - 18,700,338
- 1941 - 24,451,079
- 1942 - 39,895,245
- 1943 - 26,711,987
- 1944 - 27,560,005
- 1945 - 25,858,049
- 1946 - 22,300,254
- 1947 - 22,910,085
- 1948 - 67,553,838
- 1949 - 28,614,939
- 1950 - 24,375,003
- 1951 - 27,431,747
Elizabeth II
- 1953 - 11,998,710
- 1954 - 13,085,422
- 1955 - 25,887,253
- 1956 - 47,824,500
- 1957 - 33,071,282
- 1958 - 9,564,580
- 1959 - 14,080,319
- 1960 - 13,831,782
- 1961 - 37,735,315
- 1962 - 35,129,903
- 1963 - 25,580,000
- 1964 - 16,313,000
- 1965 - 48,723,000
- 1966 - 84,547,000
- 1967 - 22,000,000
- 1970 - 750,476 (Proof Only) |
# HMS Courageous (50)
HMS Courageous was the lead ship of her class of three battlecruisers built for the Royal Navy in the First World War. Designed to support the Baltic Project championed by First Sea Lord John Fisher, the ship was very lightly armoured and armed with only a few heavy guns. Courageous was completed in late 1916 and spent the war patrolling the North Sea. She participated in the Second Battle of Heligoland Bight in November 1917 and was present when the German High Seas Fleet surrendered a year later.
Courageous was decommissioned after the war, then rebuilt as an aircraft carrier in the mid-1920s. She could carry 48 aircraft, compared with 36 carried by her half-sister Furious on about the same displacement. After recommissioning she spent most of her career operating off Great Britain and Ireland. She briefly became a training ship, but reverted to her normal role a few months before the start of the Second World War in September 1939. A German U-boat sank Courageous by torpedo later that month, with the loss of more than 500 of her crew.
## Origin and construction
In the First World War, Admiral Fisher was prevented from ordering an improved version of the preceding Renown-class battlecruisers by a wartime restriction that banned construction of ships larger than light cruisers in 1915. To obtain ships suitable for the doctrinal roles of battlecruisers, such as scouting for fleets and hunting enemy raiders, he settled on ships with the minimal armour of a light cruiser and the armament of a battlecruiser. He justified their existence by claiming he needed fast, shallow-draught ships for his Baltic Project, a plan to invade Germany via its Baltic coast.
Courageous had an overall length of 786 feet 9 inches (239.8 m), a beam of 81 feet (24.7 m), and a draught of 25 feet 10 inches (7.9 m) at deep load. She displaced 19,180 long tons (19,490 t) at load and 22,560 long tons (22,922 t) at deep load. Courageous and her sisters were the first large warships in the Royal Navy to have geared steam turbines. To save design time, the installation used in the light cruiser Champion, the first cruiser in the navy with geared turbines, was simply replicated for four turbine sets. The Parsons turbines were powered by eighteen Yarrow small-tube boilers. They were designed to produce a total of 90,000 shaft horsepower (67 MW) at a working pressure of 235 psi (1,620 kPa; 17 kgf/cm<sup>2</sup>). The ship reached an estimated 30.8 knots (57.0 km/h; 35.4 mph) on sea trials.
The ship's normal design load was 750 long tons (762 t) of fuel oil, but she could carry a maximum of 3,160 long tons (3,211 t). At full capacity, she could steam for an estimated 6,000 nautical miles (11,110 km; 6,900 mi) at a speed of 20 knots (37 km/h; 23 mph).
Courageous carried four BL 15-inch Mk I guns in two hydraulically powered twin gun turrets, designated 'A' and 'Y' from front to rear. Her secondary armament consisted of eighteen BL 4-inch Mk IX guns mounted in six manually powered mounts. The mount placed three breeches too close together, causing the 23 loaders to get in one another's way, and preventing the intended high rate of fire. A pair of QF 3-inch 20 cwt anti-aircraft guns were fitted abreast the mainmast on Courageous. She mounted two submerged tubes for 21-inch torpedoes and carried 10 torpedoes for them.
## First World War
Courageous was laid down on 26 March 1915, launched on 5 February 1916 and completed on 4 November. On her sea trials later that month, she sustained structural damage while running at full speed in a rough head sea; the exact cause is uncertain. The forecastle deck was deeply buckled in three places between the breakwater and the forward turret. The side plating was visibly buckled between the forecastle and upper decks. Water had entered the submerged torpedo room and rivets had sheared in the angle irons securing the deck armour in place. The ship was stiffened with 130 long tons (130 t) of steel in response. As of 23 November 1916, she cost £2,038,225 to build.
Upon commissioning, Courageous was assigned to the 3rd Light Cruiser Squadron of the Grand Fleet. She became flagship of the 1st Cruiser Squadron near the end of 1916 when that unit was re-formed after most of its ships had been sunk at the Battle of Jutland in May. The ship was temporarily fitted as a minelayer in April 1917 by the addition of mine rails on her quarterdeck that could hold over 200 mines, but never laid any mines. In mid-1917, she received half a dozen torpedo mounts, each with two tubes: one mount on each side of the mainmast on the upper deck and two mounts on each side of the rear turret on the quarterdeck. On 30 July 1917, Rear-Admiral Trevylyan Napier assumed command of the 1st Cruiser Squadron and was appointed Acting Vice-Admiral Commanding the Light Cruiser Force until he was relieved on 26 October 1918.
On 16 October 1917, the Admiralty received word of German ship movements, possibly indicating a raid. Admiral Beatty, the commander of the Grand Fleet, ordered most of his light cruisers and destroyers to sea in an effort to locate the enemy ships. Courageous and Glorious were not initially included amongst them, but were sent to reinforce the 2nd Light Cruiser Squadron patrolling the central part of the North Sea later that day. Two German Brummer-class light cruisers managed to slip through the gaps between the British patrols and destroy a convoy bound for Norway on the morning of 17 October, but no word was received of the engagement until that afternoon. The 1st Cruiser Squadron was ordered to intercept, but was unsuccessful as the German cruisers were faster than expected.
### Second Battle of Heligoland Bight
Throughout 1917 the Admiralty was becoming more concerned about German efforts to sweep paths through the British-laid minefields intended to restrict the actions of the High Seas Fleet and German submarines. A preliminary raid on German minesweeping forces on 31 October by light forces destroyed ten small ships. Based on intelligence reports, the Admiralty allocated the 1st Cruiser Squadron on 17 November 1917, with cover provided by the reinforced 1st Battlecruiser Squadron and distant cover by the battleships of the 1st Battle Squadron, to destroy the minesweepers and their light cruiser escorts.
The German ships—four light cruisers of II Scouting Force, eight destroyers, three divisions of minesweepers, eight Sperrbrechers (cork-filled trawlers) and two other trawlers to mark the swept route—were spotted at 7:30 am. Courageous and the light cruiser Cardiff opened fire with their forward guns seven minutes later. The Germans responded by laying an effective smoke screen. The British continued in pursuit, but lost track of most of the smaller ships in the smoke and concentrated fire on the light cruisers. Courageous fired 92 fifteen-inch shells and 180 four-inch shells in the battle, and the only damage she received was from her own muzzle blast. One fifteen-inch shell hit a gun shield of the light cruiser SMS Pillau but did not affect her speed. At 9:30 the 1st Cruiser Squadron broke off their pursuit so that they would not enter a minefield marked on their maps; the ships turned south, playing no further role in the battle.
After the battle, the mine fittings on Courageous were removed, and she spent the rest of the war intermittently patrolling the North Sea. In 1918, short take-off platforms were fitted for a Sopwith Camel and a Sopwith 11⁄2 Strutter on both 15-inch (380 mm) turrets. The ship was present at the surrender of the German High Seas fleet on 21 November 1918. Courageous was placed in reserve at Rosyth on 1 February 1919 and she again became Napier's flagship as he was appointed Vice-Admiral Commanding the Rosyth Reserve until 1 May, The ship was assigned to the Gunnery School at Portsmouth the following year as a turret drill ship. She became flagship of the Rear-Admiral Commanding the Reserve at Portsmouth in March 1920. Captain Sidney Meyrick became her Flag Captain in 1920. He was relieved by Capt John Casement in August 1921.
## Between the wars
### Conversion
The Washington Naval Treaty of 1922 severely limited capital ship tonnage, and the Royal Navy was forced to scrap many of its older battleships and battlecruisers. The treaty allowed the conversion of existing ships totalling up to 66,000 long tons (67,059 t) into aircraft carriers, and the Courageous class's combination of a large hull and high speed made these ships ideal candidates. The conversion of Courageous began on 29 June 1924 at Devonport. Her fifteen-inch turrets were placed into storage and reused in the Second World War for HMS Vanguard, the Royal Navy's last battleship. The conversion into an aircraft carrier cost £2,025,800.
The ship's new design improved on her half-sister HMS Furious, which lacked an island and a conventional funnel. All superstructure, guns, torpedo tubes, and fittings down to the main deck were removed. A two-storey hangar was built on top of the remaining hull; each level was 16 feet (4.9 m) high and 550 feet (167.6 m) long. The upper hangar level opened onto a short flying-off deck, below and forward of the main flight deck. The flying-off deck improved launch and recovery cycle flexibility until new fighters requiring longer takeoff rolls made the lower deck obsolete in the 1930s. Two 46-by-48-foot (14.0 m × 14.6 m) lifts were installed fore and aft in the flight deck. An island with the bridge, flying control station and funnel was added on the starboard side, since islands had been found not to contribute significantly to turbulence. By 1939 the ship could carry 34,500 imperial gallons (157,000 L; 41,400 US gal) of petrol for her aircraft.
Courageous received a dual-purpose armament of sixteen QF 4.7-inch Mk VIII guns in single HA Mark XII mounts. Each side of the lower flight deck had a mount, and two were on the quarterdeck. The remaining twelve mounts were distributed along the sides of the ship. In refits in the mid-1930s, Courageous received three quadruple Mk VII mounts for 40-millimetre (1.6 in) 2-pounder "pom-pom" anti-aircraft guns, two of which were transferred from the battleship Royal Sovereign. Each side of the flying-off deck had a mount, forward of the 4.7-inch guns, and one was behind the island on the flight deck. She also received four water-cooled .50-calibre Mk III anti-aircraft machine guns in a single quadruple mounting. This was placed in a sponson on the port side aft.
The reconstruction was completed on 21 February 1928, and the ship spent the next several months on trials and training before she was assigned to the Mediterranean Fleet to be based at Malta, in which she served from May 1928 to June 1930. In August 1929, the 1929 Palestine riots broke out, and Courageous was ordered to respond. When she arrived off Palestine, her air wing was disembarked to carry out operations to help to suppress the disorder. The ship was relieved from the Mediterranean by Glorious and refitted from June to August 1930. She was assigned to the Atlantic and Home Fleets from 12 August 1930 to December 1938, aside from a temporary attachment to the Mediterranean Fleet in 1936. In the early 1930s, traverse arresting gear was installed and she received two hydraulic aircraft catapults on the upper flight deck before March 1934. Courageous was refitted again between October 1935 and June 1936 with her pom-pom mounts. She was present at the Coronation Fleet Review at Spithead on 20 May 1937 for King George VI. The ship became a training carrier in December 1938 when Ark Royal joined the Home Fleet. She was relieved of that duty by her half-sister Furious in May 1939. Courageous participated in the Portland Fleet Review on 9 August 1939.
### Air group
Courageous could carry up to 48 aircraft; following completion of her trials and embarking stores and personnel, she sailed for Spithead on 14 May 1928. The following day, a Blackburn Dart of 463 Flight made the ship's first deck landing. The Dart was followed by the Fairey Flycatchers of 404 and 407 Flights, the Fairey IIIFs of 445 and 446 Flights and the Darts of 463 and 464 Flight. The ship sailed for Malta on 2 June to join the Mediterranean Fleet.
From 1933 to the end of 1938 Courageous carried No. 800 Squadron, which flew a mixture of nine Hawker Nimrod and three Hawker Osprey fighters. 810, 820 and 821 Squadrons were embarked for reconnaissance and anti-ship attack missions in the same period. They flew the Blackburn Baffin, the Blackburn Shark, the Blackburn Ripon and the Fairey Swordfish torpedo bombers as well as Fairey Seal reconnaissance aircraft. As a deck landing training carrier, in early 1939 Courageous embarked the Blackburn Skua and Gloster Sea Gladiator fighters of 801 Squadron and the Swordfish torpedo bombers of 811 Squadron, although both of these squadrons were disembarked when the ship was relieved of her training duties in May.
## Second World War and sinking
Courageous served with the Home Fleet at the start of World War II with 811 and 822 Squadrons aboard, each squadron equipped with a dozen Fairey Swordfish. In the early days of the war, hunter-killer groups were formed around the fleet's aircraft carriers to find and destroy U-boats. On 31 August 1939 she went to her war station at Portland and embarked the two squadrons of Swordfish. Courageous departed Plymouth on the evening of 3 September 1939 for an anti-submarine patrol in the Western Approaches, escorted by four destroyers. On the evening of 17 September 1939, she was on one such patrol off the coast of Ireland. Two of her four escorting destroyers had been sent to help a merchant ship under attack and all her aircraft had returned from patrols. , commanded by Captain-Lieutenant Otto Schuhart, stalked Courageous for more than two hours. The carrier then turned into the wind to launch her aircraft. This put the ship right across the bow of the submarine, which fired three torpedoes. Two of the torpedoes struck the ship on her port side before any aircraft took off, knocking out all electrical power, and she capsized and sank in 20 minutes with the loss of 519 of her crew, including her captain. The US cargo ship Collingsworth, Ellerman Lines cargo ship Dido, and Dutch ocean liner Veendam rescued survivors. The two escorting destroyers counterattacked U-29 for four hours, but the submarine escaped.
An earlier unsuccessful attack on Ark Royal by on 14 September, followed by the sinking of Courageous three days later, prompted the Royal Navy to withdraw its carriers from anti-submarine patrols. Courageous was the first British warship to be sunk by German forces. (The submarine Oxley had been sunk a week earlier by friendly fire from the British submarine Triton.) The commander of the German submarine force, Commodore Karl Dönitz, regarded the sinking of Courageous as "a wonderful success" and it led to widespread jubilation in the Kriegsmarine (German navy). Grand Admiral Erich Raeder, commander of the Kriegsmarine, directed that Schuhart be awarded the Iron Cross First Class and that all other members of the crew receive the Iron Cross Second Class. |
# Tom Derrick
Thomas Currie "Diver" Derrick, (20 March 1914 – 24 May 1945) was an Australian soldier and a recipient of the Victoria Cross, the highest decoration for gallantry "in the face of the enemy" awarded to members of the British and Commonwealth armed forces. In November 1943, during the Second World War, Derrick was awarded the Victoria Cross for his assault on a heavily defended Japanese position at Sattelberg, New Guinea. During the engagement, he scaled a cliff face while under heavy fire and silenced seven machine gun posts, before leading his platoon in a charge that destroyed a further three.
Born in the Adelaide suburb of Medindie, South Australia, Derrick left school at the age of fourteen and found work in a bakery. As the Great Depression grew worse he lost his job and moved to Berri, working on a fruit farm before marrying in 1939. In July 1941, Derrick enlisted in the Second Australian Imperial Force, joining the 2/48th Battalion. He was posted to the Middle East, where he took part in the siege of Tobruk, was recommended for the Military Medal and promoted to corporal. Later, at El Alamein, Derrick was awarded the Distinguished Conduct Medal for knocking out three German machine gun posts, destroying two tanks, and capturing one hundred prisoners.
Derrick returned to Australia with his battalion in February 1943, before transferring to the South West Pacific Theatre where he fought in the battle to capture Lae. Back in Australia the following February he was posted to an officer cadet training unit, being commissioned lieutenant in November 1944. In April 1945 his battalion was sent to the Pacific island of Morotai, an assembly point for the Allied invasion of the Philippines. Engaged in action the following month on the heavily defended hill Freda on Tarakan Island, Derrick was hit by five bullets from a Japanese machine gun. He died from his wounds on 24 May 1945.
## Early life
Derrick was born on 20 March 1914 at the McBride Maternity Hospital in the Adelaide suburb of Medindie, South Australia, to David Derrick, a labourer from Ireland, and his Australian wife, Ada (née Whitcombe). The Derricks were poor, and Tom often walked barefoot to attend Sturt Street Public School and later Le Fevre Peninsula School. In 1928, aged fourteen, Derrick left school and found work in a bakery. By this time, he had developed a keen interest in sports, particularly cricket, Australian Rules Football, boxing and swimming; his diving in the Port River earned him the nickname of "Diver".
With the advent of the Great Depression, Derrick scraped a living from odd jobs—such as fixing bicycles and selling newspapers—to supplement his job as a baker. When in 1931, the Depression worsened, Derrick lost his bakery job and, with friends, headed by bicycle for the regional town of Berri, approximately 225 kilometres (140 mi) away, in search of work. Jobs in Berri were hard to come by and Derrick and two friends spent the next few months living in a tent on the banks of the Murray River. When the annual Royal Adelaide Show opened that year, Derrick went to the boxing pavilion to accept a challenge of staying upright for three rounds with the ex-lightweight champion of Australia. Although he was knocked down in the second round, he immediately got back to his feet and won the bet; albeit at the cost of a black eye, and a few bruised ribs.
Eventually, towards the end of 1931, Derrick found work picking fruit at a vineyard in Winkie, a short distance outside Berri. He later moved on to a full-time job at a nearby fruit farm, remaining there for the next nine years. On 24 June 1939, Derrick married Clarance Violet "Beryl" Leslie—his "one true love" whom he had met at a dance in Adelaide seven years earlier—at St Laurence's Catholic Church, North Adelaide.
## Second World War
Derrick did not join up when war broke out in September 1939 but, like many Australians, enlisted after the fall of France in June 1940. He joined the Second Australian Imperial Force on 5 July 1940, and was posted to the 2/48th Battalion, 26th Brigade, as a private. Derrick first joined his unit at the Wayville Showgrounds, before basic training at Woodside. Derrick thrived on military life, but found discipline difficult to accept.
In October, the 2/48th Battalion paraded through the streets of Adelaide to Mitcham railway station before its embarkation for the Middle East. The battalion's voyage overseas was postponed until 17 November, when the unit boarded the SS Stratheden. The ship made a stop at Perth, where Derrick was confined on board for going absent without leave to sightsee. He was soon in more trouble, and was charged and fined for punching another soldier who taunted him over this incident.
### North Africa
On arrival in Palestine, the 2/48th Battalion encamped at El Kantara and began training in desert warfare. For relaxation, the battalion set up athletic events, and Derrick became well known for often winning cross-country races—and for organising a book on the outcomes. In March 1941, the unit went by train and truck to Alexandria, Egypt, then along the North African coast to Cyrenaica, in Libya, to join the 9th Australian Division.
After the 2/48th Battalion completed its training with the 9th Division at Cyrenaica, they were moved further along the coast to Gazala. Then, just as they began to dig in, the battalion was abruptly withdrawn to Tobruk in response to the German Afrika Korps' advance. They entered Tobruk on 9 April 1941, and spent the following eight months besieged by Axis forces. While there, Derrick acquired an Italian Breda machine gun and regularly led fighting patrols against both German and Italian troops. Although Derrick's bravery was noted during the siege, he wrote in his diary about his constant fear of dying.
On the night of 30 April, the Axis forces assaulted Tobruk's outer defences and managed to capture substantial ground. In response, the 2/48th Battalion was ordered to counter-attack the following evening. During the ensuing engagement, Derrick fought as a section member in the far left flank of the attack. After suffering heavy casualties in what Derrick described as "a bobby dazzler of a fire fight", the battalion was forced to withdraw. Praised for his leadership and bravery during the assault, Derrick was immediately promoted to corporal, and recommended for the Military Medal, but the award was never made.
In late May, Derrick discovered a German posing as a British tank officer and reported him to company headquarters; the man was immediately arrested as a spy. Following a period of heavy fighting in June, the 2/48th Battalion was placed in reserve for a few days the following month. Promoted to platoon sergeant in September, Derrick—along with the rest of his battalion—was withdrawn from Tobruk and returned to Palestine aboard HMS Kingston on 22 October. Disembarking at Tel Aviv, they were given three days' leave in the city, before returning for training.
Following a period of rest and light garrison duties in Syria, the 2/48th Battalion was rushed to El Alamein, Egypt, to reinforce the British Eighth Army. During the First Battle of El Alamein on 10 July 1942, Derrick took part in the 26th Australian Brigade's attack on Tel el Eisa. In the initial assault, Derrick, against a barrage of German grenades, led an attack against three machine gun posts and succeeded in destroying the positions before capturing over one hundred prisoners. During the Axis counter-attack that evening, the Australian line was overrun by tanks. As the German infantry following the tanks advanced, Derrick's company led a charge against the men. During the engagement, Derrick managed to destroy two German tanks using sticky bombs. Commended for his "outstanding leadership and courage", Derrick was awarded the Distinguished Conduct Medal for his part in the fighting at Tel el Eisa. The award was announced in a supplement to the London Gazette on 18 February 1943.
Promoted to sergeant on 28 July, Derrick led a six-man reconnaissance on 3 October, successfully pinpointing several German machine gun positions and strongholds; this information was to be vital for the upcoming Second Battle of El Alamein. The El Alamein offensive was launched on 23 October, the 9th Australian Division taking part. At one point during the engagement, Derrick jumped up onto an Allied gun carrier heading towards the Germans. Armed with a Thompson submachine gun and under intense heavy fire, Derrick attacked and knocked out three machine gun posts while standing in the carrier. He then had the driver reverse up to each post so he could ensure each position was silenced. By the following morning, Derrick's platoon occupied all three posts. The members of the 2/48th Battalion who witnessed Derrick's action were sure he would be awarded the Victoria Cross, though no recommendation was made.
For part of 31 October, Derrick assumed command of his company after all of the unit's officers had been killed or wounded in fierce fighting. On 21 November 1942, Derrick was briefly admitted to the 2/3rd Australian Field Ambulance with slight shrapnel wounds to his right hand and buttock. Twelve days later, the 2/48th Battalion left El Alamein and returned to Gaza in Palestine, where, later that month, Derrick attended a corps patrolling course. In January 1943, the 2/48th Battalion sailed home to Australia, aboard the SS Nieuw Amsterdam along with the rest of the 9th Division.
### South West Pacific
Disembarking at Port Melbourne in late February 1943, Derrick was granted a period of leave and travelled by train to Adelaide where he spent time with Beryl. He rejoined his battalion—now encamped in the outskirts of Adelaide—before they went by train to the Atherton Tableland for training in jungle warfare. Brought up to full strength by the end of April, the 2/48th Battalion completed its training following landing-craft exercises near Cairns. On 23 July, Derrick was attached to the 21st Brigade Headquarters but admitted to hospital for old injuries to his right eye later the same day. After hospital, Derrick returned briefly to brigade headquarters before rejoining the 2/48th Battalion on 27 August.
For much of August, the 2/48th Battalion had been in training for the Allied attack on Lae, in Papua New Guinea. The unit's objective was to land on a strip of land designated as "Red Beach", and then fight their way approximately 30 kilometres (19 mi) west towards Lae. Following a bombardment by American destroyers, Derrick's wave landed on the beach with minimal casualties on 4 September. Ten days later, the 2/48th Battalion's C Company—led by Derrick's platoon—captured Malahang airstrip, before Lae fell to the Allies on 16 September. Derrick was scornful of the Japanese defence of Lae, and wrote in his diary that "our greatest problem was trying to catch up" with the retreating Japanese force.
#### Victoria Cross
Following Lae, the 9th Division was tasked to seize Finschhafen, clear the Huon Peninsula and gain control of the Vitiaz Strait. By 2 October, one of the division's brigades had gained a foothold on Finschhafen, but soon encountered fierce Japanese resistance. In response to a Japanese counter-attack, the 26th Brigade was transferred to reinforce the Australian position on 20 October and, when the division switched to the offensive in November, the brigade was ordered to capture Sattelberg. Sattelberg was a densely wooded hill rising 1,000 metres (1,100 yd) and dominating the Finschhafen region; it was in an assault on this position that Derrick was to earn the Victoria Cross.
The Australian attack on Sattelberg began in mid-November, the Japanese slowly giving ground and withdrawing back up the precipitous slopes. Each side suffered heavy casualties, and on 20 November, Derrick—who had been acting as company sergeant major for the previous month—was given command of B Company's 11 platoon after the unit had "lost all but one of their leaders". By 22 November, the 2/23rd and 2/48th Battalions had reached the southern slopes of Sattelberg, holding a position approximately 600 metres (660 yd) from the summit. A landslide had blocked the only road, so the final assault was made by infantry alone, without supporting tanks.
On 24 November, the 2/48th Battalion's B Company was ordered to outflank a strong Japanese position sited on a cliff face, before attacking a feature 140 metres (150 yd) from the Sattelberg township. The nature of the terrain meant that the only possible route was up a slope covered with kunai grass directly beneath the cliffs. Over a period of two hours, the Australians made several attempts to clamber up the slopes to reach their objective, but each time they were repulsed by intense machine gun fire and grenade attacks. As dusk fell, it appeared impossible to reach the objective or even hold the ground already gained, and the company was ordered to withdraw. In response, Derrick replied to his company commander: "Bugger the CO [commanding officer]. Just give me twenty more minutes and we'll have this place. Tell him I'm pinned down and can't get out."
Moving forward with his platoon, Derrick attacked a Japanese post that had been holding up the advance. He destroyed the position with grenades and ordered his second section around to the right flank. The section soon came under heavy machine gun and grenade fire from six Japanese posts. Clambering up the cliff face under heavy fire, Derrick held on with one hand while lobbing grenades into the weapon pits with the other, like "a man ... shooting for [a] goal at basketball". Climbing further up the cliff and in full view of the Japanese, Derrick continued to attack the posts with grenades before following up with accurate rifle fire. Within twenty minutes, he had reached the peak and cleared seven posts, while the demoralised Japanese defenders fled from their positions to the buildings of Sattelberg.
Derrick then returned to his platoon, where he gathered his first and third sections in preparation for an assault on the three remaining machine gun posts in the area. Attacking the posts, Derrick personally rushed forward on four separate occasions and threw his grenades at a range of about 7 metres (7.7 yd), before all three were silenced. Derrick's platoon held their position that night, before the 2/48th Battalion moved in to take Sattelberg unopposed the following morning. The battalion commander insisted that Derrick personally hoist the Australian flag over the town; it was raised at 10:00 on 25 November 1943.
The final assault on Sattelberg became known within the 2/48th Battalion as 'Derrick's Show'. Although he was already a celebrity within the 9th Division, the action brought him to wide public attention. On 23 March 1944, the announcement and accompanying citation for Derrick's Victoria Cross appeared in a supplement to the London Gazette. It read:
> Government House, Canberra. 23rd March 1944.
>
> The KING has been graciously pleased to approve the award of the VICTORIA CROSS to:-
>
> Sergeant Thomas Currie Derrick, D.C.M., Australian Military Forces.
>
> For most conspicuous courage, outstanding leadership and devotion to duty during the final assault on Sattelberg in November, 1943.
>
> On 24th November, 1943, a company of an Australian Infantry Battalion was ordered to outflank a strong enemy position sited on a precipitous cliff-face and then to attack a feature 150 yards from the township of Sattelberg. Sergeant Derrick was in command of his platoon of the company. Due to the nature of the country, the only possible approach to the town lay through an open kunai patch situated directly beneath the top of the cliffs. Over a period of two hours many attempts were made by our troops to clamber up the slopes to their objective, but on each occasion the enemy prevented success with intense machine-gun fire and grenades.
>
> Shortly before last light it appeared that it would be impossible to reach the objective or even to hold the ground already occupied and the company was ordered to retire. On receipt of this order, Sergeant Derrick, displaying dogged tenacity, requested one last attempt to reach the objective. His request was granted.
>
> Moving ahead of his forward section he personally destroyed, with grenades, an enemy post which had been holding up this section. He then ordered his second section around on the right flank. This section came under heavy fire from light machine-guns and grenades from, six enemy posts. Without regard for personal safety he clambered forward well ahead of the leading men of the section and hurled grenade after grenade, so completely demoralising the enemy that they fled leaving weapons and grenades. By this action alone the company was able to gain its first foothold on the precipitous ground.
>
> Not content with the work already done, he returned to the first section, and together with the third section of his platoon advanced to deal with the three remaining posts in the area. On four separate occasions he dashed forward and threw grenades at a range of six to eight yards until these positions were finally silenced.
>
> In all, Sergeant Derrick had reduced ten enemy posts. From the vital ground he had captured the remainder of the Battalion moved on to capture Sattelberg the following morning.
>
> Undoubtedly Sergeant Derrick's fine leadership and refusal to admit defeat, in the face of a seemingly impossible situation, resulted in the capture of Sattelberg. His outstanding gallantry, thoroughness and devotion to duty were an inspiration not only to his platoon and company but to the whole Battalion.
#### Later war service
The 2/48th Battalion remained at Sattelberg until late December 1943, when it returned to the coast to regroup. On Christmas Eve, Derrick noted in his diary that the next day would be his "4th Xmas overseas" and "I don't care where I spend the next one I only hope I'm still on deck [alive]". On 7 February 1944, the battalion sailed from Finschhafen for Australia, disembarking at Brisbane. Granted home leave, Derrick made his way to South Australia for a short period with Beryl. In April, he was admitted to hospital suffering from malaria before returning to his battalion the following month. During this time, he was charged with being absent without leave and subsequently forfeited a day's pay.
On 20 August 1944, Derrick was posted to an officer cadet training unit in Victoria. He requested that he be allowed to rejoin the 2/48th Battalion at the end of the course; contrary to normal Army policy that prevented officers commissioned from the ranks from returning to their previous units. An exemption was granted to Derrick only after much lobbying. While at this unit, Derrick shared a tent with Reg Saunders, who later became the Army's first Indigenous Australian officer.
Commissioned as a lieutenant on 26 November 1944, Derrick was granted twenty-four days leave. Returning to the 2/48th Battalion as a reinforcement officer, his appointment as a platoon commander in his old company was met by "great jubilation". During this period, the battalion had been posted to Ravenshoe on the Atherton Tablelands for "an extensive training period", before being transported from Cairns to Morotai during April 1945. It was around this time that Derrick converted from his Church of England religious denomination and Salvationist beliefs to Catholicism—his wife's religion—though he was not overtly religious.
On 1 May 1945, Derrick took part in the landing at Tarakan; an island off the coast of Borneo. Under the cover of a naval and aerial bombardment, he led his men ashore in the initial waves of the landing, where they were initially posted at the boundary of the 2/48th Battalion and 2/24th Battalion's area of responsibility. The Japanese force on the island mounted a determined resistance, and Derrick was later quoted in the Sunday Sun as saying he had "never struck anything so tough as the Japanese on Tarakan".
Slowly pushing inland, the 2/48th Battalion's main task from 19 May was to capture a heavily defended hill code-named Freda. Derrick's platoon unsuccessfully probed Japanese positions on that day and the next, at a loss of two men killed with others wounded. He later recorded in his diary that these setbacks were a "bad show". On 21 May, Derrick and Lieutenant Colonel Bob Ainslie, the 2/48th Battalion's commander, debated the optimum size of the unit which should be used to capture the Freda position. Derrick successfully argued that a company was best, given the restrictions posed by the terrain. He was in high spirits that night, possibly in an attempt to lift his platoon's morale. On 22 May, Derrick's was one of two platoons that attacked a well-defended knoll and captured the position. Derrick played a key role in this action, and coordinated both platoons during the final assault that afternoon.
After capturing the knoll, the two platoons—reinforced by two sections of the 2/4th Commando Squadron—dug in to await an expected Japanese counter-attack. At about 03:30 on 23 May, a Japanese light machine gun fired into the Australian position. Derrick sat upright to see if his men were all right, and was hit by five bullets from the gun's second burst; striking him from his left hip to the right of his chest. His runner, "Curly" Colby, dragged him behind cover, but Derrick could not be immediately evacuated as Japanese troops attacked at about 04:00. Derrick was in great pain, and told Colby that he had "had it". Despite his wounds, he continued to issue orders for several hours. When day broke, it was discovered that Derrick's platoon were directly overlooked by a Japanese bunker—though this would not have been visible during the assault late the previous evening.
When stretcher bearers reached the position at dawn, Derrick insisted that the other wounded be attended to first. Derrick was carried off Freda later that morning, where he was met by the 26th Brigade's commander, Brigadier David Whitehead. The two men briefly conversed before Derrick excused himself, fearing that he had not much time left and wishing to see the padre. Stepping back, Whitehead saluted and sent for Father Arch Bryson. At the hospital, surgeons found that bullets had torn away much of Derrick's liver; he died on 24 May 1945 during a second operation on his wounds. He was buried in the 2/48th Battalion's cemetery on Tarakan that afternoon, and later re-interred at the Labuan War Cemetery, plot 24, row A, grave 9.
## Legacy
Tom Derrick was widely mourned. His widow, Beryl, became prostrate with grief on hearing of his death; many members of the Army were affected, with one soldier lamenting it felt as if "the whole war stopped". By the time Derrick's death was officially announced on 30 May, most Australians on Tarakan had heard the news and rumours had spread claiming that he had been speared or shot at short range by a sub-machine gun.
The Japanese force on Tarakan learned of Derrick's death and tried to exploit it for propaganda purposes. They printed a leaflet which began "We lament over the death of Lieutenant General Terick CinC of Allied Force in Tarakan" and later included the question "what do you think of the death in action of your Commander in Chief ...?" This leaflet reached few Australian soldiers, and had little impact on them. "Tokyo Rose" also broadcast taunts over "Terick's" death.
Derrick's reputation continued to grow after his death, and many Australian soldiers recalled any association, however slight, they had with him. To many Australians, he embodied the 'ANZAC spirit', and he remains perhaps the best-known Australian soldier of the Second World War. Historian Michael McKernan later remarked that, for his war service, Derrick had arguably deserved "a VC and two bars ... at El Alamein, at Sattelberg and now at Tarakan". In a 2004 television interview, then Chief of the Australian Defence Force, General Peter Cosgrove, was asked "Who was the best soldier of all time?" After a short pause, he replied: "Diver Derrick". This sentiment was endorsed by General Sir Francis Hassett. Hassett—who, as a lieutenant colonel, had served at Finschhafen with II Corps headquarters—stated:
> From what I learnt; not only was Derrick a magnificent soldier, but also a splendid leader who, immediately he saw a tactical problem, fixed it with either personal bravery or leadership imbued with determination and common sense.
Derrick is also remembered for his personal qualities. He was sensitive and reflective. Despite a limited education, he was a "forceful and logical debater, with a thirst for knowledge". Derrick kept a diary, composed poetry, collected butterflies and frequently wrote to his wife, while on active service . Historian Peter Stanley has compared Derrick's leadership abilities with those of Edward 'Weary' Dunlop, Ralph Honner and Roden Cutler.
On 7 May 1947, Beryl Derrick attended an investiture ceremony at the Government House, Adelaide, where she was presented with her late husband's Victoria Cross and Distinguished Conduct Medal by the Governor of South Australia, Lieutenant General Sir Charles Norrie. Derrick's Victoria Cross and other medals are now displayed at the Australian War Memorial, Canberra, along with a portrait by Sir Ivor Hele. A street in the neighbouring suburb of Campbell and a rest stop in the Remembrance Driveway between Sydney and Canberra were also named in his honour. In 1995, a public park was named the Derrick Memorial Reserve on Carlisle St, Glanville in his honour, and his VC citation is displayed on a plaque there. In June 2008, a newly built bridge over the Port River on the Port River Expressway was named the Tom 'Diver' Derrick Bridge following a public campaign. |
# Fôrça Bruta
Fôrça Bruta () is the seventh studio album by Brazilian singer-songwriter and guitarist Jorge Ben. It was recorded with the Trio Mocotó band and released by Philips Records in September 1970. Conceived at a time of political tension in dictatorial Brazil, its title comes from the Portuguese term meaning "brute force" and has been interpreted ironically due to the music's relatively relaxed style.
The album introduced an acoustic samba-based music that is mellower, moodier, and less ornate than Ben's preceding work. Its largely unrehearsed, nighttime recording session found the singer improvising with Trio Mocotó's groove-oriented accompaniment while experimenting with unconventional rhythmic arrangements, musical techniques, and elements of soul, funk, and rock. Ben's lyrics generally explore themes of romantic passion, melancholy, and sensuality, with women figuring prominently in his songs. In a departure from the carefree sensibility of his past releases, they also feature elements of identity politics and postmodernism, such as irony and reimagining of established idioms.
A commercial and critical success, Fôrça Bruta established Ben as a leading artist in Brazil's Tropicália movement and pioneered a unique sound later known as samba rock. Renown among collectors and musicians but relatively rare outside of its country of origin, the album was released for the first time in the United States in 2007 by the specialty label Dusty Groove America, attracting further critical recognition. That same year, Rolling Stone Brasil named it among its 100 greatest Brazilian albums.
## Background
In 1969, Jorge Ben re-signed to Philips Records after a four-year leave from the label due to creative differences and recorded his self-titled sixth album. It featured songs performed with Trio Mocotó as his backing band; Ben had met the vocal/percussion group while touring the nightclub circuit in São Paulo in the late 1960s. The band's members were Fritz Escovão (who played the cuíca), Nereu Gargalo (tambourine), and João Parahyba (drums and percussion). The album was a commercial comeback for Ben, and its success created a busy schedule for all four musicians. This "hectic" period for them led music critic John Bush to believe it may have resulted in a relaxed recording of samba soul for Fôrça Bruta.
## Recording and production
Ben regrouped with Trio Mocotó in 1970 to record the album. They held one nighttime session without rehearsing most of the songs beforehand. According to Parahyba, this was intended to give listeners an impression of the mood that developed as they played in the studio.
During the session, Ben first sang his vocal for a song before the accompanying instrumentation was recorded. He played the acoustic guitar for the instrumentals, and specifically the ten-string viola caipira for the songs "Aparece Aparecida" and "Mulher Brasileira". He also repurposed a tuning fork, a device traditionally used by musicians to maintain musical tuning among instruments; the singer instead stimulated it with his mouth to generate sounds that resembled a harmonica.
For their part, Trio Mocotó attempted to develop a distinctive groove with a rhythm that would suit the rock or "iê-iê-iê" feel of Ben's guitar playing. The band played several percussion instruments, including the atabaque and bell plates. For "Charles Jr." and other tracks, Parahyba used the whistle of his sister's electric toy train as a horn instrument, breaking it in the process.
String and horn sections were recorded and included in the final mix but went uncredited in the album's packaging. It credited C.B.D. in Rio de Janeiro and Scatena in São Paulo as the recording locations for Fôrça Bruta, which was named after the Portuguese for the phrase "brute force". According to Robert Leaver of Amoeba Music's international records department, "one can see a sly irony" in the title, considering the heightened political tension in dictatorial Brazil at the time and the gentle quality of Ben's music for the album.
## Music
Fôrça Bruta has a pervasive sense of melancholy and idiosyncratic contrast, according to Brazilian music scholar Pedro Alexandre Sanches, who identifies each composition on the album as either a samba, samba lament, or "samba-banzo". Greg Caz, a disc jockey specializing in Brazilian music, recognizes this quality as not only melancholic but mysterious and departing from the carefree sensibility that had been the singer's trademark. This is demonstrated in the lyrics, melodies, arrangements, and Ben's "devilish" guitar figures, with "Oba, Lá Vem Ela" and "Domênica Domingava" cited by Sanches as examples. Ben's guitar playing, more developed and prominently featured on this album, leads music journalist Jacob McKean to find the sound altogether subtler and "stripped down" when compared to his previous records, while colored by a "somewhat crunchy, folksy tone" established in the opening songs "Oba, Lá Vem Ela" and "Zé Canjica".
Overall, the songs are longer and more groove-based than on Ben's previous self-titled album. They also experiment with unconventional percussive arrangements, particularly on the cuíca-driven "O Telefone Tocou Novamente" and "Zé Canjica" (with its drum cadence), resulting in rhythmic contrasts between Trio Mocotó and Ben's instruments. This rhythmic direction departs from his earlier music's innovative "chacatum, chacatum" beat, which had become popular and widely imitated by the time of the album.
While still samba-based with hints of bossa nova, Fôrça Bruta also adds understated funk and soul elements in the form of horn and string arrangements. Horn riffs are arranged in the style of Sérgio Mendes on "Pulo, Pulo", in the style of Stax Records on "O Telefone Tocou Novamente", and on the title track, which appropriates the groove of the 1968 Archie Bell & the Drells song "Tighten Up". On "Mulher Brasileira", a string section is heard playing swirling patterns around Escovão's cuíca, while the more uptempo rhythms of "Charles Jr." and "Pulo, Pulo" are given contrast by more relaxed string melodies.
### Vocals
Another source of contrast and funk/soul influence is Ben's singing, which McKean describes as "more intimate" than in the past. Along with his characteristic wails and croons, he exhibits a newfound raspy texture in his typically languid and nasal vocal. His singing also functions as an additional element of rhythm to some songs. According to Peter Margasak, Ben can be heard "reinforcing the rhythmic agility of his songs with pin-point phrasing, surprising intervallic leaps, and a plaintive kind of moan".
On "Zé Canjica" and "Charles Jr.", Ben improvises phrases (such as "Comanchero" and "the mama mama, the mama say") as rhythmic accompaniment during otherwise instrumental sections of the songs. The name "Comanche" is also implored by Ben in moments on the album. As Parahyba explains, it is a nickname given to him by Ben, who originally recorded it as a joke on "Charles Jr." A different explanation came in the form of a lyric in Ben's 1971 song "Comanche": "My mother calls me / Comanche".
### Lyrics
Women are central figures in Ben's lyrics throughout the album, especially in "Mulher Brasileira", "Terezinha", and "Domênica Domingava"; "Domênica" is a variation on Domingas, the surname of his wife and muse Maria. His preoccupation with female characters led Sanches to identify Fôrça Bruta's predominant theme as Ben's "Dionysian body", referring to the philosophical concept of a body that can submit to passionate chaos and suffering before overcoming itself.
Several of the songs deal with romantic disappointment. In "Zé Canjica", the narrator apologizes for being confused, sad, and moody while sending away a lover he feels he does not deserve. "O Telefone Tocou Novamente" expresses grief and pity over an angry lover ringing the phone of the narrator, who leaves to meet, only not to find her. During the song, Sanches observes a moment of catharsis by Ben, who raises his singing voice to an almost crying falsetto.
Ben's lyrics also appropriate thematic devices from the popular imagination. Sanches compares the verses of the caipira-influenced samba "Apareceu Aparecida" and "Pulo, Pulo" to songs from ciranda, a traditional Brazilian children's dance. In "Apareceu Aparecida" – which employs the "rolling stone" idiom – the narrator rediscovers the euphoric joy of living after his beloved has accepted him again; this leads Sanches to conclude that Ben sings of hedonism in a concentrated state.
Some songs feature expressions of political values. The nationalistic "Mulher Brasileira" celebrates Brazilian women regardless of their physical appearance and is cited by Brazilian journalist Gabriel Proiete de Souza as an early example of Ben's attempt to empower Afro-Brazilian women through his music. In Caz's opinion, the lyrics on Fôrça Bruta reveal deeper concerns than were found in the singer's previous recordings, shown most notably by "Charles Jr.", in which Ben explores his identity as an artist and as a black man. Brazilian music academic Rafael Lemos believes it demonstrates Ben's process of placing "black heritage into modernity", in the aftermath of slavery in Brazil and the continued marginalization of black people there. According to one translation of the lyrics, the narrator proclaims:
> My name is Charles Jr.
> And I'm an angel too
> But I don't want to be the first
> Nor be better than anybody
> I just want to live in peace
> And be treated as an equal among equals
> For in exchange of my love and affection
> I want to be understood and taken into consideration
> And, if possible, loved as well
> 'Cause it doesn't matter what I have
> I'm no longer what my brothers once were, no, no
> I was born of a free womb
> Born of a free womb in the 20th century
> I have love and faith
> To go into the 21st century
> Where the conquests of science, space and medicine
> And the brotherhood of all human beings
> And the humbleness of a king
> Will be the weapons of victory
> For universal peace
> And the whole world will hear
> And the whole world will know
> That my name is Charles Jr.
> And I'm an angel too.
"Charles Jr." and other songs also use elements of postmodernism, such as self-reference, irony, and surrealism (as in the lyrics of "Pulo, Pulo"). Some of Fôrça Bruta's characters and stories had appeared on Ben's earlier work, albeit in slightly different manifestations. On his 1969 album, "Charles" was depicted as a heroic Robin Hood-like figure of the country. The sensually primitive "Domingas" and "Teresa", also from the previous record, are rendered here as the more sophisticated "Domênica" and the irreverent "Terezinha", respectively. Ben sings the latter song in an exceptionally nasal voice interpreted by Sanches as an ironic caricature of música popular brasileira.
## Reception and impact
Fôrça Bruta was released by Philips in September 1970. It was received favorably in Veja magazine, whose reviewer found it impressively rhythmic, full of musical surprises and suspense, and comparable to a comic book in the way familiar fantasies and characters are reformulated in strange yet delightful directions. Commercially, it was a top-10 chart success in Brazil and produced the hit singles "O Telefone Tocou Novamente" and "Mulher Brasileira".
The album's performance established Ben as an integral artist in Brazil's Tropicália movement, led by fellow musicians Caetano Veloso and Gilberto Gil. The following year on his next album, Negro É Lindo (), Ben delved further into the black identity politics of "Charles Jr." while retaining the melancholic musical quality of the previous record.
Fôrça Bruta's fusion of Trio Mocotó's groove and Ben's more rockish guitar proved to be a distinctive feature of what critics and musicians later called samba rock. Its soul and funk elements, most prominent in the title track, helped earn the album a respected reputation among soul enthusiasts and rare-record collectors. In an interview for Guy Oseary's On the Record (2004), music entrepreneur and record collector Craig Kallman named Fôrça Bruta among his 15 favorite albums. Recording artist Beck also named it one of his favorite albums.
### Reissue and reappraisal
In 2007, Fôrça Bruta was re-released by Dusty Groove America, a specialty label in Chicago that reissued rare funk, jazz, soul, and Brazilian music titles in partnership with Universal Music. The reissue marked the first time the album had seen release in the United States. Dusty Groove asked Chicago Reader critic Peter Margasak to write liner notes for the release, but he declined, citing in part the lack of American literature available on Ben. New York-based retailer Other Music later named it the fourth best reissue of 2007 and one of Ben's "deepest, most emotional albums". That same year, Fôrça Bruta was ranked 61st on Rolling Stone Brasil's list of the 100 greatest Brazilian albums. In an essay accompanying the ranking, journalist Marcus Preto called it the singer's most melancholy album.
In a retrospective review for AllMusic, John Bush regards Fôrça Bruta as one of Ben's best records and gives it four-and-a-half out of a possible five stars. In his estimation, it retained each participating musician's abilities over the course of "a wonderful acoustic groove that may have varied little but was all the better for its agreeable evenness". A reviewer for The Boston Globe says Ben's masterful performance of this music – "a fusion of bright samba and mellow soul" – still sounds original and essential more than 30 years after its recording. Recommended even for non-Lusophones, it "transcends language and era with an organic vibe and breezy spontaneity", in his opinion.
Other reviews praise Fôrça Bruta as a "samba-soul heater" (Now's Tim Perlich) and "one of the most buoyantly textured and warmly melodic LPs ever recorded" (Matthew Hickey from Turntable Kitchen), with Hickey calling "Oba, Lá Vem Ela" among its "loveliest tunes". In Impose magazine, Jacob McKean highlights the two opening tracks, finding "Zé Canjica" particularly attractive, and believes that "Apareceu Aparecida" features the album's most appealing hook. He also finds Trio Mocotó incomparable in their performance and the album to be elegant and exquisite overall. However, he quips of Ben's nasally singing on "Terezinha", saying it sounds unusual, and that the string section is given slightly too much emphasis on "Mulher Brasileira".
Less enthusiastic about the album is Stylus Magazine's Mike Powell, who writes that it has "a kind of aesthetic gentility" that characterizes most Brazilian music and polarizes its listeners as a consequence. Powell adds that, while his cavil may be silly, Fôrça Bruta remains "demure samba-rock laced with sliding strings, an agreeable, samey atmosphere, no strife on the horizon", assigning it a letter grade of "B-minus". According to Peter Shapiro, it may be "too dainty" or not adventurous enough for some listeners, lacking the stylistically eclectic abandon of other Tropicália music. But in his appraisal in The Wire, he judges the album to be "something of a minor masterpiece of textural contrast" and "a stone cold classic of Brazilian modernism", representative of the country's flair for "weaving beguiling syncretic music from practically any cloth".
Having discovered Ben's music in 2009, indie rock musician Andrew Bird writes in a guest column for Time that Fôrça Bruta is a classic of "raw and soulful Tropicália". He also observes in Ben's singing a "pleading quality" that projects a simultaneous sense of melancholy and delight. Alynda Segarra of Hurray for the Riff Raff listened to it while making her band's 2017 album The Navigator, later citing Fôrça Bruta's string arrangements as an influence on her "cinematic" approach to the album's lyrics.
## Track listing
All songs were composed by Jorge Ben.
## Personnel
Credits are adapted from the album's liner notes.
- Jorge Ben – guitar, vocals
Trio Mocotó
- Fritz Escovão – cuíca
- Nereu Gargalo – percussion
- Jõao Parahyba – drums
Production
- Ari Carvalhaes – engineering
- Manoel Barenbein – production
- Chris Kalis – reissue production
- João Kibelkstis – engineering
- João Moreira – engineering
## Charts
## See also
- 1970s in Latin music
- Cinematic soul
- Jovem Guarda
- Music of Brazil
- Postmodern music |
# Green Park tube station
Green Park is a London Underground station located on the edge of Green Park, with entrances on both sides of Piccadilly. The station is served by three lines: Jubilee, Piccadilly and Victoria. On the Jubilee line the station is between Bond Street and Westminster stations, on the Piccadilly line it is between Hyde Park Corner and Piccadilly Circus stations, and on the Victoria line it is between Victoria and Oxford Circus stations. It is in fare zone 1.
The station was opened in 1906 by the Great Northern, Piccadilly and Brompton Railway (GNP\&BR) and was originally named Dover Street due to its location in that street. It was modernised in the 1930s when lifts were replaced with escalators and extended in the 1960s and 1970s when the Victoria and Jubilee lines were constructed.
The station is near The Ritz Hotel, the Royal Academy of Arts, St James's Palace, Berkeley Square, Bond Street, the Burlington Arcade and Fortnum & Mason, and is one of two serving Buckingham Palace (the other being St James's Park).
## History
### Piccadilly line
#### Rival schemes
During the final years of the 19th century and the early years of the 20th century numerous competing schemes for underground railways through central London were proposed. A number of the schemes submitted to parliament for approval as private bills included proposals for lines running under Piccadilly with stations in the area of the current Green Park station.
The first two proposals came before parliament in 1897. The Brompton and Piccadilly Circus Railway (B\&PCR) proposed a line between South Kensington and Piccadilly Circus and the City and West End Railway (C\&WER) proposed a line between Hammersmith and Cannon Street. The B\&PCR proposed a station on the north side at Dover Street and the C\&WER proposed a station on the south side at Arlington Street. Following review by parliament, the C\&WER bill was rejected and the B\&PCR bill was approved and received royal assent in August 1897.
In 1902, the Charing Cross, Hammersmith and District Railway (CCH\&DR) proposed a line between Charing Cross and Barnes with a parallel shuttle line running between Hyde Park Corner and Charing Cross. A station was planned at Walsingham House on the north-east corner of Green Park. This scheme was rejected by parliament.
The same year, the Central London Railway (CLR, now the central section of the Central line) submitted a bill that aimed to turn its line running between Shepherd's Bush and Bank into a loop by constructing a second roughly parallel line to the south. This would have run along Piccadilly with a station at St James's Street just to the east of Dover Street. Delayed while a royal commission considered general principles of underground railways in London, the scheme was never fully considered and although it was re-presented in 1903, it was dropped two years later.
A third scheme for 1902 was the Piccadilly, City and North East London Railway (PC\&NELR) which proposed a route between Hammersmith and Southgate. It planned a station at Albemarle Street, just to the east of Dover Street. Although favoured in parliament and likely to be approved, this scheme failed due to a falling-out between the backers and the sale of part of the proposals to a rival.
In 1905, some of the promoters of the PC\&NELR regrouped and submitted a proposal for the Hammersmith, City and North East London Railway. As the CLR had done previously, the company proposed a station at St James's Street. Owing to failures in the application process, this scheme was also rejected.
#### Construction and opening
While the various rival schemes were unsuccessful in obtaining parliamentary approval, the B\&PCR was unsuccessful in raising the funds needed to construct its line. It was not until after the B\&PCR had been taken over by Charles Yerkes's Metropolitan District Electric Traction Company that the money became available. Tunnelling began in 1902 shortly before the B\&PCR was merged with the Great Northern and Strand Railway to create the Great Northern, Piccadilly and Brompton Railway (GNP\&BR, the predecessor of the Piccadilly line).
The GNP\&BR opened the station on 15 December 1906 as Dover Street. As with most of the other GNP\&BR stations, the station building, on the east side of Dover Street, was designed by Leslie Green. It featured the company's standard red glazed terracotta facade with wide semi-circular arches at first-floor level. Platform and passageway walls were decorated in glazed cream tiles in Green's standard arrangement with margins, patterning and station names in mid-blue. When it opened, the station to the west was Down Street. The station was provided with four Otis electric lifts paired in two 23-foot (7.0 m) diameter shafts and a spiral stair in a smaller shaft. The platforms are 27.4 metres (90 ft) below the level of Piccadilly.
#### Reconstruction
The station was busy and unsuccessful attempts to control crowds with gates at platform level were made in 1918. In the 1930s, the station was included amongst those modernised in conjunction with the northern and western extensions of the Piccadilly line. A new sub-surface ticket hall was opened on 18 September 1933 with a pair of Otis escalators provided to replace the lifts. The new ticket hall was accessed from subway entrances in Piccadilly. On the north side, an entrance was provided in Devonshire House on the corner with Stratton Street; on the south side an entrance was constructed on a piece of land taken from the park. The shelter for the southern entrance was designed by Charles Holden. The original station building, the lifts and the redundant below-ground passages were closed and the station was renamed Green Park. Part of the ground floor was used as a tea shop until the 1960s. In 1955, a third escalator was added to help deal with increased passenger numbers.
### Victoria line
Proposals for an underground line linking Victoria to Finsbury Park date from 1937 when planning by the London Passenger Transport Board (LPTB) for future services considered a variety of new routes and extensions to existing lines. Parliament approved the line in 1955, but a shortage of funds meant that work did not start until after government loans were approved in 1962.
Construction works began in 1962. The 1930s ticket hall under the roadway of Piccadilly was enlarged to provide space for new Victoria line escalators and a long interchange passageway was provided between the Victoria line and Piccadilly line platforms. In 1965 a collapse of soft ground during the excavation of one of the tunnels near Green Park station meant that the ground had to be chemically stabilised before work could continue. The disused station building in Dover Street was demolished the following year in conjunction with the works for the new line. A vent shaft was constructed and an electrical sub-station was built in the basement of the new building. The 1930s entrance on the south side of Piccadilly was also reconstructed.
The enlarged ticket hall, new platforms and passageways were decorated in grey tiles. Platforms are approximately 23.4 metres (77 ft) below street level. Platform roundel signs were on backlit illuminated panels. Seat recesses on the Victoria line platforms were tiled in an abstract pattern by Hans Unger of coloured circles representing a bird's-eye view of trees in Green Park.
After trial running of empty trains from 24 February 1969, the Victoria line platforms opened on 7 March 1969 with the opening of the third stage of the line between Warren Street and Victoria. The same day, Queen Elizabeth II officially opened the line by riding a train from Green Park to Oxford Circus.
### Jubilee line
The origins of the Jubilee line are less clearly defined than those of the Victoria line. During World War II and throughout the 1950s and early 1960s consideration was given to various routes connecting north-west and south-east London via the West End and the City of London. Planning of the Victoria line had the greater priority and it was not until after construction of that line started that detailed planning began for the new line, first called the Fleet line in 1965 as it was planned to run in an east–west direction along Fleet Street. Lack of funding meant that only the first stage of the proposed line, from Baker Street to Charing Cross, received royal assent in July 1969; funding was agreed in August 1971.
Tunnelling began in February 1972 and was completed by the end of 1974. In 1977, during construction of the stations, the name of the line was changed to the Jubilee line, to mark the Queen's Silver Jubilee that year. A construction shaft in Hays Mews north of the station was used for an electrical substation and ventilation shaft. At Green Park, the ticket hall was enlarged slightly to provide space for escalators for the new line which connect to an intermediate concourse providing interchange between the Jubilee and Victoria lines. A second flight of escalators descends to the Jubilee line platforms, which are 31.1 metres (102 ft) below street level, the deepest of the three sets. Interchange between the Jubilee and Piccadilly lines was via the ticket hall. Platform walls were tiled in a deep red with black leaf patterns by June Fraser. Trial running of trains began in August 1978 and the Jubilee line opened on 1 May the next year. The line had been officially opened by Prince Charles the previous day, starting with a train journey from Green Park to Charing Cross. In 1993, to alleviate congestion, a third escalator was installed in the lower flight to replace a fixed staircase.
Work on the Fleet line's stages 2 and 3 did not proceed and it was not until 1992 that an alternative route was approved. The Jubilee line extension took the line south of the River Thames via Waterloo, which was impractical to reach from the line's existing terminus at Charing Cross. New tunnels branching from the original route south of Green Park were to be constructed, and the line to Charing Cross was to be closed. Tunnelling began in May 1994, and improvements were carried out at Green Park to provide a direct passageway connection between the Jubilee and Piccadilly lines, including lifts to the platforms at each end. A new ventilation shaft and an emergency exit to Arlington Street were built. The new extension opened in stages starting at Stratford in the east, with services to Charing Cross ending on 19 November 1999 and the final section between Green Park and Waterloo opening the following day.
### Recent changes
In 2008, Transport for London (TfL) announced a project to provide step-free access to all three lines in advance of the 2012 London Olympics. The project also included the construction of a new entrance on the south side of Piccadilly with ramped access directly from Green Park designed by Capita Symonds and Alacanthus LW architects. Work commenced in May 2009 to install two lifts from the ticket hall to the Victoria line platforms and the interchange passageway to the Piccadilly line. This work and a third lift in the new park-side entrance between the street level and the ticket hall were completed ahead of schedule in 2011. At the same time, Green Park station underwent a major improvement programme which saw the tiling on the Victoria and Piccadilly line platforms and the interchange passageways replaced. When the Jubilee line opened, the Hans Unger tiling in the seat recesses of the Victoria line platforms was replaced with a design using the leaf patterns used on the Jubilee line platforms; the Unger design was reinstated during the restoration.
The new park entrance and street level shelter feature artwork within the Portland stone cladding titled Sea Strata designed by John Maine RA. The Diana Fountain was relocated from its original site in the centre of the park to form the centrepiece of the new entrance.
To help moderate temperatures in the station, a system using cool ground water extracted from boreholes sunk 130 metres (430 ft) into the chalk aquifer below London was installed. The extracted water passes through a heat exchanger connected to the cast-iron tunnel lining and the warmed water is returned to the aquifer through a second set of boreholes 200 metres (660 ft) away.
## Proposal for new connection
In July 2005, a report, DLR Horizon 2020 Study, for the Docklands Light Railway (DLR) examined "pragmatic development schemes" to expand and improve the DLR network between 2012 and 2020. One of the proposals was an extension of the DLR from Bank to Charing Cross. Unused tunnels under Strand constructed as part of Stage 1 of the Fleet line would be enlarged to accommodate the larger DLR trains. In 2011, the DLR published a proposal to continue the extension to Victoria via Green Park. No further work has been done on these proposals.
## Piccadilly bombing
At around 9:00 pm on 9 October 1975, members of the Provisional IRA's Balcombe Street Gang detonated a bomb at a bus stop outside Green Park station, killing 23-year-old Graham Ronald Tuck and injuring 20 others. The attack was part of a bombing campaign carried out by the gang and in addition to the death and injuries caused damage to the Ritz Hotel and neighbouring buildings.
## Services and connections
### Services
The station is in Travelcard Zone 1, between Bond Street and Westminster on the Jubilee line, Hyde Park Corner and Piccadilly Circus on the Piccadilly line and Victoria and Oxford Circus on the Victoria line. On weekdays Jubilee line trains typically run every 2–21⁄2 minutes between 05:38 and 00:34 northbound and 05:26 and 00:45 southbound; on the Piccadilly line trains typically run every 21⁄2–31⁄2 minutes between 05:48 and 00:34 westbound and 00:32 eastbound, and on the Victoria line trains typically run every 100–135 seconds between 05:36 and 00:39 northbound and 05:36 and 00:31 southbound.
As of it is the station on the London Underground with million passengers using it per year.
### Connections
London Buses day and night routes serve the station.
## In popular culture
The opening scene of the 1997 film version of Henry James's The Wings of the Dove was set on the eastbound platforms at both Dover Street and Knightsbridge stations, both represented by the same studio mock-up, complete with a working recreation of a 1906 Stock train. |
# The Thriving Cult of Greed and Power
"The Thriving Cult of Greed and Power" is an article, written in 1991 by U.S. investigative journalist Richard Behar, which is highly critical of Scientology. It was first published by Time magazine on May 6, 1991, as an eight-page cover story, and was later published in Reader's Digest in October 1991. Behar had previously published an article on Scientology in Forbes magazine. He stated that he was investigated by attorneys and private investigators affiliated with the Church of Scientology while researching the Time article, and that investigators contacted his friends and family as well. Behar's article covers topics including L. Ron Hubbard and the development of Scientology, its controversies over the years and history of litigation, conflict with psychiatry and the U.S. Internal Revenue Service, the suicide of Noah Lottick, its status as a religion, and its business dealings.
After the article's publication, the Church of Scientology mounted a public relations campaign to address issues in the piece. It took out advertisements in USA Today for twelve weeks, and Church leader David Miscavige was interviewed by Ted Koppel on Nightline about what he considered to be an objective bias by the article's author. Miscavige alleged that the article was actually driven by the company Eli Lilly, because of Scientology's efforts against the drug Prozac. The Church of Scientology brought a libel suit against Time Warner and Behar, and sued Reader's Digest in multiple countries in Europe in an attempt to stop the article's publication there. The suit against Time Warner was dismissed in 1996, and the Church of Scientology's petition for a writ of certiorari to the Supreme Court of the United States was denied in 2001.
Behar received awards in honor of his work on the article, including the Gerald Loeb Award, the Worth Bingham Prize, and the Conscience-in-Media Award. The article has had ramifications in the current treatment of Scientology in the media, with some publications theorizing that journalists are wary of the litigation that Time Warner went through. The article has been cited by Anderson Cooper on CNN, in a story on Panorama's 2007 program "Scientology and Me" on the BBC, and has been used as a reference for background on the history of Scientology, in books from both the cult and new religious movement perspectives.
## Research for the article
Before penning "The Thriving Cult of Greed and Power", Behar had written a 1986 article in Forbes magazine, "The Prophet and Profits of Scientology", which reported on the Church of Scientology's business dealings and L. Ron Hubbard's financial success. Behar wrote that during research for "The Thriving Cult of Greed and Power", he and a Time contributing editor were themselves investigated by ten attorneys and six private investigators affiliated with the Church of Scientology. According to Behar, investigators contacted his friends and previous coworkers to ask them if he had a history of tax or drug problems, and obtained a copy of his personal credit report that had been obtained illegally from a national credit bureau. Behar conducted 150 interviews in the course of his research for the article.
Behar wrote that the motive of these operatives was to "threaten, harass and discredit him". He later learned that the Church of Scientology had assigned its head private investigator to direct the Church's investigation into Behar. Anderson Cooper 360° reported that Behar had been contacted by Church of Scientology attorneys numerous times while doing research on the article. The parents of Noah Lottick, a Scientologist who had committed suicide, cooperated with Time and Reader's Digest.
## Synopsis
The full title of the article is "The Thriving Cult of Greed and Power: Ruined lives. Lost fortunes. Federal crimes. Scientology poses as a religion but is really a ruthless global scam—and aiming for the mainstream". The article reported on the founding of the Church of Scientology by L. Ron Hubbard and controversies involving the Church and its affiliated business operations, as well as the suicide of a Scientologist. The article related the May 11, 1990, suicide of Dr. Edward Lottick's son Noah Antrim Lottick. Lottick was a Russian studies student who had taken a series of Scientology courses; he died after jumping from a hotel tenth floor window. The Church of Scientology and Lottick's family have differing positions on the effect Scientology coursework had on him. While none of the parties assigned blame, they expressed misgivings about his death. Initially, his father had thought that Scientology was similar to Dale Carnegie's self-improvement techniques; however, after his ordeal, the elder Lottick came to believe that the organization is a "school for psychopaths". Mike Rinder, then head of the Church of Scientology's Office of Special Affairs and a Church spokesman, stated "I think Ed Lottick should look in the mirror... I think Ed Lottick made his son's life intolerable".
The article outlined a brief history of Scientology, discussing Hubbard's initial background as a science fiction writer, and cited a California judge who had deemed Hubbard a "pathological liar". The Church of Scientology's litigation history was described, in addition to its conflicts with the Internal Revenue Service, with countries regarding whether or not to accept it as a religion, and its position against psychiatry. Behar wrote of the high costs involved in participation in the Church of Scientology, what he referred to as "front groups and financial scams", and harassment of critics. He estimated that the Church of Scientology paid US$20 million annually to over one hundred attorneys. Behar maintained that though the Church of Scientology portrays itself as a religion, it was actually a "hugely profitable global racket" which intimidated members and critics in a Mafia-like manner.
Cynthia Kisser, then director of the Cult Awareness Network, was quoted: "Scientology is quite likely the most ruthless, the most classically terroristic, the most litigious and the most lucrative cult the country has ever seen. No cult extracts more money from its members".
## Post-publication
### Church of Scientology's response
The Church of Scientology responded to the publication of "The Thriving Cult of Greed and Power" by taking out color full-page ads in USA Today in May and June 1991, on every weekday for twelve weeks, denouncing the Time magazine cover article. Two official Church of Scientology responses were titled "Facts vs. Fiction, A Correction of Falsehoods Contained in the May 6, 1991, Issues of Time Magazine", and "The Story That Time Couldn't Tell". Prior to the advertising campaign, Scientologists distributed 88-page bound booklets which disputed points from Behar's article. The "Fact vs. Fiction" piece was a 1⁄4-inch-thick (0.64 cm) booklet, which criticized Behar's article and asserted "Behar's article omits the information on the dozens of community service programs conducted by Scientologists ... which have been acknowledged by community officials". One of the advertisements in USA Today accused Time of promoting Adolf Hitler and Nazi Germany, and featured a 1936 issue of Time which had Hitler's picture on the front cover. The Church of Scientology sent out a news release condemning Time's "horrible history of supporting fascism", and said that the article was written because Time had been pressured by "vested interests". When asked by the St. Petersburg Times whether this was the case, Time Executive Editor Richard Duncan responded "Good Lord, no". Heber Jentzsch, at the time president of Church of Scientology International, issued a four-page news release which stated "Advertising is the only way the church could be assured of getting its message and its side of the story out to the public without the same vested interests behind the Time article distorting it".
After the advertising run critiquing Time magazine in USA Today had completed, the Church of Scientology mounted a $3 million public relations campaign about Scientology in USA Today, in June 1991. The Church of Scientology placed a 48-page advertising supplement in 1.8 million copies of USA Today. In a statement to the St. Petersburg Times, Scientology spokesman Richard Haworth explained "What we are trying to do is put the actual facts of Dianetics and Scientology out there".
In response to the Church of Scientology's claims of inaccuracies in the article, a lawyer for Time responded "We've reviewed all of their allegations, and find nothing wrong with the Time story." In June 1991, Newsweek reported that staffers for Time said they had received calls from a man claiming to be a paralegal for Time, who asked them if they had signed a confidentiality form about the article. Time editors sent staffers a computer memo, warning them about calls related to the article, and staffers told Newsweek that "sources named in the story say detectives have asked about their talks with Time". A Church of Scientology spokesman called the claims "scurrilous".
On February 14, 1992, Scientology leader David Miscavige gave Ted Koppel his first interview on Scientology on the ABC News program Nightline. The program noted that Scientology has vocal critics and cited Behar's 1991 article. Behar appeared on the program and gave his opinion of why individuals join Scientology, stating that the organization's "ulterior motive" is really to get people to take high-priced audit counseling. Behar stated on the program that he had evidence that members of the Church of Scientology had obtained his personal phone records. Later in the program, Koppel questioned Miscavige on the Church of Scientology's response to the Time magazine article, particularly the $3 million the church spent advertising in USA Today. Miscavige explained that the first three weeks of the advertising campaign was meant to correct falsehoods from the Time article, and the rest of the twelve-week campaign was dedicated to informing the public about Scientology. Koppel asked Miscavige what specifically had upset him about the Time article, and Miscavige called Behar "a hater". Miscavige noted that Behar had written an article on Scientology and the Internal Revenue Service three years before he began work on the Time piece, and made allegations that Behar had attempted to get two Scientologists kidnapped. When Koppel questioned Miscavige further on this, Miscavige said that individuals had contacted Behar after an earlier article, and Behar had told them to "kidnap Scientologists out". Koppel pressed further, noting that this was a serious charge to make, and asked Miscavige if his allegations were accurate, why he had not pressed charges for attempted kidnapping. Miscavige said Koppel was "missing the issue", and said that his real point was that he thought the article was not an objective piece.
Miscavige alleged on Nightline that the article itself was published as a result of a request by Eli Lilly and Company, because of "the damage we had caused to their killer drug Prozac". When Koppel asked Miscavige if he had affidavits or evidence to this effect, Miscavige responded "You think they'd admit it?" Miscavige stated that "Eli Lilly ordered a reprint of 750,000 copies of Time magazine before it came out", and that his attempts to investigate the matter with Eli Lilly and associated advertising companies were not successful.
### Litigation
The Church brought a libel lawsuit against Time Warner and Behar, seeking damages of $416 million. The Church alleged false and defamatory statements were made concerning the Church of Scientology International in the Time article. More specifically, the Church of Scientology's court statements claimed that Behar had been refining an anti-Scientology focus since his 1986 article in Forbes, which included gathering negative materials about Scientology, and "never accepting anything a Scientologist said and uniformly ignoring anything positive he learned about the Church". In its initial complaint filing, the Church quoted portions of the Behar article that it alleged were false and defamatory, including the quote from Cynthia Kisser, and Behar's own assertion that Scientology was a "global racket" that intimidated individuals in a "Mafia-like manner".
Noah Lottick's parents submitted affidavits in the case, in which they "affirmed the accuracy of each statement in the article"; Edward Lottick "concluded that Scientology therapies were manipulations, and that no Scientology staff members attended the funeral" of their son. During the litigation, the Church of Scientology attempted to subpoena Behar in a separate ongoing lawsuit with the Internal Revenue Service, and accused a federal magistrate of leaking information to him. Behar was questioned for over 190 hours during 30 days of depositions with Scientology attorneys in the libel case. One question was about Behar's life in his parents' home while he was still inside the womb. St. Petersburg Times explained that this question was prompted by Scientology teachings that certain problems come from prenatal memories. Behar told the St. Petersburg Times he "felt it was extremely excessive". In a countersuit, Behar brought up the issues of Church of Scientology private investigators and what he viewed as harassment. By July 1996, all counts of the libel suit had been dismissed. In the course of the litigation through 1996, Time Warner had spent $7.3 million in legal defense costs. The Church of Scientology also sued several individuals quoted in the Time article.
The Church of Scientology sued Reader's Digest in Switzerland, France, Italy, the Netherlands, and Germany for publishing a condensed version of the Time story. The only court to provide a temporary injunction was in Lausanne, Switzerland. In France, Italy, and the Netherlands, the courts either dismissed the Church of Scientology's motions, or set injunction hearings far beyond the date of actual publication. The company defied the injunction and mailed copies of the article, "Scientology: A Dangerous Cult Goes Mainstream", to their 326,000 Swiss subscribers. Worldwide editor-in-chief of Reader's Digest, Kenneth Tomlinson, told The New York Times that "a publisher cannot accept a court prohibiting distribution of a serious journalistic piece. ... The court order violates freedom of speech and freedom of the press". The Church of Scientology subsequently filed a criminal complaint against the Digest in Lausanne, and Mike Rinder stated it was in blatant violation of the law. By defying the Swiss court ban, the Reader's Digest risked a fine of about $3,400, as well as a potential three months' jail time for the Swiss Digest editor-in-chief. A hearing on the injunction was set for November 11, 1991, and the injunction was later lifted by the Swiss court.
In January 2001, a United States federal appeals court upheld the dismissal of the Church of Scientology International's case against Time Warner. In its opinion, the United States Court of Appeals for the Second Circuit ruled that Time Warner had not published "The Thriving Cult of Greed and Power" with an actual intent of malice, a standard that must be met for libel cases involving individuals and public groups. On October 1, 2001, the Supreme Court of the United States refused to consider reinstating the church's libel case Church of Scientology International v. Time Warner Inc., 00-1683. Time Warner said it refused to be "intimidated by the church's apparently limitless legal resources." In arguments presented to the Supreme Court, the Church of Scientology acknowledged that church officials had "committed improper acts" in the past, but also claimed that: "allegations of past misconduct were false and distorted, the result of the misunderstanding, suspicion and prejudice that typically greet a new religion". Of the rulings for Time Warner, the Church of Scientology complained that they "provide a safe harbor for biased journalism". Behar commented on the Church of Scientology's legal defeat, and said that the lawsuit had a chilling effect: "It's a tremendous defeat for Scientology ... But of course their doctrine states that the purpose of a suit is to harass, not to win, so from that perspective they hurt us all. They've had a real chilling effect on journalism, both before and after my piece".
### Awards
As a result of writing the piece, Behar was presented with the 1992 Gerald Loeb Award for distinguished business and financial journalism, the Worth Bingham Prize, the Conscience-in-Media Award from the American Society of Journalists and Authors, awarded to "those who have demonstrated singular commitment to the highest principles of journalism at notable personal cost or sacrifice," and the Cult Awareness Network's Leo J. Ryan Award, in honor of Congressman Leo J. Ryan. Paulette Cooper was also awarded the 1992 Conscience-in-Media Award by the American Society of Journalists and Authors, for her book The Scandal of Scientology. This was the only time in the history of the American Society of Journalists and Authors that the award was presented to more than one journalist in the same year.
In a February 1992 issue of Time, editor Elizabeth Valk congratulated Behar on his Conscience-in-Media Award, stating "Needless to say, we are delighted and proud". Valk noted that the honor had only been awarded seven times in the previous seventeen years of its existence. Managing editor Henry Muller also congratulated Behar in an April 1992 issue of Time.
## Analysis
Insane Therapy noted that Scientology "achieved more notoriety ... with the publication of the journalist Richard Behar's highly critical article". Larson's Book of World Religions and Alternative Spirituality described the cover design of the article as it appeared in Time, writing that it "shouted" the headline from the magazine cover. In a 2005 piece, Salon.com magazine noted that for those interested in the Church of Scientology, the Time article still remains a "milestone in news coverage", and that those who back the Church believe it was "an outrageously biased account".
## Legacy
The Church of Scientology's use of private investigators was cited in a 1998 article in the Boston Herald, and compared to Behar's experiences when researching "The Thriving Cult of Greed and Power". After the paper ran a five-part series of critical articles in 1998, then Church of Scientology President Heber Jentzsch confirmed that a private investigative firm was hired to look into the personal life of Joseph Mallia, the reporter who wrote the articles. In a later piece titled "Church of Scientology probes Herald reporter—Investigation follows pattern of harassment" this investigation was likened to Behar's assertions of harassment, as well as other reporters' experiences from 1974, 1988, and 1997.
Because of the history of conflict between Reader's Digest and Scientology, the writer of a 2005 cover story on Tom Cruise agreed to certain demands, including giving Scientology issues equal play in the writer's profile of Cruise, submitting questions for Cruise to Church of Scientology handlers, and sending the writer of the article to a one-day Church immersion course. Also in 2005, an article in Salon questioned whether the tactics of the Church's litigation and private investigations of Time Warner and other media sources had succeeded in decreasing the amount of investigative journalism pieces on Scientology in the press. A 2005 article in The Sunday Times cited the article, and came to the determination that the Church of Scientology's lawsuit against Time Warner "served to warn off other potential investigations", and that "The chill evidently lingers still".
"The Thriving Cult of Greed and Power" continues to be used today by journalists in the media, as a reference for historical information on the Church of Scientology. In April 2007, CNN anchor Anderson Cooper interviewed former Office of Special Affairs director Mike Rinder, in a live piece on Anderson Cooper 360° titled "Inside Scientology". The CNN story was prompted by the May 2007 airing of a BBC Panorama investigative program, "Scientology and Me". In the interview, Anderson Cooper quoted directly from "The Thriving Cult of Greed and Power" article, when asking Rinder about the history of Operation Snow White, and if those tactics were currently used by the Church. Rinder answered by stating that the individuals involved with Operation Snow White were no longer involved in Church of Scientology activities, and that the incident was "ancient history". Cooper then again referenced the Time magazine article noting that Behar asserted that he was illegally investigated by Scientology contacts during research for his article. Cooper questioned Rinder on the dismissed lawsuit against Time Warner, and Rinder acknowledged that all of the Church of Scientology's appeals against Time Warner were eventually rejected. |
# Armadillo shoe
The armadillo shoe (alternately armadillo heel or armadillo boot) is a high fashion platform shoe created by British fashion designer Alexander McQueen for his final collection, Plato's Atlantis (Spring/Summer 2010). Only 24 pairs exist: 21 were made during the initial production in 2009, and three were made in 2015 for a charity auction. The shoes are named for their unusual convex curved shape, said to resemble an armadillo. Each pair is approximately 12 inches (30 cm) from top to sole, with a 9-inch (23 cm) stiletto heel; this extreme height caused some models to refuse to walk in the Plato's Atlantis show. American singer Lady Gaga famously wore the shoes in several public appearances, including the music video for her 2009 single "Bad Romance".
Critical response to the armadillo heels was extensive, both immediately following the show and in retrospect. They are considered iconic in the context of the Plato's Atlantis show, McQueen's body of work, and in fashion history in general. Critics have referred to them as both grotesque and beautiful, sometimes in the same review. Much of the negative criticism focused on the height of the heel, which has been viewed as impractical, even unsafe. Other writers have explored the shoes as artistic statements. Pairs of armadillo heels have been featured in museum exhibitions, most prominently in the McQueen retrospective Alexander McQueen: Savage Beauty, first shown at the Metropolitan Museum of Art in New York City in 2011.
## Background
British fashion designer Alexander McQueen was known for his imaginative, sometimes controversial designs, and dramatic fashion shows. During his nearly twenty-year career, he explored a broad range of ideas and themes, including historicism, romanticism, femininity, sexuality, and death. He had designed extreme footwear for previous collections, including high platform shoes inspired by the Japanese geta and Venetian chopine for his Spring/Summer 2008 collection, La Dame Bleue, and houndstooth platforms for Autumn/Winter 2009, The Horn of Plenty.
For his Spring/Summer 2010 collection, Plato's Atlantis, McQueen took inspiration from climate change and Charles Darwin's theory of evolution, envisioning a world where humans evolved to survive underwater after global flooding. The collection was presented on the catwalk at Paris Fashion Week on 6 October 2009. The show began with designs that used earth tones and digitally printed animal skin patterns to invoke the appearance of land animals, and gradually transitioned into designs featuring abstract prints in aqua and blue, suggesting that the models were adapting to an increasingly submerged planet. The show's final outfit, entitled "Neptune's Daughter", was covered entirely in enormous blue-green opalescent sequins, including the matching armadillo shoes. The outfit represented the final stage of humanity's adaptation to an underwater environment. It was worn on the runway by Polina Kasina, who had long been McQueen's fit model. Plato's Atlantis was McQueen's final fully realised collection; he died by suicide in 2010.
## Design
The armadillo shoes are almost 12 inches (30 cm) from top to sole, with a 9-inch (23 cm) spike heel. The vertical body of the shoe is shaped in a convex curve, which has been compared to the silhouette of an armadillo, lobster claw, or animal hoof. Their shape is generally regarded as unique in high fashion, although museum curator Helen Persson found a precedent in the shape of Persian riding boots of the 16th century.
The shoe hides the entire foot from ankle to toe, creating the illusion that the wearer is walking en pointe in the manner of a ballerina. In actuality the ball of the foot rests at an angle on a concealed platform, with a small bulge above the toe to facilitate lifting the heavy shoe to walk. In keeping with the animalistic theme of the collection, each pair is uniquely decorated in animal skin such as python skin or shagreen (rawhide from the cowtail stingray), or iridescent paillettes resembling scales.
## History
### Development and runway show
McQueen sketched the initial idea for the shoes in early 2009, taking inspiration from the work of British pop artist Allen Jones and Australian fashion designer Leigh Bowery. He commissioned shoe designer Georgina Goodman to realise the concept. Each pair was hand-carved from wood in Italy. The Daily Beast reported that the complex manufacturing process "spanned five days and involved 30 people, using material from three suppliers and passing through three factories". The inner lining and outer shell were shaped separately and fitted together; each section required two zippers for access. For the original collection, 21 pairs were made, 20 of which were worn during the Plato's Atlantis October 2009 fashion show.
Designed as showpieces, the shoes were never commercially produced, although many were sold to private buyers following the show. The Alexander McQueen Archive in London retains ownership of at least five pairs, including the pair covered with iridescent scales worn in the final outfit of the show. The Metropolitan Museum of Art (The Met) in New York City owns two pairs, one made from turquoise shagreen and another in black leather with metal accents.
The unusual shape made walking in the shoes notoriously difficult. The show's producer, Sam Gainsbury, tested them the night before the show and found walking impossible. When she complained of this to McQueen and suggested the models were at risk of falling, the designer responded, "If they fall, they fall." In the end, models Abbey Lee Kershaw, Natasha Poly and Sasha Pivovarova all declined to walk in Plato's Atlantis because of their concerns that the heels were too high to be safe. In the 2018 documentary McQueen model Magdalena Frąckowiak said that she found walking in them "really frightening". Despite these concerns, no models fell at the show, which was regarded as "miraculous" by the fashion press. Shortly after the Plato's Atlantis show, staffers from British Vogue tested the shoes and found them difficult to walk in. Months after the show McQueen confirmed in an interview with trade journal Women's Wear Daily that he had never tested the armadillos personally. He made it clear that he was far less concerned with practicality than with visual effect, saying elsewhere, "The world needs fantasy, not reality. We have enough reality today."
### Celebrity wear
Celebrities have worn armadillo heels for red carpet appearances and photoshoots. The first of these was in November 2009, when British socialite Daphne Guinness wore a pair in nude-coloured leather and reported that they were "surprisingly comfortable". Guinness also wore a pair of snakeskin armadillo boots in a shoot for Vogue Italia in February 2010. American singer Kelis wore another nude leather pair on the red carpet in January 2010. American actress Demi Moore wore a tan pair on the April 2010 cover of Harper's Bazaar.
American singer Lady Gaga, who became a friend of McQueen's shortly before his suicide, premiered her 2009 single "Bad Romance" at the Plato's Atlantis show. For the single's music video, released November 2009, Gaga wore the opalescent "Neptune's Daughter" outfit that closed the Plato's Atlantis show, including the matching armadillo shoes. Gaga wore a pair of armadillo heels in python skin when she arrived at the MTV Video Music Awards in September 2010; she described this look in 2018 as the top outfit of her career. Later that month, she wore the same pair with a dress made of hair for a performance at The Oak Room at New York's Plaza Hotel.
Three brand-new pairs were created in 2015 by McQueen's label in partnership with Christie's auction house, which sold them to raise money for the UNICEF 2015 Nepal earthquake relief fund. Initially expected to sell for US$10–15,000 all together, they eventually sold for a combined total of $295,000. All three pairs were sold to American actor Taylor Kinney, who gifted them to Lady Gaga, then his fiancée. In 2016 Gaga was the guest editor for the Spring preview issue of V magazine, which featured a photoshoot of herself and Guinness wearing armadillo heels.
## Reception and cultural legacy
Critical reaction to the armadillo shoes was immediate and polarised. Many reviewers described them as both grotesque and beautiful in the same breath. They were particularly noted for their complete visual departure from the natural structure of the human foot. Critics often described the models as looking alien, monstrous, or inhuman while wearing them. They are often described as an iconic element of the Plato's Atlantis collection and of McQueen's body of work in general. In 2012, British Vogue called them one of the 20 all-time most iconic shoes.
Although there was some criticism of their appearance, much of the negative reaction centered on the perceived impracticality of walking in the armadillo heels. Some critics labelled the impractical design a feminist issue, pointing out that female models were being expected to walk in extreme heels designed by a man. Costume design professor Deborah Bell wrote that they transformed the model into a "hunted victim".
Critics viewing them in retrospect have described their effect on high fashion footwear as groundbreaking. By 2010, fashion journalists were crediting the armadillo heels as one source of a trend towards extreme high heels both on the runway and in everyday fashion. In 2018, Aria Darcella argued in Fashionista that "never in fashion has a shoe eclipsed the rest of a collection". Later that year, in an article that celebrated deliberately unappealing fashion, The Paris Review called them "aggressively ugly" while noting that they had "forever change[d] footwear." Writing for the American edition of Vogue in 2020, Steff Yotka described them as "the progenitor of our obsession with really quite bizarre footwear".
Since their debut, the armadillo shoes have been featured in four museum exhibitions. Several pairs from the Alexander McQueen Archive were featured at Alexander McQueen: Savage Beauty, a retrospective exhibition of McQueen's work, which appeared at The Met in 2011 and the Victoria and Albert Museum (V\&A) of London in 2015. The shoes also appeared in the 2015 V\&A exhibition Shoes: Pleasure and Pain. In 2017, Kelis lent her pair to the Museum of Modern Art for a fashion exhibition entitled Items: Is Fashion Modern? For the 2022 exhibition Lee Alexander McQueen: Mind, Mythos, Muse, first shown at the Los Angeles County Museum of Art, designer Michael Schmidt was commissioned to make several replica pairs from various materials, including candy wrappers, broken CDs, and Swarovski crystals.
In 2019 Kerry Taylor Auctions reported selling a pair of armadillo heels in turquoise shagreen for . For the Spring/Summer 2024 Alexander McQueen pre-collection, the brand introduced a line of shoes with curved heels inspired by the shape of the armadillo shoes.
## Analysis
Some writers have explored the artistic and cultural implications of the armadillo heels. Their existence as impractical but visually striking footwear has been used to support the argument that fashion is an art form in its own right. Writing for The New York Times in 2009, Amanda Fortini connected their immense height to the so-called hemline theory, which posits that fashion designs tend to reflect the state of the economy. She suggested that the extreme heels on the armadillo boots reflected an attempt to "lift our collective spirits" given the impact of the Great Recession of 2008. Fashion historians Beth Dincuff Charleston and Francesca Granata have each argued, in 2010 and 2017 respectively, that the shoes function closer to medical or corrective devices than footwear.
Mass media theorist Paul Hegarty discussed Lady Gaga's use of the armadillo heels in her "Bad Romance" video as a combination of dominance and submission: their height restricts Gaga's movement, indicating submissiveness, but her ability to walk in them indicates a subversive kind of dominance. In this way, the video "looks at complicity with controls as a way of surmounting them". In 2014 Isabelle Szmigin and Maria Piacentini discussed them as an example of how high fashion concepts – in this case, extremely high heels – are absorbed into popular culture and then spread to individuals, affecting their desires and behaviour as consumers.
Shahidha Bari, professor of fashion cultures, described them in 2020 as a parody of a ballerina's pointe shoes: "gorgeous and cruel, but it also makes explicit the mercilessness of the pointe shoe". Philosopher Gwenda-Lin Grewal called them an example of surrealist high comedy in fashion, comparing them to the absurdist shoe hat created by Italian designer Elsa Schiaparelli in 1937.
Performance scholar Franziska Bork Petersen picked up the thread of Charleston and Granata's arguments in her book Body Utopianism (2022), analysing the armadillo shoes as analogous to prosthetics in their altering of the human form. Petersen noted that while watching the runway show, the distinctive gait of the models wearing the armadillo heels became a visual norm, and that the more typical gait of models wearing other shoes "stand out in their otherness". She argues that their ability to wear the difficult shoes proficiently makes them "technicians" on the runway, and that it is the movement of the models which completes the visual impact of the shoe. She situates the unusual shape of the shoe as typical of fashion rather than an outlier, arguing that throughout the history of fashion design, clothing and footwear have significantly altered the natural shape of the human body. Although she critiques the armadillo shoes for existing as commercial rather than strictly artistic objects, Petersen concludes that the radical alteration the shoes make to the appearance of the body can "open up the possibility to encounter the familiar human body as strange", allowing for unconventional ideas of beauty to emerge in the viewer. |
# Wood Badge
The Wood Badge is an award for Scout leader training, first awarded by The Boy Scouts Association in the United Kingdom in 1919 and subsequently adopted, with variations, by some other Scout organizations. Wood Badge courses teach Scout leadership skills and instil an ideological bond and commitment to the organizations. Courses generally have theory and practical phases followed by a practice project. Scouters who complete the course are awarded a pair of wood beads on each end of a leather thong, from a necklace of beads Robert Baden-Powell claimed to have taken from the African chief Dinizulu.
## Insignia
The Wood Badge is worn around the neck as part of the Scouter's uniform. In some Scout organizations, the wood badge is presented together with a Gilwell scarf and a Gilwell woggle, denoting membership of the notional 1st Gilwell Scout Group.
### Beads
Early Wood Badge beads came from a necklace that Baden-Powell claimed to have taken from a deserted Zulu mountain stronghold while on a failed military campaign to capture Dinizulu in Zululand (now part of South Africa). Such necklaces of beads made from acacia, known as iziQu in Zulu, were presented to brave warrior leaders. In 1919, Baden-Powell threaded beads from the necklace he had taken onto leather thong he claimed had been given to him by an elderly South African in Mafeking and called it the Wood Badge.
When produced, the thong is joined by a simple overhand knot but the two ends of the thong are often tied together with a decorative diamond knot. Various rituals are practiced in tying the diamond knot, such as having a fellow course member tie it, having a mentor or course leader tie it or having the recipient tie it after completing an additional activity that shows they have mastered training skills.
#### Additional beads
Additional beads are awarded for completion of training for different levels:
- 2 beads for the basic Wood Badge (WB),
- 3 beads (WB3) for trainers at managing, planning and implementing level,
- 4 beads (WB4) for trainers at conceptualising, designing and developing level.
- 5 beads for the Deputy Camp Chief of Gilwell or other training centre, an official representative of Gilwell Park maintaining the global integrity of Wood Badge training.
- 6 beads for those (volunteers or employed staff) with primary responsibility for Wood Badge training in a Scout organization.
Baden-Powell, wore six beads, as did his Deputy Chief Scout and right-hand man, Percy Everett. Baden-Powell's beads are on display at Baden-Powell House in London. Everett endowed his six beads to be worn by the Camp Chief of Gilwell as a badge of office.
### Gilwell scarf or neckerchief
The Gilwell scarf is a triangular scarf or neckerchief made of cotton or wool twill with a taupe face and red back, with a patch of Clan MacLaren tartan affixed near the point. The patch of Maclaren clan tartan honours William de Bois Maclaren, The Boy Scouts Association commissioner who donated £7000 to The Boy Scouts Association in 1919 to purchase Gilwell Park as a leader training centre and an additional £3000 for improvements to the house on the estate. The Maclaren tartan represents the Wood Badge and training ties to Gilwell Park. Originally, the scarf was made entirely of triangular pieces of the tartan but its expense forced the adoption of the current design.
### Gilwell woggle
The Gilwell woggle is a braided leather two or three-strand Turk's head knot, which has no beginning and no end and symbolizes the commitment to the Scout Movement. In some countries, Wood Badge training is divided into parts and the Gilwell woggle is given for completion of part one. First designed in the early 1920s by British Scouter Bill Shankley, making a Turk's head knot woggle was part of the leader training scheme by 1926.
## Scout leader training course
### History
The Boy Scouts Association conducted early Scoutmaster training camps in London and Yorkshire. The first Wood Badge training, with 18 participants, was organized by The Boy Scouts Association and held from 8 to 19 September 1919 at its newly acquired leader training centre, Gilwell Park, then just outside London. The training was led by The Boy Scouts Association's Gilwell Park Camp Chief, Francis Gidney and its Commissioner for Training Percy Everett, with lectures by Baden-Powell and others. Wood Badge training courses continued at Gilwell Park. Other sites providing Wood Badge training have taken the Gilwell name.
### Modern curriculum
The principles underpinning the Wood Badge Training Scheme are:
- "Continuous Development": Emphasizes continuous adult development from both internal and external sources.
- "Essential Areas" Directed to include "Fundamentals of Scouting, Leadership and Team Management, Project management, Communication and Adult development."
- "Progressive With Multi-Entry Points" Adaptive to varying skill and knowledge levels.
- "Not Time-Bound"
- "Adaptable" Specifies that it be flexible, adaptable and responsive to the evolving needs of young people, adults and the scout organizations.
- "Recognizing and Using the Scout Method"
- Acceptance of the Principles and Practices of the "Safe from Harm" framework
- "Recognition of Individual Development" Direct that in each country establish a framework of skills to be attained and the participants be recognized when they are attained.
Included in the areas above, a Wood Badge competence framework is directed to cover development of the competencies in the following topic clusters:
- Scouting (fundamentals) essentials such as Essential Characteristics of Scouting, Youth Program Implementation, Vision and Growth, Safe from Harm, etc.
- Leadership and Management such as situational leadership, team management and development, taking initiative, leading change, learning organization, etc.
- Project management such as generating ideas, working on plans and solutions, achieving results, evaluating success etc.
- Communicating meaningfully, effectively and with cultural sensitivity.
- Adult development such as facilitating learning, organizing training, providing coaching and mentoring support etc.
Every suggested topic is directed to have a list of competencies developed through various training programs.
Generally, a Wood Badge course consists of classroom work, a series of self-study modules, outdoor training and the Wood Badge "ticket" or "project". Classroom and outdoor training are often combined and taught together and occur over one or more weeks or weekends. As part of completing this portion of the course, participants must write their tickets.
The exact curriculum varies from country to country but the training generally includes both theoretical and experiential learning. All course participants are introduced to the 1st Gilwell Scout group or Gilwell Scout Troop 1 (the latter name is used in the Boy Scouts of America and some other countries). In the Boy Scouts of America, they are also assigned to one of the traditional Wood Badge "critter" patrols. Instructors deliver training designed to strengthen the patrols. One-on-one work with an assigned troop guide helps each participant to reflect on what they have learned, so that he can better prepare an individualized "ticket". This part of the training program gives the adult Scouter the opportunity to assume the role of a Scout joining the original "model" troop, to learn firsthand how a troop ideally operates. The locale of all initial training is referred to as Gilwell Field, no matter its geographical location.
### Ticket
The phrase 'working your ticket' comes from a story attributed in Scouting legend to Baden-Powell: Upon completion of a British soldier's service in India, he had to pay the cost of his ticket home. The most affordable way for a soldier to return was to engineer a progression of assignments that were successively closer to home.
Part of the transformative power of the Wood Badge experience is the effective use of metaphor and tradition to reach both heart and mind. In most Scout associations, "working your ticket" is the culmination of Wood Badge training. Participants apply themselves and their new knowledge and skills to the completion of items designed to strengthen the individual's leadership and the home unit's organizational resilience in a project or "ticket". The ticket consists of specific goals that must be accomplished within a specified time, often 18 months due to the large amount of work involved. Effective tickets require much planning and are approved by the Wood Badge course staff before the course phase ends. Upon completion of the ticket, a participant is said to have earned his way back to Gilwell.
### On completion
After completion of the Wood Badge course, participants are awarded the insignia in a Wood Badge bead ceremony. They receive automatic membership in the notional 1st Gilwell Park Scout Group or Gilwell Troop 1. These leaders are henceforth called Gilwellians or Wood Badgers. It is estimated that worldwide over 100,000 Scouters have completed their Wood Badge training.
### 1st Gilwell Scout Group
The 1st Gilwell Scout Group is a notional Scout Group composed of Wood Badge recipients. A meeting of the Group is held annually, during the first weekend in September at Gilwell Park for the Gilwell Reunion. Gilwell Reunions are also held in other places, often on that same weekend.
### Training camp symbols
#### Axe and Log
The axe and log logo was conceived by the first Camp Chief, Francis Gidney, in the early 1920s to distinguish Gilwell Park from the Scout Headquarters. Gidney wanted to associate Gilwell Park with the outdoors and Scoutcraft rather than the business or administrative Headquarters offices. Scouters present at the original Wood Badge courses regularly saw axe blades masked for safety by being buried in a log. Seeing this, Gidney chose the axe and log as the totem of Gilwell Park.
#### Other symbols
The kudu horn is another Wood Badge symbol. Baden-Powell first encountered the kudu horn at the Battle of Shangani, where he discovered how the Matabele warriors used it to quickly spread a signal of alarm. He used the horn at the first Scout encampment at Brownsea Island in 1907. It is used from the early Wood Badge courses to signal the beginning of the course or an activity and to inspire Scouters to always do better.
The grass fields at the back of the White House at Gilwell Park are known as the Training Ground and The Orchard and are where Wood Badge training was held from the early years onward. A large oak, known as the Gilwell Oak, separates the two fields. The Gilwell Oak symbol is associated with Wood Badge, although the beads for the Wood Badge have never been made of this oak.
Wolf Cub leaders briefly followed a separate training system beginning in 1922, in which they were awarded the Akela Badge on completion. The badge was a single fang on a leather thong. Wolf Cub Leader Trainers wore two fangs. The Akela Badge was discontinued in 1925 and all leaders were awarded the Wood Badge on completion of their training. Very few of the fangs issued as Akela Badges can now be found.
## International training centers and trainers
### Australia
The first Australian Wood Badge courses were held in 1920 at Gilwell Park, Gembrook after the return of two deputy camp chiefs, Charles Hoadley and Mr. Russell from training at Gilwell Park in England. In 2003, Scouts Australia established its Scouts Australia Institute of Training, a government-registered National Vocational & Education Training (VET) provider and awards a "Diploma of Leadership and Management" to adult leaders who complete the Wood Badge training and additional competencies. The VET qualifications are recognized throughout Australia by government and private industry.
### Austria
The first Wood Badge Training in Austria took place in 1932. Scoutmaster Joesef Miegl took his Wood Badge training in Gilwell Park and September 8 to 17, 1922, he led a Leader Training near Vienna, one of the first in Austria. Scouters from Austria, Germany, Italy and Hungary took part. He brought in many things he learned in Gilwell Park about International and British Scouting but it was not an official Wood Badge training.
### Belgium
The first Wood Badge training in Belgium was held in August 1923 at Jannée, led by Étienne Van Hoof. In the largest Scout association of the country, known as Les Scouts – Fédération des Scouts Baden-Powell de Belgique, it is necessary to complete the 3-steps formation in 3 years. After the 3 steps, the scout leader become a Wood Badger and he receives a Certificate as an animator in a holiday centre (Brevet d'animateur en centre de vacances (BACV)) by the French Community of Belgium.
### Canada
Scouts Canada requires that Scouters (volunteers) are required to complete an online Wood Badge Part I Course, and are encouraged to complete a Wood Badge Part II program that includes self-directed learning, conducted through mentorship and coaching in addition to traditional courses and workshops. Upon completion of the Wood Badge Part II program a volunteer is awarded their "beads" and the Gilwell Neckerchief.
### Finland
Alfons Åkerman gave the first eight Wood Badge courses and was from 1927 to 1935 the first Deputy Camp Chief. In lieu of Gilwell training, the Finnish Scouts have a "Kolmiapila-Gilwell" (Trefoil-Gilwell), combining aspects of both girls' and boys' advanced leadership training.
### France
The first Wood Badge training in France was held Easter 1923 by Père Sevin in Chamarande.
### Ireland
Wood Badge training in Ireland goes back to the 1st Larch Hill of the Catholic Boy Scouts of Ireland, who conducted Wood Badge courses that emphasized the Catholic approach to Scouting. This emphasis is now disappeared since the formation of Scouting Ireland.
### Hungary
In 2010, 21 year after the reorganization of Hungarian Scout Association, was the first Scoutmaster training with the Wood Badge. (There was other Scoutmaster training before but these weren't organized according to the Wood Badge Framework.) The head of the first Wood Badge training in Hungary was Balázs Solymosi who has four beads. From 2010 to 2018, in 8 courses more than 50 adult leader performed successfully and awarded. In 2019 started a new era in Wood Badge training in Hungary. Two type of courses are available: one for leaders in the Association and one for local group leaders. The association level have the basis made by Balázs Solymosi, the group leader level based on a new training program. Both program gives the highest level of scouting knowledge from different point of view for the participants.
### Madagascar
The First Wood Badge training of the Tily eto Madagasikara, known as the first Lasy Ravinala, was held in 1957 at Dinta Ambohidratrimo, Antananarivo, led by the first malagasy Chief Commissioner Samuel Randria.
In Madagascar, the participants of the Wood badge camp can only wear the woggle. They will get their first two bead one year later after writing and defend a dissertation.The Gilwell scarf can only be worn by a three-bead (Trainer) and a four-bead (Trainer of trainer) holder.
### The Netherlands
The first Wood Badge training in the Netherlands was held in July 1923 by Scoutmaster Jan Schaap, on Gilwell Ada's Hoeve, Ommen. At Gilwell Sint Walrick, Overasselt, the Catholic Scouts had their training. Since approximately 2000, the Dutch Wood Badge training takes place on the Scout campsite Buitenzorg, Baarn, or outdoors in Belgium or Germany under the name 'Gilwell Training'.
### Norway
In Norway, Woodbadge is known as Trefoil-Gilwell Training.
### Philippines
Wood Badge was introduced in the Philippines in 1953 with the first course held at Camp Gre-Zar in Novaliches, Quezon City. Today, Wood Badge courses are held at the Philippine Scouting Center for the Asia-Pacific Region, at the foothills of Mount Makiling, Los Baños, Laguna province.
### Sweden
As in several other Nordic countries, the Swedish Wood Badge training is known as Trefoil Gilwell, being a unification of the former higher leadership programmes of the Swedish Guides and Scouts, known respectively as the Trefoil training and the Gilwell training.
### United Kingdom
The first Wood Badge training took place at Gilwell Park. The estate continues to provide the service for British Scouters of The Scout Association and international participants. Original trainers include Baden-Powell and Gilwell Camp Chiefs Francis Gidney, John Wilson and John Thurman.
### United States
Wood Badge was introduced to the United States by Baden-Powell. The first course was held in 1936 at the Mortimer L. Schiff Scout Reservation, the Boy Scouts of America national training center until 1979. Despite this early first course, Wood Badge was not formally adopted in the United States until 1948 under the guidance of Bill Hillcourt who became the first national Deputy Camp Chief of Gilwell in the BSA, also called the Deputy Camp Chief for the United States. Wood Badge courses are held throughout the country at local council camps, others are held at the National High Adventure Bases. |
# Yugoslav torpedo boat T6
T6 was a sea-going torpedo boat that was operated by the Royal Yugoslav Navy between 1921 and 1941. Originally 93 F, a 250t-class torpedo boat of the Austro-Hungarian Navy built in 1915–1916, she was armed with two Škoda 66 mm (2.6 in) guns and four 450 mm (17.7 in) torpedo tubes and could carry 10–12 naval mines. She saw active service during World War I, performing convoy, escort, patrol and minesweeping tasks, as well as anti-submarine operations. In 1917 the suffixes of all Austro-Hungarian torpedo boats were removed, and thereafter she was referred to as 93.
Following Austria-Hungary's defeat in 1918, 93 was allocated to the Navy of the Kingdom of Serbs, Croats and Slovenes, which later became the Royal Yugoslav Navy, and was renamed T6. At the time, she and the seven other 250t-class boats were the only modern sea-going vessels of the fledgling maritime force. During the interwar period, T6 and the rest of the navy were involved in training exercises and cruises to friendly ports, but activity was limited by reduced naval budgets. The boat was captured by the Italians during the German-led Axis invasion of Yugoslavia in April 1941. After her main armament was modernised, she served with the Royal Italian Navy under her Yugoslav designation, conducting coastal and second-line escort duties in the Adriatic Sea. Immediately following the Italian capitulation in September 1943, she was scuttled by her crew as she had insufficient fuel on board to reach an Allied port.
## Background
In 1910, the Austria-Hungary Naval Technical Committee initiated the design and development of a 275-tonne (271-long-ton) coastal torpedo boat, specifying that it should be capable of sustaining 30 knots (56 km/h; 35 mph) for 10 hours. At the same time, the committee issued design parameters for a high seas or fleet torpedo boat of 500–550 t (490–540 long tons), top speed of 30 kn and endurance of 480 nautical miles (890 km; 550 mi). This design would have been a larger and better-armed vessel than the existing Austro-Hungarian 400-tonne (390-long-ton) Huszár-class destroyers. The specification for the high seas torpedo boat was based on an expectation that the Strait of Otranto, where the Adriatic Sea meets the Ionian Sea, would be blockaded by hostile forces during a future conflict. In such circumstances, there would be a need for a torpedo boat that could sail from the Austro-Hungarian Navy base at the Bocche di Cattaro (the Bocche or Bay of Kotor) to the strait during the night, locate and attack blockading ships and return to port before morning. Steam turbine power was selected for propulsion, as diesels with the necessary power were not available and the Austro-Hungarian Navy did not have the practical experience to run turbo-electric boats.
Despite having developed these ideas, the Austro-Hungarian Navy then asked shipyards to submit proposals for a 250 t (250-long-ton) boat with a maximum speed of 28 kn (52 km/h; 32 mph). Stabilimento Tecnico Triestino (STT) of Triest was selected for the contract to build the first eight vessels, designated as the T-group. Another tender was requested for four more boats, but when Ganz & Danubius reduced their price by ten per cent, a total of sixteen boats were ordered from them, designated the F-group. The F-group designation signified the location of Ganz & Danubius' main shipyard at Fiume.
## Description and construction
The 250t-class F-group boats had short raised forecastles and an open bridge and were fast and agile, well designed for service in the Adriatic. They had a waterline length of 58.76 metres (192 ft 9 in), a beam of 5.84 m (19 ft 2 in), and a normal draught of 1.5 m (4 ft 11 in). While their designed displacement was 243.9 t (240 long tons), they displaced 267 tonnes (263 long tons) fully loaded. The boats were powered by two AEG-Curtis steam turbines driving two propellers, using steam generated by two Yarrow water-tube boilers, one of which burned fuel oil and the other coal. There were two boiler rooms, one behind the other. The turbines were rated at 5,000 shaft horsepower (3,700 kW) with a maximum output of 6,000 shp (4,500 kW) and were designed to propel the boats to a top speed of 28–29 kn (52–54 km/h; 32–33 mph). They carried 20.2 tonnes (19.9 long tons) of coal and 31 tonnes (30.5 long tons) of fuel oil, which gave them a range of 1,200 nautical miles (2,200 km; 1,400 mi) at 16 kn (30 km/h; 18 mph). The F-group had two funnels rather than the single funnel of the T-group. The crew consisted of three officers and thirty-eight enlisted men. The vessel carried one 4 m (13 ft) yawl as a ship's boat.
93 T and the rest of the 250t class were classified as high seas torpedo boats by the Austro-Hungarian Navy, despite being smaller than the original concept for a coastal torpedo boat. The naval historian Zvonimir Freivogel states that this type of situation was common due to the parsimony of the Austro-Hungarian Navy. The 250t class were the first small Austro-Hungarian Navy boats to use turbines, and this contributed to ongoing problems with them, which had to be progressively solved once they were in service.
The boats were armed with two Škoda 66 mm (2.6 in) L/30 guns, with the forward gun mounted on the forecastle and the aft gun on the quarterdeck. A 40 cm (16 in) searchlight was mounted above the bridge. They were also armed with four 450 mm (17.7 in) torpedo tubes mounted in pairs, with one pair mounted between the forecastle and bridge and the other aft of the mainmast. One 8 mm (0.31 in) Schwarzlose M.7/12 machine gun was carried for anti-aircraft work. Four mounting points were installed so that the machine gun could be mounted in the most effective position depending on the expected direction of attack. The boat could also carry 10–12 naval mines.
The fifth of the F-group to be completed at Ganz-Danubius' main shipyard at Fiume, 93 F was laid down on 9 January 1915, launched on 25 November and commissioned on 4 April 1916.
## Career
### World War I
The original concept of operation for the 250t-class boats was that they would sail in a flotilla at the rear of a cruising battle formation and were to intervene in fighting only if the battleships around which the formation was established were disabled, or to attack damaged enemy battleships. When a torpedo attack was ordered, it was to be led by a scout cruiser, supported by two destroyers to repel any enemy torpedo boats. A group of four to six torpedo boats would deliver the attack under the direction of the flotilla commander. On 3 May 1916, 93 F and five other 250-class torpedo boats were accompanying four destroyers when they were involved in a surface action off Porto Corsini against an Italian force led by the flotilla leaders Cesare Rossarol and Guglielmo Pepe. On this occasion the Austro-Hungarian force retreated behind a minefield without damage. On 12 June, 93 F and her M-group sisters 98 M and 99 M were tasked to search for the Nembo-class destroyer Zeffiro and two small torpedo boats after they had attacked the town of Parenzo on the west Istrian coast, but the Italian ships escaped unharmed, apart from Zeffiro which was damaged in an attack by seaplanes. On 12 and 13 July, 93 F conducted trials with new smoke generators. On 29 October, she underwent repairs at the main Austro-Hungarian naval base at Pola in the northern Adriatic.
In 1917, one of 93 F's 66 mm guns may have been placed on an anti-aircraft mount. According to Freivogel, sources vary on whether these mounts were added to all boats of the class, and on whether these mounts were added to the forward or aft gun. In March, the noted inventor Dagobert Müller von Thomamühl was in command of the boat with the rank of Linienschiffsleutnant. At this time, 93 F was allocated to the 7th Torpedo Boat Group of the 5th Torpedo Division. On 11 May 1917, the British submarine HMS H1 stalked 78 T off Pola, missing her with two torpedoes. In response, 93 F, 96 F and 78 T, accompanied by the Huszár-class destroyer Csikós, unsuccessfully pursued H1. On 21 May, the suffix of all Austro-Hungarian torpedo boats was removed, and thereafter they were referred to only by the numeral. Also in May, 93 conducted several minesweeping missions. On 3–4 June, 93, along with 96 as well as Csikós and her sister ships Wildfang and Velebit, were returning from a seaplane support mission when Wildfang struck a mine and sank, with 93 assisting with the rescue of her surviving crew. On 11 July, 93 was transferred south from Pola to the Bocche. On 23 September, she was patrolling near the mouth of the Bojana River that marks the border between Montenegro and Albania, when an unidentified Allied submarine fired a torpedo at her, but it passed under her hull without exploding. During 1917, 93 conducted further minesweeping missions and escorted 36 convoys.
On 1 February 1918, a mutiny broke out among the sailors of some vessels of the Austro-Hungarian Navy at the Đenovići anchorage within the Bocche, largely over poor food, lack of replacement uniforms and supplies, and insufficient leave, although the poor state of the Austro-Hungarian economy and its impact on their families was also a factor. In response, the three Erzherzog Karl-class pre-dreadnought battleships of the 3rd Division were despatched from Pola to the Bocche the following day to put down the rebellion, and 93 was part of their escort. In the event, the mutiny had been suppressed before they arrived, and 93 returned to Pola with the battleships soon after. On 22 April, 93 was part of an escort for the dreadnoughts Viribus Unitis and Szent István when they sailed to the Fasana Channel – between the Brijuni Islands and the Istrian Peninsula – for gunnery practice. 93 escorted the minelayer Camäleon while she laid a minefield on 22 July. On 11 August, 93, along with her sisters 78 and 80, the destroyer Warasdiner and the submarine chasers Arsa and Slavija, were despatched to chase the Italian submarine F7 which had sunk the steamship Euterpe off the island of Pag but had to terminate the pursuit due to poor weather. Two days later she joined the vessels of the anti-submarine flotilla in a hunt, but despite claims of success, no enemy submarine was sunk. On 20 August, 93 was transferred to the Bocche and was part of the 1st Torpedo Flotilla. On 29 September, 93 along with 82, 87 and 96 plus the Ersatz Triglav-class destroyers Lika, Dukla and Uzsok, laid mines in the Bay of Drim off northern Albania. During 1918, 93 also performed 55 more convoy escorts. As the end of the war approached in November the Austro-Hungarian Empire broke apart. On 1 November 1918, 93 was ceded to the State of Slovenes, Croats and Serbs. This was a short-lived fragment of the empire which united with the Kingdom of Serbia and Kingdom of Montenegro on 1 December, becoming the Kingdom of Serbs, Croats and Slovenes (from 1929, the Kingdom of Yugoslavia).
### Interwar period
The Austro-Hungarian Empire sued for peace in November 1918, and 93 survived the war intact. Immediately after the Austro-Hungarian capitulation, French troops occupied the Bocche, which was treated by the Allies as Austro-Hungarian territory. During the French occupation, the captured Austro-Hungarian Navy ships moored at the Bocche were neglected, and 93's original torpedo tubes were destroyed or damaged by French troops. In 1920, under the terms of the previous year's Treaty of Saint-Germain-en-Laye by which rump Austria officially ended World War I, 93 was allocated to the Kingdom of Serbs, Croats and Slovenes. Along with 87, 96 and 97, and four 250t-class T-group boats, she served with the Royal Yugoslav Navy (Serbo-Croatian Latin: Kraljevska Mornarica, KM; Краљевска Морнарица). Transferred in March 1921, in KM service, 93 was renamed T6. When the navy was formed, she and the other seven 250t-class boats were its only modern sea-going vessels. New torpedo tubes of the same size as originally fitted were ordered from the Strojne Tovarne factory in Ljubljana. In KM service T6 was rearmed with a single Bofors 40 mm (1.6 in) L/60 gun, and was also fitted with two Zbrojovka 15 mm (0.59 in) machine guns. Her crew was increased to 52, and she was commissioned in 1923.
In 1925, exercises were conducted off the Dalmatian coast, involving the majority of the navy. In May and June 1929, six of the eight 250t-class torpedo boats – including T6 – accompanied the more recently acquired light cruiser Dalmacija, submarine tender Hvar and submarines Hrabri and Nebojša, on a cruise to Malta, the Greek island of Corfu in the Ionian Sea, and Bizerte in the French protectorate of Tunisia. The ships and crews made a very good impression while visiting Malta. In 1932, the British naval attaché reported that Yugoslav ships engaged in few exercises, manoeuvres or gunnery training due to reduced budgets. By 1939, the maximum speed achieved by the 250t class in Yugoslav service had declined to 24 kn (44 km/h; 28 mph).
### World War II
In April 1941, Yugoslavia entered World War II when it was invaded by the German-led Axis powers. At the time of the invasion, T6 was assigned to the 3rd Torpedo Division located at Šibenik, which also included her sisters T3, T5 and T7. On the first day of the invasion, 6 April, they were anchored across the entrance of the St. Anthony Channel that links Šibenik Bay to the Adriatic, on a line between Jadrija on the northern side of the channel and Zablaće on the southern side, when aircraft of the Regia Aeronautica (Italian Royal Air Force) attacked Šibenik. On the same day, Kapetan bojnog broda Ivan Kern arrived to take command of the division, and the four boats sailed up the channel towards Šibenik then north to Zaton where they were again attacked unsuccessfully by Italian bombers. T3 incurred boiler damage and was sent south to Primošten for repairs to be undertaken.
On 8 April more unsuccessful Italian air attacks on the three remaining boats occurred, and the only effective anti-aircraft gun between them – the 40 mm (1.6 in) gun on T6 – malfunctioned. The three vessels then sailed east across Lake Prokljan to Skradin, where the population begged them to leave the harbour to avoid the town being bombed by the Italians. Their request was rebuffed, and during an Italian bombing raid some of the boats along with the water carrier Perun were slightly damaged. The following morning, Italian aircraft attempted to sink Perun using aerial torpedoes, but all missed. In response, Kern ordered T6 to escort Perun to the Bay of Kotor; the two vessels arrived there the next day without incident. There, T6's malfunctioning gun was repaired, and she was loaded with weapons, supplies and extra men and sent to Šibenik. On the return journey she stopped at Makarska and learned of the declaration of the creation of the Independent State of Croatia (NDH), an Axis puppet fascist state. On the same day, the division, along with other vessels, were tasked to support an attack on the Italian enclave of Zara on the Dalmatian coast, which was quickly cancelled as soon as the establishment of the NDH was declared. On the evening of 11 April, T6 met with T5 and the rest of the division near Šibenik. Kern ordered her to deliver her load to Šibenik then meet the rest of the division at Milna on the island of Brač. While overnighting at Šibenik, T6's crew saw the crew of the Uskok-class torpedo boat Četnik desert their boat during the night. T6 sailed to Milna on 12 April. Kern was unable to obtain orders from Šibenik Command by telephone, so he took the other Uskok-class boat Uskok to try to obtain some. His second-in-command was unable to maintain order, and a third of the crews of the division deserted. When Kern returned, he gave orders to sail to the Bay of Kotor, but the crews of the division refused to follow his orders. Kern retrieved his personal gear from T7 and, taking command of Uskok, sailed to the Bay of Kotor. Eventually he fled into exile with other KM vessels. On 13 April, the Orjen-class torpedo boat Triglav arrived with orders that the division should return to Šibenik to evacuate the staff of the KM Šibenik Command. The first order was complied with, but upon arrival at Šibenik the boat crews were given the choice of returning to their homes or sailing to Split to join the NDH navy. T6's commander, a Slovene, was not interested in serving in a Croatian navy and abandoned his vessel. The boats then sailed to nearby Divulje, to follow through on an intention to join the NDH navy, but the Italians opposed the NDH having a navy and they captured the boats of the division including T6.
T6 was then operated by the Italians under her Yugoslav designation, conducting coastal and second-line escort duties in the Adriatic. Second line escort duties were those where she was less likely to be required to engage Allied warships. Her main guns were replaced by two 76.2 mm (3 in) L/30 anti-aircraft guns, she was fitted with one or two Breda 20 mm (0.79 in) L/65 anti-aircraft guns, her bridge was enclosed, and one pair of torpedo tubes may have also been removed. She was also painted in a dazzle camouflage pattern. In Italian hands, her crew was increased to 64. She was allocated to Maridalmazia, the military maritime command of Dalmatia (Comando militare maritime della Dalmatia), which was responsible for the area from the northern Adriatic island of Premuda south to the port of Bar in the Italian governorate of Montenegro. According to Italian records, in February 1942, T6 and her sister T5 chased an Allied submarine between Split and the island of Mulo near Primošten, but there is no record of this incident in British records. During a convoy escort in the same month, an explosion occurred in one of T6's boilers, killing one stoker and wounding four. Based at Šibenik at the time of the Italian capitulation in September 1943, T6 was underway to the island of Lošinj south of Rijeka (formerly Fiume). Her commander suspected that Lošinj had already been occupied by the Germans, so he set course for Cesenatico – 30 km (19 mi) north of Rimini on the Italian coast – with the intention of continuing to Ancona to avoid the boat falling into German hands. He was unable to make contact with the Italian naval command at Ancona, and when no fuel was available at Cesenatico, he scuttled T6 off the town on 11 September as she had insufficient fuel remaining on board to reach an Allied port. Her crew became German prisoners of war.
## See also
- List of ships of the Royal Yugoslav Navy |
# HMS New Zealand (1911)
HMS New Zealand was one of three Indefatigable-class battlecruisers. Launched in 1911, the ship was funded by the government of New Zealand as a gift to Britain, and she was commissioned into the Royal Navy in 1912. She had been intended for the China Station, but was released by the New Zealand government at the request of the Admiralty for service in British waters.
During 1913, New Zealand was sent on a ten-month tour of the British Dominions, with an emphasis on a visit to her namesake nation. She was back in British waters at the start of the First World War, and operated as part of the Royal Navy's Grand Fleet, in opposition to the German High Seas Fleet. The battlecruiser participated in all three of the major North Sea battles—Heligoland Bight, Dogger Bank, and Jutland—and was involved in the response to the inconclusive Raid on Scarborough, and the Second Battle of Heligoland Bight. New Zealand contributed to the destruction of two cruisers and was hit by enemy fire only once, sustaining no casualties; her status as a "lucky ship" was attributed by the crew to a Māori piupiu (warrior's skirt) and hei-tiki (pendant) worn by the captain during battle.
After the war, New Zealand was sent on a second world tour, this time to allow Admiral John Jellicoe to review the naval defences of the Dominions. In 1920, the battlecruiser was placed in reserve. She was broken up for scrap in 1922 to meet the United Kingdom's tonnage limit in the disarmament provisions of the Washington Naval Treaty.
## Design
The Indefatigable class was not a significant improvement on the preceding Invincible class; the main difference was the enlargement of the dimensions to give the ships' two wing turrets a wider arc of fire. The ships were smaller and not as well protected as the contemporary German battlecruiser SMS Von der Tann and subsequent German designs. While Von der Tann's characteristics were not known when the lead ship of the class, Indefatigable, was laid down in February 1909, the Royal Navy obtained accurate information on the German ship before work began on New Zealand and her sister ship .
New Zealand had an overall length of 590 feet (179.8 m), a beam of 80 feet (24.4 m), and a draught of 29 feet 9 inches (9.1 m) at deep load. The ship displaced 18,500 long tons (18,800 t) at load and 22,130 long tons (22,490 t) at deep load. She initially had a crew of 818 officers and ratings, though this was to increase in subsequent years. At the time of her visit to New Zealand in 1913 the engineering department had a staff of 335.
The ship was powered by two sets of Parsons direct-drive steam turbines, each driving two propeller shafts using steam provided by 31 coal-burning Babcock & Wilcox boilers. The turbines were rated at 44,000 shaft horsepower (33,000 kW) and were intended to give the ship a maximum speed of 25 knots (46 km/h; 29 mph). However, during trials in 1912, the turbines produced over 49,000 shp (37,000 kW), which allowed New Zealand to reach 26.39 knots (48.87 km/h; 30.37 mph). The ship carried enough coal and fuel oil to give her a range of 6,690 nautical miles (12,390 km; 7,700 mi) at a speed of 10 knots (19 km/h; 12 mph).
The ship carried eight BL 12-inch Mk X guns in four twin gun turrets. Two turrets were mounted fore and aft on the centreline, identified as 'A' and 'X' respectively. The other two were wing turrets mounted amidships and staggered diagonally: 'P' was forward and to port of the centre funnel, while 'Q' was situated starboard and aft. Each wing turret had a limited ability to fire to the opposite side, but if the ship was full broadside to her target she could bring all eight main guns to bear. Her secondary armament consisted of sixteen 4-inch BL Mk VII guns positioned in the superstructure. She mounted two 18-inch (450 mm) submerged torpedo tubes, one on each side aft of 'X' barbette, and twelve torpedoes were carried.
The Indefatigables were protected by a waterline 4–6-inch (102–152 mm) armoured belt that extended between and covered the end barbettes. Their armoured deck ranged in thickness between 1.5 and 2.5 inches (38 and 64 mm) with the thickest portions protecting the steering gear in the stern. The turret faces were 7 inches (178 mm) thick, and the turrets were supported by barbettes of the same thickness.
New Zealand's 'A' turret was fitted with a 9-foot (2.7 m) rangefinder at the rear of the turret roof. It was also equipped to control the entire main armament in the event that the normal fire control positions were knocked out or communication between the primary positions and the gun layers was disabled.
### Wartime modifications
The ship was fitted with a single QF 6 pounder Hotchkiss anti-aircraft (AA) gun from October 1914 to the end of 1915. In March 1915, a single QF 3 inch 20 cwt AA gun was added. It was provided with 500 rounds. The battlecruiser's 4-inch guns were enclosed in casemates and given blast shields during a refit in November to better protect the gun crews from weather and enemy action. Two aft guns were removed at the same time.
New Zealand received a fire-control director sometime between mid-1915 and May 1916; this centralised fire control under the director officer. The turret crewmen merely had to follow pointers transmitted from the director to align their guns on the target. This greatly increased accuracy, as it was easier to spot the fall of shells and eliminated the problem of the ship's roll dispersing the shells when each turret fired independently.
To address deficiencies in the armour of British capital ships raised by the Battle of Jutland, New Zealand entered the dockyard in November 1916 where an additional inch of armour was added to selected horizontal areas of the main deck. In the forward part of the ship it covered the magazines for A-turret and the 4-inch guns; midships to cover the magazines for Q- and P-turrets, while it was extended vertically by 3 feet 6 inches (1.07 m) to protect the magazine trunks and escape shafts. During a refit in June 1917 the armour was again improved when 1-inch armour plate was added on the lower deck at the bottom of the inner and outer upper coal bunkers as well as over the boiler.
By 1918, New Zealand carried two aircraft, a Sopwith Pup and a Sopwith 11⁄2 Strutter, on flying-off ramps fitted on top of 'P' and 'Q' turrets. The Pup was intended to shoot down Zeppelins while the 11⁄2 Strutter was used for spotting and reconnaissance. Each platform had a canvas hangar to protect the aircraft during inclement weather.
### Post-war modifications
In preparation for its role as Admiral Jellicoe's personal transport for his planned visit to Australia, Canada, India and New Zealand New Zealand was refitted between December 1918 and February 1919. The fore topmast and both top gallants were replaced. Her flying-off platforms were removed and new peacetime trim was installed. The range clocks were removed and the deflection scales on the turrets were painted over. The lower forward four-inch guns were removed and replaced with cabins on the port and starboard sides of the forward superstructure to house Jellicoe and provide offices for his staff of eight.
## Acquisition and construction
At the start of the 20th century, the British Admiralty maintained that naval defence of the British Empire, including the Dominions, should be unified under the Royal Navy. Attitudes on this matter softened during the first decade, and at the 1909 Imperial Conference, the Admiralty proposed the creation of Fleet Units: forces consisting of a battlecruiser, three light cruisers, six destroyers, and three submarines. While Australia and Canada were encouraged to purchase fleet units to serve as the core of new national navies, other fleet units would be operated by the Royal Navy at distant bases, particularly in the Far East; New Zealand was asked to partially subsidise a fleet unit for the China Station.
To this end, the Prime Minister of New Zealand, Sir Joseph Ward, announced on 22 March 1909 that his country would fund a battleship (later changed to an Indefatigable-class battlecruiser) as an example to other countries. It is unclear why this design was selected, given that it was known to be inferior to the battlecruisers entering service with the Imperial German Navy (). Historian John Roberts has suggested that the request may have been attributable to the Royal Navy's practice of using small battleships and large cruisers as flagships of stations far from the United Kingdom, or it might have reflected the preferences of the First Sea Lord, Admiral of the Fleet John Fisher, preferences not widely shared. The New Zealand Government took out a loan to fund the cost of the ship.
When it came to naming the new ship the most obvious name was already being used by the existing King Edward VII-class battleship HMS New Zealand. It was decided to transfer the name to the new battlecruiser and to rename the older ship. Among the suggested names were Arawa, Caledonia, Wellington and Maori (which was already being used by a destroyer, and thus would have required a double renaming) being floated before Zealandia was eventually decided upon and subsequently approved by the King.
### Construction
Wright has identified that the Controller of the Admiralty John Jellicoe had wanted to have Australia and New Zealand constructed by the same shipbuilder. This would have reduced construction costs and simplified administration. Tenders were issued early in 1910, but of those who were prepared to tender, all were only prepared to construct one vessel. Both Australia and New Zealand for unknown reasons agreed to accept that of John Brown & Company which was highest of the two successful tenders, but the former signalled its acceptance first, leaving New Zealand to accept that of Fairfield Shipbuilding and Engineering. The estimated cost of Fairfield's offer was £1.8 million, which included the guns and the first issue of ammunition. Fairfield had already built HMS Indomitable, which would have given them confidence in their cost estimate, which included all stores including first coal and ammunition. In the end John Brown & Company delivered Australia well under their original tendered price.
New Zealand's keel was laid at Fairfield's yard on the Clyde on 20 June 1910. The construction contract was between the Admiralty and Fairfield (using the Admiralty's standard contract terms) and was overseen by the Admiralty with manufacturer's payment claims being approved and then passed on by the Admiralty to the New Zealand High Commission's office in London. Variation claims were often individually itemised (such as £1. 12s. 6d. for a specific drawing) and passed on for payment, with some payments still being processed as late as the 1914–15 financial year. The ship was built with all stores supplied from the Admiralty at the "Rate Book" price plus 20 per cent, with exception of the coal. The Admiralty did not charge New Zealand for its management of the project. Fairchild's share of the contract made a profit of £50,454 (6 per cent).
The four main gun mountings were made by Armstrong Whitworth's Elswick Ordnance Works, at a cost of £207,593 (excluding delivery and assembly) while the guns were supplied by both Armstrong Whitworth and Vickers. The twenty-two 12-inch guns (including six spares) and thirty-six 4-inch guns (including four spares) required to equip both of the Dominion's ships cost a combined total of £249,550.
New Zealand was launched on 1 July 1911 in front of 8,000 onlookers by Lady Theresa Ward, the wife of Sir Joseph Ward, using a bottle of New Zealand wine for the christening. Following her launch New Zealand was moved by the Clyde Shipping Company's tugs Flying Linnet and Flying Swallow to the shipyard's fitting out basin, for installation of the boilers, engines, and auxiliary machinery though temporary openings in the main deck before the superstructure and armament was installed.
The battlecruiser's first captain, Lionel Halsey, took command on 21 September 1912. Sea trials began in October with the hull checked in dry dock on 8 October prior to a 30-hour steam test at three-quarter power being undertaken on the 9 and 10 October. Full power tests were conducted off Polperro on 14 October with 49,048 shp being generated. Over the "measured mile" she reached 25.49 knots (based on revolutions) and 26.3 knots (by bearings).
New Zealand was formally commissioned at Govan on 19 November 1912. The Admiralty required that all new ships be drydocked as part of the acceptance process to allow the completion and inspection of all underwater fittings. As Fairchild didn't have their own drydock, the ship sailed from Govan with the nucleus of her crew to Devonport to use that shipyard's facilities. By now the ship's hull had spent a considerable time in Fairchild's often polluted fitting out basin, so the hull was cleaned and then painted with a fresh anti-fouling coating.
The ship was officially completed on 23 November 1912, when she reached her nominally full complement of crew. Her officers by now included three New Zealanders, Lieutenant Alexander David Boyle, Lieutenant Rupert Clare Garsia and Midshipman Hugh Beckett Anderson, all from Christchurch.
] To signal her upcoming completion the New Zealand government commissioned the marine artist William Lionel Wyllie to produce a painting of New Zealand which he titled Tower House', Portsmouth [HMS "New Zealand" fitting out]. In subsequent years he also produced other paintings of the ship.
## Service history
In December 1912 the battlecruiser began the task of working up prior to joining the 1st Battlecruiser Squadron. While at sea over the 1912-13 New Year some of the masting was damaged by a storm.
### 1913 circumnavigation of the world
In 1912 it was agreed that the ship would visit its donor country as a 'thank you' for funding its construction, with a basic nine month long itinerary developed in the last months of 1912. To facilitate the flag-waving cruise New Zealand was temporary detached from the 1st Battlecruiser Squadron on 20 January 1913 for the duration of the voyage with Halsey having independent command. The initial date of departure progressively moved backward into 1913 with the ship finally departing the Royal Navy dockyard at Devonport on 28 January for Portsmouth which it reached two days later. On 3 February, 300 expatriate New Zealanders organized by Sir Thomas Mackenzie (New Zealand's High Commissioner to the United Kingdom) visited the ship at which he unveiled the battlecruiser's coat of arms (which had been gifted by the country's expatriate community in the United Kingdom). This was followed by a visit by King George V accompanied by Winston Churchill and James Allen (New Zealand's Minister of Finance and Defence) and other high–ranking officials on 5 February 1913.
As soon as the King's party had departed New Zealand took on coal before departing Portsmouth on 6 February. There were stops at St Vincent, Ascension Island, Cape Town, Simon's Town and Durban in South Africa and then at Melbourne in Australia before New Zealand reached Wellington, New Zealand, on 12 April. This was the start of an event that gripped the country as thousands of New Zealanders came to catch a sight of and where possible visit "our Dreadnought". For the ship's crew this meant having to attend a constant parade of events and festivities. After an 11-day stay in the capital New Zealand proceeded up the east coast of the North Island to visit Napier, Gisborne and Auckland, before streaming south to visit Lyttelton, Akaroa, where she exercised with HMS Pyramus before continuing on to Timaru, Otago Harbour, Bluff, Milford Sound, Greymouth, Westport, Nelson, Picton, before stopping again at Wellington. From there she proceeded up the West Coast of the North Island visiting Wanganui, Russell and back to Auckland which was reached on 21 June.
The battlecruiser received numerous gifts while in New Zealand, including a naval ensign and a union jack. Two greenstone hei-tiki (pendants), which were intended to ward off evil were gifted to the ship. One was given by the Boy Scouts of Wellington on 13 April, and the second by Christchurch businessman C. J. Sloman in May 1913. He had deposited the hei-tiki at Canterbury Museum in 1913 and then uplifted it a few months later to lend it to the ship on the condition that it had to be returned to Canterbury Museum should the name New Zealand ever be removed from the navy list.
The most notable gift was the personal gift to Halsey of a Māori piupiu (a warrior's skirt made from rolled flax). According to legend the chief who gave the piupiu to Halsey instructed him to wear it during battle to protect the ship and its crew. If he did, then the ship would be involved in three sea battles; it would be hit only once; and no one on board would be killed. On many of these occasions speeches were often given in the Māori language, which may resulted in a misunderstanding about the purpose of the gift. It is unclear exactly who presented the piupiu to Halsey, as he did not record details about who it was or about any prophesy. There are a number of possibilities as to who gifted the piupiu. One is that it was given by Rotorua Māori in Auckland on 26 June. Another is that it was given by Rangitīaria Dennan in Rotorua on 7 May. This account is supported by Halsey's daughter, which mentions meeting Dennan and a discussion with him about her father being gifted a piupiu when he made an honorary chief of the tribe. Another possibility was that the piupiu was given by the Te Arawa chief Mita Taupopoki. On 17 April a large group from Ngāti Raukawa visited the battlecruiser in Wellington at which it is recorded that "a presentation of piupiu (garments of war)" were made. Another likely candidate was that the piupiu was given to Halsey on behalf of Ngāi Tahu chief Mana Himiona Te Ataotu by Southern Māori MP, Taare Rakatauhake Parata (Charles Rere Parata) when he visited the ship in Wellington on 19 April 1913. On this occasion a piupiu was recorded as being given. A delegation of 25 leading Māori (including Māori members of parliament) did visit the battlecruiser in Wellington on 21 April among whom was Tureiti Te Heuheu Tukino V a leading chief of the Ngāti Tūwharetoa, But this occasion it was reported that "two kiwi robes, a tangiwai pendant, two korowai robes, and a kickio ("carpet mat") were gifted.
As a result of this visit the officers and crew of New Zealand were to maintain a close relationship with her donor country and its citizens over her years of service and her adventures were closely followed in the Dominion's newspapers. Though none of the crew were Māori they would occasionally perform the haka (in which they had received instruction while in New Zealand) at functions. The ship's Māori connection was also maintained by its official letterhead paper featuring the "Aotearoa", which was the Māori word for New Zealand.
By the time the battlecruiser departed New Zealand from Auckland on 28 June for Fiji, a total of 376,114 New Zealanders had visited the vessel during her time in the country, though other sources quote 376,086, 368,118. and 378,068. It is estimated that approximately another 125,000 had been able to see the ship either from the shore or from boats. At the time the country had a population of one million. The battlecruiser streamed across the Pacific via Suva, Fiji and Honolulu to dock on 23 July at the naval base of Esquimalt on Vancouver Island, Canada.
After departing Esquimalt New Zealand headed south stopping at Mazatlán, Acapulco, Panama City, Callao, Valparaíso, Punta Arenas, before she steamed through the Strait of Magellan and on to Montevideo, Rio Janeiro, then various islands of the Caribbean and finally Halifax, Nova Scotia (in Canada) before arriving in Portsmouth on 8 December 1913 having circumnavigated the globe. She had sailed 45,320 miles, consumed 31,833 tons of coal and had been visited by 500,151 people in what was the longest voyage to date by a vessel of the dreadnought era.
### Assigned to the Grand fleet
The Admiralty requested that New Zealand return to the United Kingdom when the tour concluded, rather than remain in the Pacific region as originally planned. The New Zealand Government acceded to the request. As a result, upon her return to the United Kingdom, New Zealand joined the 1st Battlecruiser Squadron (1st BCS) of the Grand Fleet. The squadron visited Brest in February 1914, and Riga, Reval and Kronstadt in the Russian Empire the following June. While there they were visited by the Tsar and Tssarina on the 27 June and that evening hosted in a formal ball in conjunction with Lion, which was moored alongside. On the 29 June the squadron departed for the United Kingdom. The intention was that New Zealand would decommission on 30 August prior to transferring to the Mediterranean fleet where she would become the flagship of Rear Admiral Archibald Moore, but the outbreak of war cancelled that deployment.
### First World War
On 19 August 1914, shortly after the First World War began, New Zealand was transferred to the 2nd Battlecruiser Squadron (2nd BCS).
#### Battle of Heligoland Bight
New Zealand's first wartime action was the Battle of Heligoland Bight on 28 August 1914, as part of the battlecruiser force under the command of Admiral David Beatty. Beatty's ships were originally intended to provide distant support for the British cruisers and destroyers closer to the German coast, in case large units of the High Seas Fleet sortied in response to the British attacks once the tide rose. When the British light forces failed to disengage on schedule at 11:35, the battlecruisers, led by Beatty aboard his flagship, Lion, began to head south at full speed to reinforce the smaller British ships; the rising tide meant that German capital ships would be able to clear the sandbar at the mouth of the Jade estuary.
The brand-new light cruiser Arethusa had been crippled earlier in the battle and was under fire from the German light cruisers SMS Strassburg and SMS Cöln when Beatty's battlecruisers loomed out of the mist at 12:37. By this time, New Zealand had fallen behind the three newer and faster battlecruisers and was not in position to significantly participate in the battle. Strassburg was able to evade fire by hiding in the mists, but Cöln remained visible and was quickly crippled by the British squadron. Before the German ship could be sunk, Beatty was distracted by the sudden appearance of the elderly light cruiser SMS Ariadne off his starboard bow. He turned to pursue, but Ariadne was set afire after only three salvos fired from under 6,000 yards (5,500 m). At 13:10, Beatty turned north and made a general signal to retire. Shortly after turning north, the battlecruisers encountered the crippled Cöln, which was sunk by two salvos from Lion. During the battle, New Zealand's captain, Lionel Halsey, wore the Māori piupiu over his uniform, setting a tradition followed for the duration of the war. Two days after the battle, New Zealand was transferred back to the 1st BCS, when the battlecruiser Inflexible arrived from the Mediterranean.
#### Raid on Scarborough
The German Navy had decided on a strategy of bombarding British towns on the North Sea coast in an attempt to draw out the Royal Navy and destroy elements of it in detail. An earlier raid on Yarmouth on 3 November 1914 had been partially successful, but a larger-scale operation was later devised by Admiral Franz von Hipper. The fast battlecruisers would conduct the bombardment, while the rest of the High Seas Fleet stationed itself east of Dogger Bank, so they could cover the battlecruisers' return and destroy any pursuing British vessels. Having broken the German naval codes, the British were planning to catch the raiding force on its return journey, although they were not aware of the High Seas Fleet's presence. Admiral Beatty's 1st BCS (now reduced to four ships, including New Zealand) and the 2nd Battle Squadron (consisting of six dreadnoughts) were detached from the Grand Fleet in an attempt to intercept the Germans near Dogger Bank.
Admiral Hipper's raiders set sail on 15 December 1914, and successfully bombarded several English towns; British destroyers escorting the 1st BCS had already encountered German destroyers of the High Seas Fleet at 05:15 and fought an inconclusive action with them. Vice Admiral Sir George Warrender, commanding the 2nd Battle Squadron, had received a signal at 05:40 that the destroyer Lynx was engaging enemy destroyers, although Beatty had not. The destroyer Shark spotted the German armoured cruiser SMS Roon and her escorts at about 07:00, but could not transmit the message until 07:25. Admiral Warrender received the signal, as did New Zealand, but Beatty, aboard Lion, did not, even though New Zealand had been specifically tasked to relay messages between the destroyers and the flagship. Warrender attempted to pass on Shark's message to Beatty at 07:36, but did not manage to make contact until 07:55. On receiving the message, Beatty reversed course, and dispatched New Zealand to search for Roon. She was being overhauled by New Zealand when Beatty received messages that Scarborough was being shelled at 09:00. Beatty ordered New Zealand to rejoin the squadron and turned west for Scarborough.
The British forces, heading west to cover the main route through the minefields protecting the coast of England, split up while passing the shallow Southwest Patch of Dogger Bank; Beatty's ships headed to the north, while Warrender passed to the south. This left a 15-nautical-mile (28 km; 17 mi) gap between them, through which the German light forces began to move. At 12:25, the light cruisers of the II Scouting Group began to pass the British forces searching for Hipper. The light cruiser Southampton spotted the light cruiser SMS Stralsund and signalled a report to Beatty. At 12:30, Beatty turned his battlecruisers toward the German ships, which he presumed were the advance screen for Hipper's ships. However, those were some 50 kilometres (31 mi) behind. The 2nd Light Cruiser Squadron, which had been screening for Beatty's ships, detached to pursue the German cruisers, but a misinterpreted signal from the British battlecruisers sent them back to their screening positions. This confusion allowed the German light cruisers to escape, and alerted Hipper to the location of the British battlecruisers. The German battlecruisers wheeled to the north-east of the British forces and also made good their escape.
#### Battle of Dogger Bank
New Zealand became flagship of the 2nd BCS of the Grand Fleet on 15 January 1915. Eight days later, a force of German battlecruisers under the command of Admiral Hipper sortied to clear Dogger Bank of any British fishing boats or small craft that might be there to collect intelligence on German movements. Alerted by decoded German transmissions, a larger force of British battlecruisers, including New Zealand, sailed under the command of Admiral Beatty to intercept. Contact was initiated at 07:20. on the 24th, when Arethusa spotted the German light cruiser SMS Kolberg. By 07:35, the Germans had spotted Beatty's force and Hipper ordered a turn south at 20 knots (37 km/h; 23 mph), believing that this speed would outdistance any British battleships to the north-west; he planned to increase speed to the armoured cruiser SMS Blücher's maximum of 23 knots (43 km/h; 26 mph) if necessary to outrun any battlecruisers.
Beatty ordered his battlecruisers to make all practical speed to catch the Germans before they could escape. New Zealand and Indomitable were the slowest of Beatty's ships, and gradually fell behind the newer battlecruisers, despite New Zealand achieving an indicated speed of 27 knots due to the original overdesign of the engines and to the efforts of her stokers. Despite dropping behind, New Zealand was able to open fire on Blücher by 09:35, and continued to engage the armoured cruiser after the other British battlecruisers had switched targets to the German battlecruisers. After about an hour, New Zealand had knocked out Blücher's forward turret, and Indomitable began to fire on her as well at 10:31. Two 12-inch shells pierced the German ship's armoured deck and exploded in an ammunition room four minutes later. This started a fire amidships that destroyed her two port 21 cm (8.3 in) turrets, while the concussion damaged her engines so that her speed dropped to 17 knots (31 km/h; 20 mph), and jammed her steering gear. At 10:48, Beatty ordered Indomitable to attack her, but the combination of a signalling error by Beatty's flag lieutenant and heavy damage to Beatty's flagship Lion, which had knocked out her radio and caused enough smoke to obscure her signal halyards, caused the rest of the British battlecruisers, temporarily under the command of Rear Admiral Sir Gordon Moore in New Zealand, to think that that signal applied to them. In response, they turned away from Hipper's main body and engaged Blücher. New Zealand fired 147 shells at Blücher before the German ship capsized and sank at 12:07 after being torpedoed by Arethusa. Other sources dispute the number of shells fired by New Zealand with Wright stating 151 shells (12 shells of common 12-inch and 139 shells of 12-inch high explosive) during the action. Halsey had again worn the piupiu over his uniform during the battle, and the lack of damage to New Zealand was once more attributed to its good luck properties.
New Zealand was relieved by Australia as flagship of the 2nd BCS on 22 February. The squadron joined the Grand Fleet in a sortie on 29 March, in response to intelligence that the German fleet was leaving port as the precursor to a major operation. By the next night, the German ships had withdrawn, and the squadron returned to Rosyth. On 11 April, the British fleet was again deployed on the intelligence that a German force was planning an operation. The Germans intended to lay mines at the Swarte Bank, but after a scouting Zeppelin located a British light cruiser squadron, they began to prepare for what they thought was a British attack. Heavy fog and the need to refuel caused Australia and the British vessels to return to port on 17 April, and although they were redeployed that night, they were unable to stop two German light cruisers from laying the minefield. In June Halsey was promoted to Captain of the fleet with rank of Commodore on HMS Iron Duke and was succeeded as captain of New Zealand by J.F.E. (Jimmy) Green. Despite it being his personal property Halsey left the piupiu in the care of Green.
#### Collision with HMAS Australia
On the morning of 21 April, the 2nd BCS left Rosyth at 04:00 (accompanied by the 4th Light Cruiser Squadron and destroyers) again bound for the Skagerrak, this time to support efforts to disrupt the transport of Swedish ore to Germany. The planned destroyer sweep of the Kattegat was cancelled when word came that the High Seas Fleet was mobilising for an operation of their own (later learned to be timed to coincide with the Irish Easter Rising), and the British ships were ordered to a rendezvous point in the middle of the North Sea, with the 1st and 3rd Battlecruiser Squadrons while the rest of the Grand Fleet made for the south-eastern end of the Long Forties. At 15:30 on the afternoon of 22 April, the three squadrons of battlecruisers were patrolling together to the north-west of Horn Reefs when heavy fog came down, while the ships were steaming abreast at 19.5 knots, with Australia on the port flank. Concerned about possible submarine attack Beatty issued instructions at 15:35 for the fleet to commence zigzagging. It took some time for the instruction to be relayed by signal flag down the line and so it wasn't until 15:40 that Australia with a cruiser to her port side commenced her first zigzag and swung to starboard. The crew were aware that New Zealand was on that side about five cables (926 metres) away but the poor visibility meant that as they made their turn they didn't see her until it was too late and they hit at 15:43, despite Australia attempting to turn away to port. Australia's side was torn open from frames 59 to 78 by the armour plate on the hull below her sister ships P-turret, while as New Zealand turned away her outer port propeller damaged Australias hull below her Q-turret.
Australia slowed to half-speed as the mist hid her sister ship, but the damage to New Zealand"'s propeller caused a temporary loss of control and she swung back in front of Australia which despite turning to port, had her stem crushed at 15:46 as she scraped the side of New Zealand, just behind her P-turret. Both ships to come to a complete stop about 30–40 yd (27–37 m) apart while their respective officers assessed the damage. The damage control teams on the Australia were soon busy storing up bulkheads and sealing off the damage portions to prevent any more water entering the ship. Meanwhile off watch Australian sailors took advantage of a convenient potato locker to hurl both its contents and insults at the crew of their nearby sister ship. New Zealand was soon underway, returning to Rosyth with the rest of the squadron. The same fog caused the battleship Neptune to collide with a merchant ship and the destroyers Ambuscade, Ardent and Garland to collide with one another.
Once it was safe to proceed Australia with her speed restricted to 12, and then later to 16 knots arrived back at Rosyth to find both drydocks occupied, one by New Zealand and the other by HMS Dreadnought so she departed for Newcastle-on-Tyne, where she was further damaged trying to dock during strong winds. As this facility couldn't handle all of the repairs that it needed the battlecruiser was ordered to Devonport. Australia was not able to return to sea until 31 May, thus missing the Battle of Jutland. Meanwhile New Zealand replaced her damaged propeller with Australias spare propeller which was in store at Rosyth and returned to the fleet on 30 May, a day before the start of the Battle of Jutland. Due to the continued absence of Australia Rear Admiral William Christopher Pakenham transferred his flag from Indefatigable to New Zealand.
#### Battle of Jutland
On 31 May 1916, the 2nd BCS consisted of its flagship New Zealand and Indefatigable; Australia was still under repair following her collision with New Zealand. The squadron was assigned to Admiral Beatty's Battlecruiser Fleet, which had put to sea to intercept a sortie by the High Seas Fleet into the North Sea. The British were able to decode the German radio messages and left their bases before the Germans put to sea. Hipper's battlecruisers spotted the Battlecruiser Fleet to their west at 15:20, but Beatty's ships did not spot the Germans to their east until 15:30. Two minutes later, he ordered a course change to east-south-east to position himself astride the German's line of retreat and called his ships' crews to action stations. He also ordered the 2nd BCS, which had been leading, to fall in astern of the 1st BCS. Hipper ordered his ships to turn to starboard, away from the British, to assume a south-easterly course, and reduced speed to 18 knots (33 km/h; 21 mph) to allow three light cruisers of the 2nd Scouting Group to catch up. With this turn, Hipper was falling back on the High Seas Fleet, then about 60 miles (97 km) behind him. Around this time, Beatty altered course to the east as it was quickly apparent that he was still too far north to cut off Hipper.
Thus began the so-called "Run to the South" as Beatty changed course to steer east-south-east at 15:45, paralleling Hipper's course, now that the range closed to under 18,000 yards (16,000 m). The Germans opened fire first at 15:48, followed by the British. The British ships were still in the process of making their turn, and only the two leading ships, Lion and Princess Royal, had steadied on their course when the Germans opened fire. The British formation was echeloned to the right with Indefatigable in the rear and the furthest to the west, and New Zealand ahead of her and slightly further east. The German fire was accurate from the beginning, but the British overestimated the range as the German ships blended into the haze. Indefatigable aimed at Von der Tann, while New Zealand, disengaged herself, targeted SMS Moltke. By 15:54, the range was down to 12,900 yards (11,800 m) and Beatty ordered a course change two points to starboard to open up the range at 15:57. Indefatigable was destroyed at about 16:03, when her magazines exploded.
After Indefatigable's loss, New Zealand shifted her fire to Von der Tann in accordance with Beatty's standing instructions. The range had grown too far for accurate shooting, so Beatty altered course four points to port to close the range again between 16:12 and 16:15. By this time, the 5th Battle Squadron, consisting of four Queen Elizabeth-class battleships, had closed up and was engaging Von der Tann and Moltke. At 16:23, a 13.5-inch (340 mm) shell from Tiger struck near Von der Tann's rear turret, starting a fire among the practice targets stowed there that completely obscured the ship and caused New Zealand to shift fire to Moltke. At 16:26, the ship was hit by a 28-centimetre (11 in) shell, fired by Von der Tann, on 'X' barbette that detonated on contact and knocked loose a piece of armour that briefly jammed 'X' turret and blew a hole in the upper deck. Four minutes later, Southampton, scouting in front of Beatty's ships, spotted the lead elements of the High Seas Fleet charging north at top speed. Three minutes later, she sighted the topmasts of Vice-Admiral Reinhard Scheer's battleships, but did not transmit a message to Beatty for another five minutes. Beatty continued south for another two minutes to confirm the sighting himself before ordering a sixteen-point turn to starboard in succession. New Zealand, the last ship in the line, turned prematurely to stay outside the range of the oncoming battleships.
New Zealand was straddled several times by the battleship SMS Prinzregent Luitpold but was not hit. Beatty's ships maintained full speed in an attempt to increase the distance between them and the High Seas Fleet, and gradually moved out of range. They turned north and then north-east to try to rendezvous with the main body of the Grand Fleet. At 17:40, they opened fire again on the German battlecruisers. The setting sun blinded the German gunners, and as they could not make out the British ships, they turned away to the north-east at 5:47. Beatty gradually turned more towards the east to allow him to cover the deployment of the Grand Fleet in battle formation and to move ahead of it, but he mistimed his manoeuvre and forced the leading division to fall off towards the east, further away from the Germans. By 18:35, Beatty was following Indomitable and Inflexible of the 3rd BCS as they were steering east-south-east, leading the Grand Fleet, and continuing to engage Hipper's battlecruisers to their south-west. A few minutes earlier, Scheer had ordered a simultaneous 180° starboard turn and Beatty lost sight of the High Seas Fleet in the haze. Twenty minutes later, Scheer ordered another 180° turn which put them on a converging course again with the British, which had altered course to the south. This allowed the Grand Fleet to cross Scheer's T, forming a battle line that cut across his battle line and badly damaging his leading ships. Scheer ordered yet another 180° turn at 19:13 in an attempt to extricate the High Seas Fleet from the trap into which he had sent them.
This was successful, and the British lost sight of the Germans until 8:05, when Castor spotted smoke bearing west-north-west. Ten minutes later, she had closed the range enough to identify German torpedo boats, and engaged them. Beatty turned west upon hearing gunfire and spotted the German battlecruisers only 8,500 yards (7,800 m) away. Inflexible opened fire at 20:20, followed by the rest of Beatty's battlecruisers. New Zealand and Indomitable concentrated their fire on SMS Seydlitz, and hit her five times before she turned west to disengage. Shortly after 20:30, the pre-dreadnought battleships of Rear Admiral Mauve's II Battle Squadron were spotted and fire switched to them. The Germans had poor visibility and were able to fire only a few rounds at them before turning away to the west. The British battlecruisers hit the German ships several times before they blended into the haze around 8:40. After this, Beatty changed course to south-south-east and maintained that course, ahead of both the Grand Fleet and the High Seas Fleet, until 02:55 the next morning, when the order was given to reverse course and head home.
New Zealand arrived back in Rosyth on 2 June and dropped anchor at 09:55. The crew had approximately 50 minutes rest before, with the potential possibly that she may have to put to sea again, they began the task of refuelling with 1,178 tons of coal and then replenishing the ammunition with 480 twelve-inch shells, work which continued until 03:30 on the following morning.
New Zealand fired 430 twelve-inch shells during the battle, 100 from A-turret, 129 from P-turret, 105 from Q-turret and 96 from X-turret, more than any other ship on either side. Despite this rate of fire, only four successful hits were credited to her: three on Seydlitz and one on the pre-dreadnought SMS Schleswig-Holstein. This gave a hit rate of less than one per cent. Other than the single hit on X-turret the only other damage was from near misses and was minimal, consisting of a shell through the silk jack, a splinter hitting the ensign staff, the No.3 cutter had some damage to its bow and the No.2 picket boat was hit in three places. This confirmed to the crew that the piupiu and hei-tiki worn by Captain Green, brought good luck.
#### Post-Jutland career
New Zealand was relieved by Australia as flagship on 9 June and temporarily attached to the 1st Battlecruiser Squadron, until HMS Renown relieved her in September. On the evening of 18 August, the Grand Fleet put to sea in response to a message deciphered by Room 40 that indicated that the High Seas Fleet, minus II Squadron, would be leaving harbour that night. The German objective was to bombard Sunderland on 19 August, based on extensive reconnaissance provided by airships and submarines. The Grand Fleet sailed with 29 dreadnought battleships and six battlecruisers. Throughout the next day, Jellicoe and Scheer received conflicting intelligence; after reaching the location in the North Sea where the British expected to encounter the High Seas Fleet, they turned north in the erroneous belief that they had entered a minefield. Scheer turned south again, then steered south-eastward to pursue a lone British battle squadron sighted by an airship, which was in fact the Harwich Force of cruisers and destroyers under Commodore Tyrwhitt. Realising their mistake, the Germans changed course for home. The only contact came in the evening when Tyrwhitt sighted the High Seas Fleet but was unable to achieve an advantageous attack position before dark, and broke off contact. The British and the German fleets returned home; the British lost two cruisers to submarine attacks, and one German dreadnought had been torpedoed. New Zealand underwent a refit at Rosyth in November 1916. She temporarily replaced Australia as squadron flagship between 29 November and 7 January 1917.
On 1 October 1917 Green gave up his command of the ship following a promotion to Rear-Admiral, but it wasn't until 13 December 1917 that Captain Edward Kennedy assumed temporary command, which he held until 17 January 1918 when Richard Webb took over the permanent captain. Webb remained captain until September when he was made a rear-admiral and left to take up the role of Assistant High Commissioner at Constantinople. In the latter stages of the war a number of New Zealand soldiers on leave were able to take advantage of the open invitation extended to them by New Zealand's captain to visit the ship.
German minesweepers and escorting light cruisers were attempting to clear British-laid minefields in the Heligoland Bight in late 1917. The Admiralty planned a large operation for 17 November to destroy the ships, and allocated two light cruiser squadrons and the 1st Cruiser Squadron covered by the reinforced 1st Battlecruiser Squadron and, more distantly, the 1st Battle Squadron of battleships. New Zealand was attached to the 1st BCS for this operation, which became known as the Second Battle of Heligoland Bight. New Zealand did not fire her guns during the battle. As in previous engagements, Captain Green wore the piupiu and tiki for luck.
During 1918, New Zealand and the Grand Fleet's other capital ships were used on occasion to escort convoys between the United Kingdom and Norway. The 2nd BCS spent the period from 8 to 21 February covering these convoys in company with battleships and destroyers, and put to sea on 6 March in company with the 1st BCS to support minelayers. The 2nd BCS again supported minelayers in the North Sea from 25 June or 26 June to the end of July. During September and October, New Zealand and the 2nd BCS supervised and protected minelaying operations north of Orkney. In the former month Leonard Andrew Boyd Donaldson took over command of the ship and remained in command until 11 February 1919.
By the time of the 1918 armistice New Zealand had since August 1914 sailed 84,458 nautical miles, consumed 97,034 tons of coal and fired a total of 664 twelve-inch shells in action. As a member of the 2nd BCS the battlecruiser was present at the surrender of the High Seas Fleet in November 1918. To witness the event New Zealand embarked five soldiers from the New Zealand Division and a New Zealand newspaper reporter. New Zealand was assigned responsibility for checking the compliance of SMS Derfflinger with the terms of its internment.
### Post-war
In December 1918 New Zealand was used to convey Queen Maud and Prince Olav from Norway for their state visit of the United Kingdom. With the war at an end most of the United Kingdom's older capital ships were put into reserve, as they were by now obsolete and with the government wishing to make significant cuts in its military expenditure there was little chance of their returning to full service, especially once the formal peace treaty was signed with Germany in mid-1919. One exception was New Zealand, which it was decided would be used to transport, Admiral Jellicoe on what was to be an expected year-long visit to India and the dominions of Australia, Canada, and New Zealand to assist with planning and coordinating their naval policies and defences. To prepare her for voyage, the battlecruiser was refitted between December 1918 and 11 February 1919 at the end of which she was recommissioned with a virtually all-new crew under the command of Captain Oliver Elles Leggett.
The battlecruiser departed Portsmouth on 31 February 1919 and while crossing the Bay of Biscay encountered a storm that forced the evacuation of the newly constructed accommodation for Jellicoe and his staff when it became apparent that the dockyard had failed to seal the holes in the structure. After a 24-hour stop at Gibraltar for Jellicoe to make his first official visit the battlecruiser continued onto Port Said to take on approximately 2,000 tons of coal before continuing through the Suez Canal to make a brief stop at Suez where Jellicoe rejoined it (having left it at Port Said to visit Cairo) before crossing the Arabian Sea to reach Bombay on 14 March. While Jellicoe was engaged in a week of consultations in Delhi, 1,740 tons of coal was taken on board and the opportunity was taken for the battlecruiser to be painted in the dockyard. This was completed on 22 March, just in time for the ship to host a ball three days later. The battlecruiser then made a two-day visit to Karachi before returning to Bombay. Unfortunately while in Karachi a sailor was killed after falling off a balcony while on shore. Once back in Bombay some of the crew got into trouble while on shore leave, which was cancelled in response.
New Zealand departed Bombay on 1 May for Columbo, which was reached two days later, where 1,800 tons of coal and 700 tons of oil was taken on board, in preparation for the journey across the Indian Ocean. By the 9 May the battlecruiser was in the vicinity of the Cocos (Keeling) Islands and the opportunity was taken to divert so that the crew could see the remains of the light cruiser SMS Emden.
The ship arrived at Albany, Western Australia, on 15 May, where Jellicoe and his staff disembarked to take an overland route across the country. New Zealand sailed via Perth, Outer Harbor (near Adelaide), Melbourne and Hobart with the opportunity taken for New Zealand to exercise with Australia and other units of the RAN prior to reaching Sydney. Here she was drydocked in Sunderland Dock at Cockatoo Island where her bottom was scraped and painted, before being refloated and coaled. The battlecruiser left Sydney on 16 August for New Zealand.
Wellington was reached on 20 August, as the influenza pandemic was rampant. As a result, the crew was subjected to a medical inspection before anyone was allowed to disembark. While in Wellington the ship was visited by approximately 50,000 New Zealanders prior to 24 August before it proceeded south to Lyttelton, which was reached on 1 September. The ship then proceeded north to anchor off Picton on 13 September where it spent two days and then after a stop in Wellington it sailed up the east coast of the North Island to reach Auckland on 22 September. Jellicoe, during the next six weeks as he visited ports throughout the country, was preparing a three-volume report for the government. The ship was particularly popular in New Zealand, with Jellicoe, the officers and crew attending numerous social engagements. The tour around the country allowed Jellicoe and his staff to familiarize themselves with the country as they prepared recommendations for the New Zealand government on its naval policy. Crowds flocked to visit the battlecruiser as they had done in 1913. Jellicoe, too, was popular and he later returned to New Zealand to serve as Governor-General from 1920 to 1924.
The battlecruiser left Auckland on 3 October, briefing stopping at Suva in Fiji and Samoa with mail, where at the latter her 12-inch guns were fired to entertain the local chiefs, then Fanning Island for (six hours) and Hawaii. Enroute the ship called upon Christmas Island (Kiritimati), southeast of Fanning Island, on 19 November 1920, thinking it uninhabited. Instead, they were greeted by Joe English, of Medford, Massachusetts, who had been manager of a copra plantation on the island, but had become marooned with two others, when the war had broken out. The men were evacuated.
The battlecruiser arrived in Canada, the final country to be assessed when it docked on 8 November and docking at Esquimalt on Vancouver Island. The Jellicoes left the ship on 20 November to tour Canada and the United States by train before re-joining it in Key West. On 11 November two rugby teams from the ship competed against local teams from Victoria. The officers played the Wanderers and the crew played V.I.A.A (Vancouver Island Athletic Association).
After leaving Vancouver the ship stopped at San Diego, before passing via the Panama Canal into the Caribbean where as well as visiting Havana time was spent in Jamaica, where exercising of the main armament was undertaken. During a stop at Port of Spain on the island of Trinidad a petty officer fell off a wharf and was drowned. Heading north the battlecruiser picked Jellicoe at Key West on 8 January 1920. The battlecruiser reached Portsmouth on 3 February having covered 33,514 nautical miles. As Jellicoe had been promoted to Admiral of the Fleet while overseas the ship was greeted by the appropriate 19-gun salute from HMS Victory.
### Put into reserve
On 6 February New Zealand was pulled by tugs to a mooring on the Hamoaze. Most of the crew sent on six-weeks leave, with a skeleton crew of 250 remaining behind under the command of Lieutenant Commander Alexander David Boyle. Leggett gave up command of New Zealand and was succeeded by Captain Hartley Russell Gwennap Moore (1881–1953) on 11 March. Moore remained in that position until July 1921.
New Zealand was paid off into reserve on 15 March 1920. By this time the battlecruiser was regarded as obsolete by the Royal Navy, as she was coal powered and her 12-inch guns were inferior to the 15-inch (381 mm) guns deployed on the latest generation of capital ships. She was briefly recommissioned on 1 July 1921 with a reserve crew to replace HMS Hercules as flagship at Rosyth under the command of Captain Ralph Eliot (1881–1958), who had previously been in command of Hercules. Eliot was to be the ship's last captain, and remained in command until 1 September.
### Scrapping
Along with all of the other British 12-inch battleships and battlecruisers it was agreed that New Zealand would be scrapped to meet the tonnage restrictions set on the British Empire by the Washington Naval Treaty.
New Zealand was sold for scrap together with Agincourt and Princess Royal to the Exeter-based electrical engineering firm of J\&W Purves with the proviso that they had to be demolished within 18 months of the Washington Naval Treaty being ratified. To meet the Admiralty's desire to provided work for unemployed dock workers at Rosyth Dockyard the contract was immediately transferred the contract to a new entity chaired by A. Wallace Cowan (1877–1964) called the Rosyth Shipbreaking Company which would undertake the scrapping of the vessels at Rosyth. It took until 19 December 1922 to legally organize the transfer of the ships from the Royal Navy to the new company, which had among its directors Admiral J.F.E. Green who had commanded the ship at the Battle of Jutland. Leased facilities were set up adjacent to where the vessels were lying alongside a wharf on the south side of the main basin in the Naval Dockyard at Rosyth. The vessels were taken over on 25 January 1923 with work commencing first on New Zealand. By March 1923 her superstructure had been removed and she was moved out of the basin and beached above the low tide mark on a beach outside of the wall of the northwest dockyard. A large portion of New Zealand's hull was still being dismantled in July 1924 and it wasn't until September 1924 that the last components of New Zealand were removed from the site. And her place on the beach was taken over by the Princess Royal. Between them the three vessels yielded 40,000 tons of steel, approximately 10,000 tons of armour plate and even 3,000 tons of coal still in their bunkers.
The New Zealand government received £20,000 from the sale of the vessel and completed paying off the loan used to fund the ship in the 1944/45 financial year.
## Artifacts
By the time of the decision to scrap New Zealand had an impressive collection of silverware and trophies (officially listed at 47 in January 1919).
As well as the above-mentioned silverware and trophies numerous other items were removed from the vessel prior to scrapping and sent back to New Zealand. Among the items were the ship's bell, a boomerang, two greenstone mere (clubs), silver cups, gunnery shields, two hei-tiki, a complete laundry, a 42-foot long motor launch, the ship's flags, some searchlights, a steering wheel, four 4-inch QF guns and associated rangefinders. Some furniture was sent to the High Commission in London, though they lost out on the wardroom buffet, which ended up in New Zealand's Parliament restaurant, Bellamy's. Most of these items arrived in New Zealand in late 1923. The ship's former captains were sent furniture from the captain's cabin.
The 4-inch guns, a range finder and laundry equipment, were used by military units. During the Second World War, the 4-inch guns were the main armament of the land batteries which protected the entrances to the harbours at Auckland, Wellington and Lyttelton. Two of these guns have since 23 November 1929 been located outside of the northern entrance to the Auckland War Memorial Museum. At the outbreak of World War Two, they were removed with one being returned to service while the other gun which was too damaged to repair, was placed in storage at the museum. Two guns were once again returned to display outside the museum in 1959.
On 12 December 1924 A. Wallace Cowan presented an ink stand and cigar boxes made from the ship's timbers to New Zealand High Commissioner Sir James Allen and current New Zealand Prime Minister William Massey (who was in the United Kingdom at the time), while a third cigar box was sent to Ward. One of these cigar boxes is currently held by the Auckland Museum. Teak from the ship was used as flooring in Cowan's house. A photo album of the breaking up of the vessel was presented by Cowan's daughter to the New Zealand Royal Navy in 1968 and is now held by National Archives New Zealand.
Auckland War Memorial Museum has among its collection Pelorus Jack's silver collar (a gift received from the New Zealanders of Transvaal), another brass-studded collar and his harness. Another collar, gifted by the Pretoria Public Works Department, is held by the Royal New Zealand Navy Museum, Devonport.
The other artifacts are on display in various museums in New Zealand. The hei-tiki donated by C. J. Sloman has been in the Canterbury Museum since 1932. Having once been on display in the Wellington Maritime Museum the auxiliary steering wheel and an engine telegraph are now, together with other items is in the possession of Museum of New Zealand Te Papa Tongarewa in Wellington.
Other than for when it was lent for display at the 1940 Centennial Exhibition in New Zealand the captain's piupiu remained with Halsey until his death in 1949. His daughter Ruth bequeathed it to New Zealand upon her death in 2002 and since 2005 it has been on display at the Torpedo Bay Navy Museum in Auckland alongside the ship's bell, the wardroom buffet and other artifacts, including the piece of armour knocked off of X-turret at the Battle of Jutland. When HMS Queen Mary exploded at the Battle of Jutland debris from the ship fell on New Zealand, among which was a ring-bolt. This is now in the collection of the Torpedo Bay Navy Museum.
The South Canterbury Museum in Timaru, New Zealand, holds the naval ensign which flew from New Zealand" during all of its naval engagements in World War I. The naval ensign and a union jack were purchased by the women's branch of the Navy League in Timaru and presented to the ship when it visited Timaru in May 1913.
## Ship's mascot
The ship's first mascot was a bulldog donated by a New Zealander resident in London and named after the famous dolphin that greeted ships in the Marlborough Sounds of New Zealand. The first was "discharged dead" from the Navy on 24 April 1916 after falling down the forward funnel. His will requested not only that his successor be a "bull pup of honest parentage, clean habits, and moral tendencies", but also that "no Dachshound or other dog of Teutonic extraction" be permitted on board.
His successor's service at the Battle of Jutland caused him to become afraid of gunfire and when it was considered it was unlikely he could survive the ship's return voyage to the United Kingdom he was discharged with the rank of leading sea dog and given to the people of Auckland in October 1919. Following six-month quarantine Jack was taken under the care of the superintendent of parks. |
# Japanese aircraft carrier Sōryū
Sōryū (Japanese: 蒼龍, meaning "Blue (or Green) Dragon") was an aircraft carrier built for the Imperial Japanese Navy (IJN) during the mid-1930s. A sister ship, Hiryū, was intended to follow Sōryū, but Hiryū's design was heavily modified and she is often considered to be a separate class. Sōryū's aircraft were employed in operations during the Second Sino-Japanese War in the late 1930s and supported the Japanese invasion of French Indochina in mid-1940. During the first months of the Pacific War, she took part in the attack on Pearl Harbor, the Battle of Wake Island, and supported the conquest of the Dutch East Indies. In February 1942, her aircraft bombed Darwin, Australia, and she continued on to assist in the Dutch East Indies campaign. In April, Sōryū's aircraft helped sink two British heavy cruisers and several merchant ships during the Indian Ocean raid.
After a brief refit, Sōryū and three other carriers of the 1st Air Fleet (Kidō Butai) participated in the Battle of Midway in June 1942. After bombarding American forces on Midway Atoll, the carriers were attacked by aircraft from the island and the carriers Enterprise, Hornet, and Yorktown. Dive bombers from Yorktown crippled Sōryū and set her afire. Japanese destroyers rescued the survivors but the ship could not be salvaged and was ordered to be scuttled so as to allow her attendant destroyers to be released for further operations. She sank with the loss of 711 officers and enlisted men of the 1,103 aboard. The loss of Sōryū and three other IJN carriers at Midway was a crucial strategic defeat for Japan and contributed significantly to the Allies' ultimate victory in the Pacific.
## Design and description
Sōryū was one of two large carriers approved for construction under the Imperial Japanese Navy's 1931–32 Supplementary Program (the other being her near-sister Hiryū). In contrast to some earlier Japanese carriers, such as Akagi and Kaga, which were conversions of battlecruiser and battleship hulls respectively, Sōryū was designed from the keel up as an aircraft carrier and incorporated lessons learned from the light carrier Ryūjō.
The ship had a length of 227.5 meters (746 ft 5 in) overall, a beam of 21.3 meters (69 ft 11 in) and a draught of 7.6 meters (24 ft 11 in). She displaced 16,200 tonnes (15,900 long tons) at standard load and 19,100 tonnes (18,800 long tons) at normal load. Her crew consisted of 1,100 officers and ratings.
### Machinery
Sōryū was fitted with four geared steam turbine sets with a total of 152,000 shaft horsepower (113,000 kW), each driving one propeller shaft using steam provided by eight Kampon water-tube boilers. The turbines and boilers were the same as those used in the Mogami-class cruisers. The ship's power and slim, cruiser-type hull, with a length-to-beam ratio of 10:1, gave her a speed of 34.5 knots (63.9 km/h; 39.7 mph) and made her the fastest carrier in the world at the time of her commissioning. Sōryū carried 3,730 metric tons (3,670 long tons) of fuel oil which gave her a range of 7,750 nautical miles (14,350 km; 8,920 mi) at 18 knots (33 km/h; 21 mph). The boiler uptakes were trunked together to the ship's starboard side amidships and exhausted just below flight deck level through two funnels curved downwards.
### Flight deck and hangars
The carrier's 216.9-meter (711 ft 7 in) flight deck was 26 meters (85 ft 4 in) wide and overhung her superstructure at both ends, supported by pairs of pillars. Sōryū's island was built on a starboard-side extension that protruded beyond the side of the hull so that it did not encroach on the width of the flight deck. Nine transverse arrestor wires were installed on the flight deck and could stop a 6,000 kg (13,000 lb) aircraft. The flight deck was only 12.8 meters (42 ft) above the waterline and the ship's designers kept this distance low by reducing the height of the hangars. The upper hangar was 171.3 by 18.3 metres (562 by 60 ft) and had an approximate height of 4.6 meters (15 ft 1 in); the lower was 142.3 by 18.3 metres (467 by 60 ft) and had an approximate height of 4.3 meters (14 ft 1 in). Together they had an approximate total area of 5,736 square metres (61,742 sq ft). This caused problems in handling aircraft because the wings of a Nakajima B5N "Kate" torpedo bomber could neither be spread nor folded in the upper hangar.
Aircraft were transported between the hangars and the flight deck by three elevators, the forward one abreast the island on the centerline and the other two offset to starboard. The forward platform measured 16 by 11.5 meters (52 ft 6 in × 37 ft 9 in), the middle one 11.5 by 12 meters (37 ft 9 in × 39 ft 4 in), and the rear 11.8 by 10 meters (38 ft 9 in × 32 ft 10 in). They were capable of transferring aircraft weighing up to 5,000 kilograms (11,000 lb). Sōryū had an aviation gasoline capacity of 570,000 liters (130,000 imp gal; 150,000 U.S. gal) for her planned aircraft capacity of sixty-three plus nine spares.
### Armament
Sōryū's primary anti-aircraft (AA) armament consisted of six twin-gun mounts equipped with 50-caliber 12.7-centimeter Type 89 dual-purpose guns mounted on projecting sponsons, three on either side of the carrier's hull. The guns had a range of 14,700 meters (16,100 yd), and a ceiling of 9,440 meters (30,970 ft) at an elevation of +90 degrees. Their maximum rate of fire was fourteen rounds a minute, but their sustained rate of fire was around eight rounds per minute. The ship was equipped with two Type 94 fire-control directors to control the 12.7-centimeter (5.0 in) guns, one for each side of the ship, although the starboard director on the island could control all of the Type 89 guns.
The ship's light AA armament consisted of fourteen twin-gun mounts for license-built Hotchkiss 25 mm (1 in) Type 96 AA guns. Three of these were sited on a platform just below the forward end of the flight deck. The gun was the standard Japanese light AA weapon during World War II, but it suffered from severe design shortcomings that rendered it largely ineffective. According to historian Mark Stille, the weapon had many faults including an inability to "handle high-speed targets because it could not be trained or elevated fast enough by either hand or power, its sights were inadequate for high-speed targets, it possessed excessive vibration and muzzle blast". These guns had an effective range of 1,500–3,000 meters (1,600–3,300 yd), and a ceiling of 5,500 meters (18,000 ft) at an elevation of +85 degrees. The effective rate of fire was only between 110 and 120 rounds per minute because of the frequent need to change the 15-round magazines. The Type 96 guns were controlled by five Type 95 directors, two on each side and one in the bow.
### Armor
To save weight Sōryū was minimally armored; her waterline belt of 41 millimeters (1.6 in) of Ducol steel only protected the machinery spaces and the magazines. Comparable figures for Hiryū were 90 millimeters (3.5 in) over the machinery spaces and the aviation gasoline storage tanks increasing to 150 millimeters (5.9 in) over the magazines. Sōryū's waterline belt was backed by an internal anti-splinter bulkhead. The ship's deck was only 25 mm thick over the machinery spaces and 55 millimeters (2.2 in) thick over the magazines and aviation gasoline storage tanks.
## Construction and service
Following the Japanese ship-naming conventions for aircraft carriers, Sōryū was named "Blue (or Green) Dragon". The ship was laid down at the Kure Naval Arsenal on 20 November 1934, launched on 23 December 1935 and commissioned on 29 December 1937. She was assigned to the Second Carrier Division after commissioning. Her air group was initially intended to consist of eighteen Mitsubishi A5M ("Claude") monoplane fighters, twenty-seven Aichi D1A2 ("Susie") Type 96 dive bombers, and twelve Yokosuka B4Y ("Jean") Type 96 torpedo bombers, but the A5Ms were in short supply and Nakajima A4N1 biplanes were issued instead. On 25 April 1938, nine A4Ns, eighteen D1A2s, and nine B4Ys transferred to Nanjing to support forces advancing up the Yangtze River. The air group advanced with the successful Japanese offensive, despite the defense by the Chinese Air Force and the Soviet Volunteer Group; it was transferred to Wuhu in early June and then to Anqing. Little is known of its operations there, but its primary role during this time was air defense. One fighter pilot of the group was killed after he shot down a Chinese aircraft. Leaving a few fighters and their pilots behind to serve as the nucleus of a new fighter unit, the air group returned to Sōryū on 10 July. The ship supported operations over Canton in September, but her aircraft saw no aerial combat. She returned home in December and spent most of the next year and a half training.
In September–October 1940, the ship was based at Hainan Island to support the Japanese invasion of French Indochina. In February 1941, Sōryū moved to Taiwan to reinforce the blockade of southern China. Two months later, the 2nd Carrier Division was assigned to the First Air Fleet, or Kido Butai, on 10 April. Sōryū's air group was detached in mid-July and transferred to Hainan Island to support the occupation of southern Indochina. Sōryū returned to Japan on 7 August and became flagship of the 2nd Division. She was relieved of that role on 22 September as she began a short refit that was completed on 24 October. The ship arrived at Kagoshima two days later and she resumed her former role as flagship of the Division.
### Pearl Harbor and subsequent operations
In November 1941 the IJN's Combined Fleet, under Admiral Isoroku Yamamoto, prepared to participate in Japan's initiation of war with the United States by conducting a preemptive strike against the US Navy's Pacific Fleet base at Pearl Harbor, Hawaii. On 22 November, Sōryū, commanded by Captain Ryusaku Yanagimoto, and the rest of the Kido Butai under Vice Admiral Chuichi Nagumo, including six fleet carriers from the First, Second, and Fifth Carrier Divisions, assembled in Hitokappu Bay at Etorofu Island. The fleet departed Etorofu on 26 November and followed a course across the north-central Pacific to avoid commercial shipping lanes. At this time Sōryū embarked 21 Mitsubishi A6M Zero fighters, 18 Aichi D3A "Val" dive bombers, and 18 Nakajima B5N torpedo bombers. From a position 230 nautical miles (430 km; 260 mi) north of Oahu, Sōryū and the other five carriers launched two waves of aircraft on the morning of 8 December 1941 (Japan time).
In the first wave, eight of Sōryū's B5Ns were supposed to attack the aircraft carriers that normally berthed on the northwest side of Ford Island, but none were in Pearl Harbor that day; six B5Ns attacked the ships that were present, torpedoing the target ship Utah, causing her to capsize, and the elderly light cruiser Raleigh, damaging her. Two of the B5N pilots diverted to their secondary target, ships berthed alongside "1010 Pier", where the fleet flagship was usually moored. That battleship was in drydock and her position was occupied by the light cruiser Helena and the minelayer Oglala. One torpedo passed underneath Oglala and struck Helena in one of her engine rooms; the other pilot rejected these targets and attacked the battleship California. Her other ten B5Ns were tasked to drop 800-kilogram (1,800 lb) armor-piercing bombs on the battleships berthed on the southeast side of Ford Island ("Battleship Row") and may have scored one or two hits on them. Her eight A6M Zeros strafed parked aircraft at Marine Corps Air Station Ewa, claiming twenty-seven aircraft destroyed in addition to five aircraft shot down.
Sōryū's second wave consisted of nine A6M Zeros and seventeen D3As. The former attacked Naval Air Station Kaneohe Bay, losing one Zero to American anti-aircraft guns. On the return trip, the Zero pilots claimed to have shot down two American aircraft while losing two of their own. The D3As attacked various ships in Pearl Harbor, but it is not possible to identify which aircraft attacked which ship. Two of them were shot down during the attack.
While returning to Japan, Vice Admiral Chūichi Nagumo, commander of the First Air Fleet, ordered that Sōryū and Hiryū be detached on 16 December to attack the defenders of Wake Island who had already defeated the first Japanese attack on the island. The two carriers reached the vicinity of the island on 21 December and launched twenty-nine D3As and two B5Ns, escorted by eighteen Zeros, to attack ground targets. They encountered no aerial opposition and launched thirty-five B5Ns and six A6M Zeros the following day. They were intercepted by the two surviving Grumman F4F Wildcat fighters of Marine Fighter Squadron VMF-211. The Wildcats shot down two B5Ns before they were shot down themselves by the Zeros. The garrison surrendered the next day after Japanese troops were landed.
The carriers arrived at Kure on 29 December. They were assigned to the Southern Force on 8 January 1942 and departed four days later for the Dutch East Indies. The ships supported the invasion of the Palau Islands and the Battle of Ambon, attacking Allied positions on the island on 23 January with fifty-four aircraft. Four days later the carriers detached eighteen Zeros and nine D3As to operate from land bases in support of Japanese operations in the Battle of Borneo. On 30 January they destroyed two aircraft on the ground and shot down a Qantas Short Empire flying boat flying to Surabaya to pick up refugees.
Sōryū and Hiryū arrived at Palau on 28 January and waited for the arrival of the carriers Kaga and Akagi. All four carriers departed Palau on 15 February and launched air strikes against Darwin, Australia, four days later. Sōryū contributed eighteen B5Ns, eighteen D3As, and nine Zeros to the attack while flying Combat Air Patrols (CAP) over the carriers. Her aircraft attacked the ships in port and its facilities, sinking or setting on fire eight ships and causing three others to be beached lest they sink. The Zeros destroyed a single Consolidated PBY Catalina flying boat; one D3A was lost. The Japanese aircraft spotted a ship on the return trip but had expended all their ordnance and had to be rearmed and refueled before they could attack the vessel. Several hours later, nine of Sōryū's D3As located and bombed an American supply ship of 3,200 gross register tons (GRT), Don Isidro, hitting her five times but failing to sink her. Sōryū and the other carriers arrived at Staring Bay on Celebes Island on 21 February to resupply and rest before departing four days later to support the invasion of Java. On 1 March 1942, the ship's D3As damaged the destroyer USS Edsall badly enough for her to be caught and sunk by Japanese cruisers. Later that day the dive bombers sank the oil tanker USS Pecos. The four carriers launched an airstrike of 180 aircraft against Tjilatjep on 5 March, sinking five small ships, damaging another nine badly enough that they had to be scuttled, and set the town on fire. Two days later they attacked Christmas Island before returning to Staring Bay on 11 March to resupply and train for the impending Indian Ocean raid. This raid was intended to secure newly conquered Burma, Malaya, and the Dutch East Indies against Allied attack by destroying base facilities and forces in the eastern Indian Ocean.
### Indian Ocean raid
On 26 March 1942, the five carriers of the First Air Fleet departed from Staring Bay; they were spotted by a Catalina about 350 nautical miles (650 km; 400 mi) southeast of Ceylon on the morning of 4 April. Nagumo closed to within 120 nautical miles (220 km; 140 mi) of Colombo before launching an airstrike the next morning. Sōryū contributed eighteen B5Ns and nine Zeros to the force. The pilots of the latter aircraft claimed to have shot down a single Fairey Fulmar of 806 Naval Air Squadron, plus seven other fighters while losing one of their own. The D3As and B5Ns inflicted some damage to the port facilities, but a day's warning had allowed most of the shipping in the harbor to be evacuated. Later that morning the British heavy cruisers Cornwall and Dorsetshire were spotted and Sōryū launched eighteen D3As. They were the first to attack and claimed to have made fourteen hits on the two ships, sinking both in combination with the dive bombers from the other carriers.
On 9 April, Sōryū contributed eighteen B5Ns, escorted by nine Zeros, to the attack on Trincomalee. Her B5Ns were the first to bomb the port and her fighters did not encounter any British fighters. Meanwhile, a floatplane from the battleship Haruna spotted the small aircraft carrier Hermes, escorted by the Australian destroyer , and every available D3A was launched to attack the ships. Sōryū contributed eighteen dive bombers, but they arrived too late and instead found three other ships further north. They sank the oil tanker British Sergeant and the Norwegian cargo ship Norviken before they were attacked by eight Fulmars of 803 and 806 Naval Air Squadrons. The Royal Navy pilots claimed three D3As shot down for the loss of a pair of Fulmars; the Japanese actually lost four D3As with another five damaged. While this was going on, Akagi narrowly escaped damage when nine British Bristol Blenheim bombers from Ceylon penetrated the CAP and dropped their bombs from 11,000 feet (3,400 m). Sōryū had six Zeros aloft, along with fourteen more from the other carriers, and they collectively accounted for five of the British bombers for the loss of one of Hiryū's Zeros. After launching the D3As that sank Hermes and the other ships, the First Air Fleet reversed course and headed southeast for the Malacca Strait before recovering their aircraft; they then proceeded to Japan.
On 19 April, while transiting the Bashi Straits between Taiwan and Luzon en route to Japan, Akagi, Sōryū, and Hiryū were sent in pursuit of the American carriers Hornet and Enterprise, which had launched the Doolittle Raid against Tokyo. They found only empty ocean, for the American carriers had immediately departed the area to return to Hawaii. The carriers quickly abandoned the chase and dropped anchor at Hashirajima anchorage on 22 April. Having been engaged in constant operations for four and a half months, Sōryū, along with the other three carriers of the First and Second Carrier Divisions, was hurriedly refitted and replenished in preparation for the Combined Fleet's next major operation, scheduled to begin one month hence. While at Hashirajima, Sōryū's air group was based ashore at nearby Kasanohara, near Kagoshima, and conducted flight and weapons training with the other First Air Fleet carrier units.
### Midway
Concerned by the US carrier strikes in the Marshall Islands, Lae-Salamaua, and the Doolittle raids, Yamamoto was determined to force the US Navy into a showdown to eliminate the American carrier threat. He decided to invade and occupy Midway Island, an action that he was sure would draw out the American carriers. The Japanese codenamed the Midway invasion Operation MI.
On 25 May 1942, Sōryū set out with the Combined Fleet's carrier striking force in the company of Kaga, Akagi, and Hiryū, which constituted the First and Second Carrier Divisions, for the attack on Midway Island. Her aircraft complement consisted of eighteen Zeros, sixteen D3As, eighteen B5Ns, and two preproduction reconnaissance variants (D4Y1-C) of the new Yokosuka D4Y dive bomber. Also aboard were three A6M Zeros of the 6th Kōkūtai intended as a portion of the aerial garrison for Midway.
With the fleet positioned 250 nautical miles (460 km; 290 mi) northwest of Midway at dawn (04:45 local time) on 4 June 1942, Sōryū's part in the 108-plane combined air raid was a strike on the airfield on Eastern Island with eighteen torpedo bombers escorted by nine Zeros. The air group suffered heavily during the attack; a single B5N was shot down by fighters, two more were forced to ditch on the return (both crews rescued), and five (including one that landed aboard Hiryu) were damaged beyond repair. The Japanese did not know that the US Navy had discovered their MI plan by breaking their cipher, and had prepared an ambush using its three available carriers, positioned northeast of Midway.
The carrier also contributed 3 Zeros to the total of eleven assigned to the initial combat air patrol (CAP) over the four carriers. By 07:00 the carrier had six fighters with the CAP that helped to defend the Kido Butai from the first US attackers from Midway Island at 07:10. At this time, Nagumo's carriers were attacked by six US Navy Grumman TBF Avengers from Torpedo Squadron 8 (VT-8) that had been temporarily detached from the Hornet to Midway, and four United States Army Air Corps (USAAC) Martin B-26 Marauders, all carrying torpedoes. The Avengers went after Hiryū while the Marauders attacked Akagi. The thirty CAP Zeros in the air at this time, including the six from Sōryū, immediately attacked the American airplanes, shooting down five of the Avengers and two of the B-26s. The surviving aircraft dropped their torpedoes, but all missed. Sōryū launched three more Zeros to reinforce the CAP, at 07:10.
At 07:15 Admiral Nagumo ordered the B5Ns on Kaga and Akagi rearmed with bombs for another attack on Midway itself. This process was limited by the number of ordnance carts (used to handle the bombs and torpedoes) and ordnance elevators, preventing torpedoes from being stowed belowdeck until after all the bombs were moved up from their magazine, assembled, and mounted on the aircraft. The process normally took about an hour and a half; more time would be required to bring the aircraft up to the flight deck, and to warm up and launch the strike group. Around 07:40 Nagumo reversed his order when he received a message from one of his scout aircraft that American warships had been spotted. Depleted of ammunition, the first six of Sōryū's CAP Zeros landed aboard the carrier at 07:30.
At 07:55, the next American strike from Midway arrived in the form of sixteen Douglas SBD Dauntless bombers of Marine Scout Bomber Squadron 241 (VMSB-241) under Major Lofton R. Henderson. Sōryū's three CAP fighters were among the nine still aloft that attacked Henderson's planes, shooting down six of them as they executed a fruitless glide-bombing attack on Hiryū. At roughly the same time, a dozen USAAC Boeing B-17 Flying Fortresses attacked the Japanese carriers, bombing from 20,000 feet (6,100 m). The high altitude of the B-17s gave the Japanese captains enough time to anticipate where the bombs would land and successfully maneuver their ships out of the impact area. Four B-17s attacked Sōryū, but they all missed.
The CAP defeated the next American air strike from Midway, shooting down three of the eleven Vought SB2U Vindicator dive bombers from VMSB-241, which attacked the battleship Haruna unsuccessfully, starting at around 08:30. Although all the American air strikes had thus far caused negligible damage, they kept the Japanese carrier forces off-balance as Nagumo endeavored to prepare a response to news, received at 08:20, of the sighting of American carrier forces to his northeast. Around 08:30 Sōryū launched one of her D4Ys on a mission to confirm the location of the American carriers.
Sōryū began recovering her Midway strike force at around 08:40 and finished shortly by 09:10. The landed aircraft were quickly struck below, while the carriers' crews began preparations to spot aircraft for the strike against the American carrier forces. The preparations were interrupted at 09:18 when the first American carrier aircraft to attack were sighted. These consisted of fifteen Douglas TBD Devastator torpedo bombers of VT-8, led by Lieutenant Commander John C. Waldron from Hornet. The three airborne CAP Zeros were landing aboard at 09:30 when the Americans unsuccessfully attempted a torpedo attack on Soryū, but three of the morning's escort fighters were still airborne and joined the eighteen CAP fighters in destroying Waldron's planes. All of the American planes were shot down, leaving George H. Gay Jr.—the only surviving aviator—treading water.
Shortly afterwards, fourteen Devastators from Torpedo Squadron 6 (VT-6) from the Enterprise, led by Lieutenant Commander Eugene E. Lindsey, attacked. Lindsey's aircraft tried to sandwich Kaga, but the CAP, reinforced by three more Zeros launched by Sōryū at 09:45, shot down all but four of the Devastators, and Kaga dodged the torpedoes. Sōryū launched another trio of CAP Zeros at 10:00 and another three at 10:15 after Torpedo Squadron 3 (VT-3) from Yorktown was spotted. A Wildcat escorting VT-3 shot down one of her Zeros.
While VT-3 was still attacking Hiryū, American dive bombers arrived over the Japanese carriers almost undetected and began their dives. It was at this time, around 10:20, that in the words of Jonathan Parshall and Anthony Tully, the "Japanese air defenses would finally and catastrophically fail". At 10:25, Sōryū was attacked by thirteen Dauntlesses from Yorktown's Bombing Squadron 3 (VB-3). The carrier received three direct hits from 1,000 lb (454 kg) bombs: one penetrated to the lower hangar deck amidships, and the other two exploded in the upper hangar deck fore and aft. The hangars contained armed and fueled aircraft preparing for the upcoming strike, resulting in secondary explosions and rupturing the steam pipes in the boiler rooms. Within a very short time the fires on the ship were out of control. At 10:40 she stopped and her crew was ordered to abandon ship five minutes later. The destroyers Isokaze and Hamakaze rescued the survivors. Sōryū was still afloat and showed no signs of beginning to sink by early evening, so Isokaze was ordered to scuttle her with torpedoes so as to allow the destroyers to be used for possible operations that night. The destroyer reported at 19:15 that Sōryū had sunk at position . Losses were 711 crew of her complement of 1,103, including Captain Yanagimoto, who chose to remain on board. This was the highest mortality percentage of all the Japanese carriers lost at Midway, due largely to the devastation in both hangar decks.
The loss of Sōryū and the three other IJN carriers at Midway, comprising two-thirds of Japan's total number of fleet carriers and the experienced core of the First Air Fleet, was a crucial strategic defeat and contributed significantly to the ultimate Allied victory. In an effort to conceal the defeat, the ship was not immediately removed from the Navy's registry of ships, awaiting a "suitable opportunity" before finally being struck from the registry on 10 August 1942. |
# Capture of Fort Ticonderoga
The capture of Fort Ticonderoga occurred during the American Revolutionary War on May 10, 1775, when a small force of Green Mountain Boys led by Ethan Allen and Colonel Benedict Arnold surprised and captured the fort's small British garrison. The cannons and other armaments at Fort Ticonderoga were later transported to Boston by Colonel Henry Knox in the noble train of artillery and used to fortify Dorchester Heights and break the standoff at the siege of Boston.
Capture of the fort marked the beginning of offensive action taken by the Americans against the British. After seizing Ticonderoga, a small detachment captured the nearby Fort Crown Point on May 11. Seven days later, Arnold and 50 men raided Fort Saint-Jean on the Richelieu River in southern Quebec, seizing military supplies, cannons, and the largest military vessel on Lake Champlain.
Although the scope of this military action was relatively minor, it had significant strategic importance. It impeded communication between northern and southern units of the British Army, and gave the nascent Continental Army a staging ground for the invasion of Quebec later in 1775. It also involved two larger-than-life personalities in Allen and Arnold, each of whom sought to gain as much credit and honor as possible for these events. Most significantly, in an effort led by Henry Knox, artillery from Ticonderoga was dragged across Massachusetts to the heights commanding Boston Harbor, forcing the British to withdraw from that city.
## Background
In 1775, Fort Ticonderoga's location did not appear to be as strategically important as it had been in the French and Indian War, when the French famously defended it against a much larger British force in the 1758 Battle of Carillon, and when the British captured it in 1759. After the 1763 Treaty of Paris, in which the French ceded their North American territories to the British, the fort was no longer on the frontier of two great empires, guarding the principal waterway between them. The French had blown up the fort's powder magazine when they abandoned the fort, and it had fallen further into disrepair since then. In 1775 it was garrisoned by only a small detachment of the 26th Regiment of Foot, consisting of two officers and forty-six men, with many of them "invalids" (soldiers with limited duties because of disability or illness). Twenty-five women and children lived there as well. Because of its former significance, Fort Ticonderoga still had a high reputation as the "gateway to the continent" or the "Gibraltar of America", but in 1775 it was, according to historian Christopher Ward, "more like a backwoods village than a fort."
Even before shooting started in the American Revolutionary War, American Patriots were concerned about Fort Ticonderoga. The fort was a valuable asset for several reasons. Within its walls was a collection of heavy artillery including cannons, howitzers, and mortars, armaments that the Americans had in short supply. The fort was situated on the shores of Lake Champlain, a strategically important route between the Thirteen Colonies and the British-controlled northern provinces. British forces placed there would expose the colonial forces in Boston to attack from the rear. After the war began with the Battles of Lexington and Concord on April 19, 1775, the British General Thomas Gage realized the fort would require fortification, and several colonists had the idea of capturing the fort.
Gage, writing from the besieged city of Boston following Lexington and Concord, instructed Quebec's governor, General Guy Carleton, to rehabilitate and refortify the forts at Ticonderoga and Crown Point. Carleton did not receive this letter until May 19, well after the fort had been captured.
Benedict Arnold had frequently traveled through the area around the fort, and was familiar with its condition, manning, and armaments. En route to Boston following news of the events of April 19, he mentioned the fort and its condition to members of Silas Deane's militia. The Connecticut Committee of Correspondence acted on this information; money was "borrowed" from the provincial coffers and recruiters were sent into northwestern Connecticut, western Massachusetts, and the New Hampshire Grants (now Vermont) to raise volunteers for an attack on the fort.
John Brown, an American spy from Pittsfield, Massachusetts who had carried correspondence between revolutionary committees in the Boston area and Patriot supporters in Montreal, was well aware of the fort and its strategic value. Ethan Allen and other Patriots in the disputed New Hampshire Grants territory also recognized the fort's value, as it played a role in the dispute over that area between New York and New Hampshire. Whether either took or instigated action prior to the Connecticut Colony's recruitment efforts is unclear. Brown had notified the Massachusetts Committee of Safety in March of his opinion that Ticonderoga "must be seized as soon as possible should hostilities be committed by the King's Troops."
When Arnold arrived outside Boston, he told the Massachusetts Committee of Safety about the cannons and other military equipment at the lightly defended fort. On May 3, the Committee gave Arnold a colonel's commission and authorized him to command a "secret mission", which was to capture the fort. He was issued £100, some gunpowder, ammunition, and horses, and instructed to recruit up to 400 men, march on the fort, and ship back to Massachusetts anything he thought useful.
## Colonial forces assemble
Arnold departed immediately after receiving his instructions. He was accompanied by two captains, Eleazer Oswald and Jonathan Brown, who were charged with recruiting the necessary men. Arnold reached the border between Massachusetts and the Grants on May 6, where he learned of the recruitment efforts of the Connecticut Committee, and that Ethan Allen and the Green Mountain Boys were already on their way north. Riding furiously northward (his horse was subsequently killed), he reached Allen's headquarters in Bennington the next day. Upon arrival, Arnold was told that Allen was in Castleton, 50 miles (80 km) to the north, awaiting supplies and more men. He was also warned that, although Allen's effort had no official sanction, his men were unlikely to serve under anyone else. Leaving early the next day, Arnold arrived in Castleton in time to join a war council, where he made a case to lead the expedition based on his formal authorization to act from the Massachusetts Committee.
The force that Allen had assembled in Castleton included about 100 Green Mountain Boys, about 40 men raised by James Easton and John Brown at Pittsfield, and an additional 20 men from Connecticut. Allen was elected colonel, with Easton and Seth Warner as his lieutenants. When Arnold arrived on the scene, Samuel Herrick had already been sent to Skenesboro and Asa Douglas to Panton with detachments to secure boats. Captain Noah Phelps, a member of the "Committee of War for the Expedition against Ticonderoga and Crown Point", had reconnoitered the fort disguised as a peddler seeking a shave. He saw that the fort walls were dilapidated, learned from the garrison commander that the soldiers' gunpowder was wet, and that they expected reinforcements at any time. He reported this intelligence to Allen, following which they planned a dawn raid.
Many of the Green Mountain Boys objected to Arnold's wish to command, insisting that they would go home rather than serve under anyone other than Ethan Allen. Arnold and Allen worked out an agreement, but no documented evidence exists concerning the deal. According to Arnold, he was given joint command of the operation. Some historians have supported Arnold's contention, while others suggest he was merely given the right to march next to Allen.
## Capture of the fort
By 11:30 pm on May 9, the men had assembled at Hand's Cove (in what is now Shoreham, Vermont) and were ready to cross the lake to Ticonderoga. Boats did not arrive until 1:30 am, and they were inadequate to carry the whole force. Eighty-three of the Green Mountain Boys made the first crossing with B. Arnold and E. Allen, with Major Asa Douglas going back for the rest. As dawn approached, Allen and Arnold became fearful of losing the element of surprise, so they decided to attack with the men at hand. The only sentry on duty at the south gate fled his post after his musket misfired, and the Americans rushed into the fort. The Patriots then roused the small number of sleeping troops at gunpoint and began confiscating their weapons. Allen, Arnold, and a few other men charged up the stairs toward the officers' quarters. Lieutenant Jocelyn Feltham, the assistant to Captain William Delaplace, was awakened by the noise, and called to wake the captain. Stalling for time, Feltham demanded to know by what authority the fort was being entered. Allen, who later claimed that he said it to Captain Delaplace, replied, "In the name of the Great Jehovah and the Continental Congress\!" Delaplace finally emerged from his chambers (fully clothed, not with "his breeches in his hand", as Allen would later say) and surrendered his sword.
Nobody was killed in the battle. The only injury was to one American, Gideon Warren, who was slightly injured by a sentry with a bayonet. Eventually, as many as 400 men arrived at the fort, which they plundered for liquor and other provisions. Arnold, whose authority was not recognized by the Green Mountain Boys, was unable to stop the plunder. Frustrated, he retired to the captain's quarters to await forces that he had recruited, reporting to the Massachusetts Provincial Congress that Allen and his men were "governing by whim and caprice" at the fort, and that the plan to strip the fort and send armaments to Boston was in peril. When Delaplace protested the seizure of his private liquor stores, Allen issued him a receipt for the stores, which he later submitted to Connecticut for payment. Arnold's disputes with Allen and his unruly men were severe enough that there were times when some of Allen's men drew weapons.
On May 12, Allen sent the prisoners to Connecticut's Governor Jonathan Trumbull with a note saying "I make you a present of a Major, a Captain, and two Lieutenants of the regular Establishment of George the Third." Arnold busied himself over the next few days with cataloging the military equipment at Ticonderoga and Crown Point, a task made difficult by the fact that walls had collapsed on some of the armaments.
## Crown Point and the raid on Fort Saint-Jean
Seth Warner sailed a detachment up the lake and captured nearby Fort Crown Point, garrisoned by only nine men. It is widely recorded that this capture occurred on May 10; this is attributed to a letter Arnold wrote to the Massachusetts Committee of Safety on May 11, claiming that an attempt to sail up to Crown Point was frustrated by headwinds. However, Warner claimed, in a letter dated May 12 from "Head Quarters, Crown Point", that he "took possession of this garrison" the day before. It appears likely that, having failed on May 10, the attempt was repeated the next day with success, as reported in Warner's memoir. A small force was also sent to capture Fort George on Lake George, which was held by only two soldiers.
Troops recruited by Arnold's captains began to arrive, some after seizing Philip Skene's schooner Katherine and several bateaux at Skenesboro. Arnold rechristened the schooner Liberty. The prisoners had reported that the lone British warship on Lake Champlain was at Fort Saint-Jean, on the Richelieu River north of the lake. Arnold, uncertain whether word of Ticonderoga's capture had reached Saint-Jean, decided to attempt a raid to capture the ship. He had Liberty outfitted with guns, and sailed north with 50 of his men on May 14. Allen, not wanting Arnold to get the full glory for that capture, followed with some of his men in bateaux, but Arnold's small fleet had the advantage of sail, and pulled away from Allen's boats. By May 17, Arnold's small fleet was at the northern end of the lake. Seeking intelligence, Arnold sent a man to reconnoiter the situation at Fort Saint-Jean. The scout returned later that day, reporting that the British were aware of the fall of Ticonderoga and Crown Point, and that troops were apparently on the move toward Saint-Jean. Arnold decided to act immediately.
Rowing all night, Arnold and 35 of his men brought their bateaux near the fort. After a brief scouting excursion, they surprised the small garrison at the fort, and seized supplies there, along with HMS Royal George, a seventy-ton sloop-of-war. Warned by their captives that several companies were on their way from Chambly, they loaded the more valuable supplies and cannons on the George, which Arnold renamed the Enterprise. Boats that they could not take were sunk, and the enlarged fleet returned to Lake Champlain. This activity was observed by Moses Hazen, a retired British officer who lived near the fort. Hazen rode to Montreal to report the action to the local military commander, and then continued on to Quebec City, where he reported the news to General Carleton on May 20. Major Charles Preston and 140 men were immediately dispatched from Montreal to Saint-Jean in response to Hazen's warning.
Fifteen miles out on the lake, Arnold's fleet met Allen's, which was still heading north. After an exchange of celebratory gunfire, Arnold opened his stores to feed Allen's men, who had rowed 100 miles (160 km) in open boats without provisions. Allen, believing he could seize and hold Fort Saint-Jean, continued north, while Arnold sailed south. Allen arrived at Saint-Jean on May 19, where he was warned that British troops were approaching by a sympathetic Montreal merchant who had raced ahead of those troops on horseback. Allen, after penning a message for the merchant to deliver to the citizens of Montreal, returned to Ticonderoga on May 21, leaving Saint-Jean just as the British forces arrived. In Allen's haste to escape the arriving troops, three men were left behind; one was captured, but the other two eventually returned south by land.
## Aftermath
Ethan Allen and his men eventually drifted away from Ticonderoga, especially once the alcohol began to run out, and Arnold largely controlled affairs from a base at Crown Point. He oversaw the fitting of the two large ships, eventually taking command of Enterprise because of a lack of knowledgeable seamen. His men began rebuilding Ticonderoga's barracks, and worked to extract armaments from the rubble of the two forts and build gun carriages for them.
Connecticut sent about 1,000 men under Colonel Benjamin Hinman to hold Ticonderoga, and New York also began to raise militia to defend Crown Point and Ticonderoga against a possible British attack from the north. When Hinman's troops arrived in June, there was once again a clash over leadership. None of the communications to Arnold from the Massachusetts committee indicated that he was to serve under Hinman; when Hinman attempted to assert authority over Crown Point, Arnold refused to accept it, as Hinman's instructions only included Ticonderoga. The Massachusetts committee eventually sent a delegation to Ticonderoga. When they arrived on June 22 they made it clear to Arnold that he was to serve under Hinman. Arnold, after considering for two days, disbanded his command, resigned his commission, and went home, having spent more than £1,000 of his own money in the effort to capture the fort.
When Congress received news of the events, it drafted a second letter to the inhabitants of Quebec, which was sent north in June with James Price, another sympathetic Montreal merchant. This letter, and other communications from the New York Congress, combined with the activities of vocal American supporters, stirred up the Quebec population in the summer of 1775.
When news of the fall of Ticonderoga reached England, Lord Dartmouth wrote that it was "very unfortunate; very unfortunate indeed".
### Repercussions in Quebec
News of the capture of Ticonderoga and Crown Point, and especially the raids on Fort Saint-Jean, electrified the Quebec population. Colonel Dudley Templer, in charge of the garrison at Montreal, issued a call on May 19 to raise a militia for defense of the city, and requested Indians living nearby to also take up arms. Only 50 men, mostly French-speaking landowning seigneurs and petty nobility, were raised in and around Montreal, and they were sent to Saint-Jean; no Indians came to their aid. Templer also prevented merchants sympathetic to the American cause from sending supplies south in response to Allen's letter.
General Carleton, notified by Hazen of the events on May 20, immediately ordered the garrisons of Montreal and Trois-Rivières to fortify Saint-Jean. Some troops garrisoned at Quebec were also sent to Saint-Jean. Most of the remaining Quebec troops were dispatched to a variety of other points along the Saint Lawrence, as far west as Oswegatchie, to guard against potential invasion threats. Carleton then traveled to Montreal to oversee the defense of the province from there, leaving the city of Quebec in the hands of Lieutenant Governor Hector Cramahé. Before leaving, Carleton prevailed on Monsignor Jean-Olivier Briand, the Bishop of Quebec, to issue his own call to arms in support of the provincial defense, which was circulated primarily in the areas around Montreal and Trois-Rivières.
### Later actions near Ticonderoga
In July 1775, General Philip Schuyler began using the fort as the staging ground for the invasion of Quebec that was launched in late August. In the winter of 1775–1776, Henry Knox directed the transportation of the guns of Ticonderoga to Boston. The guns were placed upon Dorchester Heights overlooking the besieged city and the British ships in the harbor, prompting the British to evacuate their troops and Loyalist supporters from the city in March 1776.
Benedict Arnold again led a fleet of ships at the Battle of Valcour Island, and played other key roles in thwarting Britain's attempt to recapture the fort in 1776. The British did recapture the fort in July 1777 during the Saratoga campaign, but had abandoned it by November after Burgoyne's surrender at Saratoga.
### Broken communications
Although Fort Ticonderoga was not at the time an important military post, its capture had several important results. Rebel control of the area meant that overland communications and supply lines between British forces in Quebec and those in Boston and later New York were severed, so the British military command made an adjustment to their command structure. This break in communication was highlighted by the fact that Arnold, on his way north to Saint-Jean, intercepted a message from Carleton to Gage, detailing the military troop strengths in Quebec. Command of British forces in North America, previously under a single commander, was divided into two commands. General Carleton was given independent command of forces in Quebec and the northern frontier, while General William Howe was appointed Commander-in-Chief of forces along the Atlantic coast, an arrangement that had worked well between Generals Wolfe and Amherst in the French and Indian War. In this war, however, cooperation between the two forces would prove to be problematic and would play a role in the failure of the Saratoga campaign in 1777, as General Howe apparently abandoned an agreed-upon northern strategy, leaving General John Burgoyne without southern support in that campaign.
### War of words between Allen and Arnold
Beginning on the day of the fort's capture, Allen and Arnold began a war of words, each attempting to garner for himself as much credit for the operation as possible. Arnold, unable to exert any authority over Allen and his men, began to keep a diary of events and actions, which was highly critical and dismissive of Allen. Allen, in the days immediately after the action, also began to work on a memoir. Published several years later (see Further reading), the memoir fails to mention Arnold at all. Allen also wrote several versions of the events, which John Brown and James Easton brought to a variety of Congresses and committees in New York, Connecticut, and Massachusetts. Randall (1990) claims that Easton took accounts written by both Arnold and Allen to the Massachusetts committee, but conveniently lost Arnold's account on the way, ensuring that Allen's version, which greatly glorified his role in the affair, would be preferred. Smith (1907) indicates that it was highly likely that Easton was interested in claiming Arnold's command for himself. There was clearly no love lost between Easton and Arnold. Allen and Easton returned to Crown Point on June 10 and called a council of war while Arnold was with the fleet on the lake, a clear breach of military protocol. When Arnold, whose men now dominated the garrison, asserted his authority, Easton insulted Arnold, who responded by challenging Easton to a duel. Arnold later reported, "On refusing to draw like a gentleman, he having a [sword] by his side and cases of loaded pistols in his pockets, I kicked him very heartily and ordered him from the Point."
## See also
- List of American Revolutionary War battles
## Explanatory notes |
# The Tower House
The Tower House, 29 Melbury Road, is a late-Victorian townhouse in the Holland Park district of Kensington and Chelsea, London, built by the architect and designer William Burges as his home. Designed between 1875 and 1881, in the French Gothic Revival style, it was described by the architectural historian J. Mordaunt Crook as "the most complete example of a medieval secular interior produced by the Gothic Revival, and the last". The house is built of red brick, with Bath stone dressings and green roof slates from Cumbria, and has a distinctive cylindrical tower and conical roof. The ground floor contains a drawing room, a dining room and a library, while the first floor has two bedrooms and an armoury. Its exterior and the interior echo elements of Burges's earlier work, particularly Park House in Cardiff and Castell Coch. It was designated a Grade I listed building in 1949.
Burges bought the lease on the plot of land in 1875. The house was built by the Ashby Brothers, with interior decoration by members of Burges's long-standing team of craftsmen such as Thomas Nicholls and Henry Stacy Marks. By 1878 the house was largely complete, although interior decoration and the designing of numerous items of furniture and metalwork continued until Burges's death in 1881. The house was inherited by his brother-in-law, Richard Popplewell Pullan. It was later sold to Colonel T. H. Minshall and then, in 1933, to Colonel E. R. B. Graham. The poet John Betjeman inherited the remaining lease in 1962 but did not extend it. Following a period when the house stood empty and suffered vandalism, it was purchased and restored, first by Lady Jane Turnbull, later by the actor Richard Harris and then by the musician Jimmy Page.
The house retains most of its internal structural decoration, but much of the furniture, fittings and contents that Burges designed has been dispersed. Many items, including the Great Bookcase, the Zodiac settle, the Golden Bed and the Red Bed, are now in museums such as the Ashmolean in Oxford, the Higgins in Bedford and the Victoria and Albert in London, while others are in private collections.
## Location and setting
The Tower House is on a corner of Melbury Road, just north of Kensington High Street, in the district of Holland Park. It stands opposite Stavordvale Lodge and next to Woodland House, built for the artist Luke Fildes. The development of Melbury Road in the grounds of Little Holland House created an art colony in Holland Park, the Holland Park Circle. Its most prominent member, Frederic, Lord Leighton, lived at Leighton House, 12 Holland Park Road, and at the time of Leighton's death in 1896 six Royal Academicians, as well as one associate member, were living in Holland Park Road and Melbury Road.
## History
### Design, construction and craftsmanship, 1875–78
In 1863, William Burges gained his first major architectural commission, Saint Fin Barre's Cathedral, Cork, at the age of 35. In the following twelve years, his architecture, metalwork, jewellery, furniture and stained glass led his biographer, J. Mordaunt Crook to suggest that Burges rivaled Pugin as "the greatest art-architect of the Gothic Revival". But by 1875, his short career was largely over. Although he worked to finalise earlier projects, he received no further major commissions, and the design, construction, decoration and furnishing of the Tower House occupied much of the last six years of his life. In December 1875, after rejecting plots in Victoria Road, Kensington and Bayswater, Burges purchased the leasehold of the plot in Melbury Road from the Earl of Ilchester, the owner of the Holland Estate. The ground rent was £100 per annum. Initial drawings for the house had been undertaken in July 1875 and the final form was decided upon by the end of the year. Building began in 1876, contracted to the Ashby Brothers of Kingsland Road at a cost of £6,000.
At the Tower House Burges drew on his own "experience of twenty years learning, travelling and building", and used many of the artists and craftsmen who had worked with him on earlier buildings. An estimate book compiled by him, and now in the Victoria and Albert Museum, contains the names of the individuals and companies that worked at the house. Thomas Nicholls was responsible for the stone carving, including the capitals, corbels and the chimneypieces. The mosaic and marble work was contracted to Burke and Company of Regent Street, while the decorative tiles were supplied by W. B. Simpson and Sons Ltd of the Strand. John Ayres Hatfield crafted the bronze decorations on the doors, while the woodwork was the responsibility of John Walden of Covent Garden. Henry Stacy Marks and Frederick Weekes were employed to decorate the walls with murals, and Campbell and Smith of Southampton Row had responsibility for most of the painted decoration. Marks painted birds above the frieze in the library, and the illustrations of famous lovers in the drawing room were by Weekes. They also painted the figures on the bookcases in the library. The stained glass was by Saunders and Company of Long Acre, with initial designs by Horatio Walter Lonsdale.
### Burges to Graham, 1878–1962
Burges spent his first night at the house on 5 March 1878. It provided a suitable backdrop for entertaining his range of friends, "the whole gamut of Pre-Raphaelite London." His dogs, Dandie, Bogie and Pinkie, are immortalised in paintings on various pieces of furniture such as the Dog Cabinet and the foot of The Red Bed. Burges displayed his extensive collection of armour in the armoury. The decoration of his bedroom hints at another of his passions: a fondness for opium. Stylised poppies cover the panels of a cupboard which was set next to his bed.
In 1881, after catching a chill while overseeing work at Cardiff, Burges returned, half paralysed, to the house where he lay dying for some three weeks. Among his last visitors were Oscar Wilde and James Whistler. Burges died in the Red Bed on 20 April 1881, just over three years after moving into the Tower House; he was 53 years old. He was buried in West Norwood Cemetery.
The lease on the house was inherited by Burges's brother-in-law, Richard Popplewell Pullan. Pullan completed some of Burges's unfinished projects and wrote two studies of his work. The lease was then purchased by Colonel T. H. Minshall, author of What to Do with Germany and Future Germany, and father of Merlin Minshall. Minshall sold his lease to Colonel E. R. B. and Mrs. Graham in 1933. The Tower House was designated a Grade I listed building on 29 July 1949.
### Betjeman to Turnbull, 1962–69
John Betjeman was a friend of the Grahams and was given the remaining two-year lease on the house, together with some of the furniture, on Mrs. Graham's death in 1962. Betjeman, a champion of Victorian Gothic Revival architecture, was an early admirer of Burges. In 1957 the Tower House had featured in the fifth episode of his BBC television series, An Englishman's Castle. In a radio interview of 1952 about Cardiff Castle Betjeman spoke of the architect and his foremost work: "a great brain has made this place. I don't see how anyone can fail to be impressed by its weird beauty ... awed into silence from the force of this Victorian dream of the Middle Ages."
Because of a potential liability for £10,000 of renovation work upon the expiry of the lease, Betjeman considered the house too costly to maintain, and subsequently vacated it. From 1962 to 1966, the house stood empty and suffered vandalism and neglect. A survey undertaken in January 1965 revealed that the exterior stonework was badly decayed, dry rot had eaten through the roof and the structural floor timbers, and the attics were infested with pigeons. Vandals had stripped the lead from the water tanks and had damaged the mirrors, fireplaces and carving work. The most notable loss was the theft of the carved figure of Fame from the Dining Room chimneypiece. Betjeman suggested that the owner's agents had deliberately refused to let the house, and allowed it to decline, intending to demolish it and redevelop the site. Writing in Country Life in 1966, Charles Handley-Read took a different view saying that "the Ilchester Estate, upon which the house is situated, are anxious that it should be preserved and [have] entered into a long lease conditional upon the house being put into a state of good repair." In March 1965, the Historic Buildings Council obtained a preservation order on the house, enabling the purchaser of the lease, Lady Jane Turnbull, daughter of William Grey, 9th Earl of Stamford, to initiate a programme of restoration the following July. These renovations were supported by grants of £4,000 from the Historic Buildings Council and £3,000 from the Greater London Council. The lease was sold in 1969.
### Harris and Page, 1969 onwards
The actor Richard Harris bought the lease for £75,000 in 1969 after discovering that the American entertainer Liberace had made an offer but had not put down a deposit. Reading of the intended sale in the Evening Standard, Harris bought it the following day, describing his purchase as the biggest gift he had ever given himself. In his autobiography, the entertainer Danny La Rue recalled visiting the house with Liberace, writing, "It was a strange building and had eerie murals painted on the ceiling [...] I sensed evil". Meeting La Rue later, Harris said he had found the house haunted by the ghosts of children from an orphanage that he believed had previously occupied the site and that he had placated them by buying them toys. Harris employed the original decorators, Campbell Smith & Company Ltd., to carry out restoration, using Burges's drawings from the Victoria and Albert Museum.
Jimmy Page, the Led Zeppelin guitarist, bought the house from Harris in 1972 for £350,000, outbidding the musician David Bowie. Page, an enthusiast of Burges and for the Pre-Raphaelite Brotherhood, commented in an interview in 2012: "I was still finding things 20 years after being there – a little beetle on the wall or something like that; it's Burges's attention to detail that is so fascinating." In 2015, Page successfully challenged a planning application lodged by the pop star Robbie Williams, who had purchased the adjacent Woodland House in 2013 and planned extensive renovations. Page argued that the alterations, particularly the intended underground excavations, would threaten the structure of the Tower House. Ongoing disagreements between Williams and Page over Williams' development plans continue to feature in Britain's press.
## Architecture
### Exterior and design
The cultural historian Caroline Dakers wrote that the Tower House was a "pledge to the spirit of Gothic in an area given over to Queen Anne". Burges loathed the Queen Anne style prevalent in Holland Park, writing that it: "like other fashions [...] will have its day, I do not call it Queen Anne art, for, unfortunately I see no art in it at all". His inspirations were French Gothic domestic architecture of the thirteenth century and more recent models drawn from the work of the 19th-century French architect Viollet-le-Duc. Architectural historians Gavin Stamp and Colin Amery considered that the building "sums up Burges in miniature. Although clearly a redbrick suburban house, it is massive, picturesquely composed, with a prominent tourelle for the staircase which is surmounted by a conical roofed turret." Burges's neighbour Luke Fildes described the house as a "model modern house of moderately large size in the 13th-century style built to show what may be done for 19th-century everyday wants".
The house has an L-shaped plan, and the exterior is plain, of red brick, with Bath stone dressings and green roof slates from Cumberland. With a floor plan of 2,500 square feet (230 m<sup>2</sup>), Burges went about its construction on a grand scale. The architect R. Norman Shaw remarked that the concrete foundations were suitable "for a fortress". This approach, combined with Burges's architectural skills and the minimum of exterior decoration, created a building that Crook described as "simple and massive". Following his usual pattern, Burges re-worked many elements of earlier designs, adapting them as appropriate. The frontages come from the other townhouse he designed, Park House, Cardiff, then known as the McConnochie House after its owner, although they have been reversed, with the arcaded, street front from Park House forming the garden front of the Tower House. The staircase is consigned to the conical tower, avoiding the error Burges made at the earlier house, where he placed the staircase in the middle of the hall. The cylindrical tower and conical roof derive from Castell Coch, and the interiors are inspired from examples at Cardiff Castle. The house has two main floors, with a basement below and a garret above. The ground floor contains a drawing room, a dining room and a library, while the first floor has two bedrooms and an armoury.
### Plan
### Interior
The architectural writer Bridget Cherry wrote that "the sturdy exterior gives little hint of the fantasy [Burges] created inside", interiors which the art historian and Burges scholar Charles Handley-Read described as "at once opulent, aggressive, obsessional, enchanting, their grandeur border[ing] on grandiloquence". Each room has a complex iconographic scheme of decoration: in the hall it is Time; in the drawing room, Love; in Burges's bedroom, the Sea. Massive fireplaces with elaborate overmantels were carved and installed, described by Crook as "veritable altars of art [...] some of the most amazing pieces of decoration Burges ever designed". Handley-Read considered that Burges's decorations were "unique, almost magical [and] quite unlike anything designed by his contemporaries".
#### Ground floor
A bronze-covered door, with relief panels depicting figures, opens onto the entrance hall. In Burges's time the door had a letterbox, in the form of Mercury, the messenger of the gods. The letterbox is now lost, but a contemporary copy is in the collection of The Higgins Art Gallery & Museum. The porch contains a white marble seat and column, and on the floor is a mosaic of Pinkie, a favourite poodle of Burges. Cartooned by H. W. Lonsdale, it resembles the cave canem floor at Pompei.
The interior centres on the double-height entrance hall, with the theme of Time. The painted ceiling depicts the astrological signs of the constellations, arranged in the positions they held when the house was first occupied. A large stained glass window contains four female figures representing Dawn, Noon, Twilight and Night. A mosaic floor in the entrance hall contains a labyrinth design, with the centre depicting the myth of Theseus slaying the Minotaur. The garden's entrance door, also covered in bronze, is decorated with a relief of the Madonna and Child. As elsewhere, Burges incorporated earlier designs, the bronze doors echoing those at Cork Cathedral, and the maze floor recalling an earlier ceiling at Burges's office at 15 Buckingham Street. Emblems adorn the five doors on the ground floor, each one relevant to their respective room. A flower marked the door to the garden, with the front door marked by a key. The library is indicated by an open book, the drawing or music room by musical instruments, and the dining room by a bowl and flask of wine.
The library, its walls lined with bookcases, features a sculptured mantelpiece resembling the Tower of Babel. The hooded chimneypiece represents the "dispersion of languages", with figures depicting Nimrod ruling over the elements of speech. Two trumpeters represent the pronouns, a queen embodies the verb, a porter the noun, and numerous other gilded and painted figures are displayed. The ceiling is divided into eight compartments, with depictions of the six founders of law and philosophy: Moses, St. Paul, Luther, Mahomet, Aristotle and Justinian. An illuminated alphabet frieze of architecture and the visual arts running around the bookcases completes the scheme, with the letters of the alphabet incorporated, including a letter "H" falling below the cornice. Due to H-dropping being a social taboo in Victorian times, Handley-Read described it as the "most celebrated of all Burges's jokes". Artists and craftsmen are featured at work on each lettered door of the bookcases that surround the room. In a panel in one of the glazed doors which open onto the garden, Burges is shown standing in front of a model of the Tower House. He features as Architect, the 'A' forming the first letter of the alphabet frieze. Both the Architecture Cabinet and the Great Bookcase stood in this room. The stained glass windows in the room represent painting, architecture and sculpture, and were painted by Weekes.
On the wall opposite the library fireplace is an opening into the drawing room. Inside there are three stained glass windows which are set in ornamented marble linings. Opposite the windows stood the Zodiac Settle, which Burges moved from Buckingham Street. Love is the central decorative scheme to the room, with the ceiling painted with medieval cupids, and the walls covered with mythical lovers. Carved figures from the Roman de la Rose decorate the chimneypiece, which Crook considered "one of the most glorious that Burges and Nicholls ever produced". Echoing Crook, Charles Handley-Read wrote, "Working together, Burges and Nicholls had transposed a poem into sculpture with a delicacy that is very nearly musical. The Roman de la Rose has come to life."
The dining room is devoted to Geoffrey Chaucer's The House of Fame and the art of story-telling, Crook explaining that "tall stories are part of the dining room rite". The hooded chimneypiece, of Devonshire marble, contained a bronze figure above the fireplace representing the Goddess of Fame; its hands and face were made of ivory, with sapphires for eyes. It was later stolen. The tiles on the walls depict fairy stories, including Reynard the Fox, Jack and the Beanstalk and Little Red Riding Hood. The room also shows Burges's innovative use of materials: Handley-Read observed that the Victorians had "a horror of food smells" and therefore the room was constructed using materials that did not absorb odours and could be washed. The walls are covered with Devonshire marble, surmounted by glazed picture tiles, while the ceiling is of sheet metal. The ceiling is divided into coffered compartments by square beams and features symbols of the Sun, the planets and the signs of the Zodiac. Burges designed most of the cutlery and plates used in this room, which display his skills as a designer of metalwork, including the claret jug and Cat Cup chosen by Lord and Lady Bute as mementoes from Burges's collection after his death. The panels of the wine cupboard were decorated by Dante Gabriel Rossetti.
#### First floor and garret
The windows of the stair turret represent "the Storming of the Castle of Love". On the first floor are two bedrooms and an armoury. Burges's bedroom, with a theme of sea creatures, overlooks the garden. Its elaborate ceiling is segmented into panels by gilded and painted beams, studded with miniature convex mirrors set in to gilt stars. Fish and eels swim in a frieze of waves painted under the ceiling, and fish are also carved in relief on the chimneypiece. On the fire-hood, a sculpted mermaid gazes into a looking-glass, with seashells, coral, seaweed and a baby mermaid also represented. Charles Handley-Read described the frieze around the Mermaid fireplace as "proto-Art Nouveau" and noted "the debt of international art nouveau to Victorian Gothic designers, Burges included". In this room, Burges placed two of his most personal pieces of furniture, the Red Bed, in which he died, and the Narcissus washstand, both of which originally came from Buckingham Street. The bed is painted blood red and features a panel depicting Sleeping Beauty. The washstand is red and gold; its tip-up basin of marble inlaid with fishes is silver and gold.
"The Earth and its productions" is the theme of the guest room facing the street. Its ceiling is adorned with butterflies and fleurs-de-lis, and at the crossing of the main beams is a convex mirror in a gilded surround. Along the length of the beams are paintings of frogs and mice. A frieze of flowers, once painted over, has since been restored. The Golden Bed and the Vita Nuova Washstand designed for this room are now in the Victoria and Albert Museum.
Burges designated the final room on the first floor an armoury and used it to display his large collection of armour. The collection was bequeathed to the British Museum upon his death. A carved chimneypiece in the armoury has three roundels carved with the goddesses Minerva, Venus and Juno in medieval attire.
The garret originally contained day and night nurseries, which the author James Stourton considers a surprising choice of arrangement for the "childless bachelor Burges". They contain a pair of decorated chimneypieces featuring the tale of Jack and the Beanstalk and three monkeys at play.
### Garden
The garden at the rear of the house featured raised flowerbeds which Dakers described as being "planned according to those pleasances depicted in medieval romances; beds of scarlet tulips, bordered with stone fencing". On a mosaic terrace, around a statue of a boy holding a hawk, sculpted by Thomas Nicholls, Burges and his guests would sit on "marble seats or on Persian rugs and embroidered cushions." The garden, and that of the adjacent Woodland House, contain trees from the former Little Holland House.
## Furniture
In creating the interior of the house, Burges demonstrated his skill as a jeweller, metalworker and designer. He included some of his best pieces of furniture such as the Zodiac Settle, the Dog Cabinet and the Great Bookcase, the last of which Charles Handley-Read described as "occupying a unique position in the history of Victorian painted furniture". The fittings were as elaborate as the furniture: the tap for one of the guest washstands was in the form of a bronze bull from whose throat water poured into a sink inlaid with silver fish. Within the Tower House Burges placed some of his finest metalwork; the artist Henry Stacy Marks wrote, "he could design a chalice as well as a cathedral ... His decanters, cups, jugs, forks and spoons were designed with an equal ability to that with which he would design a castle."
Burges's furniture did not receive universal acclaim. In his major study of English domestic architecture, Das englische Haus, published some twenty years after Burges's death, Hermann Muthesius wrote of The Tower House, "Worst of all, perhaps, is the furniture. Some of it is in the earlier manner, some of it box-like and painted all over. This style had now become fashionable, though with what historical justification it is not easy to say".
Many of the early pieces of furniture, such as the Narcissus Washstand, the Zodiac Settle and the Great Bookcase, were originally made for Burges's office at Buckingham Street and were later moved to the Tower House. The Great Bookcase was also part of Burges's contribution to the Medieval Court at the 1862 International Exhibition. Later pieces, such as the Crocker Dressing Table and the Golden Bed, and its accompanying Vita Nuova Washstand, were made specifically for the house. John Betjeman located the Narcissus Washstand in a junk shop in Lincoln and gave it to Evelyn Waugh, a fellow enthusiast for Victorian art and architecture, who featured it in his 1957 novel, The Ordeal of Gilbert Pinfold. Betjeman later gave Waugh both the Zodiac Settle and the Philosophy Cabinet.
Many of the decorative items Burges designed for the Tower House were dispersed in the decades following his death. Kenneth Clark purchased the Great Bookcase for the Ashmolean Museum. Several pieces purchased by Charles Handley-Read, who was instrumental in reviving interest in Burges, were acquired by The Higgins Art Gallery & Museum, Bedford. The museum also bought the Zodiac Settle from the Waugh family in 2011.
### Dispersed furniture and locations
The table below lists the known pieces of furniture originally in situ, with their dates of construction and their current location where known.
## Architectural coverage
Edward Godwin, fellow architect and close friend of Burges, wrote of Burges's unified approach to building and design in an article in The British Architect, published in 1878: "the interior and the exterior, the building and its furniture, the enclosed as well as the enclosure, are in full accord." Richard Popplewell Pullan described the house in detail in the second of two works he wrote about his brother-in-law, The House of William Burges, A.R.A., published in 1886. The book contains photographs of the interior of the house by Francis Bedford. In 1893, the building was the only private house to be recorded in an article in The Builder, which gave an overview of the architecture of the previous fifty years. It was referenced again a decade later by Muthesius, who described it as "the most highly developed Gothic house to have been built in the 19th century (and) the last to be built in England". It was then largely ignored, with James Stourton describing its early twentieth-century decline as "a paradigm of the reputation of the Gothic Revival".
A renewed understanding and appreciation of the building, and of Burges himself, began with Charles Handley-Read's essay on Burges in Peter Ferriday's collection Victorian Architecture, published in 1963. In 1966 Handley-Read followed this with a substantial article on the house for Country Life, "Aladdin's Palace in Kensington". His notes on Burges formed the basis of Mordaunt Crook's centenary volume, William Burges and the High Victorian Dream, published in 1981 and revised and reissued in 2013, in which Crook wrote at length on both the Tower House and its contents.
More recent coverage was given in London 3: North West, the revision to the Buildings of England guide to London written by Nikolaus Pevsner and Bridget Cherry, published in 1991 (revised 2002). The house is referenced in Matthew Williams's William Burges (2004), and in Panoramas of Lost London by Philip Davies, published in 2011, which includes some of Francis Bedford's photographs of the house from 1885. In a chapter on the building in Great Houses of London (2012), the author James Stourton called The Tower House "the most singular of London houses, even including the Soane Museum." |
# Evita (1996 film)
Evita is a 1996 American biographical musical drama film based on the 1976 concept album of the same name produced by Tim Rice and Andrew Lloyd Webber, which also inspired a 1978 musical. The film depicts the life of Eva Perón, detailing her beginnings, rise to fame, political career and death at the age of 33. Directed by Alan Parker, and written by Parker and Oliver Stone, Evita stars Madonna as Eva, Jonathan Pryce as Eva's husband Juan Perón, and Antonio Banderas as Ché, an everyman who acts as the film's narrator.
Following the release of the 1976 album, a film adaptation of the musical became mired in development hell for more than fifteen years, as the rights were passed on to several major studios, and various directors and actors considered. In 1993, producer Robert Stigwood sold the rights to Andrew G. Vajna, who agreed to finance the film through his production company Cinergi Pictures, with Buena Vista Pictures distributing the film through Hollywood Pictures. After Stone stepped down from the project in 1994, Parker agreed to write and direct the film. Recording sessions for the songs and soundtrack took place at CTS Studios in London, England, roughly four months before filming. Parker worked with Rice and Lloyd Webber to compose the soundtrack, reworking the original songs by creating the music first and then the lyrics. They also wrote a new song, "You Must Love Me", for the film. Principal photography commenced in February 1996 with a budget of $55 million, and concluded in May of that year. Filming took place on locations in Buenos Aires and Budapest as well as on soundstages at Shepperton Studios. The film's production in Argentina was met with controversy, as the cast and crew faced protests over fears that the project would tarnish Eva's image.
Evita premiered at the Shrine Auditorium in Los Angeles, California, on December 14, 1996. Hollywood Pictures gave the film a platform release, which involved releasing it in select cities before expanding distribution in the following weeks. The film had a limited release on December 25, 1996, before opening nationwide on January 10, 1997. It grossed over $141 million worldwide. The film received a mixed critical response; reviewers praised Madonna's performance, the music, costume designs and cinematography, while criticism was aimed at the pacing and direction. Evita received many awards and nominations, including the Academy Award for Best Original Song ("You Must Love Me"), and three Golden Globe Awards for Best Picture – Comedy or Musical, Best Original Song ("You Must Love Me") and Best Actress – Comedy or Musical (Madonna).
## Plot
At a cinema in Buenos Aires on July 26, 1952, a film is interrupted when news breaks of the death of Eva Perón, Argentina's First Lady, at the age of 33. As the nation goes into public mourning, Ché, a member of the public, marvels at the spectacle and promises to show how Eva did "nothing for years". The rest of the film follows Eva (née Duarte) from her beginnings as an illegitimate child of a lower-class family to her rise to become First Lady; Ché assumes many different guises throughout Eva's story.
At the age of 15, Eva lives in the provincial town of Junín, and longs for a better life in Buenos Aires. She persuades a tango singer, Agustín Magaldi, with whom she is having an affair, to take her to the city. After Magaldi leaves her, she goes through several relationships with increasingly influential men, becoming a model, actress and radio personality. She meets Colonel Juan Perón at a fundraiser following the 1944 San Juan earthquake. Perón's connection with Eva adds to his populist image, since they are both from the working class. Eva has a radio show during Perón's rise and uses all of her skills to promote him, even when the controlling administration has him jailed in an attempt to stunt his political momentum. The groundswell of support that Eva generates forces the government to release Perón, and he finds the people enamored of him and Eva. Perón wins election to the presidency and marries Eva, who promises that the new government will serve the descamisados.
At the start of the Perón government, Eva dresses glamorously, enjoying the privileges of being the First Lady. Soon after, she embarks on what is called her "Rainbow Tour" of Europe. While there, she receives a mixed reception. The people of Spain adore her, the people of Italy call her a whore and throw things at her, and Pope Pius XII gives her a small, meager gift. Upon returning to Argentina, Eva establishes a foundation to help the poor. The film suggests the Perónists otherwise plunder the public treasury.
Eva is hospitalized and learns that she has terminal cancer. She declines the position of Vice President due to her failing health, and makes one final broadcast to the people of Argentina. She understands that her life was short because she shone like the "brightest fire", and helps Perón prepare to go on without her. A large crowd surrounds the Unzué Palace in a candlelight vigil praying for her recovery when the light of her room goes out, signifying her death. At Eva's funeral, Ché is seen at her coffin, marveling at the influence of her brief life. He walks up to her glass coffin, kisses it, and joins the crowd of passing mourners.
## Cast
Cast taken from Turner Classic Movies listing of Evita.
## Production
### Failed projects: 1976–1986
Following the release of Evita (1976), a sung-through concept album by Tim Rice and Andrew Lloyd Webber detailing the life of Eva Perón, director Alan Parker met with their manager David Land, asking if Rice and Lloyd Webber had thought of making a film version. He understood that they were more interested in creating a stage version with the album's original lyrics. The original West End theatre production of Evita opened at the Prince Edward Theatre on June 21, 1978, and closed on February 18, 1986. The subsequent Broadway production opened at the Broadway Theatre on September 25, 1979, and closed on June 26, 1983, after 1,567 performances and 17 previews. Robert Stigwood, producer of the West End production, wanted Parker to direct Evita as a film but, after completing work on the musical Fame (1980), Parker turned down the opportunity to helm Evita, telling Stigwood that he "didn't want to do back-to-back musicals".
The film rights to Evita became the subject of a bidding war among Warner Bros., Metro-Goldwyn-Mayer (MGM) and Paramount Pictures. Stigwood sold the rights to EMI Films for over $7.5 million. He also discussed the project with Jon Peters, who promised that he would convince his girlfriend Barbra Streisand to play the lead role if he were allowed to produce. Streisand, however, was not interested in the project because she saw the stage version in New York and didn't like it. Stigwood turned down the offer, opting to stay involved as the film's sole producer. EMI ultimately dropped the project after merging with Thorn Electrical Industries to form Thorn EMI, as well as producing several box-office flops under the banner.
In May 1981, Paramount Pictures acquired the film rights, with Stigwood attached as a producer. Paramount allocated a budget of $15 million, and the film was scheduled to go into production by year-end. To avoid higher production costs, Stigwood, Rice and Lloyd Webber each agreed to take a smaller salary but a higher percentage of the film's gross. Stigwood hired Ken Russell to direct the film, based on the success of their previous collaboration Tommy (1975).
Stigwood and Russell decided to hold auditions with the eight actresses portraying Eva in the musical's worldwide productions, with an undisclosed number performing screen tests in New York and London. In November 1981, Russell continued holding screen tests at Elstree Studios. Karla DeVito was among those who auditioned for the role of Eva. Russell also travelled to London, where he screen tested Liza Minnelli wearing a blonde wig and custom-period gowns. He felt that Minnelli, a more established film actress, would be better suited for the role, but Rice, Stigwood and Paramount wanted Elaine Paige, the first actress to play Eva in the London stage production. Russell began working on his own screenplay without Stigwood, Rice or Lloyd Webber's approval. His script followed the outlines of the stage production, but established the character of Ché as a newspaper reporter. The script also contained a hospital montage for Eva and Ché, in which they pass each other on gurneys in white corridors as she is being treated for cancer, while Ché is beaten and injured by rioters. Russell was ultimately fired from the project after telling Stigwood he would not do the film without Minnelli.
As Paramount began scouting locations in Mexico, Stigwood began the search for a new director. He met with Herbert Ross, who declined in favor of directing Footloose (1984) for Paramount. Stigwood then met with Richard Attenborough, who deemed the project impossible. Stigwood also approached directors Alan J. Pakula and Hector Babenco, who both declined. In 1986, Madonna visited Stigwood in his office, dressed in a gown and 1940s-style hairdo to show her interest in playing Eva. She also campaigned briefly for Francis Ford Coppola to helm the film. Stigwood was impressed, stating that she was "perfect" for the part.
### Oliver Stone: 1987–1994
In 1987, Jerry Weintraub's independent film company Weintraub Entertainment Group (WEG) obtained the film rights from Paramount. Oliver Stone, a fan of the musical, expressed interest in the film adaptation and contacted Stigwood's production company RSO Films to discuss the project. After he was confirmed as the film's writer and director in April 1988, Stone travelled to Argentina, where he visited Eva's birthplace and met with the newly elected President Carlos Menem, who agreed to provide 50,000 extras for the production as well as allowing freedom of speech.
Madonna met with Stone and Lloyd Webber in New York to discuss the role. Plans fell through after she requested script approval and told Lloyd Webber that she wanted to rewrite the score. Stone then approached Meryl Streep for the lead role and worked closely with her, Rice and Lloyd Webber at a New York City recording studio to do preliminary dubbings of the score. Stigwood said of Streep's musical performance: "She learned the entire score in a week. Not only can she sing, but she's sensational – absolutely staggering."
WEG allocated a budget of $29 million, with filming set to begin in early 1989, but production was halted due to the 1989 riots in Argentina. Concerned for the safety of the cast and crew, Stigwood and Weintraub decided against shooting there. The filmmakers then scouted locations in Brazil and Chile, before deciding on Spain, with a proposed budget of $35 million; the poor box office performances of WEG's films resulted in the studio dropping the project. Stone took Evita to Carolco Pictures shortly after, and Streep remained a front-runner for the lead role. However, Streep began increasing her compensation requests; she demanded a pay-or-play contract with a 48-hour deadline. Although an agreement was reached, Streep's agent contacted Carolco and RSO Films, advising them that she was stepping down from the project for "personal reasons". Streep renewed her interest after 10 days, but Stone and his creative team had left the project in favor of making The Doors (1991).
In 1990, the Walt Disney Studios acquired the film rights to Evita, and Glenn Gordon Caron was hired to direct the film, with Madonna set to appear in the lead role. Disney was to produce the film under its film label Hollywood Pictures. Although Disney had spent $2–3 million in development costs, it canceled the plans in May 1991 when the budget climbed to $30 million. Disney chairman Jeffrey Katzenberg did not want to spend more than $25.7 million on the film. In November 1993, Stigwood sold the film rights to Andrew G. Vajna's production company Cinergi Pictures. Vajna later enlisted Arnon Milchan of Regency Enterprises as a co-financier, and Stone returned as the film's director after meeting with Dan Halsted, the senior vice president of Hollywood Pictures. Production was set to begin sometime in 1995 after Stone and Milchan concluded filming of Noriega, a film chronicling the life of Panamanian general and dictator Manuel Noriega. Stone and Milchan disputed over the high production costs of Evita, Noriega (which never was filmed) and Nixon (1995), resulting in Stone leaving the project in July 1994.
### Development
In December 1994, Alan Parker signed on to write and direct the film after being approached by Stigwood and Vajna. Parker also produced the film, with his Dirty Hands Productions banner enlisted as a production company. While drafting his own script, Parker researched Eva's life, compiling newspaper articles, documentaries and English-language books. He refused to borrow elements from Stone's script or the stage play, instead opting to model his script after Rice and Lloyd Webber's concept album. Stone had a falling out with Parker over the content of the script, claiming that he had made significant contributions. A legal dispute and arbitration by the Writers Guild of America resulted in Parker and Stone sharing a screenwriting credit.
> "While Evita is a story of people whose lives were in politics, it is not a political story. It is a Cinderella story about the astonishing life of a girl from the most mundane of backgrounds, who became the most powerful woman her country (and indeed Latin America) had ever seen, a woman never content to be a mere ornament at the side of her husband, the president."
Parker's finished script included 146 changes to the concept album's music and lyrics. In May 1995, he and Rice visited Lloyd Webber at his home in France, where Parker tried to bring them to work on the film. Rice and Lloyd Webber had not worked together for many years, and the script for Evita required that they compose new music. In June 1995, with assistance from the United States Department of State and United States Senator Chris Dodd, Parker arranged a private meeting with Menem in Argentina to discuss the film's production and request permission to film at the Casa Rosada, the executive mansion. Although he expressed his discontent with the production, Menem granted the filmmakers creative freedom to shoot in Argentina, but not in the Casa Rosada. He also advised Parker to be prepared to face protests against the film. Parker had the film's production designer Brian Morris take photographs of the Casa Rosada, so that the production could construct a replica at Shepperton Studios in England. The director visited seven other countries before deciding to film on location in Buenos Aires and Budapest.
### Casting
Antonio Banderas was the first actor to secure a role in the film. He was cast as Ché when Glenn Gordon Caron was hired to direct the film, and remained involved when Stone returned to the project. Before he left the project, Stone had considered casting Michelle Pfeiffer in the lead role of Eva, and this was confirmed in July 1994. Pfeiffer left the production when she became pregnant with her second child. Parker also considered Glenn Close, along with Meryl Streep, to play Eva.
In December 1994, Madonna sent Parker a four-page letter explaining that she was the best person to portray Eva and would be fully committed to the role. She also sent him a copy of her "Take a Bow" music video as a way of "auditioning". Parker insisted that if Madonna was to be his Evita, she must understand who was in charge. "The film is not a glorified Madonna video," said Parker. "I controlled it and she didn't." Rice believed that Madonna suited the title role since she could "act so beautifully through music". Lloyd Webber was wary about her singing. Since the film required the actors to sing their own parts, Madonna underwent vocal training with coach Joan Lader to increase her own confidence in singing the unusual songs, and project her voice in a much more cohesive manner. Lader noted that the singer "had to use her voice in a way she's never used it before. Evita is real musical theater — it's operatic, in a sense. Madonna developed an upper register that she didn't know she had."
In January 1996, Madonna travelled to Buenos Aires to research Eva's life, and met with several people who had known her before her death. During filming, she fell sick many times due to the intense emotional effort required, and midway through production, she discovered she was pregnant. Her daughter Lourdes was born on October 14, 1996. Madonna published a diary of the film shoot in Vanity Fair. She said of the experience, "This is the role I was born to play. I put everything of me into this because it was much more than a role in a movie. It was exhilarating and intimidating at the same time ... And I am prouder of Evita than anything else I have done."
Parker decided to keep Banderas in the supporting role of Ché after checking the actor's audition tape. While writing the script, the director chose not to identify the character with Ernesto "Che" Guevara, which had been done in several versions of the musical. "In the movie Ché tells the story of Eva", Banderas said. "He takes a very critical view of her and he's sometimes cynical and aggressive but funny, too. At the same time he creates this problem for himself because, for all his principles, he gets struck by the charm of the woman." For the role of Juan Perón, Parker approached film and stage actor Jonathan Pryce, who secured the part after meeting with the director.
### Filming
#### Principal photography
The film's production in Argentina was met with controversy and sparked significant media attention. The cast and crew faced protests over fears that the project would tarnish Eva's image. Members of the Peronist Party launched a hate campaign, condemning the film's production, Madonna and Parker. Evita also prompted the government of Argentina to produce its own film, Eva Perón: The True Story (1996), to counter any misconceptions or inaccuracies caused by the film. In response to the controversy surrounding the project, the production held a press conference in Buenos Aires on February 6, 1996.
Principal photography began on February 8, 1996, with a budget of $55 million. Production designer Brian Morris constructed 320 different sets. Costume designer Penny Rose was given special access to Eva's wardrobe in Argentina, and she modeled her own costume designs after Eva's original outfits and shoes. She clothed 40,000 extras in period dresses. The production used more than 5,500 costumes from 20 costume houses located in Paris, Rome, London, New York City, San Francisco, Los Angeles, Buenos Aires, and Budapest as well as 1,000 military uniforms. Madonna's wardrobe included 85 costume changes, 39 hats, 45 pairs of shoes, and 56 pairs of earrings. She broke the Guinness World Record for "Most Costume Changes in a Film".
Filming began in Buenos Aires with scenes depicting Eva's childhood in 1936. Locations included Los Toldos, the town of Junín, where Eva was raised, and Chivilcoy, where her father's funeral was held. On February 23, 1996, Menem arranged a meeting with Parker, Madonna, Pryce and Banderas, and granted the crew permission to film in the Casa Rosada shortly before they were scheduled to leave Buenos Aires. On March 9, the production filmed the musical number "Don't Cry for Me Argentina" there, utilizing 4,000 extras for two days. Filming in Buenos Aires concluded after five weeks.
The cast and crew then moved to Budapest, Hungary, where 23 locations were used for scenes set in Buenos Aires. The production spent two days re-enacting Eva's state funeral, which required 4,000 extras to act as citizens, police officials and military personnel. The filmmakers shot exterior scenes outside of the St. Stephen's Basilica, but were denied access to film inside the building. For the musical numbers "Your Little Body's Slowly Breaking Down" and "Lament", Parker had Madonna and Pryce record the songs live on set, due to the emotional effort required from their performances. After five weeks of shooting in Hungary, the remainder of filming took place on sound stages at Shepperton Studios in England. Principal photography concluded on May 30, 1996 after 84 days of filming.
#### Cinematography
Director of photography Darius Khondji was initially reluctant about working on a musical but was inspired by Parker's passion for the project. For the film's visual style, Khondji and Parker were influenced by the works of American realist painter George Bellows. Khondji shot Evita using Moviecam cameras, with Cooke anamorphic lenses. He used Eastman EXR 5245 film stock for exteriors in Argentina, 5293 for the Argentinean interiors, and 5248 for any scenes shot during overcast days and combat sequences.
Khondji employed large tungsten lighting units, including 18K HMIs, dino and Wendy lights. He used Arriflex's VariCon, which functions as an illuminated filter, and incorporated much more lens filtration than he had on previous projects. Technicolor's ENR silver retention, when combined with the VariCon, was used to control the contrast and black density of the film's release prints. The finished film features 299 scenes and 3,000 shots from 320,000 feet (98,000 m) of film.
### Music and soundtrack
Recording sessions for the film's songs and soundtrack began on October 2, 1995, at CTS Studios in London. It took almost four months to record all the songs, which involved creating the music first and then the lyrics. Parker declared the first day of recording as "Black Monday", and recalled it as a worrisome and nervous day. He said, "All of us came from very different worlds—from popular music, from movies, and from musical theater—and so we were all very apprehensive." The cast was also nervous; Banderas found the experience "scary", while Madonna was "petrified" when it came to recording the songs. "I had to sing 'Don't Cry for Me Argentina' in front of Andrew Lloyd Webber ... I was a complete mess and was sobbing afterward. I thought I had done a terrible job", the singer recalled.
According to the film's music producer Nigel Wright, the lead actors would first sing the numbers backed by a band and orchestra before working with Parker and music supervisor David Caddick "in a more intimate recording environment [to] perfect their vocals". More trouble arose as Madonna was not completely comfortable with "laying down a guide vocal simultaneously with an 84-piece orchestra" in the studio. She was used to singing over a pre-recorded track and not having musicians listen to her. Also, unlike her previous soundtrack releases, she had little to no creative control. "I'm used to writing my own songs and I go into a studio, choose the musicians and say what sounds good or doesn't ... To work on 46 songs with everyone involved and not have a big say was a big adjustment," she recalled. An emergency meeting was held between Parker, Lloyd Webber and Madonna, where it was decided that the singer would record her part at Whitfield Street Studios, a contemporary studio, while the orchestration would take place elsewhere. She also had alternate days off from the recording to preserve and strengthen her voice.
By the end of recording, Parker noticed that Rice and Lloyd Webber did not have a new song in place. They arranged a meeting at Lloyd Webber's county estate in Berkshire, where they began work on the music and lyrics for "You Must Love Me". Madonna's reaction to the lyrics was negative since she wanted Eva to be portrayed sympathetically, rather than as the "shrewd manipulator" that Parker had in mind. Although Madonna was successful in getting many portions in the script altered, Rice declined to change the song. He recalled, "I remember taking the lyrics to Madonna and she was trying to change them... The scene can be interpreted in different ways, but my lyrics were kept, thank God\!"
The soundtrack for Evita was released in the United States on November 12, 1996. Warner Bros. Records released two versions: a two-disc edition entitled Evita: The Complete Motion Picture Music Soundtrack, which featured all the tracks from the film, and Evita: Music from the Motion Picture, a single-disc edition. AllMusic's Stephen Thomas Erlewine described the soundtrack as "unengaging", while Hartford Courant's Greg Morago praised Madonna's singing abilities. The soundtrack was a commercial success, reaching number one in Austria, Belgium, Scotland, Switzerland and the United Kingdom, as well as selling over seven million copies worldwide.
## Release
In May 1996, Parker constructed a 10-minute trailer of Evita that was shown at the 1996 Cannes Film Festival for reporters, film distributors and critics. Despite a minor technical issue with the film projector's synchronization of the sound and picture, the trailer received positive response. Roger Ebert, for the Chicago Sun-Times, wrote "If the preview is representative of the finished film, Argentina can wipe away its tears." Barry Walters of The San Francisco Examiner stated "Rather than showing the best moments from every scene, the trailer concentrates on a few that prove what Madonna, Banderas and Pryce can do musically. The results are impressive." Evita premiered at the Shrine Auditorium in Los Angeles on December 14, 1996, the Savoy Cinema in Dublin, Ireland, on December 19, 1996, and the Empire Theatre in London on December 20, 1996.
Hollywood Pictures gave the film a platform release, showing it in a few cities before expanding distribution in the following weeks. Evita opened in limited release in New York and Los Angeles on December 25, 1996, before being released nationwide on January 10, 1997. The film was distributed by Buena Vista Pictures in North America and Latin America. Cinergi handled distribution in other countries, with Paramount Pictures releasing the film in Germany and Japan (through United International Pictures), Summit Entertainment in other regions and Entertainment Film Distributors in the United Kingdom and Ireland. A book detailing the film's production, The Making of Evita, was written by Parker and released on December 10, 1996 by Collins Publishers. In 2002, Evita became the first and only American film to be screened at the Pyongyang International Film Festival.
### Home media
Evita was released on VHS on August 5, 1997, and on LaserDisc on August 20, 1997. A DTS LaserDisc version and a "Special Edition" LaserDisc by the Criterion Collection were both released on September 17, 1997. Special features on the Criterion LaserDisc include an audio commentary by Parker, Madonna's music videos for "Don't Cry for Me Argentina" and "You Must Love Me", two theatrical trailers and five TV spots. The film was released on DVD on March 25, 1998. A 15th Anniversary Edition was released on Blu-ray on June 19, 2012. The Blu-ray presents the film in 1080p high definition, and features a theatrical trailer, the music video for "You Must Love Me" and a behind-the-scenes documentary entitled "The Making of Evita".
## Reception
### Box office
Evita grossed $71,308 on its first day of limited release, an average of $35,654 per theater. By the end of its first weekend, the film had grossed $195,085, with an overall North American gross of $334,440. More theatres were added on the following weekend, and the film grossed a further $1,064,660 in its second weekend, with an overall gross of $2,225,737.
Released to 704 theaters in the United States and Canada, Evita grossed $2,551,291 on its first day of wide release. By the end of its opening weekend, it had grossed $8,381,055, securing the number two position at the domestic box office behind the science-fiction horror film The Relic. Evita saw a small increase in attendance in its second weekend of wide release. During the four-day Martin Luther King Jr. Day weekend, the film moved to third place on domestic box office charts, and earned $8,918,183—a 6.4% overall increase from the previous weekend. It grossed another $5,415,891 during its fourth weekend, moving to fifth place in the top 10 rankings. Evita moved to fourth place the following weekend, grossing a further $4,374,631—a 19.2% decrease from the previous weekend. By its sixth weekend, the film moved from fourth to sixth place, earning $3,001,066.
Evita completed its theatrical run in North America on May 8, 1997, after 135 days (19.3 weeks) of release. It grossed $50,047,179 in North America, and $91,000,000 in other territories, for a worldwide total of $141,047,179.
### Critical response
Evita received a mixed response from critics. Rotten Tomatoes sampled 39 reviews, and gave the film a score of 64%, with an average score of 6.7/10. The site's consensus reads: "Evita sometimes strains to convince on a narrative level, but the soundtrack helps this fact-based musical achieve a measure of the epic grandeur to which it aspires." Another review aggregator, Metacritic, assigned the film a weighted average score of 45 out of 100 based on 23 reviews from critics, indicating "mixed or average reviews".
Writing for the Hartford Courant, Malcolm Johnson stated "Against all odds, this long-delayed film version turns out to be a labor of love for director Alan Parker and for his stars, the reborn Madonna, the new superstar Antonio Banderas, the protean veteran Jonathan Pryce." Roger Ebert of the Chicago Sun-Times gave the film three-and-a-half out of four stars, writing "Parker's visuals enliven the music, and Madonna and Banderas bring it passion. By the end of the film we feel like we've had our money's worth, and we're sure Evita has." On the syndicated television program Siskel & Ebert & the Movies, Ebert and his colleague Gene Siskel gave the film a "two thumbs up" rating. Siskel, in his review for the Chicago Tribune, wrote, "Director Alan Parker has mounted this production well, which is more successful as spectacle than anything else." According to Time magazine's Richard Corliss, "This Evita is not just a long, complex music video; it works and breathes like a real movie, with characters worthy of our affection and deepest suspicions." Critic Zach Conner commented, "It's a relief to say that Evita is pretty damn fine, well-cast, and handsomely visualized. Madonna once again confounds our expectations. She plays Evita with a poignant weariness and has more than just a bit of star quality. Love or hate Madonna-Eva, she is a magnet for all eyes."
Carol Buckland of CNN considered that "Evita is basically a music video with epic pretensions. This is not to say it isn't gorgeous to look at or occasionally extremely entertaining. It's both of those things. But for all the movie's grand style, it falls short in terms of substance and soul." Newsweek's David Ansen wrote "It's gorgeous. It's epic. It's spectacular. But two hours later, it also proves to be emotionally impenetrable." Giving the film a C− rating, Owen Gleiberman of Entertainment Weekly criticized Parker's direction, stating, "Evita could have worked had it been staged as larger-than-life spectacle ... The way Alan Parker has directed Evita, however, it's just a sluggish, contradictory mess, a drably "realistic" Latin-revolution music video driven by a soundtrack of mediocre '70s rock." Janet Maslin from The New York Times praised Madonna's performance as well as the costume design and cinematography, but wrote that the film was "breathless and shrill, since Alan Parker's direction shows no signs of a moral or political compass and remains in exhausting overdrive all the time." Jane Horwitz of the Sun-Sentinel stated, "Madonna sings convincingly and gets through the acting, but her performance lacks depth, grace and muscle. Luckily, director Alan Parker's historic-looking production with its epic crowd scenes and sepia-toned newsreels shows her off well." Negative criticism came from the San Francisco Chronicle's Barbara Shulgasser, who wrote: "This movie is supposed to be about politics and liberation, the triumph of the lower classes over oppression, about corruption. But it is so steeped in spectacle, in Madonna-ness, in bad rock music and simple-minded ideas, that in the end it isn't about anything".
### Accolades
Evita received various awards and nominations, with particular recognition for Madonna, Parker, Rice, Lloyd Webber and the song "You Must Love Me". It received five Golden Globe Award nominations, and won three for Best Motion Picture – Musical or Comedy, Best Actress – Musical or Comedy (Madonna) and Best Original Song ("You Must Love Me"). At the 69th Academy Awards, the film won the Academy Award for Best Original Song ("You Must Love Me"), and was nominated in four other categories: Best Film Editing, Best Cinematography, Best Art Direction and Best Sound. Madonna appeared during the Academy Awards and performed "You Must Love Me". The National Board of Review named Evita one of the "Top 10 Films of 1996", ranking it at number four. At the 50th British Academy Film Awards, Evita garnered eight nominations, but did not win in any category. At the 1st Golden Satellite Awards, it received five nominations, and won three for Best Film, Best Original Song ("You Must Love Me"), and Best Costume Design (Penny Rose). |
# Nicoll Highway collapse
The Nicoll Highway collapse occurred in Singapore on 20 April 2004 at 3:30 pm local time when a Mass Rapid Transit (MRT) tunnel construction site caved in, leading to the collapse of the Nicoll Highway near the Merdeka Bridge. Four workers were killed and three were injured, delaying the construction of the Circle Line (CCL).
The collapse was caused by a poorly designed strut-waler support system, a lack of monitoring and proper management of data caused by human error, and organisational failures of the Land Transport Authority (LTA) and construction contractors Nishimatsu and Lum Chang. The Singapore Civil Defence Force extracted three bodies from the site but were unable to retrieve the last due to unstable soil. An inquiry was conducted by Singapore's Manpower Ministry from August 2004 to May 2005, after which three Nishimatsu engineers and an LTA officer were charged under the Factories Act and Building Control Act respectively, and all four defendants were fined. The contractors gave S$30,000 (US$) each to the families of the victims as unconditional compensation.
Following the incident, the collapsed site was refilled, and Nicoll Highway was rebuilt and reopened to traffic on 4 December 2004. Heng Yeow Pheow, an LTA foreman whose body was never recovered, was posthumously awarded the Pingat Keberanian (Medal of Valour) for helping his colleagues to safety ahead of himself. In response to inquiry reports, the LTA and the Building and Construction Authority (BCA) revised their construction safety measures so they were above industry standards. The CCL tunnels were realigned, with Nicoll Highway station rebuilt to the south of the original site underneath Republic Avenue. The station and tunnels opened on 17 April 2010, three years later than planned.
## Background
### Nicoll Highway and Merdeka Bridge
The Singapore Improvement Trust first planned Nicoll Highway in the late 1940s to relieve the heavy rush-hour traffic along Kallang Road and provide an alternative route from Singapore's city centre to Katong and Changi. These plans were finalised in July 1953; they included construction of a bridge spanning the Kallang and Rochor rivers. The construction contract for Kallang Bridge was awarded to Paul Y. Construction Company in association with Messrs Hume Industries and Messrs Sime Darby for $4.485 million (US$million in 2021) in December 1954. On 22 June 1956, Kallang Bridge was renamed Merdeka Bridge to reflect "the confidence and aspiration of the people of Singapore". Merdeka Bridge and Nicoll Highway opened on 17 August that year; crowds gathered on both ends of the bridge to witness the opening ceremony. By August 1967, the highway and the bridge had been widened to accommodate seven lanes.
### Nicoll Highway station
Nicoll Highway station was first announced in November 1999 as part of the Mass Rapid Transit's (MRT) Marina Line (MRL), which consisted of six stations from Dhoby Ghaut to Stadium. In 2001, Nicoll Highway station became part of Circle Line (CCL) Stage 1 when the MRL was incorporated into the CCL. The contract for the construction of Nicoll Highway station and tunnels was awarded to a joint venture between Nishimatsu Construction Co Ltd [ja] and Lum Chang Building Contractors Pte Ltd at S$270 million (US$ million) on 31 May 2001. In 1996, the joint venture was investigated for breaching safety rules in a previous project; infringements included loose planks on its scaffolding. In 1997, the companies damaged underground telecommunications cables in another underpass construction project.
The site was on land reclaimed during the 1970s and consisted of silty old alluvium and a 40 m (130 ft) layer of marine clay resulting from sea-level changes of the Kallang Basin. The station and tunnels were constructed from the "bottom-up": cut-and-cover excavation was supported by a network of steel king posts, walers, and struts to keep the site open.
## Incident
At about 3:30 pm local time on 20 April 2004, tunnels linking to Nicoll Highway station caved in along with a 100-metre (300-foot) stretch of Nicoll Highway near the abutment of Merdeka Bridge. The incident happened when most of the workers were on a tea break. The collapse of a tunnel's retaining wall created a hole 100 m long, 130 m wide, and 30 m deep (330 by 430 by 100 ft). One person was found dead and three others, who were working on driving machinery at the bottom of the site, were initially reported missing. They included a foreman who had helped evacuate his workers to safety when the site collapsed but did not escape in time because a flight of exit stairs collapsed. Three injured workers were taken to hospital for treatment; two of them were discharged the same day. No motorists were driving along the stretch of road when it collapsed and others stopped in time.
Three power cables were severed, resulting in a 15-minute blackout in the Esplanade, Suntec City, and Marina Square regions. The collapse of the highway damaged a gas service line. From initial reports, eyewitnesses heard explosions and saw flames flashing across the highway; the Land Transport Authority (LTA), Singapore's transport agency, said it had no evidence of an explosion and that the witnesses might have mistaken the loud sound of the collapse for an explosion. As a precautionary measure, gas supply to the damaged pipe was shut off.
## Rescue and safety measures
The Singapore Civil Defence Force (SCDF) arrived at the site at 3:42 pm. After rescuing the three injured people, specialist SCDF units, such as the Disaster Assistance and Rescue Team and Search Platoon, arrived as reinforcements to search for the missing workers. The first dead victim was found at 6:07 pm. All machinery was turned off as the SCDF used a life-detector device in the collapse site but nothing was detected and sniffer dogs were brought into the search. The second body was recovered at 11:42 pm on 21 April.
Prime Minister Goh Chok Tong visited the site on 21 April; he praised the coordination between the SCDF and the Public Utilities Board (PUB) for the ongoing rescue efforts and expressed relief at the small number of fatalities. Goh extended his condolences to the families of the victims and said the rescue efforts should be the priority rather than apportioning blame. He added the government would convene a public inquiry. President S R Nathan visited the site on 22 April to pay tribute to the rescue workers.
A third body was recovered from the site on 22 April at 12:15 am. The SCDF had to vertically excavate through a pile of rubble and debris located within three cavities, two of which were flooded and blocked by twisted steel beams and struts. The operation presented significant difficulty due to the limited space for manoeuvring within the cavities and the lack of visibility in flooded areas. The LTA detected stability problems on 23 April at 1:05 am and grouting was implemented to stabilise the soil while water was pumped out from cavities, allowing rescuers to further investigate. Heavy rain in the afternoon caused soil erosion and halted the search. Because of the instability of the collapsed area that could bury rescue workers and cause more damage to the surrounding area, the search for the foreman, Heng Yeow Pheow, was called off at 3:30 pm.
The Nicoll Highway collapse led to the deaths of four people:
- Vadivil s/o Nadesan, crane operator: A Malaysian of Indian descent, his body was the first to be recovered. The 45-year-old had tried to escape by jumping out of his crane when the incident occurred. He was found caught between a pick-up truck and a container.
- Liu Rong Quan, construction worker: The body of the 36-year-old Chinese national was found wedged between the wheel and chassis of a 10-tonne (22,000 lb) truck. Liu had started working at the site ten days before the incident.
- John Tan Lock Yong, LTA engineer: Tan, the third victim to be found, was found between a tipper truck and a container. Tan had been working on the station construction project for two years.
- Heng Yeow Pheow, LTA foreman whose body was never recovered. According to survivor accounts, Heng had hurried his workers to safety, saving eight workers, but he was trapped when the collapse occurred.
Safety measures were implemented after the collapse to minimise further damage to the collapsed area. A damaged canal had to be blocked up to prevent water from the Kallang River from entering the site, and canvas sheets were laid on slopes in the site to protect the soil. While the surrounding buildings were assessed to be safe, they were later monitored for stability with additional settlement markers and electro-level beams that were installed at the nearby Golden Mile Complex. The LTA halted work at 16 of the 24 CCL excavation sites so these could be reviewed.
Near the incident site, the approach slab before the abutment of Merdeka Bridge had collapsed. To prevent displacement of the first span triggering the collapse of the 610-metre (2,000 ft) bridge, the first and second spans of the bridge were cut to isolate the first span. This also allowed Crawford Underpass beneath the bridge to be reopened. This project began on 23 April and was completed on 28 April. Eight prism points and five tiltmeters were installed to monitor any bridge movements.
The collapsed site was quickly stabilised through the injection of concrete into areas that were vulnerable to movement or further collapse. Several vehicles, equipment and construction materials were retrieved using a specialised crane. The remaining equipment and materials at the site were buried under infill to avoid further collapse. Access to the collapsed site via the completed parts of the tunnel and the shaft was sealed off.
## Committee of Inquiry
Singaporean authorities dismissed terrorism and sabotage as causes of the incident. On 22 April, Singapore's Ministry of Manpower established a Committee of Inquiry (COI) to investigate the cause of the incident. Senior District Judge Richard Magnus was appointed Chairman; he was assisted by assessors Teh Cee Ing from the Nanyang Technological University (NTU) and Lau Joo Ming from the Housing and Development Board. The COI called for 143 witnesses to provide evidence, including 14 experts. The COI visited the site on 23 April and the inquiry was originally scheduled for 1 June. Because all parties involved would need two-and-a-half months to prepare due to complex technical content, the inquiry was postponed to 2 August.
### Inquiry
At the first hearing of the inquiry, the inquiry panel established that there were "fundamental" design flaws in the worksite due to incorrect analysis of soil conditions by the contractors, leading to more pressure on the retaining walls. In April, the LTA had said the collapse happened without warning but the LTA had already found flaws in Nishimatsu–Lum Chang's design in October 2001: the contractor used a design-software simulator with incorrect parameters. An alternative design had been proposed in consultation with an NTU professor but the contractor had rejected the design. The LTA technical advisor for design management had advised against excavation of the site due to incorrect data.
In the two months before the cave-in, the tunnel's retaining walls had moved more than the maximum allowed. The contractors had petitioned the LTA to increase the agreed maximum threshold of movement. The contractors had miscalculated the amount of stress on the retaining walls but gave the LTA repeated assurances that their calculations were in order. Nishimatsu's senior on-site supervisor Teng Fong Sin claimed ignorance of the significance of the trigger values taken from the retaining wall. Teng said even if he had been aware of the significance, he lacked the authority to halt the ongoing work.
No readings were taken in the two days leading up to the collapse. This was because soil-monitoring instruments, which were placed roughly in the centre of the collapsed area, had been buried and the site supervisor Chakkarapani Balasubramani did not take the readings, although he raised the issue with the main contractor and was told the instruments would be dug out. Nishimatsu engineer Arumaithurai Ahilan said he saw "nothing alarming" in the soil-movement readings and accused Balasubramani of lying in testimony. While he was also alerted to other ground movements, Nishimatsu addressed these cracks by applying cement patches, and no further corrective actions were taken because the buildings did not suffer any structural damage. According to a system analyst from Monosys, the project's subcontractor, the strain-detecting sensors recorded readings that were still below trigger values at 3 pm. These readings were the last obtained before the collapse at 3:30 pm.
The steel beams to hold up the walls had not been constructed when workers dug further into the site. LTA supervisor Phang Kok Pin, whose duty was to confirm the correct installation of support beams, said he visited the pit typically once or twice a day. He conducted only sporadic inspections and heavily relied on reports from Nishimatsu contractors to confirm the accurate installation of the beams.
Nishimatsu supervisors were warned about failing support structures on the day of the collapse but instructed the subcontracting site supervisor Nallusamy Ramadoss to continue installing struts and pouring cement on the buckled struts to strengthen the wall. The struts continued to bend further before the collapse; Ramadoss warned his workers of the danger and evacuated them to safety. Some workers said they were not warned of any danger or given any safety briefings but escaped in time. Other workers also reported hearing "thungs" of bent walers before the cable bridge swayed, and everything around them trembled and collapsed.
### Resumption and conclusion
The inquiry was adjourned on 30 August and resumed on 6 September. An interim report that was released to the government on 13 September noted "glaring and critical shortcomings" in the construction project that were seen in other ongoing construction projects. Additionally, inexperienced personnel had been appointed to monitor the safety of the retaining wall system. The interim report recommended a more-effective safety management system, an industry standard for the safety of temporary works, and a higher standard of reliability and accuracy in monitoring data. The interim report was released so "corrective measures" could be implemented for other construction projects.
The LTA project manager Wong Hon Peng, who was informed of the deflection readings four days before the collapse, admitted his lack of respect for safety, that his initial response was "any solution adopted should not bring about claims against LTA" and that he failed to take heed of the warnings. The project manager from Nishimatsu, Yoshiaki Chikushi, also said he was unaware of the extent to which the struts supporting the construction site had buckled, and was consulting with the LTA on the day of the collapse after being alerted of the failing struts. To meet deadlines, Chikushi had accelerated the hacking of a wall that led to the removal of support beams in the excavation, and approved the grouting method that left gaps under some cables running across the site. He did not consider how these methods would cause problems. The final phase of the hearing, which involved the consultation of experts on the causes of weakening of the retaining wall, began on 24 January 2005 and concluded on 2 February. More than 170 witnesses were brought in during the 80 days of the inquiry.
The COI released its final report on 13 February 2005; it concluded the incident was preventable and had been caused by human error and organisational failures. The strut-waler support system was poorly designed and was weaker than it should have been, and there was a lack of monitoring and proper management of data. The COI report said the "warning signs", such as excessive wall deflections and surging inclinometer readings, were not seriously addressed, and blamed the collapse on the contractor. The people responsible were accused of indifference and laxity towards the worksite safety of the construction project. To address the lack of safety culture stated in the report, the COI restated several recommendations from its interim report to improve the safety of construction projects. The government accepted the report's recommendations.
## Aftermath
### Compensation to the victims
Family of the victims were given S$30,000 (US$) each as unconditional, ex gratia compensation by Nishimatsu and Lum Chang. Heng's family received an additional S$380,000 (US$) in settlements from the three construction firms involved with the collapse and S$630,000 (US$) in public donations. The money from the public donations was diverted into a trust fund that was set up by Heng's Member of Parliament Irene Ng from which expenses for his children's upkeep could be drawn until 2019.
### Honours and awards
Nine SCDF officers who were involved in the search and rescue efforts were awarded the Pingat Keberanian (Medal of Valour). SCDF Commissioner James Tan, who was in charge of the rescue team, was awarded the Pingat Pentadbiran Awam – Emas (Public Administration Medal – Gold) and 18 other SCDF officers were awarded other State medals. Heng was posthumously honoured with the Pingat Keberanian for prioritising the safety of his colleagues over his own escape in May 2004. In 2014, three former colleagues whom Heng rescued inaugurated a memorial bench at Tampines Tree Park dedicated to the foreman. The ceremony, initiated by MP Irene Ng, was attended by Heng's wife and his two children. The bench was funded by the Tampines Changkat Citizens' Consultative Committee. A commemorative stone and plaque were also erected at the former site marking where Heng was believed to be buried. On every anniversary, workers from Kori Construction visit the site to offer prayers and incense in honour of Heng.
### Criminal trials
The COI determined that Nishimatsu, L\&M Geotechnic, Monosys and thirteen professionals from the LTA and Nishimatsu were responsible for the collapse. Those who received warnings included Nishimatsu personnel, an LTA engineer, soil engineers, and L\&M Geotechnic and Monosys, which were engaged in soil analysis. Three others were given counselling by the Manpower Ministry. Nishimatsu and three of its personnel faced criminal charges under the Factories Act. A qualified personn from LTA, who was project director of the CCL and responsible for monitoring the site's readings, faced charges under the Building Control Act.
The CCL project director's trial began on 3 October 2005; he was found guilty and fined S$8,000 (US$) on 24 November. On 28 April 2006, three senior executives from Nishimatsu were fined; the company's project director was fined S$120,000 (US$) for his failure to take appropriate measures concerning the buckling walls and for compromising safety due to flawed monitoring of instruments. The company's design manager and project coordinator were each fined S$200,000 (US$) for giving "blind approval" to the flawed designs.
### Construction safety reforms
The LTA and the Building and Construction Authority (BCA) introduced new safety protocols such as a new Project Safety Review which identifies and reduces risks of hazards. Safety requirements are now set above industry standards, which include doubling scaffold access for evacuation routes in an emergency and one man-cage at each excavation area for rescuers. LTA no longer allows contractors to outsource their own geotechnical firms, but appoints an independent monitoring firm to check on instruments. Contractors are also no longer permitted to design and supervise their own temporary works, with the work carried out by independent consultants. Under the Safety Performance Scheme, contractors are now offered incentives or penalties and are required to maintain a Risk Register that identifies all hazards. The contractors and LTA meet every six months over safety performances, and identify and mitigate potential risks during the progress of works. These new regulations were reported to have driven up the costs of CCL construction works, alongside inflation and increasing costs of concrete.
### Highway reinstatement
Following the collapse, the LTA closed off the stretch of Nicoll Highway from Middle Road to Mountbatten Road. Alternative roads leading into the city, including the junction of Kallang Road and Crawford Street, were widened to accommodate diverted traffic. The LTA also converted a bus-only lane at Lorong 1 Geylang towards Mountbatten Road into a traffic lane. On 25 April 2004, a part of Nicoll Highway running from Mountbatten Road to Stadium Drive was restored for motorists accessing the area around National Stadium of Singapore. Crawford Underpass, which runs under Merdeka Bridge, reopened on 29 April.
After the collapsed site was refilled, the highway was rebuilt on bored piles so the rebuilt stretch would not be affected by future excavation works. Reconstruction of the highway began on 24 August 2004 and the new stretch of highway reopened on 4 December.
### Station relocation and opening
On 4 February 2005, the LTA announced Nicoll Highway station would be relocated 100 m (330 ft) south of the original site along Republic Avenue with a new tunnel alignment between Millenia (now Promenade) and Boulevard (now ) stations. The LTA decided against rebuilding at the original site due to higher costs and engineering challenges posed by debris left there. Prior to the collapse, Nicoll Highway and the adjacent Promenade station were planned to have a cross-platform interchange with an unspecified future line; that had to be realigned because the new Nicoll Highway station had no provision to be an interchange. The new tunnels were designed by Aecom consultants and tunnels to the previous site were demolished with special machinery from Japan.
The new station was built using the top-down method while the 1.8 km (1.1 miles) of tunnels were bored, minimising their impact on the environment. Retaining walls for the new station site were 1.5 m (4.9 ft) thick and entrenched 60 m (200 ft) underground – twice the previous depth. To reduce ground movement, the walls would be embedded into hard layers of soil. To ensure stability and prevent movement of the bored tunnels, the contractor implemented perforated vertical drains, and ground improvement efforts were undertaken in the vicinity of tunnel drainage sumps and cross-passages.
On 29 September 2005, the LTA marked the start of the new Nicoll Highway station's construction with a groundbreaking ceremony, during which the diaphragm walls were first installed. Due to the tunnel collapse, the completion date of CCL Stage 1 was initially delayed from 2007 to 2009, and further postponed until 2010. Nicoll Highway station opened on 17 April 2010, along with the stations on CCL Stages 1 and 2. |
# Hurricane Alex (2016)
Hurricane Alex was the first Atlantic hurricane to occur in January since Hurricane Alice of 1954–55. Alex originated as a non-tropical low near the Bahamas on January 7, 2016. Initially traveling northeast, the system passed by Bermuda on January8 before turning southeast and deepening. It briefly acquired hurricane-force winds by January 10, then weakened slightly before curving towards the east and later northeast. Acquiring more tropical weather characteristics over time, the system transitioned into a subtropical cyclone well south of the Azores on January 12, becoming the first North Atlantic tropical or subtropical cyclone in January since Tropical Storm Zeta of 2005–2006. Alex continued to develop tropical features while turning north-northeast, and transitioned into a fully tropical cyclone on January 14. The cyclone peaked in strength as a Category1 hurricane on the Saffir–Simpson scale (SSHWS), with maximum sustained winds of 85mph (140km/h) and a central pressure of 981mbar (hPa; 28.97inHg). Alex weakened to a high-end tropical storm before making landfall on Terceira Island on January 15. By that time, the storm was losing its tropical characteristics; it fully transitioned back into a non-tropical cyclone several hours after moving away from the Azores. Alex ultimately merged with another cyclone over the Labrador Sea on January 17.
The precursor cyclone to Hurricane Alex brought stormy conditions to Bermuda from January7 to 9. On its approach, the hurricane prompted hurricane and tropical storm warnings and the closure of schools and businesses for the Azores. Alex brought gusty winds and heavy rain to the archipelago, though structural damage was generally minor. One person died of a heart attack because the inclement weather prevented them from being transported to hospital in time.
## Meteorological history
In early January 2016, a stationary front spanned across the western Caribbean, spawning a non-tropical low along its boundary over northwestern Cuba by January6. The low moved northeast ahead of the subtropical jet stream the following day, when its interaction with a shortwave trough produced a cyclonic disturbance at the lower atmospheric levels northeast of the Bahamas. This system proceeded northeast toward Bermuda, where unfavorable atmospheric conditions such as strong wind shear, low sea surface temperatures, and dry air initially inhibited tropical or subtropical cyclone formation. The system featured a large field of gale-force winds, with maximum sustained winds of 60–65mph (95–100km/h). On January 8, it passed about 75mi (120 km) north of Bermuda, bringing strong winds and heavy rain to the islands. The next day, an unseasonable air pattern blocked the disturbance from continuing along its northeasterly path. Instead, the system turned east-southeast into a region slightly more favorable for subtropical development. On January 10, surface pressures below the system's core deepened to 979mbar (hPa; 28.91inHg) as the surrounding winds reached hurricane-force. Concurrently, a warm-core seclusion at the upper-levels marked the transition to a more symmetric structure, although convective activity near the center remained sparse. Once separated from the jet stream, the cyclone turned sharply to the south-southeast in response to a mid-latitude trough over the central Atlantic, entering a region with warmer waters of 2.7 °F (1.5 °C) above average for January.
The system underwent substantial changes to its cyclonic structure on January 11–12: frontal boundaries separated from the core of the cyclone, its core became symmetric, it became co-located with an upper-level low, and intense albeit shallow convection developed atop the circulation. All these factors indicated the storm's transition into a subtropical cyclone by 18:00UTC on January 12, at which point it was situated 1,150mi (1,850km) west-southwest of the Canary Islands and received the name Alex from the National Hurricane Center. The cyclone proceeded to the east-northeast and eventually north-northeast over the next day, steered by the same trough that had enabled the previous southward turn.
An eye feature soon appeared at the center of the cyclone's spiral bands, marking intensification. The 20mi-wide (25km) feature cleared out early on January 14 and was surrounded by a largely symmetric ring of −75 °F (−60 °C) cloud tops. Alex then began to move away from the upper-level low it had been situated under and entered a region of lower wind shear, allowing the cyclone to acquire a deeper warm core with upper-level outflow typical of more tropical systems. Despite moving over 72 °F (22 °C) waters, Alex continued to deepen and became a fully tropical cyclone by 06:00UTC on January 14 – a transition supported by greater instability than usual for tropical cyclones due to colder upper-tropospheric temperatures than those around the equator. Upon the storm's transition, Dvorak satellite estimates indicated that Alex had achieved hurricane strength. The hurricane achieved its peak intensity as a tropical cyclone with winds of 85mph (140km/h) and a minimum barometric pressure of 981mbar (hPa; 28.97inHg) soon thereafter, classifying as a Category1 on the Saffir–Simpson scale.
As Alex moved north toward the Azores, decreasing sea surface temperatures and increasing wind shear caused the cyclone to weaken through January 14 and 15. A deterioration of the convection around the hurricane's eye feature marked the start of its transition back to an extratropical cyclone. Becoming increasingly disorganized due to shear, Alex weakened to a tropical storm before making landfall over Terceira Island at 13:15UTC with winds of 65mph (100km/h). Less than five hours later, the system completed its transition into an extratropical cyclone, featuring a more elongated circulation, an expanding radius of maximum winds, and frontal boundaries. Furthermore, the overall structure became more "comma-shaped" as a consequence of the frontal systems. The system deepened slightly to 978mbar (hPa; 28.88inHg) as it turned northwest towards Greenland. On its passage, the cyclone interacted with the mountainous southeastern coast of the island, generating hurricane-force winds over that region. Around 06:00UTC on January 17, the remnants of Alex were absorbed into a larger extratropical low over the North Atlantic.
## Preparations and impact
### Bermuda
The precursor to Alex brought stormy conditions to Bermuda between January7 and 9, dropping 1.33 in (34 mm) of rain at Bermuda International Airport over the course of these three days. Gusts to 60 mph (97 km/h) disrupted air travel, downed trees, and left 753 customers without power on January8. Waves as high as 20 ft (6 m) prompted small craft advisories and the suspension of ferry services between the islands.
### Azores
When Alex was classified as a hurricane on January 14, the Azores Meteorological Service issued a hurricane warning for the islands of Faial, Pico, São Jorge, Graciosa, and Terceira, and a tropical storm warning for São Miguel and Santa Maria. A red alert – the highest level of weather warnings – was declared for central and eastern islands. Anticipating strong winds and heavy rain, homeowners stacked sandbags to protect their properties from flooding and boarded up doors and windows. Officials closed schools and administrative buildings for the duration of the storm. SATA Air Açores cancelled 33 domestic and international flights for the morning of January 15, stranding more than 700 passengers. The hurricane and tropical storm warnings were discontinued on January 15 after Alex had passed.
Traversing the archipelago on January 15, Alex brought heavy rain and gusty winds to several islands. Rainfall totaled 4.04 in (103 mm) in Lagoa, São Miguel, and 3.71 in (94 mm) in Angra do Heroísmo, Terceira. Wind gusts exceeded 50 mph (80 km/h) on Santa Maria Island and peaked at 57 mph (92 km/h) in Ponta Delgada, São Miguel. The strong winds brought down trees, damaged some roofs, and triggered scattered power outages. The storm caused minor flooding; six homes in Ponta Delgada sustained damage, while the winds destroyed the roof of another. Landslides occurred across the central islands, though their damage was limited. Overall, the storm's effects were milder than initially feared, possibly because the strongest winds were located far from the center of Alex as the system underwent an extratropical transition. One person suffering a heart attack died as an indirect result of Alex when turbulence from the storm hindered their emergency helicopter from taking off in time.
## Distinctions
Alex was only the sixth January tropical or subtropical cyclone on record, and only the third hurricane in January, along with an unnamed hurricane in 1938 and Hurricane Alice in 1955. Alex's landfall on Terceira as a strong tropical storm marked only the second such occurrence for an Atlantic tropical cyclone in January, after Hurricane Alice, which made landfall on the islands of Saint Martin and Saba. Alex was also only the second hurricane on record to form north of 30°N and east of 30°W.
The January 2016 formation of Hurricane Pali several thousand miles away in the Central Pacific coincided with Alex's. This marked the first recorded occurrence of simultaneous January tropical cyclones between these two basins.
## See also
- Other storms of the same name
- List of off-season Atlantic hurricanes
- Timeline of the 2016 Atlantic hurricane season
- List of Bermuda hurricanes
- List of Azores hurricanes
- Hurricane Ophelia (2017) – A Category3 hurricane that attained that intensity farther east than any other storm on record |
# October 1 (film)
October 1 is a 2014 Nigerian thriller film written by Tunde Babalola, produced and directed by Kunle Afolayan, and starring Sadiq Daba, Kayode Olaiya, and Demola Adedoyin. The film is set in the last months of Colonial Nigeria in 1960. It recounts the fictional story of Danladi Waziri (Daba), a police officer from Northern Nigeria, investigating a series of killings of young women in the remote Western Nigeria village of Akote just before 1 October 1960 – the date Nigeria gained independence from British colonial rule.
October 1 was produced with a budget of US$2 million (₦315 million in 2013) in Lagos, Ilara-Mokin, Akure, and villages neighbouring Akure, using period costumes and props, from August to September 2013. The film premiered on 28 September 2014 and opened to international audiences on 3 October. The film earned just over ₦100 million (US$ in 2014) within six months of its release; Afolayan blamed film piracy for the film's low earnings.
October 1 deals with several themes, including the sexual abuse of children by religious authority figures, religious and ethnic conflict, politics in Colonial Nigeria, and Nigeria's unification and independence. Critics reviewed the film positively, praising its cinematography, production design and costuming, writing, and acting. The film also won several awards, including Best Feature Film, Best Screenplay, and Best Actor at the 2014 Africa International Film Festival.
## Plot
Police inspector Danladi Waziri is summoned by the British colonial authorities to present his findings on a series of rapes and murders of young women in Akote, a remote village in Western Nigeria. Upon his arrival in Akote, he is received by Sergeant Afonja, who tells him that a man on horseback being admired by several villagers is Prince Aderopo, the first of their community to graduate from university. As he begins his investigation, Waziri notices a pattern in the killings and concludes that the rapes and murders are the work of a serial killer. In the evening, while Aderopo is meeting with his childhood friends Tawa and Agbekoya in the village bar, one of his guards deserts his post to spend time with his lover. At the bar, Baba Ifa, the town's chief priest, warns Waziri and Afonja that the killings will continue until the murderer is satisfied. The next day, the dead body of the guard's lover is discovered.
Waziri orders the arrest of Baba Ifa, but Afonja refuses. Waziri suspends him and replaces him with his deputy, Corporal Omolodun. The body of an Igbo girl is discovered and Omolodun trails the killer along a bush path; the killer then kills Omolodun. Okafor, the girl's father, and his fellow tribesmen capture a travelling Hausa man, claiming that he is the serial killer. The accused man is taken into custody, but he maintains his innocence and tells Waziri that the actual perpetrator was whistling a tune. Waziri informs his superiors that he has found the murderer and will be closing the case. Okafor throws a machete at the man during his transfer, piercing his heart; as he is dying, the man continues to insist that he did not kill the girl.
After leaving a celebration of the investigation's closure, Waziri hears whistling and is assaulted by the killer. Although he is too drunk to identify him, he slowly remembers the killer's face as he recovers at Afonja's home. The next morning, he goes to the market square to observe the body language of Aderopo. Waziri visits Tawa and discovers that Aderopo and Agbekoya both received the same scholarship from Reverend Dowling, the village priest. Waziri visits Agbekoya, who reveals that Dowling molested him and Aderopo.
At an independence celebration, Aderopo invites Tawa to their childhood hideout, which has been renovated. Waziri and Afonja attempt to trail them, but are unsuccessful; Agbekoya, the only other person who knows the location of the hideout, leads them there. As they arrive, Aderopo is about to make Tawa his sixth victim, symbolizing the six years that he was abused by Dowling. Waziri and Afonja save Tawa. Waziri subsequently presents his account of the investigation to the British, who instruct him to withhold Aderopo's identity; he reluctantly agrees to do so for the sake of a peaceful independence.
## Cast
## Production
The idea for the October 1 came from Kunle Afolayan's desire to direct a story set in a small community. Several writers submitted scripts before he met Tunde Babalola, who was eventually hired to write the screenplay, originally titled Dust. Afolayan also contributed to the script. Although Afolayan did not want to film a big-budget production, he eventually concluded that the script required one because he wanted to produce a "national film" that appealed to both younger and older audiences: "For the older generation, especially those who were part of independence, they will be able to see themselves in this film. For the younger generation it's a platform – many of them who don't know the story of Nigeria."
October 1's US$2 million budget (₦315 million in 2013) came from several governmental and corporate entities, including the Lagos State Government, Toyota Nigeria, Elizade Motors, Guinness, and Sovereign Trust Insurance. Over a thousand actors auditioned for the film. Afolayan stated that he selected Sadiq Daba to play Waziri because he wanted someone from the north of Nigeria who could speak Hausa and had a "look" that matched the film's 1960s aesthetics. Afolayan cast Deola Sagoe, the film's costume designer, as Funmilayo Ransome-Kuti – a Nigerian independence activist – because her features were similar to members of the Ransome-Kuti family. Afolayan also starred as Agbekoya.
Filming began in August 2013 in Lagos, with additional shooting in Ilara-Mokin, Akure, and villages neighbouring Akure. The production team was made up of approximately a hundred people and the film was shot using Red cameras. Modern inventions captured during principal photography were digitally removed during post-production. Shooting ended in September after 42 days.
Almost half of the props used in October 1 were made by art director Pat Nebo. Others were sourced from the United States and the United Kingdom, including television sets and shotguns from the 1950s, as the Nigeria Police Force had not kept old gear and they were otherwise not available domestically. Some of the antique vehicles used in the film were obtained in Nigeria; many were refurbished. Sagoe designed the period costumes. The costume designers watched documentaries and researched archival material to capture the style of 1960s Nigeria. Cinematographer Yinka Edward said he used natural-looking lighting to capture a realistic look because he wanted to tie the cinematography to Aderopo's emotional state.
October 1 featured a score by Kulanen Ikyo, as well as the songs "Mama E" and "Bo Ko Daya" by Victor Olaiya, and "Sunny Sunny Day", written by Yvonne Denobis and produced by Ikyo.
## Release
October 1's first poster was released in June 2013, showing the Nigerian and British flags flying over a dusty town. In September, the filmmakers unveiled a set of character posters. The first trailer was released on 1 October 2013, Nigeria's 53rd Independence Day. The trailer won the "Best Fiction Film Trailer" award at the 2013 International Movie Trailers Festival Awards.
October 1 was originally set for release on 1 October 2013. In March 2014, Afolayan was unable to specify a release date, stating that he was avoiding releasing it at the same time as Render to Caesar, Half of a Yellow Sun, '76, and Dazzling Mirage. The filmmakers also announced that several versions would be released, including one for Nigerian audiences, one for a wider African audience, one for film festivals, and an international release version. Terra Kulture, a Nigerian arts promoter, organized private screenings prior to the film's wider release.
October 1 premiered at the Eko Hotels and Suites in Lagos on 28 September 2014. The event had a 1960s theme, featuring an exhibition with tours of the sets and displays of the film's costumes and props. Advanced screenings began on 1 October 2014 and the film opened to a general audience on 3 October 2014. October 1 was screened at the 2014 Cultural Confidence event, hosted by the Nollywood Diaspora Film Series at New York University on 11 October 2014. The film was also officially selected for the 2014 Africa International Film Festival. The film's European premiere was in London on 3 November 2014 at the 2014 Film Africa Festival. October 1 also opened the 4th Africa Film Week in Greece. The Nation estimated that October 1 grossed approximately ₦60 million (US$ in 2014) as of January 2015. In an interview with The Netng in February 2015, Afolayan disclosed that the film had earned just over ₦100 million (US$ in 2014) in six months. In April 2015, Afolayan learned that the film had been pirated, causing him to lash out on Twitter against the Igbo people in Eastern Nigeria, whom he perceived to be the source of the piracy; Afolayan quickly apologised for his remarks. He also questioned whether piracy would affect the film's profits going forward.
In December 2014, October 1 was released on DStv Explora's video-on-demand service. The following month, Afolayan announced that Netflix had acquired online distribution rights for the film, making it one of the first Nollywood films to be featured on Netflix. A behind-the-scenes documentary aired on DStv's Africa Magic channel in September 2014.
## Themes
Critics noted that October 1 addresses several themes, including the sexual abuse of children by religious authority figures, religious and ethnic conflict, politics and human rights in Colonial Nigeria, and Nigeria's unification and independence.
Many critics observed that October 1 critiques colonial rule in Nigeria through a variety of lenses. Filmmaker Onyeka Nwelue described the film as "sharpen[ing] the veracity of a society torn apart by its tribalism". Wilfred Okiche of YNaija linked the film's character study of psychological abuse with the nation's political dysfunction, which is rooted in a colonial logic of consolidating several tribal groups in one country. In The Nation, Victor Akande highlighted the film's commentary on the colonial mentality, pointing to Aderopo and Agbekoya's belief that Western education would improve them, while actually moving them away from tradition. Akande and Yishau Olukorede have noted that audiences would recognize parallels between those themes and the Boko Haram insurgency's criticism of Western education. Additionally, Jane Agouye, writing in The Punch, described the serial murders as a metaphor for the "rape of the country's natural resources by the white men". Toni Kan, reviewing the film in This Day, concluded that the film captured a "collective" anticipation about "the coming of independence, the beginning of a new era".
Scholars have likewise addressed October 1 from the perspective of the collective trauma that colonialism has imposed on Nigeria. Ezinne Michaelia Ezepue and Chidera G. Nwafor have argued that Afolayan "advocates for decolonization" by using the film's characters as stand-ins for the psychosocial effects of British colonial rule on Nigeria. Azeez Akinwumi Sesan has focused on the film's "rhetoric of return" to "selfhood and nationhood ... through the characterization and representation of collective catharsis as a product of the collective unconscious of a people or race". Osakue Stevenson Omoera has addressed the film from a human rights perspective, linking the film's exploration of sexual violence and ethnic tensions to contemporary sociopolitical issues in Nigeria.
The cast and crew of October 1 found similar themes in the film. Babalola said that "[t]he story depicted how independence affected the tribes in Nigeria" and that the film "is a metaphor of Nigeria and the many discriminatory things that happened to the land". Daba stated that the film "cuts across the whole of Nigeria and back to our colonial days. It talks about our ethnic intra-relationships and many more". Afolayan described the moral of the film as "the last line by one of the colonialists ... who said, 'Good or bad it is your country now'".
## Reception
October 1 received generally positive reviews. Amarachukwu Iwuala, writing for Pulse, applauded the cinematography, production design and costuming, writing, and acting. In This Day, Toni Kan commended the writing and casting, praising Kehinde Bankole for her portrayal of Tawa. Kan also praised the direction and plot, noting that although the killer is presented from the beginning of the film, Afolayan leaves the audience doubting whether they have actually interpreted the evidence correctly. Nollywood Reinvented awarded the film a rating of 72%, praising its writing, but criticising the first half of the film for making it "too easy" for the audience to guess the killer's identity. Onykea Nwelule likewise praised the writing for crisp dialogue and the film's historical accuracy, calling the film "the work of a genius", but he wrote that he would have liked the film to feature historical figures involved in Nigeria's independence movement. Augustine Ogwo praised the film's cinematography, casting, and set design. He concluded that the film "will definitely stand the test of time" and predicted that it would generate continuing discussions on national issues. Wilfred Okiche of YNaija praised the film's production design, but noted "some niggling issues with live action scenes and vivid stunts". Isabella Akinseye of Nolly Silver Screen rated the film 3.4 out of 5 stars, stating that the film attempted to do too much, distracting from its cinematography, costume, production design and acting, which she praised. Babatunde Onikoyi said that the film exemplifies Afolayan's status as a director of New Nigerian Cinema.
### Accolades
October 1 Best Feature Film and Best Screenplay at the 2014 Africa International Film Festival; Sadiq Daba won Best Actor. The film also received 12 nominations at the 2015 Africa Magic Viewers Choice Awards and won nine, including Best Movie of the Year and Best Movie Director; Kehinde Bankole won Best Actress.
## See also
- List of Nigerian films of 2014
- Cinema of Nigeria
- Decolonisation of Africa |
# Newton's parakeet
Newton's parakeet (Psittacula exsul), also known as the Rodrigues parakeet or Rodrigues ring-necked parakeet, is an extinct species of parrot that was endemic to the Mascarene island of Rodrigues in the western Indian Ocean. Several of its features diverged from related species, indicating long-term isolation on Rodrigues and subsequent adaptation. The rose-ringed parakeet of the same genus is a close relative and probable ancestor. Newton's parakeet may itself have been ancestral to the endemic parakeets of nearby Mauritius and Réunion.
Around 40 centimetres (16 in) long, Newton's parakeet was roughly the size of a rose-ringed parakeet. Its plumage was mostly greyish or slate blue in colour, which is unusual in Psittacula, a genus containing mostly green species. The male had stronger colours than the female and possessed a reddish instead of black beak, but details of a mature male's appearance are uncertain; only one male specimen is known, and it is believed to be immature. Mature males might have possessed red patches on the wing like the related Alexandrine parakeet. Both sexes had a black collar running from the chin to the nape, but this was clearer in the male. The legs were grey and the iris yellow. Some 17th-century accounts indicate that some members of the species were green, which would suggest that both blue and green colour morphs occurred, but no definitive explanation exists for these reports. Little is known about its behaviour in life, but it may have fed on the nuts of the bois d'olive tree, along with leaves. It was very tame and was able to mimic speech.
Newton's parakeet was first written about by the French Huguenot François Leguat in 1708 and was only mentioned a few times by other writers afterwards. The specific name "exsul" is a reference to Leguat, who was exiled from France. Only two life drawings exist, both of a single specimen held in captivity in the 1770s. The species was scientifically described in 1872, with a female specimen as the holotype. A male, the last specimen recorded, was collected in 1875, and these two specimens are the only ones that exist today. The bird became scarce due to deforestation and perhaps hunting, but it was thought to have been finally wiped out by a series of cyclones and storms that hit Rodrigues in the late 19th century. Speculation about the possible survival of the species, though unfounded, lasted as late as 1967.
## Taxonomy
Newton's parakeet was first recorded by François Leguat in his 1708 memoir A New Voyage to the East Indies. Leguat was the leader of a group of nine French Huguenot refugees who colonised Rodrigues between 1691 and 1693 after they were marooned there. Subsequent accounts were by Julien Tafforet, who was marooned on the island in 1726, in his Relation de l'Île Rodrigue, and then by the French mathematician Alexandre Pingré, who travelled to Rodrigues to view the 1761 transit of Venus.
In 1871, George Jenner, the British magistrate of Rodrigues, collected a female specimen; it was preserved in alcohol and given to Edward Newton, a British colonial administrator in Mauritius, who sent it to his brother, the ornithologist Alfred Newton. A. Newton scientifically described the species in 1872 and gave it the scientific name Palaeornis exsul. "Exsul" ("exiled") refers to Leguat, in that he was exiled from France when he gave the first description of the bird. Newton had tried to find a more descriptive name, perhaps based on colouration, but found it difficult. He refrained from publishing a figure of the female in his original description, though the journal Ibis had offered him the space. He instead wanted to wait until a male specimen could be procured since he imagined it would be more attractive. The female, which is the holotype specimen of the species, is housed in the Cambridge University Museum as specimen UMZC 18/Psi/67/h/1.
A. Newton requested further specimens, especially males, but in 1875 he finally published a plate of the female, lamenting that no male specimens could be found. Tafforet's 1726 account had been rediscovered the previous year, and A. Newton noted that it confirmed his assumption that the male would turn out be much more colourful than the female. Newton's collector, the English naturalist Henry H. Slater, had seen a live Newton's parakeet the year before, but was not carrying a gun at the time. On 14 August 1875, William Vandorous shot a male specimen. It may have been the same specimen Slater had observed. It was subsequently sent to E. Newton by William J. Caldwell. This is the paratype of the species, numbered UMZC 18/Psi/67/h/2 and housed in the Cambridge Museum.
In 1876, the Newton brothers noted that they had expected the male would be adorned with a red patch on the wing, but that the absence of this indicated it was immature. They still found it more beautiful than the female. These two specimens are the only preserved individuals of the species. The mandible and sternum were extracted from the female specimen, and subfossil remains have since been found in the Plaine Corail caverns on Rodrigues. The American ornithologist James L. Peters used the name Psittacula exsul for Newton's parakeet in his 1937 checklist of birds, replacing the genus name Palaeornis with Psittacula, wherein he also classified other extant parakeets of Asia and Africa.
### Evolution
Based on morphological features, the Alexandrine parakeet (Psittacula eupatria) has been proposed as the founder population for all Psittacula species on Indian Ocean islands, with new populations settling during the species' southwards colonisation from its native South Asia. Features of that species gradually disappear in species further away from its range. Subfossil remains of Newton's parakeet show that it differed from other Mascarene Psittacula species in some osteological features, but also had similarities, such as a reduced sternum, which suggests a close relationship. Skeletal features indicate an especially close relationship with the Alexandrine parakeet and the rose-ringed parakeet (Psittacula krameri), but the many derived features of Newton's parakeet indicates it had long been isolated on Rodrigues.
Many endemic Mascarene birds, including the dodo, are descended from South Asian ancestors, and the British palaeontologist Julian Hume has proposed that this may also be the case for all parrots there. Sea levels were lower during the Pleistocene, so it was possible for species to colonise some of these less isolated islands. Although most extinct parrot species of the Mascarenes are poorly known, subfossil remains show that they shared common features such as enlarged heads and jaws, reduced pectoral bones, and robust leg bones. Hume has suggested that they all have a common origin in the radiation of the tribe Psittaculini, members of which are known as Psittaculines, basing this theory on morphological features and the fact that parrots from that group have managed to colonise many isolated islands in the Indian Ocean. The Psittaculini could have invaded the area several times, as many of the species were so specialised that they may have evolved significantly on hotspot islands before the Mascarenes emerged from the sea. Other members of the genus Psittacula from the Mascarenes include the extant echo parakeet (Psittacula eques echo) of Mauritius, as well as its extinct Réunion subspecies (Psittacula eques eques), and the Mascarene grey parakeet (Psittacula bensoni) of both Mauritius and Réunion.
A 2011 genetic study of parrot phylogeny was unable to include Newton's parakeet, as no viable DNA could be extracted. A 2015 genetic study by the British geneticist Hazel Jackson and colleagues included viable DNA from the toe-pad of the female Newton's parakeet specimen. It was found to group within a clade of rose-ringed parakeet subspecies (from Asia and Africa), which it had diverged from 3.82 million years ago. Furthermore, Newton's parakeet appeared to be ancestral to the parakeets of Mauritius and Réunion. The cladogram accompanying the study is shown below:
In 2018, the American ornithologist Kaiya L. Provost and colleagues found the Mascarene parrot (Mascarinus marscarinus) and Tanygnathus species to group within Psittacula, making that genus paraphyletic (an unnatural grouping), and stated this argued for breaking up the latter genus. To solve the issue, the German ornithologist Michael P. Braun and colleagues proposed in 2019 that Psittacula should be split into multiple genera. They placed Newton's parakeet in the new genus Alexandrinus, along with its closest relatives, the echo parakeet and the rose-ringed parakeet.
A 2022 genetic study by the Brazilian ornithologist Alexandre P. Selvatti and colleagues confirmed the earlier studies in regard to the relationship between Psittacula, the Mascarene parrot, and Tanygnathus. They suggested that Psittaculinae originated in the Australo–Pacific region (then part of the supercontinent Gondwana), and that the ancestral population of the Psittacula–Mascarinus lineage were the first psittaculines in Africa by the late Miocene (8–5 million years ago), and colonised the Mascarenes from there.
## Description
Newton's parakeet was about 40 cm (16 in) long – roughly the size of the rose-ringed parakeet. The wing of the male specimen was 198 mm (7.8 in), the tail 206 mm (8.1 in), the culmen 25 mm (0.98 in), and the tarsus was 22 mm (0.87 in). The wing of the female specimen was 191 mm (7.5 in), the tail 210 mm (8.3 in), the culmen 24 mm (0.94 in), and the tarsus was 22 mm (0.87 in). The male specimen was greyish blue (also described as "slatey blue") tinged with green, and darker above. The head was bluer, with a dark line running from the eye to the cere. It had a broad black collar running from the chin to the nape, where it became gradually narrower. The underside of the tail was greyish, the upper beak was dark reddish brown, and the mandible was black. The legs were grey and the iris yellow. The female was similar but had a greyer head and a black beak. The black collar was not so prominent as that of the male and did not extend to the back of the neck.
The general appearance of Newton's parakeet was similar to the extant Psittacula species, including the black collar, but the bluish grey colouration set it apart from other members of its genus, which are mostly green. It differed from its Mascarene relatives in some skeletal details, including in that the internal margin of the mandibular symphysis (where the two halves of the lower jaw connected) was oval instead of square-shaped when seen from above, and in that the upper end of the humerus (upper arm bone) was less expanded than in the Mascarene grey parakeet and the echo parakeet.
### Possible colour variation
The French naturalist Philibert Commerson received a live specimen on Mauritius in the 1770s and described it as "greyish blue". French artist Paul Philippe Sanguin de Jossigny made two illustrations of this specimen, the only known depictions of Newton's parakeet in life, unpublished until 2007. Though both existing specimens are blue, some early accounts from Rodrigues have caused confusion over the colouration of the plumage. One of these is Leguat's following statement:
> There are abundance of green and blew Parrets, they are of a midling and equal bigness; when they are young, their Flesh is as good as young Pigeons.
If the green parrots Leguat referred to were not the Rodrigues parrot (Necropsittacus rodericanus), they might perhaps have been a green colour morph of Newton's parakeet, as Hume has suggested. As A. Newton observed in his original description, some feathers of the female specimen display both blue and green tinges, depending on the light. This may explain some of the discrepancies. According to Fuller, the green parrots mentioned could also instead have been storm-blown members of Psittacula species from other islands, that survived on Rodrigues for a short time.
The two existing specimens were originally preserved in alcohol, but though this can discolour specimens, it is not probable that it could turn green to blue. Hume and the Dutch orhinthologist Hein van Grouw have also suggested that due to an inheritable mutation, some Newton's parakeets may have lacked psittacin, a pigment that together with eumelanin produces green colouration in parrot feathers. Complete lack of psittacin produces blue colouration, whereas reduced psittacin can produce a colour between green and blue called parblue, which corresponds to the colour of the two preserved Newton's parakeet specimens. The reason why only parblue specimens are known today may be due to collecting bias, as unusually coloured specimens are more likely to be collected than those of normal colour.
Tafforet also described what appears to be green Newton's parakeets, but the issue of colouration was further complicated by the mention of red plumage:
> The parrots are of three kinds, and in quantity ... The second species [mature male Newton's parakeet?] is slightly smaller and more beautiful, because they have their plumage green like the preceding [Rodrigues Parrot], a little more blue, and above the wings a little red as well as their beak. The third species [Newton's parakeet] is small and altogether green, and the beak black.
In 1987, the British ecologist Anthony S. Cheke proposed that the last two types mentioned were male and female Newton's parakeets, and that the differences between them were due to sexual dimorphism. The last bird mentioned had earlier been identified as introduced grey-headed lovebirds (Agapornis canus) by A. Newton, but Cheke did not find this likely, as their beaks are grey. Pingré also mentioned green birds, perhaps with some red colours, but his account is partially unintelligible and therefore ambiguous. A red shoulder patch is also present on the related Alexandrine parakeet. None of the existing Newton's parakeet specimens have red patches. Fuller suggested the single known male specimen may have been immature, judged on the colour of its beak, and this may also explain the absence of the red patch. When Psittacula are bred by aviculturalists, blue is easily produced from green; the production of blue may suppress red colouration, so blue morphs may have lacked the red patch.
## Behaviour and ecology
Almost nothing is known about the behaviour of Newton's parakeet, but it is probable that it was similar to that of other members of its genus. Leguat mentioned that the parrots of the island ate the nuts of the bois d'olive tree (Cassine orientalis). Tafforet also stated that the parrots ate the seeds of the bois de buis shrub (Fernelia buxifolia), which is endangered today, but was common all over Rodrigues and nearby islets during his visit. Newton's parakeet may have fed on leaves as the related echo parakeet does. The fact that it survived long after Rodrigues had been heavily deforested shows that its ecology was less vulnerable than that of, for example, the Rodrigues parrot.
Leguat and his men were hesitant to hunt the parrots of Rodrigues because they were so tame and easy to catch. Leguat's group took a parrot as a pet and were able to teach it to speak:
> Hunting and Fishing were so easie to us, that it took away from the Pleasure. We often delighted ourselves in teaching the Parrots to speak, there being vast numbers of them. We carried one to Maurice Isle [Mauritius], which talk'd French and Flemish.
The authors of the 2015 study which resolved the phylogenetic placement of the Mascarene island parakeets suggested that the echo parakeet of Mauritius would be a suitable ecological replacement for the Réunion parakeet and Newton's parakeet, due to their close evolutionary relationship. The echo parakeet was itself close to extinction in the 1980s, numbering only twenty individuals, but has since recovered, so introducing it to the nearby islands could also help secure the survival of this species.
Many other species endemic to Rodrigues became extinct after humans arrived, and the island's ecosystem remains heavily damaged. Forests had covered the entire island before humans arrived, but very little forestation can be seen today. Newton's parakeet lived alongside other recently extinct birds such as the Rodrigues solitaire, the Rodrigues parrot, the Rodrigues rail, the Rodrigues starling, the Rodrigues scops owl, the Rodrigues night heron, and the Rodrigues pigeon. Extinct reptiles include the domed Rodrigues giant tortoise, the saddle-backed Rodrigues giant tortoise, and the Rodrigues day gecko.
## Extinction
Of the roughly eight parrot species endemic to the Mascarenes, only the echo parakeet has survived. The others were likely all made extinct by a combination of excessive hunting and deforestation by humans. Leguat stated that Newton's parakeet was abundant during his stay. It was still common when Tafforet visited in 1726, but when Pingré mentioned it in 1761, he noted that the bird had become scarce. It was still present on southern islets off Rodrigues (Isle Gombrani), along with the Rodrigues parrot. After this point, much of Rodrigues was severely deforested and used for livestock.
According to early accounts praising its flavour, it appears visitors commonly ate Newton's parakeet. Several individuals would likely be needed to provide a single meal, owing to the bird's small size. Pingré stated:
> The perruche [Newton's parakeet] seemed to me much more delicate [than the flying-fox]. I would not have missed any game from France if this one had been commoner in Rodrigues; but it begins to become rare. There are even fewer perroquets [Rodrigues parrots], although there were once a big enough quantity according to François Leguat; indeed a little islet south of Rodrigues still retains the name Isle of Parrots [Isle Pierrot].
According to government surveyor Thomas Corby, Newton's parakeet may still have been fairly common in 1843. Slater reported that he saw a single specimen in southwestern Rodrigues during his three-month stay to observe the 1874 Transit of Venus, and assistant colonial secretary William J. Caldwell saw several specimens in 1875 during his own three-month visit. The male that he received in 1875 and gave to Newton is the last recorded member of the species. A series of cyclones struck the following year and may have devastated the remaining population. Further severe storms hit in 1878 and 1886, and since few forested areas were left by this time, there was little cover to protect any remaining birds. The male could, therefore, have been the last of the species alive.
There were unfounded rumours of its continued existence until the beginning of the 20th century. In 1967, the American ornithologist James Greenway stated that an extremely small population might still survive on small offshore islets, since this is often the last refuge of endangered birds. The Mauritian ornithologist France Staub stated in 1973 that his visits to Rodrigues the previous seven years confirmed the bird was extinct. Hume concluded in 2007 that the islets were probably too small to sustain a population. |
# Hurricane Nadine
Hurricane Nadine was an erratic Category 1 hurricane that became the fourth-longest-lived Atlantic hurricane on record. As the fourteenth tropical cyclone and named storm of the 2012 Atlantic hurricane season, Nadine developed from a tropical wave traveling west of Cape Verde on September 10. On the following day, it had strengthened into Tropical Storm Nadine. After initially tracking northwestward, Nadine turned northward, well away from any landmass. Early on September 15, Nadine reached hurricane status as it was curving eastward. Soon after, an increase in vertical wind shear weakened Nadine and by September 16 it was back to a tropical storm. On the following day, the storm began moving northeastward and threatened the Azores but late on September 19, Nadine veered east-southeastward before reaching the islands. Nonetheless, the storm produced tropical storm force winds on a few islands. On September 21, the storm curved south-southeastward while south of the Azores. Later that day, Nadine transitioned into a extratropical low pressure area.
Due to favorable conditions, the remnants of Nadine regenerated into a tropical cyclone on September 24. After re-developing, the storm executed a cyclonic loop and meandered slowly across the eastern Atlantic. Eventually, Nadine turned south-southwestward, at which time it became nearly stationary. By September 28, the storm curved northwestward and re-strengthened into a hurricane. The tenacious cyclone intensified further and peaked with winds of 90 mph (140 km/h) on September 30. By the following day, however, Nadine weakened back to a 65 mph (105 km/h) tropical storm, as conditions became increasingly unfavorable. Strong wind shear and decreasing sea surface temperatures significantly weakened the storm. Nadine transitioned into an extratropical cyclone on October 3, and merged with an approaching cold front northeast of the Azores soon after. The remnants of Nadine passed through the Azores on October 4 and again brought relatively strong winds to the islands.
## Origin and meteorological history
A large tropical wave emerged into the Atlantic Ocean from the west coast of Africa, on September 7. The system passed south of Cape Verde on September 8, bringing disorganized showers and thunderstorms. Around that time, the National Hurricane Center gave the system a medium chance of tropical cyclogenesis within 48 hours. A low pressure area developed along the axis of the tropical wave on September 9, which further increased convective activity. The system was assessed with a high chance for tropical cyclone formation on September 10. Based on satellite intensity estimates, the National Hurricane Center declared the disturbance as Tropical Depression Fourteen at 1200 UTC on September 10, while the storm was about 885 miles (1,424 km) west of Cape Verde.
Although thunderstorm activity was initially minimal around the center of circulation, a convective band associated with the depression was becoming more organized. Late on September 10, convection began to increase slightly near the center, but because Dvorak intensity T-numbers were between 2.0 and 2.5, the depression was not upgraded to a tropical storm. However, dry air briefly caused showers and thunderstorms to decrease later that day. Initially, it headed just north of due west around the southern periphery of a large subtropical ridge. However, by September 11, the depression re-curved northwestward. Later that day, the depression began to regain deep convection. Geostationary satellite imagery and scatterometer data indicated that the depression strengthened into Tropical Storm Nadine at 0000 UTC on September 12.
## Strengthening and initial peak intensity
By September 12, a central dense overcast developed and due to favorable conditions, the National Hurricane Center noted the possibility of rapid deepening. Intensification continued at a quicker albeit less than rapid rate on September 12. Later that day, sustained winds reached 65 mph (105 km/h). By early on September 13, convective banding wrapped almost completely around the center and cloud tops reached temperatures as low as −112 °F (−80 °C). However, because microwave satellite data could not determine if an eye had developed, Nadine's intensity was held at 70 mph (110 km/h)—–just below the threshold of hurricane status. The National Hurricane Center noted that "the window for Nadine to strengthen may be closing", citing computer model consensus of an increase in wind shear and little change in structure. The storm then began experiencing moderate southwesterly wind shear on September 13, generated by a mid- to upper-level trough and a shear axis a few hundred miles to the west of Nadine. As a result, the storm struggled to develop an eye and the center became more difficult to locate.
Although the storm was disorganized, a scatterometer pass indicated tropical storm force winds extended outward up to 230 miles (370 km). The satellite appearance of Nadine became more ragged by September 14. Despite this, the storm remained just below hurricane status and the National Hurricane Center noted the possibility of intensification if wind shear decreased in the next few days. Nadine turned northward on September 14 as it tracked along the periphery of a subtropical ridge. Soon after, a Tropical Rainfall Measuring Mission (TRMM) pass indicated that core convection began re-organizing. However, because wind shear displaced the mid-level circulation to the north of the low-level circulation, Nadine was not upgraded to a hurricane. Because Nadine would approach colder sea surface temperatures, significant strengthening was considered unlikely. Due to an increase in satellite intensity estimates and re-organization, Nadine was upgraded to a hurricane at 1800 UTC on September 14. Six hours later, Nadine reached an initial peak intensity with winds of 80 mph (130 km/h). Satellite imagery indicated that a ragged eye feature was attempting to develop late on September 15.
## Weakening and initial post-tropical transition
Late on September 15, National Hurricane Center forecaster Robbie Berg noted that Nadine began "to look a little more ragged", as microwave data observations noted shearing of deep convection to the northeast of the center. Late on September 16, the eye became tilted and disappeared, convective bands began disorganizing, and the overall shower and thunderstorm activity waned since early that day. Nadine weakened back to a tropical storm on September 17 and a trough reduced its satellite appearance.
Dry air began impacting Nadine on September 17, though outflow from the storm prevented significant weakening. Despite a large flare of deep convection over the northern semicircle, Nadine weakened slightly later that day. Further weakening occurred on the following day, after the burst in deep convection on September 17 deteriorated. Later on September 18, most of the deep convection dissipated. The strongest showers and thunderstorms that remained were in a band to the west and northwest of Nadine's center.
Nadine threatened the Azores while moving northeastward and then northward between September 18 and September 19, though a blocking ridge prevented the storm from approaching closer to the islands. Its closest approach to the Azores was about 150 miles (240 km) south-southwest of Flores Island on September 19. The storm then re-curved east-southeastward on September 20, after the ridge weakened and the mid- to upper-level trough deepened. By late on September 21, much of the remaining deep convection was composed of only a ragged convective band with warming cloud tops. Operationally, the National Hurricane Center re-classified Nadine as a subtropical storm at 2100 UTC on September 21, due to an above-average, asymmetrical wind field and an upper-level low pressure area near the center. However, post-season analysis concluded that Nadine degenerated into a non-tropical low pressure area three hours earlier.
## Regeneration, peak intensity, and demise
Early on September 22, the National Hurricane Center noted that regeneration into a tropical cyclone was a distinct possibility. The remnant low pressure area soon moved over warmer seas and a low-shear environment, causing deep convection to re-develop. Thus, Nadine became a tropical storm at 0000 UTC on September 23. Another blocking ridge over the Azores forced Nadine to move west-northwestward on September 24, causing it to execute a small cyclonic loop. Although winds increased to 60 mph (95 km/h), the storm weakened again and decreased to a 45 mph (72 km/h) tropical storm on September 25. Despite this regression, satellite imagery indicated that Nadine developed an eye-like feature. However, the National Hurricane Center later noted that it was a cloud-free region near the center of the storm. By September 26, Nadine curved south-southwestward to southwestward around the southeastern portion of a mid- to upper-level ridge over the western Atlantic.
After minimal change in strength for several days, Nadine finally began to intensify on September 27, due to sea surface temperatures warmer than 79 °F (26 °C). At 1200 UTC on September 28, Nadine re-strengthened into a Category 1 hurricane on the Saffir–Simpson hurricane wind scale. Around that time, satellite imagery indicated that the storm re-developed an eye feature. After becoming disorganized, the National Hurricane Center erroneously downgraded Nadine to a tropical storm on September 29 before upgrading it to a hurricane again six hours later. Nadine had actually remained a hurricane and was intensifying further. Winds increased to 85 mph (140 km/h) on September 30, after the eye became more distinct. At 1200 UTC, the storm attained its peak intensity with maximum sustained winds of 90 mph (150 km/h) and a minimum barometric pressure of 978 mbar (28.9 inHg).
After peak intensity, Nadine began weakening once again and deteriorated to a tropical storm at 1200 UTC, on October 1. Northwesterly winds began to increase on October 3, after an upper-level trough that was causing low wind shear moved eastward. A few hours later, the low-level center became partially exposed, before becoming fully separated from the convection by 1500 UTC. Due to strong wind shear and cold sea surface temperatures, showers and thunderstorms rapidly diminished, and by late on October 3, Nadine became devoid of any deep convection. At 0000 UTC on October 4, Nadine transitioned into an extratropical low-pressure area, while about 195 miles (314 km) southwest of the central Azores. The low rapidly moved northeastward, degenerated into a trough of low pressure, and was absorbed by a cold front later that day.
## Impact and records
Tropical cyclone warnings and watches were issued on two separate occasions as Nadine approached the Azores. At 1000 UTC on September 18, a tropical storm watch was issued for the islands of Flores and Corvo. Although the tropical storm watch was discontinued at 2100 UTC, a tropical storm warning was implemented at that time for the islands of Corvo, Faial, Flores, Graciosa, Pico, São Jorge, and Terceira. At 1500 UTC on September 19, a tropical storm warning was also issued for São Miguel and Santa Maria. All watches and warnings were discontinued by late on September 21. After re-generating, Nadine posed a threat to the Azores again, which resulted in a tropical storm watch for the entire archipelago at 1500 UTC on October 1. Nine hours later, 0000 UTC on the following day, the watch was upgraded to a tropical storm warning. After Nadine became extratropical, the warning was discontinued. On the storm's second approach toward the Azores, schools were closed and flight were cancelled.
Late on September 20, Flores reported a wind gust of 46 mph (74 km/h). A sustained wind speed of 62 mph (100 km/h) and a gust up to 81 mph (130 km/h) were reported at Horta on the island of Faial, as Nadine passed to the south on September 21. During the second Azores impact on October 4, the highest sustained wind speed reported was 38 mph (61 km/h) on São Miguel, while the strongest gust was 87 mph (140 km/h) at the Wind Power Plant on Santa Maria. On Pico Island, the pavement of the sports hall of the primary and secondary school in Lajes do Pico was destroyed. The remnants of Nadine produced a plume of moisture that dropped heavy rainfall over the United Kingdom, particularly in England and Wales, reaching 5.12 in (130 mm) at Ravensworth in the former. The rains flooded houses and disrupted roads and rails.
Nadine lasted a total of 24 days as a tropical, subtropical, and post-tropical cyclone, including 22.25 days as a tropical system. This makes it the fourth longest-lasting Atlantic tropical cyclone on record, only behind the 1899 San Ciriaco hurricane at 28 days, Hurricane Ginger in 1971 at 27.25 days, and Hurricane Inga in 1969 at 24.75 days. When only counting time spent as a tropical storm or hurricane – 20.75 days – Nadine is the third longest-lasting, behind only Hurricane Ginger in 1971 and the 1899 San Ciriaco hurricane. When Nadine was upgraded to a hurricane at 1800 UTC on September 14, it marked the third-earliest forming eighth hurricane, behind only an unnamed system in 1893 and Ophelia in 2005.
## See also
- List of Azores hurricanes
- Hurricane Alberto (2000)
- Hurricane Gordon (2006)
- Hurricane Leslie (2018) |
# Glacier National Park (U.S.)
Glacier National Park is a national park of the United States located in northwestern Montana, on the Canada–United States border. The park encompasses more than 1 million acres (4,100 km<sup>2</sup>) and includes parts of two mountain ranges (sub-ranges of the Rocky Mountains), more than 130 named lakes, more than 1,000 different species of plants, and hundreds of species of animals. This vast pristine ecosystem is the centerpiece of what has been referred to as the "Crown of the Continent Ecosystem", a region of protected land encompassing 16,000 sq mi (41,000 km<sup>2</sup>).
The region that became Glacier National Park was first inhabited by Native Americans. Upon the arrival of European explorers, it was dominated by the Blackfeet in the east and the Flathead in the western regions. Under pressure, the Blackfeet ceded the mountainous parts of their treaty lands in 1895 to the federal government which became part of the park. Soon after the establishment of the park on May 11, 1910, a number of hotels and chalets were constructed by the Great Northern Railway. These historic hotels and chalets are listed as National Historic Landmarks and a total of 350 locations are on the National Register of Historic Places. By 1932 work was completed on the Going-to-the-Sun Road, later designated a National Historic Civil Engineering Landmark, which provided motorists easier access to the heart of the park.
Glacier National Park's mountains began forming 170 million years ago when ancient rocks were forced eastward up and over much younger rock strata. Known as the Lewis Overthrust, these sedimentary rocks are considered to have some of the finest examples of early life fossils on Earth. The current shapes of the Lewis and Livingston mountain ranges and positioning and size of the lakes show the telltale evidence of massive glacial action, which carved U-shaped valleys and left behind moraines that impounded water, creating lakes. Of the estimated 150 glaciers over 25 acres (10 ha) in size which existed in the park in the mid-19th century during the late Little Ice Age, only 25 active glaciers remained by 2010. Scientists studying the glaciers in the park have estimated that all the active glaciers may disappear by 2030 if current climate patterns persist.
Glacier National Park still maintains almost all of its modern, original native plant and animal species (since discovery by Europeans). Large mammals such as American black bear, grizzly bear, bighorn sheep, elk, moose, mountain lion and mountain goats, as well as gray wolf, wolverine and Canadian lynx inhabit the park. Hundreds of species of birds, more than a dozen fish species, and quite a few reptiles and amphibian species have been documented. Species of butterflies, pollinating insects and other invertebrates range in the thousands.
The park has numerous ecosystems, ranging from prairie to tundra. The easternmost forests of western redcedar and hemlock grow in the southwest portion of the park. Forest fires are annually common in the park. There has been a fire every year of the park's existence except for in 1964. In total, 64 fires occurred in 1936 alone, the most on-record. In 2003, six fires burned approximately 136,000 acres (550 km<sup>2</sup>), more than 13% of the park.
Glacier National Park borders Waterton Lakes National Park in Canada—the two parks are known as the Waterton-Glacier International Peace Park and were designated as the world's first International Peace Park in 1932. Both parks were designated by the United Nations as Biosphere Reserves in 1976, and in 1995 as World Heritage Sites. In April 2017, the joint park received a provisional Gold Tier designation as Waterton-Glacier International Dark Sky Park through the International Dark Sky Association, the first transboundary dark sky park.
## History
According to archeological evidence, Native Americans first arrived in the Glacier area some 10,000 years ago. The earliest occupants with lineage to current tribes were the Flathead (Salish) and Kootenai, Shoshone, and Cheyenne. The Blackfeet lived on the eastern slopes of what later became the park, as well as the Great Plains immediately to the east. The park region provided the Blackfeet shelter from the harsh winter winds of the plains, allowing them to supplement their traditional bison hunts with other game meat. The Blackfeet Indian Reservation borders the park in the east, while the Flathead Indian Reservation is located west and south of the park. When the Blackfeet Reservation was first established in 1855 by the Lame Bull Treaty, it included the eastern area of the current park up to the Continental Divide. To the Blackfeet, the mountains of this area, especially Chief Mountain and the region in the southeast at Two Medicine, were considered the "Backbone of the World" and were frequented during vision quests. In 1895 Chief White Calf of the Blackfeet authorized the sale of the mountain area, some 800,000 acres (3,200 km<sup>2</sup>), to the U.S. government for $1.5 million, with the understanding that they would maintain usage rights to the land for hunting as long as the ceded strip will be "public land of the United States". This established the current boundary between the park and the reservation.
While exploring the Marias River in 1806, the Lewis and Clark Expedition came within 50 mi (80 km) of the area that is now the park. A series of explorations after 1850 helped to shape the understanding of the area that later became the park. In 1885 George Bird Grinnell hired the noted explorer (and later well-regarded author) James Willard Schultz to guide him on a hunting expedition into what would later become the park. After several more trips to the region, Grinnell became so inspired by the scenery that he spent the next two decades working to establish a national park. In 1901 Grinnell wrote a description of the region in which he referred to it as the "Crown of the Continent". His efforts to protect the land made him the premier contributor to this cause. A few years after Grinnell first visited, Henry L. Stimson and two companions, including a Blackfoot, climbed the steep east face of Chief Mountain in 1892.
In 1891, the Great Northern Railway crossed the Continental Divide at Marias Pass 5,213 ft (1,589 m), which is along the southern boundary of the park. In an effort to attract passengers, the Great Northern soon advertised the splendors of the region to the public. The company lobbied the United States Congress. In 1897 the park was designated as a forest preserve. Under the forest designation, mining was still allowed but was not commercially successful. Meanwhile, proponents of protecting the region kept up their efforts. In 1910, under the influence of the Boone and Crockett Club, and spearheaded by George Bird Grinnell and Louis W. Hill, president of the Great Northern, a bill was introduced into the U.S. Congress which designated the region a national park. This bill was signed into law by President William Howard Taft in 1910. In 1910 Grinnell wrote, "This Park, the country owes to the Boone and Crockett Club, whose members discovered the region, suggested it being set aside, caused the bill to be introduced into congress and awakened interest in it all over the country".
From May until August 1910, the forest reserve supervisor, Fremont Nathan Haines, managed the park's resources as the first acting superintendent. In August 1910, William Logan was appointed the park's first superintendent. While the forest reserve designation confirmed the traditional usage rights of the Blackfeet, the enabling legislation of the national park does not mention the guarantees to the Native Americans. The United States government's position was that with the special designation as a National Park the mountains ceded their multi-purpose public land status and the former rights ceased to exist as the Court of Claims confirmed it in 1935. Some Blackfeet held that their traditional usage rights still exist de jure. In the 1890s, armed standoffs were avoided narrowly several times.
The Great Northern Railway, under the supervision of president Louis W. Hill, built a number of hotels and chalets throughout the park in the 1910s to promote tourism. These buildings, constructed and operated by a Great Northern subsidiary called the Glacier Park Company, were modeled on Swiss architecture as part of Hill's plan to portray Glacier as "America's Switzerland". Hill was especially interested in sponsoring artists to come to the park, building tourist lodges that displayed their work. His hotels in the park never made a profit but they attracted thousands of visitors who came via the Great Northern. Vacationers commonly took pack trips on horseback between the lodges or utilized the seasonal stagecoach routes to gain access to the Many Glacier areas in the northeast.
The chalets, built between 1910 and 1915, included Belton, St. Mary, Going-to-the-Sun, Many Glacier, Two Medicine, Sperry, Granite Park, Cut Bank, and Gunsight Lake. The railway also built Glacier Park Lodge, adjacent to the park on its east side, and the Many Glacier Hotel on the east shore of Swiftcurrent Lake. Louis Hill personally selected the sites for all of these buildings, choosing each for their dramatic scenic backdrops and views. Another developer, John Lewis, built the Lewis Glacier Hotel on Lake McDonald in 1913–1914. The Great Northern Railway bought the hotel in 1930 and it was later renamed Lake McDonald Lodge. The Great Northern Railway also established four tent camps at Red Eagle Lake, Cosley Lake, Fifty Mountain and Goat Haunt. The chalets and tent camps were located roughly 10–18 miles apart, and were connected by a network of trails that allowed visitors to tour Glacier's backcountry on foot or horseback. These trails were also constructed by the railroad. "Because of a lack of federal funds Great Northern assumed financial responsibility for all trail construction during this period, but was eventually reimbursed as funding became available." Today, only Sperry, Granite Park, and Belton Chalets are still in operation, while a building formerly belonging to Two Medicine Chalet is now Two Medicine Store. The surviving chalet and hotel buildings within the park are now designated as National Historic Landmarks. In total, 350 buildings and structures within the park are listed on the National Register of Historic Places, including ranger stations, backcountry patrol cabins, fire lookouts, and concession facilities. In 2017, Sperry Chalet closed early for the season due to the Sprague Fire which subsequently burned the entire interior portions of the structure, leaving only the stone exterior standing. Due to damage, the chalet was closed indefinitely and while the exterior stonework was stabilized in the fall of 2017. The rebuilding process was completed during the summers of 2018 and 2019, and a reopening ceremony was held in February 2020.
After the park was well established and visitors began to rely more on automobiles, work was begun on the 53-mile (85 km) long Going-to-the-Sun Road, completed in 1932. Also known simply as the Sun Road, the road bisects the park and is the only route that ventures deep into the park, going over the Continental Divide at Logan Pass, 6,646 ft (2,026 m) at the midway point. The Sun Road is also listed on the National Register of Historic Places and in 1985 was designated a National Historic Civil Engineering Landmark. Another route, along the southern boundary between the park and National Forests, is US Route 2, which crosses the Continental Divide at Marias Pass and connects the towns of West Glacier and East Glacier.
The Civilian Conservation Corps (CCC), a New Deal relief agency for young men, played a major role between 1933 and 1942 in developing both Glacier National Park and Yellowstone National Park. CCC projects included reforestation, campground development, trail construction, fire hazard reduction, and fire-fighting work. The increase in motor vehicle traffic through the park during the 1930s resulted in the construction of new concession facilities at Swiftcurrent and Rising Sun, both designed for automobile-based tourism. These early auto camps are now also listed on the National Register.
## Park management
Glacier National Park is managed by the National Park Service, with the park's headquarters in West Glacier, Montana. Visitation to Glacier National Park averaged about 3.5 million visitors in 2019, which surpassed its 2017 peak of 3.31 million. Glacier has had at least 2 million annual visitors consistently since 2012, but has broken annual attendance records from 2014 to 2018.
In anticipation of the 100th anniversary of the park in 2010, major reconstruction of the Going-to-the-Sun Road was completed. The Federal Highway Administration managed the reconstruction project in cooperation with the National Park Service. Some rehabilitation of major structures such as visitor centers and historic hotels, as well as improvements in wastewater treatment facilities and campgrounds, were expected to be completed by the anniversary date. The National Park Service engaged in fishery studies for Lake McDonald to assess status and develop protection programs to enhance native fish populations. The restoration of park trails, education and youth programs, park improvements and many community programs were also planned.
The National Park Service mandate is to "... preserve and protect natural and cultural resources". The Organic Act of August 25, 1916 established the National Park Service as a federal agency. One major section of the Act has often been summarized as the "Mission", "... to promote and regulate the use of the ... national parks ... which purpose is to conserve the scenery and the natural and historic objects and the wildlife therein and to provide for the enjoyment of the same in such manner and by such means as will leave them unimpaired for the enjoyment of future generations." In keeping with this mandate, hunting is illegal in the park, as are mining, logging, and the removal of natural or cultural resources. Additionally, oil and gas exploration and extraction are not permitted. These restrictions caused a lot of conflict with the adjoining Blackfeet Indian Reservation. When they sold the land to the United States government, it was with the stipulation of being able to maintain their usage rights of the area, many of which (such as hunting) had come into conflict with these regulations.
In 1974, a wilderness study was submitted to Congress which identified 95% of the area of the park as qualifying for wilderness designation. Unlike a few other parks, Glacier National Park has yet to be protected as wilderness, but National Park Service policy requires that identified areas listed in the report be managed as wilderness until Congress renders a full decision. Ninety-three percent of Glacier National Park is managed as wilderness, even though it has not been officially designated.
## Geography and geology
The park is bordered on the north by Waterton Lakes National Park in Alberta, and the Flathead Provincial Forest and Akamina-Kishinena Provincial Park in British Columbia. To the west, the north fork of the Flathead River forms the western boundary, while its middle fork is part of the southern boundary. The Blackfeet Indian Reservation provides most of the eastern boundary. The Lewis and Clark and the Flathead National Forests form the southern and western boundary. The remote Bob Marshall Wilderness Complex is located in the two forests immediately to the south.
The park contains over 700 lakes, but only 131 have been named as of 2016. Lake McDonald on the western side of the park is the longest at 10 mi (16 km) and the deepest at 464 ft (141 m). Numerous smaller lakes, known as tarns, are located in cirques formed by glacial erosion. Some of these lakes, like Avalanche Lake and Cracker Lake, are colored an opaque turquoise by suspended glacial silt, which also causes a number of streams to run milky white. Glacier National Park lakes remain cold year-round, with temperatures rarely above 50 °F (10 °C) at their surface. Cold water lakes such as these support little plankton growth, ensuring that the lake waters are remarkably clear. However, the lack of plankton lowers the rate of pollution filtration, so pollutants tend to linger longer. Consequently, the lakes are considered environmental bellwethers as they can be quickly affected by even minor increases in pollutants.
Two hundred waterfalls are scattered throughout the park. However, during drier times of the year, many of these are reduced to a trickle. The largest falls include those in the Two Medicine region, McDonald Falls in the McDonald Valley, and Swiftcurrent Falls in the Many Glacier area, which is easily observable and close to the Many Glacier Hotel. One of the tallest waterfalls is Bird Woman Falls, which drops 492 ft (150 m) from a hanging valley beneath the north slope of Mount Oberlin.
### Geology
The rocks found in the park are primarily sedimentary rocks of the Belt Supergroup. They were deposited in shallow seas over 1.6 billion to 800 million years ago. During the formation of the Rocky Mountains 170 million years ago, one region of rocks now known as the Lewis Overthrust was forced eastward 50 mi (80 km). This overthrust was several miles (kilometers) thick and hundreds of miles (kilometers) long. This resulted in older rocks being displaced over newer ones, so the overlying Proterozoic rocks are between 1.4 and 1.5 billion years older than Cretaceous age rocks they now rest on.
One of the most dramatic evidences of this overthrust is visible in the form of Chief Mountain, an isolated peak on the edge of the eastern boundary of the park rising 2,500 ft (800 m) above the Great Plains. There are six mountains in the park over 10,000 ft (3,000 m) in elevation, with Mount Cleveland at 10,466 ft (3,190 m) being the tallest. Appropriately named Triple Divide Peak sends waters towards the Pacific Ocean, Hudson Bay, and Gulf of Mexico watersheds. This peak can effectively be considered to be the apex of the North American continent, although the mountain is only 8,020 ft (2,444 m) above sea level.
The rocks in Glacier National Park are the best preserved Proterozoic sedimentary rocks in the world, with some of the world's most fruitful sources for records of early life. Sedimentary rocks of similar age located in other regions have been greatly altered by mountain building and other metamorphic changes; consequently, fossils are less common and more difficult to observe. The rocks in the park preserve such features as millimeter-scale lamination, ripple marks, mud cracks, salt-crystal casts, raindrop impressions, oolites, and other sedimentary bedding characteristics. Six fossilized species of stromatolites, early organisms consisting of primarily blue-green algae, have been documented and dated at about 1 billion years. The discovery of the Appekunny Formation, a well-preserved rock stratum in the park, pushed back the established date for the origination of animal life a full billion years. This rock formation has bedding structures which are believed to be the remains of the earliest identified metazoan (animal) life on Earth.
### Glaciers
Glacier National Park is dominated by mountains which were carved into their present shapes by the huge glaciers of the last ice age. These glaciers have largely disappeared over the last 12,000 years. Evidence of widespread glacial action is found throughout the park in the form of U-shaped valleys, cirques, arêtes, and large outflow lakes radiating like fingers from the base of the highest peaks. Since the end of the ice ages, various warming and cooling trends have occurred. The last recent cooling trend was during the Little Ice Age, which took place approximately between 1550 and 1850. During the Little Ice Age, the glaciers in the park expanded and advanced, although to nowhere near as great an extent as they had during the Ice Age.
During the middle of the 20th century, examining the maps and photographs from the previous century provided clear evidence that the 150 glaciers known to have existed in the park a hundred years earlier had greatly retreated and disappeared altogether in many cases. Repeat photography of the glaciers, such as the pictures taken of Grinnell Glacier between 1938 and 2015 as shown, help to provide visual confirmation of the extent of glacier retreat.
In the 1980s, the U.S. Geological Survey began a more systematic study of the remaining glaciers, which has continued to the present day. By 2010, 37 glaciers remained, but only 25 of them were at least 25 acres (0.10 km<sup>2</sup>) in area and therefore still considered active. Based on the warming trend of the early 2000s, scientists had estimated that the park's remaining glaciers would melt by 2020; however, a later estimate stated that the glaciers may be gone by 2030. This glacier retreat follows a worldwide pattern that has accelerated even more since 1980. Without a major climatic change in which cooler and moister weather returns and persists, the mass balance, which is the accumulation rate versus the ablation (melting) rate of glaciers, will continue to be negative and the glaciers have been projected to disappear, leaving behind only barren rock eventually.
After the end of the Little Ice Age in 1850, the glaciers in the park retreated moderately until the 1910s. Between 1917 and 1941, the retreat rate accelerated and was as high as 330 ft (100 m) per year for some glaciers. A slight cooling trend from the 1940s until 1979 helped to slow the rate of retreat and, in a few cases, even advanced the glaciers over ten meters. However, during the 1980s, the glaciers in the park began a steady period of loss of glacial ice, which continues as of 2010. In 1850, the glaciers in the region near Blackfoot and Jackson Glaciers covered 5,337 acres (21.6 km<sup>2</sup>), but by 1979, the same region of the park had glacier ice covering only 1,828 acres (7.4 km<sup>2</sup>). Between 1850 and 1979, 73% of the glacial ice had melted away. At the time the park was created, Jackson Glacier was part of Blackfoot Glacier, but the two have separated into individual glaciers since.
It is unknown how glacial retreat may affect the park's ecosystems beyond the broad concept of creating new problems over time, and intensifying or exacerbating existing challenges. There is concern over negative impacts, such as the loss of habitat for plant and animal species that are dependent on cold water. Less glacial melt reduces stream level flow during the dry summer and fall seasons, and lowers water table levels overall, increasing the risk of forest fires. The loss of glaciers will also reduce the aesthetic appeal that glaciers provide to visitors. Relative to the unpredictability of emerging science, misinformation began to circulate in the news media and on social media in early to mid-2019, claiming that the Park Service had discreetly removed or changed placards, movies, brochures, and other literature warning that the park's glaciers would be gone by 2020. Apparently, the event was triggered when the Park Service began updating their on-site placards to reflect the latest scientific findings. The "gone by 2020" date on one placard was replaced with, "When they will completely disappear, however, depends on how and when we act." Another placard states, "Some glaciers melt faster than others, but one thing is consistent: the glaciers in the park are shrinking."
### Climate
As the park spans the Continental Divide, and has more than 7,000 ft (2,100 m) in elevation variance, many climates and microclimates are found in the park. As with other alpine systems, average temperature usually drops as elevation increases. The western side of the park, in the Pacific watershed, has a milder and wetter climate, due to its lower elevation. Precipitation is greatest during the winter and spring, averaging 2 to 3 in (50 to 80 mm) per month. Snowfall can occur at any time of the year, even in the summer, and especially at higher altitudes. The winter can bring prolonged cold waves, especially on the eastern side of the Continental Divide, which has a higher elevation overall. Snowfalls are significant over the course of the winter, with the largest accumulation occurring in the west. During the tourist season, daytime high temperatures average 60 to 70 °F (16 to 21 °C), and nighttime lows usually drop into the 40 °F (4 °C) range. Temperatures in the high country may be much cooler. In the lower western valleys, daytime highs in the summer may reach 90 °F (30 °C).
Rapid temperature changes have been noted in the region. In Browning, Montana, just east of the park in the Blackfeet Reservation, a world record temperature drop of 100 °F (56 °C) in only 24 hours occurred on the night of January 23–24, 1916, when thermometers plunged from 44 to −56 °F (7 to −49 °C).
Glacier National Park has a highly regarded global climate change research program. Based in West Glacier, with the main headquarters in Bozeman, Montana, the U.S. Geological Survey has performed scientific research on specific climate change studies since 1992. In addition to the study of the retreating glaciers, research performed includes forest modeling studies in which fire ecology and habitat alterations are analyzed. Additionally, changes in alpine vegetation patterns are documented, watershed studies in which stream flow rates and temperatures are recorded frequently at fixed gauging stations, and atmospheric research in which UV-B radiation, ozone, and other atmospheric gases are analyzed over time. The research compiled contributes to a broader understanding of climate changes in the park. The data collected, when compared to other facilities scattered around the world, help to correlate these climatic changes on a global scale.
Glacier is considered to have excellent air and water quality. No major areas of dense human population exist anywhere near the region and industrial effects are minimized due to a scarcity of factories and other potential contributors of pollutants. However, the sterile and cold lakes found throughout the park are easily contaminated by airborne pollutants that fall whenever it rains or snows, and some evidence of these pollutants has been found in park waters. Wildfires could also impact the quality of water. However, the pollution level is currently viewed as negligible, and the park lakes and waterways have a water quality rating of A-1, the highest rating given by the state of Montana.
## Wildlife and ecology
### Flora
Glacier is part of a large preserved ecosystem collectively known as the "Crown of the Continent Ecosystem", all of which is a primarily untouched wilderness of a pristine quality. Virtually all the plants and animals which existed at the time European explorers first entered the region are present in the park today.
A total of over 1,132 plant species have been identified parkwide. The predominantly coniferous forest is home to various species of trees such as the Engelmann spruce, Douglas fir, subalpine fir, limber pine and western larch, which is a deciduous conifer, producing cones but losing its needles each fall. Cottonwood and aspen are the more common deciduous trees and are found at lower elevations, usually along lakes and streams. The timberline on the eastern side of the park is almost 800 ft (244 m) lower than on the western side of the Continental Divide, due to exposure to the colder winds and weather of the Great Plains. West of the Continental Divide, the forest receives more moisture and is more protected from the winter, resulting in a more densely populated forest with taller trees. Above the forested valleys and mountain slopes, alpine tundra conditions prevail, with grasses and small plants eking out an existence in a region that enjoys as little as three months without snow cover. Thirty species of plants are found only in the park and surrounding national forests. Beargrass, a tall flowering plant, is commonly found near moisture sources, and is relatively widespread during July and August. Wildflowers such as monkeyflower, glacier lily, fireweed, balsamroot and Indian paintbrush are also common.
The forested sections fall into three major climatic zones. The west and northwest are dominated by spruce and fir and the southwest by red cedar and hemlock; the areas east of the Continental Divide are a combination of mixed pine, spruce, fir and prairie zones. The cedar-hemlock groves along the Lake McDonald valley are the easternmost examples of this Pacific climatic ecosystem.
Whitebark pine communities have been heavily damaged due to the effects of blister rust, a non native fungus. In Glacier and the surrounding region, 30% of the whitebark pine trees have died and over 70% of the remaining trees are currently infected. The whitebark pine provides a high fat pine cone seed, commonly known as the pine nut, that is a favorite food of red squirrels and Clark's nutcracker. Both grizzlies and black bears are known to raid squirrel caches of pine nuts, one of the bears' favorite foods. Between 1930 and 1970, efforts to control the spread of blister rust were unsuccessful, and continued destruction of whitebark pines appears likely, with attendant negative impacts on dependent species.
### Fauna
Virtually all the historically known plant and animal species, with the exception of the bison and woodland caribou, are still present, providing biologists with an intact ecosystem for plant and animal research. Two threatened species of mammals, the grizzly bear and the lynx, are found in the park. Although their numbers remain at historical levels, both are listed as threatened because in nearly every other region of the U.S. outside of Alaska, they are either extremely rare or absent from their historical range. On average, one or two bear attacks on humans occur each year. There have been 11 bear-related deaths since 1971, and 20 non-fatal injuries since 2001. The exact number of grizzlies and lynx in the park is unknown; however, the first ever scientific survey of the lynx population in the park was completed in 2021. The collected data will help researchers determine the number of individual lynx that populate certain areas of the park. Reports from state and federal resource agencies, such as the Montana Department of Fish, Wildlife and Parks, indicate that as of 2021, the grizzly population throughout the millions of acres in and around Glacier Park has climbed to around 1,051–more than triple the 300 or so population estimates in 1975 when grizzlies were first listed as a threatened species. While exact population numbers for grizzlies and the smaller black bear are still unknown, biologists have implemented a variety of methods in their efforts to achieve more accuracy in determining population range. Another study has indicated that the wolverine, another very rare mammal in the lower 48 states, also lives in the park. There were only three or four wolf packs remaining in the park when it was established. Early rangers used guns, traps, and poison to successfully eliminate the species from the park by 1936. Wolves recolonized Glacier National Park naturally during the 1980s. Sixty-two species of mammals have been documented including badger, river otter, porcupine, mink, marten, fisher, two species of marmots, six species of bats, and numerous other small mammals. Other mammals such as the mountain goat (the official park symbol), bighorn sheep, moose, elk, mule deer, skunk, white-tailed deer, bobcat, coyote, and cougar are either plentiful or common.
Over 260 species of birds have been recorded, with raptors such as the bald eagle, golden eagle, peregrine falcon, osprey and several species of hawks residing year round. The harlequin duck is a colorful species of waterfowl found in the lakes and waterways. The great blue heron, tundra swan, Canada goose and American wigeon are species of waterfowl more commonly encountered in the park. Great horned owl, Clark's nutcracker, Steller's jay, pileated woodpecker and cedar waxwing reside in the dense forests along the mountainsides, and in the higher altitudes, the ptarmigan, timberline sparrow and rosy finch are the most likely to be seen. The Clark's nutcracker is less plentiful than in past years due to the decline in the number of whitebark pines.
Because of the colder climate, ectothermic reptiles are all but absent, with two species of garter snake and the western painted turtle being the only three reptile species proven to exist. Similarly, only six species of amphibians are documented, although those species exist in large numbers. After a forest fire in 2001, a few park roads were temporarily closed the following year to allow thousands of western toads to migrate to other areas.
A total of 23 species of fish reside in park waters, and native game fish species found in the lakes and streams include the westslope cutthroat trout, northern pike, mountain whitefish, kokanee salmon and Arctic grayling. Glacier is also home to the threatened bull trout, which is illegal to possess and must be returned to the water if caught inadvertently. Introduction in previous decades of lake trout and other non-native fish species has greatly impacted some native fish populations, especially the bull trout and west slope cutthroat trout.
### Fire ecology
Forest fires were viewed for many decades as a threat to protected areas such as forests and parks. As a better understanding of fire ecology developed after the 1960s, forest fires were understood to be a natural part of the ecosystem. The earlier policies of suppression resulted in the accumulation of dead and decaying trees and plants, which would normally have been reduced had fires been allowed to burn. Many species of plants and animals actually need wildfires to help replenish the soil with nutrients and to open up areas that allow grasses and smaller plants to thrive. Glacier National Park has a fire management plan which ensures that human-caused fires are generally suppressed. In the case of natural fires, the fire is monitored and suppression is dependent on the size and threat the fire may pose to human safety and structures.
Increased population and the growth of suburban areas near parklands, has led to the development of what is known as Wildland Urban Interface Fire Management, in which the park cooperates with adjacent property owners in improving safety and fire awareness. This approach is common to many other protected areas. As part of this program, houses and structures near the park are designed to be more fire resistant. Dead and fallen trees are removed from near places of human habitation, reducing the available fuel load and the risk of a catastrophic fire, and advance warning systems are developed to help alert property owners and visitors about forest fire potentials during a given period of the year. Glacier National Park has an average of 14 fires with 5,000 acres (20 km<sup>2</sup>) burnt each year. In 2003, 136,000 acres (550 km<sup>2</sup>) burned in the park after a five-year drought and a summer season of almost no precipitation. This was the most area transformed by fire since the creation of the park in 1910.
## Recreation
Glacier is distant from major cities. The closest airport is in Kalispell, Montana, southwest of the park. Amtrak's Empire Builder stops seasonally at East Glacier, and year-round at West Glacier and Essex. A fleet of restored 1930s White Motor Company coaches, called Red Jammers, offer tours on all the main roads in the park. The drivers of the buses are called "Jammers", due to the gear-jamming that formerly occurred during the vehicles' operation. The tour buses were rebuilt in 2001 by Ford Motor Company. The bodies were removed from their original chassis and built on modern Ford E-Series van chassis. They were also converted to run on propane to lessen their environmental impact. Later, new hybrid engines were adopted. As of 2017, 33 of original 35 are still in operation.
Historic wooden tour boats, some dating back to the 1920s, operate on some of the larger lakes. Several of these boats have been in continuous seasonal operation at Glacier National Park since 1927 and carry up to 80 passengers. Three of these decades-old boats were added to the National Register of Historic Places in January 2018.
Hiking is popular in the park. Over half of the visitors to the park report taking a hike on the park's nearly 700 mi (1,127 km) of trails. 110 mi (177 km) of the Continental Divide National Scenic Trail spans most of the distance of the park north to south, with a few alternative routes at lower elevations if high altitude passes are closed due to snow. The Pacific Northwest National Scenic Trail crosses the park on 52 mi (84 km) from east to west.
Dogs are not permitted on any trails in the park due to the presence of bears and other large mammals. Dogs are permitted at front country campsites that can be accessed by a vehicle and along paved roads.
Many day hikes can be taken in the park. Back-country camping is allowed at campsites along the trails. A permit is required and can be obtained from certain visitor centers or arranged for in advance. Much of Glacier's backcountry is usually inaccessible to hikers until early June due to accumulated snowpack and avalanche risk, and many trails at higher altitudes remain snow-packed until July. Campgrounds that allow vehicle access are found throughout the park, most of which are near one of the larger lakes. The campgrounds at St. Mary and at Apgar are open year-round, but conditions are primitive in the off-season, as the restroom facilities are closed and there is no running water. All campgrounds with vehicle access are usually open from mid-June until mid-September. Guide and shuttle services are also available.
The park attracts many climbers though the rock quality is old and loose in the Lewis Overthrust fault structure. The seminal literature on climbing in the park, A Climber's Guide to Glacier National Park, was written by J. Gordon Edwards in 1961, with the latest edition published in 1995. The Glacier Mountaineering Society sponsors climbing in the park, issuing awards to those climbers who summit all 10,000 ft (3,000 m) peaks or all five technical peaks.
The park is a popular destination for kayaking and fly fishing. A permit is not required to fish in park waters. The threatened bull trout must be released immediately back to the water if caught; otherwise, the regulations on limits of catch per day are liberal.
Winter recreation in Glacier is limited. Snowmobiling is illegal throughout the park. Cross-country skiing is permitted in the lower altitude valleys away from avalanche zones.
## See also
- Outline of Glacier National Park
- List of national parks of the United States |
# Annie Dove Denmark
Annie Dove Denmark (September 29, 1887 – January 16, 1974) was an American music educator and academic administrator who was the fifth president of Anderson College (now Anderson University) in Anderson, South Carolina, from 1928 to 1953.
A talented musician in her youth, Denmark attended the Baptist University for Women (now Meredith College) and graduated with an artist's diploma in piano in 1908. She began her teaching career later the same year. For a period of eight years thereafter, she taught piano at Buies Creek Academy, the Tennessee College for Women, and Shorter College. She continued her studies as her career began; she spent the summer of 1909 in New York City studying under Rafael Joseffy, the 1916–1917 academic year studying under Alberto Jonás, and many successive summers during her time at Anderson attending the Chautauqua Institute. She began teaching at Anderson at the start of the 1917–1918 academic year. After the resignation of Anderson president John E. White in September 1927, her name was put forward as a potential successor and she had gained the full support of the trustees by December of that year.
Denmark took office as Anderson's fifth president in January 1928; she is commonly cited as the first woman president of a college or university in South Carolina, though this claim is incorrect. Taking on the school's substantial debt, she guided the school through the Great Depression and oversaw Anderson's transition from a four-year college to a two-year junior college, the first of its kind in the state. The remainder of the college's debt was paid off by the South Carolina Baptist Convention in May 1938, and attendance increased as World War II ended and the school enrolled more men than it ever had since becoming co-educational in 1931. She announced her resignation in April 1952 and ultimately left office in May 1953 following that year's commencement, concluding a 25-year presidency that remains the longest in Anderson's history. She was promptly elected president emeritus by the trustees and given an apartment on campus, though she instead retired to her hometown of Goldsboro, North Carolina, where she lived until her death in 1974.
She was the recipient of multiple honors during her life and following her death: Furman University awarded her an honorary degree in 1941, Anderson established the Denmark Society and the Annie Dove Denmark award in 1944 and 1976, respectively, she was made the namesake of a dormitory building on campus in 1966, and was inducted into the Anderson County Museum Hall of Fame in 2004.
## Early life and education
Annie Denmark was born in Goldsboro, North Carolina, on September 29, 1887, the fourth of five children born to Sara Emma () and Willis Arthur Denmark. Her family had lived in Goldsboro for some time before her birth; her father moved there several years prior to the Civil War and was the Wayne County tax collector for 33 years. In addition to being an alderman in the town, he was co-founder of a church where he was superintendent of the Sunday school and a deacon. Sara was Willis's second wife; his first wife, Clarissa Boyette, was Sara's sister and had died about two years after the birth of their first and only child. Willis and Sara married eleven months following Clarissa's death. Annie was raised with close ties to the church; she was later described by the Anderson University historian Hubert Hester as a "gifted student" in music, even playing organ at her church between 1897, at the age of ten, until 1908. She received her high school diploma in 1904 from the Goldsboro public schools and enrolled at the Baptist University for Women (BUW) in Raleigh, North Carolina, later the same year.
While at BUW, which changed its name to Meredith College the year after Denmark graduated, she was the president of a literary society for a year and was a member of the student council. One of her instructors there was Grace Louise Cronkhite, who later became her close friend and was dean of music at Anderson, in addition to teaching piano, organ, and music theory, during her presidency. Denmark gave her graduating piano recital on April 22, 1908, and received an artist's diploma in piano a short time later. She continued to take instruction from Cronkhite for a year following her graduation. She also took graduate courses at Columbia University.
## Career
### Teaching career and start at Anderson, 1908–1927
Denmark accepted her first teaching position in 1908, shortly after her graduation from college, and taught during the 1908–1909 academic year at Buies Creek Academy—now Campbell University—in Buies Creek, North Carolina. One student that she taught at Buies Creek was Bessie Campbell, the daughter of J. A. Campbell, later made the namesake of the school. Denmark received a monthly salary of $45 () in this position and spent $9 monthly () on board. This salary was sufficient to send her to New York City during the summer of 1909, where she studied under pianist and teacher Rafael Joseffy. She then moved to Murfreesboro, Tennessee, where she taught for one year at the Tennessee College for Women as the piano instructor, and afterwards took the same position at Shorter College—now Shorter University—in Rome, Georgia, where she stayed from 1910 to 1916. In addition to teaching piano at Shorter, she taught a Sunday school class for young women at the Fifth Avenue Baptist Church in Rome. During the 1916–1917 academic year, she traveled back to New York to study at the Virgil Piano School under Alberto Jonás. She joined the faculty of Anderson College—now Anderson University—in Anderson, South Carolina, as the instructor of piano and harmony in 1917. For her first eight years at Anderson, she held a role as director of religious activities in addition to teaching. While teaching at Anderson, she continued her own studies. For many summers she traveled to Chautauqua, New York, to attend the Chautauqua Institute, and she took classes outside of her teaching schedule at Anderson during the school year. She eventually earned her Bachelor of Arts degree from Anderson in 1925. That same year, she was appointed dean of women by President John E. White, a position she kept for three years.
White resigned as president of Anderson effective September 1, 1927, leaving the position vacant. A committee formed from three members of the school's Board of Trustees was created in order to name his successor. R. H. Holliday, the business manager of the school, was named acting president in the intervening three months while the new permanent president was being selected. Denmark was not the first choice of the Board of Trustees: Charles E. Burts and R. C. Burts, brothers who were both from Newberry, South Carolina, each refused the job, and A. J. Barton, from Nashville, Tennessee, could not agree to terms with the board and therefore did not take the job either. Though little is known about the exact events that led to Denmark's election, it is known that her name was put forward for consideration by college trustee J. Dexter Brown and that the Board of Trustees were in unanimous support of her appointment to the presidency when asked at their meeting on December 15, 1927.
### President of Anderson College, 1928–1953
Denmark took office and became Anderson's fifth president on January 1, 1928. In doing so, she became the school's second lay president. She is sometimes referred to as the first woman college president in South Carolina, though she was predated in this distinction by Euphemia McClintock some 26 years earlier. Denmark was formally inaugurated as president just over a year later, on February 14, 1929. Her inauguration ceremony was well-attended by leaders in higher education throughout the southeast. She marked the occasion of her inauguration by declaring that day to be the inaugural observation of the college's annual Founders Day, recognizing the anniversary of the granting of the college charter on February 14, 1911. The Founders Day ceremony was often accompanied by a guest speaker, including people such as Clemson University President E. Walter Sikes and South Carolina First Lady Gladys Atkinson Johnston, an Anderson graduate and the wife of Governor Olin D. Johnston. Denmark inherited the college's debt of $60,000 () upon taking office, as well as the school's lack of an endowment. She spoke three times to the Board of Trustees at meetings between January and May 1928, concluding one speech with the line, "What are you trustees willing to sacrifice for Anderson College?", and then pledging a gift of $5,000 () to the school, to be paid over the next few years.
As the school entered the 1930s and the Great Depression, it fought to maintain membership in the Southern Association of Colleges and Schools, which required members to have an endowment of no less than $500,000 (), of which Anderson at this point had built up not even one-fifth. This, among other reasons, contributed to a decision by Denmark and the trustees to convert Anderson to a junior college. The trustees voted in favor of this plan, and it was brought before the Baptist State Convention on December 4, 1929, at their meeting in Spartanburg. It was approved after much debate. Anderson opened its doors as a junior college for the first time at the start of the 1930–1931 academic year, making it the first junior college in the state. The college became co-educational the following year, admitting its first male students in 1931. Two years following the switch, enrollment had increased by 27 percent. The college reported a total of 199 enrolled students in 1931–1932. Around this time she published White Echoes, a collection of sermons preached by her predecessor, John E. White, during his time in charge of Anderson's First Baptist Church.
Much of the administration's attention was focused on the school's financial troubles and the establishment of a college radio broadcasting station for the next several years. The debt grew larger as the middle of the decade neared and in 1935–1936 the college was paying $3,600 () yearly in interest alone to the Hibernia Trust Company. Denmark and the trustees planned a large dinner to spur the fundraising campaign on April 6, 1936, though the dinner was canceled after an F2 tornado—one that was part of a larger outbreak over the course of that and the previous day—struck the city of Anderson that afternoon resulting in thirty injuries in addition to the loss of two mills and several homes and farms in town. The college did, however, collect $20,000 () as a result of their storm insurance policy, which went towards restoring the heating plant and other general refurbishments. The school administration won a significant victory two years later when, on May 23, 1938, the South Carolina Baptist Convention assisted in paying the remainder of Anderson's debt, bringing many of the school's financial woes to a close.
Enrollment climbed over the next few years and spiked noticeably after World War II; 42 of 53 men that enrolled at Anderson in 1946–1947 were veterans, largely a result of the G.I. Bill, and the school enrolled a record 409 students that academic year in total. Denmark helped to increase pay for Anderson faculty on multiple occasions: in May 1944, she recommended a "slight increase" in their salaries and introduced a salary bonus in March 1946. The college, still in some need of funds and a stable endowment, received $60,000 () from the Baptist State Convention sometime between 1946 and 1947, which was used to modernize some of the campus's buildings. In 1944, she worked with school administration to implement an honor code for the college under which students would be tried by their classmates, though some infractions (such as alcohol possession) meant a student would be subject to expulsion with no debate. Throughout her presidency, she kept close the college's ties with the church, as chapel attendance remained a requirement for all students, five days a week, up to and through her resignation.
At a meeting of the Board of Trustees on April 23, 1952, Denmark announced her resignation as president of the college, saying,
> I am herewith tendering to you as representing the Baptist State Convention of South Carolina, my resignation to take effect on January 1, 1953, or as soon thereafter as my successor can be found.
This date represented the 25th anniversary, to the day, of the beginning of her term, though her successor was ultimately not found until several months later. In her letter, she referenced the school's freedom from debt and good prospects for future financial support as well as her desire to allow the new president enough time to prepare for the next academic year. The trustees were quite surprised by this request and did not accept her resignation until the conclusion of the meeting, when she insisted that they do so. She gave her final president's report on January 22, 1953; at the same meeting, president-elect Elmer Francis Haight was introduced to the trustees. "Denmark Day" was celebrated on Founders Day of that year—February 14, 1953—during which the retiring president was honored by many former students and other guests of the college. Her official duties as president came to a close following the commencement exercises of May 22, 1953. Haight began his duties as Anderson's sixth president the following month.
During her presidency, Denmark held a number of other positions within higher education: she led the Southern Association of Colleges for Women as its president from 1934 to 1935, was a member of the Board of Trustees of the Southern Baptist Women's Missionary Union Training School in Louisville, Kentucky, and was the first woman to hold an office in the Baptist State Convention when she was its vice president in 1950.
## Later life and death
On Denmark's final day as president, the college's trustees elected her president emeritus and extended her an invitation to remain living on campus for the rest of her life. While she accepted the position, she opted to return to her hometown of Goldsboro. She assumed several hobbies in her retirement, including collecting Madonnas and watching baseball. After suffering declining health for several months, she died on the morning of January 16, 1974, at Wayne Memorial Hospital in Goldsboro, at the age of 86. She never married and left no immediate family. Memorial services were held the following day at First Baptist Church in Goldsboro and in Anderson's auditorium. She was buried in Goldsboro's Willow Dale Cemetery.
## Legacy
On June 2, 1941, Furman University conferred upon Denmark the honorary Doctor of Letters degree at their commencement exercises. In 1944, during her presidency, the Denmark Society was established, which honored "outstanding graduates" of the college. Similarly, the Annie Dove Denmark Award bears her name; it is bestowed as Anderson's highest honor to non-alumni and was established in 1976. She received a certificate of service at Anderson's commencement in May 1961, along with her successor Elmer Francis Haight, as part of the school's fiftieth anniversary celebrations. West Dormitory, a dormitory building on Anderson's campus originally built in 1911 and in which Denmark resided during her tenure, was renamed Denmark Hall in her honor in 1966. She was the subject of an original biographical play produced by Anderson entitled The Denmark Story. It was supported by a grant given by South Carolina Humanities in 2010 and showed in September 2010 at Anderson's Daniel Recital Hall, after which it toured around the state during winter 2011. She was honored as an inductee into the Anderson County Museum's Hall of Fame in 2004, alongside five others. Due to her contributions to the life of the college and the city as a whole, she was sometimes referred to as "the first citizen of Anderson"; many letters written to her were addressed to "Dr. Anderson". As of 2024, her 25-year presidency remains the longest in the college's history. |
# Tropical Storm Hermine (1998)
Tropical Storm Hermine was the eighth tropical cyclone and named storm of the 1998 Atlantic hurricane season. Hermine developed from a tropical wave that emerged from the west coast of Africa on September 5. The wave moved westward across the Atlantic Ocean, and on entering the northwest Caribbean interacted with other weather systems. The resultant system was declared a tropical depression on September 17 in the central Gulf of Mexico. The storm meandered north slowly, and after being upgraded to a tropical storm made landfall on Louisiana, where it quickly deteriorated into a tropical depression again on September 20.
Before the storm's arrival, residents of Grand Isle, Louisiana, were evacuated. As a weak tropical storm, damages from Hermine were light. Rainfall spread from Louisiana through Georgia, causing isolated flash flooding. In some areas, the storm tide prolonged the coastal flooding from a tropical cyclone. Gusty winds were reported. Associated tornadoes in Mississippi damaged mobile homes and vehicles, and inflicted one injury. While Hermine was not of itself a particularly damaging storm, its effects combined with those of other tropical cyclones, and resulted in agricultural damage.
## Meteorological history
On September 5, 1998, a tropical wave emerged from the west coast of Africa and entered the Atlantic Ocean. The wave was not associated with any thunderstorm activity until it reached the Windward Islands, when cloud and shower activity began to increase. Continuing westward, the disturbance approached the South American coastline and turned into the northwest Caribbean. The wave interacted with an upper-level low-pressure system and another tropical wave that entered the region. At the time, a large monsoon-type flow prevailed over Central America, part of the Caribbean Sea, and the Gulf of Mexico. An area of low pressure developed over the northwestern Caribbean, and at about 1200 UTC on September 17, the system was sufficiently organized to be declared a tropical depression in the central Gulf of Mexico.
Initially, the cloud pattern associated with the system featured a tight and well-defined circulation, as well as clusters of deep convection south of the center. Due to the proximity of a large upper-level low-pressure area in the southern Gulf of Mexico, the surrounding environment did not favor intensification. Influenced by the low, the depression moved southward. The system completed a cyclonic loop in the central gulf, and by early on September 18 was drifting northward. As a result of wind shear, the center of circulation was separated from the deep convective activity. Early the next day, deep convection persisted in a small area northeast of the center. Forward motion was nearly stationary, with a gradual drift east-southeastward. Despite the wind shear, the depression attained tropical storm status at 1200 UTC on September 19; as such, it was named Hermine by the National Hurricane Center.
Shortly after being upgraded to a tropical storm, Hermine reached its peak intensity with maximum sustained winds of 45 mph (75 km/h). The tropical storm-force winds were confined to the eastern semicircle of the cyclone. Hermine tracked northward and approached the coast, where it nearly stalled. A continually weakening storm, it moved ashore near Cocodrie, Louisiana at 0500 UTC on September 20 with winds of 40 miles per hour (64 km/h), and then deteriorated into a tropical depression. On its landfall, associated rain bands were deemed "not very impressive", although there was a rapid increase in thunderstorm activity east of the center. The thunderstorms produced heavy rainfall in parts of southeastern Louisiana and southern Mississippi. The storm progressively weakened as the circulation moved northeastward, and dissipated at 1800 UTC. Initially, it was believed that Hermine's remnants contributed to the development of Hurricane Karl; however, this belief was not confirmed.
## Preparations
On September 17, the National Hurricane Center issued a tropical storm watch from Sargent, Texas to Grand Isle, Louisiana. The following day, the watch was extended southward from Sargent to Matagorda, Texas, and eastward to Pascagoula, Mississippi. A tropical storm warning was posted from Morgan City, Louisiana, eastward to Pensacola, Florida on September 19. The warning was promptly extended westward from Morgan City to Intracoastal City, Louisiana, and by 1200 UTC on September 20 all tropical cyclone watches and warnings were discontinued. As the storm moved inland, flood advisories were issued for southern Mississippi. On Grand Isle, a mandatory evacuation order was declared for the third time in three weeks, and residents in low-lying areas of Lafourche Parish were ordered to leave. Shelters were opened, but few people used them. Only fifteen people entered the American Red Cross shelter in Larose, Louisiana, which had been designed to hold 500. Workers were evacuated from oil rigs in the Gulf of Mexico, and energy futures rose substantially in anticipation of the storm, though when Hermine failed to cause significant damage, they retreated. The Coast Guard evacuated its Grand Isle station in preparation.
## Impact
In southern Florida, the combination of rainbands from Hermine and a separate upper-level cyclone in its vicinity produced up to 14.14 inches (359 mm) of rainfall. Hermine's remnants spread showers and thunderstorms across northern parts of the state. The heavy rainfall downed a tree Orlando, and led to several traffic accidents. A man died on U.S. Route 441 after losing control of his vehicle.
Upon landfall in Louisiana, winds were primarily of minimal tropical storm-force and confined to squalls. Offshore, a wind gust of 46 miles per hour (74 km/h) was reported near the mouth of the Mississippi River, and near New Orleans, wind gusts peaked at 32 miles per hour (51 km/h). Along the coast, storm tides generally ran 1 to 3 feet (0.30 to 0.91 m) above normal, which prolonged coastal flooding in some areas from previous Tropical Storm Frances. Winds on Grand Isle reached 25 miles per hour (40 km/h), and storm tides on the island averaged 1 foot (0.30 m). Hermine brought 3 to 4 inches (76 to 102 mm) of rainfall to the state, triggering isolated flash flooding. Near Thomas, part of Louisiana Highway 438 was submerged under flood waters. An oil rig in the Gulf of Mexico reported sustained winds of 48 miles per hour (77 km/h) with gusts to 59 miles per hour (95 km/h).
At around 8:30 AM on September 20, a man was presumed drowned in Lake Cataouatche, southwest of New Orleans. The man had been shrimping in the lake in choppy waters caused by the storm, and dove into the water without a life vest to untangle a net from his boat's propeller. After he freed the propeller, the boat was carried away by the current in the lake, and he was last seen swimming after the boat. After the disappearance, the Coast Guard launched a search with rescue boats and search dogs but could not locate him. His body was eventually found on the morning of September 22. Captain Pat Yoes, of the St. Charles Parish Sheriff's Office, said that the storm "obviously ... played a part" in the man's death, but Lieutenant Commander William Brewer of the United States Coast Guard told the press that he did not "think it was directly storm-related."
Hermine spawned two tornadoes in Mississippi. One destroyed two mobile homes, damaged seven cars, and resulted in one minor injury; the other caused only minor damage. Rainfall of 4 to 5 inches (100 to 130 mm) caused localized flooding; in southern Walthall County, parts of Mississippi Highway 27 were under 1 foot (0.30 m) of water. Over 6 inches (150 mm) of rainfall was reported in Alabama, resulting in the flooding of apartments and several roads and the closure of several highways. Numerous cars were damaged, and motorists were stranded on Bibb County Route 24. Floodwaters also covered U.S. Route 11 near Tuscaloosa, Alabama stranding several motorists and a milk truck. Flash flood warnings were issued in Bibb and Shelby counties as northern Alabama experienced its first rainfall in the month of September. The rainfall extended eastward into Georgia, where the rains led the state to lift a fire alert for three northern counties, South Carolina and North Carolina. The remnants of the storm dumped 10.5 inches (27 cm) of rain on Charleston, South Carolina and rainfall of up to one foot was reported in other parts of the state. The rain in Charleston led to over five feet of standing water in some neighborhoods, forcing several families to evacuate their mobile homes and stranding a number of vehicles. As a result, the local police closed several roads, including sections of Interstate 526.
Overall, damage totaled $85,000 (1998 USD); the effects were described as minor. Although the effects from Hermine were small, counties in Louisiana and Texas were declared disaster areas due to damage associated with the earlier Tropical Storm Frances and the Louisiana Office of Emergency Preparedness extended these funds to cover damages from Hermine as well.
## Aftermath
The heavy rains from Hermine combined with those from Frances caused major fish kills in southern Louisiana, the first since those caused by Hurricane Andrew in 1992. The rain from the two storms flooded the swamps in south Louisiana, where it rapidly lost oxygen due to decaying plant matter. After the swamps began to drain, the low-oxygen water flowed into streams, canals, and bayous in the area, and testing in the days following the storm showed that the water was "almost devoid of oxygen." Without sufficient oxygen, local fish population died quickly, filling waterways, particularly in the area of Lake Charles and Lafayette, according to the Louisiana Department of Wildlife and Fisheries. In total, the fish kills affected at least a dozen separate lakes and bayous in the state.
The combined effects of Hermine and other storms caused significant damage to Louisiana agriculture. The standing water after Hermine provided ideal hatching conditions for mosquitoes, who formed swarms large enough to kill livestock in the days after the storm. At least twelve bulls and horses were killed by mosquito bites in the next week, including bulls who drowned after wading into deep water to escape the insects. The rains and standing water from the storm also prevented farmers from drying out soybeans for harvest and ruined sugar cane. According to Louisiana Agriculture Commissioner Bob Odom, the combined effects of Hurricane Earl, Tropical Storm Frances, and Tropical Storm Hermine caused $420 million in direct and indirect losses for Louisiana farmers.
## See also
- Other storms of the same name
- Hurricane Georges
- List of Florida hurricanes
- List of North Carolina hurricanes (1980–1999) |
# Painted turtle
The painted turtle (Chrysemys picta) is the most widespread native turtle of North America. It lives in relatively slow-moving fresh waters, from southern Canada to northern Mexico, and from the Atlantic to the Pacific. They have been shown to prefer large wetlands with long periods of inundation and emergent vegetation. This species is one of the few that is specially adapted to tolerate freezing temperatures for extended periods of time due to an antifreeze-like substance in their blood that keeps their cells from freezing. This turtle is a member of the genus Chrysemys, which is part of the pond turtle family Emydidae. Fossils show that the painted turtle existed 15 million years ago. Three regionally based subspecies (the eastern, midland, and western) evolved during the last ice age. The southern painted turtle (C. dorsalis) is alternately considered the only other species in Chrysemys, or another subspecies of C. picta.
The adult painted turtle is 13–25 cm (5.1–9.8 in) long; the male is smaller than the female. The turtle's top shell is dark and smooth, without a ridge. Its skin is olive to black with red, orange, or yellow stripes on its extremities. The subspecies can be distinguished by their shells: the eastern has straight-aligned top shell segments; the midland has a large gray mark on the bottom shell; the western has a red pattern on the bottom shell.
The turtle eats aquatic vegetation, algae, and small water creatures including insects, crustaceans, and fish. Painted turtles primarily feed while in water and are able to locate and subdue prey even in heavily clouded conditions. Although they are frequently consumed as eggs or hatchlings by rodents, canines, and snakes, the adult turtles' hard shells protect them from most predators. Reliant on warmth from its surroundings, the painted turtle is active only during the day when it basks for hours on logs or rocks. During winter, the turtle hibernates, usually in the mud at the bottom of water bodies. The turtles mate in spring and autumn. Females dig nests on land and lay eggs between late spring and mid-summer. Hatched turtles grow until sexual maturity: 2–9 years for males, 6–16 for females.
In the traditional tales of Algonquian tribes, the colorful turtle played the part of a trickster. In modern times, four U.S. states (Colorado, Illinois, Michigan, and Vermont) have named the painted turtle their official reptile. While habitat loss and road killings have reduced the turtle's population, its ability to live in human-disturbed settings has helped it remain the most abundant turtle in North America. Adults in the wild can live for more than 55 years.
## Taxonomy and evolution
The painted turtle (C. picta) is the only species in the genus Chrysemys. The parent family for Chrysemys is Emydidae: the pond turtles. Emydidae is split into two sub families; Chrysemys is part of the Deirochelyinae (Western Hemisphere) branch. The four subspecies of the painted turtle are the eastern (C. p. picta), midland (C. p. marginata), southern (C. p. dorsalis), and western (C. p. bellii).
The painted turtle's generic name is derived from the Ancient Greek words for "gold" (chryso) and "freshwater tortoise" (emys); the species name originates from the Latin for "colored" (pictus). The subspecies name, marginata, derives from the Latin for "border" and refers to the red markings on the outer (marginal) part of the upper shell; dorsalis is from the Latin for "back", referring to the prominent dorsal stripe; and bellii honors English zoologist Thomas Bell, a collaborator of Charles Darwin. An alternate East Coast common name for the painted turtle is "skilpot", from the Dutch for turtle, schildpad.
### Classification
Originally described in 1783 by Johann Gottlob Schneider as Testudo picta, the painted turtle was called Chrysemys picta first by John Edward Gray in 1855. Four subspecies were then recognized: the eastern by Schneider in 1783, the western by Gray in 1831, and the midland and southern by Louis Agassiz in 1857, though the southern painted turtle is now generally considered a full species.
### Subspecies
Although the subspecies of painted turtle intergrade (blend together) at range boundaries they are distinct within the hearts of their ranges.
- The male eastern painted turtle (C. p. picta) is 13–17 cm (5–7 in) long, while the female is 14–17 cm (6–7 in). The upper shell is olive green to black and may possess a pale stripe down the middle and red markings on the periphery. The segments (scutes) of the top shell have pale leading edges and occur in straight rows across the back, unlike all other North American turtles, including the other three subspecies of painted turtle, which have alternating segments. The bottom shell is plain yellow or lightly spotted. Sometimes as few as one dark grey spot near the lower center of the shell.
- The midland painted turtle (C. p. marginata) is 10–25 cm (4–10 in) long. The centrally located midland is the hardest to distinguish from the other three subspecies. Its bottom shell has a characteristic symmetrical dark shadow in the center which varies in size and prominence.
- The largest subspecies is the western painted turtle (C. p. bellii), which grows up to 26.6 cm (10 in) long. Its top shell has a mesh-like pattern of light lines, and the top stripe present in other subspecies is missing or faint. Its bottom shell has a large colored splotch that spreads to the edges (further than the midland) and often has red hues.
Until the 1930s, many of the subspecies of the painted turtle were labeled by biologists as full species within Chrysemys, but this varied by the researcher. The painted turtles in the border region between the western and midland subspecies were sometimes considered a full species, treleasei. In 1931, Bishop and Schmidt defined the current "four in one" taxonomy of species and subspecies. Based on comparative measurements of turtles from throughout the range, they subordinated species to subspecies and eliminated treleasei.
Since at least 1958, the subspecies were thought to have evolved in response to geographic isolation during the last ice age, 100,000 to 11,000 years ago. At that time painted turtles were divided into three different populations: eastern painted turtles along the southeastern Atlantic coast; southern painted turtles around the southern Mississippi River; and western painted turtles in the southwestern United States. The populations were not completely isolated for sufficiently long, hence wholly different species never evolved. When the glaciers retreated, about 11,000 years ago, all three subspecies moved north. The western and southern subspecies met in Missouri and hybridized to produce the midland painted turtle, which then moved east and north through the Ohio and Tennessee river basins.
Biologists have long debated the genera of closely related subfamily-mates Chrysemys, Pseudemys (cooters), and Trachemys (sliders). After 1952, some combined Pseudemys and Chrysemys because of similar appearance. In 1964, based on measurements of the skull and feet, Samuel B. McDowell proposed all three genera be merged into one. However, further measurements, in 1967, contradicted this taxonomic arrangement. Also in 1967, J. Alan Holman, a paleontologist and herpetologist, pointed out that, although the three turtles were often found together in nature and had similar mating patterns, they did not crossbreed. In the 1980s, studies of turtles' cell structures, biochemistries, and parasites further indicated that Chrysemys, Pseudemys, and Trachemys should remain in separate genera.
In 2003, Starkey et al. proposed that Chrysemys dorsalis, formerly considered a subspecies of C. picta, to be a distinct species sister to all subspecies in C. picta. Although this proposal was largely unrecognized at the time due to evidence of hybridization between dorsalis and picta, the Turtle Taxonomy Working Group and the Reptile Database have since followed through with it, although both the subspecific and specific names have been recognized.
### Fossils
Although its evolutionary history—what the forerunner to the species was and how the close relatives branched off—is not well understood, the painted turtle is common in the fossil record. The oldest samples, found in Nebraska, date to about 15 million years ago. Fossils from 15 million to about 5 million years ago are restricted to the Nebraska-Kansas area, but more recent fossils are gradually more widely distributed. Fossils newer than 300,000 years old are found in almost all the United States and southern Canada.
### DNA
The turtle's karyotype (nuclear DNA, rather than mitochondrial DNA) consists of 50 chromosomes, the same number as the rest of its subfamily-mates and the most common number for Emydidae turtles in general. Less well-related turtles have from 26 to 66 chromosomes. Little systematic study of variations of the painted turtle's karotype among populations has been done. (However, in 1967, research on protein structure of offshore island populations in New England, showed differences from mainland turtles.)
Comparison of subspecies chromosomal DNA has been discussed, to help address the debate over Starkey's proposed taxonomy, but as of 2009 had not been reported. The complete sequencing of the genetic code for the painted turtle was at a "draft assembled" state in 2010. The turtle was one of two reptiles chosen to be first sequenced.
## Description
Adult painted turtles can grow to 13–25 cm (5–10 in) long, with males being smaller. The shell is oval, smooth with little grooves where the large scale-like plates overlap, and flat-bottomed. The color of the top shell (carapace) varies from olive to black. Darker specimens are more common where the bottom of the water body is darker. The bottom shell (plastron) is yellow, sometimes red, sometimes with dark markings in the center. Similar to the top shell, the turtle's skin is olive to black, but with red and yellow stripes on its neck, legs, and tail. As with other pond turtles, such as the bog turtle, the painted turtle's feet are webbed to aid swimming.
The head of the turtle is distinctive. The face has only yellow stripes, with a large yellow spot and streak behind each eye, and on the chin two wide yellow stripes that meet at the tip of the jaw. The turtle's upper jaw is shaped into an inverted "V" (philtrum), with a downward-facing, tooth-like projection on each side.
The hatchling has a proportionally larger head, eyes, and tail, and a more circular shell than the adult. The adult female is generally longer than the male, 10–25 cm (4–10 in) versus 7–15 cm (3–6 in). For a given length, the female has a higher (more rounded, less flat) top shell. The female weighs around 500 g (18 oz) on average, against the males' average adult weight of roughly 300 g (11 oz). The female's greater body volume supports her egg-production. The male has longer foreclaws and a longer, thicker tail, with the anus (cloaca) located further out on the tail.
### Similar species
The painted turtle has a very similar appearance to the red-eared slider (the most common pet turtle) and the two are often confused. The painted turtle can be distinguished because it is flatter than the slider. Also, the slider has a prominent red marking on the side of its head (the "ear") and a spotted bottom shell, both features missing in the painted turtle.
## Distribution
### Range
The most widespread North American turtle, the painted turtle is the only turtle whose native range extends from the Atlantic to the Pacific. It is native to eight of Canada's ten provinces, forty-five of the fifty United States, and one of Mexico's thirty-one states. On the East Coast, it lives from the Canadian Maritimes to the U.S. state of Georgia. On the West Coast, it lives in British Columbia, Washington, and Oregon and offshore on southeast Vancouver Island. The northernmost American turtle, its range includes much of southern Canada. To the south, its range reaches the U.S. Gulf Coast in Louisiana and Alabama. In the southwestern United States there are only dispersed populations. It is found in one river in extreme northern Mexico. It is absent in a part of southwestern Virginia and the adjacent states as well as in north-central Alabama. There is a harsher divide between midland and eastern painted turtles in the southeast because they are separated by the Appalachian mountains, but the two subspecies tend to mix in the northeast.
The borders between the four subspecies are not sharp, because the subspecies interbreed. Many studies have been performed in the border regions to assess the intermediate turtles, usually by comparing the anatomical features of hybrids that result from intergradation of the classical subspecies. Despite the imprecision, the subspecies are assigned nominal ranges.
#### Eastern painted turtle
The eastern painted turtle ranges from southeastern Canada to Georgia with a western boundary at approximately the Appalachians. At its northern extremes, the turtle tends to be restricted to the warmer areas closer to the Atlantic Ocean. It is uncommon in far north New Hampshire and in Maine is common only in a strip about 50 miles from the coast. In Canada, it lives in New Brunswick and Nova Scotia but not in Quebec or Prince Edward Island. To the south it is not found in the coastal lowlands of southern North Carolina, South Carolina, or Georgia, or in southern Georgia in general or at all in Florida.
In the northeast, there is extensive mixing with the midland subspecies, and some writers have called these turtles a "hybrid swarm". In the southeast, the border between the eastern and midland is more sharp as mountain chains separate the subspecies to different drainage basins.
#### Midland painted turtle
The midland painted turtle lives from southern Ontario and Quebec, through the eastern U.S. Midwest states, to Kentucky, Tennessee and northwestern Alabama, where it intergrades with the southern painted turtle. It also is found eastward through West Virginia, western Maryland and Pennsylvania. The midland painted turtle appears to be moving east, especially in Pennsylvania. To the northeast it is found in western New York and much of Vermont, and it intergrades extensively with the eastern subspecies.
#### Western painted turtle
The western painted turtle's northern range includes southern parts of western Canada from Ontario through Manitoba, Saskatchewan, Alberta and British Columbia. In Ontario, the western subspecies is found north of Minnesota and directly north of Lake Superior, but there is a 130 km (80 mi) gap to the east of Lake Superior (in the area of harshest winter climate) where no painted turtles of any subspecies occur. Thus Ontario's western subspecies does not intergrade with the midland painted turtle of southeastern Ontario. In Manitoba, the turtle is numerous and ranges north to Lake Manitoba and the lower part of Lake Winnipeg. The turtle is also common in south Saskatchewan, but in Alberta, there may only be 100 individuals, all found very near the U.S. border, mostly in the southeast.
In British Columbia, populations exist in the interior in the vicinity of the Kootenai, Columbia, Okanagan, and Thompson river valleys. At the coast, turtles occur near the mouth of the Fraser and a bit further north, as well as the bottom of Vancouver Island, and some other nearby islands. Within British Columbia, the turtle's range is not continuous and can better be understood as northward extensions of the range from the United States. High mountains present barriers to east–west movement of the turtles within the province or from Alberta. Some literature has shown isolated populations much further north in British Columbia and Alberta, but these were probably pet-releases.
In the United States, the western subspecies forms a wide intergrade area with the midland subspecies covering much of Illinois as well as a strip of Wisconsin along Lake Michigan and part of the Upper Peninsula of Michigan (UP). Further west, the rest of Illinois, Wisconsin and the UP are part of the range proper, as are all of Minnesota and Iowa, as well as all of Missouri except a narrow strip in the south. All of North Dakota is within range, all of South Dakota except a very small area in the west, and all of Nebraska. Almost all of Kansas is in range; the border of that state with Oklahoma is roughly the species range border, but the turtle is found in three counties of north central Oklahoma.
To the northwest, almost all of Montana is in range. Only a narrow strip in the west, along most of the Idaho border (which is at the Continental Divide) lacks turtles. Wyoming is almost entirely out of range; only the lower elevation areas near the eastern and northern borders have painted turtles. In Idaho, the turtles are found throughout the far north (upper half of the Idaho Panhandle). Recently, separate Idaho populations have been observed in the southwest (near the Payette and Boise rivers) and the southeast (near St. Anthony). In Washington state, turtles are common throughout the state within lower elevation river valleys. In Oregon, the turtle is native to the northern part of the state throughout the Columbia River Valley as well as the Willamette River Valley north of Salem.
To the southwest, the painted turtle's range is fragmented. In Colorado, while range is continuous in the eastern, prairie, half of the state, it is absent in most of the western, mountainous, part of the state. However, the turtle is confirmed present in the lower elevation southwest part of the state (Archuleta and La Plata counties), where a population ranges into northern New Mexico in the San Juan River basin. In New Mexico, the main distribution follows the Rio Grande and the Pecos River, two waterways that run in a north–south direction through the state. Within the aforementioned rivers, it is also found in the northern part of Far West Texas. In Utah, the painted turtle lives in an area to the south (Kane County) in streams draining into the Colorado River, although it is disputed if they are native. In Arizona, the painted turtle is native to an area in the east, Lyman Lake. The painted turtle is not native to Nevada or California.
In Mexico, painted turtles have been found about 50 miles south of New Mexico near Galeana in the state of Chihuahua. There, two expeditions found the turtles in the Rio Santa Maria which is in a closed basin.
#### Human-introduced range
Pet releases are starting to establish the painted turtle outside its native range. It has been introduced into waterways near Phoenix, Arizona, and to Germany, Indonesia, the Philippines, and Spain.
### Habitat
To thrive, painted turtles need fresh waters with soft bottoms, basking sites, and aquatic vegetation. They find their homes in shallow waters with slow-moving currents, such as creeks, marshes, ponds, and the shores of lakes. The subspecies have evolved different habitat preferences.
- The eastern painted turtle is very aquatic, leaving the immediate vicinity of its water body only when forced by drought to migrate. Along the Atlantic, painted turtles have appeared in brackish waters. They can be found in wetland areas like swamps and marshes with a thick layer of mud as well as sandy bottoms with lots of vegetation. Areas with warmer climates have higher relative densities among populations and habitat desirability also influences density.
- The midland and southern painted turtles seek especially quiet waters, usually shores and coves. They favor shallows that contain dense vegetation and have an unusual toleration of pollution.
- The western painted turtle lives in streams and lakes, similar to the other painted turtles, but also inhabits pasture ponds and roadside pools. It is found as high as 1,800 m (5,900 ft).
### Population features
Within much of its range, the painted turtle is the most abundant turtle species. Population densities range from 10 to 840 turtles per hectare (2.5 acres) of water surface. Warmer climates produce higher relative densities among populations, and habitat desirability also influences density. Rivers and large lakes have lower densities because only the shore is desirable habitat; the central, deep waters skew the surface-based estimates. Also, lake and river turtles have to make longer linear trips to access equivalent amounts of foraging space.
Adults outnumber juveniles in most populations, but gauging the ratios is difficult because juveniles are harder to catch; with current sampling methods, estimates of age distribution vary widely. Annual survival rate of painted turtles increases with age. The probability of a painted turtle surviving from the egg to its first birthday is only 19%. For females, the annual survival rate rises to 45% for juveniles and 95% for adults. The male survival rates follow a similar pattern, but are probably lower overall than females, as evidenced by the average male age being lower than that of the female. Natural disasters can confound age distributions. For instance, a hurricane can destroy many nests in a region, resulting in fewer hatchlings the next year. Age distributions may also be skewed by migrations of adults.
To understand painted turtle adult age distributions, researchers require reliable methods. Turtles younger than four years (up to 12 years in some populations) can be aged based on "growth rings" in their shells. For older turtles, some attempts have been made to determine age based on size and shape of their shells or legs using mathematical models, but this method is more uncertain. The most reliable method to study the long-lived turtles is to capture them, permanently mark their shells by notching with a drill, release the turtles, and then recapture them in later years. The longest-running study, in Michigan, has shown that painted turtles can live more than 55 years.
Adult sex ratios of painted turtle populations average around 1:1. Many populations are slightly male-heavy, but some are strongly female-imbalanced; one population in Ontario has a female to male ratio of 4:1. Hatchling sex ratio varies based on egg temperature. During the middle third of incubation, temperatures of 23–27 °C (73–81 °F) produce males, and anything above or below that, females. It does not appear that females choose nesting sites to influence the sex of the hatchlings; within a population, nests will vary sufficiently to give both male and female-heavy broods.
## Ecology
### Diet
The painted turtle is a bottom-dwelling hunter. It quickly juts its head into and out of vegetation to stir potential victims out into the open water, where they are pursued. Large prey is ripped apart with the forefeet as the turtle holds it in its mouth. It also consumes plants and skims the surface of the water with its mouth open to catch small particles of food.
Although all subspecies of painted turtle eat both plants and animals (in the form of leaves, algae, fish, crustaceans, aquatic insects and carrion), their specific diets vary. Young painted turtles are mostly carnivorous and as they mature they become more herbivorous.
Painted turtles obtain coloration from carotenoids in their natural diet by eating algae and a variety of aquatic plants from their environment. Stripes and spots increase red and yellow chroma and decrease UV chroma and brightness in turtles with large amounts of carotenoids in their diet compared to the stripes and spots of turtles with only moderate amounts of carotenoids in their diet.
- The eastern painted turtle's diet is the least studied. It prefers to eat in the water, but has been observed eating on land. The fish it consumes are typically dead or injured.
- The midland painted turtle eats mostly aquatic insects and both vascular and non-vascular plants.
- The western painted turtle's consumption of plants and animals changes seasonally. In early summer, 60% of its diet comprises insects. In late summer, 55% includes plants. Of note, the western painted turtle aids in the dispersal of white water-lily seeds. The turtle consumes the hard-coated seeds, which remain viable after passing through the turtle, and disperses them through its feces.
### Predators
Painted turtles are most vulnerable to predators when young. Nests are frequently ransacked and the eggs eaten by garter snakes, crows, chipmunks, thirteen-lined ground and gray squirrels, skunks, groundhogs, raccoons, badgers, gray and red fox, and humans. The small and sometimes bite-size, numerous hatchlings fall prey to water bugs, bass, catfish, bullfrogs, snapping turtles, three types of snakes (copperheads, racers and water snakes), herons, rice rats, weasels, muskrats, minks, and raccoons. As adults, the turtles' armored shells protect them from many potential predators, but they still occasionally fall prey to alligators, ospreys, crows, red-shouldered hawks, bald eagles, and especially raccoons.
Painted turtles defend themselves by kicking, scratching, biting, or urinating. In contrast to land tortoises, painted turtles can right themselves if they are flipped upside down.
## Life cycle
### Mating
The painted turtles mate in spring and fall in waters of 10–25 °C (50–77 °F). Males start producing sperm in early spring, when they can bask to an internal temperature of 17 °C (63 °F). Females begin their reproductive cycles in mid-summer, and ovulate the following spring.
Courtship begins when a male follows a female until he meets her face-to-face. He then strokes her face and neck with his elongated front claws, a gesture returned by a receptive female. The pair repeat the process several times, with the male retreating from and then returning to the female until she swims to the bottom, where they copulate. As the male is smaller than the female, he is not dominant. Although not directly observed, evidence indicates that the male will inflict injury on the female in attempts of coercion. Males will use their tooth-like cusps on their beaks and their foreclaws during this act of coercion with the female. The female stores sperm, to be used for up to three clutches, in her oviducts; the sperm may remain viable for up to three years. A single clutch may have multiple fathers.
### Egg-laying
Nesting is done, by the females only, between late May and mid-July. The nests are vase-shaped and are usually dug in sandy soil, often at sites with southern exposures. Nests are often within 200 m (220 yd) of water, but may be as far away as 600 m (660 yd), with older females tending to nest further inland. Nest sizes vary depending on female sizes and locations but are about 5–11 cm (2–4 in) deep. Females may return to the same sites several consecutive years, but if several females make their nests close together, the eggs become more vulnerable to predators. Female eastern painted turtles have been shown to nest together, possibly even participating in communal nesting.
The female's optimal body temperature while digging her nest is 29–30 °C (84–86 °F). If the weather is unsuitable, for instance a too hot night in the Southeast, she delays the process until later at night. Painted turtles in Virginia have been observed waiting three weeks to nest because of a hot drought.
While preparing to dig her nest, the female sometimes exhibits a mysterious preliminary behavior. She presses her throat against the ground of different potential sites, perhaps sensing moisture, warmth, texture, or smell, although her exact motivation is unknown. She may further temporize by excavating several false nests as the wood turtles also do.
The female relies on her hind feet for digging. She may accumulate so much sand and mud on her feet that her mobility is reduced, making her vulnerable to predators. To lighten her labors, she lubricates the area with her bladder water. Once the nest is complete, the female deposits into the hole. The freshly laid eggs are white, elliptical, porous, and flexible. From start to finish, the female's work may take four hours. Sometimes she remains on land overnight afterwards, before returning to her home water.
Females can lay five clutches per year, but two is a normal average after including the 30–50% of a population's females that do not produce any clutches in a given year. In some northern populations, no females lay more than one clutch per year. Bigger females tend to lay bigger eggs and more eggs per clutch. Clutch sizes of the subspecies vary, although the differences may reflect different environments, rather than different genetics. The two more northerly subspecies, western and midland, are larger and have more eggs per clutch—11.9 and 7.6, respectively—than the eastern (4.9). Within subspecies, also, the more northerly females lay larger clutches.
### Growth
Incubation lasts 72–80 days in the wild and for a similar period in artificial conditions. In August and September, the young turtle breaks out from its egg, using a special projection of its jaw called the egg tooth. Not all offspring leave the nest immediately, though. Hatchlings north of a line from Nebraska to northern Illinois to New Jersey typically arrange themselves symmetrically in the nest and overwinter to emerge the following spring.
The hatchling's ability to survive winter in the nest has allowed the painted turtle to extend its range farther north than any other American turtle. The painted turtle is genetically adapted to survive extended periods of subfreezing temperatures with blood that can remain supercooled and skin that resists penetration from ice crystals in the surrounding ground. The hardest freezes nevertheless kill many hatchlings.
Immediately after hatching, turtles are dependent on egg yolk material for sustenance. About a week to a week and a half after emerging from their eggs (or the following spring if emergence is delayed), hatchlings begin feeding to support growth. The young turtles grow rapidly at first, sometimes doubling their size in the first year. Growth slows sharply at sexual maturity and may stop completely. Likely owing to differences of habitat and food by water body, growth rates often differ from population to population in the same area. Among the subspecies, the western painted turtles are the quickest growers.
Females grow faster than males overall, and must be larger to mature sexually. In most populations males reach sexual maturity at 2–4 years old, and females at 6–10. Size and age at maturity increase with latitude; at the northern edge of their range, males reach sexual maturity at 7–9 years of age and females at 11–16.
## Behavior
### Daily routine and basking
A cold-blooded reptile, the painted turtle regulates its temperature through its environment, notably by basking. All ages bask for warmth, often alongside other species of turtle. Sometimes more than 50 individuals are seen on one log together. Turtles bask on a variety of objects, often logs, but have even been seen basking on top of common loons that were covering eggs.
The turtle starts its day at sunrise, emerging from the water to bask for several hours. Warmed for activity, it returns to the water to forage. After becoming chilled, the turtle re-emerges for one to two more cycles of basking and feeding. At night, the turtle drops to the bottom of its water body or perches on an underwater object and sleeps.
To be active, the turtle must maintain an internal body temperature between 17–23 °C (63–73 °F). When fighting infection, it manipulates its temperature up to 5 °C (9.0 °F) higher than normal.
### Seasonal routine and hibernation
In the spring, when the water reaches 15–18 °C (59–64 °F), the turtle begins actively foraging. However, if the water temperature exceeds 30 °C (86 °F), the turtle will not feed. In fall, the turtle stops foraging when temperatures drop below the spring set-point.
During the winter, the turtle hibernates. In the north, the inactive season may be as long as from October to March, while the southernmost populations may not hibernate at all. While hibernating, the body temperature of the painted turtle averages 6 °C (43 °F). Periods of warm weather bring the turtle out of hibernation, and even in the north, individuals have been seen basking in February.
The painted turtle hibernates by burying itself, either on the bottom of a body of water, near water in the shore-bank or the burrow of a muskrat, or in woods or pastures. When hibernating underwater, the turtle prefers shallow depths, no more than 2 m (7 ft). Within the mud, it may dig down an additional 1 m (3 ft). In this state, the turtle does not breathe, although if surroundings allow, it may get some oxygen through its skin. The species is one of the best-studied vertebrates able to survive long periods without oxygen. Adaptations of its blood chemistry, brain, heart, and particularly its shell allow the turtle to survive extreme lactic acid buildup while oxygen-deprived.
### Anoxia tolerance
During the winter months, painted turtles become ice-locked and spend their time in either hypoxic (low oxygen) or anoxic (no oxygen) regions of the pond or lake. Painted turtles essentially hold their breath until the following spring when the ice melts. As a result, painted turtles rely on anaerobic respiration, which leads to the production of lactic acid. However, painted turtles can tolerate long periods of anoxia due to three factors: a depressed metabolic rate, large glycogen stores in the liver, and sequestering lactate in the shell and releasing carbonate buffers to the extracellular fluid.
The shell of an adult painted turtle has the largest concentration of carbonate content recorded among animals. This large carbonate content helps the painted turtle buffer the accumulation of lactic acid during anoxia. Both the shell and skeleton release calcium and magnesium carbonates to buffer extracellular lactic acid. A painted turtle can also sequester 44% of total body lactate in their shell. Despite the shell's large buffering contribution, it does not experience any significant decrease in mechanical properties under natural conditions.
The duration of anoxia tolerance varies depending on the sub-species of painted turtle. The western painted turtle (C. picta bellii) can survive 170 days of anoxia, followed by the midland painted turtle (C. picta marginata) which can survive 150 days, and finally the eastern painted turtle (C. picta picta), which can survive 125 days. Differences in anoxia tolerance are partially attributed to the rate of lactate production and buffering capability in painted turtles. Furthermore, northern populations of painted turtles have a higher anoxia tolerance than southern populations.
Other anoxia tolerant freshwater turtles include: the southern painted turtle (Chrysemys dorsalis), which can survive 75–86 days of anoxia, the snapping turtle (Chelydra serpentina), which can survive 100 days under anoxia, and the map turtle (Graptemys geographica), which can survive 50 days of anoxia. One reason for the difference in duration between more anoxia-tolerant species and less anoxia-tolerant species is the turtle's ability to buffer lactic acid accumulation during anoxia.
Unlike adult painted turtles, hatchlings can survive only 40 days, but still exhibit high anoxia tolerance and freeze tolerance compared to other hatchling species (30 days for Chelydra serpentina, and 15 days for Graptemys geographica) due to cold winters.
### Movement
Searching for water, food, or mates, the painted turtles travel up to several kilometers at a time. During summer, in response to heat and water-clogging vegetation, the turtles may vacate shallow marshes for more permanent waters. Short overland migrations may involve hundreds of turtles together. If heat and drought are prolonged, the turtles will bury themselves and, in extreme cases, die.
Foraging turtles frequently cross lakes or travel linearly down creeks. Daily crossings of large ponds have been observed. Tag and release studies show that sex also drives turtle movement. Males travel the most, up to 26 km (16 mi), between captures; females the second most, up to 8 km (5 mi), between captures; and juveniles the least, less than 2 km (1.2 mi), between captures. Males move the most and are most likely to change wetlands because they seek mates.
The painted turtles, through visual recognition, have homing capabilities. Many individuals can return to their collection points after being released elsewhere, trips that may require them to traverse land. One experiment placed 98 turtles varying several-kilometer distances from their home wetland; 41 returned. When living in a single large body of water, the painted turtles can home from up to 6 km (4 mi) away. Another experiment found that if placed far enough away from water the turtles will just walk in straight paths and not orient towards water or in any specific direction which indicates a lack of homing ability. Females may use homing to help locate suitable nesting sites.
Eastern painted turtle movements may contribute to aquatic plant seed dispersal. A study done in Massachusetts found that the quantity of intact macrophyte seeds defecated by Eastern painted turtles can be high and that the seeds of specifically Nymphaea ordorata that were found in feces were capable of moderate to high level germination. As turtles move between ponds and habitats, they carry seeds along with them to new locations.
## Interaction with humans
### Conservation
The species is currently classified as least concern by the IUCN but populations have been subject to decline locally.
The decline in painted turtle populations is not a simple matter of dramatic range reduction, like that of the American bison. Instead the turtle is classified as G5 (demonstrably widespread) in its Natural Heritage Global Rank, and the IUCN rates it as a species of least concern. The painted turtle's high reproduction rate and its ability to survive in polluted wetlands and artificially made ponds have allowed it to maintain its range, but the post-Columbus settlement of North America has reduced its numbers.
Only within the Pacific Northwest is the turtle's range eroding. Even there, in Washington, the painted turtle is designated S5 (demonstrably widespread). However, in Oregon, the painted turtle is designated S2 (imperiled), and in British Columbia, the turtle's populations in the Coast and Interior regions are labeled "endangered" and "of special concern", respectively.
Much is written about the different factors that threaten the painted turtle, but they are unquantified, with only inferences of relative importance. A primary threat category is habitat loss in various forms. Related to water habitat, there is drying of wetlands, clearing of aquatic logs or rocks (basking sites), and clearing of shoreline vegetation, which allows more predator access or increased human foot traffic. Related to nesting habitat, urbanization or planting can remove needed sunny soils.
Another significant human impact is roadkill—dead turtles, especially females, are commonly seen on summer roads. In addition to direct killing, roads genetically isolate some populations. Localities have tried to limit roadkill by constructing underpasses, highway barriers, and crossing signs. Oregon has introduced public education on turtle awareness, safe swerving, and safely assisting turtles across the road.
In the West, human-introduced bass, bullfrogs, and especially snapping turtles, have increased the predation of hatchlings. Outside the Southeast, where sliders are native, released pet red-eared slider turtles increasingly compete with painted turtles. In cities, increased urban predators (raccoons, canines, and felines) may impact painted turtles by eating their eggs.
Other factors of concern for the painted turtles include over-collection from the wild, released pets introducing diseases or reducing genetic variability, pollution, boating traffic, angler's hooks (the turtles are noteworthy bait-thieves), wanton shooting, and crushing by agricultural machines or golf course lawnmowers or all-terrain vehicles. Gervais and colleagues note that research itself impacts the populations and that much funded turtle trapping work has not been published. They advocate discriminating more on what studies are done, thereby putting fewer turtles into scientists' traps. Global warming represents an uncharacterized future threat.
As the most common turtle in Nova Scotia, the eastern painted turtle is not listed under the Species at Risk Act for conservation requirements.
### Pets and other uses
According to a trade data study, painted turtles were the second most popular pet turtles after red-eared sliders in the early 1990s. As of 2010, most U.S. states allow, but discourage, painted turtle pets, although Oregon forbids keeping them as pets, and Indiana prohibits their sale. U.S. federal law prohibits sale or transport of any turtle less than 10 cm (4 in), to limit human contact to salmonella. However, a loophole for scientific samples allows some small turtles to be sold, and illegal trafficking also occurs.
Painted turtle pet-keeping requirements are similar to those of the red-eared slider. Keepers are urged to provide them with adequate space and a basking site, and water that is regularly filtered and changed. Aquatic turtles are generally unsuitable pets for children, as they do not enjoy being held. Hobbyists have maintained turtles in captivity for decades. Painted turtles are long-lived pets, and have a lifespan of up to 40 years in captivity.
The painted turtle is sometimes eaten but is not highly regarded as food, as even the largest subspecies, the western painted turtle, is inconveniently small and larger turtles are available. Schools frequently dissect painted turtles, which are sold by biological supply companies; specimens often come from the wild but may be captive-bred. In the Midwest, turtle racing is popular at summer fairs.
### Capture
Commercial harvesting of painted turtles in the wild is controversial and, increasingly, restricted. Wisconsin formerly had virtually unrestricted trapping of painted turtles but based on qualitative observations forbade all commercial harvesting in 1997. Neighboring Minnesota, where trappers collected more than 300,000 painted turtles during the 1990s, commissioned a study of painted turtle harvesting. Scientists found that harvested lakes averaged half the painted turtle density of off-limit lakes, and population modeling suggested that unrestricted harvests could produce a large decline in turtle populations. In response, Minnesota forbade new harvesters in 2002 and limited trap numbers. Although harvesting continued, subsequent takes averaged half those of the 1990s. In 2023, Minnesota banned the practice of commercial turtle trapping. As of 2009, painted turtles faced virtually unlimited harvesting in Arkansas, Iowa, Missouri, Ohio, and Oklahoma; since then, Missouri has prohibited their harvesting.
Individuals who trap painted turtles typically do so to earn additional income, selling a few thousand a year at $1–2 each. Many trappers have been involved in the trade for generations, and value it as a family activity. Some harvesters disagree with limiting the catch, saying the populations are not dropping.
Many U.S. state fish and game departments allow non-commercial taking of painted turtles under a creel limit, and require a fishing (sometimes hunting) license; others completely forbid the recreational capture of painted turtles. Trapping is not allowed in Oregon, where western painted turtle populations are in decline, and in Missouri, where there are populations of both southern and western subspecies. In Canada, Ontario protects both subspecies present, the midland and western, and British Columbia protects its dwindling western painted turtles.
Capture methods are also regulated by locality. Typically trappers use either floating "basking traps" or partially submerged, baited "hoop traps". Trapper opinions, commercial records, and scientific studies show that basking traps are more effective for collecting painted turtles, while the hoop traps work better for collecting "meat turtles" (snapping turtles and soft-shell turtles). Nets, hand capture, and fishing with set lines are generally legal, but shooting, chemicals, and explosives are forbidden.
### Culture
Native American tribes were familiar with the painted turtle—young braves were trained to recognize its splashing into water as an alarm—and incorporated it in folklore. A Potawatomi myth describes how the talking turtles, "Painted Turtle" and allies "Snapping Turtle" and "Box Turtle", outwit the village women. Painted Turtle is the star of the legend and uses his distinctive markings to trick a woman into holding him so he can bite her. An Illini myth recounts how Painted Turtle put his paint on to entice a chief's daughter into the water.
As of 2010, four U.S. states designated the painted turtle as official reptile. Vermont honored the reptile in 1994, following the suggestion of Cornwall Elementary School students. In 1995, Michigan followed, based on the recommendation of Niles fifth graders, who discovered the state lacked an official reptile. On February 2, 2005, Representative Bob Biggins introduced a bill to make the tiger salamander the official state amphibian of Illinois and to make the painted turtle the official state reptile. The bill was signed into law by Governor Rod Blagojevich on July 19, 2005. Colorado chose the western painted turtle in 2008, following the efforts of two succeeding years of Jay Biachi's fourth grade classes. In New York, the painted turtle narrowly lost (5,048 to 5,005, versus the common snapping turtle) a 2006 statewide student election for state reptile.
In the border town of Boissevain, Manitoba, a 10,000 lb (4,500 kg) western painted turtle, Tommy the Turtle, is a roadside attraction. The statue was built in 1974 to celebrate the Canadian Turtle Derby, a festival including turtle races that ran from 1972 to 2001.
Another Canadian admirer of the painted turtle is Jon Montgomery, who won the 2010 Olympic gold medal in skeleton (a form of sled) racing, while wearing a painted turtle painting on the crown of his helmet, prominently visible when he slid downhill. Montgomery, who also iconically tattooed his chest with a maple-leaf, explained his visual promotion of the turtle, saying that he had assisted one to cross the road. BC Hydro referred to Montgomery's action when describing its own sponsorship of conservation research for the turtle in British Columbia.
Several private entities use the painted turtle as a symbol. Wayne State University Press operates an imprint "named after the Michigan state reptile" that "publishes books on regional topics of cultural and historical interest". In California, The Painted Turtle is a camp for ill children, founded by Paul Newman. Painted Turtle Winery of British Columbia trades on the "laid back and casual lifestyle" of the turtle with a "job description to bask in the sun". Also, there is an Internet company in Michigan, a guesthouse in British Columbia, and a café in Maine that use the painted turtle commercially.
In children's books, the painted turtle is a popular subject, with at least seven books published between 2000 and 2010. |
# Winter service vehicle
A winter service vehicle (WSV), or snow removal vehicle, is a vehicle specially designed or adapted to clear thoroughfares of ice and snow. Winter service vehicles are usually based on a dump truck chassis, with adaptations allowing them to carry specially designed snow removal equipment. Many authorities also use smaller vehicles on sidewalks, footpaths, and cycleways. Road maintenance agencies and contractors in temperate or polar areas often own several winter service vehicles, using them to keep the roads clear of snow and ice and safe for driving during winter. Airports use winter service vehicles to keep both aircraft surfaces, and runways and taxiways free of snow and ice, which, besides endangering aircraft takeoff and landing, can interfere with the aerodynamics of the craft.
The earliest winter service vehicles were snow rollers, designed to maintain a smooth, even road surface for sleds, although horse-drawn snowplows and gritting vehicles are recorded in use as early as 1862. The increase in motor car traffic and aviation in the early 20th century led to the development and popularisation of large motorised winter service vehicles.
## History
Although snow removal dates back to at least the Middle Ages, early attempts merely involved using a shovel or broom to remove snow from walkways and roads. Before motorised transport, snow removal was seen as less of a concern; unpaved roads in rural areas were dangerous and bumpy, and snow and ice made the surface far smoother. Most farmers could simply replace their wagons with sleds, allowing the transport of heavy materials such as timber with relative ease. Early communities in the northern regions of the United States and Canada even used animal-drawn snow rollers, the earliest winter service vehicles, to compress the snow covering roads. The compression increased the life of the snow and eased passage for sleds. Some communities even employed snow wardens to spread or "pave" snow onto exposed areas such as bridges, to allow sleds to use these routes.
However, with the increase in paved roads and the increasing size of cities, snow-paving fell out of favour, as the resultant slippery surfaces posed a danger to pedestrians and traffic. The earliest patents for snowplows date back to 1840, but there are no records of their actual use until 1862, when the city of Milwaukee began operating horse-drawn carts fitted with snowplows. The horse-drawn snowplow quickly spread to other cities, especially those in areas prone to heavy snowfall.
The first motorised snowplows were developed in 1913, based on truck and tractor bodies. These machines allowed the mechanisation of the snow clearing process, reducing the labor required for snow removal and increasing the speed and efficiency of the process. The expansion of the aviation industry also acted as a catalyst for the development of winter service vehicles during the early 20th century. Even a light dusting of snow or ice could cause an aeroplane to crash, so airports erected snow fences around airfields to prevent snowdrifts, and began to maintain fleets of vehicles to clear runways in heavy weather.
With the popularisation of the motor car, it was found that plowing alone was insufficient for removing all snow and ice from the roadway, leading to the development of gritting vehicles, which used sodium chloride to accelerate the melting of the snow. Early attempts at gritting were resisted, as the salt used encouraged rusting, causing damage to the metal structures of bridges and the shoes of pedestrians. However, as the number of motoring accidents increased, the protests subsided and by the end of the 1920s, many cities in the United States used salt and sand to clear the roads and increase road safety. As environmental awareness increased through the 1960s and 1970s, gritting once again came under criticism due to its environmental impact, leading to the development of alternative de-icing chemicals and more efficient spreading systems.
## Design
Winter service vehicles are usually based on a dump truck chassis, which are then converted into winter service vehicles either by the manufacturer or an aftermarket third-party. A typical modification involves the replacement of steel components of the vehicle with corrosion resistant aluminium or fibreglass, waterproofing any exposed electronic components, replacement of the stock hopper with a specially designed gritting body, the addition of a plow frame, reinforcement of the wheels, bumpers to support the heavy blade, and the addition of extra headlamps, a light bar, and retroreflectors for visibility.
Other common changes include the replacement of the factory stock tires with rain tires or mud and snow tires and the shortening of the vehicle's wheelbase to improve maneuverability. For smaller applications smaller trucks are used. In Canada, pickup trucks are used with snow removal operations with a blade mounted in front and optional de-icing equipment installed in the rear. Underbody scrapers are also used by some agencies and are mounted between axles, distributing plowing stresses on the chassis more evenly.
In most countries, winter service vehicles usually have amber light bars, which are activated to indicate that the vehicle is operating below the local speed limit or otherwise poses a danger to other traffic, either by straddling lanes or by spreading grit or de-icer. In some areas, such as the Canadian province of Ontario, winter service vehicles use the blue flashing lights associated with emergency service vehicles, rather than the amber or orange used elsewhere. In Michigan, green flashing lights are used. Many agencies also paint their vehicles in high-contrast orange or yellow to allow the vehicles to be seen more clearly in whiteout conditions.
Some winter service vehicles, especially those designed for use on footpaths or pedestrian zones, are built on a far smaller chassis using small tractors or custom made vehicles. These vehicles are often multi-purpose, and can be fitted with other equipment such as brushes, lawnmowers or cranes—as these operations are generally unable to run during heavy snowfalls, there is generally little overlap between the different uses, reducing the size of the fleet required by the agency or contractor.
Modern winter service vehicles will usually also have a satellite navigation system connected to a weather forecast feed, allowing the driver to choose the best areas to treat and to avoid areas in which rain is likely, which can wash away the grit used—the most advanced can even adapt to changing conditions, ensuring optimal gritter and plow settings. Most run on wheels, often with snow chains or studded tires, but some are mounted on caterpillar tracks, with the tracks themselves adapted to throw the snow towards the side of the road. Off-road winter service vehicles mounted on caterpillar tracks are known as snowcats. Snowcats are commonly fitted with snowplows or snow groomers, and are used by ski resorts to smooth and maintain pistes and snowmobile runs, although they can also be used as a replacement for chairlifts.
Military winter service vehicles are heavily armoured to allow for their use in combat zones, especially in Arctic and mountain warfare, and often based on combat bulldozers or Humvees. Military winter service vehicles have been used by the United Nations, Kosovo Force, and the US Army in Central Europe during the Kosovo War, while during the Cold War, the Royal Marines and Royal Corps of Signals deployed a number of tracked vehicles in Norway to patrol the NATO border with the Soviet Union.
## Operation
Winter service vehicles are operated by both government agencies and by private subcontractors. Public works in areas which regularly receive snowfall usually maintain a fleet of their own vehicles or pay retainers to contractors for priority access to vehicles in winter, while cities where snow is a less regular occurrence may simply hire the vehicles as needed. Winter service vehicles in the United Kingdom are the only road-going vehicles entitled to use red diesel. Though the vehicles still use public highways, they are used to keep the road network operational, and forcing them to pay extra tax to do so would discourage private contractors from assisting with snow removal on public roads. Winter service vehicle drivers in the United States must hold a Class A or Class B commercial driver's license. Although some agencies in some areas, such as the US state of Minnesota, allow winter service vehicle drivers to operate without any extra training, most provide supplemental lessons to drivers to teach them the most effective and safe methods of snow removal. Many require that trainee drivers ride-along with more experienced drivers, and some even operate specially designed driving simulators, which can safely replicate dangerous winter driving conditions. Other organisations require that all staff have a recognised additional licence or certificate—the United Kingdom Highways Agency for example requires that all staff have both a City & Guilds qualification and a supplemental Winter Maintenance Licence.
Winter service vehicle drivers usually work part-time, before and during inclement weather only, with drivers working a 12- to 16-hour shift. Main roads are typically gritted in advance, to reduce the disruption to the network. Salt barns are provided at regular intervals for drivers to collect more grit, and bedding is provided at road maintenance depots for drivers to use between shifts in heavy or prolonged storms.
Weather conditions typically vary greatly depending on altitude; hot countries can experience heavy snowfall in mountainous regions yet receive very little in low-lying areas, increasing the accident rate among drivers inexperienced in winter driving. In addition, road surface temperatures can fall rapidly at higher altitudes, precipitating rapid frost formation. As a result, gritting and plowing runs are often prioritised in favour of clearing these mountain roads, especially at the start and end of the snow season. The hazardous roads through mountain passes pose additional problems for the large winter service vehicles. The heavy metal frame and bulky grit makes hill climbing demanding for the vehicle, so vehicles have extremely high torque transmission systems to provide enough power to make the climb. Furthermore, because the tight hairpin turns found on mountain slopes are difficult for long vehicles to navigate, winter service vehicles for use in mountainous areas are shortened, usually from six wheels to four.
## Equipment
### De-icer
De-icers spray heated de-icing fluid, most often using propylene glycol (formerly ethylene glycol was used), onto icy surfaces such as the bodies of aircraft and road surfaces. These prevent ice from forming on the body of the aircraft while on the ground. Ice makes the surface of the wings rougher, reducing the amount of lift they provide while increasing drag. The ice also increases the weight of the aircraft and can affect its balance.
Aircraft de-icing vehicles usually consist of a large tanker truck, containing the concentrated de-icing fluid, with a water feed to dilute the fluid according to the ambient temperature. The vehicle also normally has a cherry picker crane, allowing the operator to spray the entire aircraft in as little time as possible; an entire Boeing 737 can be treated in under 10 minutes by a single de-icing vehicle.
In road snow and ice control, brine is often used as an anti-icer rather than a de-icer. A vehicle carries a tank of brine, which is sprayed on the road surface before or at the onset of the storm. This keeps snow and ice from adhering to the surface and makes mechanical removal by plows easier. Solid salt is also wetted with brine or other liquid deicer. This speeds de-icing action and helps keep it from bouncing off the pavement into the gutter or ditch. Brine acts faster than solid salt and does not require compression by passing traffic to become effective. The brine is also more environmentally friendly, as less salt is required to treat the same length of road.
Airport runways are also de-iced by sprayers fitted with long spraying arms. These arms are wide enough to cross the entire runway, and allow de-icing of the entire airstrip to take place in a single pass, reducing the length of time that the runway is unavailable.
### Front-end loader
Front-end loaders are commonly used to remove snow especially from sidewalks, parking lots, and other areas too small for using snowplows and other heavy equipment. They are sometimes used as snowplows with a snowplow attachment but commonly have a bucket or snowbasket, which can also be used to load snow into the rear compartment of a snowplow or dump truck. Front end loaders with large box-like front end attachment are used to clear snow in parking lots in malls and other institutions.
### Gritter
A gritter, also known as a sander, salt spreader or salt truck, is found on most winter service vehicles. Indeed, the gritter is so commonly seen on winter service vehicles that the terms are sometimes used synonymously. Gritters are used to spread grit (usually rock salt), onto roads. The grit is stored in the large hopper on the rear of the vehicle, with a wire mesh over the top to prevent foreign objects from entering the spreading mechanism and hence becoming jammed. The salt is generally spread across the roadway by an impeller, attached by a hydraulic drive system to a small onboard engine. However, until the 1970s, the grit was often spread manually using shovels by men riding on the back of the truck, and some older spreading mechanisms still require grit be manually loaded into the impeller from the hopper.
Salt reduces the melting point of ice by freezing-point depression, causing it to melt at lower temperatures and run off to the edge of the road, while sand increases traction by increasing friction between car tires and roadways. The amount of salt dropped varies with the condition of the road; to prevent the formation of light ice, approximately 10 g/m<sup>2</sup> (2.0 lb/1000 sq ft; 0.018 lb/sq yd) is dropped, while thick snow can require up to 40 g/m<sup>2</sup> (8.2 lb/1000 sq ft; 0.074 lb/sq yd) of salt, independent of the volume of sand dropped. The grit is sometimes mixed with molasses to help adhesion to the road surface. However, the sweet molasses often attracts livestock, who lick the road.
Gritters are among the winter service vehicles also used in airports, to keep runways free of ice. However, the salt normally used to clear roads can damage the airframe of aircraft and interferes with the sensitive navigation equipment. As a result, airport gritters spread less dangerous potassium acetate or urea onto the runways instead, as these do not corrode the aircraft or the airside equipment.
Gritters can also be used in hot weather, when temperatures are high enough to melt the bitumen used in asphalt. The grit is dropped to provide a protective layer between the road surface and the tires of passing vehicles, which would otherwise damage the road surface by "plucking out" the bitumen-coated aggregate from the road surface.
### Snow blower
Snow blowers, also known as rotating snowplows or snow cutters, can be used in place of snowplows on winter service vehicles. A snow blower consists of a rapidly spinning auger which cuts through the snow, forcing it out of a funnel attached to the top of the blower. Snow blowers typically clear much faster than plows, with some clearing in excess of 5,000 tonnes (4,900 long tons; 5,500 short tons) of snow per hour, and can cut through far deeper snow drifts than a snowplow can. In addition, snow blowers can remove snow from the roadway completely, rather than piling it at the side of the road, making passage easier for other road users and preventing the windrow from blocking driveways.
### Jet-powered snow blower
Some railroads occasionally use air-blowing machines, each powered by a jet engine, to clear snow from tracks and switches. In addition to physically blowing snow with the force of the air, they melt recalcitrant precipitation with exhaust temperatures over 1,000 degrees Fahrenheit (538 °C). Approximately 100 are believed to have been manufactured in the 1960s, 1970s, and 1980s; they are used so rarely that they are generally maintained indefinitely rather than being replaced. For example, in the Boston area the MBTA uses two model RP-3 Portec RMC Hurricane Jet Snow Blowers, nicknamed "Snowzilla", to clear heavy snows from the Ashmont–Mattapan High Speed Line and Wellington Yard. The jet snow blowers can be faster and gentler than conventional removal methods, but they consume a large amount of fuel.
### Snow groomer
A snow groomer is a machine designed to smooth and compact the snow, rather than removing it altogether. Early snow groomers were used by residents of rural areas to compress the snow close to their homes, and consisted of a heavy roller hauled by oxen which compacted the snow to make a smooth surface for sledging. With the invention of the motor car, snow groomers were replaced by snowplows and snow blowers on public thoroughfares, but remained in use at ski resorts, where they are used to maintain smooth, safe trails for various wintersports, including skiing, snowboarding and snowmobiling. Snow groomers remained unchanged throughout the 20th century, with most consisting of heavy roller which could be attached to a tractor or snowcat and then hauled across the area to be groomed.
The development of more advanced electronic systems in the 1980s allowed manufacturers to produce snow groomers which could work on and replicate a much wider range of terrains, with the most modern even able to produce half-pipes and ramps for snowboarding. Snow groomers are also used in conjunction with snow cannons, to ensure that the snow produced is spread evenly across the resort. However, snow groomers have a detrimental effect on the environment within the resort. Regular pressure from the grooming vehicle increases the infiltration rate of the soil while decreasing the field capacity. This increases the rate at which water can soak through the soil, making it more prone to erosion.
### Snow melter
A snow-melting vehicle works by scooping snow into a melting pit located in a large tank at the rear of the vehicle. Around the melting pit is a thin jacket full of warm water, heated by a powerful burner. The gases from the burner are bubbled through the water, causing some of the heated water to spill over into the melting pit, melting the snow instantly. The meltwater is discharged into the storm drains.
Because they have to carry the large water tank and fuel for the burner, snow melting machines tend to be much larger and heavier than most winter service vehicles, at around 18 metres (59 ft), with the largest being hauled by semi-trailer tractor units. In addition, the complicated melting process means that snow melting vehicles have a much lower capacity than the equivalent plow or blower vehicle; the largest snow melter can remove 500 metric tons of snow per hour, compared to the 5,000 metric tons per hour capacity of any large snow blower.
Snow melters are in some ways more environmentally friendly than gritters, as they do not spray hazardous materials, and pollutants from the road surface can be separated from the meltwater and disposed of safely. In addition, because the snow is melted on board, the costs of transporting snow from the site are eliminated. On the other hand, snow melting can require large amounts of energy, which has its own costs and environmental impact.
### Snowplow
Many winter service vehicles can be fitted with snowplows, to clear roads which are blocked by deep snow. In most cases, the plows are mounted on hydraulically actuated arms, allowing them to be raised, lowered, and angled to better move snow. Most winter service vehicles include either permanently fixed plows or plow frames: 75% of the UK's Highways Agency vehicles include a plow frame to which a blade can be attached. Winter service vehicles with both a plow frame and a gritting body are known as "all purpose vehicles", and while these are more efficient than using dedicated vehicles, the weight of the hopper often decreases the range of the vehicle. Therefore, most operators will keep at least a few dedicated plowing vehicles in store for heavy storms.
In the event that specially designed winter service vehicles are not available for plowing, other service or construction vehicles can be used instead: among those used by various authorities are graders, bulldozers, skid loaders, pickup trucks and rubbish trucks. Front-end loaders can also be used to plow snow. Either a snowplow attachment can be mounted on the loader's arm in place of the bucket, or the bucket or snowbasket can be used to load snow into the rear compartment of a snowplow or dump truck, which then hauls it away. Snowplows are dangerous to overtake; often, the oncoming lane may not be completely free of snow. In addition, the plow blade causes considerable spray of snow on both sides, which can obscure the vision of other road users.
### Snow sweeper
A snow sweeper uses brushes to remove thin layers of snow from the pavement surface. Snow sweepers are used after plowing to remove any remaining material missed by the larger vehicles in areas with very low snow-tolerance, such as airport runways and racing tracks, as the flexible brushes follow the terrain better than the rigid blades of snowplows and snow blowers. These brushes also allow the vehicle to be used on the tactile tiles found at traffic lights and tram stops, without damaging the delicate surface. Unlike other winter service vehicles, snow sweepers do not compress the snow, leaving a rough, high friction, surface behind them. This makes snow sweepers the most efficient method of snow removal for snow depths below 10 centimetres (3.9 in). Snow deeper than this, however, can clog the brushes, and most snow sweepers cannot be used to clear snow deeper than 15 centimetres (5.9 in). A more advanced version of the snow sweeper is the jet sweeper, which adds an air-blower just behind the brushes, in order to blow the swept snow clear of the pavement and prevent the loosened snow from settling.
### Surface friction tester
The surface friction tester is a small fifth wheel attached to a hydraulic system mounted on the rear axle of the vehicle, used to measure road slipperiness. The wheel, allowed to roll freely, is slightly turned relative to the ground so that it partially slides. Sensors attached to the axis of the wheel calculate the friction between the wheel and the pavement by measuring the torque produced by the rotation of the wheel. Surface friction testers are used at airports and on major roadways before ice formation or after snow removal. The vehicle can relay the surface friction data back to the control centre, allowing gritting and clearing to be planned so that the vehicles are deployed most efficiently. Surface friction testers often include a water spraying system, to simulate the effects of rain on the road surface before the rain occurs. The sensors are usually mounted to small compact or estate cars or to a small trailer, rather than the large trucks used for other winter service equipment, as the surface friction tester works best when attached to a lightweight vehicle.
## Materials
To improve traction and melt ice or snow, winter service vehicles spread granular or liquid ice-melting chemicals and grit, such as sand or gravel.
The most common chemical is rock salt, which can melt snow at high temperatures but has some unwanted side effects. If the salt concentration becomes high enough, it can be toxic to plant and animal life and greatly accelerate corrosion of metals, so operators should limit gritting to an absolute minimum. The dropped salt is eventually washed away and lost, so it may not be reused or collected after gritting runs. By contrast, the insoluble sand can be collected and recycled by street sweeping vehicles and mixed with new salt crystals to be reused in later batches of grit.
Sea salt may not be used, as it is too fine and dissolves too quickly, so all salt used in gritting comes from salt mines, a non-renewable source. As a result, some road maintenance agencies have networks of ice prediction stations, to prevent unnecessary gritting, which not only wastes salt but can also damage the environment and disrupt traffic.
The U.S. state of Oregon uses magnesium chloride, a relatively cheap chemical similar in snow-melting effects to sodium chloride, but less reactive, while New Zealand uses calcium magnesium acetate, which avoids the environmentally harmful chloride ion altogether. Urea is sometimes used to grit suspension bridges, as it does not corrode iron or steel at all, but urea is less effective than salt and can cost up to seven times more, weight-for-weight.
In some areas of the world, including Berlin, Germany, dropping salt is prohibited altogether except on the highest-risk roads; plain sand without any melting agents is spread instead. While this may protect the environment, it is more labour-intensive, as more gritting runs are needed; because the sand is insoluble, it tends to accumulate at the sides of the road, making it more difficult for buses to pull in at bus stops.
Grit is often mixed with hydrous sodium ferrocyanide as an anticaking agent which, while harmless in its natural form, can undergo photodissociation in strong sunlight to produce the extremely toxic chemical hydrogen cyanide. Although sunlight is generally not intense enough to cause this in polar and temperate regions, salt deposits must kept as far as possible from waterways to avert the possibility of cyanide-tainted runoff water entering fisheries or farms.
Gritting vehicles are also dangerous to overtake; as grit is scattered across the entire roadway, loose pieces can damage the paintwork and windows of passing cars. Loose salt does not provide sufficient traction for motorcycles, which can lead to skidding, especially around corners.
## See also
- Snow emergency
- Snowplow
- Snow removal
- Wedge plow |
# Zombie Nightmare
Zombie Nightmare is a 1987 Canadian zombie film produced and directed by Jack Bravman, written by John Fasano, and starring Adam West, Tia Carrere, Jon Mikl Thor, and Shawn Levy. The film centres around a baseball player who is killed by a group of teenagers and is resurrected as a zombie by a Haitian voodoo priestess. The zombie goes on to kill the teens, whose deaths are investigated by the police. The film was shot in the suburbs of Montreal, Canada. It was originally written to star mostly black actors but, at the request of investors, the characters' names were changed to more typically white names. While Bravman was credited as director, Fasano directed the majority of the film. Problems occurred between Fasano and the production crew, who believed him to be assistant director and ignored his directions.
Originally planned for theatrical release, Zombie Nightmare was released direct-to-video by New World Video. It grossed C$1.5 million against a budget of $180,000. While it received negative reviews from critics, finding the plot to be predictable and derivative, it was praised for its heavy metal soundtrack featuring Motörhead, Girlschool, and Jon Mikl Thor's band Thor. Zombie Nightmare was shown in a 1994 episode of the comedy television series Mystery Science Theater 3000.
## Plot
Baseball player William Washington is fatally stabbed defending a young black girl from two white teenage boys. Years later, William's son Tony also becomes a baseball player. After Tony disrupts a robbery at a grocery store, he is struck and killed by a car full of teenagers: Bob, Amy, Jim, Peter, and Susie. The teens flee the scene and Tony's body is carried to his home, where his mother Louise mourns over him. She contacts Molly Mekembe, the girl William saved, to repay the favour of her rescue. Now revealed to be a Haitian voodoo priestess, Molly resurrects Tony as a zombie and uses her powers to guide him to the teenagers, aiding him in his revenge.
The next night, Tony tracks Peter and Susie to an academy gymnasium and fatally breaks Peter's neck, then kills Susie by crushing her skull with a baseball bat. Police detective Frank Sorrell investigates the case. Police Captain Tom Churchman tells the press that the killings were a drug-induced murder-suicide. The next night, Tony finds Jim sexually assaulting a waitress and impales Jim with his bat, killing him. Churchman tells Sorrell that they found a suspect responsible for the murders and closes the case. Believing that the case has not truly been solved, Sorrell investigates photos that place Molly at both incidents. He suggests bringing her in for questioning but Churchman dismisses it. After sending Sorrell home to rest, Churchman calls Jim's father Fred and informs him of Molly's involvement in Jim's death. Fred goes to the police station to meet Churchman, but is killed by Tony en route.
Believing that they will be targeted next, Bob and Amy decide to leave town. They steal money from Jim's uncle's garage but Tony finds them there and kills them. Sorrell is attacked by Tony but survives. While monitoring Tony's actions, Churchman abducts Molly and forces her at gunpoint to show him where Tony is going. Sorrell follows Tony to a cemetery. Molly and Churchman soon arrive, with both telling Sorrell that the priestess resurrected Tony to not only avenge himself, but also to avenge Molly, as Churchman and Fred were the teenagers who attacked her years ago and Churchman killed Tony's father. Churchman shoots Tony, having learned that a revived zombie's power fades once it has achieved its goal. Molly tries to cast a spell, but is shot and killed by Churchman, who then turns to kill Sorrell as the only surviving witness. However, a second zombie rises out of a nearby grave and drags Churchman into the ground while Churchman pleads for Sorrell to kill him but he ignores him and walks away.
## Cast
## Production
Director Jack Bravman wanted to transition from adult films to horror, and contacted John Fasano after hearing about his work on Blood Sisters. Bravman asked Fasano to write the script and take an uncredited co-directing role. Fasano accepted and wrote the original script to have the teens be black with black-sounding names, setting the film in his hometown of Port Washington, New York, and offering local actors roles for the film. Associate Professor of the University of Lethbridge Sean Brayton described the original concept as "a retribution narrative" of a black character getting revenge on the white perpetrators of his death. The script was later changed to give the teens more white-sounding names because investors were worried that a black-centric cast would not sell in the foreign market. Having written the script on an IBM Selectric II, Fasano typed the white-sounding names onto a page, cut them out with a knife, and glued them to their appropriate places in the script. After the name changes, they received a budget of $180,000 from investors. Unions in New York did not give the production a permit to film in the state, so filming was moved to Montreal, Quebec.
Zombie Nightmare was produced by Montreal-based company Gold-Gems Productions and was the film debut of American actress Tia Carrere. Adam West played the crooked policeman Tom Churchman. West was on the set for two days and glanced at his script during his scenes; fellow cop Frank Sorrell was played by Frank Dietz, a childhood friend of Fasano. Manuska Rigaud, who played voodoo priestess Molly Mokembe, was a professional Tina Turner impersonator. The role of Tony was originally given to bodybuilder Peewee Piemonte. Days into production, Piemonte was fired for eating all the craft services and the meals of crew members. Piemonte was replaced by Jon Mikl Thor, singer for the Canadian rock band Thor. Wrestler Superstar Billy Graham was originally cast to play Tony's father. On the day he arrived in Montreal, no one came to pick him up at the airport and Graham left after waiting ten hours. Fasano took up the role. Scenes with West and Carrere were directed by Bravman while Fasano shot the majority of the film. Problems occurred between Fasano and the Canadian crew, who believed him to be assistant director and not co-director. This resulted in his directions being ignored by the crew, including cinematographer Roger Racine.
Tony Bua and Andy Clement, college friends of Fasano, made the zombie masks and provided the makeup for the film. It took five hours to apply Jon Mikal Thor's makeup, using glue and latex. American cast and crew members were housed in an airport hotel with pornography being played on every television channel, and they noticed that Bravman's name appeared in the credits for many of the films. Zombie Nightmare's editor, David Wellington, received the writing credit for the film so that it would qualify for a Canadian tax credit program.
Thor wrote much of the incidental music. This includes heavy metal riffs by his band and synthesizer music played by the band Thorkestra. Several other heavy metal bands contribute to the soundtrack. The Motörhead single "Ace of Spades" plays during the opening credits. Other bands heard on the soundtrack include Virgin Steele, Girlschool, Fist, and Death Mask, and a track by Thor's then-wife and backup singer Rusty Hamilton.
## Reception and legacy
### Release
The film was originally planned for a theatrical release by Filmworld Distributors but it was instead released direct-to-video by New World Pictures. It was released in the United States on VHS in October 1987. The film grossed C$1.5 million worldwide. The film was released on special edition DVD by Scorpion Releasing in 2010.
### Critical reception
Steve Bissette for Deep Red Magazine criticized the story for being predictable and derivative. Bissette thought the makeup and production values to be competent and noted Rigaud's "absurd over-the-top performance". Fangoria had praise for the performances of the teen leads and recommended the movie for its heavy metal soundtrack. However, the reviewer considered the film boring, criticizing the lack of special effects and the predictable plot. Ian Jane of DVD Talk wrote that the film was horrible but so "deliciously goofy" one couldn't help but have fun with it. Writing in The Zombie Movie Encyclopedia, academic Peter Dendle called it "painful and toilsome". Bloody Disgusting listed the film among the "cheesiest" of heavy metal horror films. Kerrang\! considered the soundtrack to be better than the film itself. Jim Craddock, author of VideoHound's Golden Movie Retriever, summarized the film as "cheap and stupid".
### Mystery Science Theater 3000
Zombie Nightmare was featured in a season six episode of Mystery Science Theater 3000 (MST3K), a cult science fiction comedy television series in which the character Mike Nelson and his two robot friends Crow T. Robot and Tom Servo are forced to watch bad films as part of an ongoing scientific experiment. The episode was first showcased during Comedy Central's "Fresh Cheese" fall 1994 tour around college campuses in the United States. It made its television debut on Comedy Central on 24 November 1994. The episode premiered during the channel's annual Turkey Day marathon of MST3K episodes, which West hosted. In The Amazing Colossal Episode Guide, a series guide written by MST3K members, Mary Jo Pehl described the movie as "painful" and said that the members of the show "thoroughly, intensely, and unequivocally hated this movie". In 2009, Shout\! Factory released the episode as part of the "Volume XV" box set, and in 2017, the episode was added to Netflix. |
# Gagak Item
Gagak Item (; Vernacular Malay for Black Raven, also known by the Dutch title De Zwarte Raaf) is a 1939 bandit film from the Dutch East Indies (now Indonesia) directed by Joshua and Othniel Wong for Tan's Film. Starring Rd Mochtar, Roekiah, and Eddy T. Effendi, it follows a masked man known only as "Gagak Item" ("Black Raven"). The black-and-white film, which featured the cast and crew from the 1937 hit Terang Boelan (Full Moon), was a commercial success and received positive reviews upon release. It is likely lost.
## Production
Gagak Item was directed by brothers Joshua and Othniel Wong; filming the work in black-and-white, they also handled sound editing. It was produced by Tan Khoen Yauw of Tan's Film and starred Rd Mochtar, Roekiah, Eddy T. Effendi, and Kartolo. The Wongs and cast had first worked together on Albert Balink's 1937 blockbuster Terang Boelan (Full Moon), before joining Tan's Film in 1938 for the highly successful Fatima; Gagak Item was their second production with the company, which hoped to mirror Terang Boelan's success. Through these prior films Mochtar and Roekiah had become an established screen couple.
Saeroen, a journalist-turned-screenwriter for Terang Boelan and Fatima, returned to write the script to Gagak Item. The film, a love story, followed a girl and a masked man known as "Gagak Item" ("Black Raven") and was set in rural Buitenzorg (now Bogor). The titular bandit was similar to Zorro, a character popular in the Indies at the time; such figures had been a staple of travelling theatre troupes beginning in the early 1930s. When writing the script Saeroen continued the formula he had used in Terang Boelan, including action, music, beautiful vistas and physical comedy. The film had six songs performed by Hugo Dumas' musical troupe Lief Java; the troupe was known for its keroncong performances, mixing traditional music with Portuguese influences. Gagak Item featured vocals by kroncong singer Annie Landouw.
## Release and reception
Gagak Item was released in late 1939 and was screened in Batavia (now Jakarta), the capital of the Indies; Medan, Northern Sumatra; and Surabaya, Eastern Java. Some screenings of the film, also advertised under the Dutch title De Zwarte Raaf, had Dutch-language subtitles. A novelisation of the film, published by the Yogyakarta-based Kolff-Buning, soon followed. Gagak Item was one of four domestic productions released in 1939; the film industry had undergone a significant downturn following the onset of the Great Depression, during which time cinemas mainly showed Hollywood productions, and had only begun recovering following the release of Terang Boelan.
Gagak Item was a commercial and critical success, although not as much as Tan's earlier production. An anonymous review in Bataviaasch Nieuwsblad praised the film, especially its music. The reviewer opined that the film would be a great success and that the film industry in the Indies was showing promising developments. Another review in the same paper found that, although "one may shake one's head over the cultural value of indigenous films", the film was a step forward for the industry. The review praised Roekiah's "demure" acting.
Following the success of Gagak Item the Wongs, Saeroen, Roekiah, and Mochtar continued to work with Tan's Film. Their next production, Siti Akbari (1940), was similarly successful, although again not as profitable as Terang Boelan or Fatima. Saeroen, Joshua Wong, and Mochtar left the company in 1940: Wong and Mochtar after payment disputes, and Saeroen to return to journalism. Through 1941 Tan's Film produced fewer movies than its competitors, and was ultimately shut down following the Japanese occupation in early 1942.
Gagak Item was screened as late as January 1951. The film is likely lost. In common with the rest of the world, movies in the Indies were then shot using highly flammable nitrate film, and after a fire destroyed much of Produksi Film Negara's warehouse in 1952, old Indies films shot on nitrate were deliberately destroyed. The American visual anthropologist Karl G. Heider writes that all Indonesian films from before 1950 are lost. However, JB Kristanto's Katalog Film Indonesia (Indonesian Film Catalogue) records several as having survived at Sinematek Indonesia's archives, and film historian Misbach Yusa Biran writes that several Japanese propaganda films have survived at the Netherlands Government Information Service.
## Explanatory notes |
# Albert Stanley, 1st Baron Ashfield
Albert Henry Stanley, 1st Baron Ashfield, (8 August 1874 – 4 November 1948), born Albert Henry Knattriess, was a British-American businessman who was managing director, then chairman of the Underground Electric Railways Company of London (UERL) from 1910 to 1933 and chairman of the London Passenger Transport Board (LPTB) from 1933 to 1947.
Although born in Britain, his early career was in the United States, where, at a young age, he held senior positions in the developing tramway systems of Detroit and New Jersey. In 1898, he served in the United States Navy during the short Spanish–American War.
In 1907, his management skills led to his recruitment by the UERL, which was struggling through a financial crisis that threatened its existence. He quickly integrated the company's management and used advertising and public relations to improve profits. As managing director of the UERL from 1910, he led the takeover of competing underground railway companies and bus and tram operations to form an integrated transport operation known as the Combine.
He was Member of Parliament for Ashton-under-Lyne from December 1916 to January 1920. He was President of the Board of Trade between December 1916 and May 1919, reorganising the board and establishing specialist departments for various industries. He returned to the UERL and then chaired it and its successor the LPTB during the organisation's most significant period of expansion in the interwar period, making it a world-respected organisation considered an exemplar of the best form of public administration.
## Early life and career in the United States
Stanley was born on 8 August 1874, in New Normanton, Derbyshire, England, the son of Henry and Elizabeth Knattriess (née Twigg). His father worked as a coachbuilder for the Pullman Company. In 1880, the family emigrated to Detroit in the United States, where he worked at Pullman's main factory. During the 1890s, the family changed its name to "Stanley".
In 1888, at the age of 14, Stanley left school and went to work as an office boy at the Detroit Street Railways Company, which ran a horse-drawn tram system. He continued to study at evening school and worked long hours, often from 7.30 am to 10.00 pm. His abilities were recognised early and Stanley was responsible for scheduling the services and preparing the timetables when he was 17. Following the tramway's expansion and electrification, he became General Superintendent of the company in 1894.
Stanley was a naval reservist and, during the brief Spanish–American War of 1898, he served in the United States Navy as a landsman in the crew of USS Yosemite alongside many others from Detroit. In 1903, Stanley moved to New Jersey to become assistant general manager of the street railway department of the Public Service Corporation of New Jersey. The company had been struggling, but Stanley quickly improved its organisation and was promoted to department general manager in January 1904. In January 1907, he became general manager of the whole corporation, running a network of almost 1,000 route miles and 25,000 employees.
In 1904, Stanley married Grace Lowrey (1878–1962) of New York. The couple had two daughters: Marian Stanley (1906–92) and Grace Stanley (1907–77).
## Career in Britain
### Rescue of the Underground Electric Railways Company
On 20 February 1907, Sir George Gibb, managing director of the Underground Electric Railways Company of London (UERL), appointed Stanley as its general manager. The UERL was the holding company of four underground railways in central London. Three of these (the District Railway, the Baker Street and Waterloo Railway and the Great Northern, Piccadilly and Brompton Railway) were already in operation and the fourth (the Charing Cross, Euston and Hampstead Railway) was about to open. The UERL had been established by American financier Charles Yerkes, and much of the finance and equipment had been brought from the United States. Hence, Stanley's experience managing urban transit systems in that country made him an ideal candidate. The cost of constructing three new lines in just a few years had put the company in a precarious monetary position, and income needed to be increased to pay the interest on its loans. Stanley's responsibility was to restore its finances.
Only recently promoted to general manager of the New Jersey system, Stanley had been reluctant to take the position in London and took it for one year only, provided he would be free to return to America at the end of the year. He told the company's senior managers that the company was almost bankrupt and got resignation letters from each of them post-dated by six months. Through better integration of the separate companies within the group and by improving advertising and public relations, he was quickly able to turn the fortunes of the company around, while the company's chairman, Sir Edgar Speyer, renegotiated the debt repayments. In 1908, Stanley joined the company's board and, in 1910, he became the managing director.
With Commercial Manager Frank Pick, Stanley devised a plan to increase passenger numbers: developing the "UNDERGROUND" brand and establishing a joint booking system and co-ordinated fares throughout all of London's underground railways, including those not controlled by the UERL. In July 1910, Stanley took the integration of the group further, when he persuaded previously reluctant American investors to approve the merger of the three tube railways into a single company. Further consolidation came with the UERL's take-over of London General Omnibus Company (LGOC) in 1912 and the Central London Railway and the City and South London Railway on 1 January 1913. Of London's underground railways, only the Metropolitan Railway (and its subsidiaries the Great Northern & City Railway and the East London Railway) and the Waterloo & City Railway remained outside of the Underground Group's control. The LGOC was the dominant bus operator in the capital and its high profitability (it paid dividends of 18 per cent compared with Underground Group companies' dividends of 1 to 3 per cent) subsidised the rest of the group. Stanley further expanded the group through shareholdings in London United Tramways and Metropolitan Electric Tramways and the foundation of bus builder AEC. The much enlarged group became known as the Combine. On 29 July 1914, Stanley was knighted in recognition of his services to transport.
Stanley also planned extensions of the existing Underground Group's lines into new, undeveloped districts beyond the central area to encourage the development of new suburbs and new commuter traffic. The first of the extensions, the Bakerloo line to Queen's Park and Watford Junction, opened between 1915 and 1917. The other expansion plans were postponed during World War I.
### Government
In 1915, Stanley was given a wartime role as Director-General of Mechanical Transport at the Ministry of Munitions. In 1916, he was selected by Prime Minister David Lloyd George to become President of the Board of Trade. Lloyd George had previously promised this role to Sir Max Aitken (later Lord Beaverbrook), Member of Parliament for Ashton-under-Lyne. At that time, a member of parliament taking a cabinet post for the first time had to resign and stand for re-election in a by-election. Aitken had made arrangements to do this before Lloyd George decided to appoint Stanley to the position instead. Aitken, a friend of Stanley, was persuaded to continue with the resignation in exchange for a peerage so that Stanley could take his seat. Stanley became President of the Board of Trade and was made a Privy Counsellor on 13 December 1916. He was elected to parliament unopposed on 23 December 1916 as a Conservative. At 42 years old he was the youngest member of Lloyd George's coalition government.
At the 1918 general election, Stanley was opposed by Frederick Lister, the President of the National Federation of Discharged and Demobilized Sailors and Soldiers, in a challenge over the government's policy on war pensions. With the backing of Beaverbrook, who visited his former constituency to speak on his behalf, Stanley won the election.
Stanley's achievements in office were mixed. He established various specialist departments to manage output in numerous industries and reorganised the structure of the Board. However, despite previous successes with unions, his negotiations were ineffective. Writing to Leader of the House of Commons and future Prime Minister Bonar Law in January 1919, Lloyd George described Stanley as having "all the glibness of Runciman and that is apt to take in innocent persons like you and me ... Stanley, to put it quite bluntly, is a funk, and there is no room for funks in the modern world." Stanley left the Board of Trade and the government in May 1919 and returned to the UERL.
### Return to the Underground
Back at the Underground Group, Stanley returned to his role as managing director and also became its chairman, replacing Lord George Hamilton. In the 1920 New Year Honours, he was created Baron Ashfield, of Southwell in the County of Nottingham, ending his term as an MP. He and Pick reactivated their expansion plans, and one of the most significant periods in the organisation's history began, subsequently considered to be its heyday and sometimes called its "Golden Age".
The Central London Railway was extended to Ealing Broadway in 1920, and the Charing Cross, Euston and Hampstead Railway was extended to Hendon in 1923 and to Edgware in 1924. The City and South London Railway was reconstructed with larger diameter tunnels to take modern trains between 1922 and 1924 and extended to Morden in 1926. In addition, a programme of modernising many of the Underground's busiest central London stations was started; providing them with escalators to replace lifts. New rolling stock was gradually introduced with automatic sliding doors along the length of the carriage instead of manual end gates. By the middle of the 1920s, the organisation had expanded to such an extent that a large, new headquarters building was constructed at 55 Broadway over St James's Park station. In this, Ashfield had a panelled office suite on the seventh floor.
Starting in the early 1920s, competition from numerous small bus companies, nicknamed "pirates" because they operated irregular routes and plundered the LGOC's passengers, eroded the profitability of the Combine's bus operations and had a negative impact on the profitability of the whole group. Ashfield lobbied the government for regulation of transport services in the London area. Starting in 1923, a series of legislative initiatives were made in this direction, with Ashfield and Labour London County Councillor (later MP and Minister of Transport) Herbert Morrison, at the forefront of debates as to the level of regulation and public control under which transport services should be brought. Ashfield aimed for regulation that would give the UERL group protection from competition and allow it to take substantive control of the LCC's tram system; Morrison preferred full public ownership. Ashfield's proposal was fraught with controversy, The Spectator noting, "Everybody agrees that Lord Ashfield knows more about transport than anyone else, but people are naturally loth to give, not to him, but to his shareholders, the monopoly of conveying them." After seven years of false starts, a bill was announced at the end of 1930 for the formation of the London Passenger Transport Board (LPTB), a public corporation that would take control of the UERL, the Metropolitan Railway and all bus and tram operators within an area designated as the London Passenger Transport Area. As Ashfield had done with shareholders in 1910 over the consolidation of the three UERL controlled tube lines, he used his persuasiveness to obtain their agreements to the government buy-out of their stock.
> I have read this bill carefully, and I beg you to accept that I know what I am talking about. You cannot conceive I would be guilty of such folly as to suggest to you in a matter in which my whole life has been wrapped, that you should transfer your interests to a board subject to political interference, that could play ducks and drakes with your investments. Acts of Parliament are not treated like scraps of paper. They are scrupulously observed by all parties. I have promised the Minister my support. You may fail to support me, but in that event you will have to find somebody else to manage your undertakings. I have pledged my word and I am not going back on it.
The Board was a compromise – public ownership but not full nationalisation – and came into existence on 1 July 1933. Ashfield served as the organisation's chairman from its establishment in 1933 on an annual salary of £12,500 (approximately £600,000 today), with Pick as Chief Executive.
The opening of extensions of the Piccadilly line to Uxbridge, Hounslow and Cockfosters followed in 1933. On the Metropolitan Railway, Ashfield and Pick instigated a rationalisation of services. The barely used and loss-making Brill and Verney Junction branches beyond Aylesbury were closed in 1935 and 1936. Freight services were reduced and electrification of the remaining steam operated sections of the line was planned. In 1935, the availability of government-backed loans to stimulate the flagging economy allowed Ashfield and Pick to promote system-wide improvements under the New Works Programme for 1935–1940, including the transfer of the Metropolitan line's Stanmore services to the Bakerloo line in 1939, the Northern line's Northern Heights project and extension of the Central line to Ongar and Denham.
Following a reorganisation of public transportation by the Labour government of Clement Attlee, the LPTB was scheduled to be nationalised along with the majority of British railway, bus, road haulage and waterway concerns from 1 January 1948. In advance of this, Ashfield resigned from the LPTB at the end of October 1947 and joined the board of the new British Transport Commission (BTC) which was to operate all of the nationalised public transport systems. At nationalisation, the LPTB was to be abolished and replaced by the London Transport Executive (LTE). Lord Latham, a member of the LPTB and the incoming chairman of the new organisation, acted as temporary chairman for the last two months of the LPTB's existence. The BTC required office space, and as one of his last acts before leaving the LPTB for the BTC, Ashfield offered the use of the eighth and ninth floors of 55 Broadway for the BTC's use. He was in turn allotted a BTC office on one of these floors, but decided to continue using his seventh-floor suite for his BTC duties. Latham, as his replacement at the LPTB (and subsequently the LTE), was not able to occupy this accommodation until Ashfield's death.
### Other activities
In addition to his management of London Underground and brief political career, Ashfield held many directorships in transport undertakings and industry. He helped establish the Institute of Transport in 1919/20 and was one of its first presidents. He was a director of the Mexican Railway Company and two railway companies in Cuba and a member of the 1931 Royal Commission on Railways and Transportation in Canada. He was one of two government directors of the British Dyestuffs Corporation, its chairman from 1924 and was involved in the creation of Imperial Chemical Industries in 1926, of which he was subsequently a non-executive director. Ashfield was a director of the Midland Bank, Amalgamated Anthracite Collieries and chairman of Albany Ward Theatres, Associated Provincial Picture Houses, and Provincial Cinematograph Theatres.
During World War I, he was Colonel of the Territorial Force Engineer and Railway Staff Corps and was Honorary Colonel of the Royal Artillery's 84th Light Anti Aircraft Regiment during World War II.
## Personality
Biographers of Stanley characterise him as having an "immensely active mind, and a strong sense of public duty" and a "great charm of manner and a sense of humour which concealed an almost ruthless determination" that made him a "formidable negotiator". His "intuitive understanding of his fellow men" gave him "presence, which allowed him to dominate meetings effortlessly" and "inspired loyalty, devotion even, among his staff". He was "a dapper ladies' man, something of a playboy tycoon, who was always smartly turned out and enjoyed moving in high society".
## Legacy
Ashfield died on 4 November 1948 at 31 Queen's Gate, South Kensington. During his near forty-year tenure as managing director and chairman of the Underground Group and the LPTB, Ashfield oversaw the transformation of a collection of unconnected, competing railway, bus and tram companies, some in severe financial difficulties, into a coherent and well managed transport organisation, internationally respected for its technical expertise and design style. Transport historian Christian Wolmar considers it "almost impossible to exaggerate the high regard in which LT was held during its all too brief heyday, attracting official visitors from around the world eager to learn the lessons of its success and apply them in their own countries." "It represented the apogee of a type of confident public administration ... with a reputation that any state organisation today would envy ... only made possible by the brilliance of its two famous leaders, Ashfield and Pick."
A memorial to Ashfield was erected at 55 Broadway in 1950 and a blue plaque was placed at his home, 43 South Street, Mayfair in 1984. A large office building at London Underground's Lillie Bridge Depot is named Ashfield House in his honour. It stands to the south of the District line tracks a short distance to the east of West Kensington station and is also visible from West Cromwell Road (A4). |
# Oxidative phosphorylation
Oxidative phosphorylation (UK /ɒkˈsɪd.ə.tɪv/, US /ˈɑːk.sɪˌdeɪ.tɪv/ ) or electron transport-linked phosphorylation or terminal oxidation is the metabolic pathway in which cells use enzymes to oxidize nutrients, thereby releasing chemical energy in order to produce adenosine triphosphate (ATP). In eukaryotes, this takes place inside mitochondria. Almost all aerobic organisms carry out oxidative phosphorylation. This pathway is so pervasive because it releases more energy than alternative fermentation processes such as anaerobic glycolysis.
The energy stored in the chemical bonds of glucose is released by the cell in the citric acid cycle, producing carbon dioxide and the energetic electron donors NADH and FADH. Oxidative phosphorylation uses these molecules and O<sub>2</sub> to produce ATP, which is used throughout the cell whenever energy is needed. During oxidative phosphorylation, electrons are transferred from the electron donors to a series of electron acceptors in a series of redox reactions ending in oxygen, whose reaction releases half of the total energy.
In eukaryotes, these redox reactions are catalyzed by a series of protein complexes within the inner membrane of the cell's mitochondria, whereas, in prokaryotes, these proteins are located in the cell's outer membrane. These linked sets of proteins are called the electron transport chain. In eukaryotes, five main protein complexes are involved, whereas in prokaryotes many different enzymes are present, using a variety of electron donors and acceptors.
The energy transferred by electrons flowing through this electron transport chain is used to transport protons across the inner mitochondrial membrane, in a process called electron transport. This generates potential energy in the form of a pH gradient and the resulting electrical potential across this membrane. This store of energy is tapped when protons flow back across the membrane and down the potential energy gradient, through a large enzyme called ATP synthase in a process called chemiosmosis. The ATP synthase uses the energy to transform adenosine diphosphate (ADP) into adenosine triphosphate, in a phosphorylation reaction. The reaction is driven by the proton flow, which forces the rotation of a part of the enzyme. The ATP synthase is a rotary mechanical motor.
Although oxidative phosphorylation is a vital part of metabolism, it produces reactive oxygen species such as superoxide and hydrogen peroxide, which lead to propagation of free radicals, damaging cells and contributing to disease and, possibly, aging and senescence. The enzymes carrying out this metabolic pathway are also the target of many drugs and poisons that inhibit their activities.
## Chemiosmosis
Oxidative phosphorylation works by using energy-releasing chemical reactions to drive energy-requiring reactions. The two sets of reactions are said to be coupled. This means one cannot occur without the other. The chain of redox reactions driving the flow of electrons through the electron transport chain, from electron donors such as NADH to electron acceptors such as oxygen and hydrogen (protons), is an exergonic process – it releases energy, whereas the synthesis of ATP is an endergonic process, which requires an input of energy. Both the electron transport chain and the ATP synthase are embedded in a membrane, and energy is transferred from the electron transport chain to the ATP synthase by movements of protons across this membrane, in a process called chemiosmosis. A current of protons is driven from the negative N-side of the membrane to the positive P-side through the proton-pumping enzymes of the electron transport chain. The movement of protons creates an electrochemical gradient across the membrane, is called the proton-motive force. It has two components: a difference in proton concentration (a H<sup>+</sup> gradient, ΔpH) and a difference in electric potential, with the N-side having a negative charge.
ATP synthase releases this stored energy by completing the circuit and allowing protons to flow down the electrochemical gradient, back to the N-side of the membrane. The electrochemical gradient drives the rotation of part of the enzyme's structure and couples this motion to the synthesis of ATP.
The two components of the proton-motive force are thermodynamically equivalent: In mitochondria, the largest part of energy is provided by the potential; in alkaliphile bacteria the electrical energy even has to compensate for a counteracting inverse pH difference. Inversely, chloroplasts operate mainly on ΔpH. However, they also require a small membrane potential for the kinetics of ATP synthesis. In the case of the fusobacterium Propionigenium modestum it drives the counter-rotation of subunits a and c of the F<sub>O</sub> motor of ATP synthase.
The amount of energy released by oxidative phosphorylation is high, compared with the amount produced by anaerobic fermentation. Glycolysis produces only 2 ATP molecules, but somewhere between 30 and 36 ATPs are produced by the oxidative phosphorylation of the 10 NADH and 2 succinate molecules made by converting one molecule of glucose to carbon dioxide and water, while each cycle of beta oxidation of a fatty acid yields about 14 ATPs. These ATP yields are theoretical maximum values; in practice, some protons leak across the membrane, lowering the yield of ATP.
## Electron and proton transfer molecules
The electron transport chain carries both protons and electrons, passing electrons from donors to acceptors, and transporting protons across a membrane. These processes use both soluble and protein-bound transfer molecules. In the mitochondria, electrons are transferred within the intermembrane space by the water-soluble electron transfer protein cytochrome c. This carries only electrons, and these are transferred by the reduction and oxidation of an iron atom that the protein holds within a heme group in its structure. Cytochrome c is also found in some bacteria, where it is located within the periplasmic space.
Within the inner mitochondrial membrane, the lipid-soluble electron carrier coenzyme Q10 (Q) carries both electrons and protons by a redox cycle. This small benzoquinone molecule is very hydrophobic, so it diffuses freely within the membrane. When Q accepts two electrons and two protons, it becomes reduced to the ubiquinol form (QH<sub>2</sub>); when QH<sub>2</sub> releases two electrons and two protons, it becomes oxidized back to the ubiquinone (Q) form. As a result, if two enzymes are arranged so that Q is reduced on one side of the membrane and QH<sub>2</sub> oxidized on the other, ubiquinone will couple these reactions and shuttle protons across the membrane. Some bacterial electron transport chains use different quinones, such as menaquinone, in addition to ubiquinone.
Within proteins, electrons are transferred between flavin cofactors, iron–sulfur clusters and cytochromes. There are several types of iron–sulfur cluster. The simplest kind found in the electron transfer chain consists of two iron atoms joined by two atoms of inorganic sulfur; these are called [2Fe–2S] clusters. The second kind, called [4Fe–4S], contains a cube of four iron atoms and four sulfur atoms. Each iron atom in these clusters is coordinated by an additional amino acid, usually by the sulfur atom of cysteine. Metal ion cofactors undergo redox reactions without binding or releasing protons, so in the electron transport chain they serve solely to transport electrons through proteins. Electrons move quite long distances through proteins by hopping along chains of these cofactors. This occurs by quantum tunnelling, which is rapid over distances of less than 1.4×10<sup>−9</sup> m.
## Eukaryotic electron transport chains
Many catabolic biochemical processes, such as glycolysis, the citric acid cycle, and beta oxidation, produce the reduced coenzyme NADH. This coenzyme contains electrons that have a high transfer potential; in other words, they will release a large amount of energy upon oxidation. However, the cell does not release this energy all at once, as this would be an uncontrollable reaction. Instead, the electrons are removed from NADH and passed to oxygen through a series of enzymes that each release a small amount of the energy. This set of enzymes, consisting of complexes I through IV, is called the electron transport chain and is found in the inner membrane of the mitochondrion. Succinate is also oxidized by the electron transport chain, but feeds into the pathway at a different point.
In eukaryotes, the enzymes in this electron transport system use the energy released from O<sub>2</sub> by NADH to pump protons across the inner membrane of the mitochondrion. This causes protons to build up in the intermembrane space, and generates an electrochemical gradient across the membrane. The energy stored in this potential is then used by ATP synthase to produce ATP. Oxidative phosphorylation in the eukaryotic mitochondrion is the best-understood example of this process. The mitochondrion is present in almost all eukaryotes, with the exception of anaerobic protozoa such as Trichomonas vaginalis that instead reduce protons to hydrogen in a remnant mitochondrion called a hydrogenosome.
### NADH-coenzyme Q oxidoreductase (complex I)
NADH-coenzyme Q oxidoreductase, also known as NADH dehydrogenase or complex I, is the first protein in the electron transport chain. Complex I is a giant enzyme with the mammalian complex I having 46 subunits and a molecular mass of about 1,000 kilodaltons (kDa). The structure is known in detail only from a bacterium; in most organisms the complex resembles a boot with a large "ball" poking out from the membrane into the mitochondrion. The genes that encode the individual proteins are contained in both the cell nucleus and the mitochondrial genome, as is the case for many enzymes present in the mitochondrion.
The reaction that is catalyzed by this enzyme is the two electron oxidation of NADH by coenzyme Q10 or ubiquinone (represented as Q in the equation below), a lipid-soluble quinone that is found in the mitochondrion membrane:
The start of the reaction, and indeed of the entire electron chain, is the binding of a NADH molecule to complex I and the donation of two electrons. The electrons enter complex I via a prosthetic group attached to the complex, flavin mononucleotide (FMN). The addition of electrons to FMN converts it to its reduced form, FMNH<sub>2</sub>. The electrons are then transferred through a series of iron–sulfur clusters: the second kind of prosthetic group present in the complex. There are both [2Fe–2S] and [4Fe–4S] iron–sulfur clusters in complex I.
As the electrons pass through this complex, four protons are pumped from the matrix into the intermembrane space. Exactly how this occurs is unclear, but it seems to involve conformational changes in complex I that cause the protein to bind protons on the N-side of the membrane and release them on the P-side of the membrane. Finally, the electrons are transferred from the chain of iron–sulfur clusters to a ubiquinone molecule in the membrane. Reduction of ubiquinone also contributes to the generation of a proton gradient, as two protons are taken up from the matrix as it is reduced to ubiquinol (QH<sub>2</sub>).
### Succinate-Q oxidoreductase (complex II)
Succinate-Q oxidoreductase, also known as complex II or succinate dehydrogenase, is a second entry point to the electron transport chain. It is unusual because it is the only enzyme that is part of both the citric acid cycle and the electron transport chain. Complex II consists of four protein subunits and contains a bound flavin adenine dinucleotide (FAD) cofactor, iron–sulfur clusters, and a heme group that does not participate in electron transfer to coenzyme Q, but is believed to be important in decreasing production of reactive oxygen species. It oxidizes succinate to fumarate and reduces ubiquinone. As this reaction releases less energy than the oxidation of NADH, complex II does not transport protons across the membrane and does not contribute to the proton gradient.
In some eukaryotes, such as the parasitic worm Ascaris suum, an enzyme similar to complex II, fumarate reductase (menaquinol:fumarate oxidoreductase, or QFR), operates in reverse to oxidize ubiquinol and reduce fumarate. This allows the worm to survive in the anaerobic environment of the large intestine, carrying out anaerobic oxidative phosphorylation with fumarate as the electron acceptor. Another unconventional function of complex II is seen in the malaria parasite Plasmodium falciparum. Here, the reversed action of complex II as an oxidase is important in regenerating ubiquinol, which the parasite uses in an unusual form of pyrimidine biosynthesis.
### Electron transfer flavoprotein-Q oxidoreductase
Electron transfer flavoprotein-ubiquinone oxidoreductase (ETF-Q oxidoreductase), also known as electron transferring-flavoprotein dehydrogenase, is a third entry point to the electron transport chain. It is an enzyme that accepts electrons from electron-transferring flavoprotein in the mitochondrial matrix, and uses these electrons to reduce ubiquinone. This enzyme contains a flavin and a [4Fe–4S] cluster, but, unlike the other respiratory complexes, it attaches to the surface of the membrane and does not cross the lipid bilayer.
In mammals, this metabolic pathway is important in beta oxidation of fatty acids and catabolism of amino acids and choline, as it accepts electrons from multiple acetyl-CoA dehydrogenases. In plants, ETF-Q oxidoreductase is also important in the metabolic responses that allow survival in extended periods of darkness.
### Q-cytochrome c oxidoreductase (complex III)
Q-cytochrome c oxidoreductase is also known as cytochrome c reductase, cytochrome bc<sub>1</sub> complex, or simply complex III. In mammals, this enzyme is a dimer, with each subunit complex containing 11 protein subunits, an [2Fe-2S] iron–sulfur cluster and three cytochromes: one cytochrome c<sub>1</sub> and two b cytochromes. A cytochrome is a kind of electron-transferring protein that contains at least one heme group. The iron atoms inside complex III's heme groups alternate between a reduced ferrous (+2) and oxidized ferric (+3) state as the electrons are transferred through the protein.
The reaction catalyzed by complex III is the oxidation of one molecule of ubiquinol and the reduction of two molecules of cytochrome c, a heme protein loosely associated with the mitochondrion. Unlike coenzyme Q, which carries two electrons, cytochrome c carries only one electron.
As only one of the electrons can be transferred from the QH<sub>2</sub> donor to a cytochrome c acceptor at a time, the reaction mechanism of complex III is more elaborate than those of the other respiratory complexes, and occurs in two steps called the Q cycle. In the first step, the enzyme binds three substrates, first, QH<sub>2</sub>, which is then oxidized, with one electron being passed to the second substrate, cytochrome c. The two protons released from QH<sub>2</sub> pass into the intermembrane space. The third substrate is Q, which accepts the second electron from the QH<sub>2</sub> and is reduced to Q<sup>.−</sup>, which is the ubisemiquinone free radical. The first two substrates are released, but this ubisemiquinone intermediate remains bound. In the second step, a second molecule of QH<sub>2</sub> is bound and again passes its first electron to a cytochrome c acceptor. The second electron is passed to the bound ubisemiquinone, reducing it to QH<sub>2</sub> as it gains two protons from the mitochondrial matrix. This QH<sub>2</sub> is then released from the enzyme.
As coenzyme Q is reduced to ubiquinol on the inner side of the membrane and oxidized to ubiquinone on the other, a net transfer of protons across the membrane occurs, adding to the proton gradient. The rather complex two-step mechanism by which this occurs is important, as it increases the efficiency of proton transfer. If, instead of the Q cycle, one molecule of QH<sub>2</sub> were used to directly reduce two molecules of cytochrome c, the efficiency would be halved, with only one proton transferred per cytochrome c reduced.
### Cytochrome c oxidase (complex IV)
Cytochrome c oxidase, also known as complex IV, is the final protein complex in the electron transport chain. The mammalian enzyme has an extremely complicated structure and contains 13 subunits, two heme groups, as well as multiple metal ion cofactors – in all, three atoms of copper, one of magnesium and one of zinc.
This enzyme mediates the final reaction in the electron transport chain and transfers electrons to oxygen and hydrogen (protons), while pumping protons across the membrane. The final electron acceptor oxygen is reduced to water in this step. Both the direct pumping of protons and the consumption of matrix protons in the reduction of oxygen contribute to the proton gradient. The reaction catalyzed is the oxidation of cytochrome c and the reduction of oxygen:
### Alternative reductases and oxidases
Many eukaryotic organisms have electron transport chains that differ from the much-studied mammalian enzymes described above. For example, plants have alternative NADH oxidases, which oxidize NADH in the cytosol rather than in the mitochondrial matrix, and pass these electrons to the ubiquinone pool. These enzymes do not transport protons, and, therefore, reduce ubiquinone without altering the electrochemical gradient across the inner membrane.
Another example of a divergent electron transport chain is the alternative oxidase, which is found in plants, as well as some fungi, protists, and possibly some animals. This enzyme transfers electrons directly from ubiquinol to oxygen.
The electron transport pathways produced by these alternative NADH and ubiquinone oxidases have lower ATP yields than the full pathway. The advantages produced by a shortened pathway are not entirely clear. However, the alternative oxidase is produced in response to stresses such as cold, reactive oxygen species, and infection by pathogens, as well as other factors that inhibit the full electron transport chain. Alternative pathways might, therefore, enhance an organism's resistance to injury, by reducing oxidative stress.
### Organization of complexes
The original model for how the respiratory chain complexes are organized was that they diffuse freely and independently in the mitochondrial membrane. However, recent data suggest that the complexes might form higher-order structures called supercomplexes or "respirasomes". In this model, the various complexes exist as organized sets of interacting enzymes. These associations might allow channeling of substrates between the various enzyme complexes, increasing the rate and efficiency of electron transfer. Within such mammalian supercomplexes, some components would be present in higher amounts than others, with some data suggesting a ratio between complexes I/II/III/IV and the ATP synthase of approximately 1:1:3:7:4. However, the debate over this supercomplex hypothesis is not completely resolved, as some data do not appear to fit with this model.
## Prokaryotic electron transport chains
In contrast to the general similarity in structure and function of the electron transport chains in eukaryotes, bacteria and archaea possess a large variety of electron-transfer enzymes. These use an equally wide set of chemicals as substrates. In common with eukaryotes, prokaryotic electron transport uses the energy released from the oxidation of a substrate to pump ions across a membrane and generate an electrochemical gradient. In the bacteria, oxidative phosphorylation in Escherichia coli is understood in most detail, while archaeal systems are at present poorly understood.
The main difference between eukaryotic and prokaryotic oxidative phosphorylation is that bacteria and archaea use many different substances to donate or accept electrons. This allows prokaryotes to grow under a wide variety of environmental conditions. In E. coli, for example, oxidative phosphorylation can be driven by a large number of pairs of reducing agents and oxidizing agents, which are listed below. The midpoint potential of a chemical measures how much energy is released when it is oxidized or reduced, with reducing agents having negative potentials and oxidizing agents positive potentials.
As shown above, E. coli can grow with reducing agents such as formate, hydrogen, or lactate as electron donors, and nitrate, DMSO, or oxygen as acceptors. The larger the difference in midpoint potential between an oxidizing and reducing agent, the more energy is released when they react. Out of these compounds, the succinate/fumarate pair is unusual, as its midpoint potential is close to zero. Succinate can therefore be oxidized to fumarate if a strong oxidizing agent such as oxygen is available, or fumarate can be reduced to succinate using a strong reducing agent such as formate. These alternative reactions are catalyzed by succinate dehydrogenase and fumarate reductase, respectively.
Some prokaryotes use redox pairs that have only a small difference in midpoint potential. For example, nitrifying bacteria such as Nitrobacter oxidize nitrite to nitrate, donating the electrons to oxygen. The small amount of energy released in this reaction is enough to pump protons and generate ATP, but not enough to produce NADH or NADPH directly for use in anabolism. This problem is solved by using a nitrite oxidoreductase to produce enough proton-motive force to run part of the electron transport chain in reverse, causing complex I to generate NADH.
Prokaryotes control their use of these electron donors and acceptors by varying which enzymes are produced, in response to environmental conditions. This flexibility is possible because different oxidases and reductases use the same ubiquinone pool. This allows many combinations of enzymes to function together, linked by the common ubiquinol intermediate. These respiratory chains therefore have a modular design, with easily interchangeable sets of enzyme systems.
In addition to this metabolic diversity, prokaryotes also possess a range of isozymes – different enzymes that catalyze the same reaction. For example, in E. coli, there are two different types of ubiquinol oxidase using oxygen as an electron acceptor. Under highly aerobic conditions, the cell uses an oxidase with a low affinity for oxygen that can transport two protons per electron. However, if levels of oxygen fall, they switch to an oxidase that transfers only one proton per electron, but has a high affinity for oxygen.
## ATP synthase (complex V)
ATP synthase, also called complex V, is the final enzyme in the oxidative phosphorylation pathway. This enzyme is found in all forms of life and functions in the same way in both prokaryotes and eukaryotes. The enzyme uses the energy stored in a proton gradient across a membrane to drive the synthesis of ATP from ADP and phosphate (P<sub>i</sub>). Estimates of the number of protons required to synthesize one ATP have ranged from three to four, with some suggesting cells can vary this ratio, to suit different conditions.
This phosphorylation reaction is an equilibrium, which can be shifted by altering the proton-motive force. In the absence of a proton-motive force, the ATP synthase reaction will run from right to left, hydrolyzing ATP and pumping protons out of the matrix across the membrane. However, when the proton-motive force is high, the reaction is forced to run in the opposite direction; it proceeds from left to right, allowing protons to flow down their concentration gradient and turning ADP into ATP. Indeed, in the closely related vacuolar type H+-ATPases, the hydrolysis reaction is used to acidify cellular compartments, by pumping protons and hydrolysing ATP.
ATP synthase is a massive protein complex with a mushroom-like shape. The mammalian enzyme complex contains 16 subunits and has a mass of approximately 600 kilodaltons. The portion embedded within the membrane is called F<sub>O</sub> and contains a ring of c subunits and the proton channel. The stalk and the ball-shaped headpiece is called F<sub>1</sub> and is the site of ATP synthesis. The ball-shaped complex at the end of the F<sub>1</sub> portion contains six proteins of two different kinds (three α subunits and three β subunits), whereas the "stalk" consists of one protein: the γ subunit, with the tip of the stalk extending into the ball of α and β subunits. Both the α and β subunits bind nucleotides, but only the β subunits catalyze the ATP synthesis reaction. Reaching along the side of the F<sub>1</sub> portion and back into the membrane is a long rod-like subunit that anchors the α and β subunits into the base of the enzyme.
As protons cross the membrane through the channel in the base of ATP synthase, the F<sub>O</sub> proton-driven motor rotates. Rotation might be caused by changes in the ionization of amino acids in the ring of c subunits causing electrostatic interactions that propel the ring of c subunits past the proton channel. This rotating ring in turn drives the rotation of the central axle (the γ subunit stalk) within the α and β subunits. The α and β subunits are prevented from rotating themselves by the side-arm, which acts as a stator. This movement of the tip of the γ subunit within the ball of α and β subunits provides the energy for the active sites in the β subunits to undergo a cycle of movements that produces and then releases ATP.
This ATP synthesis reaction is called the binding change mechanism and involves the active site of a β subunit cycling between three states. In the "open" state, ADP and phosphate enter the active site (shown in brown in the diagram). The protein then closes up around the molecules and binds them loosely – the "loose" state (shown in red). The enzyme then changes shape again and forces these molecules together, with the active site in the resulting "tight" state (shown in pink) binding the newly produced ATP molecule with very high affinity. Finally, the active site cycles back to the open state, releasing ATP and binding more ADP and phosphate, ready for the next cycle.
In some bacteria and archaea, ATP synthesis is driven by the movement of sodium ions through the cell membrane, rather than the movement of protons. Archaea such as Methanococcus also contain the A<sub>1</sub>A<sub>o</sub> synthase, a form of the enzyme that contains additional proteins with little similarity in sequence to other bacterial and eukaryotic ATP synthase subunits. It is possible that, in some species, the A<sub>1</sub>A<sub>o</sub> form of the enzyme is a specialized sodium-driven ATP synthase, but this might not be true in all cases.
## Oxidative phosphorylation - energetics
The transport of electrons from redox pair NAD<sup>+</sup>/ NADH to the final redox pair 1/2 O<sub>2</sub>/ H<sub>2</sub>O can be summarized as
1/2 O<sub>2</sub> + NADH + H<sup>+</sup> → H<sub>2</sub>O + NAD<sup>+</sup>
The potential difference between these two redox pairs is 1.14 volt, which is equivalent to -52 kcal/mol or -2600 kJ per 6 mol of O<sub>2</sub>.
When one NADH is oxidized through the electron transfer chain, three ATPs are produced, which is equivalent to 7.3 kcal/mol x 3 = 21.9 kcal/mol.
The conservation of the energy can be calculated by the following formula
Efficiency = (21.9 x 100%) / 52 = 42%
So we can conclude that when NADH is oxidized, about 42% of energy is conserved in the form of three ATPs and the remaining (58%) energy is lost as heat (unless the chemical energy of ATP under physiological conditions was underestimated).
## Reactive oxygen species
Molecular oxygen is a good terminal electron acceptor because it is a strong oxidizing agent. The reduction of oxygen does involve potentially harmful intermediates. Although the transfer of four electrons and four protons reduces oxygen to water, which is harmless, transfer of one or two electrons produces superoxide or peroxide anions, which are dangerously reactive.
These reactive oxygen species and their reaction products, such as the hydroxyl radical, are very harmful to cells, as they oxidize proteins and cause mutations in DNA. This cellular damage may contribute to disease and is proposed as one cause of aging.
The cytochrome c oxidase complex is highly efficient at reducing oxygen to water, and it releases very few partly reduced intermediates; however small amounts of superoxide anion and peroxide are produced by the electron transport chain. Particularly important is the reduction of coenzyme Q in complex III, as a highly reactive ubisemiquinone free radical is formed as an intermediate in the Q cycle. This unstable species can lead to electron "leakage" when electrons transfer directly to oxygen, forming superoxide. As the production of reactive oxygen species by these proton-pumping complexes is greatest at high membrane potentials, it has been proposed that mitochondria regulate their activity to maintain the membrane potential within a narrow range that balances ATP production against oxidant generation. For instance, oxidants can activate uncoupling proteins that reduce membrane potential.
To counteract these reactive oxygen species, cells contain numerous antioxidant systems, including antioxidant vitamins such as vitamin C and vitamin E, and antioxidant enzymes such as superoxide dismutase, catalase, and peroxidases, which detoxify the reactive species, limiting damage to the cell.
## Oxidative phosphorylation in hypoxic/anoxic conditions
As oxygen is fundamental for oxidative phosphorylation, a shortage in O<sub>2</sub> level can alter ATP production rates. Under anoxic conditions, ATP-synthase will commit 'cellular treason' and run in reverse, forcing protons from the matrix back into the inner membrane space, using up ATP in the process. The proton motive force and ATP production can be maintained by intracellular acidosis. Cytosolic protons that have accumulated with ATP hydrolysis and lactic acidosis can freely diffuse across the mitochondrial outer-membrane and acidify the inter-membrane space, hence directly contributing to the proton motive force and ATP production.
## Inhibitors
There are several well-known drugs and toxins that inhibit oxidative phosphorylation. Although any one of these toxins inhibits only one enzyme in the electron transport chain, inhibition of any step in this process will halt the rest of the process. For example, if oligomycin inhibits ATP synthase, protons cannot pass back into the mitochondrion. As a result, the proton pumps are unable to operate, as the gradient becomes too strong for them to overcome. NADH is then no longer oxidized and the citric acid cycle ceases to operate because the concentration of NAD<sup>+</sup> falls below the concentration that these enzymes can use.
Many site-specific inhibitors of the electron transport chain have contributed to the present knowledge of mitochondrial respiration. Synthesis of ATP is also dependent on the electron transport chain, so all site-specific inhibitors also inhibit ATP formation. The fish poison rotenone, the barbiturate drug amytal, and the antibiotic piericidin A inhibit NADH and coenzyme Q.
Carbon monoxide, cyanide, hydrogen sulphide and azide effectively inhibit cytochrome oxidase. Carbon monoxide reacts with the reduced form of the cytochrome while cyanide and azide react with the oxidised form. An antibiotic, antimycin A, and British anti-Lewisite, an antidote used against chemical weapons, are the two important inhibitors of the site between cytochrome B and C1.
Not all inhibitors of oxidative phosphorylation are toxins. In brown adipose tissue, regulated proton channels called uncoupling proteins can uncouple respiration from ATP synthesis. This rapid respiration produces heat, and is particularly important as a way of maintaining body temperature for hibernating animals, although these proteins may also have a more general function in cells' responses to stress.
## History
The field of oxidative phosphorylation began with the report in 1906 by Arthur Harden of a vital role for phosphate in cellular fermentation, but initially only sugar phosphates were known to be involved. However, in the early 1940s, the link between the oxidation of sugars and the generation of ATP was firmly established by Herman Kalckar, confirming the central role of ATP in energy transfer that had been proposed by Fritz Albert Lipmann in 1941. Later, in 1949, Morris Friedkin and Albert L. Lehninger proved that the coenzyme NADH linked metabolic pathways such as the citric acid cycle and the synthesis of ATP. The term oxidative phosphorylation was coined by in 1939.
For another twenty years, the mechanism by which ATP is generated remained mysterious, with scientists searching for an elusive "high-energy intermediate" that would link oxidation and phosphorylation reactions. This puzzle was solved by Peter D. Mitchell with the publication of the chemiosmotic theory in 1961. At first, this proposal was highly controversial, but it was slowly accepted and Mitchell was awarded a Nobel prize in 1978. Subsequent research concentrated on purifying and characterizing the enzymes involved, with major contributions being made by David E. Green on the complexes of the electron-transport chain, as well as Efraim Racker on the ATP synthase. A critical step towards solving the mechanism of the ATP synthase was provided by Paul D. Boyer, by his development in 1973 of the "binding change" mechanism, followed by his radical proposal of rotational catalysis in 1982. More recent work has included structural studies on the enzymes involved in oxidative phosphorylation by John E. Walker, with Walker and Boyer being awarded a Nobel Prize in 1997.
## See also
- Respirometry
- TIM/TOM Complex |
# Mensa (constellation)
Mensa is a constellation in the Southern Celestial Hemisphere near the south celestial pole, one of fourteen constellations drawn up in the 18th century by French astronomer Nicolas-Louis de Lacaille. Its name is Latin for table, though it originally commemorated Table Mountain and was known as "Mons Mensae". One of the eighty-eight constellations designated by the International Astronomical Union (IAU), it covers a keystone-shaped wedge of sky 153.5 square degrees in area. Other than the south polar constellation of Octans, it is the most southerly of constellations and is observable only south of the 5th parallel of the Northern Hemisphere.
One of the faintest constellations in the night sky, Mensa contains no apparently bright stars—the brightest, Alpha Mensae, is barely visible in suburban skies. Part of the Large Magellanic Cloud, several star clusters and a quasar lie in the area covered by the constellation, and at least three of its star systems have been found to have exoplanets.
## History
Originally named "Montagne de la Table" or "Mons Mensae", Mensa was created by Nicolas-Louis de Lacaille out of dim Southern Hemisphere stars in honor of Table Mountain, a South African mountain overlooking Cape Town, near the location of Lacaille's observatory. He recalled that the Magellanic Clouds were sometimes known as Cape clouds, and that Table Mountain was often covered in clouds when a southeasterly stormy wind blew. Hence he made a "table" in the sky under the clouds. Lacaille had observed and catalogued 10,000 southern stars during a two-year stay at the Cape of Good Hope. He devised 14 new constellations in uncharted regions of the Southern Celestial Hemisphere not visible from Europe. Mensa was the only constellation that did not honor an instrument symbolic of the Age of Enlightenment. Sir John Herschel proposed shrinking the name to one word in 1844, noting that Lacaille himself had abbreviated some of his constellations thus.
Although the stars of Mensa do not feature in any ancient mythology, the mountain it is named after has a rich mythology. Called "Tafelberg" in Dutch and German, it has two neighboring mountains called "Devil's Peak" and "Lion's Head". Table Mountain features in the mythology of the Cape of Good Hope, notorious for its storms. Explorer Bartolomeu Dias saw the mountain as a mythical anvil for storms.
## Characteristics
Mensa is bordered by Dorado to the north, Hydrus to the northwest and west, Octans to the south, Chamaeleon to the east and Volans to the northeast. Covering 153.5 square degrees and 0.372% of the night sky, it ranks 75th of the 88 constellations in size. The three-letter abbreviation for the constellation, as adopted by the IAU in 1922, is "Men". The official constellation boundaries, as set by Belgian astronomer Eugène Delporte in 1930, are defined by a polygon of eight segments. In the equatorial coordinate system, the right ascension coordinates of these borders lie between and , while the declination coordinates are between −69.75° and −85.26°. The whole constellation is visible to observers south of latitude 5°N.
## Features
### Stars
#### Bright stars
Lacaille gave eleven stars in the constellation Bayer designations, using the Greek alphabet to label them Alpha through to Lambda Mensae (excluding Kappa). Gould later added Kappa, Mu, Nu, Xi and Pi Mensae. Stars as dim as these were not generally given designations; however, Gould felt their closeness to the South Celestial Pole warranted their naming. Alpha Mensae is the brightest star with a barely visible apparent magnitude of 5.09, making it the only constellation with no star above magnitude 5.0. Overall, there are 22 stars within the constellation's borders brighter than or equal to apparent magnitude 6.5.
- Alpha Mensae is a solar-type star (class G7V) 33.32 ± 0.02 light-years from Earth. It came to within 11 light-years from Earth around 250,000 years ago and would have been considerably brighter back then—nearly of second magnitude. An infrared excess has been detected around this star, indicating the presence of a circumstellar disk at a radius of over 147 astronomical units (AU). The estimated temperature of this dust is below 22 K. However, data from Herschel Space Observatory failed to confirm this excess, leaving the finding in doubt. No planetary companions have yet been discovered around it. It has a red dwarf companion star at an angular separation of 3.05 arcseconds; equivalent to a projected separation of roughly 30 AU.
- Gamma Mensae is the second-brightest star in the constellation, at magnitude 5.19. Located 104.9 ± 0.5 light-years from Earth, it is an ageing (10.6 billion year-old) star around 1.04 times as massive as the Sun. It has swollen to around 5 times the solar radius, becoming an orange giant of spectral type K2III.
- Beta Mensae is slightly fainter at magnitude 5.31. Located 660 ± 10 light-years from Earth, it is a yellow giant of spectral type G8III, around 3.6 times as massive and 513 times as luminous as the Sun. It is 270 million years old, and lies in front of the Large Magellanic Cloud.
- Zeta and Eta Mensae have infrared excesses suggesting they too have circumstellar disks of dust. Zeta Mensae is an ageing white giant of spectral type A5 III around 394 ± 4 light-years from Earth, and Eta Mensae is an orange giant of spectral type K4 III, lying 650 ± 10 light-years away from Earth.
- Pi Mensae is a solar-type (G1) star 59.62 ± 0.07 light-years distant. In 2001, a substellar companion was discovered in an eccentric orbit. Incorporating more accurate Hipparcos data yields a mass range for the companion to be anywhere from 10.27 to 29.9 times that of Jupiter. This confirms its substellar nature with the upper limit of mass putting it in the brown dwarf range. The discovery of a second substellar companion—a super-Earth—was announced on 16 September 2018. It takes 6.27 days to complete its orbit and is the first exoplanet detected by the Transiting Exoplanet Survey Satellite (TESS) submitted for publication.
#### Planet-hosting stars
- HD 38283 (Bubup) is a Sun-like star of spectral type F9.5V of magnitude 6.7, located 124.3 ± 0.1 light-years distant. In 2011, a gas giant with an Earth-like orbital period of 363 days and a minimum mass a third that of Jupiter was discovered by the radial velocity method.
- HD 39194 is an orange dwarf of spectral type K0V and magnitude 8.08, located 86.21 ± 0.09 light-years distant. Three planets in close orbit were discovered by the High Accuracy Radial Velocity Planet Searcher (HARPS) in 2011. The three take 5.6, 14 and 34 days to complete an orbit around their star, and have minimum masses 3.72, 5.94 and 5.14 times that of the Earth respectively.
#### Variable stars
- TZ Mensae is an eclipsing binary that varies between magnitude 6.2 and 6.9 every 8.57 days. It is composed of two white main sequence stars in close orbit. One of these is of spectral type A0V, has a radius twice as that of the Sun and is 2.5 times as massive. The other, an A8V spectral type, has a radius 1.4 times that of the Sun and is 1.5 times as massive.
- UX Mensae is another eclipsing binary system composed of two young stars around 1.2 times as massive as the Sun and 2.2 ± 0.5 billion years of age, orbiting each other every 4.19 days. The system is 338.2 ± 0.9 light-years distant.
- TY Mensae is another eclipsing binary system classified as a W Ursae Majoris variable; the two components are so close that they share a common envelope of stellar material. The larger star has been calculated to be 1.86 times as massive, have 1.85 times the diameter and is 13.6 times as luminous, while the smaller is 0.4 times as massive, 0.84 times the diameter, and 1.7 times as luminous as the Sun. Their surface temperatures have been calculated at 8164 and 7183 K respectively.
- YY Mensae is an orange giant of spectral type K1III around 2.2 times as massive as the Sun, with 12.7 times its diameter and 70 times its luminosity. A rapidly rotating star with a period of 9.5 days, it is a strong emitter of X-rays and belongs to a class of star known as FK Comae Berenices variables. These stars are thought to have formed with the merger of two stars in a contact binary system. With an apparent magnitude of 8.05, it is 707 ± 6 light-years distant.
- AH Mensae is a cataclysmic variable star system composed of a white dwarf and a red dwarf that orbit each other every 2 hours 57 minutes. The stars are close enough that the white dwarf strips material off the red dwarf, creating an accretion disc that periodically ignites with a resulting brightening of the system.
- TU Mensae is another cataclysmic variable composed of a red dwarf and white dwarf. The orbital period of 2 hours 49 minutes is one of the longest for cataclysmic variable systems exhibiting brighter outbursts, known as superhumps. The normal outbursts result in an increase in brightness lasting around a day every 37 days, while the superhumps last 5–20 days and take place every 194 days.
- AO Mensae is a faint star of magnitude 9.8. An orange dwarf that has 80% the size and mass of the Sun, it is also a BY Draconis variable. These are a class of stars with starspots prominent enough that the star changes brightness as it rotates. It is a member of the Beta Pictoris moving group, a loose association of young stars moving across the galaxy.
#### Other stars
- WISE 0535−7500 is a binary system composed of two sub-brown dwarfs of spectral class cooler than Y1 located 47 ± 3 light-years away. Unable to be separated by observations to date, they are presumed to be of similar mass—8 to 20 times that of Jupiter—and are less than one AU apart.
### Deep-sky objects
The Large Magellanic Cloud lies partially within Mensa's boundaries, although most of it lies in neighbouring Dorado. It is a satellite galaxy of the Milky Way, located at a distance of 163,000 light-years. Among its stars within Mensa are W Mensae, an unusual yellow-white supergiant that belongs to a rare class of star known as a R Coronae Borealis variable, HD 268835, a blue hypergiant that is girded by a vast circumstellar disk of dust, and R71, a luminous blue variable star that brightened in 2012 to over a million times as luminous as the Sun. Also within the galaxy is NGC 1987, a globular cluster estimated to be around 600 million years old that has a significant number of red ageing stars, and NGC 1848, a 27 million year old open cluster. Mensa contains several described open clusters, most of which can be only be clearly observed from large telescopes.
PKS 0637-752 is a distant quasar with a calculated redshift of z = 0.651. It was chosen as the first target of the then newly-operational Chandra X-Ray Observatory in 1999. The resulting images revealed a gas jet approximately 330,000 light-years long. It is visible at radio, optical and x-ray wavelengths. |
# Perseus (constellation)
Perseus is a constellation in the northern sky, named after the Greek mythological hero Perseus. It is one of the 48 ancient constellations listed by the 2nd-century astronomer Ptolemy, and among the 88 modern constellations defined by the International Astronomical Union (IAU). It is located near several other constellations named after ancient Greek legends surrounding Perseus, including Andromeda to the west and Cassiopeia to the north. Perseus is also bordered by Aries and Taurus to the south, Auriga to the east, Camelopardalis to the north, and Triangulum to the west. Some star atlases during the early 19th century also depicted Perseus holding the disembodied head of Medusa, whose asterism was named together as Perseus et Caput Medusae; however, this never came into popular usage.
The galactic plane of the Milky Way passes through Perseus, whose brightest star is the yellow-white supergiant Alpha Persei (also called Mirfak), which shines at magnitude 1.79. It and many of the surrounding stars are members of an open cluster known as the Alpha Persei Cluster. The best-known star, however, is Algol (Beta Persei), linked with ominous legends because of its variability, which is noticeable to the naked eye. Rather than being an intrinsically variable star, it is an eclipsing binary. Other notable star systems in Perseus include X Persei, a binary system containing a neutron star, and GK Persei, a nova that peaked at magnitude 0.2 in 1901. The Double Cluster, comprising two open clusters quite near each other in the sky, was known to the ancient Chinese. The constellation gives its name to the Perseus cluster (Abell 426), a massive galaxy cluster located 250 million light-years from Earth. It hosts the radiant of the annual Perseids meteor shower—one of the most prominent meteor showers in the sky.
## History and mythology
In Greek mythology, Perseus was the son of Danaë, who was sent by King Polydectes to bring the head of Medusa the Gorgon—whose visage caused all who gazed upon her to turn to stone. Perseus slew Medusa in her sleep, and Pegasus and Chrysaor appeared from her body. Perseus continued to the realm of Cepheus whose daughter Andromeda was to be sacrificed to Cetus the sea monster.
Perseus rescued Andromeda from the monster by killing it with his sword. He turned Polydectes and his followers to stone with Medusa's head and appointed Dictys the fisherman king. Perseus and Andromeda married and had six children. In the sky, Perseus lies near the constellations Andromeda, Cepheus, Cassiopeia (Andromeda's mother), Cetus, and Pegasus.
In Neo-Assyrian Babylonia (911–605 BC), the constellation of Perseus was known as the Old Man constellation (SU.GI), then associated with East in the MUL.APIN, an astronomical text from the 7th century.
### In non-Western astronomy
Four Chinese constellations are contained in the area of the sky identified with Perseus in the West. Tiānchuán (天船), the Celestial Boat, was the third paranatellon (A star or constellation which rises at the same time as another star or object) of the third house of the White Tiger of the West, representing the boats that Chinese people were reminded to build in case of a catastrophic flood season. Incorporating stars from the northern part of the constellation, it contained Mu, Delta, Psi, Alpha, Gamma and Eta Persei. Jīshuǐ (積水), the Swollen Waters, was the fourth paranatellon of the aforementioned house, representing the potential of unusually high floods during the end of August and beginning of September at the beginning of the flood season. Lambda and possibly Mu Persei lay within it. Dàlíng (大陵), the Great Trench, was the fifth paranatellon of that house, representing the trenches where criminals executed en masse in August were interred. It was formed by Kappa, Omega, Rho, 24, 17 and 15 Persei. The pile of corpses prior to their interment was represented by Jīshī (積屍, Pi Persei), the sixth paranatellon of the house. The Double Cluster, h and Chi Persei, had special significance in Chinese astronomy.
In Polynesia, Perseus was not commonly recognized as a separate constellation; the only people that named it were those of the Society Islands, who called it Faa-iti, meaning "Little Valley". Algol may have been named Matohi by the Māori people, but the evidence for this identification is disputed. Matohi ("Split") occasionally came into conflict with Tangaroa-whakapau over which of them should appear in the sky, the outcome affecting the tides. It matches the Maori description of a blue-white star near Aldebaran but does not disappear as the myth would indicate.
## Characteristics
Perseus is bordered by Aries and Taurus to the south, Auriga to the east, Camelopardalis and Cassiopeia to the north, and Andromeda and Triangulum to the west. Covering 615 square degrees, it ranks twenty-fourth of the 88 constellations in size. It appears prominently in the northern sky during the Northern Hemisphere's spring. Its main asterism consists of 19 stars. The constellation's boundaries, as set by Belgian astronomer Eugène Delporte in 1930, are defined by a 26-sided polygon. In the equatorial coordinate system, the right ascension coordinates of these borders lie between and , while the declination coordinates are between and . The International Astronomical Union (IAU) adopted the three-letter abbreviation "Per" for the constellation in 1922.
## Features
### Stars
Algol (from the Arabic رأس الغول Ra's al-Ghul, which means The Demon's Head), also known by its Bayer designation Beta Persei, is the best-known star in Perseus. Representing the head of the Gorgon Medusa in Greek mythology, it was called Horus in Egyptian mythology and Rosh ha Satan ("Satan's Head") in Hebrew. Located 92.8 light-years from Earth, it varies in apparent magnitude from a minimum of 3.5 to a maximum of 2.3 over a period of 2.867 days. The star system is the prototype of a group of eclipsing binary stars named Algol variables, though it has a third member to make up what is actually a triple star system. The brightest component is a blue-white main-sequence star of spectral type B8V, which is 3.5 times as massive and 180 times as luminous as the Sun. The secondary component is an orange subgiant star of type K0IV that has begun cooling and expanding to 3.5 times the radius of the Sun, and has 4.5 times the luminosity and 80% of its mass. These two are separated by only 0.05 astronomical units (AU)—five percent of the distance between the Earth and Sun; the main dip in brightness arises when the larger fainter companion passes in front of the hotter brighter primary. The tertiary component is a main sequence star of type A7, which is located on average 2.69 AU from the other two stars. AG Persei is another Algol variable in Perseus, whose primary component is a B-type main sequence star with an apparent magnitude of 6.69. Phi Persei is a double star, although the two components do not eclipse each other. The primary star is a Be star of spectral type B0.5, possibly a giant star, and the secondary companion is likely a stellar remnant. The secondary has a similar spectral type to O-type subdwarfs.
With the historical name Mirfak (Arabic for elbow) or Algenib, Alpha Persei is the brightest star of this constellation with an apparent magnitude of 1.79. A supergiant of spectral type F5Ib located around 590 light-years away from Earth, Mirfak has 5,000 times the luminosity and 42 times the diameter of the Sun. It is the brightest member of the Alpha Persei Cluster (also known as Melotte 20 and Collinder 39), which is an open cluster containing many luminous stars. Neighboring bright stars that are members include the Be stars Delta (magnitude 3.0), Psi (4.3), and 48 Persei (4.0); the Beta Cephei variable Epsilon Persei (2.9); and the stars 29 (5.2), 30 (5.5), 31 (5.0), and 34 Persei (4.7). Of magnitude 4.05, nearby Iota Persei has been considered a member of the group, but is actually located a mere 34 light-years distant. This star is very similar to the Sun, shining with 2.2 times its luminosity. It is a yellow main sequence star of spectral type G0V. Extensive searches have failed to find evidence of it having a planetary system.
Zeta Persei is the third-brightest star in the constellation at magnitude 2.86. Around 750 light-years from Earth, it is a blue-white supergiant 26–27 times the radius of the Sun and 47,000 times its luminosity. It is the brightest star (as seen from Earth) of a moving group of bright blue-white giant and supergiant stars, the Perseus OB2 Association or Zeta Persei Association. Zeta is a triple star system, with a companion blue-white main sequence star of spectral type B8 and apparent magnitude 9.16 around 3,900 AU distant from the primary, and a white main sequence star of magnitude 9.90 and spectral type A2, some 50,000 AU away, that may or may not be gravitationally bound to the other two. X Persei is a double system in this association; one component is a hot, bright star and the other is a neutron star. With an apparent magnitude of 6.72, it is too dim to be seen with the naked eye even in perfectly dark conditions. The system is an X-ray source and the primary star appears to be undergoing substantial mass loss. Once thought to be a member of the Perseus OB2 Association, Omicron Persei (Atik) is a multiple star system with a combined visual magnitude of 3.85. It is composed of two blue-white stars—a giant of spectral class B1.5 and main sequence star of B3—which orbit each other every 4.5 days and are distorted into ovoids due to their small separation. The system has a third star about which little is known. At an estimated distance of 1,475 light-years from Earth, the system is now thought to lie too far from the center of the Zeta Persei group to belong to it.
GRO J0422+32 (V518 Persei) is another X-ray binary in Perseus. One component is a red dwarf star of spectral type M4.5V, which orbits a mysterious dense and heavy object—possibly a black hole—every 5.1 hours. The system is an X-ray nova, meaning that it experiences periodic outbursts in the X-ray band of the electromagnetic spectrum. If the system does indeed contain a black hole, it would be the smallest one ever recorded. Further analysis in 2012 calculated a mass of 2.1 solar masses, which raises questions as to what the object actually is as it appears to be too small to be a black hole.
GK Persei, also known as Nova Persei 1901, is a bright nova that appeared halfway between Algol and Delta Persei. Discovered on 21 February 1901 by Scottish amateur astronomer Thomas David Anderson, it peaked at magnitude 0.2—almost as bright as Capella and Vega. It faded to magnitude 13 around 30 years after its peak brightness. Xi Persei, traditionally known as Menkhib, a blue giant of spectral type O7III, is one of the hottest bright stars in the sky, with a surface temperature of 37,500 K. It is one of the more massive stars, being between 26 and 32 solar masses, and is 330,000 times as luminous as the Sun.
Named Gorgonea Tertia, Rho Persei varies in brightness like Algol, but is a pulsating rather than eclipsing star. At an advanced stage of stellar evolution, it is a red giant that has expanded for the second time to have a radius around 150 times that of the Sun. Its helium has been fused into heavier elements and its core is composed of carbon and oxygen. It is a semiregular variable star of the Mu Cephei type, whose apparent magnitude varies between 3.3 and 4.0 with periods of 50, 120 and 250 days. The Double Cluster contains three even larger stars, each over 700 solar radii: S, RS, and SU Persei are all semiregular pulsating M-type supergiants. The stars are not visible to the naked eye; SU Persei, the brightest of the three, has an apparent magnitude of 7.9 and thus is visible through binoculars. AX Persei is another binary star, the primary component is a red giant in an advanced phase of stellar evolution, which is transferring material onto an accretion disc around a smaller star. The star system is one of the few eclipsing symbiotic binaries, but is unusual because the secondary star is not a white dwarf, but an A-type star. DY Persei is a variable star that is the prototype of DY Persei variables, which are carbon-rich R Coronae Borealis variables that exhibit the variability of asymptotic giant branch stars. DY Persei itself is a carbon star that is too dim to see through binoculars, with an apparent magnitude of 10.6.
Seven stars in Perseus have been found to have planetary systems. V718 Persei is a star in the young open cluster IC 348 that appears to be periodically eclipsed by a giant planet every 4.7 years. This has been inferred to be an object with a maximum mass of 6 times that of Jupiter and an orbital radius of 3.3 AU.
### Deep-sky objects
The galactic plane of the Milky Way passes through Perseus, but is much less obvious than elsewhere in the sky as it is mostly obscured by molecular clouds. The Perseus Arm is a spiral arm of the Milky Way galaxy and stretches across the sky from the constellation Cassiopeia through Perseus and Auriga to Gemini and Monoceros. This segment is towards the rim of the galaxy.
Within the Perseus Arm lie two open clusters (NGC 869 and NGC 884) known as the Double Cluster. Sometimes known as h and Chi (χ) Persei, respectively, they are easily visible through binoculars and small telescopes. Both lie more than 7,000 light-years from Earth and are several hundred light-years apart. Both clusters are of approximately magnitude 4 and 0.5 degrees in diameter. The two are Trumpler class I 3 r clusters, though NGC 869 is a Shapley class f and NGC 884 is a Shapley class e cluster. These classifications indicate that they are both quite rich (dense); NGC 869 is the richer of the pair. The clusters are both distinct from the surrounding star field and are clearly concentrated at their centers. The constituent stars, numbering over 100 in each cluster, range widely in brightness.
M34 is an open cluster that appears at magnitude 5.5, and is approximately 1,500 light-years from Earth. It contains about 100 stars scattered over a field of view larger than that of the full moon. M34 can be resolved with good eyesight but is best viewed using a telescope at low magnifications. IC 348 is a somewhat young open cluster that is still contained within the nebula from which its stars formed. It is located about 1,027 light-years from Earth, is about 2 million years old, and contains many stars with circumstellar disks. Many brown dwarfs have been discovered in this cluster due to its age; since brown dwarfs cool as they age, it is easier to find them in younger clusters.
There are many nebulae in Perseus. M76 is a planetary nebula, also called the Little Dumbbell Nebula. It appears two arc-minutes by one arc-minute across and has an apparent brightness of magnitude 10.1. NGC 1499, also known as the California Nebula, is an emission nebula that was discovered in 1884–85 by American astronomer Edward E. Barnard. It is very difficult to observe visually because its low surface brightness makes it appear dimmer than most other emission nebulae. NGC 1333 is a reflection nebula and a star-forming region. Perseus also contains a giant molecular cloud, called the Perseus molecular cloud; it belongs to the Orion Spur and is known for its low rate of star formation compared to similar clouds.
Perseus contains some notable galaxies. NGC 1023 is a barred spiral galaxy of magnitude 10.35, around 30 million light-years (9.2 million parsecs) from Earth. It is the principal member of the NGC 1023 group of galaxies and is possibly interacting with another galaxy. NGC 1260 is either a lenticular or tightly wound spiral galaxy about 76.7 million ly (23.5 million pc) from Earth. It was the host galaxy of the supernova SN 2006gy, one of the brightest ever recorded. It is a member of the Perseus Cluster (Abell 426), a massive galaxy cluster located 76.6 million ly (23.5 million pc) from Earth. With a redshift of 0.0179, Abell 426 is the closest major cluster to the Earth. NGC 1275, a component of the cluster, is a Seyfert galaxy containing an active nucleus that produces jets of material, surrounding the galaxy with massive bubbles. These bubbles create sound waves that travel through the Perseus Cluster, sounding a B flat 57 octaves below middle C. This galaxy is a cD galaxy that has undergone many galactic mergers throughout its existence, as evidenced by the "high velocity system"—the remnants of a smaller galaxy—surrounding it. Its active nucleus is a strong source of radio waves. 3C 31 is an active galaxy and radio source in Perseus 237 million light-years from Earth (redshift 0.0173). Its jets, caused by the supermassive black hole at its center, extend several million light-years in opposing directions, making them some of the largest objects in the universe.
### Meteor showers
The Perseids are a prominent annual meteor shower that appear to radiate from Perseus from mid-July, peaking in activity between 9 and 14 August each year. Associated with Comet Swift–Tuttle, they have been observed for about 2,000 years. The September Epsilon Perseids, discovered in 2012, are a meteor shower with an unknown parent body in the Oort cloud.
## See also
- Perseus (Chinese astronomy)
## Cited texts
-
-
-
-
- |
# Pyramid of Unas
The pyramid of Unas (Egyptian: Nfr swt Wnjs "Beautiful are the places of Unas") is a smooth-sided pyramid built in the 24th century BC for the Egyptian pharaoh Unas, the ninth and final king of the Fifth Dynasty. It is the smallest Old Kingdom pyramid, but significant due to the discovery of Pyramid Texts, spells for the king's afterlife incised into the walls of its subterranean chambers. Inscribed for the first time in Unas's pyramid, the tradition of funerary texts carried on in the pyramids of subsequent rulers, through to the end of the Old Kingdom, and into the Middle Kingdom through the Coffin Texts that form the basis of the Book of the Dead.
Unas built his pyramid between the complexes of Sekhemket and Djoser, in North Saqqara. Anchored to the valley temple at a nearby lake, a long causeway was constructed to provide access to the pyramid site. The causeway had elaborately decorated walls covered with a roof which had a slit in one section allowing light to enter, illuminating the images. A long wadi was used as a pathway. The terrain was difficult to negotiate and contained old buildings and tomb superstructures. These were torn down and repurposed as underlay for the causeway. A significant stretch of Djoser's causeway was reused for embankments. Tombs that were on the path had their superstructures demolished and were paved over, preserving their decorations. Two Second Dynasty tombs, presumed to belong to Hotepsekhemwy, Nebra, and Ninetjer, from seals found inside, are among those that lie under the causeway. The site was later used for numerous burials of Fifth Dynasty officials, private individuals from the Eighteenth to Twentieth Dynasties, and a collection of Late Period monuments known as the "Persian tombs".
The causeway joined the temple in the harbour with the mortuary temple on the east face of the pyramid. The mortuary temple was entered on its east side through a large granite doorway, seemingly constructed by Unas's successor, Teti. Just south of the upper causeway are two long boat pits. These may have contained two wooden boats: the solar barques of Ra, the sun god. The temple was laid out in a similar manner to Djedkare Isesi's. A transverse corridor separates the outer from the inner temple. The entry chapel of the inner temple has been completely destroyed, though it once contained five statues in niches. A feature of the inner temple was a single quartzite column that was contained in the antichambre carrée. The room is otherwise ruined. Quartzite is an atypical material to use in architectural projects, though examples of it being used sparingly in the Old Kingdom exist. The material is associated with the sun cult due to its sun-like coloration.
The underground chambers remained unexplored until 1881, when Gaston Maspero, who had recently discovered inscribed texts in the pyramids of Pepi I and Merenre I, gained entry. Maspero found the same texts inscribed on the walls of Unas's pyramid, their first known appearance. The 283 spells in Unas's pyramid constitute the oldest, smallest and best preserved corpus of religious writing from the Old Kingdom. Their function was to guide the ruler through to eternal life and ensure his continued survival even if the funerary cult ceased to function. In Unas's case, the funerary cult may have survived the turbulent First Intermediate Period and up until the Twelfth or Thirteenth Dynasty, during the Middle Kingdom. This is a matter of dispute amongst Egyptologists, where a competing idea is that the cult was revived during the Middle Kingdom, rather than having survived until then.
## Location and excavation
The pyramid is situated on the Saqqara plateau and lies on a line running from the pyramid of Sekhemkhet to the pyramid of Menkauhor. The site required the construction of an exceptionally long causeway to reach a nearby lake, suggesting the site held some significance to Unas.
The pyramid was briefly examined by John Shae Perring, and soon after by Karl Richard Lepsius, who listed the pyramid on his pioneering list as number XXXV. Entry was first gained by Gaston Maspero, who examined its substructure in 1881. He had recently discovered a set of texts in the pyramids of Pepi I and Merenre I. Those same texts were discovered in Unas's tomb, making this their earliest known appearance. From 1899 to 1901, the architect and Egyptologist Alessandro Barsanti conducted the first systematic investigation of the pyramid site, succeeding in excavating part of the mortuary temple, as well as a series of tombs from the Second Dynasty and the Late Period. Later excavations by Cecil Mallaby Firth, from 1929 until his death in 1931, followed by those of the architect Jean-Philippe Lauer from 1936 to 1939, were conducted with little success. The archaeologists Selim Hassan, Muhammed Zakaria Goneim and A. H. Hussein mainly focused on the causeway leading to the pyramid while conducting their investigations from 1937 to 1949. Hussein discovered a pair of limestone-lined boat pits at the upper end of the causeway. In the 1970s, Ahmad Moussa excavated the lower half of the causeway and the valley temple. Moussa and another archaeologist, Audran Labrousse [fr], conducted an architectural survey of the valley temple from 1971 to 1981. The pyramids of Unas, Teti, Pepi I and Merenre were the subjects of a major architectural and epigraphic project in Saqqara, led by Jean Leclant. From 1999 until 2001, the Supreme Council of Antiquities conducted a major restoration and reconstruction project on the valley temple. The three entrances and ramps were restored, and a low limestone wall built to demarcate the temple's plan.
## Mortuary complex
### Layout
Unas's complex is situated between the pyramid of Sekhemkhet and the south-west corner of the pyramid complex of Djoser. It is in symmetry with the pyramid of Userkaf situated at the north-east corner, in Saqqara. Old Kingdom mortuary complexes consist of five essential components: (1) a valley temple; (2) a causeway; (3) a mortuary temple; (4) a cult pyramid; and (5) the main pyramid. Unas's monument has all of these elements: the main pyramid, constructed six steps high from limestone blocks; a valley temple situated in a natural harbour at the mouth of a wadi; a causeway constructed using the same wadi as a path; a mortuary temple similar in layout to that of Unas's predecessor, Djedkare Isesi's, and a cult pyramid in the south of the mortuary temple. The pyramid, mortuary temple and cult pyramid were enclosed by a 7 m (23 ft; 13 cu) tall perimeter wall. The perimeter wall from the north-east to north-west corner is about 86 m (282 ft; 164 cu) long, and stretches 76 m (249 ft; 145 cu) from north to south.
### Main pyramid
Though Unas's reign lasted for around thirty to thirty-three years, his pyramid was the smallest built in the Old Kingdom. Time constraints cannot be considered a factor explaining the small size, and it is more likely that resource accessibility constrained the project. The monument's size was also inhibited due to the extensive quarrying necessary to increase the size of the pyramid. Unas chose to avoid that additional burden and instead kept his pyramid small.
The core of the pyramid was built six steps high, constructed with roughly dressed limestone blocks which decreased in size in each step. The construction material for the core would, ideally, have been locally sourced. This was then encased with fine white limestone blocks quarried from Tura. Some of the casing on the lowest steps has remained intact. The pyramid had a base length of 57.75 m (189.5 ft; 110.21 cu) converging towards the apex at an angle of approximately 56°, giving it a height of 43 metres (141 ft; 82 cu) on completion. The pyramid had a total volume of 47,390 m<sup>3</sup> (61,980 cu yd). The pyramid was smooth-sided. The pyramid has since been ruined, as have all others of the Fifth Dynasty, a result of its poor construction and materials. The pyramids of the Fifth Dynasty were further systematically dismantled during the New Kingdom to be reused in the construction of new tombs.
Unas abandoned the practice of building pyramids for his consorts; instead, Khenut and Nebet were buried in a double mastaba north-east of the main pyramid. Each queen was accorded separate rooms and an individual entrance, though the layout of the tombs is identical. Khenut owned the western half, and Nebet owned the eastern half. Their chambers were extensively decorated. The chapel for Nebet's mastaba contains four recesses. One bears a cartouche of Unas's name, indicating that it may have contained a statue of the king, whereas the others contained statues of the queen. Directly north of the mastaba were the tombs for Unas's son Unasankh and daughter Iput. Another daughter, Hemetre, was buried in a tomb west of Djoser's complex.
### Substructure
A small chapel, called the "north chapel" or "entrance chapel", was situated adjacent to the pyramid's north face. It consisted of a single room, with an altar and a stela bearing the hieroglyph for "offering table". Only trace elements of the chapel remain. These chapels had a false door and a decoration scheme similar to the offering hall, which the archaeologist Dieter Arnold suggests indicates that the chapel was a "miniature offering chapel".
The entrance into the substructure of the pyramid lay under the chapel's pavement. The substructure of the pyramid is similar to that of Unas's predecessor, Djedkare Isesi. The entry leads into a 14.35 m (47.1 ft) long vertically sloping corridor inclined at 22° that leads to a vestibule at its bottom. The vestibule is 2.47 m (8.1 ft) long and 2.08 m (6.8 ft) wide. From the vestibule, a 14.10 m (46.3 ft) long horizontal passage follows a level path to the antechamber and is guarded by three granite slab portcullises in succession. The passage ends at an antechamber, a room measuring 3.75 m (12.3 ft) by 3.08 m (10.1 ft), located under the centre axis of the pyramid. To the east, a doorway leads to a room – called the serdab – with three recesses. The serdab measures 6.75 m (22.1 ft) wide and 2 m (6.6 ft) deep. To the west lay the burial chamber, a room measuring 7.3 m (24 ft) by 3.08 m (10.1 ft), containing the ruler's sarcophagus. The roof of both the antechamber and burial chamber were gabled, in a similar fashion to earlier pyramids of the era.
Near the burial chamber's west wall sat Unas's coffin, made from greywacke rather than basalt as was originally presumed. The coffin was undamaged, but its contents had been robbed. A canopic chest had once been buried at the foot of the south-east corner of the coffin. Traces of the burial are fragmentary; all that remain are portions of a mummy, including its right arm, skull and shinbone, as well as the wooden handles of two knives used during the opening of the mouth ceremony. The mummy remains have been displayed in the Egyptian Museum of Cairo.
The walls of the chambers were lined with Tura limestone, while those surrounding Unas's sarcophagus were sheathed in white alabaster incised and painted to represent the doors of the royal palace facade, complementing the eastern passage. Taken as symbolically functional, these allowed the king to depart the tomb in any direction. The walls appear to contain blocks reused from one of Khufu's constructions, possibly his pyramid compex at Giza, as an earlier scene of the king (identified by his Horus name Medjedu) fishing with a harpoon was discovered beneath those carved for Unas.
The ceiling of the burial chamber was painted blue with gold stars to resemble the night sky. The ceiling of the antechamber and corridor were similarly painted. Whereas the stars in the antechamber and the burial chamber pointed northward, the stars in the corridor pointed towards the zenith. The remaining walls of the burial chamber, antechamber, and parts of the corridor were inscribed with a series of vertically written texts, chiselled in bas-relief and painted blue.
#### Pyramid Texts of Unas
The inscriptions, known as the Pyramid Texts, were the central innovation of Unas's pyramid, on whose subterranean walls they were first etched. The Pyramid Texts are the oldest large corpus of religious writing known from ancient Egypt. A total of 283 such spells, out of at least 1,000 known and an indeterminate number of unknown ones, appear in Unas's pyramid. The spells are the smallest and best-preserved collection of Pyramid Texts known from the Old Kingdom. Though they first appeared in Unas's pyramid, many of the texts are significantly older. The texts subsequently appeared in the pyramids of the kings and queens of the Sixth to Eighth Dynasties, until the end of the Old Kingdom. With the exception of a single spell, copies of Unas's texts appeared throughout the Middle Kingdom and later, including a near complete replica of the texts in the tomb of Senwosretankh at El-Lisht.
Ancient Egyptian belief held that the individual consisted of three basic parts; the body, the ka, and the ba. When the person died, the ka would separate from the body and return to the gods from where it had come, while the ba remained with the body. The body of the individual, interred in the burial chamber, never physically left; but the ba, awakened, released itself from the body and began its journey toward new life. Significant to this journey was the Akhet: the horizon, a junction between the earth, the sky, and the Duat. To ancient Egyptians, the Akhet was the place from where the sun rose, and so symbolised a place of birth or resurrection. In the texts, the king is called upon to transform into an akh in the Akhet. The akh, literally "effective being", was the resurrected form of the deceased, attained through individual action and ritual performance. If the deceased failed to complete the transformation, they became mutu, that is "the dead". The function of the texts, in congruence with all funerary literature, was to enable the reunion of the ruler's ba and ka leading to the transformation into an akh, and to secure eternal life among the gods in the sky.
The writings on the west gable in Unas's burial chamber consist of spells that protect the sarcophagus and mummy within. The north and south walls of the chamber are dedicated to the offering and resurrection rituals respectively, and the east wall contains texts asserting the king's control over his sustenance in the form of a response to the offering ritual. The offering ritual texts continue onto the north and south walls of the passageway splitting the resurrection ritual which concludes on the south wall. In the rituals of the burial chamber, the king is identified both as himself and as the god Osiris, being addressed as "Osiris Unas". The king is also identified with other deities, occasionally several, alongside Osiris in other texts. The Egyptologist James Allen identifies the last piece of ritual text on the west gable of the antechamber:
> Your son Horus has acted for you.
> The great ones will shake, having seen the knife in your arm as you emerge from the Duat.
> Greetings, experienced one\! Geb has created you, the Ennead has given you birth.
> Horus has become content about his father, Atum has become content about his years, the eastern and western gods have become content about the great thing that has happened in his embrace – the god's birth.
> It is Unis: Unis, see\! It is Unis: Unis, look\! It is Unis: hear\! It is Unis: Unis, exist\! It is Unis: Unis, raise yourself from your side\!
> Do my command, you who hate sleep but were made slack. Stand up, you in Nedit. Your good bread has been made in Pe; receive your control of Heliopolis.
> It is Horus (who speaks), having been commanded to act for his father.
> The storm-lord, the one with spittle in his vicinity, Seth – he will bear you: he is the one who will bear Atum.
The antechamber and corridor were inscribed primarily with personal texts. The west, north and south walls of the antechamber contain texts whose primary concern is the transition from the human realm to the next, and with the king's ascent to the sky. The east wall held a second set of protective spells, starting with the "Cannibal Hymn". In the hymn, Unas consumes the gods to absorb their power for his resurrection. The Egyptologist Toby Wilkinson identifies the hymn as a mythologizing of the "butchery ritual" in which a bull is sacrificed. The serdab remained uninscribed. The southern section of the walls of the corridor contain texts that focus primarily on the resurrection and ascension of the deceased. The mere presence of the spells within the tomb were believed to have efficacy, thus protecting the king even if the funerary cult ceased to function.
Parts of the corpus of Pyramid Texts were passed down into the Coffin Texts, an expanded set of new texts written on non-royal tombs of the Middle Kingdom, some retaining Old Kingdom grammatical conventions and with many formulations of the Pyramid Texts recurring. The transition to the Coffin Texts was begun in the reign of Pepi I and completed by the Middle Kingdom. The Coffin Texts formed the basis for the Book of the Dead in the New Kingdom and Late Period. The texts would resurface in tombs and on papyri for two millennia, finally disappearing around the time that Christianity was adopted.
### Valley temple
Unas's valley temple is situated in a harbour that naturally forms at the point where the mouth of a wadi meets the lake. The same wadi was used as a path for the causeway. The temple sits between those of Nyuserre Ini and Pepi II. Despite a complex plan, the temple did not contain any significant innovations. It was richly decorated – in a fashion similar to the causeway and mortuary temple – and the surviving palm granite columns that stood at the entrance into the temple evidence their high quality craftsmanship.
The main entrance into the temple was on the east side, consisting of a portico with eight granite palm columns arranged into two rows. A narrow westward corridor led from the entry into a rectangular north–south oriented hall. A second hall was to the south. Two secondary entrances into the halls were built on the north and south sides. Each had a portico with two columns. These were approached by narrow ramps. West of the two halls was the main cult hall. It had a second chamber with three storerooms to the south and a passageway leading to the causeway to the north-west.
### Causeway
The causeway connecting the valley temple to the mortuary temple of Unas's pyramid complex was constructed along the path provided by a natural wadi. The Egyptologist Iorwerth Edwards estimates the walls to be 4 m (13 ft) high, and 2.04 m (6 ft 8 in) thick. The passageway was about 2.65 m (8 ft 8 in) wide. It had a roof constructed from slabs 0.45 m (1 ft 6 in) thick projecting from each wall toward the centre. The causeway, at between 720 m (2,360 ft) and 750 m (2,460 ft) long, was among the longest constructed for any pyramid, comparable to the causeway of Khufu's pyramid. The causeway is also the best preserved of any from the Old Kingdom. Construction of the causeway was complicated and required negotiating uneven terrain and older buildings which were torn down and their stones appropriated as underlay. The causeway was built with two turns, rather than in a straight line. Around 250 m (820 ft) worth of Djoser's causeway was used to provide embankments for Unas's causeway and to plug gaps between it and the wadi. South of the uppermost bend of the causeway were two 45 m (148 ft) long boat pits of white limestone, which might originally have housed wooden boats with curved keels representing the day and night vessels of Ra, the sun god. The boats lay side by side in an east–west orientation.
Tombs in the path of the causeway were built over, preserving their decorations, but not their contents, indicating that the tombs had been robbed either before or during the causeway's construction. Two large royal tombs, dating to the Second Dynasty, are among those that lie beneath the causeway. The western gallery tomb contains seals bearing the names of Hotepsekhemwy and Nebra, and the eastern gallery tomb contains numerous seals inscribed with the name of Ninetjer indicating probable ownership. The superstructures of the tombs were demolished, allowing the mortuary temple and upper end of the causeway to be built over the top of them.
The interior walls of the causeway were highly decorated with painted bas-reliefs, but records of these are fragmentary. The remnants depict a variety of scenes including the hunting of wild animals, the conducting of harvests, scenes from the markets, craftsmen working copper and gold, a fleet returning from Byblos, boats transporting columns from Aswan to the construction site, battles with enemies and nomadic tribes, the transport of prisoners, lines of people bearing offerings, and a procession of representatives from the nomes of Egypt. A slit was left in a section of the causeway roofing, allowing light to enter illuminating the brightly painted decorations on the walls. The archaeologist Peter Clayton notes that these depictions were more akin to those found in the mastabas of nobles.
The Egyptologist Miroslav Verner highlights one particular scene from the causeway depicting famished desert nomads. The scene had been used as "unique proof" that the living standards of desert dwellers had declined during Unas's reign as a result of climatic changes in the middle of the third millennium B.C. The discovery of a similar relief painting on the blocks of Sahure's causeway casts doubt on this hypothesis. Verner contends that the nomads may have been brought in to demonstrate the hardships faced by pyramid builders bringing in higher quality stone from remote mountain areas. Grimal suggested that this scene foreshadowed the nationwide famine that seems to have struck Egypt at the onset of the First Intermediate Period. According to Allen et al., the most widely accepted explanation for the scene is that it was meant to illustrate the generosity of the sovereign in aiding famished populations.
A collection of tombs were found north of the causeway. The tomb of Akhethetep, a vizier, was discovered by a team led by Christiane Ziegler. The other mastabas belong to the viziers Ihy, Iy–nofert, Ny-ankh-ba and Mehu. The tombs are conjectured to belong to Unas's viziers, with the exception of Mehu's tomb, which is associated with Pepi I. Another tomb, belonging to Unas-ankh, son of Unas, separates the tombs of Ihy and Iy-nofert. It may be dated late into Unas's reign.
Ahmed Moussa discovered the rock-cut tombs of Nefer and Ka-hay – court singers during Menkauhor's reign – south of Unas's causeway, containing nine burials along with an extremely well preserved mummy found in a coffin in a shaft under the east wall of the chapel. The Chief Inspector at Saqqara, Mounir Basta, discovered another rock-cut tomb just south of the causeway in 1964, later excavated by Ahmed Moussa. The tombs belonged to two palace officials – manicurists – living during the reigns of Nyuserre Ini and Menkauhor, in the Fifth Dynasty, named Ni-ankh-khnum and Khnum-hotep. A highly decorated chapel for the tomb was discovered the following year. The chapel was located inside a unique stone mastaba that was connected to the tombs through an undecorated open court.
### Mortuary temple
The mortuary temple in Unas's pyramid complex has a layout comparable to his predecessor, Djedkare Isesi's, with one notable exception. A pink granite doorway separates the end of the causeway from the entrance hall. It bears the names and titles of Teti, Unas's successor, indicating that he must have had the doorway constructed following Unas's death. The entrance hall had a vaulted ceiling, and a floor paved with alabaster. The walls in the room were decorated with relief paintings that depicted the making of offerings. The entrance hall terminates in an open columned courtyard, with eighteen – two more columns than in Djedkare Isesi's complex – pink granite palm columns supporting the roof of an ambulatory. Some of the columns were reused centuries later in buildings in Tanis, the capital of Egypt during the Twenty First and Twenty Second Dynasties. Other columns have been displayed in the British Museum, and in the Louvre. Relief decorations that were formerly in the courtyard have also been reused in later projects, as shown by the presence of reliefs of Unas in Amenemhat I's pyramid complex in El-Lisht.
North and south of the entrance hall and columned courtyard were storerooms. These were stocked regularly with offering items for the royal funerary cult, which had expanded influence in the Fifth Dynasty. Their irregular placement resulted in the northern storerooms being twice as numerous as the southern. The rooms were used for burials in the Late Period, as noted by the presence of large shaft tombs. At the far end of the courtyard was a transverse corridor creating an intersection between the columned courtyard at its east and inner temple to its west, with a cult pyramid to the south, and a larger courtyard surrounding the pyramid to the north.
The inner temple is accessed by a small staircase leading into a ruined chapel with five statue niches. The chapel and offering hall were surrounded by storerooms; as elsewhere in the temple, there were more storerooms to the north than south. The antichambre carrée – a square antechamber – separated the chapel from the offering hall. The room measures 4.2 m (14 ft; 8.0 cu) on each side, making it the smallest such chamber from the Old Kingdom, but has been largely destroyed. It was originally entered through a door on its eastern side, and contained two additional doors leading to the offering hall and storeroom. The room contained a single column made of quartzite – fragments of which have been found in the south-west part of the temple – quarried from the Gabel Ahmar stone quarry near Heliopolis. Quartzite, being a particularly hard stone – a 7 on the Mohs hardness scale – was not typically used in architectural projects, but was used sparingly as a building material at some Old Kingdom sites in Saqqara. The hard stone is associated with the sun cult, a natural development caused by the coloration of the material being sun-like. Remnants of a granite false door bearing an inscription concerning the souls of the residents of Nekhen and Buto marks what little of the offering hall has been preserved. A block from the door has been displayed in the Egyptian Museum in Cairo.
### Cult pyramid
The purpose of the cult pyramid remains unclear. It had a burial chamber but was not used for burials, and instead appears to have been a purely symbolic structure. It may have hosted the pharaoh's ka, or a miniature statue of the king. It may have been used for ritual performances centring around the burial and resurrection of the ka spirit during the Sed festival.
The cult pyramid in Unas's complex has identifiable remains, but has otherwise been destroyed. The preserved elements suggest that it had a base length of 11.5 m (38 ft; 22 cu), a fifth of that of the main pyramid. The pyramid's covering slabs were inclined at 69°. This was typical for cult pyramids which had a 2:1 ratio-ed slope, and thus a height equal to the length of the base, i.e. 11.5 m (38 ft; 22 cu). A small channel was dug in front of the pyramid entrance, perhaps to prevent run-off from entering the pyramid. The first slabs of the descending corridor are declined at 30.5°. The pit measures 5.15 m (17 ft; 10 cu) north-south and 8.15 m (27 ft; 16 cu) east-west. The burial chamber was cut 2.03 m (7 ft; 4 cu) deep into the rock, sits 2.12 m (7 ft; 4 cu) beneath the pavement and measures 5 m (16 ft; 10 cu) by 2.5 m (8 ft; 5 cu).
The "great enclosure" of the main pyramid and inner temple has an identifiable anomaly. Four m (13 ft; 8 cu) from the cult pyramid's west face the wall abruptly turns to the north before receding for 12 m (39 ft; 23 cu) toward the main pyramid. It stops 2.6 m (8.5 ft; 5.0 cu) from the main pyramid and turns once more back onto its original alignment. The only explanation for this is the presence of the Second Dynasty Hotepsekhemwy's large tomb which spans the width of the whole temple and crosses directly under the recess. The architects of the pyramid appear to have preferred for the enclosure wall to run over the tomb's passageway, rather than over the top of the subterranean gallery. The cult pyramid has its own secondary enclosure that runs along the north face of the pyramid and half of its west face. This secondary wall was about 1.04 m (3 ft; 2 cu) thick, and had a double-door 0.8 m (2.6 ft) thick built close to its start.
## Later history
Evidence suggests that Unas's funerary cult survived through the First Intermediate Period and into the Middle Kingdom, an indication that Unas retained prestige long after his death. Two independent pieces of evidence corroborate the existence of the cult in the Middle Kingdom: 1) A stela dated to the Twelfth Dynasty bearing the name Unasemsaf and 2) A statue of a Memphite official, Sermaat, from the Twelfth or Thirteenth Dynasty, with an inscription invoking Unas's name. The Egyptologist Jaromír Málek contends that the evidence only suggests a theoretical revival of the cult, a result of the valley temple serving as a useful entry path into the Saqqara necropolis, but not its persistence from the Old Kingdom. Despite renewed interest in the Old Kingdom rulers at the time, their funerary complexes, including Unas's, were partially reused in the construction of Amenemhat I's and Senusret I's pyramid complexes at El-Lisht. One block used in Amenemhat's complex has been positively identified as originating from Unas's complex, likely taken from the causeway, on the basis of inscriptions containing his name appearing upon it. Several other blocks have their origins speculatively assigned to Unas's complex as well.
The Saqqara plateau witnessed a new era of tomb building in the New Kingdom. Starting with the reign of Thutmose III in the Eighteenth Dynasty and up until possibly the Twentieth Dynasty, Saqqara was used as a site for the tombs of private individuals. The largest concentrations of tombs from the period are found in a large area south of Unas's causeway. This area came to prominent use around the time of Tutankhamun. Unas's pyramid underwent restorative work in the New Kingdom. In the Nineteenth Dynasty, Khaemweset, High Priest of Memphis and son of Ramesses II, had an inscription carved onto a block on the pyramid's south side commemorating his restoration work.
Late Period monuments, colloquially called the "Persian tombs", thought to date to the reign of Amasis II, were discovered near the causeway. These include tombs built for Tjannehebu, Overseer of the Royal Navy; Psamtik, the Chief Physician; and Peteniese, Overseer of Confidential Documents. The Egyptologist John D. Ray explains that the site was chosen because it was readily accessible from both Memphis and the Nile Valley. Traces of Phoenician and Aramaic burials have been reported in the area directly south of Unas's causeway.
## See also
- List of Egyptian pyramids
- List of megalithic sites |
# 1926 World Series
The 1926 World Series was the championship series of the 1926 Major League Baseball season. The 23rd edition of the Series, it pitted the National League champion St. Louis Cardinals against the American League champion New York Yankees. The Cardinals defeated the Yankees four games to three in the best-of-seven series, which took place from October 2 to 10, 1926, at Yankee Stadium and Sportsman's Park.
This was the first World Series appearance (and first National League pennant win) for the Cardinals, and would be the first of 11 World Series championships in Cardinals history. The Yankees were playing in their fourth World Series in six years after winning their first American League pennant in 1921 and their first world championship in 1923. They would play in another 36 World Series (and win 26 of those), as of the end of the 2023 season.
In Game 1, Herb Pennock pitched the Yankees to a 2–1 win over the Cards. In Game 2, pitcher Grover Cleveland Alexander evened the Series for St. Louis with a 6–2 victory. Knuckleballer Jesse Haines' shutout in Game 3 gave St. Louis a 2–1 Series lead. In the Yankees' 10–5 Game 4 win, Babe Ruth hit three home runs, a World Series record equaled only four times since. According to newspaper reports, Ruth had promised a sickly boy named Johnny Sylvester to hit a home run for him in Game 4. After Ruth's three-homer game, the boy's condition miraculously improved. The newspapers' account of the story is disputed by contemporary baseball historians, but it remains one of the most famous anecdotes in baseball history. Pennock again won for the Yankees in Game 5, 3–2.
Cards' player-manager Rogers Hornsby chose Alexander to start Game 6, and used him in relief to close out Game 7. Behind Alexander, the Cardinals won the final two games of the series, and thus the world championship. In Game 7, the Yankees, trailing 3–2 in the bottom of the ninth inning and down to their last out, Ruth walked, bringing up Bob Meusel. Ruth, successful in half of his stolen base attempts in his career, took off for second base on the first pitch. Meusel swung and missed, and catcher Bob O'Farrell threw to second baseman Hornsby who tagged Ruth out, ending Game 7 and thereby crowning his Cardinals World Series champions for the first time. The 1926 World Series is the only Series to date which ended with a baserunner being caught stealing.
This World Series also started the Cardinals run of dominance in the National League. Between the years 1926 and 1934, St. Louis captured five National League pennants and won three World Series titles.
In 2020, ESPN ranked the 1926 World Series as the 10th best ever played.
## Season summary
The Cardinals won the 1926 National League pennant with 89 wins and 65 losses, two games ahead of the runner-up Cincinnati Reds, after finishing only fourth in 1925 at 77–76. Before 1926 was half over, they traded outfielder Heinie Mueller to the New York Giants for outfielder Billy Southworth. They also claimed future Hall of Fame pitcher Grover Cleveland Alexander on waivers from the Chicago Cubs. Their starting rotation was led by Flint Rhem with 20 wins and a 3.21 earned run average (ERA), far surpassing his eight wins and 4.92 ERA of 1925. Offensively, the Cardinals were led by Jim Bottomley, Rogers Hornsby (who had batted over .400 in 1925) and catcher Bob O'Farrell, 1926 National League MVP-to-be.
The 1926 NL pennant race was heated. During the second and third weeks of September, both the Cardinals and the Reds had multi-game winning streaks and traded first and second place almost every day. On September 17, the Cards took a one-game lead over the Reds and extended their lead when the Reds lost several games in a row. They lost the last game of the season to the Reds on September 26, but still finished two games ahead of them in first place in the final standings.
The Yankees had the best record in the AL at 91–63, finishing three games ahead of the Cleveland Indians and greatly improving on their 69-win, seventh-place 1925 season. Lou Gehrig played his first full season as the Yankees' starting first baseman, and the team traded for rookie shortstop Tony Lazzeri in the offseason, eventually playing him at second base. Gehrig, Lazzeri and Ruth led the offense, while Pennock and Urban Shocker led the starting rotation with 42 wins between them.
In early September 1926, thousands of Cleveland fans, confident that their Indians would win the pennant even when they trailed the Yankees by six games, made World Series ticket reservations. By September 23, the Indians were only two games behind New York, but then lost three of their final four games to finish the season three games behind.
On September 11, Baseball Commissioner Kenesaw Mountain Landis met with representatives from four of the top teams in each of the two major leagues. The group gave home field to the AL for World Series Games 1–2 (scheduled for October 2 and 3) and 6–7, while the NL would host Games 3–5. Each game was to start at 1:30 p.m. local time.
Some bookmakers made the Yankees a 15-to-1 Series favorite, while others, like New York's top betting commissioners, thought the teams were evenly matched. One New York Times writer found "little justification for installing either team as the favorite". Regardless of the odds, players from both teams were confident of victory. Hornsby said, "We're going to come through winners. We have the better pitching staff, the better hitters and the greater experience. That's what it takes to win. ... We're going to beat the Yankees. Any of my ball players will tell you that, and we expect to do it." Yankee skipper Miller Huggins retorted,
> We're confident we're going to win. It'll be whichever team does the hitting, and we're sure we're going to do it. We're out of our hitting slump. We have a more experienced team and more experienced pitchers. We're about even in the strength of the infields, but ours is steadier. Our outfield is better, stronger and more experienced, and all the boys are cocky and ready to go. There's no doubt in their minds or in mine that the Yankees will win.
## Series summary
## Matchups
### Game 1
Yankee Stadium was filled with 61,658 fans on October 2 for Game 1. Those without tickets gathered at City Hall to watch the game's progress as charted on two large scoreboards. Before the start of the game, New York Supreme Court judge Robert F. Wagner, then a candidate for the United States Senate, threw out the ceremonial first pitch and took his position in the VIP box next to New York City mayor Jimmy Walker. Commissioner Landis and former heavyweight champion of the world Jack Dempsey were also in attendance. Southpaw Bill Sherdel started for the Cardinals, having posted a 16–12 record with 235 innings pitched in the regular season. The Yankees started Pennock, the team's only 20-game winner that season. The future Hall of Fame pitcher, nicknamed "The Knight of Kennett Square" for his hometown, had a 3.62 ERA in 2661⁄3 innings during the regular season, and had finished third in the American League Most Valuable Player Award balloting behind winner George Burns and runner-up Johnny Mostil.
Taylor Douthit led off Game 1 with a double to left, advanced to third on Southworth's slow grounder to second baseman Tony Lazzeri, stayed there on Hornsby's comebacker right to Pennock but came home on "Sunny Jim" Bottomley's bloop single for the first run of the Series. In the bottom half, Sherdel walked Earle Combs, Babe Ruth and Bob Meusel, to load the bases. Gehrig scored Combs with a fielder's choice grounder for his first World Series run batted in (RBI), reaching first ahead of the relay. The Cardinals and Yankees were tied 1–1 after the first inning.
In the bottom of the third, Ruth singled and Meusel bunted him over, but Ruth split his pants sliding into second, prompting radio announcer Graham McNamee to exclaim, "Babe is the color of a red brick house\!" Doc Woods, the team's trainer, ran out and sewed up Ruth's pants, much to the amusement of the crowd.
The score was still tied at one apiece in the bottom of the sixth, just as rain began to fall. Ruth lined a single past third baseman Les Bell. Meusel again sacrificed Ruth to second. Gehrig followed with a single, scoring Ruth and giving the Yankees the lead. Lazzeri lined a shot to left but Gehrig, on a headfirst dive, was tagged out at third by Bell. Lazzeri advanced to second on the throw. Bell bobbled Dugan's grounder for an error to put runners at first and third, but Hank Severeid forced Dugan at second to end the inning. The Yankees maintained their one-run advantage through the end of the eighth inning.
In the top of the ninth, Bottomley singled off Pennock but could not advance, giving the Yankees a 2–1 win in Game 1. Gehrig was their offensive star with both of his team's RBIs. Pennock went the distance, striking out four and yielding but three hits, two in the first and one in the ninth. Hard-luck loser Sherdel gave up only two runs and six hits while striking out one.
### Game 2
The second game was played the next day, October 3, at Yankee Stadium in front of a crowd of 63,600. Urban Shocker was the starting pitcher for the Yankees. With 19 wins and 11 losses, Shocker had the second-best pitching record on the team, behind the Game 1 starter, southpaw Herb Pennock. Shocker had a 3.38 ERA in 258 innings, along with 59 strikeouts in the 1926 season. The Cardinals' Game 2 starter was 39-year-old Grover Cleveland "Old Pete" Alexander, a veteran player in his 16th major league season. That season, he posted numbers considerably lower than the pitching season statistics from his prime in the late 1910s with the Philadelphia Phillies and Chicago Cubs. Alexander had compiled a 12–10 record in 200 innings, while posting a 3.05 ERA and 48 strikeouts, compared to the nearly 250 strikeouts he had in 1915 with the Phillies.
The Cardinals were first to bat in the game. After giving up a double to Rogers Hornsby, Shocker got a groundout from Jim Bottomley to end the run-scoring threat. In the Yankees' half of the inning, Mark Koenig grounded into a double play, and Babe Ruth followed by striking out. The Cardinals threatened again in the second inning, after back-to-back singles by catcher Bob O'Farrell and shortstop Tommy Thevenow. However, Alexander came to the plate and popped up to Koenig to end the inning. The Yankees scored first in the bottom of the second inning. Bob Meusel hit a single into center field, and Lou Gehrig followed by hitting a grounder to Alexander, which advanced Meusel to second base. Tony Lazzeri then hit a single to left field that scored Meusel from second. Joe Dugan followed with a single of his own, moving Lazzeri to third base. On the following play, Yankees catcher Hank Severeid struck out, and Lazzeri then attempted to steal home plate. Alexander made an error on his throw to catcher Bob O'Farrell, and Lazzeri was able to slide into home plate for the second Yankees run of the inning. O'Farrell then threw the ball to Thevenow, but the tag was late and Dugan was called safe at second base. The inning ended when Alexander struck out Shocker.
In the third inning, Taylor Douthit hit an infield single to shortstop Koenig, and Billy Southworth followed with a single to left field, advancing Douthit to second base. Hornsby laid down a sacrifice bunt to Shocker, moving each runner up a base. Bottomley hit a single into left field, scoring both Douthit and Southworth. The next two batters, Les Bell and Chick Hafey, hit into outs to conclude the inning. In the top of the seventh inning. Bob O'Farrell lined a double, and Tommy Thevenow followed with a single into left field. Pitcher Alexander popped up to Lazzeri, and Douthit followed with a fly ball to left field. Southworth then hit a three-run home run, giving the Cardinals a 5–2 advantage over the Yankees. Hornsby then grounded out to Koenig to end the inning. Gehrig, Lazzeri and Dugan all grounded out in the bottom of the seventh inning. In the top of the eighth, Bottomley hit a single into right field. Yankees manager Miller Huggins came out of the dugout and took Shocker out of the game, calling in Bob Shawkey from the bullpen to replace him. Shawkey struck out the first two batters he faced, and Bottomley was tagged out after attempting to steal second base. The Yankees could not produce any runs in their half of the inning.
In the ninth inning, Sad Sam Jones, a 22-year veteran in the American League, replaced Dutch Ruether, who had replaced pitcher Shawkey. Jones gave up an inside-the-park home run to Thevenow. Thevenow had only two other home runs in his career, both of which were inside-the-park and during the 1926 regular season. Jones then walked Douthit and Hornsby and gave up a single to Southworth. With the bases loaded and two outs in the top of the ninth inning, Bottomley hit a fly ball to center fielder Earle Combs. The Yankees did not score in the bottom of the ninth inning, and lost the game to the Cardinals by a 6–2 score. Alexander pitched a complete game, allowed hits in only two of the nine innings and did not allow a Yankee hit after the third inning. He also had a series-high 10 strikeouts, allowing four hits, one earned run and one walk. Meanwhile, the Yankees' starter Shocker allowed ten hits and five earned runs, including a home run, in seven innings of work. Shawkey had a perfect inning with two strikeouts, while Jones gave up two hits and allowed two walks in the ninth inning.
### Game 3
After Game 2 ended on October 3, the Yankees and Cardinals boarded trains to St. Louis, Missouri. The mayor of St. Louis, Victor J. Miller, ordered that the workday end by three the next afternoon so that the city could welcome the Cardinals at Union Station. The Cardinals players were treated like champions by fans and citizens alike. Just outside the station, Mayor Miller stood at a podium and presented club manager and player Rogers Hornsby with a brand new Lincoln sedan priced at US$4,000 and paid for by the city's top businessmen. Each member of the Cardinals' team received a new hat, a new pair of shoes, and an engraved white-gold watch valued at a manufacturer's price of $100. As the Cardinals were receiving special treatment from the people of St. Louis, fans were lining up outside Sportsman's Park with the hope of being able to purchase tickets to Game 3 for a price of $3.30.
Sportsman's Park was filled with 37,708 people on October 5 for Game 3. On the mound for the Cardinals was right-handed knuckleball pitcher Jesse Haines, a future Hall of Famer with a 13–4 record and 3.25 ERA in 183 innings in 1926. Starting for the Yankees was southpaw pitcher Dutch Ruether, who had a 14–9 record with a 4.60 ERA in 1926.
The game was rain delayed for 30 minutes during the top half of the fourth inning. Once the game resumed, the Cardinals came to bat and scored the first runs of the game. Les Bell, a .325 hitter with 17 home runs that season, led the Cardinals with a single to center field. Chick Hafey dropped a sacrifice bunt straight to Ruether, who then threw it to second baseman Tony Lazzeri. Bell beat Lazzeri's tag at second base and was called safe by the umpire. Bob O'Farrell was walked, and Tommy Thevenow hit a grounder to Lazzeri, who tossed it to Mark Koenig for the force out at second base. Koenig tagged O'Farrell out, but made an error in his throw to first baseman Lou Gehrig, which resulted in a run. Then, Haines hit a Ruether pitch for a two-run home run.
The Cardinals were leading the Yankees 3–0 by the end of the inning. The Yankees failed to produce any offense in the fifth inning, but the Cardinals added to their lead by picking up a run when Billy Southworth beat the tag at home following a Jim Bottomley grounder to second base. Ruether was then replaced by Bob Shawkey, who closed out the inning by yielding two weak infield groundouts. The Yankees picked up one hit in each of the next two innings, but could not produce any runs. Yankees pitcher Myles Thomas came in to pitch a hitless ninth inning. With one out in the top of the ninth inning, Lou Gehrig hit a line drive single into right field, but Lazzeri grounded into a double play, ending the game as a 4–0 Cardinals victory. Haines pitched a complete-game shutout, and only gave up five hits total, two of which came from Gehrig.
### Game 4
Future Hall of Famer Waite Hoyt started Game 4 for the Yankees at Sportsman's Park on October 6. Hoyt had a 16–12 record with a 3.85 ERA in 218 innings for the 1926 season. This was Hoyt's fourth World Series with the Yankees, and he entered the 1926 Series with over 35 innings of pitching experience in the championship series. He was opposed by Flint Rhem, the Cardinals' 20-game winner who had led the team with both a .741 winning percentage and 258 innings pitched.
In the first inning, after striking out Earle Combs and Mark Koenig, Rhem gave up a home run to Babe Ruth. Bob Meusel was then walked, but was tagged out at home after attempting to score on a Lou Gehrig single. The Cardinals came into the bottom of the first with two straight singles to put runners at first and third base. Rogers Hornsby singled in Taylor Douthit to tie the game at 1–1 and moved Billy Southworth to second base. Jim Bottomley flied out to left field, and Les Bell followed with a sacrifice fly to center fielder Combs. With the go-ahead run at third base, Hornsby stole second, but Chick Hafey struck out to end to the Cardinals' run-scoring threat. Two innings later, Ruth came up to the plate with two outs and hit Rhem's pitch for a home run, his second of the game. Gehrig led off the next inning with a strikeout. Tony Lazzeri followed with a walk, and Joe Dugan hit a run-scoring double. Catcher Hank Severeid hit a single into center field, and Dugan ran towards home. He was tagged out at the plate by catcher Bob O'Farrell. The Yankees' starter Hoyt struck out to end the inning.
The Cardinals responded by scoring three runs in the bottom of the fourth inning. With one out and no runners on the bases, Hafey hit a single. O'Farrell followed and hit a ground ball towards Koenig that he bobbled, enabling O'Farrell and Hafey to reach first and second base, respectively. Tommy Thevenow followed with a double to right field that got by Meusel, scoring Hafey and moving O'Farrell to third base. Cardinals' manager Rogers Hornsby then put in left-handed infielder Specs Toporczer to pinch hit for Rhem, who was done pitching for the game. Toporczer hit a fly ball to Earle Combs in center field, upon which O'Farrell promptly tagged up to score another Cardinal run. With the game tied at three apiece and a runner at second base, Douthit hit a double in the outfield, which scored Thevenow. Southworth followed with a single to left fielder Ruth, and Douthit immediately tried to score. Ruth threw from left field to catcher Hank Severeid, who tagged Douthit out at home plate.
To start the top of the fifth inning, Art Reinhart was put in as pitcher. Reinhart walked Combs and followed by giving up a run-scoring double to Koenig. He then walked Ruth and Meusel in succession to load the bases for Gehrig. Reinhart walked Gehrig, allowing Koenig to score and keeping the bases loaded with no outs. Hi Bell replaced Reinhart as pitcher, but he was not able to suppress the Yankees' offense. Lazzeri hit a sacrifice fly to right field, which scored Ruth and moved Meusel up to third base. Dugan then hit a weak groundball; he was thrown out at first by catcher O'Farrell, but Meusel scored and Gehrig went to second base. Bell then balked, moving Gehrig to third base. Severeid was walked, and pitcher Hoyt ended the inning by hitting into a force play at second base.
The Yankees expanded on their three-run lead in the next inning. After the entire Yankees lineup batted in the fifth inning, Combs was back at the plate to start the sixth. Combs hit an infield single past shortstop Thevenow. Koenig followed by striking out. Ruth, with two home runs already in the game, came up to the plate. The count on Ruth went up to three balls and two strikes before he hit a long home run. Ruth's three home runs was a feat equaled only thrice since. As one of the game announcers (either McNamee or Carlin) described the situation:
> The Babe is up. Two home runs today. One ball, far outside. Babe's shoulders look as if there is murder in them down there, the way he is swinging that bat down there. A high foul into the left-field stands. That great big bat of Babe's looks like a toothpick down there, he is so big himself. Here it is. Babe shot a bad one and fouled it. Two strikes and one ball. The outfield have all moved very far towards right. It is coming up now. A little too close. Two strikes and two balls. He has got two home runs and a base on balls so far today. Here it is, and a ball. Three and two. The Babe is waving that wand of his over the plate. Bell is loosening up his arm. The Babe is hit clear into the center-field bleachers for a home run\! For a home run\! Did you hear what I said? Where is that fellow who told me not to talk about Ruth anymore? Send him up here.
>
> Oh what a shot\! Directly over second. The boys are all over him over there. One of the boys is riding on Ruth's back. Oh, what a shot\! Directly over second base far into the bleachers out in center field, and almost on a line and then that dumbbell, where is he, who told me not to talk about Ruth\! Oh, boy\! Not that I love Ruth, but oh, how I love to see a shot like that\! Wow\! That is a world's series record, three home runs in one world's series game and what a home run\! That was probably the longest hit ever in Sportsman's Park. They tell me this is the first ball ever hit in the center-field stand. That is a mile and half from here. You know what I mean.
It was measured at over 430 feet (130 m) and had cleared the 20 feet (6.1 m) wall in center field, crashing through the window of an auto dealer across the street from the stadium. Locals claimed it was the longest home run ever hit in St. Louis. Meusel then hit a single in right field, but was tagged out as he tried to head for second base. Gehrig followed with a double to the opposite side, but could not score when Lazzeri popped up to Thevenow to end the inning.
In the seventh inning, the Yankees faced a new pitcher, this time southpaw Bill Hallahan, who served as both a starter and reliever for the Cardinals. After Severeid singled and subsequently advanced on a sacrifice bunt by Hoyt, he scored on a double hit into left field by Combs. The Yankees led 10–4 and did not get any more runs or hits in the eighth or ninth inning. The Cardinals came up to bat in the bottom of the ninth inning with Hoyt trying to hold on to his six-run lead. Hornsby singled to right field and advanced to second base on the following play. He then ran home to score a run on a Les Bell single to center field. Hafey then popped up in foul territory, and Severeid made the catch. The game ended with a 10–5 score. Waite Hoyt pitched a complete game, allowing two earned runs on 14 hits while striking out eight batters. The Cardinals' five pitchers combined to give up 10 Yankee runs and 14 hits. With the series tied at two games apiece, both teams anticipated Game 5, which featured a rematch between Herb Pennock and Bill Sherdel.
#### Babe Ruth and Johnny Sylvester
The 1926 World Series produced one of the most famous anecdotes in baseball history, involving Babe Ruth and Johnny Sylvester. Sylvester was an 11-year-old boy from Essex Fells, New Jersey who was supposedly hospitalized after falling off a horse. Sylvester asked his father to get him a baseball autographed by Babe Ruth. Prior to the start of the World Series, the boy's parents sent urgent telegrams to the Yankees in St. Louis, asking for an autographed ball. Soon, the family received an airmail package with two balls, one autographed by the entire St. Louis Cardinals team and the other with signatures from a number of Yankees players and a personal message from Ruth saying, "I'll knock a homer for you on Wednesday". After Ruth hit three home runs in Game 4 on October 6, newspapers reported that Sylvester's condition had miraculously improved. After the World Series had ended, Ruth made a highly publicized visit to Sylvester's home, in which the boy said to Ruth, "I'm sorry the Yanks lost the series". In the spring of 1927, Sylvester's uncle visited Ruth and thanked him for saving the boy's life. Ruth asked how the boy was doing and asked the uncle to give the boy his regards. After the man left, Ruth, who was seated next to a group of baseball writers, said, "Now who the hell is Johnny Sylvester?"
There have been many alternate versions of this event. One version, which was later portrayed in The Babe Ruth Story, claims that Ruth went to Sylvester's hospital bed and promised him in person that he would hit a home run for him. On October 9, Ruth followed up on Sylvester and told him he would "try to knock you another homer, maybe two today". Differing newspaper reports from October 1926 claimed that Sylvester suffered from blood poisoning, a spinal infection, a sinus condition, or had a condition requiring a spinal fusion. Contemporary analyses dispute whether Sylvester was ever hospitalized, dying, or if Ruth's three home runs had actually saved the boy's life, as claimed by the newspapers.
### Game 5
Game 5, played at Sportsman's Park in St. Louis on October 7, featured a rematch between Game 1 starters Herb Pennock and Bill Sherdel. Pennock had pitched a complete game three-hitter in the 2–1 Yankees victory, while Sherdel had pitched seven innings, giving up two runs and six hits.
Through the first three innings of the fifth game, both pitchers held the opposing team to no runs and a limited number of hits. In the bottom of the fourth inning, the Cardinals cracked through Pennock's tough pitching. Jim Bottomley began by hitting a one-out double past left fielder Babe Ruth. Les Bell followed with a single to right field, scoring Bottomley. Chick Hafey then hit a fly ball caught in foul territory by Ruth, and Bell was called out while attempting to steal second base. In the top of the sixth inning, Pennock hit a line drive double into left field past Hafey. Cardinals' catcher Bob O'Farrell threw to Tommy Thevenow in hopes of picking off Pennock, who was standing a considerable distance away from second base. Thevenow made an error with his tag on Pennock, and Pennock was safe at second base. Earle Combs, the Yankees leadoff hitter, followed by drawing a walk. With runners at first and second base, Koenig hit a single to left fielder Hafey. Pennock scored on the play, and Combs moved to second base. Ruth then struck out, and Bob Meusel followed by hitting a sacrifice fly to right fielder Billy Southworth, on which Combs promptly advanced to third base. Lou Gehrig drew a walk to load the bases for Tony Lazzeri, who ended the inning by hitting a fly ball to center fielder Wattie Holm.
The Cardinals came back to take the lead in the bottom of the seventh inning. Bell led the inning by hitting a double into left field. After a Hafey fly out, O'Farrell hit a single to Ruth in left field, and Bell ran from second base to home to score the run and give the Cardinals a 2–1 advantage. In the top of the ninth inning, the Yankees tied up the game. Gehrig lined a double to left field, and Lazzeri bunted a single, advancing Gehrig to third base. Ben Paschal went in as a pinch-hitter for Joe Dugan and singled into center field, scoring Gehrig and advancing Lazzeri to second base. Severeid laid down a weak bunt, and Cardinals catcher O'Farrell threw to third base to make the force out on Lazzeri. With runners at first and second base, Pennock hit a groundball to shortstop Thevenow, who tossed it to second base to get the force out on Severeid. With Pennock at first base and Paschal at third base, Combs grounded to second base, ending the Yankees' hope of taking the lead. The Cardinals could not break the 2–2 tie in the bottom of the ninth inning, so the game went into extra innings.
The Yankees immediately took advantage of Sherdel in the top of the tenth inning. Koenig led things off by singling into left field. Sherdel threw a wild pitch to Ruth, and Koenig advanced to second base. Ruth then walked, and Meusel followed with a sacrifice bunt straight to pitcher Sherdel. Meusel was out at first base, but Ruth and Koenig were safe at second and third base, respectively. Gehrig was intentionally walked, loading the bases. Lazzeri hit a fly ball to left field, and Koenig tagged up on the play to score a run and give the Yankees a one-run lead. Mike Gazella, in place of Joe Dugan at third base, was hit by a pitch from Sherdel. With the bases loaded again, Severeid popped up to second baseman Rogers Hornsby to end the Yankee rally. The Cardinals got a single from Thevenow in the bottom of the tenth inning, but they could not score any runs. The game ended with the Yankees winning by a score of 3–2. Both Pennock and Sherdel pitched ten-inning complete games. Sherdel gave up nine hits and two earned runs, while walking five and striking out two. Pennock finished the game giving up just seven hits and two runs, while striking out four batters.
### Game 6
The teams moved back to Yankee Stadium for Game 6. Over 48,000 fans came into Yankee Stadium on October 9 to see if the Yankees could win their second World Series in franchise history. The game's pitching matchup was between Grover Cleveland Alexander and Bob Shawkey, both of whom had made appearances in previous games in the series. Shawkey had come in as relief in Games 2 and 3, while Alexander had pitched a complete game against the Yankees in the Cardinals' Game 2 victory. In the 1926 season, Shawkey had made most of his pitching appearances in relief, and had been an occasional starter on the Yankees rotation. He started 10 of his 29 total pitching appearances and posted an 8–7 record with a 3.62 earned run average.
The game was lopsided from the start. In the top of the first inning, Shawkey gave up three runs on three hits, with the runs coming from a Jim Bottomley double and Les Bell single. Alexander encountered a minor setback in the fourth inning. To open up the bottom of the inning, Bob Meusel launched a triple into left field and scored on the following ground out by Lou Gehrig. Alexander shut down the Yankees for the rest of the inning, and the Cardinals held on to a 3–1 lead. In the top of the fifth inning, the Cardinals expanded their two-run lead. Tommy Thevenow hit a single to left fielder Babe Ruth. Alexander laid down a sacrifice bunt and was tagged out by first baseman Gehrig, but was successful in advancing Thevenow to second base. Wattie Holm, substituting for Taylor Douthit as center fielder, followed by hitting a single into center field, scoring Thevenow on the play. Billy Southworth and Rogers Hornsby followed with groundouts in the infield to end the inning.
The Cardinals scored again in the top of the seventh inning. Thevenow again led the inning by hitting a single into left field. Alexander bunted right in front of the plate. Yankees catcher Hank Severeid made the throw to second baseman Tony Lazzeri, but Lazzeri made an error on the play, and both runners were safe at their respective bases. Holm followed by hitting a weak grounder that led to a force out of Thevenow at third base. With runners at first and second base, Southworth lined a double right by Ruth, scoring Alexander and sending Holm to third base. Urban Shocker, the starter in Game 2, then came in to relieve Shawkey as pitcher. Shocker gave up a single to Hornsby into center field, allowing Holm and Southworth to score. Bottomley then hit a grounder to shortstop Mark Koenig, who stepped on second base to get Hornsby out on the force play. Bell followed with a two-run home run, extending the Cardinals' lead to 9–1. Chick Hafey lined a double into left field, but Bob O'Farrell ended the inning by striking out. In the bottom of the seventh inning, the Yankees scored one run on an Earle Combs single to cut the Cardinals' lead to seven runs.
In the eighth inning, Myles Thomas came in to relieve Shocker, who had given up three hits and two unearned runs in less than an inning of work. Meanwhile, Alexander shut down the Yankees offense for the rest of the game. In the top of the ninth inning, the Cardinals increased their lead back to eight runs after Hornsby had an RBI groundout, scoring Southworth. Alexander finished with his second complete game of the series and gave up only two runs on eight hits, while striking out six batters. The three Yankee pitchers combined to give up 13 hits, seven earned runs, three unearned runs, and one home run.
### Game 7
The deciding Game 7 was played on October 10, 1926, at Yankee Stadium in front of a crowd of 38,093 people. The game featured two future Hall of Famers, who were both winners in their respective pitching appearances earlier in the series. Jesse Haines took to the mound for the Cardinals; he had pitched in relief in Game 1 and threw a complete-game shutout against the Yankees in Game 3. Waite Hoyt had pitched a complete-game 10–5 Yankees victory in Game 4.
The Yankees scored the first run of the game in the third inning on a Babe Ruth home run into the right field bleachers. In the following half inning, the Cardinals came back to take a 3–1 lead over the Yankees. Jim Bottomley lined a one-out single into left field to start the Cardinals' fourth inning rally. Les Bell just barely made it to first base after shortstop Mark Koenig accidentally kicked the ball while trying to field it. With runners at first and second base, Chick Hafey hit a bloop single into left field, which loaded up the bases for catcher Bob O'Farrell. This time, left fielder Bob Meusel made an error by dropping O'Farrell's fly ball, so Bottomley scored to tie the game, and the bases remained loaded. Tommy Thevenow followed with a two-run single to right fielder Ruth. Hoyt struck out the next batter, and Wattie Holm hit into a force play at second base. All three runs in the inning were charged as unearned on Hoyt, due to the two Yankee errors.
In the sixth inning, the Yankees cut the Cardinals' lead. With two outs, Joe Dugan hit a single. Hank Severeid followed with a double, scoring Dugan, before pinch-hitter Ben Paschal grounded to Haines to end the inning. Game 1 and 5 winner Herb Pennock came in relief for Hoyt in the seventh inning. He yielded only one hit in the inning and limited the Cardinals to their 3–2 lead. In the bottom half of the inning, the Yankees loaded up the bases with Earle Combs, Ruth and Lou Gehrig. At this point, there were two outs, and Haines had developed a blister on his pitching hand, and could no longer pitch in the game.
Rogers Hornsby had to determine who he would put in to replace Haines as pitcher. Although Grover Cleveland Alexander had pitched a complete game the day before and may have spent the night drinking (Alexander later denied this, saying that Hornsby specifically told him to limit his celebrating since he might be needed the next day), Hornsby decided to trust him after Alexander said he "had it in easy in there" in Game 6 and would be ready whenever Hornsby needed him. According to the popular legend, Alexander told Hornsby his strategy: After getting a strike on Lazzeri, Alexander would then throw an inside fastball. Hornsby warned him that that pitch was Lazzeri's favorite. Alexander responded that if Lazzeri swung at it, he would hit it foul, and Alexander would then throw an outside curve to strike him out. Hornsby then supposedly said, "Who am I to tell you how to pitch?" The first two pitches thrown by Alexander to batter Tony Lazzeri went for a strike and a ball, respectively. On the third pitch, Lazzeri hit a fly ball down the left-field line. The ball initially appeared to be going into the stands for a grand slam, but at the last second, it curved several feet into the stands in foul territory. Alexander then threw a curveball that Lazzeri swung at and missed for strike three, ending the inning and the Yankees' threat.
Alexander retired the Yankees in order in the eighth inning. The Cardinals did nothing offensively in the top of the ninth inning, so it was up to Alexander to preserve the Cardinals' game in the bottom of the ninth. Alexander got the first two batters of the inning, Combs and Dugan, to ground out to third baseman Bell. With two outs and no runners on base, Alexander faced Ruth. Ruth had hit a solo home run and walked three times in the game. Manager Hornsby walked to the mound to talk with Alexander. Alexander told Hornsby that he would rather face Ruth than intentionally walk him. Alexander's first pitch to Ruth fell in for a solid strike in the middle of the plate. Alexander's next pitch fell outside of the strike zone for ball one. Ruth then fouled the next pitch, making the count one ball and two strikes. Alexander's next two pitches fell too low for balls two and three, making it a full count. The following full count pitch was noted by New York Herald Tribune sportswriter W. O. McGeehan: "The count went to three and two, Ruth was swaying eagerly. The soupbone creaked again. The ball seemed a fraction of an inch from being a strike. Ruth paused a moment. Even he was uncertain. Then he trotted down to first."
With two outs and Ruth at first base, left fielder Bob Meusel came up to the plate, with Lou Gehrig waiting in the batting circle after him. Meusel was a .315 hitter that year and had batted in 81 runs in just over 100 regular-season games. Meusel also had success in Game 6 against Alexander, with a double and triple. Just as Meusel was about to take his first pitch, Ruth made the bold move of trying to steal second base. Ruth was known as a good but overly aggressive baserunner, with about a 50% success rate at stealing bases in his career, and his attempt surprised many people throughout the stadium. Meusel swung and missed, and Cardinals catcher Bob O'Farrell immediately threw the ball to second baseman Hornsby. Hornsby reached for the ball, and laid the tag immediately on Ruth. As the game announcer described it, "Ruth is walked again for the fourth time today. One strike on Bob Meusel. Going down to second\! The game is over\! Babe tried to steal second and is put out catcher to second\!"
As Hornsby recalled later, Ruth "didn't say a word. He didn't even look around or up at me. He just picked himself up and walked away". Ruth's failed attempt to steal second base ended the 1926 World Series; it is the only time a World Series has ended with a runner being caught stealing. Ruth explained later that he attempted to steal second base because he thought no one would expect it. He hoped that by getting to second base, he could have an easier chance at scoring if Meusel hit a single into the outfield.
## Aftermath
The Cardinals went back home to St. Louis to a rapturous fan reception, having won their first undisputed world championship. Each member of the championship team collected $5,584.51, while the Yankees' players were given $3,417.75 each.
To date, the Cardinals' 11 world championships are the most won by any National League team, and rank second only to the Yankees' 27. The Cardinals' and Yankees' last wins were within two years of each other (having occurred in 2011 and 2009, respectively). The two teams would meet again in 1928 (which the Yankees swept in four games); 1942 (which the Cardinals won in five games); 1943 (which the Yankees won in five games); and 1964 (which the Cardinals won in seven games).
As for the Yankees, Game 7 of the 1926 series marked the last postseason loss for the team in a decade. The Bronx Bombers would go on to sweep their next three World Series, 1927, 1928, and 1932. Their next World Series loss would be Game 1 of the 1936 World Series which the Yankees would eventually win 4 games to 2.
## Composite line score
## See also
- 1926 Negro World Series
- 1928 World Series Second World Series match-up between the Cardinals and the Yankees |
# George S. Patton slapping incidents
In early August 1943, Lieutenant General George S. Patton slapped two United States Army soldiers under his command during the Sicily Campaign of World War II. Patton's hard-driving personality and lack of belief in the medical condition of combat stress reaction, then known as "battle fatigue" or "shell shock", led to the soldiers' becoming the subject of his ire in incidents on August 3 and 10, when Patton struck and berated them after discovering they were patients at evacuation hospitals away from the front lines without apparent physical injuries.
Word of the incidents spread, eventually reaching Patton's superior, General Dwight D. Eisenhower, who ordered him to apologize to the men. Patton's actions were initially suppressed in the news until journalist Drew Pearson publicized them in the United States. The reactions of the U.S. Congress and the general public were divided between support and disdain for Patton's actions. Eisenhower and Army Chief of Staff George C. Marshall opted not to fire Patton as a commander.
Seizing the opportunity the predicament presented, Eisenhower used Patton as a decoy in Operation Fortitude, sending faulty intelligence to German agents that Patton was leading Operation Overlord, the Allied invasion of Europe. While Patton eventually returned to combat command in the European Theater in mid-1944, the slapping incidents were seen by Eisenhower, Marshall, and other leaders to be examples of Patton's brashness and impulsivity.
## Background
The Allied invasion of Sicily began on July 10, 1943, with Lieutenant General George S. Patton leading 90,000 men of the Seventh United States Army in a landing near Gela, Scoglitti, and Licata to support Bernard Montgomery's British 8th Army landings to the north. Initially ordered to protect the British forces' flank, Patton took Palermo after Montgomery's forces were slowed by heavy resistance from troops of Nazi Germany and the Kingdom of Italy. Patton then set his sights on Messina. He sought an amphibious assault, but it was delayed by lack of landing craft and his troops did not land in Santo Stefano until August 8, by which time the Germans and Italians had already evacuated the bulk of their troops to mainland Italy. Throughout the campaign, Patton's troops were heavily engaged by German and Italian forces as they pushed across the island.
Patton had already developed a reputation in the U.S. Army as an effective, successful, and hard-driving commander, punishing subordinates for the slightest infractions but also rewarding them when they performed well. As a way to promote an image that inspired his troops, Patton created a larger-than-life personality. He became known for his flashy dress, highly polished helmet and boots, and no-nonsense demeanor. General Dwight D. Eisenhower, the commander of the Sicily operation and Patton's friend and commanding officer, had long known of Patton's colorful leadership style, and also knew that Patton was prone to impulsiveness and a lack of self-restraint.
### Battle fatigue
Prior to World War I, the U.S. Army considered the symptoms of battle fatigue to be cowardice or attempts to avoid combat duty. Soldiers who reported these symptoms received harsh treatment. "Shell shock" had been diagnosed as a medical condition during World War I. But even before the conflict ended, what constituted shell shock was changing. This included the idea that it was caused by the shock of exploding shells. By World War II soldiers were usually diagnosed with "psychoneurosis" or "combat fatigue." Despite this, "shell shock" remained in the popular vocabulary. But the symptoms of what constituted combat fatigue were broader than what had constituted shell shock in World War I. By the time of the invasion of Sicily, the U.S. Army was initially classifying all psychological casualties as "exhaustion" which many still called shell shock. While the causes, symptoms, and effects of the condition were familiar to physicians by the time of the two incidents, it was generally less understood in military circles.
An important lesson from the Tunisia Campaign was that neuropsychiatric casualties had to be treated as soon as possible and not evacuated from the combat zone. This was not done in the early stages of the Sicilian Campaign, and large numbers of neuropsychiatric casualties were evacuated to North Africa, with the result that treatment became complicated and only 15 percent of them were returned to duty. As the campaign wore on, the system became better organized and nearly 50 percent were restored to combat duty.
Some time before what would become known as the "slapping incident," Patton spoke with Major General Clarence R. Huebner, the newly appointed commander of the U.S. 1st Infantry Division, in which both men served. Patton had asked Huebner for a status report; Huebner replied: "The front lines seem to be thinning out. There seems to be a very large number of 'malingerers' at the hospitals, feigning illness in order to avoid combat duty." For his part, Patton did not believe the condition was real. In a directive issued to commanders on August 5, he forbade "battle fatigue" in the Seventh Army:
> It has come to my attention that a very small number of soldiers are going to the hospital on the pretext that they are nervously incapable of combat. Such men are cowards and bring discredit on the army and disgrace to their comrades, whom they heartlessly leave to endure the dangers of battle while they, themselves, use the hospital as a means of escape. You will take measures to see that such cases are not sent to the hospital but dealt with in their units. Those who are not willing to fight will be tried by court-martial for cowardice in the face of the enemy.
## Incidents
### August 3
Private Charles H. Kuhl, 27, of L Company, U.S. 26th Infantry Regiment, reported to an aid station of C Company, 1st Medical Battalion, on August 2, 1943. Kuhl, who had been in the U.S. Army for eight months, had been attached to the 1st Infantry Division since June 2, 1943. He was diagnosed with "exhaustion," a diagnosis he had been given three times since the start of the campaign. From the aid station, he was evacuated to a medical company and given sodium amytal. Notes in his medical chart indicated "psychoneurosis anxiety state, moderately severe (soldier has been twice before in hospital within ten days. He can't take it at the front, evidently. He is repeatedly returned.)" Kuhl was transferred from the aid station to the 15th Evacuation Hospital near Nicosia for further evaluation.
Patton arrived at the hospital the same day, accompanied by a number of medical officers, as part of his tour of the U.S. II Corps troops. He spoke to some patients in the hospital, commending the physically wounded. He then approached Kuhl, who did not appear to be physically injured. Kuhl was sitting slouched on a stool midway through a tent ward filled with injured soldiers. When Patton asked Kuhl where he was hurt, Kuhl reportedly shrugged and replied that he was "nervous" rather than wounded, adding, "I guess I can't take it." Patton "immediately flared up," slapped Kuhl across the chin with his gloves, then grabbed him by the collar and dragged him to the tent entrance. He shoved him out of the tent with a kick to his backside. Yelling "Don't admit this son of a bitch," Patton demanded that Kuhl be sent back to the front, adding, "You hear me, you gutless bastard? You're going back to the front."
Corpsmen picked up Kuhl and brought him to a ward tent, where it was discovered he had a temperature of 102.2 °F (39.0 °C); and was later diagnosed with malarial parasites. Speaking later of the incident, Kuhl noted "at the time it happened, [Patton] was pretty well worn out ... I think he was suffering a little battle fatigue himself." Kuhl wrote to his parents about the incident, but asked them to "just forget about it." That night, Patton recorded the incident in his diary: "[I met] the only errant coward I have ever seen in this Army. Companies should deal with such men, and if they shirk their duty, they should be tried for cowardice and shot."
Patton was accompanied in this visit by Major General John P. Lucas, who saw nothing remarkable about the incident. After the war he wrote:
> There are always a certain number of such weaklings in any Army, and I suppose the modern doctor is correct in classifying them as ill and treating them as such. However, the man with malaria doesn't pass his condition on to his comrades as rapidly as does the man with cold feet nor does malaria have the lethal effect that the latter has.
Patton was further heard by war correspondent Noel Monks angrily claiming that shell shock is "an invention of the Jews."
### August 10
Private Paul G. Bennett, 21, of C Battery, U.S. 17th Field Artillery Regiment, was a four-year veteran of the U.S. Army, and had served in the division since March 1943. Records show he had no medical history until August 6, 1943, when a friend was wounded in combat. According to a report, he "could not sleep and was nervous." Bennett was brought to the 93rd Evacuation Hospital. In addition to having a fever, he exhibited symptoms of dehydration, including fatigue, confusion, and listlessness. His request to return to his unit was turned down by medical officers. A medical officer described Bennett's condition:
> The shells going over him bothered him. The next day he was worried about his buddy and became more nervous. He was sent down to the rear echelon by a battery aid man and there the medical aid man gave him some tranquilizers that made him sleep, but still he was nervous and disturbed. On the next day the medical officer ordered him to be evacuated, although the boy begged not to be evacuated because he did not want to leave his unit.
On August 10, Patton entered the receiving tent of the hospital, speaking to the injured there. Patton approached Bennett, who was huddled and shivering, and asked what the trouble was. "It's my nerves," Bennett responded. "I can't stand the shelling anymore." Patton reportedly became enraged at him, slapping him across the face. He began yelling: "Your nerves, Hell, you are just a goddamned coward. Shut up that goddamned crying. I won't have these brave men who have been shot at seeing this yellow bastard sitting here crying." Patton then reportedly slapped Bennett again, knocking his helmet liner off, and ordered the receiving officer, Major Charles B. Etter, not to admit him. Patton then threatened Bennett, "You're going back to the front lines, and you may get shot and killed, but you're going to fight. If you don't, I'll stand you up against a wall and have a firing squad kill you on purpose. In fact, I ought to shoot you myself, you goddamned whimpering coward." Upon saying this, Patton pulled out his pistol threateningly, prompting the hospital's commander, Colonel Donald E. Currier, to physically separate the two. Patton left the tent, yelling to medical officers to send Bennett back to the front lines.
As he toured the remainder of the hospital, Patton continued discussing Bennett's condition with Currier. Patton stated, "I can't help it. It makes my blood boil to think of a yellow bastard being babied," and "I won't have those cowardly bastards hanging around our hospitals. We'll probably have to shoot them some time anyway, or we'll raise a breed of morons."
## Aftermath
### Private reprimand and apologies
The August 10 incident—particularly the sight of Patton threatening a subordinate with a pistol—upset many of the medical staff present. The II Corps surgeon, Colonel Richard T. Arnest, submitted a report on the incident to Brigadier General William B. Kean, chief of staff of II Corps, who submitted it to Lieutenant General Omar Bradley, commander of II Corps. Bradley, out of loyalty to Patton, did nothing more than lock the report in his safe. Arnest also sent the report through medical channels to Brigadier General Frederick A. Blesse, General Surgeon of Allied Force Headquarters, who then submitted it to Eisenhower, who received it on August 16. Eisenhower ordered Blesse to proceed immediately to Patton's command to ascertain the truth of the allegations. Eisenhower also formulated a delegation, including Major General John P. Lucas, two colonels from the Inspector General's office, and a theater medical consultant, Lieutenant Colonel Perrin H. Long, to investigate the incident and interview those involved. Long interviewed medical personnel who witnessed each incident, then filed a report entitled "Mistreatment of Patients in Receiving Tents of the 15th and 93rd Evacuation Hospitals" which extensively detailed Patton's actions at both hospitals.
By August 18, Eisenhower had ordered that Patton's Seventh Army be broken up, with a few of its units remaining garrisoned in Sicily. The majority of its combat forces would be transferred to the Fifth United States Army under Lieutenant General Mark W. Clark. This had already been planned by Eisenhower, who had previously told Patton that his Seventh Army would not be part of the upcoming Allied invasion of Italy, scheduled for September. On August 20, Patton received a cable from Eisenhower regarding the arrival of Lucas at Palermo. Eisenhower told Patton it was "highly important" that he personally meet with Lucas as soon as possible, as Lucas would be carrying an important message. Before Lucas arrived, Blesse arrived from Algiers to look into the health of the troops in Sicily. He was also ordered by Eisenhower to deliver a secret letter to Patton and investigate its allegations. In the letter, Eisenhower told Patton he had been informed of the slapping incidents. He said he would not be opening a formal investigation into the matter, but his criticism of Patton was sharp.
Eisenhower's letter to Patton, dated August 17, 1943:
> I clearly understand that firm and drastic measures are at times necessary in order to secure the desired objectives. But this does not excuse brutality, abuse of the sick, nor exhibition of uncontrollable temper in front of subordinates. ... I feel that the personal services you have rendered the United States and the Allied cause during the past weeks are of incalculable value; but nevertheless if there is a very considerable element of truth in the allegations accompanying this letter, I must so seriously question your good judgment and your self-discipline as to raise serious doubts in my mind as to your future usefulness.
Eisenhower noted that no formal record of the incidents would be retained at Allied Headquarters, save in his own secret files. Still, he strongly suggested Patton apologize to all involved. On August 21, Patton brought Bennett into his office; he apologized and the men shook hands. On August 22, he met with Currier as well as the medical staff who had witnessed the events in each unit and expressed regret for his "impulsive actions." Patton related to the medical staff a story of a friend from World War I who had committed suicide after "skulking"; he stated he sought to prevent any recurrence of such an event. On August 23, he brought Kuhl into his office, apologized, and shook hands with him as well. After the apology, Kuhl said he thought Patton was "a great general," and that "at the time, he didn't know how sick I was." Currier later said Patton's remarks sounded like "no apology at all [but rather like] an attempt to justify what he had done." Patton wrote in his diary that he loathed making the apologies, particularly when he was told by Bennett's brigade commander, Brigadier General John A. Crane, that Bennett had gone absent without leave (AWOL) and arrived at the hospital by "falsely representing his condition." Patton wrote, "It is rather a commentary on justice when an Army commander has to soft-soap a skulker to placate the timidity of those above." As word of the actions had spread informally among troops of the Seventh Army, Patton drove to each division under his command between August 24 and 30 and gave a 15-minute speech in which he praised their behavior and apologized for any instances where he had been too harsh on soldiers, making only vague reference to the two slapping incidents. In his final apology speech to the U.S. 3rd Infantry Division, Patton was overcome with emotion when the soldiers supportively began to chant "No, general, no, no," to prevent him from having to apologize.
In a letter to General George Marshall on August 24, Eisenhower praised Patton's exploits as commander of the Seventh Army and his conduct of the Sicily campaign, particularly his ability to take initiative as a commander. Still, Eisenhower noted Patton continued "to exhibit some of those unfortunate traits of which you and I have always known." He informed Marshall of the two incidents and his requirement that Patton apologize. Eisenhower stated he believed Patton would cease his behavior "because fundamentally, he is so avid for recognition as a great military commander that he will ruthlessly suppress any habit of his that will tend to jeopardize it." When Eisenhower arrived in Sicily to award Montgomery the Legion of Merit on August 29, Patton gave Eisenhower a letter expressing his remorse about the incidents.
### Media attention
Word of the slapping incidents spread informally among soldiers before eventually circulating to war correspondents. One of the nurses who witnessed the August 10 incident apparently told her boyfriend, a captain in the Seventh Army public affairs detachment. Through him, a group of four journalists covering the Sicily operation heard of the incident: Demaree Bess of the Saturday Evening Post, Merrill Mueller of NBC News, Al Newman of Newsweek, and John Charles Daly of CBS News. The four journalists interviewed Etter and other witnesses, but decided to bring the matter to Eisenhower instead of filing the story with their editors. Bess, Mueller, and Quentin Reynolds of Collier's Magazine flew from Sicily to Algiers, and on August 19, Bess gave a summary on the slapping incidents to Eisenhower's chief of staff, Major General Walter Bedell Smith. The reporters asked Eisenhower directly about the incident, and Eisenhower requested that the story be suppressed because the war effort could not afford to lose Patton. Bess and other journalists initially complied. However, the news reporters then demanded Eisenhower fire Patton in exchange for them not reporting the story, a demand which Eisenhower refused.
The story of Kuhl's slapping broke in the U.S. when newspaper columnist Drew Pearson revealed it on his November 21 radio program. Pearson received details of the Kuhl incident and other material on Patton from his friend Ernest Cuneo, an official with the Office of Strategic Services, who obtained the information from War Department files and correspondence. Pearson's version not only conflated details of both slapping incidents but falsely reported that the private in question was visibly "out of his head," telling Patton to "duck down or the shells would hit him" and that in response "Patton struck the soldier, knocking him down." Pearson punctuated his broadcast by twice stating that Patton would never again be used in combat, despite the fact that Pearson had no factual basis for this prediction. In response, Allied Headquarters denied that Patton had received an official reprimand, but confirmed that Patton had slapped at least one soldier.
Patton's wife, Beatrice Patton, spoke to the media to defend him. She appeared in True Confessions, a women's confession magazine, where she characterized Patton as "the toughest, most hard boiled General in the U.S. Army ... but he's quite sweet, really." She was featured in a Washington Post article on November 26. While she did not attempt to justify Patton's action, she characterized him as a "tough perfectionist," stating that he cared deeply about the men under his command and would not ask them to do something he would not do himself:
> He had been known to weep at men's graves—as well as tear their hides off. The deed is done and the mistake made, and I'm sure Georgie is sorrier and has punished himself more than anyone could possibly realize. I've known George Patton for 31 years and I've never known him to be deliberately unfair. He's made mistakes—and he's paid for them. This was a big mistake, and he's paying a big price for it.
### Public response
Demands for Patton to be relieved of duty and sent home were made in Congress and in newspapers across the country. U.S. Representative Jed Johnson of Oklahoma's 6th district described Patton's actions as a "despicable incident" and was "amazed and chagrined" Patton was still in command. He called for the general's immediate dismissal on the grounds that his actions rendered him no longer useful to the war effort. Representative Charles B. Hoeven of Iowa's 9th district said on the House floor that parents of soldiers need no longer worry of their children being abused by "hard boiled officers." He wondered whether the Army had "too much blood and guts." Eisenhower submitted a report to Secretary of War Henry L. Stimson, who presented it to Senator Robert R. Reynolds, Chairman of the Senate Committee on Military Affairs. The report laid out Eisenhower's response to the incident and gave details of Patton's decades of military service. Eisenhower concluded that Patton was invaluable to the war effort and that he was confident the corrective actions taken would be adequate. Investigators Eisenhower sent to Patton's command found the general remained overwhelmingly popular with his troops.
By mid-December, the government had received around 1,500 letters related to Patton, with many calling for his dismissal and others defending him or calling for his promotion. Kuhl's father, Herman F. Kuhl, wrote to his own congressman, stating that he forgave Patton for the incident and requesting that he not be disciplined. Retired generals also weighed in on the matter. Former Army Chief of Staff Charles P. Summerall wrote to Patton that he was "indignant about the publicity given a trifling incident," adding that "whatever [Patton] did" he was sure it was "justified by the provocation. Such cowards used to be shot, now they are only encouraged." Major General Kenyon A. Joyce, another combat commander and one of Patton's friends, attacked Pearson as a "sensation mongerer," stating that "niceties" should be left for "softer times of peace." In one notable dissension, Patton's friend, former mentor and General of the Armies John J. Pershing publicly condemned his actions, an act that left Patton "deeply hurt" and caused him to never speak to Pershing again.
After consulting with Marshall, Stimson, and Assistant Secretary of War John J. McCloy, Eisenhower retained Patton in the European theater, though his Seventh Army saw no further combat. Patton remained in Sicily for the rest of the year. Marshall and Stimson not only supported Eisenhower's decision, but defended it. In a letter to the U.S. Senate, Stimson stated that Patton must be retained because of the need for his "aggressive, winning leadership in the bitter battles which are to come before final victory." Stimson acknowledged retaining Patton was a poor move for public relations but remained confident it was the right decision militarily.
## Effect on plans for invasion of Europe
Contrary to his statements to Patton, Eisenhower never seriously considered removing the general from duty in the European Theater. Writing of the incident before the media attention, he said, "If this thing ever gets out, they'll be howling for Patton's scalp, and that will be the end of Georgie's service in this war. I simply cannot let that happen. Patton is indispensable to the war effort – one of the guarantors of our victory." Still, following the capture of Messina in August 1943, Patton did not command a force in combat for 11 months.
Patton was passed over to lead the invasion in northern Europe. In September, Bradley—Patton's junior in both rank and experience—was selected to command the First United States Army that was forming in England to prepare for Operation Overlord. According to Eisenhower, this decision had been made months before the slapping incidents became public knowledge, but Patton felt they were the reason he was denied the command. Eisenhower had already decided on Bradley because he felt the invasion of Europe was too important to risk any uncertainty. While Eisenhower and Marshall both considered Patton to be a superb corps-level combat commander, Bradley possessed two of the traits that a theater-level strategic command required, and that Patton conspicuously lacked: a calm, reasoned demeanor, and a meticulously consistent nature. The slapping incidents had only further confirmed to Eisenhower that Patton lacked the ability to exercise discipline and self-control at such a command level. Still, Eisenhower re-emphasized his confidence in Patton's skill as a ground combat commander by recommending him for promotion to four-star general in a private letter to Marshall on September 8, noting his previous combat exploits and admitting that he had a "driving power" that Bradley lacked.
By mid-December, Eisenhower had been appointed Supreme Allied Commander in Europe and moved to England. As media attention surrounding the incident began to subside, McCloy told Patton he would indeed be eventually returning to combat command. Patton was briefly considered to lead the Seventh Army in Operation Dragoon, but Eisenhower felt his experience would be more useful in the Normandy campaign. Eisenhower and Marshall privately agreed that Patton would command a follow-on field army after Bradley's army conducted the initial invasion of Normandy; Bradley would then command the resulting army group. Patton was told on January 1, 1944 only that he would be relieved of command of the Seventh Army and moved to Europe. In his diary, he wrote that he would resign if he was not given command of a field army. On January 26, 1944, formally given command of a newly arrived unit, the Third United States Army, he went to the United Kingdom to prepare the unit's inexperienced soldiers for combat. This duty occupied Patton throughout early 1944.
Exploiting Patton's situation, Eisenhower sent him on several high-profile trips throughout the Mediterranean in late 1943. He traveled to Algiers, Tunis, Corsica, Cairo, Jerusalem, and Malta in an effort to confuse German commanders as to where the Allied forces might next attack. By the next year, the German High Command still had more respect for Patton than for any other Allied commander and considered him central to any plan to invade Europe from the north. Because of this, Patton was made a central figure in Operation Fortitude in early 1944. The Allies fed the German intelligence organizations, through double agents, a steady stream of false intelligence that Patton had been named commander of the First United States Army Group (FUSAG) and was preparing for an invasion of Pas de Calais. The FUSAG command was actually an intricately constructed "phantom" army of decoys, props and radio signals based around southeast England to mislead German aircraft and to make Axis leaders believe a large force was massing there. Patton was ordered to keep a low profile to deceive the Germans into thinking he was in Dover throughout early 1944, when he was actually training the Third Army. As a result of Operation Fortitude, the German 15th Army remained at Pas de Calais to defend against the expected attack. The formation remained there even after the invasion of Normandy on June 6, 1944.
It was during the following month of July 1944 that Patton and the Third Army finally did travel to mainland Europe, and entered into combat on August 1. |
# Paint It Black
"Paint It Black" is a song by the English rock band the Rolling Stones. A product of the songwriting partnership of Mick Jagger and Keith Richards, it is a raga rock song with Indian, Middle Eastern and Eastern European influences and lyrics about grief and loss. London Records released the song as a single on 7 May 1966 in the United States, and Decca Records released it on 13 May in the United Kingdom. Two months later, London Records included it as the opening track on the American version of the band's 1966 studio album Aftermath, though it is not on the original UK release.
Originating from a series of improvisational melodies played by Brian Jones on the sitar, the song features all five members of the band contributing to the final arrangement although only Jagger and Richards were credited as songwriters. In contrast to previous Rolling Stones singles with straightforward rock arrangements, "Paint It Black" has unconventional instrumentation, including a prominent sitar, the Hammond organ and castanets. This instrumental experimentation matches other songs on Aftermath. The song was influential to the burgeoning psychedelic genre as the first chart-topping single to feature the sitar, and widened the instrument's audience. Reviews of the song at the time were mixed, and some music critics believed its use of the sitar was an attempt to copy the Beatles, while others criticised its experimental style and doubted its commercial potential.
"Paint It Black" was a major chart success for the Rolling Stones, remaining 11 weeks (including two at number one) on the US Billboard Hot 100, and 10 weeks (including one atop the chart) on the Record Retailer chart in the UK. Upon a reissue in 2007, it reentered the UK Singles Chart for 11 weeks. It was the band's third number-one single in the US and sixth in the UK. The song also topped charts in Canada and the Netherlands. It received a platinum certification in the UK from the British Phonographic Industry (BPI) and from Italy's Federazione Industria Musicale Italiana (FIMI).
"Paint It Black" was inducted into the Grammy Hall of Fame in 2018, and Rolling Stone magazine ranked the song number 213 on their list of the 500 Greatest Songs of All Time. In 2011, the song was added to the Rock and Roll Hall of Fame's list of "The Songs that Shaped Rock and Roll". Many artists have covered "Paint It Black" since its initial release. It has been included on many of the band's compilation albums and several film soundtracks. It has been played on a number of Rolling Stones tours.
## Background
In 1965, popularity of the Rolling Stones increased markedly with a series of international hit singles written by lead singer Mick Jagger and guitarist Keith Richards. While 1964 saw the band reach the top of both the albums and singles charts in their native United Kingdom, other bands from Britain dominated the American market, such as the Beatles. In 1965, the Stones crossed over to the American market with their first number one single, "(I Can't Get No) Satisfaction", and first number one album Out of Our Heads. That year also saw the Stones reach the top of the charts for the first time in countries such as Finland, Germany, and South Africa.
This success attracted the attention of Allen Klein, an American businessman who became their US representative in August while Andrew Loog Oldham, the group's manager, continued in the role of promoter and record producer. One of Klein's first actions on the band's behalf was to force Decca Records to grant a $1.2 million royalty advance to the group, bringing the members their first signs of financial wealth and allowing them to purchase country houses and new cars. Their October–December 1965 tour of North America was the group's fourth and largest tour there up to that point. According to the biographer Victor Bockris, through Klein's involvement, the concerts afforded the band "more publicity, more protection and higher fees than ever before."
By this time, the Rolling Stones had begun to respond to the increasingly sophisticated music of the Beatles, in comparison to whom they had long been promoted by Oldham as a rougher alternative. With the success of the Jagger–Richards-penned singles "(I Can't Get No) Satisfaction" (1965), "Get Off of My Cloud" (1965) and "19th Nervous Breakdown" (1966), the band increasingly rivalled the musical and cultural influence of the Beatles, and began to be identified as one of the major pillars of the British Invasion. The Stones' outspoken, surly attitude on songs like "Satisfaction" alienated the Establishment detractors of rock music, which music historian Colin King explains, "only made the group more appealing to those sons and daughters who found themselves estranged from the hypocrisies of the adult world – an element that would solidify into an increasingly militant and disenchanted counterculture as the decade wore on."
## Development
"Paint It Black" came at a pivotal period in the band's recording history. The Jagger–Richards songwriting collaboration had begun producing more original material for the band over the past year, with the early model of Stones albums featuring only a few Jagger–Richards compositions having been replaced by that of albums such as Out of Our Heads and December's Children (and Everybody's), each of which consisted of half original tracks and half cover songs. This trend culminated in the sessions for Aftermath (1966) where, for the first time, the duo penned every track on the album. Brian Jones, originally the band's founder and leader over the first few years of its existence, began feeling overshadowed by the prominence of Jagger and Richards' contributions to the group.
Despite having contributed to early songs by the Stones via the Nanker Phelge pseudonym, Jones had less and less influence over the group's direction as their popularity grew primarily as a result of original Jagger–Richards singles. Jones grew bored attempting to write songs with conventional guitar melodies. To alleviate his boredom, he began exploring Eastern instruments, specifically the Indian sitar, with a goal to bolstering the musical texture and complexity of the band's sound. A multi-instrumentalist, Jones could develop a tune on the sitar in a short time; he had a background with the instrument largely from his studies under Harihar Rao, a disciple of Ravi Shankar.
Over 1965, the sitar had become a more and more prominent instrument in the landscape of British rock. The Yardbirds had attempted to record "Heart Full of Soul" with the sitar as part of the arrangement in April; however, they had run into problems getting the instrument to "cut through" the mix, and the session musician responsible for playing the instrument had trouble staying within the 4/4 time signature of the song. Ultimately, the final version of "Heart Full of Soul" featured a fuzz guitar in place of the sitar, although the song's distinctively Indian timbre remained. Following similar Indian-influenced experimentation by the Kinks on "See My Friends" that nonetheless still used guitar as the primary instrument, the first British band to release a recording featuring the sitar was the Beatles, with "Norwegian Wood (This Bird Has Flown)" released that December on the album Rubber Soul. Following a discussion with the Beatles' lead guitarist George Harrison, who had recently played the sitar on the sessions for "Norwegian Wood" in October 1965, Jones began devoting more time to the sitar, and began arranging basic melodies with the instrument. One of these melodies morphed over time into the tune featured in "Paint It Black".
## Writing and recording
Jagger and Richards wrote the lyrics and much of the chord progression of "Paint It Black" during the first group of sessions for the then untitled Aftermath the previous December, and while on the 1966 Australian tour. Initially, the first group of sessions were to be released as an album by themselves, then titled Could You Walk on the Water? In mid-January 1966, the British press announced that a new Rolling Stones LP carrying that title would be released on 10 March. In Rolling with the Stones, Wyman refers to the announcement as "audacity" on Oldham's part. A Decca spokesman said the company would not issue an album with such a title "at any price"; Oldham's idea upset executives at the company's American distributor, London Records, who feared the allusion to Jesus walking on water would provoke a negative response from Christians.
The title controversy embroiled the Stones in a conflict with Decca, delaying the Stones' next studio album's release from March to April 1966. The delay, however, gave the Stones more time to record new material for the upcoming album, which had now been retitled Aftermath. Upon their return from Australasia, it was one of the new songs worked on for the revised new album. "Paint It Black" was recorded as the Stones had begun to take more time recording their material. Referring to the atmosphere of the Stones' sessions at the time, Richards told Beat Instrumental magazine in February 1966: "Our previous sessions have always been rush jobs. This time we were able to relax a little, take our time." Sound engineer Dave Hassinger recorded the song on 6 and 9 March 1966 at RCA Studios in Los Angeles. Andrew Loog Oldham produced the track, as with all of the Stones' recordings until 1967. Both the single's US and UK B-sides were also recorded on these dates, as were a majority of album tracks for Aftermath.
"Paint It Black" follows a simple verse form that lacks a refrain. It starts with five consecutive 16-bar verses before relaxing into a chanted section and finishing in a frantic coda. The song was written originally as a standard pop arrangement in a minor key similar to "The House of the Rising Sun", which Jagger humorously compared to "songs for Jewish weddings". The Stones were dissatisfied with this version and considered scrapping the song altogether. During a session break, Bill Wyman twiddled with a Hammond organ in search of a heavier bass sound; Wyman's playing inspired the uptempo and Eastern melody. The sitar was brought into the mix when Harihar Rao walked into the studio with one in hand. With the sitar, Jones combined his recent melodic improvisations with the chord progression and lyrics provided by Jagger and Richards. Soon after the recording session, Richards felt that the track's conclusion was over-recorded and that it could have been improved.
This song may have the first recorded example of a fretless bass guitar. Wyman had removed the frets from a bass intending to replace them, but became enamoured with the fretless sound. This can be most easily heard near the end of each vocal line, when Wyman plays high on the bass's neck, using the upper register.
Wyman was later critical of Oldham listing Jagger and Richards as songwriters to the exclusion of the rest of the Stones. He felt that "Paint It Black" should have been credited to the band's pseudonym, Nanker Phelge, rather than Jagger–Richards, since the song's final arrangement originated from a studio improvisation by Jones, Watts and himself, and Jones was responsible for providing the melody line on the sitar. In the view of pop historian Andrew Grant Jackson, "Paint It Black" bears a strong resemblance to the Supremes' 1965 hit "My World Is Empty Without You", which used "a foreboding minor key with harpsichord and organ".
## Music and lyrics
In a 1995 interview, commenting on the musical styles found on Aftermath, Jagger described "Paint It Black" as "this kind of Turkish song". According to music scholar James E. Perone, although the introductory sitar passage is played in an Indian fashion, "the rhythmic and melodic feel of the Eastern-sounding phrases actually call to mind the Middle East more than India". Jagger's droning and slightly nasal singing complement the motif Jones plays on the sitar. Wyman's heavy bass, Charlie Watts' low-pitch drumming and Richards' bolero-driven acoustic guitar outro drive "Paint It Black". Commentators and reviewers have classified "Paint It Black" as raga rock, psychedelia, and psychedelic rock. Perone named "Paint It Black" as one of the Stones' 1966 songs that acts as an explicit attempt to transcend the blues-based rock and roll conventions of the Stones' previous songs, along with other Aftermath songs such as "Stupid Girl", "Lady Jane" and "Under My Thumb".
Using colour-based metaphors, the song's lyrics describe the grief suffered by someone stunned by the sudden and unexpected loss of a partner, leading to what author Tony Visconti terms "a blanket worldview of desperation and desolation, with no hint of hope." The lyrics have also given rise to alternative interpretations scholars consider less likely, ranging from a bad trip on hallucinogens to the Vietnam War. Perone noted in 2012 that the lyrical content – a character "so entrenched in his depression and rage that he has lost all hope" – establishes a rough concept for Aftermath's American edition, the following songs offering insight into "the darkness of his psyche" and possible reasons for its darkness. Perone argues the resulting connections among the songs on Aftermath lend it a conceptual unity which, although not sufficient for it to be considered a concept album, allows for the record to be understood "as a psychodrama around the theme of love, desire and obsession that never quite turns out right". As Perone explains:
> The individual songs seem to ping-pong back and forth between themes of love/desire for women and the desire to control women and out-and-out misogyny. However, the band uses musical connections between songs as well as the sub-theme of travel, the use of feline metaphors for women and other lyrical connections to suggest that the characters whom lead singer Mick Jagger portrays throughout the album are really one and perhaps stem from the deep recesses of his psyche.
The Village Voice music critic Robert Christgau described "Paint It Black" as an example of the Stones' development as artists. According to Christgau, the texture of the Stones' blues-derived hard rock is "permanently enriched" as Jones "daub[s] on occult instrumental [colours]". Christgau praised Mick Jagger specifically for his influence on the Stones' artistic identity on their 1966 material, describing him as a lyricist "whose power, subtlety and wit are unparalleled in contemporary popular music", and additionally suggested that Jagger and Richards rank second as composers of melody in rock, behind only John Lennon and Paul McCartney.
## Release
London Records released "Paint It Black" as a single in the US on 7 May 1966; Decca Records released it on 13 May in the UK. "Paint It Black"'s UK B-side was "Long, Long While", a song that was not released on any of the band's studio albums. Richie Unterberger of AllMusic later described "Long, Long While" as an underappreciated song, with a "considerably different" tone than most of the band's work, and commented that it was better than many of the tracks the Stones selected for their studio albums. Upon its original release, the song was credited to "Jagger-Richard", as Andrew Loog Oldham advised Keith Richards to use the surname Richard professionally on the Stones' releases during the 1960s. Later releases of the song have changed the credit to "Jagger-Richards".
In the US, "Stupid Girl" was chosen as its US B-side. Both songs were included in the American release of Aftermath, with "Paint It Black" being a new addition when compared to the earlier British edition "Paint It Black" became Aftermath's opening track, replacing "Mother's Little Helper", while "Stupid Girl" remained as the second track on the album. Its delayed North American release allowed pirate radio stations to play the single up to two weeks before the album appeared. The song was originally released as "Paint It, Black", the comma being an error by Decca, which stirred controversy over its racial interpretation. The Stones performed "Paint It Black" live on The Ed Sullivan Show on 11 September.
Due to "Paint It Black" not appearing on the UK edition of Aftermath and being released as a non-album single, its first album release in the UK came on the UK edition of the compilation Big Hits (High Tide and Green Grass) (1966), though the album was not released with the song as part of its track listing in the US. The first release of the song on a compilation album in the US came on Through the Past, Darkly (Big Hits Vol. 2) (1969).
Later compilations by the Rolling Stones featuring "Paint It Black" include Hot Rocks 1964–1971 (1971), Singles Collection: The London Years (1989), Forty Licks (2002), and GRRR\! (2012). Live recordings are on the concert albums Flashpoint (1991), Live Licks (2004), Shine a Light (2008), Hyde Park Live (2013), and Havana Moon (2016).
## Critical reception and legacy
Initial reaction to "Paint It Black" was mixed. Some music critics found the addition of the sitar to be simply a case of the band copying the Beatles. In his book Brian Jones: The Making of the Rolling Stones, Paul Trynka comments on the influence of Harrison's sitar playing on the Beatles' song "Norwegian Wood" from the Rubber Soul album and draws parallels with Jones' droning sitar melody on "Paint It Black". Responding to claims that he was imitating the Beatles, Jones replied: "What utter rubbish", comparing the argument to saying that all groups using a guitar copy each other merely by using the instrument. Jonathan Bellman, an American musicologist, agreed with Jones, writing in a 1997 issue of The Journal of Musicology that the events are an example of concurrent musical and instrumental experimentation. Jones' sitar part on the track influenced the development of a whole subgenre of minor-key psychedelic music.
Lindy Shannon of the La Crosse Tribune felt that "Paint It Black", the Byrds' "Eight Miles High" and the Beatles' "Rain" were straying from the "commercial field" and instead "going into a sort of distorted area of unpleasant sounds". Staff at Melody Maker lauded the track, calling it "a glorious Indian raga-riot that will send the Stones back to number one". Writing for Disc and Music Echo, Penny Valentine praised Jagger's singing, writing that it was "better than ever" but was critical of the track's sitar. Guitar Player's Jesse Gress cited "Paint It Black" as originating the 1960s ragarock craze. In a review for New Musical Express (NME), Keith Altham considered "Paint It Black" the band's best single since "(I Can't Get No) Satisfaction" was released the previous year. A reviewer for Billboard predicted that Aftermath would become another hit for the band, citing "Paint It Black" as the focal point of this hard rock album and praising Oldham's production. Record World said "Guys are in depressed mood, but the rhythm is anything but depressed. Irrepressible hit." The Herald News considered the song a "top record ... for teeners", and in The Sunday Press Nancy Brown described it as a "pulsating, blues-soaked romantic tear-jerker". In the San Francisco Examiner, Ralph J. Gleason lauded the song for its "hypnotizing tone" and "same qualities of ambiguity and obscurity as some of the previous Stones hits". In April 1967, while hosting the television documentary Inside Pop: The Rock Revolution, Leonard Bernstein praised the song for its "arab café" sound, and cited it as an example of contemporary pop music's ability to evoke disparate moods through instrumentation.
In a retrospective review, Richie Unterberger of AllMusic called the song an "eerily insistent" classic that features some of "the best use of sitar on a rock record", and in another AllMusic review wrote it is "perhaps the most effective use of the Indian instrument in a rock song". Writing on the song's 50th anniversary in 2016, Dave Swanson of Ultimate Classic Rock considered the song, like its parent album Aftermath, to be a major turning point in artistic evolution for the band, noting: "'Paint It, Black' wasn't just another song by just another rock group; it was an explosion of ideas presented in one neat three-minute package." In 2017, ranking Aftermath as one of the best albums of the 1960s, Judy Berman of Pitchfork described the song as "rock's most nihilistic hit to date". David Palmer, editor of the Cullman Times, wrote that the "attitude" songs on Aftermath – particularly "Paint It Black" – influenced the nihilistic outlook of punk music. Stereogum critic Tom Breihan praised the song as a strong example of the band's brand of "swirling doom-blues", and praised its heavy sound and dark lyrics as ahead of its time when compared to the landscape of popular music in 1966.
"Paint It Black" inspired almost four hundred covers. It has placed on many "best songs" lists including those by Rolling Stone, Vulture magazine, NME, and Pitchfork. The Recording Academy inducted the song into the Grammy Hall of Fame in 2018. It is ranked number 213 on Rolling Stones list of the 500 Greatest Songs of All Time.
## Commercial performance
In the UK, "Paint It Black" peaked at number one on the Record Retailer chart during a 10-week stay, becoming the Rolling Stones' sixth UK number one. Seven days after its UK release, "Paint It Black" had sold 300,000 advance copies; the British Phonographic Industry (BPI) later certified it platinum. In 2007, the song entered the UK Singles chart at number 70 for an 11-week stint. In Germany, "Paint It Black" peaked at number two on the Musikmarkt Hit-Parade; the Bundesverband Musikindustrie (BVMI) certified the 2018 re-issue gold. The single was a top five hit in other European countries, peaking at number two in Austria, Ireland and Norway; number three in Belgium; and number four in Spain. After its 1990 reissue, "Paint It Black" charted at number 61. The single's 2007 reissue charted at number 49 on the Official German Charts and its 2012 re-issue charted number at 127 in France.
"Paint It Black" debuted at number 48 on the US Billboard Hot 100 chart for the week of 14 May 1966. The song took three weeks to rise to number one, where it stayed for two consecutive weeks, being replaced by Frank Sinatra's "Strangers in the Night." Its stint at number one made it the band's third in the US and the first song to feature a sitar to peak at number one in the country. By June, it had sold more than a million copies. It rose to number one in a "violent shakeup" of the list where 10 of its 20 songs appeared for the first time. "Paint It Black" remained on the chart for 11 weeks. Further reissues of the single have not peaked on the Billboard Hot 100, but 2008 sales saw "Paint It Black" reach number 73 on the Billboard Hot Canadian Digital Song Sales. According to the pop historian Richard Havers, Aftermath's 1966 chart run in the US, where it reached number 2 on the Billboard chart and number 1 on those published by Cash Box and Record World, was assisted by the success of "Paint It Black". "Paint It Black" also topped singles charts in Canada and the Netherlands, and was ranked within the Top 10 highest performing singles of the year in Austria, despite not reaching number 1 on the weekly charts.
In a KEYS national survey taken in June 1966, "Paint It Black" was number one in the United States. Surveys conducted by the Associated Press and United Press International identified the song as ranking No. 1 in the US the week of 12–19 June 1966. On the 1966 year-end charts, "Paint It Black" ranked number 34 on the US Billboard Hot 100 and number 30 on the Record Retailer chart. The 1990 reissue of "Paint It Black" topped the Netherlands Single Top 100 and peaked at number 11 in Belgium.
## Live performances
The Rolling Stones performed "Paint It Black" during their tours of America and England in 1966, following its release, along with other songs from Aftermath such as "Under My Thumb" and "Lady Jane", One notable live performance of the song was as the opening song of the Stones' performance at the Royal Albert Hall, a performance remembered for ending prematurely due to a riot, which led to rock bands being banned from performing at the venue. Footage of the riot would later be used in the promotional video for the Stones' next single, "Have You Seen Your Mother, Baby, Standing in the Shadow?". Despite its status as a hit single and as a staple of these shows, "Paint It Black" was not included on the Stones' live album documenting their tour of England, Got Live If You Want It\!.
"Paint It Black" has become a regular fixture of the Stones' concert setlists following its release, and has been performed during the Steel Wheels/Urban Jungle Tour (1991), Licks Tour (2002–2003), A Bigger Bang Tour (2005–2007), 50 & Counting (2012–2013), 14 On Fire (2014), América Latina Olé Tour 2016, No Filter Tour (2017–2021) and Sixty tour (2022).
## Personnel
According to authors Andy Babiuk and Greg Prevost, except where noted:
The Rolling Stones
- Mick Jagger – lead and harmony vocals; writer
- Keith Richards – harmony vocal; lead and acoustic guitars; writer
- Brian Jones – sitar, acoustic guitar
- Bill Wyman – bass, Hammond organ, maracas, cowbell
- Charlie Watts – drums, tambourine, castanets
Additional musicians and production
- Jack Nitzsche – piano
- Dave Hassinger – sound engineer
- Andrew Loog Oldham – producer
In Philippe Margotin and Jean-Michel Guesdon's book The Rolling Stones All the Songs, they add a question mark after Jones' guitar contribution and credit "tambourine, bongos, castanets" to "unidentified musicians". In the book Rolling Stones Gear by Babiuk and Prevost, they credit an acoustic guitar contribution to Jones, maracas and cowbell to Wyman and tambourine and castanets to Watts.
Studio locations"'
- Recorded at RCA Studios (Los Angeles)
## Charts
### Weekly charts
### Year-end charts
## Certifications
## Notable cover versions
- Eric Burdon and War released a cover of the song in 1970, which reached number 31 on the Dutch Top 40 singles chart.
- Bahamian musician Exuma included a cover of the song on his 1973 album Life.
- English all-female post-punk band the Mo-dettes released their version in 1980 which just missed the UK top 40, peaking at No. 42.
- American Hard Rock/Heavy Metal band W.A.S.P. recorded a cover of "Paint It Black" as the B-side to their 1984 7" single "School Daze". It was also included on the 1998 CD reissue of the bands first self titled album W.A.S.P.
- Canadian speed metal band Anvil recorded a cover for their 1981 debut album Hard 'n' Heavy.
- Irish rock band U2 included a cover of "Paint It Black" as the B-side to their 1992 single "Who's Gonna Ride Your Wild Horses", and did so again with the 20th anniversary rerelease of their album Achtung Baby in 2011.
- The London Symphony Orchestra performed a cover of the song in their 1994 "Symphonic Music of the Rolling Stones" performance.
- American singer Tracy Lawrence covered "Paint It Black" for the compilation album Stone Country: Country Artists Perform the Songs of the Rolling Stones in 1997.
- American singer-songwriter Vanessa Carlton included a cover of the song on her 2002 debut album Be Not Nobody, which was certified platinum by the Recording Industry Association of America.
- Canadian rock band Rush played one minute and ten seconds of the song during their 2003 performance at Molson Canadian Rocks for Toronto.
- American singer-songwriter Ciara recorded a cover version for the 2015 film, The Last Witch Hunter.
- In 1967 Marie Laforêt recorded a French version titled Marie Douceur - Marie Colère, which was later used in the 2023 film John Wick: Chapter 4.
- English band Duran Duran recorded a cover for their 2023 album Danse Macabre.
- Irish rockers U2 have incorporated it into their performance of Until The End Of The World at their Las Vegas residency at The Sphere.
- Canadian rock band Sum 41 recorded a cover for their eighth and final album Heaven :x: Hell, released in 2024.
## Notable usage in media
"Paint It Black" has seen commercial use in film, video games and other entertainment media.
- In the end credits of the film Full Metal Jacket (1987)
- In the end credits of the film The Devil's Advocate (1997),
- It was used as a plot device in the supernatural horror film Stir of Echoes (1999)
- It was featured in Black Adam (2022), during a slow motion fight sequence.
- In 1987 it was used as opening theme for the broadcast version of the CBS television show, Tour of Duty.
- In a trailer for the video game Call of Duty: Black Ops III (2015)
- In a trailer for the film The Mummy (2017).
- Multiple episodes of the TV series Westworld use an orchestral arrangement of the song by Ramin Djawadi.
- A cello arrangement of the song is featured prominently in the Netflix original series Wednesday (2022).
- The song features on the soundtracks to multiple video games, including:
- Twisted Metal: Black (2001),
- Guitar Hero III: Legends of Rock (2007)
- Guitar Hero Live" (2015) |
# Battle of Marshall's Elm
The battle of Marshall's Elm was a skirmish that took place near Street, in the county of Somerset, South West England, on 4 August 1642. The engagement occurred during the build-up to formal beginning of the First English Civil War on 22 August, while the Royalists and Parliamentarians were recruiting men in the county. The Royalists had established their regional headquarters in Wells, but were threatened by superior Parliamentarian numbers in the vicinity. The Royalist commander sent out a mounted patrol consisting of 60 to 80 cavalry and dragoons, which came across a force of between 500 and 600 Parliamentarian recruits travelling north across the Somerset Levels under the command of Sir John Pyne.
The Royalists set an ambush at Marshall's Elm, where the road rose out of the Levels into the Polden Hills. After a parley between the leaders was unsuccessful, the Parliamentarians were caught in the ambush. Facing musket fire from the hidden dragoons, and being charged at by the Royalist cavalry, they were routed. The Royalists killed around 27, and took 60 prisoners, including two of the Parliamentarian officers. Despite their victory, the Royalists were forced to withdraw from Wells, and later from Somerset altogether, due to their inferior numbers.
## Background
Conflict between the English Parliament and its monarch on religious, fiscal and legislative matters had been ongoing since at least 1603. The tension between Parliament and King Charles escalated sharply during 1642 after the King had attempted to arrest five Members of Parliament, who he accused of treason. In preparation for the likelihood of conflict with Parliament, Charles appointed the Marquess of Hertford as commander of his forces in the West Country, supported by Sir Ralph Hopton, a local Member of Parliament (MP) and an experienced army officer. Both sides were attempting to recruit the existing militia and new men into their armies. Parliament passed the Militia Ordinance in March 1642 without royal assent, granting themselves control of the militia. In response, Charles granted commissions of array to his commanders, a medieval device for levying soldiers which had not been used since 1557. One such commission was issued to Hertford, for the levying of troops in south-west England and south Wales.
Hertford chose Wells in Somerset as the Royalists' headquarters in the West Country, and they arrived in the city on 28 July. The decision was based on the fact that Wells housed the county magazine, had Royalist sympathies, and was geographically central within the area. In his 1973 book, Somerset in the Civil War, the historian David Underdown criticises the decision, citing Wells' vulnerable position in the Mendip Hills, and the strong Parliamentarian views held by the majority of Somerset's rural population. Hopton had previously acted as one of the deputy lieutenants for Somerset, making him responsible for training and leading the county's militia. Hopton's standing helped the Royalists' recruiting, but the general population of the county, many of whom were Calvinist Protestants, or worked in industries depressed by royal policies, was more sympathetic towards Parliament than the King. Broadly speaking, the Royalists were more successful in recruiting cavalry and members of the gentry; Hopton, John Digby and Francis Hawley each brought a troop of horse, but attempts to raise an infantry regiment were unsuccessful. In contrast, the Parliamentarians signed up more men, but many of these were untrained and unarmed countrymen.
On 30 July 1642, the Parliamentarians, led by William Strode, one of Parliament's deputy lieutenants in Somerset, held a meeting to collect arms at Shepton Mallet, around four miles (6 km) east-southeast of Wells. Hertford sent Hopton with his cavalry to Shepton on 1 August to confront the Parliamentarians, but he had orders to avoid conflict. When Hopton arrived in Shepton, Strode refused to listen to him, and the two scuffled. A crowd of over 1,000 had gathered, and Hopton withdrew and rejoined his cavalry outside the town. There, the Royalists and the countrymen sympathetic to the Parliamentarians faced off without fighting for several hours before the Royalists pulled back to Wells.
## Prelude
The success of the Parliamentarians' recruiting left the Royalists in danger of being surrounded in Wells. Sir John Pyne, an MP who had also been appointed as a deputy lieutenant of Somerset by Parliament in March, and Captain John Preston recruited around 400 men from Taunton (around 24 miles (39 km) south-west of Wells), while Captain Sands brought a further 200 from South Petherton. Pyne had orders to bring the men, described as "a few hundred farmers" by Underdown, to Street, where they would rendezvous with Strode. Hertford was wary of his weak position, and on 4 August he sent a mounted patrol out under the command of Sir John Stawell, composed of three troops of cavalry and some dragoons, numbering around 60 to 80 in all. The patrol, which also included several of the Royalist gentry and the experienced soldier Henry Lunsford, rode south through Glastonbury into the Polden Hills. On reaching the village of Marshall's Elm, just over one mile (2 km) south of Street, and around eight miles (13 km) south of Wells, the patrol spotted Pyne's force marching through cornfields about two miles (3 km) away.
## Battle
Having approached from the north, the Royalists had the advantage of higher ground, coming down off the Poldens. Marshall's Elm is located in a depression that acted as a pass between Ivy Thorn Hill and Collard Hill, where the road rose out of the Somerset Levels to climb into the hills. Stawell parleyed with the Parliamentarians, telling them that they could avoid conflict if they aborted their march, but to no effect. While Stawell was engaged in his discussion with the Parliamentarians, Lunsford arranged the Royalist troops; the cavalry were behind the brow of the hill, leaving just their heads and weapons visible, to disguise their numbers; fourteen dragoons dismounted and were hidden in quarry pits lower on the hill by the road. He ordered all the men to hold their fire until he led the attack with the dragoons.
Pyne initially continued the Parliamentarian march, but then changed his mind. His order to stop was met with complaints from his men, who said that the Royalist force "were but a few horse and would run away", and they continued up the hill. Pyne's men halted occasionally to fire, but Lunsford held the Royalists' fire until the enemy were within 120 paces, when the dragoons returned fire with their muskets and killed the leader of the Parliamentarian vanguard. The Parliamentarians hesitated, unsure of where the attack had come from, and Stawell led the cavalry charge down the hill. The Parliamentarians were routed; seven were killed at Marshall's Elm, and the Royalists chased some of the fleeing men for three miles (5 km), as far as Somerton. They captured sixty prisoners, who they left in Somerton. Among those captured were the two officers, Preston and Sands. As well as the seven killed at the battle, roughly another twenty died of their wounds.
## Aftermath
The battle provided both a tactical and strategic victory for the Royalists, leaving Hertford with an escape route from Wells should it be needed. Underdown credits their cavalry strength and leadership for the victory, highlighting that their leaders were "accustomed to command and confident of their ability to defeat larger forces of poorly officered farmers". He was particularly complimentary of Lunsford, and the experience he brought. One of the region's Parliamentarian leaders, John Ashe, said that the battle "very much daunted the honest countryman".
Despite their defeat at Marshall's Elm, the Parliamentarians continued to gather men around Wells. Groups congregated from Bristol, Gloucester, Wiltshire and throughout north-east Somerset; a range of cavalry, musketeers and countrymen wielding makeshift weapons such as pitchforks. The force, which numbered around 12,000, crossed the Mendip Hills and reached a slope overlooking Wells on the evening of 5 August. Pyne held joint command of part of the force with Strode. Hertford sent his cavalry to face them, and both groups agreed to a ceasefire until the next day. Overnight, the Parliamentarians' numbers were swelled by further recruits and reinforcements, and Hertford made a sham of negotiating in the morning to cover his retreat; while the Parliamentarian messengers were riding north out of Wells with his 'offer', his men fled south, covered by a cavalry rearguard led by Hopton. After spending two nights in Somerton, the Royalists withdrew out of Somerset altogether, garrisoning Sherborne Castle in Dorset.
The First English Civil War formally began on 22 August, when Charles I raised his royal standard in Nottingham. The battle at Marshall's Elm was not the only engagement to predate the formal start of the war, but the historian Peter Gaunt suggests that it was the bloodiest, while another, Charles Carlton, said that Marshall's Elm was the "first real confrontation" of the war. |
# Toys for Bob
Toys for Bob, Inc. is an American video game developer based in Novato, California. It was founded in 1989 by Paul Reiche III and Fred Ford and is best known for creating Star Control and the Skylanders franchise, as well as for working on the Crash Bandicoot and Spyro franchises.
The studio began as a partnership between Reiche and Ford. The two had separately attended the University of California, Berkeley in the late 1970s before entering the video game industry in the early 1980s. They later met through mutual friends in 1988, when Reiche was seeking a programmer to develop Star Control for Accolade. This led to the creation of their partnership in 1989 and the debut of Star Control in 1990. The release was considered a landmark science fiction game and led to the 1992 sequel Star Control II, which greatly expanded the series' story and scale. Star Control II is celebrated as one of the greatest games of all time and is featured on several "best of" lists for music, writing, world design, and character design. The studio adopted the name Toys for Bob to stimulate curiosity and differentiate themselves from other studios.
With Crystal Dynamics as their publisher, they developed several games, including The Horde, Pandemonium\!, and The Unholy War. In the early 2000s, the studio transitioned to working on licensed games before being laid off by Crystal Dynamics. With Terry Falls as a co-owner, Reiche and Ford incorporated the studio in 2002. Activision became their publisher soon after, and eventually acquired the studio in 2005. Toys for Bob created the Skylanders series when Activision merged with Vivendi Games and acquired the Spyro franchise. The developers at Toys for Bob had already been experimenting with using physical toys to interact with video games and believed that this technology would be ideal for Spyro's universe of characters. Credited with inventing the toys-to-life genre, the 2011 release of Skylanders: Spyro's Adventure was considered a technological and commercial breakthrough. This led to a spinoff series with several successful games, generating a billion dollars in revenue for Activision in the first 15 months and winning several awards. In 2018, Toys for Bob assisted with the development of the remaster compilations Crash Bandicoot N. Sane Trilogy and Spyro Reignited Trilogy, earning a reputation leading a revival of properties from the original PlayStation.
After the release of Crash Bandicoot 4: It's About Time in 2020, Reiche and Ford left the company to start an independent studio. Toys for Bob took on new leadership under Paul Yan and Avery Lodato while working on the Call of Duty series. After Activision's parent company, Activision Blizzard, faced lawsuits over workplace harassment and discrimination, Microsoft acquired the holding in October 2023. Following layoffs at the studio, Toys for Bob spun off from Activision in May 2024.
## History
### Partnership and Star Control success
Toys for Bob began as a partnership between Paul Reiche III and Fred Ford. The two founders separately attended the University of California, Berkeley, around the same time, and both entered the video game industry in the early 1980s. Ford started his career creating games for Japanese personal computers before transitioning to more corporate work, but after a few years working at graphics companies in Silicon Valley, Ford realized he missed working in the game industry. Meanwhile, Reiche had started his career working for Dungeons & Dragons publisher TSR before developing PC games for Free Fall Associates. Reiche's producer at Free Fall took a new job at Accolade and helped Reiche secure a three-game agreement with the publisher. At this point, Reiche needed a programmer and Ford was seeking a designer/artist, so their mutual friends set up a board game night to introduce them. Those friends included fantasy artist Erol Otus, as well as game designer Greg Johnson, who hosted the meet-up. Soon after, Reiche and Ford formed their studio in 1989.
Reiche and Ford's first collaboration was Star Control, released for MS-DOS in 1990. Originally called Starcon, the game began as an evolution of the concepts that Reiche first created in Archon: The Light and the Dark. Archon's strategic elements were adapted for Star Control into a space setting, with one-on-one ship combat inspired by the classic 1962 game Spacewar\!. During production, Reiche and Ford spent time working on their collaborative process, and this was partly why the game was limited in scope compared to its sequel. Upon its release, Star Control was voted the "Best Science Fiction Game" by Video Games and Computer Entertainment, and decades later, it is remembered as one of the greatest games of all time, with numerous game developers citing it as an influence on their work.
The success of Star Control led to a more ambitious sequel, Star Control II. Reiche and Ford aimed to expand on the first game's combat system with deeper storytelling. Their goal of creating a dynamic space adventure was largely inspired by Starflight, designed by Greg Johnson in 1986. While developing Starflight, Johnson had shared office space with Reiche, who became so fascinated with the project that he helped Johnson build the game's communication system. Years later, this friendship led Reiche to ask Johnson to work on Star Control II, and Johnson became one of the game's most significant contributors. Star Control's story and characters were vastly expanded from those of the first game. As Reiche and Ford worked on the first version of the game's dialog, they recognized they needed help with the writing and art and decided to enlist the help of close friends. In addition to Johnson, they recruited Otus, who contributed art, music, and text (as well as voice acting, in a later release). Through mutual friends, they acquired the talents of fantasy artist George Barr. The project eventually ran over schedule, and the budget from Accolade ran out. During the final months of development, Ford supported the team financially.
Star Control II received even more acclaim than the first game, earning recognition as one of the best games of all time by numerous publications since its release.[1] It is also ranked among the best games in several specific areas, including writing, world design, character design, and music. Star Control II has also inspired the design of numerous games, including the open-ended gameplay of Fallout, the world design of Mass Effect, and the story events of Stellaris. After finishing a Star Control II port to the 3DO Interactive Multiplayer (with additional voice acting and game improvements), Accolade offered Ford and Reiche the same budget to produce a third game, which they turned down to pursue other projects. As the pair had retained the rights to their characters and stories from the first two games, they licensed their content to Accolade so that the publisher could produce Star Control 3 without their involvement.
### Growth under Crystal Dynamics
The studio pitched their next game to Sega, but their contacts at the company had already left for Crystal Dynamics, which led the studio to pursue a publishing agreement with them instead. Around this time, the studio was operating with Reiche, Ford, and Ford's brother Ken, with additional freelancers hired for key tasks. Whereas their previous games were released as a partnership under their legal names, their subsequent games began to refer to their studio as Toys for Bob. They initially wanted a name that would distinguish them from their competitors. Reiche's wife Laurie suggested the name "Toys for Bob", which was chosen to stimulate curiosity and allude to Reiche and Ford's appreciation for toys.
The studio's first game under Crystal Dynamics was The Horde (1994), a full-motion video action and strategy game. Aiming to take advantage of Crystal Dynamics's Hollywood connections and the increased storage size of CD-ROMs for the video scenes, they hired a cast of professional actors including Martin Short and Kirk Cameron. The game received two awards from Computer Gaming World: "Best Musical Score" for Burke Trieschmann's music and "Best On Screen Performance" for Michael Gregory's role as Kronus Maelor. In 1996, Toys for Bob released Pandemonium\!, a 2.5D platform game for consoles. Their team expanded to nearly 30 people to complete the project, with substantial efforts to learn the mechanics of 3D game design. As the company grew, so did the mythology around their name. According to Reiche, since people frequently asked about the truth behind "Toys for Bob", he instructed his team to invent their own "Bob" and swear he is the only one, with the goal of "further confusing people".
As the studio prepared to release The Unholy War in 1998, Crystal Dynamics was acquired by Eidos Interactive. Unholy War was a fighting game with a strategic meta-game, similar to the combination of game modes seen in Reiche's game Archon, and the original Star Control. Reiche and Ford thought the gameplay could be used for an adaptation of a Japanese license such as SD Gundam, and Crystal Dynamics helped them get in touch with Bandai, who promised them an "even bigger license". Bandai ultimately had them produce Majokko Daisakusen: Little Witching Mischiefs, a game based on magical girl characters from Japanese anime created in the 1960s, 1970s, and 1980s. Bandai's choice of license came as a surprise to Toys for Bob, and the development process was fraught with translation challenges. As the project dragged on, the studio continued to receive bug reports in Japanese until they simply unplugged their fax machine, thus ending development. Majokko Daisakusen was released exclusively in Japan, and Toys for Bob never learned how well the game performed.
Their next release was Disney's 102 Dalmatians: Puppies to the Rescue, another major license that was considered of a higher quality than other licensed games. Soon after the release, Crystal Dynamics decided to fire the entire Toys for Bob team. After operating as a partnership for more than a decade, Reiche, Ford, and Terry Falls incorporated Toys for Bob in 2002, and announced that they were seeking a new publisher after parting ways with Crystal Dynamics and Eidos Interactive.
### Acquisition by Activision
Soon after re-establishing their studio as an independent company, Reiche and Ford released the source code for the 3DO version of Star Control II as open-source software under the GNU General Public Licence (GPL) and enlisted the fan community to port it to modern operating systems. The result was the 2002 open source game The Ur-Quan Masters, released under a new title since the Star Control trademark was owned by Atari, who had acquired Accolade. An intern at Toys for Bob began porting the game to various modern operating systems, and the fan community continued the project with further support and modifications. Reiche and Ford retained the original copyrighted content within the first two Star Control games, and granted the fan-operated project a free, perpetual license to the Star Control II content and the "Ur-Quan Masters" trademark.
Toys for Bob secured Activision as their new publisher, thanks to an introduction from former staff who had founded Shaba Games and sold it to Activision. As the industry found a thriving market for licensed game adaptations, Activision asked Toys for Bob to work on Disney's Extreme Skate Adventure, which combined the publisher's game engine from Tony Hawk's Pro Skater 4 with the studio's experience working on Disney properties. The game was released in 2003 and gave the studio more experience creating games for a younger audience.
Working with Activision, Toys for Bob continued to focus on licensed games, such as Madagascar. Their growing relationship with the publisher led them to be acquired in 2005: the studio became a wholly owned subsidiary under Activision, and the management team and employees signed long-term contracts under the new corporate structure. However, the release of Madagascar showed that the market for licensed games was beginning to dry up, in part due to the negative reputation created by a flood of low quality licensed games. By this time, the company was operating with 27 employees, and needed a game that was successful enough to justify their growing team.
### Skylanders breakthrough
Activision asked the studio to generate a new idea, and the company felt pressure to find the right opportunity. One idea came from Toys for Bob character designer I-Wei Huang, who had been creating toys and robots in his spare time. The company saw the potential to adapt these toys and character designs into a game, with technical engineer Robert Leyland applying his hobby in building electronics. Coincidentally, Activision merged with Vivendi Games in 2008, and asked Toys for Bob to create a new game around Vivendi's Spyro franchise. The studio saw the potential for toy–game interaction and suggested to Activision that it would be ideal for Spyro's universe of characters. Their team also saw it as an opportunity to make use of their passion for and experience in creating monsters. Activision CEO Bobby Kotick responded well to the idea and gave them an additional year of development to better refine the gameplay, technology, and manufacturing process. Activision believed that the technology would be ideal for Nintendo's properties: they asked Toys for Bob to present the concept to Nintendo early in its development cycle, but Nintendo decided to limit their role to marketing the title for the Wii.
This culminated in the 2011 release of Skylanders: Spyro's Adventure, which became a breakthrough success. They followed this with Skylanders: Giants in 2012, allowing the series to earn a billion dollars in sales just 15 months after the release of the first game. These successes led Gamasutra to list Toys for Bob among their top developers for 2012, stating, "we're not just impressed that Toys for Bob successfully pulled Skylanders off—it sold massively, after all—we're impressed by how ballsy it was to begin with". Multiple publications have credited Skylanders with creating the toys-to-life genre, attracting competitors such as Nintendo, Disney, and the Lego Group to the multi-billion-dollar market sector.
In the years that followed, Toys for Bob created several successful Skylanders video games, including Skylanders: Trap Team. Their last game in this series was Skylanders: Imaginators in 2016, which won several awards. However, slower sales and increased competition suggested that toys-to-life games might have hit their peak, and Activision decided to discontinue the Skylanders series. Still, the Skylanders series became one of the best-selling video game franchises of all time. In late 2018, Toys for Bob donated hundreds of Skylanders toys to The Strong National Museum of Play, which planned to use them as an exhibit to document "one of the most significant game franchises of the last decade".
Toys for Bob continued their development for important licenses under Activision. They worked on a re-packaged Spyro Reignited Trilogy with updated sound and visuals, in consultation with developers from the original Spyro trilogy. Its 2018 release was considered one of the best video game remakes of all time. Having ported the Crash Bandicoot N. Sane Trilogy to the Nintendo Switch, Toys for Bob sought to maintain the momentum of that title's success by developing Crash Bandicoot 4: It's About Time, a direct continuation of the original Crash Bandicoot trilogy. Upon its release in 2020, the studio earned a reputation for leading a revival of properties from the original PlayStation as part of a recent trend.
### New leadership
Founders Reiche and Ford left Toys for Bob at the end of 2020 to create an independent studio and commence development on a sequel to The Ur-Quan Masters. Paul Yan and Avery Lodato became Toys for Bob's studio heads, and the studio continues to operate with an estimated 180 employees. In April 2021, it was announced that Toys for Bob would be working on Call of Duty: Warzone as a support studio alongside Raven Software, Infinity Ward, and Treyarch. This led to the release of Call of Duty: Vanguard in December 2021, with some of the game contents also included in Warzone. Throughout 2021, allegations of workplace harassment surfaced at Activision's parent company, Activision Blizzard, and Toys for Bob employees were among 500 employees calling for the resignation of Kotick. Reiche agreed with the need for a change in leadership at Activision Blizzard.
On January 18, 2022, Microsoft announced that it intended to acquire Activision Blizzard for $68.7 billion. Microsoft promised to strive towards safer and more inclusive working conditions among Activision's studios, including Toys for Bob. As the CEO of Microsoft's gaming division, Phil Spencer expressed interest in having Toys for Bob revive older game properties now owned by the conglomerate. Kotick also expressed his long-term desire to revive the Skylanders series, believing this was now possible thanks to Microsoft's hardware manufacturing and supply chain. As part of 1,900 job cuts instituted by Microsoft in January 2024, 89 people were laid off from Toys for Bob and the studio's offices in Novato were closed down. The remaining staffers transitioned to work from home. In February 2024, the studio announced plans to spin off from Activision. In the following month, Windows Central reported that the studio had forged a partnership with Microsoft to publish its first independent game. The studio became independent again in May 2024.
## Accolades
- Spyro Reignited Trilogy – Family Title of the Year, Australian Games Awards, 2018
- Star Control – 253rd Best Game of All Time, Polygon, 2017
- Skylanders: Imaginators – Tillywig Toy & Media Awards, 2017
- Skylanders: Imaginators – Best Toys of 2016, Parents, 2016
- Skylanders: Imaginators – 2016 Hot Holiday Toy List, Toys "R" Us, 2016
- Skylanders: Imaginators – Best Video Games of 2016, USA Today, 2016
- Skylanders: Imaginators – Best Family Game of E3, Game Critics Awards, 2016
- Skylanders: Imaginators – Best Family Game, Gamescom, 2016
- Star Control II – Hardcore Gaming 101 Best Video Games of All Time, 2015
- Skylanders: Giants – BAFTA Children's Game, British Academy Games Awards, 2013
- Star Control II – Best Classic PC Game, Kotaku, 2013
- Skylanders: Spyro's Adventure – Outstanding Innovation In Gaming, Academy of Interactive Arts & Sciences, 2012
- Star Control II – 52nd Greatest PC Game, PC Gamer, 2011
- Toys for Bob – Top 50 most successful game studios, MCV/Develop, 2006
- Star Control II – Computer Gaming World Hall of Fame, 2006
- Star Control II – 17th Best Game, IGN, 2005
- Star Control II – 53rd Best Game, IGN, 2003
- Star Control II – GameSpot Greatest Games of All Time, 2003
- Star Control – 45th Most Influential Game of All Time (Developer Survey), PC Gameplay, 2001
- Star Control II – 26th Best Game of All Time, GameSpy, 2001
- Star Control II – GameSpy Hall of Fame, 2000
- Star Control II – 46th Best Game of All Time, Next Generation, 1999
- Star Control II – 29th Best Game of All Time, Computer Gaming World, 1996
- Star Control – 127th Best Game of all Time, Computer Gaming World, 1996
- The Horde – Best Musical Score, Computer Gaming World, 1994
- The Horde – Michael Gregory – Best On-Screen Performance, Computer Gaming World, 1994
- Star Control II – 33rd Best Game of All Time, PC Gamer UK, 1994
- Star Control II – 21st Best Game of All Time, PC Gamer US, 1994
- Star Control II – Class of '93, Game Developers Conference, 1993
- Star Control II – Game of the Year, Pelit, 1993
- Star Control II – Adventure Game of the Year, Computer Gaming World, 1993
- Star Control – Best Computer Science Fiction Game, VideoGames & Computer Entertainment", 1990
## Games developed
1. |
# Euryoryzomys emmonsae
Euryoryzomys emmonsae, also known as Emmons' rice rat or Emmons' oryzomys, is a rodent from the Amazon rainforest of Brazil in the genus Euryoryzomys of the family Cricetidae. Initially misidentified as E. macconnelli or E. nitidus, it was formally described in 1998. A rainforest species, it may be scansorial, climbing but also spending time on the ground. It lives only in a limited area south of the Amazon River in the state of Pará, a distribution that is apparently unique among the muroid rodents of the region.
Euryoryzomys emmonsae is a relatively large rice rat, weighing 46 to 78 g (1.6 to 2.8 oz), with a distinctly long tail and relatively long, tawny brown fur. The skull is slender and the incisive foramina (openings in the bone of the palate) are broad. The animal has 80 chromosomes and its karyotype is similar to that of other Euryoryzomys. Its conservation status is assessed as "Data Deficient", but deforestation may pose a threat to this species.
## Taxonomy
In 1998, Guy Musser, Michael Carleton, Eric Brothers, and Alfred Gardner reviewed the taxonomy of species previously lumped under "Oryzomys capito" (now classified in the genera Hylaeamys, Euryoryzomys, and Transandinomys). They described the new species Oryzomys emmonsae on the basis of 17 specimens from three locations in the state of Pará in northern Brazil; these animals had been previously identified as Oryzomys macconnelli (now Euryoryzomys macconnelli) and then as Oryzomys nitidus (now Euryoryzomys nitidus). The specific name honors Louise H. Emmons, who, among other contributions to Neotropical mammalogy, collected three of the known examples of the species in 1986, including the holotype. The new species was placed in what they termed the "Oryzomys nitidus group", which also included O. macconelli, O. nitidus, and O. russatus.
In 2000, James Patton, Maria da Silva, and Jay Malcolm reported on mammals collected at the Rio Juruá in western Brazil. In this report, they provided further information on the Oryzomys species reviewed by Musser and colleagues, including sequence data from the mitochondrial cytochrome b gene. Their analysis reaffirmed that O. emmonsae was a distinct species and found that it was closest to O. macconnelli and O. russatus, differing from both by about 12% in the cytochrome b sequence; O. nitidus was more distantly related, differing by 14.7%. The average sequence difference between the three O. emmonsae individuals studied was 0.8%.
In 2006, an extensive morphological and molecular phylogenetic analysis by Marcelo Weksler showed that species then placed in the genus Oryzomys did not form a single, cohesive (monophyletic) group; for example, O. macconnelli, O. lamia (placed under O. russatus by Musser and colleagues) and O. russatus clustered together in a single natural group (clade), but were not closely related to the type species of Oryzomys, the marsh rice rat (O. palustris). Later in 2006, Weksler and colleagues described several new genera to accommodate species previously placed in Oryzomys, among which was Euryoryzomys for the "O. nitidus complex", including O. emmonsae.
Thus, the species is now known as Euryoryzomys emmonsae. As a species of Euryoryzomys, it is classified within the tribe Oryzomyini ("rice rats"), which includes over a hundred species, mainly from South and Central America. Oryzomyini in turn is part of the subfamily Sigmodontinae of family Cricetidae, along with hundreds of other species of mainly small rodents.
## Description
Euryoryzomys emmonsae is a fairly large, long-tailed rice rat with long, soft fur. The hairs on the back are 8 to 10 mm (0.31 to 0.39 in) long. It generally resembles E. nitidus in these and other characters, but has a longer tail. E. macconnelli is slightly larger and has longer and duller fur. In E. emmonsae, the upperparts are tawny brown, but a bit darker on the head because many hairs have black tips. The hairs of the underparts are gray at the bases and white at the tips; overall, the fur appears mostly white. In most specimens, there is a patch on the chest where the gray bases are absent. The longest of the vibrissae (whiskers) of the face extend slightly beyond the ears. The eyelids are black. The ears are covered with small, yellowish brown hairs and appear dark brown overall. The feet are covered with white hairs above and brown below. There are six pads on the plantar surface, but the hypothenar is reduced. The ungual tufts, tufts of hair which surround the bases of the claws, are well-developed. The tail is like the body in color above, and mostly white below, but in the 10 mm (0.39 in) nearest the tail tip it is brown below.
Compared to E. nitidus and E. macconnelli, the skull is relatively small and slender. It has broad and short incisive foramina (perforations of the palate between the incisors and the molars) and lacks sphenopalatine vacuities which perforate the mesopterygoid fossa, the gap behind the end of the palate. The animal is similar to other members of the genus in the pattern of the arteries of the head. The alisphenoid strut, an extension of the alisphenoid bone which separates two foramina (openings) in the skull (the masticatory-buccinator foramen and the foramen ovale accessorium) is rarely present; its presence is more frequent in E. nitidus. The capsular process, a raising of the bone of the mandible (lower jaw) behind the third molar, houses the back end of the lower incisor in most Euryoryzomys, but is absent in E. emmonsae and E. macconnelli. Traits of the teeth are similar to those of E. nitidus and other Euryoryzomys.
The karyotype includes 80 chromosomes with a total of 86 major arms (2n = 80; FN = 86). The X chromosome is subtelocentric (with one pair of long arms and one pair of short arms) and the Y chromosome is acrocentric (with only one pair of arms, or with a minute second pair). Among the autosomes (non-sex chromosomes), the four metacentric or submetacentric (with two pairs of arms as long as or not much shorter than the other) pairs of chromosomes are small, and the 35 pairs of acrocentrics range from large to small. Some of those have a minute second pair of arms and could also be classified as subtelocentric, which would raise FN to 90. This karyotype is similar to other known karyotypes of members of Euryoryzomys.
In thirteen specimens measured by Musser, head and body length ranges from 120 to 142 mm (4.7 to 5.6 in), tail length (12 specimens only) from 130 to 160 mm (5.1 to 6.3 in), hindfoot length from 32 to 35 mm (1.3 to 1.4 in), ear length (three specimens only) from 23 to 24 mm (0.91 to 0.94 in), and body mass from 46 to 78 g (1.6 to 2.8 oz).
## Distribution and ecology
The known distribution of Euryoryzomys emmonsae is limited to a portion of the Amazon rainforest south of the Amazon River in the state of Pará, between the Xingu and Tocantins rivers, but the limits of its range remain inadequately known. No other South American rainforest muroid rodent is known to have a similar distribution. Musser and colleagues reported it from three locations and Patton and others added a fourth; in some of those it occurs together with E. macconnelli or Hylaeamys megacephalus.
Specimens of E. emmonsae for which detailed habitat data are available were caught in "viny forest", a microhabitat that often included much bamboo. All were captured on the ground, some in bamboo thickets and another under a log. Musser and colleagues speculated that E. emmonsae may be scansorial, spending time both on the ground and climbing in vegetation, like the similarly long-tailed rice rat Cerradomys subflavus.
## Conservation status
The IUCN currently lists Euryoryzomys emmonsae as "Data Deficient" because it is so poorly known. It may be threatened by deforestation and logging, but occurs in at least one protected area, the Tapirapé-Aquiri National Forest. |
# Battle of Savo Island
The Battle of Savo Island, also known as the First Battle of Savo Island and in Japanese sources as the First Battle of the Solomon Sea (第一次ソロモン海戦, Dai-ichi-ji Soromon Kaisen), and colloquially among Allied Guadalcanal veterans as the Battle of the Five Sitting Ducks, was a naval battle of the Solomon Islands campaign of the Pacific War of World War II between the Imperial Japanese Navy and Allied naval forces. The battle took place on 8–9 August 1942 and was the first major naval engagement of the Guadalcanal campaign and the first of several naval battles in the straits later named Ironbottom Sound, near the island of Guadalcanal.
The Imperial Japanese Navy, in response to Allied amphibious landings in the eastern Solomon Islands, mobilized a task force of seven cruisers and one destroyer under the command of Vice Admiral Gunichi Mikawa. The task forces sailed from Japanese bases in New Britain and New Ireland down New Georgia Sound (also known as "The Slot") with the intention of interrupting the Allied landings by attacking the supporting amphibious fleet and its screening force. The Allied screen consisted of eight cruisers and fifteen destroyers under Rear Admiral Victor Crutchley, but only five cruisers and seven destroyers were involved in the battle. In a night action, Mikawa thoroughly surprised and routed the Allied force, sinking one Australian and three American cruisers, while suffering only light damage in return. Rear Admiral Samuel J. Cox, director of the Naval History and Heritage Command, considers this battle and the Battle of Tassafaronga to be two of the worst defeats in U.S. naval history, with only the attack on Pearl Harbor being worse.
After the initial engagement, Mikawa, fearing Allied carrier strikes against his fleet in daylight, decided to withdraw under cover of night rather than attempt to locate and destroy the Allied invasion transports. The Japanese attacks prompted the remaining Allied warships and the amphibious force to withdraw earlier than planned (before unloading all supplies), temporarily ceding control of the seas around Guadalcanal to the Japanese. This early withdrawal of the fleet left the Allied ground forces (primarily United States Marines), which had landed on Guadalcanal and nearby islands only two days before, in a precarious situation with limited supplies, equipment, and food to hold their beachhead.
Mikawa's decision to withdraw under cover of night rather than attempt to destroy the Allied invasion transports was primarily founded on concern over possible Allied carrier strikes against his fleet in daylight. In reality, the Allied carrier fleet, similarly fearing Japanese attack, had already withdrawn beyond operational range. This missed opportunity to cripple (rather than interrupt) the supply of Allied forces on Guadalcanal contributed to Japan's failure to recapture the island. At this critical early stage of the campaign, it allowed the Allied forces to entrench and fortify themselves sufficiently to defend the area around Henderson Field until additional Allied reinforcements arrived later in the year.
The battle was the first of five costly, large-scale sea and air-sea actions fought in support of the ground battles on Guadalcanal, as the Japanese sought to counter the American offensive in the Pacific. These sea battles took place after increasing delays by each side to regroup and refit, until the 30 November 1942 Battle of Tassafaronga—after which the Japanese, eschewing the costly losses, attempted resupplying by submarine and barges. The final naval battle, the Battle of Rennell Island, took place months later on 29–30 January 1943, by which time the Japanese were preparing to evacuate their remaining land forces and withdraw.
## Background
### Operations at Guadalcanal
On 7 August 1942 Allied forces (primarily U.S. Marines) landed on Guadalcanal, Tulagi, and Florida Island in the eastern Solomon Islands. The landings were meant to deny their use to the Japanese as bases, especially the nearly completed airfield at Henderson Field that was being constructed on Guadalcanal. If Japanese air and sea forces were allowed to establish forward operating bases in the eastern Solomons, they would be in a position to threaten the supply shipping routes between the U.S. and Australia. The Allies also wanted to use the islands as launching points for a campaign to recapture the Solomons, isolate or capture the major Japanese base at Rabaul, and support the Allied New Guinea campaign, which was then building strength under General Douglas MacArthur. The landings initiated the six-month-long Guadalcanal campaign.
The overall commander of Allied naval forces in the Guadalcanal and Tulagi operation was U.S. Vice Admiral Frank Jack Fletcher. He also commanded the carrier task groups providing air cover. U.S. Rear Admiral Richmond K. Turner commanded the amphibious fleet that delivered the 16,000 Allied troops to Guadalcanal and Tulagi. Also under Turner was Rear Admiral Victor Crutchley's screening force of eight cruisers, fifteen destroyers, and five minesweepers. This force was to protect Turner's ships and provide gunfire support for the landings. Crutchley commanded his force of mostly American ships from his flagship, the Australian heavy cruiser .
The Allied landings took the Japanese by surprise. The Allies secured Tulagi, nearby islets Gavutu and Tanambogo, and the airfield under construction on Guadalcanal by nightfall on 8 August. On 7–8 August Japanese aircraft based at Rabaul attacked the Allied amphibious forces several times, setting afire the U.S. transport ship George F. Elliott (which sank later) and heavily damaging the destroyer USS Jarvis. In these air attacks, the Japanese lost 36 aircraft, while the U.S. lost 19 aircraft, including 14 carrier-based fighter aircraft.
Concerned over the losses to his carrier fighter aircraft strength, anxious about the threat to his carriers from further Japanese air attacks, and worried about his ships' fuel levels, Fletcher announced that he would withdraw his carrier task forces on the evening of 8 August. Some historians contend that Fletcher's fuel situation was not at all critical but that Fletcher used it to justify his withdrawal from the battle area. Fletcher's biographer notes that Fletcher concluded that the landing was a success and that no important targets for close air support were at hand. Turner, however, believed that Fletcher understood that he was to provide air cover until all the transports were unloaded on 9 August.
Even though the unloading was going more slowly than planned, Turner decided that without carrier air cover, he would have to withdraw his ships from Guadalcanal. He planned to unload as much as possible during the night and depart the next day.
### Japanese response
Unprepared for the Allied operation at Guadalcanal, the initial Japanese response included airstrikes and an attempted reinforcement. Mikawa, commander of the newly formed Japanese Eighth Fleet headquartered at Rabaul, loaded 519 naval troops on two transports and sent them towards Guadalcanal on 7 August. When the Japanese learned that Allied forces at Guadalcanal were stronger than originally reported, the transports were recalled.
Mikawa also assembled all the available warships in the area to attack the Allied forces at Guadalcanal. At Rabaul were the heavy Takao-class cruiser Chōkai (Mikawa's flagship), the light cruisers Tenryū of Cruiser Division 18 and Yūbari and the destroyer Yūnagi. En route from Kavieng were four heavy cruisers of Cruiser Division 6 under Rear Admiral Aritomo Goto: the Aoba-class Aoba and Kinugasa and the Furutaka-class Furutaka and Kako, totaling 34 8-inch main guns.
The Japanese Navy had trained extensively in night-fighting tactics before the war, a fact of which the Allies were unaware. Mikawa hoped to engage the Allied naval forces off Guadalcanal and Tulagi on the night of 8–9 August when he could employ his night-battle expertise while avoiding attacks from Allied aircraft, which could not operate effectively at night. Mikawa's warships rendezvoused at sea near Cape St. George in the evening of 7 August and then headed east-southeast.
## Battle
### Prelude
Mikawa decided to take his fleet north of Buka Island and then down the east coast of Bougainville. The fleet paused east of Kieta for six hours on the morning of 8 August to avoid daytime air attacks during its final approach to Guadalcanal. Mikawa proceeded along the dangerous New Georgia Sound (known as "The Slot"), hoping that no Allied plane would see them in the fading light. The Japanese fleet was in fact sighted in St George Channel, where the column almost ran into USS S-38, lying in ambush. She was too close to fire torpedoes, but her captain, Lieutenant Commander Henry G. Munson alerted the fleet. Once at Bougainville, Mikawa spread his ships out over a wide area to mask the composition of his force and launched four floatplanes from his cruisers to scout for Allied ships in the southern Solomons.
At 10:20 and 11:10, his ships were spotted by Royal Australian Air Force (RAAF) Lockheed Hudson reconnaissance aircraft based at Milne Bay in New Guinea. The Hudson's crew tried to report the sighting to the Allied radio station at Fall River, New Guinea. Receiving no acknowledgment, they returned to Milne Bay at 12:42 to ensure that the report was received as soon as possible. The second Hudson also failed to report its sighting by radio but completed its patrol and landed at Milne Bay at 15:00. For unknown reasons, these reports were not relayed to the Allied fleet off Guadalcanal until 18:45 and 21:30, respectively. U.S. official historian Samuel Morison wrote in his 1949 account that the RAAF Hudson's crew failed to report the sighting until after they had landed and even had tea. This claim made international headlines and was repeated by many subsequent historians. Later research has discredited this version of events, and in 2014, the U.S. Navy's Naval History and Heritage Command acknowledged in a letter to the Hudson's radio operator, who had lobbied for decades to clear his crewmates' name, that Morison's criticisms were "unwarranted."
Mikawa's floatplanes returned around 12:00 and reported two groups of Allied ships, one off Guadalcanal and the other off Tulagi. By 13:00, he reassembled his warships and headed south through Bougainville Strait at 24 knots (44 km/h). At 13:45, the cruiser force was near Choiseul southeast of Bougainville. At that time, several surviving Japanese aircraft from the noon torpedo raid on the Allied ships off the coast of Guadalcanal flew over the cruisers on the way back to Rabaul and gave them waves of encouragement. Mikawa entered The Slot by 16:00 and began his run towards Guadalcanal. He communicated the following battle plan to his warships: "On the rush-in we will go from S. (south) of Savo Island and torpedo the enemy main force in front of Guadalcanal anchorage; after which we will turn toward the Tulagi forward area to shell and torpedo the enemy. We will then withdraw north of Savo Island."
Mikawa's run down The Slot was not detected by Allied forces. Turner had requested that U.S. Admiral John S. McCain Sr., commander of Allied air forces for the South Pacific Area, conduct extra reconnaissance missions over The Slot in the afternoon of 8 August. But for unexplained reasons McCain did not order the missions nor did he tell Turner that they were not carried out. Thus, Turner mistakenly believed that The Slot was under Allied observation throughout the day. However, McCain cannot totally bear fault as his patrol craft were few in number and operated over a vast area at the extreme limit of their endurance. Turner had fifteen scouting planes of the cruiser force, which were never used that afternoon and remained on the decks of their cruisers, filled with gasoline and serving as an explosive hazard to the cruisers.
To protect the unloading transports during the night, Crutchley divided the Allied warship forces into three groups. A "southern" group, consisting of the Australian cruisers HMAS Australia and , cruiser USS Chicago, and destroyers USS Patterson and USS Bagley, patrolled between Lunga Point and Savo Island to block the entrance between Savo Island and Cape Esperance on Guadalcanal. A "northern" group, consisting of the cruisers USS Vincennes, USS Astoria and USS Quincy, and destroyers USS Helm and USS Wilson, conducted a box-shaped patrol between the Tulagi anchorage and Savo Island to defend the passage between Savo and Florida Islands. An "eastern" group consisting of the cruisers USS San Juan and with destroyers USS Monssen and USS Buchanan guarded the eastern entrances to the sound between Florida and Guadalcanal Islands. Crutchley placed two radar-equipped U.S. destroyers to the west of Savo Island to provide early warning for any approaching Japanese ships. The destroyer USS Ralph Talbot patrolled the northern passage and the destroyer USS Blue patrolled the southern passage, with a gap of 12–30 kilometers (7.5–18.6 mi) between their uncoordinated patrol patterns.
At this time, the Allies were unaware of all of the limitations of their primitive ship-borne radar, such as the effectiveness of the radar could be greatly degraded by the presence of nearby landmasses. Chicago's Captain Bode ordered his ship's radar to be used only intermittently out of concern that it would reveal his position, a decision that conformed with general Navy radar usage guidelines but which may have been incorrect in this specific circumstance. He allowed a single sweep every half hour with the fire control radar, but the timing of the last pre-engagement sweep was too early to detect the approaching Japanese cruisers. Wary of the potential threat from Japanese submarines to the transport ships, Crutchley placed his remaining seven destroyers as close-in protection around the two transport anchorages.
The crews of the Allied ships were fatigued after two days of constant alert and action in supporting the landings. Also, the weather was extremely hot and humid, inducing further fatigue and, in Morison's words, "inviting weary sailors to slackness." In response, most of Crutchley's warships went to "Condition II" the night of 8 August, which meant that half the crews were on duty while the other half rested, either in their bunks or near their battle stations.
In the evening, Turner called a conference on his command ship off Guadalcanal with Crutchley and Marine commander Major General Alexander A. Vandegrift to discuss the departure of Fletcher's carriers and the resulting withdrawal schedule for the transport ships. At 20:55, Crutchley left the southern group in Australia to attend the conference, leaving Bode in charge of the southern group. Crutchley did not inform the commanders of the other cruiser groups of his absence, contributing further to the dissolution of command arrangements. Bode, awakened from sleep in his cabin, decided not to place his ship in the lead of the southern group of ships, the customary place for the senior ship, and went back to sleep. At the conference, Turner, Crutchley, and Vandegrift discussed the reports of the "seaplane tender" force reported by the Australian Hudson crew earlier that day. They decided that it would not be a threat that night, because seaplane tenders did not normally engage in a surface action. Vandegrift said that he would need to inspect the transport unloading situation at Tulagi before recommending a withdrawal time for the transport ships, and he departed at midnight to conduct the inspection. Crutchley elected not to return with Australia to the southern force but instead stationed his ship just outside the Guadalcanal transport anchorage, without informing the other Allied ship commanders of his intentions or location.
As Mikawa's force neared the Guadalcanal area, the Japanese ships launched three floatplanes for one final reconnaissance of the Allied ships, and to provide illumination by dropping flares during the upcoming battle. Although several of the Allied ships heard and/or observed one or more of these floatplanes, starting at 23:45, none of them interpreted the presence of unknown aircraft in the area as an actionable threat, and no one reported the sightings to Crutchley or Turner. Mikawa's force approached in a single 3-kilometer (1.9 mi) column led by Chōkai, with Aoba, Kako, Kinugasa, Furutaka, Tenryū, Yūbari, and Yūnagi following. Sometime between 00:44 and 00:54 on 9 August, lookouts in Mikawa's ships spotted Blue about 9 kilometers (5.6 mi) ahead of the Japanese column.
### Action south of Savo
To avoid Blue, Mikawa changed course to pass north of Savo Island. He also ordered his ships to slow to 22 knots (41 km/h) to reduce wakes that might make his ships more visible. Mikawa's lookouts spied either Ralph Talbot about 16 kilometers (9.9 mi) away or a small schooner of unknown nationality. The Japanese ships held their course while pointing more than 50 guns at Blue, ready to open fire at the first indication that Blue had sighted them. When Blue was less than 2 kilometers (1.2 mi) away from Mikawa's force, she reversed course, having reached the end of her patrol track, and steamed away, apparently oblivious to the long column of large Japanese ships sailing by her. Seeing that his ships were still undetected, Mikawa turned back to a course south of Savo Island and increased speed, first to 26 knots (48 km/h), and then to 30 knots (56 km/h). At 01:25, Mikawa released his ships to operate independently of his flagship, and at 01:31 he ordered "Every ship attack."
At about this time, Yūnagi detached from the Japanese column and reversed direction, perhaps because she lost sight of the other Japanese ships ahead of her, or perhaps she was ordered to provide a rearguard for Mikawa's force. One minute later, Japanese lookouts sighted a warship to port. This ship was the destroyer USS Jarvis, heavily damaged the day before and departing Guadalcanal independently for repairs in Australia. Whether Jarvis sighted the Japanese ships is unknown, since her radios had been destroyed. Furutaka launched torpedoes at Jarvis, which all missed. The Japanese ships passed as close to Jarvis as 1,100 meters (1,200 yd), close enough for officers on Tenryū to look down onto the destroyer's decks without seeing any of her crew moving about. If Jarvis was aware of the Japanese ships passing by, she did not respond in any noticeable way and was torpedoed and sunk the following day by aircraft from Rabaul. There were no survivors.
After sighting Jarvis, the Japanese lookouts sighted the Allied destroyers and cruisers of the southern force about 12,500 meters (13,700 yd) away, silhouetted by the glow from the burning George F. Elliott. At about 01:38, the Japanese cruisers began launching salvos of torpedoes at the Allied southern force ships. At this same time, lookouts on Chōkai spotted the ships of the Allied northern force at a range of 16 kilometers (9.9 mi). Chōkai turned to face this new threat, and the rest of the Japanese column followed, while still preparing to engage the Allied southern force ships with gunfire.
Patterson's crew was alert because the destroyer's captain had taken seriously the earlier daytime sightings of Japanese warships and evening sightings of unknown aircraft. At 01:43, Patterson spotted a ship, probably Kinugasa, 5,000 meters (5,500 yd) dead ahead and immediately sent a warning by radio and signal lamp: "Warning\! Warning\! Strange ships entering the harbor\!" Patterson increased speed to full and fired star shells towards the Japanese column. Her captain ordered a torpedo attack, but his order was not heard over the noise from the destroyer's guns.
At about the same moment that Patterson sighted the Japanese ships and went into action, Japanese floatplanes dropped aerial flares directly over Canberra and Chicago. Canberra responded with Captain Frank Getting ordering an increase in speed and a reversal of an initial turn to port, which kept Canberra between the Japanese and the Allied transports, and for her guns to train out and fire at any targets that could be sighted. As Canberra's guns took aim at the Japanese, Chōkai and Furutaka opened fire on her, scoring numerous hits. Aoba and Kako joined in with gunfire, and Canberra took up to 24 large-caliber hits. Early hits killed her gunnery officer, mortally wounded Getting, and destroyed both boiler rooms, knocking out power to the entire ship before Canberra could fire any of her guns or communicate a warning to other Allied ships. The cruiser glided to a stop, on fire, with a 5- to 10-degree list to starboard, and unable to fight the fires or pump out flooded compartments because of lack of power.
The crew of Chicago, observing the illumination of their ship by air-dropped flares and the sudden turn by Canberra in front of them, came alert and awakened Captain Bode. Bode ordered his 5 in (127 mm) guns to fire star shells towards the Japanese column, but the shells did not function. At 01:47, a torpedo, probably from Kako, hit Chicago's bow, sending a shock wave throughout the ship that damaged the main battery director. A second torpedo hit but failed to explode, and a shell hit the cruiser's mainmast, killing two crewmen. Chicago steamed west for 40 minutes,leaving behind the transports she was assigned to protect. The cruiser fired her secondary batteries at the trailing ships in the Japanese column and may have hit Tenryū, causing slight damage. Bode did not try to assert control over any of the other Allied ships in the southern force, of which he was still technically in command. More significantly, Bode made no attempt to warn any of the other Allied ships or personnel in the Guadalcanal area as his ship headed away from the battle area.
Patterson engaged in a gun duel with the Japanese column. Patterson received a shell hit aft, causing moderate damage and killing 10 crew members. Patterson continued to pursue and fire at the Japanese ships and may have hit Kinugasa, causing moderate damage. Patterson then lost sight of the Japanese column as it headed northeast along the eastern shore of Savo Island. Bagley, whose crew sighted the Japanese shortly after Patterson and Canberra, circled completely around to port before firing torpedoes in the general direction of the rapidly disappearing Japanese column; one or two of which may have hit Canberra. Bagley played no further role in the battle. Yūnagi exchanged non-damaging gunfire with Jarvis before exiting the battle area to the west with the intention of eventually rejoining the Japanese column north and west of Savo Island.
At 01:44, as Mikawa's ships headed towards the Allied northern force, Tenryū and Yūbari split from the rest of the Japanese column and took a more westward course. Furutaka, either because of a steering problem, or to avoid a possible collision with Canberra, followed Yūbari and Tenryū. Thus, the Allied northern force was about to be enveloped and attacked from two sides.
### Action north of Savo
When Mikawa's ships attacked the Allied southern force, the captains of all three U.S. northern force cruisers were asleep, with their ships steaming quietly at 10 knots (19 km/h). Although crewmen on all three ships observed flares or gunfire from the battle south of Savo or else received Patterson's warning of threatening ships entering the area, it took some time for the crews to go from Condition II to full alert. At 01:44, the Japanese cruisers began firing torpedoes at the northern force. At 01:50, they aimed powerful searchlights at the three northern cruisers and opened fire with their guns.
Astoria's bridge crew called general quarters upon sighting the flares south of Savo, around 01:49. At 01:52, shortly after the Japanese searchlights came on and shells began falling around the ship, Astoria's main gun director crews spotted the Japanese cruisers and opened fire. Astoria's captain, awakened to find his ship in action, rushed to the bridge and ordered a ceasefire, fearful that his ship might be firing on friendly forces. As shells continued to cascade around his ship, the captain ordered firing resumed less than a minute later. Chōkai had found the range, and Astoria was quickly hit by numerous shells and set afire. Between 02:00 and 02:15, Aoba, Kinugasa, and Kako joined Chōkai in pounding Astoria, destroying the cruiser's engine room and bringing the flaming ship to a halt. At 02:16, one of Astoria's remaining operational main gun turrets fired at Kinugasa's searchlight but missed and hit one of Chōkai's forward turrets, putting the turret out of action and causing moderate damage to the ship. Astoria sank at 12:16 after all attempts to save her failed.
Quincy had also seen the aircraft flares over the southern ships, received Patterson's warning, and had just sounded general quarters and was coming alert when the searchlights from the Japanese column came on. Quincy's captain gave the order to commence firing, but the gun crews were not ready. Within a few minutes, Quincy was caught in a crossfire between Aoba, Furutaka, and Tenryū, and was hit heavily and set afire. Quincy's captain ordered his cruiser to charge towards the eastern Japanese column, but as she turned to do so Quincy was hit by two torpedoes from Tenryū, causing severe damage. Quincy managed to fire a few main gun salvos, one of which hit Chōkai's chart room 6 meters (20 ft) from Admiral Mikawa and killed or wounded 36 men, although Mikawa was not injured. At 02:10, incoming shells killed or wounded almost all of Quincy's bridge crew, including the captain. At 02:16, the cruiser was hit by a torpedo from Aoba, and the ship's remaining guns were silenced. Quincy's assistant gunnery officer, sent to the bridge to ask for instructions, reported on what he found:
> When I reached the bridge level, I found it a shambles of dead bodies with only three or four people still standing. In the Pilot House itself the only person standing was the signalman at the wheel who was vainly endeavoring to check the ship's swing to starboard to bring her to port. On questioning him I found out that the Captain, who at that time was laying [sic] near the wheel, had instructed him to beach the ship and he was trying to head for Savo Island, distant some four miles (6 km) on the port quarter. I stepped to the port side of the Pilot House, and looked out to find the island and noted that the ship was heeling rapidly to port, sinking by the bow. At that instant the Captain straightened up and fell back, apparently dead, without having uttered any sound other than a moan.
Quincy sank, bow first, at 02:38.
Like Quincy and Astoria, Vincennes also sighted the aerial flares to the south, and furthermore, actually sighted gunfire from the southern engagement. At 01:50, when the U.S. cruisers were illuminated by the Japanese searchlights, Vincennes hesitated to open fire, believing that the searchlight's source might be friendly ships. Kako opened fire on Vincennes which responded with her own gunfire at 01:53. As Vincennes began to receive damaging shell hits, her commander, Captain Frederick L. Riefkohl, ordered an increase of speed to 25 knots (46 km/h), but at 01:55 two torpedoes from Chōkai hit, causing heavy damage. Kinugasa joined Kako in pounding Vincennes. Vincennes scored one hit on Kinugasa causing moderate damage to her steering engines. The rest of the Japanese ships also fired and hit Vincennes up to 74 times, and at 02:03 another torpedo hit her, this time from Yūbari. With all boiler rooms destroyed, Vincennes came to a halt, burning "everywhere" and listing to port. At 02:16, Riefkohl ordered the crew to abandon ship, and Vincennes sank at 02:50.
During the engagement, the U.S. destroyers Helm and Wilson struggled to see the Japanese ships. Both destroyers briefly fired at Mikawa's cruisers but caused no damage and received no damage to themselves. At 02:16, the Japanese columns ceased fire on the northern Allied force as they moved out of range around the north side of Savo Island. Ralph Talbot encountered Furutaka, Tenryū, and Yūbari as they cleared Savo Island. The Japanese ships fixed Ralph Talbot with searchlights and hit her several times with gunfire, causing heavy damage, but Ralph Talbot escaped into a nearby rain squall, and the Japanese ships left her behind.
### Mikawa's decision
At 02:16 Mikawa conferred with his staff about whether they should turn to continue the battle with the surviving Allied warships and try to sink the Allied transports in the two anchorages. Several factors influenced his ultimate decision. His ships were scattered and would take some time to regroup. His ships would need to reload their torpedo tubes, a labor-intensive task that would take some time. Mikawa also did not know the number and locations of any remaining Allied warships, and his ships had expended much of their ammunition.
More importantly, Mikawa had no air cover and believed that U.S. aircraft carriers were in the area. Mikawa was probably aware that the Japanese Navy had no more heavy cruisers in production and thus would be unable to replace any he might lose to air attack the next day if he remained near Guadalcanal. He was unaware that the U.S. carriers had withdrawn from the battle area and would not be a threat the next day. Although several of Mikawa's staff urged an attack on the Allied transports, the consensus was to withdraw from the battle area. Therefore, at 02:20, Mikawa ordered his ships to retire.
## Aftermath
### Allied
At 04:00 on 9 August, Patterson came alongside Canberra to assist the cruiser in fighting her fires. By 05:00, it appeared that the fires were almost under control, but Turner, who at this time intended to withdraw all Allied ships by 06:30, ordered the ship to be scuttled if she was not able to accompany the fleet. After the survivors were removed, the destroyers USS Selfridge and USS Ellet sank Canberra, which took some 300 shells and five torpedoes.
Later in the morning, Vandegrift advised Turner that he needed more supplies unloaded from the transports before they withdrew. Therefore, Turner postponed the withdrawal of his ships until mid-afternoon. In the meantime, Astoria's crew tried to save their sinking ship. Astoria's fires eventually became completely out of control, and the ship sank at 12:15.
On the morning of 9 August, an Australian coastwatcher on Bougainville radioed a warning of a Japanese airstrike on the way from Rabaul. The Allied transport crews ceased unloading for a time but were puzzled when the airstrike did not materialize. Allied forces did not discover until after the war was over that this Japanese airstrike instead concentrated on Jarvis south of Guadalcanal, sinking her with all hands. The Allied transports and warships all departed the Guadalcanal area by nightfall on 9 August. During the naval surface battle of Savo Island, three U.S. heavy cruisers, Astoria (219 killed), Quincy (370 killed), and Vincennes (322 killed), and one Australian heavy cruiser, (84 killed), were sunk or scuttled. The commanding officers of Canberra and Quincy were also killed in action. Chicago spent the next 6 months in drydock, returned to Guadalcanal in late January 1943 and was promptly finished off for good in the campaign's last engagement: the Battle of Rennell Island.
### Japanese
In the late evening of 9 August, Mikawa on Chōkai released the four cruisers of Cruiser Division 6 to return to their home base at Kavieng. At 08:10 on 10 August, Kako was torpedoed and sunk by the submarine USS S-44 110 kilometers (68 mi) from her destination. The other three Japanese cruisers picked up all but 71 of her crew and went on to Kavieng.
Admiral Isoroku Yamamoto signaled a congratulatory note to Mikawa on his victory, stating, "Appreciate the courageous and hard fighting of every man of your organization. I expect you to expand your exploits and you will make every effort to support the land forces of the Imperial army which are now engaged in a desperate struggle." Later on, though, when it became apparent that Mikawa had missed an opportunity to destroy the Allied transports, he was intensely criticised by his comrades.
## Tactical result
From the time of the battle until several months later, almost all Allied supplies and reinforcements sent to Guadalcanal came by transports in small convoys, mainly during daylight hours, while Allied aircraft from the New Hebrides and Henderson Field and any available aircraft carriers flew covering missions. During this time, Allied forces on Guadalcanal received barely enough ammunition and provisions to withstand the several Japanese drives to retake the islands.
Despite their defeat in this battle, the Allies eventually won the battle for Guadalcanal, an important step in the defeat of Japan. In hindsight, according to Richard B. Frank, if Mikawa had elected to risk his ships to go after the Allied transports on the morning of 9 August, he could have improved the chances of Japanese victory in the Guadalcanal campaign at its inception, and the course of the war in the southern Pacific could have gone much differently. Although the Allied warships at Guadalcanal that night were completely routed, the transports were unaffected. Many of these same transports were later used many times to bring crucial supplies and reinforcements to Allied forces on Guadalcanal over succeeding months. Mikawa's decision not to destroy the Allied transport ships when he had the opportunity proved to be a crucial strategic mistake for the Japanese.
## U.S. Navy board of inquiry
A formal United States Navy board of inquiry, known as the Hepburn Investigation, prepared a report of the battle. The board interviewed most of the major Allied officers involved over several months, beginning in December 1942. The report recommended official censure for Captain Howard D. Bode of the Chicago for failing to broadcast a warning to the fleet of encroaching enemy ships. The report stopped short of recommending formal action against other Allied officers, including Admirals Fletcher, Turner, McCain, and Crutchley, and Captain Riefkohl. The careers of Turner, Crutchley, and McCain do not appear to have been affected by the defeat or the mistakes they made in contributing to it. Riefkohl never commanded ships again. Bode, upon learning that the report was going to be especially critical of his actions, shot himself in his quarters at Balboa, Panama Canal Zone, on 19 April 1943 and died the next day. Crutchley was later gazetted with the Legion of Merit (Chief Commander).
Admiral Turner assessed why his forces were so soundly defeated in the battle:
> "The Navy was still obsessed with a strong feeling of technical and mental superiority over the enemy. In spite of ample evidence as to enemy capabilities, most of our officers and men despised the enemy and felt themselves sure victors in all encounters under any circumstances. The net result of all this was a fatal lethargy of mind which induced a confidence without readiness, and a routine acceptance of outworn peacetime standards of conduct. I believe that this psychological factor, as a cause of our defeat, was even more important than the element of surprise."
Historian Frank adds that "This lethargy of mind would not be completely shaken off without some more hard blows to (U.S.) Navy pride around Guadalcanal, but after Savo, the United States picked itself up off the deck and prepared for the most savage combat in its history."
The report of the inquiry caused the U.S. Navy to make many operational and structural changes. All the earlier models of U.S. Navy cruisers were retrofitted with emergency diesel-electric generators. The fire mains of the ships were changed to a vertical loop design that could be broken many times and still function. During the battle, many ship fires were attributed to aviation facilities filled with gas, oil, and planes. Motorboats were filled with gasoline and also caught fire. In some cases, these facilities were dead amidships, presenting a perfect target for enemy ships at night. Ready-service lockers (lockers containing ammunition that is armed and ready for use) added to the destruction, and it was noted that the lockers were never close to being depleted, i.e., they contained much more dangerous ammunition than they needed to. A focus was put on removing or minimizing flammable amidship materials. Admiral Ernest J. King, the commander in chief of the United States Fleet, ordered sweeping changes to be made before ships entered surface combat in the future.
## See also
- The Second Battle of Savo Island (a.k.a. the Battle of Cape Esperance)
- The Third Battle of Savo Island (a.k.a. the Naval Battle of Guadalcanal)
- The Fourth Battle of Savo Island (a.k.a. the Battle of Tassafaronga) |
# Operation PBHistory
Operation PBHistory was a covert operation carried out in Guatemala by the United States Central Intelligence Agency (CIA). It followed Operation PBSuccess, which led to the overthrow of Guatemalan President Jacobo Árbenz in June 1954 and ended the Guatemalan Revolution. PBHistory attempted to use documents left behind by Árbenz's government and by organizations related to the communist Guatemalan Party of Labor to demonstrate that the Guatemalan government had been under the influence of the Soviet Union, and to use those documents to obtain further intelligence that would be useful to US intelligence agencies. It was an effort to justify the overthrow of the elected Guatemalan government in response to the negative international reactions to PBSuccess. The CIA also hoped to improve its intelligence resources about communist parties in Latin America, a subject on which it had little information.
The first phase of the operation began soon after Árbenz's resignation on June 27, 1954: several agents were dispatched to Guatemala beginning on July 4. These included agents of the CIA and the Office of Intelligence Research (OIR). The first phase involved the collection of 150,000 documents from sources including Árbenz's personal possessions, trade union offices, and police agencies. The ruling military junta led by Carlos Castillo Armas aided these efforts. Following a presentation made to US President Dwight Eisenhower on July 20, a decision was taken to accelerate the operation, and the number of people working in Guatemala was increased. The new team members posed as unaffiliated with the US government to maintain plausible deniability. The operation helped set up the Guatemalan National Committee of Defense Against Communism, which was covertly funded by the CIA: agents of the committee became involved in PBHistory. The team studied over 500,000 documents in total, and finished processing documents on September 28, 1954.
PBHistory documents were used to support the CIA's existing operations Kufire and Kugown, which sought to track Latin American communists and to disseminate information critical of the Árbenz government. Documents were also shared with the Kersten Committee of the US House of Representatives, which publicized PBHistory within the US. The documents uncovered by the operation proved useful to the Guatemalan intelligence agencies, enabling the creation of a register of suspected communists. Operation PBHistory did not succeed in its chief objective of finding evidence that the Guatemalan communists were being controlled by the Soviet government, and was unable to counter the international narrative that the United States had toppled the government of Jacobo Árbenz to serve the interests of the United Fruit Company.
## Background and origins
The October Revolution of 1945 in Guatemala led to the election of Juan José Arévalo as President of Guatemala, who initiated reforms based on liberal capitalism. Arévalo was an anti-communist, and cracked down on the communist Guatemalan Party of Labor (Partido Guatemalteco del Trabajo, PGT). The US government was nonetheless suspicious that he was under the influence of the Soviet Union. Arévalo's defense minister Jacobo Árbenz was elected president in 1950. Influenced partly by McCarthyism, the US government was predisposed to see communist influence in Árbenz's government, particularly after the legalization of the PGT. Árbenz also had personal ties to some PGT members. In 1952 Árbenz began an agrarian reform program that transferred uncultivated land from large landowners to poor laborers in return for compensation. In response the US-based United Fruit Company, which had large landholdings in Guatemala, intensively lobbied the US government for Árbenz's overthrow.
US President Dwight Eisenhower authorized a Central Intelligence Agency (CIA) operation to overthrow Árbenz, code-named Operation PBSuccess in August 1953. Within the CIA, the operation was headed by Frank Wisner, who had worked in the US intelligence services since World War II. While preparations for Operation PBSuccess were underway, the US government issued a series of statements denouncing the Guatemalan government, alleging that it had been infiltrated by communists. On June 18, 1954 Carlos Castillo Armas, a Guatemalan army Colonel in exile since a failed coup in 1949, led an invasion force of 480 men into Guatemala. The invasion was supported by a campaign of psychological warfare, which presented Castillo Armas's victory as a fait accompli to the Guatemalan people. Worried by the possibility of a US invasion, the Guatemalan army refused to fight, and on June 27 Árbenz resigned.
The actions of the United States resulted in international outrage. Media outlets across the world accused the US of sponsoring a coup to reverse Árbenz's agrarian reform in the interests of the United Fruit Company. This criticism was influenced by the coverage put out by media outlets in communist-controlled countries, but was repeated in the media in countries that were US allies, with Britain's Labour Party and the Swedish Social Democratic Party joining in. Latin American opposition to the US reached a new peak: author Daniel James stated that "No one could recall so intense and universal a wave of anti-US sentiment in the entire history of Latin America." Although people within the US saw the coup as a triumph for US foreign policy, CIA officials felt that in order for Operation PBSuccess to be termed a success, further action was needed. Thus, the CIA was interested in finding documents that would allow it to portray the administration of Árbenz as being controlled by Soviet communists, and thus to justify the coup.
Due to the quick overthrow of the Árbenz government, the CIA believed that the government and the PGT leaders would not have been able to destroy any incriminating documents, and that these could be analyzed to demonstrate Árbenz's supposed ties to the Soviet Union. According to historian Nick Cullather, Wisner hoped to "expose Soviet machinations throughout the hemisphere". The CIA also believed that it could better understand the workings of Latin American communist parties, on which subject the CIA had very little information. Although there had been an active communist movement in Latin America since 1919, it had largely been clandestine, and the CIA knew little about the methods that parties like the PGT used. The CIA hoped that PGT records left behind in haste would enable its international Communism Division to reconstruct the party's leadership and organizational structure, and possibly do the same for other communist parties in the region.
The CIA also hoped to exploit the aftermath of the coup to bolster its own intelligence resources. Wisner, who was serving as Deputy Director for Plans at the time of the coup, hoped to recruit agents both from among communists who wanted to defect, and from other Guatemalans who might become a part of the new government. In Wisner's words, he wished to identify "people who can be controlled and exploited to further US policy". Furthermore, the agency hoped to use the findings of the operation to demonstrate the extent of Soviet influence for propaganda purposes, and also to use the information gathered to eliminate any communist influence in Guatemala.
## Document analysis
### First phase
On June 30, 1954, three days after the resignation of Árbenz, Wisner sent a telegram that later became known as the "shift of gears cable". Two agents from the CIA, and two from the Office of Intelligence Research (OIR), arrived in Guatemala City on July 4. Castillo Armas had arrived in the capital on the previous day to assume the presidency. One of the CIA officers was Lothar Metzl, who was on the counterintelligence staff of the CIA. Metzl was an Austrian, who had studied communist movements since the 1930s, including in Europe. In Wisner's words, the agents were supposed to perform a "snatch job on documents while the melon was freshly burst open".
The initial targets of the operation were the personal possessions and documents of Árbenz and those of Carlos Enrique Díaz (who had been chief of the armed forces under Árbenz, and briefly his successor as president), the offices of trade unions, known front organizations, police agencies, and the headquarters of the Partido Guatemalteco del Trabajo (PGT). The results of the initial searches were disappointing for the CIA; many of the offices had already been plundered both by the Guatemalan army and by other looters. The CIA were particularly interested in finding documents that mentioned the Árbenz government's purchase of weapons from Czechoslovakia, but they were unsuccessful. They were also unsuccessful in finding any evidence that the Soviet Union had controlled the communist movement in Guatemala.
Despite these difficulties, the agents collected 150,000 documents, in addition to a number of government files, which the agency judged to be useful. The CIA received assistance in collecting these from Castillo Armas's junta, and from the Guatemalan army. The haul was described as the "greatest catch of documents ever left behind by a Communist Party and its auxiliaries". Most of these had nothing but "local significance". Although no documents were discovered demonstrating Soviet influence, the CIA hoped to use the large number of papers to show that the communists in Guatemala had had a large influence over the government, through institutions like labor unions, peasant organizations, student unions, and youth groups.
On July 20, the CIA agents presented the results of their first two weeks of work in Washington. At Wisner's request, Tracy Barnes—principal manager of CIA operations in PBSuccess—created a booklet from these documents to show to US President Dwight Eisenhower. The 23 documents in the booklet included communist literature owned by Árbenz, such as a Chinese study on agrarian reform and some Marxist volumes, as well as diplomatic records implying communist sympathies, and a biography of Joseph Stalin belonging to Árbenz's wife Maria Cristina Villanova. After the presentation, Wisner decided that the examination of the seized documents needed to proceed faster, and so expanded the group of agents working in Guatemala.
One of the aims of the new team was to help Castillo Armas establish an intelligence agency that would be able to fight communism in Guatemala. Castillo Armas was pressured to create an anti-communist task force, which he did on July 20—creating the National Committee of Defense Against Communism (Comité de Defensa Nacional contra el Comunismo). The purpose of this group was to create an anti-communist bureaucracy and intelligence service and to organize records and facilitate PBHistory. The Comité secretly received funds from the CIA, with the understanding that this fact could prove "very embarrassing" and that a new source would eventually need to be found. Although the Comité was theoretically an intelligence agency, it also had some police powers. It could order the arrest of anybody suspected of being a communist, and had oversight over all army and police authorities. The CIA team was supposed to help it set up by creating a nucleus of information about people associated with the PGT.
The Comité was not, however, granted the power to arrest or search the house of prominent government officials who had served under Árbenz. This was largely because Castillo Armas and other military leaders lacked trust in the Comité. Nevertheless, the Comité was able to conduct personal searches of exiles as they left the country. This proved to be ineffective as very few documents proved to be revealing.
### Second phase
On August 4, a new and larger US contingent was deployed to Guatemala. In order to remain covert, this group identified itself to Armas as the "Social Research Group", composed of businessmen and experts from universities. It consisted of eight CIA officers, three men from the State Department, and one from the US Information Agency. It was led by an officer working under the pseudonym "Francis T. Mylkes," and included David Atlee Phillips, who was fluent in Spanish and had been part of the PBSuccess team. The group presented itself as unaffiliated with the US government in order to avoid nationalist backlash and to maintain plausible deniability. The new PBHistory group worked directly with the new Guatemalan Comité training its 25 agents and using them to procure documents; the training covered topics including screening, organizing, and classifying confiscated documents, as well as "the rudiments of mail control, logging, abstracting, and cryptic reference".
Eventually, the 25 personnel of the Comité joined the CIA officers in sorting and processing the seized documents. The CIA officers had a separate side entrance to the building in which the operations took place, in order to maintain the image that the operation was a Guatemalan internal affair. The task of sorting through the papers proved to be daunting: by September, the main index of the material contained 15,800 cards. All hand-written material was preserved, and multiple copies of printed material were kept. Every document had to be reproduced, because the original copies of each one were to remain in Guatemala. Approximately half of the paper that had been gathered was incinerated. The CIA gave the highest priority to the documents seized from the PGT.
By September 1954, the PBHistory agents had found only a few top secret documents. Some documents showed that government officials and communist party leaders had been able to dispose of most of the sensitive material before they left. In the period of uncertain leadership that followed Árbenz's overthrow, a member of one of the series of ruling juntas had prevented the Comité from searching the homes of private citizens, and from arresting them, which potentially reduced the number of sensitive documents the CIA had access to. Additionally, Castillo Armas stated after taking power that the intelligence information of the army had been completely destroyed.
The CIA finished processing documents on September 28, 1954. By this point, the agents had parsed through more than 500,000 unique documents. Photostatic copies were taken of 2,095 important documents, 750 photographs of the material were published for the use of the media, and 50,000 documents were microfilmed. Copies of a handful of important documents were distributed to the various agencies that had been a part of PBHistory, as well as the US Federal Bureau of Investigation.
## Document exploitation
The agencies that participated in Operation PBHistory had different aims in mind for the products of the operation. The CIA was most interested in using the information gathered against communist movements in Latin America and elsewhere. The State Department wanted to use them to reconstruct the history of the communist party within Guatemala, while the highest priority of the United States Information Agency was to use the documents to release information that could be used to change international opinion. The agents in charge of the operation were expected to balance these interests: however, as the organization behind the operation, the CIA had veto power over any public use. The work done by the PBHistory team assisted two existing CIA operations, Operation Kufire and Operation Kugown, both of which had been a part of Operation PBSuccess.
### Operation Kufire
Operation Kufire was a wide-ranging effort to track communists from various countries across Latin America who had come to Guatemala during the presidency of Árbenz. The CIA expected that these individuals would return to their home countries, or to other countries that had liberal policies about political asylum, and by tracking them, the CIA hoped to disrupt their activities. During the course of this operation, a CIA analyst asked Philips whether to open a file on a 25-year-old physician, who did not at the time appear to be particularly threatening. Philips said yes, thereby opening a file on Ernesto "Che" Guevara, which would quickly become one of the thickest files the CIA had on a single individual. Che's name was added to a secret CIA kill list of individuals targeted for assassination. Few other documents resulted in files that were of enduring value to the CIA.
### Operation Kugown
Operation Kugown was the name given to the psychological warfare operation that had played an important part in the overthrow of Árbenz. During the coup, its primary targets had been the Árbenz government. After the conclusion of the coup, Kugown continued, targeting the rest of Guatemala, and the wider international audience. The aim of the operation was to disseminate derogatory information about Árbenz, and to convince Guatemalans—and the rest of the world—that Árbenz's regime had been communist-dominated. The use of documents from PBHistory for Operation Kugown began in August 1954. The standard method employed by the CIA was to select a document that could be portrayed as incriminating and write an explanation covering it. This would then be released to the press by the Comité, so that the local agency could receive some credit. The Comité also released a short documentary film, titled Después Descubrimos La Verdad ("Later We Discovered the Truth"). Through these avenues news media in Guatemala and elsewhere in Central America were saturated with stories of how the Árbenz government had been controlled by communists.
While the press releases had a substantial impact within Guatemala, the CIA was unable to staunch the continued criticism of the US role in the coup, which came from virtually all countries except for West Germany and the US itself. Very few news agencies chose to run the press releases from the Comité, even though a number of them were put out. Information was sent to press agencies worldwide describing infiltration by the PGT and links among Communists elsewhere; nonetheless, their impact remained minimal. The lack of attention frustrated the PBHistory agents to the point where they planned to stage a false flag attack on their own headquarters, which would later be described as the work of Guatemala's remaining communists. However, the CIA decided that such an attack would need the cooperation of too many "indigenous" people, and the plan was scrapped as being too risky. Operation Kugown also released a large quantity of communist propaganda material that had entered Guatemala from countries within the Soviet sphere of influence; these convinced American journalists such as Donald Grant, of the St. Louis Post-Dispatch, that there must have been a connection between Árbenz and the Soviet government. Ultimately, these operations were unsuccessful in convincing Latin America that the 1954 coup was justified.
### Kersten committee
Information from the PBHistory documents was disseminated by the officials of both the US and Guatemalan governments. US ambassador Henry Cabot Lodge made use of 21 documents in a speech he made at the United Nations. Information was also funnelled to US ambassadors and Congresspeople. The US Congress in 1954 was among the few Republican controlled Congresses in many years. Republicans sought to use an anti-communist push to generate support for themselves and to erode the voting base of the Democratic Party. Anti-Communist members of the US Congress, such as Charles J. Kersten and Patrick J. Hillings of the Kersten Committee, enthusiastically involved themselves with PBHistory. By August 1954, Kersten was receiving PBHistory documents from Dulles so he could use them in speeches to Congress about the Soviet Union's influence in Guatemala. In September and October 1954, the Kersten Committee held hearings purportedly investigating the penetration of communist influence. PBHistory documents were used in this process, and Castillo Armas became the first head of state to testify before a US Congress committee (although he did so through the use of a tape recording of his testimony). Although the hearings did little to unearth information about communist presence in Guatemala, they provided Operation PBHistory with huge publicity within the US.
At the same time, the involvement of the Kersten committee and of Kersten and Hillings caused concern for the CIA. Dulles constantly worried that their investigation would damage CIA operations, particularly when Hillings visited Guatemala shortly before PBSuccess was to begin. Congresspeople had not officially been informed of the CIA's role in the coup, and Dulles wished to keep them uninformed. By supplying them with PBHistory documents, Dulles hoped to forestall them from inadvertently exposing the CIA's other projects. Following the hearings, a subcommittee headed by Hillings produced a final report. In addition to stating without evidence that the Guatemalan government had been acting under orders from the Soviet Union, this report falsely claimed that Soviet weapons had been brought covertly to Guatemala by submarine. This unintentionally drew attention to Operation Washtub, a CIA effort to foist incriminating weapons on the Guatemalan government.
### Other uses
Once the CIA had stopped using the documents for propaganda purposes, the agents in charge of PBHistory decided that the best use of the documents they had uncovered would be to record the growth of the communist movement in Guatemala. This research was undertaken by the US State Department's Office of Intelligence Research. The OIR produced a 50-page report after five months of work; the State Department considered it the "definitive answer" to the question of how communism had arisen in Guatemala. The government of Honduras, which had allowed its territory to be used as a "staging area" for the coup against Árbenz, also made use of the PBHistory papers to justify its position. It argued that it had been facing interference in its internal matters from communists in Guatemala.
## Aftermath and analysis
Max Holland, analyzing PBHistory in 2004, wrote that although very few highly sensitive communist documents were found, the operation provided the CIA with its first detailed look at the development of a powerful communist movement. It also allowed them to set up a Guatemalan service that would work against the communists, and for these reasons, the CIA judged the operation to be a success. Historian Kate Doyle stated that the documents uncovered by PBHistory allowed the CIA to create a register of suspected communists. The documents were described by participants as an "intelligence goldmine"; the register that the CIA left with the Guatemalan security forces contained information on thousands of citizens.
PBHistory documents were used for years afterward to discredit Árbenz (living in exile) and to counter Soviet propaganda about American imperialism in Guatemala. When Árbenz moved to Montevideo in 1957, the CIA used the PBHistory documents to produce a biography of Árbenz that described him as a Soviet agent, in an attempt to prevent him from moving to Mexico, where opposition to Castillo Armas' regime was coalescing. Nonetheless, Árbenz remained a symbol of principled resistance to the United States, helped in part by Soviet propaganda to that effect. When some PBHistory documents were published, they received little attention in Latin America. Though PBSuccess was viewed positively within the US soon after it occurred, the violence perpetrated by the Guatemalan governments supported by the US in the 1970s and 1980s changed the perception of the coup among the US public.
Despite the report produced by the Office of Intelligence Research, by 1957 the CIA realized that its version of the history of the Árbenz government and the coup was not gaining traction. Books written by defenders of the Árbenz government, which were strongly critical of the US intervention, were generally better received. Nationalist Latin Americans were inclined to view the Castillo Armas government as a creation of the US. As a result, the US government decided to allow Ronald Schneider, a historian who was in the process of completing his Ph.D, to access the PBHistory archive. Schneider published Communism in Guatemala: 1944 to 1954 in 1959. Later observers have stated that the publication may have been subsidized in some way by the CIA: both the Foreign Policy Research Institute, where Schneider worked, and Frederick A. Praeger, who published Schneider's book, received CIA funds. Schneider stated in his book that the Comité was responsible for collecting the documents he accessed, but did not mention the CIA's role in funding the Comité, nor did he explain how the documents came to the US. Schneider's book did not rely on PBHistory material alone, but also on information that he gathered during a trip to Mexico and Guatemala in 1957. The book was generally well received.
The operation failed in its main purpose, which was to find evidence that the government of Árbenz was under Soviet control. A CIA report published on October 1, 1954, stated that "'very few' 'Communist damaging' documents had been found". The US failed to persuade Latin Americans of its point of view on communism: most people viewed the reforms of the Guatemalan Revolution in a positive light, and even Schneider's account, described by Holland as a balanced portrayal, was unable to persuade the public that the Soviet Union was involved in the rise of Guatemalan communism. Political scientist Jeremy Gunn, who was given access to the material collected by the operation, stated that it "found no traces of Soviet control and substantial evidence that Guatemalan Communists acted alone, without support or guidance from outside the country". Nothing useful was discovered with respect to international communism either. In contrast, the Soviet government's portrayal was of a Guatemalan government that did not threaten the interests of the US, but which was nonetheless overthrown in order to protect the United Fruit Company. Over time, this description of the events became the favored one in Latin America.
The Soviet narrative was further strengthened in 1957, when Castillo Armas was overthrown and replaced with a highly reactionary government which further rolled back the reforms of the 1944 revolution; the Eisenhower administration did not react to the coup in any significant way. When Richard Nixon, then US vice-president, visited Latin America in 1958, he encountered severe abuse wherever he went, even from people who were not communists or their sympathizers. PBHistory was also unable to change the strong resentment against the CIA that the Guatemalan coup had created. Writing in 2008 Gunn compared PBHistory to a similarly unsuccessful attempt by the US to justify the invasion of Iraq after it had occurred. Historian Mark Hove has written that "Operation PBHistory proved ineffective because of 'a new, smoldering resentment' that had emerged in Latin America over US intervention in Guatemala." |
# Terra Nova: Strike Force Centauri
Terra Nova: Strike Force Centauri is a 1996 tactical first-person shooter video game developed and published by LookingGlass Technologies. Set in a science-fictional depiction of the 24th century, the game follows a faction of humans who colonize the Alpha Centauri star system to escape from the Hegemony, a totalitarian Earth government. The player assumes the role of Nikola ap Io, the leader of an Alpha Centauri military unit, and undertakes missions against pirates and the Hegemony.
Terra Nova has been cited as one of the first squad-oriented games with three-dimensional (3D) graphics; the player is often assisted by artificially intelligent teammates who may be given tactical commands. Conceived by Looking Glass after the completion of their first game, Ultima Underworld: The Stygian Abyss, Terra Nova was subject to a long and difficult development process, caused in part by the production of its full-motion video cutscenes. The game's TED engine can render 3D outdoor environments and simulate physics; the latter enables such effects as procedural animation.
Terra Nova's critical reception was highly positive. Reviewers praised its tactical elements, and several compared it to the 1995 video game MechWarrior 2: 31st Century Combat. However, reception of its graphics was mixed, and many noted the game's steep system requirements. Despite critical acclaim and sales in excess of 100,000 units, the game was a commercial failure: it did not recoup its development costs. While it was intended to be the first in a series, its low sales led the company to cancel plans for a sequel.
## Gameplay
As a tactical first-person shooter, Terra Nova focuses on combat and takes place from a character's eye view in a three-dimensional (3D) graphical environment. The protagonist wears powered battle armor (PBA) that features lock-on targeting, jumpjets for limited flight, infrared and zoomed vision, and a rechargeable energy shield that protects against attacks. The player uses a freely movable mouse cursor to aim weapons and manipulate the heads-up display (HUD) interface. As with Looking Glass Technologies' earlier game System Shock, the HUD contains three "Multi-Function Displays" (MFDs). These screens may be configured to display tactical information, such as squad command menus, maps and weapon statistics.
The player is usually accompanied by up to three artificially intelligent squadmates, who may be given tactical orders such as holding a position, taking cover or rushing enemies. Squadmates may be commanded as a group or individually; for example, one half of a squad may be used to distract enemies while the other half attacks an objective. Each squad member specializes in weapons, reconnaissance, repairs, demolitions or electronics. Those in the latter four categories may be given special commands, such as repairing a teammate's armor or setting explosive charges. During missions, squad members radio in enemy sightings and status assessments.
The game takes place in 37 missions. Each begins with a briefing that describes such details as objectives, squad size and enemies. Objectives range from rescues and assaults to reconnaissance photography. Additional missions—whose contents may be selected by the player—are available through the game's "Random Scenario Builder". Before undertaking missions, the player outfits the squad and protagonist with PBA suits and equipment. The three types of PBA—Scout, Standard and Heavy—vary in ability; for example, the Scout armor is fast and light, while the Heavy armor is slow and powerful. Each may be fitted with weapons and an "Auxiliary Suit Function" (ASF); the latter ranges from increased jumpjet power to deployable automatic turrets. Only a small amount of equipment is available at the outset, but more becomes accessible as the game progresses. Between missions, the player may read e-mails, news and military files, and a "library" that details the game's setting.
## Plot
### Setting and characters
Terra Nova is set in a science fictional depiction of the year 2327 and takes place in the Alpha Centauri star system. The setting's early inspirations were the novels Starship Troopers and The Forever War, and PC Gamer UK compared it to that of the 1986 action film Aliens. Over two hundred years before the beginning of the game, Earth is subsumed by a world government called the Hegemony, whose "Publicanism" philosophy PC Zone summarized as "communism without the economic restrictions". The Hegemony annexes colonies throughout the Solar System, but the inhabitants of Jupiter's moons reach an agreement that allows them to relocate to Alpha Centauri, where they settle on the Earth-like NewHope and the frozen Thatcher planets. The settlers divide into twelve "Clans"—each with a military "Strike Force" to defend against bandits—and create the Centauri Council to govern the system. Trade is established with the Hegemony. As the game begins, an elite Strike Force called Strike Force Centauri is formed in response to increasing pirate activity.
The protagonist of Terra Nova is Nikola ap Io, the squad leader of Strike Force Centauri. His older brother, Brandt ap Io, is one of his subordinates, and the two share a mutual animosity. Other members of the squad include Sarah Walker, the daughter of a Centauri Council member; Ernest Schuyler, who is known for his sense of humor; and the frank and abrasive Simon Ashford. Each member was given a personality so that the player would form connections with the squad. Commander Arlen MacPherson assumes overall charge of the squad, and he has regular dealings with Hegemony ambassador Creon Pentheus. Live-action full-motion video cutscenes depicting character interaction occasionally play between missions.
### Story
As the game begins, pirates steal a shipment of highly destructive "Petrovsk grenades". A reconnaissance mission by Nikola identifies the grenades at a heavily defended pirate base, and they are recovered en route to a transport ship. Without the grenades, the base is assaulted by Strike Force Centauri, and Hegemony equipment is found there. When MacPherson confronts Pentheus about the incident, he denies involvement. Proof of the Hegemony's intentions is later found at a Thatcher smuggling base, and Pentheus declares war on the Centauri colonies. Now knowing the pirates are funded by the Hegemony, MacPherson suspects that a previous information leak was in fact the work of a Hegemony spy; Nikola questions Brandt, who responds with indignance. After a series of missions against the Hegemony, Nikola's aircraft is ambushed and shot down, and he is captured by Pentheus. During this time, Pentheus tells him that a traitor within Strike Force Centauri is responsible for the ambush. The squad rescues Nikola, but Schuyler is killed in the assault. At his funeral, Ashford accuses Nikola as the traitor.
It soon becomes clear that MacPherson is being poisoned. Nikola believes that Brandt is responsible, because of his recent disappearances, but is proven wrong. After MacPherson dies, Sarah Walker takes his place as commander of Strike Force Centauri. Walker sends Nikola, disguised as a pirate, on an espionage mission to discover the traitor's identity. Nikola finds information that incriminates Ashford, who, when confronted, boasts of his actions and leaps to his death from a docking bay. The squad continues the war, and the Hegemony is eventually forced to gather its remaining forces at a base on Thatcher. The squad destroys the facility by detonating a highly explosive fuel tank inside it. Following its defeat, the Hegemony denies involvement in the war, declares Pentheus a rogue agent and appoints a new ambassador to the system. While angered by the announcement, Strike Force Centauri celebrates its victory as the game ends.
## Development
Terra Nova was conceived in 1992, around the time that Looking Glass Technologies' first game, Ultima Underworld: The Stygian Abyss, was completed. Company co-founder Paul Neurath wrote a design document for a tactical, squad-based game with a science fiction setting, and he helped the team initiate its development. Artist Robb Waters created concept art. It was originally titled Freefall, because of the way the soldiers enter combat by dropping from aircraft. Development was initially led by a newly hired programmer who envisioned the game as an exact simulation, in which every element was as realistic as possible. Programmer Dan Schmidt created the game's artificial intelligence, and he attempted to make squadmates intelligently follow orders and provide assistance, instead of merely "staying out of your way". Schmidt hired Eric Brosius and Terri Brosius, then-members of the band Tribe, to compose the game's music, which was called "orchestrally flavored" by the Boston Herald. As with their 1995 video game Flight Unlimited, Looking Glass Technologies self-published Terra Nova.
The game began production alongside the company's second project, Ultima Underworld II: Labyrinth of Worlds, and remained in development after that game's 1993 release. It then continued through the creation of their titles System Shock (released in 1994) and Flight Unlimited. The game was subject to numerous delays, which Schmidt later attributed to its lack of a set deadline. He stated that the team was "trying to go with the same philosophy" as the company's earlier games, in that they would "develop the systems and the game would come out of it". However, the team's development priorities regularly changed, and the programmer who led the project left several years into production. According to Schmidt, his departure meant that "there was no-one left who was psyched about making this really [realistic] simulation". Despite this fact, the team continued using the idea, even though serious difficulties were involved in achieving it. Schmidt said that the game's development status was uncertain after the programmer left, and that he inherited the role of lead programmer around that time merely because the position had to be filled. He later assumed the role of project leader. In January 1995, Looking Glass showed Terra Nova alongside Flight Unlimited at the Winter Consumer Electronics Show, under their "Immersive Reality" marketing label.
In the team's original plan, Terra Nova consisted of missions that were bookended by simplistic cutscenes, akin to those of the 1990 Origin Systems video game Wing Commander. However, in 1994, Origin released Wing Commander III: Heart of the Tiger, which features live-action full-motion video (FMV) cutscenes. This pressured Looking Glass into incorporating FMV into Terra Nova. Schmidt later said, "Lots of A-list games were including more and more FMV, and it was felt by management that if Terra Nova didn't have any, it would look second-rate." The decision to include it came when the game was already overdue, and a large portion of the game's funding was redirected toward cutscene production. A scriptwriter from outside the company was hired to write the cutscenes; because of the interplay between the cutscenes and missions, the script underwent numerous rewrites. The game's delays and large budget resulted in the removal of a planned online multiplayer component, and the FMV cutscenes, which were expensive to produce, increased the number of sales needed to recoup development costs. A patch was planned to release post-launch in September 1996 and add the online multiplayer functionality, but it did not materialize. Schmidt called the cutscenes a "giant distraction" for the team and himself as project leader: he later described them as "cheesier than most" of those from the period and noted that "I wince a lot looking back on [them]". Schmidt believed that they were likely an error from a business standpoint, as they further increased the game's budget and production length, but ultimately did not increase sales.
Roughly a year before its release, the team concluded that Terra Nova's realistic, simulation-style gameplay was not enjoyable. However, Schmidt said that the game's already lengthy development meant that it had to be released; otherwise, he believed that it would be canceled, or that its high cost would bankrupt the company. As a result, the game was completely redesigned to be "much more arcadey" only a few months before release. Schmidt said that, in the new game, "you were going around blowing people up" and "your enemies have brackets on them showing their health and it's very bright and glowy and green". He believed that these elements drastically increased the game's enjoyability. He summarized, "Six months before it shipped the game wasn't fun at all and we actually ended up shipping something that was at least somewhat enjoyable to play". After previously being slated to launch in the second quarter of 1995, the game was released on March 5, 1996; by this time, its graphical technology had been surpassed by other video games, according to Schmidt. Lead programmer Art Min later expressed dissatisfaction with the game: he believed that, while the team coalesced at the end of development, they shipped the game too soon because of "an overexcited VP of Product Development".
### Technology
Unlike Looking Glass' previous first-person games—Ultima Underworld, Ultima Underworld II and System Shock—Terra Nova takes place in outdoor environments. The game's engine, named TED, supports weather conditions, day and night environments, real-time water reflections and moving clouds, among other effects. Most of the work on the outdoor renderer was done by programmer Eric Twietmeyer; however, contemporary computers were not powerful enough to display fully three-dimensional (3D) outdoor environments. The problem was solved by programmer James Fleming: the game's engine renders and applies textures to foreground objects in full 3D graphics, but—according to PC Gamer US—it displays a "bitmapped background in the distance" to provide the "illusion of detail". As with Flight Unlimited and the CD-ROM release of System Shock, Terra Nova was designed to support head-mounted displays. The game features QSound technology. Describing QSound's effect before the game's release, Suzanne Kantra Kirschner of Popular Science wrote that "you'll hear the rustle of leaves from the right speaker a split second before you hear it in the left[,] signaling you that the enemy is approaching from the right".
The game's characters are procedurally animated via simulated physics models and inverse kinematics (IK)—a system designed by programmer Seamus Blackley. Basic physics are used to move character models through the environment, and the models are animated by the IK system in accordance with this movement. Designer Richard Wyckoff later compared the character physics to those of a marble, and Schmidt described the technique as akin to putting each character in a hamster ball. The system's imperfect nature can result in animation glitches. A more realistic simulation of bipedal movement was originally planned, but it was simplified before release because of coding difficulties. Schmidt later said that the original method "almost always worked", but that "every thirty minutes someone would put their foot down slightly wrong ... and then go flying off across the map". A physics model is also used to simulate weapon recoil, the arc of projectiles and the gravity of each planet; for example, projectiles travel farther in low gravity environments.
## Reception
Although Terra Nova sold more than 100,000 units, it was a commercial failure because it did not recoup its development costs. Designer Tim Stellmach later characterized its performance as "a disaster". Despite this, the game was acclaimed by critics, and several publications drew comparisons to the 1995 video game MechWarrior 2: 31st Century Combat.
Edge compared the game favorably to System Shock due to its balanced combination of action and strategy, stating that in just two years Looking Glass "has metamorphosed from one of the industry's secret technology powerhouses to a hugely respected developer in its own right." John Payne of The Washington Post wrote, "Depending on your point of view, Terra Nova is either a stripped-down Mechwarrior or a souped-up Doom." However, he stated that it was enjoyable regardless of which perspective was taken. While he described the game's animation as "fluid", he found its graphics in general to be "fairly blocky, even at a distance". He finished his review by stating that the game "requires more practice than action fans are used to" but provides "a nice payoff". Next Generation Magazine wrote, "Looking Glass has always been known for breaking the barriers of conventional gameplay, and it has done it again with Terra Nova". The magazine considered the game to be "an all around stunning effort".
The Sunday Star-Times' Peter Eley found the game to be extremely complex, and he noted the originality of its "real-time, full motion and 3D combat simulation". He called its sound and music "stunning" but found that its graphics "aren't as crisp as some other games", and he described performance issues. Lee Perkins of The Age also found the game's performance and graphics lacking, but he said, "In spite of its visual shortcomings, Terra Nova has the same level of inherent player appeal as System Shock". He concluded that the game's "tactical demands ... are probably its strongest point", and that it "isn't quite up in the Mechwarrior 2 league, [but] it's making some very loud noises with avid mech-combat fans". Computer Games Strategy Plus' Tim Royal offered similar praise for its strategic elements; however, like the other two, he noted the game's performance issues, and he called its graphics "above average, but not mind-boggling". He finished, "I ... won't say it beats System Shock. It doesn't. ... But Terra Nova offers a wonderful variety of terrain, mission types, and scenarios".
William Wong of Computer Shopper called it "a great game that is backed by good graphics and sound, and will keep you going for hours"; he also praised its cutscenes. He concluded, "If the [upcoming] multiplayer pack is as good as the standalone version, Terra Nova could be a strike force to be reckoned with." PC Gamer UK's James Flynn praised the game's graphics, sense of realism, and free-form missions; about the latter, he wrote, "There's no right or wrong way to complete any of the missions in Terra Nova, and this is one of its strongest assets." He noted that it was "virtually impossible to recommend" the game to those with lower-end computers, but he believed that it was also "impossible to condemn Looking Glass for programming the game this way, because it feels so real, and its authenticity is what makes it so much fun". Daniel Jevons of Maximum approved of the graphics but disliked the focus on long-range combat and the use of the mouse to move a crosshair rather than the entire viewpoint. However, he concluded that the game "has a degree of depth that most robot combat games lack, the plot is strangely involving and despite the initial control difficulties, with perseverance most competent gamers will soon be stomping around the battlefields". Michael E. Ryan of PC Magazine praised the game's artificial intelligence and called its graphical quality "spectacular", but found its movement controls to be "awkward". He concluded, "Terra Nova is an exceptional game that combines frenetic, fast-paced action with real-time squad-level tactics. It doesn't get much better than this".
Terra Nova was named the 15th best computer game ever by PC Gamer UK in 1997. The editors called it "exactly the kind of dynamic, risk-taking, intelligent game we've been asking for". In 2000, Computer Games Strategy Plus named it one of the "10 Best Sci-Fi Simulations" of all time.
### Legacy
The New York Times has cited Terra Nova as one of the first 3D games with squad-oriented gameplay. GameSpy's Bill Hiles said that the game "preceded the 'tactical squad-based, first-person shooter' action genre by a full two years", and that "In 1996, ...Terra Nova didn't feel like any other game out there". Hiles called Tribes 2 "a spiritual descendent of Terra Nova if there ever was one". Project leader Dan Schmidt later said that he had "a bit of a negative experience overall because the thing dragged on forever", but he noted that "there are people who regard it highly so it can't have been that terrible". The 1998 video game Jurassic Park: Trespasser features a procedural animation system very similar to the one used in Terra Nova.
While Schmidt said before the game's release that the team wanted to develop "a whole series of games that take place in the Terra Nova world", the game's poor sales made the creation of a sequel "impractical", according to Paul Neurath. As the game's publisher, Looking Glass took on the full burden of its commercial underperformance, which contributed to the company's bankruptcy and closure in May 2000. Neurath later said, "If we could do Terra Nova over, I would have dumped the cinematics and done online team play instead. Who knows, maybe then the Tribes II and Halo teams would be talking about the influence of Terra Nova on their games". |
# Imperial Trans-Antarctic Expedition
The Imperial Trans-Antarctic Expedition of 1914–1917 is considered to be the last major expedition of the Heroic Age of Antarctic Exploration. Conceived by Sir Ernest Shackleton, the expedition was an attempt to make the first land crossing of the Antarctic continent. After Roald Amundsen's South Pole expedition in 1911, this crossing remained, in Shackleton's words, the "one great main object of Antarctic journeyings". Shackleton's expedition failed to accomplish this objective but became recognized instead as an epic feat of endurance.
Shackleton had served in the Antarctic on the Discovery expedition of 1901–1904 and had led the Nimrod expedition of 1907–1909. In this new venture, he proposed to sail to the Weddell Sea and to land a shore party near Vahsel Bay, in preparation for a transcontinental march via the South Pole to the Ross Sea. A supporting group, the Ross Sea party, would meanwhile establish camp in McMurdo Sound and from there lay a series of supply depots across the Ross Ice Shelf to the foot of the Beardmore Glacier. These depots would be essential for the transcontinental party's survival, as the group would not be able to carry enough provisions for the entire crossing. The expedition required two ships: Endurance under Shackleton for the Weddell Sea party, and , under Aeneas Mackintosh, for the Ross Sea party.
Endurance became beset—trapped in the ice of the Weddell Sea—before it was able to reach Vahsel Bay. It drifted northward, held in the pack ice, throughout the Antarctic winter of 1915. Eventually the ice crushed the ship, and it sank, stranding its complement of 28 men on the ice. After months spent in makeshift camps as the ice continued its northwards drift, the party used lifeboats that had been salvaged from the ship to reach the inhospitable, uninhabited Elephant Island. Shackleton and five other members of the group then made an 800-mile (1,300 km) open-boat journey in the James Caird, and were able to reach South Georgia. From there, Shackleton was eventually able to arrange a rescue of the men who had remained on Elephant Island and to bring them home without loss of life. The remarkably preserved wreck of Endurance was found on the seafloor in 2022.
On the other side of the continent, the Ross Sea party overcame great hardships to fulfill its mission. Aurora was blown from her moorings during a gale and was unable to return, leaving the shore party stranded without proper supplies or equipment. Although the depots were still able to be laid, three people died before the party was eventually rescued.
## Preparations
### Origin
Despite the public acclaim that had greeted Ernest Shackleton's achievements after the Nimrod Expedition in 1907–1909, the explorer was unsettled, becoming—in the words of British skiing pioneer Sir Harry Brittain—"a bit of a floating gent". By 1912, his future Antarctic plans depended on the results of Robert Falcon Scott's Terra Nova Expedition, which had left Cardiff in July 1910, and on the concurrent Norwegian expedition led by Roald Amundsen. The news of Amundsen's conquest of the South Pole reached Shackleton on 11 March 1912, to which he responded: "The discovery of the South Pole will not be the end of Antarctic exploration". The next work, he said, would be "a transcontinental journey from sea to sea, crossing the pole". He was aware that others were in the field pursuing this objective.
On 11 December 1911, a German expedition under Wilhelm Filchner had sailed from South Georgia, intending to penetrate deep into the Weddell Sea and establishing a base from which he would cross the continent to the Ross Sea. In late 1912 Filchner returned to South Georgia, having failed to land and set up his base. However, his reports of possible landing sites in Vahsel Bay, at around 78° latitude, were noted by Shackleton, and incorporated into his developing expedition plans.
News of the deaths of Scott and his companions on their return from the South Pole reached London in February 1913. Against this gloomy background Shackleton initiated preparations for his proposed journey. He solicited financial and practical support from, among others, Tryggve Gran of Scott's expedition, and the former prime minister, Lord Rosebery, but received no help from either. Gran was evasive, and Rosebery blunt: "I have never been able to care one farthing about the Poles".
Shackleton got support, however, from William Speirs Bruce, leader of the Scottish National Antarctic Expedition of 1902–1904, who had harboured plans for an Antarctic crossing since 1908 but had abandoned the project for lack of funds. Bruce generously allowed Shackleton to adopt his plans, although the eventual scheme announced by Shackleton owed little to Bruce. On 29 December 1913, having acquired his first promises of financial backing—a £10,000 grant from the British government—Shackleton made his plans public in a letter to The Times.
### Shackleton's plan
Shackleton called his new expedition the Imperial Trans-Antarctic Expedition, because he felt that "not only the people of these islands, but our kinsmen in all the lands under the Union Jack will be willing to assist towards the carrying out of the ... programme of exploration." To arouse the interest of the general public, he issued a detailed programme early in 1914. The expedition was to consist of two parties and two ships. The Weddell Sea party would travel aboard Endurance and continue to the Vahsel Bay area, where 14 men would land, of whom six, under Shackleton, would form the transcontinental party. This group, with 69 dogs, two motor sledges, and equipment "embodying everything that the experience of the leader and his expert advisers can suggest", would undertake the 1,800-mile (2,900 km) journey to the Ross Sea. The remaining eight shore party members would carry out scientific work, three going to Graham Land, three to Enderby Land and two remaining at base camp.
The Ross Sea party would set up its base in McMurdo Sound, on the opposite side of the continent. After landing they would lay depots on the route of the transcontinental party as far as the Beardmore Glacier, hopefully meeting that party there and assisting it home. They would also make geological and other observations.
### Finances
Shackleton estimated that he would need £50,000 (current value £) to carry out the simplest version of his plan. He did not believe in appeals to the public: "(they) cause endless book-keeping worries". His chosen method of fundraising was to solicit contributions from wealthy backers, and he had begun this process early in 1913 with little initial success. The first significant encouragement came in December 1913, when the British government offered him £10,000, provided he could raise an equivalent amount from private sources. The Royal Geographical Society (RGS), from which he had expected nothing, gave him £1,000—according to Huntford, Shackleton, in a grand gesture, advised them that he would only need to take up half of this sum. Lord Rosebery, who had previously expressed his lack of interest in polar expeditions, gave £50.
In February 1914, The New York Times reported that playwright J. M. Barrie—a close friend of Scott, who had become Shackleton's rival late in his career—had confidentially donated $50,000 (about £10,000). With time running out, contributions were eventually secured during the first half of 1914. Dudley Docker of the Birmingham Small Arms Company gave £10,000, wealthy tobacco heiress Janet Stancomb-Wills gave a "generous" sum (the amount was not revealed), and, in June, Scottish industrialist Sir James Key Caird donated £24,000 (current value £). Shackleton informed the Morning Post that "this magnificent gift relieves me of all anxiety".
Shackleton now had the money to proceed. He acquired, for £14,000 (current value £), a 300-ton barquentine called Polaris, which had been built for the Belgian explorer Adrien de Gerlache for an expedition to Spitsbergen. This scheme had collapsed and the ship became available. Shackleton changed her name to Endurance, reflecting his family motto, "By endurance we conquer". For a further £3,200 (current value £), he acquired Douglas Mawson's expedition ship , which was lying in Hobart, Tasmania. This would act as the Ross Sea party's vessel.
How much money Shackleton raised to meet the total costs of the expedition (later estimated by the Daily Mail to be around £80,000) is uncertain, since the size of the Stancomb-Wills donation is not known. Money was a constant problem for Shackleton, who as an economy measure halved the funding allocated to the Ross Sea party, a fact which the party's commander Aeneas Mackintosh only discovered when he arrived in Australia to take up his duties. Mackintosh was forced to haggle and plead for money and supplies to make his part of the expedition viable. Shackleton had, however, realised the revenue-earning potential of the expedition. He sold the exclusive newspaper rights to the Daily Chronicle, and formed the Imperial Trans Antarctic Film Syndicate to take advantage of the film rights.
## Personnel
According to legend, Shackleton posted an advertisement in a London paper, stating: "Men wanted for hazardous journey. Low wages, bitter cold, long hours of complete darkness. Safe return doubtful. Honour and recognition in event of success." Searches for the original advertisement have proved unsuccessful, and the story is generally regarded as apocryphal. Shackleton received more than 5,000 applications for places on the expedition, including a letter from "three sporty girls" who suggested that if their feminine garb was inconvenient they would "just love to don masculine attire."
Eventually the crews for the two arms of the expedition were trimmed down to 28 apiece, including William Bakewell, who joined the ship in Buenos Aires; his friend Perce Blackborow, who stowed away when his application was turned down; and several last-minute appointments made to the Ross Sea party in Australia. A temporary crewman was Sir Daniel Gooch, grandson of the renowned railway pioneer Daniel Gooch, who stepped in to help Shackleton as a dog handler at the last moment and signed up for an able seaman's pay. Gooch agreed to sail with Endurance as far as South Georgia.
As his second-in-command, Shackleton chose Frank Wild, who had been with him on both the Discovery and Nimrod expeditions, and was one of the Farthest South party in 1909. Wild had just returned from Mawson's Australasian Antarctic Expedition. To captain Endurance Shackleton had wanted John King Davis, who had commanded Aurora during the Australasian Antarctic Expedition. Davis refused, thinking the enterprise was "foredoomed", so the appointment went to Frank Worsley, who claimed to have applied to the expedition after learning of it in a dream. Tom Crean, who had been awarded the Albert Medal for saving the life of Lieutenant Edward Evans on the Terra Nova Expedition, took leave from the Royal Navy to sign on as Endurance's second officer; another experienced Antarctic hand, Alfred Cheetham, became third officer. Two Nimrod veterans were assigned to the Ross Sea party: Mackintosh, who commanded it, and Ernest Joyce. Shackleton had hoped that Aurora would be staffed by a naval crew, and had asked the Admiralty for officers and men, but was turned down. After pressing his case, Shackleton was given one officer from the Royal Marines, Captain Thomas Orde-Lees, who was superintendent of physical training at the marines' training depot.
The scientific staff of six accompanying Endurance comprised the two surgeons, Alexander Macklin and James McIlroy; a geologist, James Wordie; a biologist, Robert Clark; a physicist, Reginald W. James; and Leonard Hussey, a meteorologist who would eventually edit Shackleton's expedition account South. The visual record of the expedition was the responsibility of its photographer Frank Hurley and its artist George Marston. The final composition of the Ross Sea party was hurried. Some who left Britain for Australia to join Aurora resigned before it departed for the Ross Sea, and a full complement of crew was in doubt until the last minute. Within the party only Mackintosh and Joyce had any previous Antarctic experience; Mackintosh had lost an eye as the result of an accident during the Nimrod expedition and had gone home early.
## Expedition
### Weddell Sea party
#### Voyage through the ice
Endurance, without Shackleton (who was detained in England by expedition business), left Plymouth on 8 August 1914, heading first for Buenos Aires. Here Shackleton, who had travelled on a faster ship, rejoined the expedition. Hurley also came on board, together with Bakewell and the stowaway, Blackborow, while several others left the ship or were discharged. On 26 October, the ship sailed for the South Atlantic, arriving in South Georgia on 5 November. Shackleton's original intention was that the crossing would take place in the first season, 1914–1915. Although he soon recognised the impracticality of this, he neglected to inform Mackintosh and the Ross Sea party of his change of plan. According to the Daily Chronicle's correspondent Ernest Perris, a cable intended for Mackintosh was never sent.
After a month-long halt in the Grytviken whaling station on South Georgia, Endurance departed for the Antarctic on 5 December. Two days later, Shackleton was disconcerted to encounter pack ice as far north as 57° 26′S, forcing the ship to manoeuvre. During the following days there were more tussles with the pack, which, on 14 December, was thick enough to halt the ship for 24 hours. Three days later, the ship was stopped again. Shackleton commented: "I had been prepared for evil conditions in the Weddell Sea, but had hoped that the pack would be loose. What we were encountering was fairly dense pack of a very obstinate character".
Endurance's progress was frustratingly slow, until, on 22 December, leads opened up and the ship was able to continue steadily southward. This continued for the next two weeks, taking the party deep into the Weddell Sea. Further delays then slowed progress after the turn of the year, before a lengthy run south during 7–10 January 1915 brought them close to the 100-foot (30 m) ice walls which guarded the Antarctic coastal region of Coats Land. This territory had been discovered and named by William Speirs Bruce in 1904 during the Scottish National Antarctic Expedition. On 15 January, Endurance came abreast of a great glacier, the edge of which formed a bay which appeared a good landing place. However, Shackleton considered it too far north of Vahsel Bay for a landing, "except under pressure of necessity"—a decision he would later regret. On 17 January, the ship reached a latitude of 76° 27′S, where land was faintly discernible. Shackleton named it Caird Coast, after his principal backer. Bad weather forced the ship to shelter in the lee of a stranded iceberg.
Endurance was now close to Luitpold Land, discovered by Filchner in 1912, at the southern end of which lay their destination, Vahsel Bay. Next day, the ship was forced north-westward for 14 miles (23 km), resuming in a generally southerly direction before being stopped altogether. The position was 76° 34′S, 31° 30′W. After ten days of inactivity, Endurance's fires were banked to save fuel. Strenuous efforts were made to release her; on 14 February, Shackleton ordered men onto the ice with ice-chisels, prickers, saws and picks to try to force a passage, but the labour proved futile. Shackleton did not at this stage abandon all hope of breaking free, but was now contemplating the "possibility of having to spend a winter in the inhospitable arms of the pack".
#### Drift of Endurance
On 22 February 1915, Endurance, still held fast, drifted to her most southerly latitude, 76° 58′S. Thereafter she began moving with the pack in a northerly direction. On 24 February, Shackleton realised that they would be held in the ice throughout the winter and ordered ship's routine abandoned. The dogs were taken off board and housed in ice-kennels or "dogloos", and the ship's interior was converted to suitable winter quarters for the various groups of men—officers, scientists, engineers, and seamen. A wireless apparatus was rigged, but their location was too remote to receive or transmit signals.
Shackleton was aware of the recent example of Filchner's ship, Deutschland, which had become icebound in the same vicinity three years earlier. After Filchner's attempts to establish a land base at Vahsel Bay failed, his ship was trapped on 6 March 1912, about 200 miles (320 km) off the coast of Coats Land. Six months later, at latitude 63° 37', the ship broke free, then sailed to South Georgia apparently none the worse for its ordeal. Shackleton thought that a similar experience might allow Endurance to make a second attempt to reach Vahsel Bay in the following Antarctic spring.
In February and March, the rate of drift was very slow. At the end of March, Shackleton calculated that the ship had travelled a mere 95 miles (153 km) since 19 January. However, as winter set in the speed of the drift increased, and the condition of the surrounding ice changed. On 14 April, Shackleton recorded the nearby pack "piling and rafting against the masses of ice"—if the ship was caught in this disturbance "she would be crushed like an eggshell". In May, as the sun set for the winter months, the ship was at 75° 23′S, 42° 14′W, still drifting northwards. It would be at least four months before spring brought the chance of an opening of the ice, and there was no certainty that Endurance would break free in time to attempt a return to the Vahsel Bay area. Shackleton now considered the possibility of finding an alternative landing ground on the western shores of the Weddell Sea, if that coast could be reached. "In the meantime", he wrote, "we must wait".
In the dark winter months of May, June and July, Shackleton was concerned with maintaining fitness, training and morale. Although the scope for activity was limited, the dogs were exercised (and on occasion raced competitively), men were encouraged to take moonlight walks, and aboard ship there were attempted theatricals. Special occasions such as Empire Day were duly celebrated. The first signs of the ice breaking up occurred on 22 July. On 1 August, in a south-westerly gale with heavy snow, the ice floe began to disintegrate all around the ship, the pressure forcing masses of ice beneath the keel and causing a heavy list to port. The position was perilous; Shackleton wrote: "The effects of the pressure around us was awe-inspiring. Mighty blocks of ice [...] rose slowly till they jumped like cherry-stones gripped between thumb and finger [...] if the ship was once gripped firmly her fate would be sealed". This danger passed, and the succeeding weeks were quiet. During this relative lull the ship drifted into the area where, in 1823, Captain Benjamin Morrell of the sealer Wasp reported seeing a coastline which he identified as "New South Greenland". There was no sign of any such land; Shackleton concluded that Morrell had been deceived by the presence of large icebergs.
On 30 September, the ship sustained what Shackleton described as "the worst squeeze we had experienced". Worsley described the pressure as like being "thrown to and fro like a shuttlecock a dozen times". On 24 October, the starboard side was forced against a large floe, increasing the pressure until the hull began to bend and splinter, so that water from below the ice began to pour into the ship. When the timbers broke they made noises which sailors later described as being similar to the sound of "heavy fireworks and the blasting of guns". The supplies and three lifeboats were transferred to the ice, while the crew attempted to shore up the ship's hull and pump out the incoming sea. However, after a few days, on 27 October 1915, and in freezing temperatures below −15 °F (−26 °C), Shackleton gave the order to abandon ship. The position at abandonment was 69° 05′S, 51° 30′W. The wreckage remained afloat, and over the following weeks the crew salvaged further supplies and materials, including Hurley's photographs and cameras that had initially been left behind. From around 550 plates, Hurley chose the best 120, the maximum that could be carried, and smashed the rest.
#### Camping on the ice
With the loss of Endurance the transcontinental plans were abandoned, and the focus shifted to that of survival. Shackleton's intention now was to march the crew westward, to one or other of several possible destinations. His first thought was for Paulet Island, where he knew there was a hut containing a substantial food depot, because he had ordered it 12 years earlier while organising relief for Otto Nordenskjöld's stranded Swedish expedition. Other possibilities were Snow Hill Island, which had been Nordenskjöld's winter quarters and which was believed to contain a stock of emergency supplies, or Robertson Island. Shackleton believed that from one of these islands they would be able to reach and cross Graham Land and get to the whaling outposts in Wilhelmina Bay. He calculated that on the day Endurance was abandoned they were 346 miles (557 km) from Paulet Island. Worsley calculated the distance to Snow Hill Island to be 312 miles (500 km), with a further 120 miles (190 km) to Wilhelmina Bay. He believed the march was too risky; they should wait until the ice carried them to open water, and then escape in the boats. Shackleton over-ruled him.
Before the march could begin, Shackleton ordered the weakest animals to be shot, including the carpenter Harry McNish's cat, Mrs Chippy, and a pup which had become a pet of the surgeon Macklin. The company set out on 30 October 1915, with two of the ship's lifeboats carried on sledges. Problems quickly arose as the condition of the sea ice around them worsened. According to Hurley the surface became "a labyrinth of hummocks and ridges" in which barely a square yard was smooth. In three days, the party managed to travel barely two miles (3.2 km), and on 1 November, Shackleton abandoned the march; they would make camp and await the break-up of the ice. They gave the name "Ocean Camp" to the flat and solid-looking floe on which their aborted march had ended, and settled down to wait. Parties continued to revisit the Endurance wreck, which was still drifting with the ice a short distance from the camp. More of the abandoned supplies were retrieved until, on 21 November, the ship finally slipped beneath the ice. The final resting place of Endurance would remain a mystery for nearly 107 years, until the wreckage was discovered on 5 March 2022.
The ice was not drifting fast enough to be noticeable, although by late November the speed was up to seven miles (11 km) a day. By 5 December, they had passed 68°S, but the direction was turning slightly east of north. This was taking the transcontinental party to a position from which it would be difficult to reach Snow Hill Island, although Paulet Island, further north, remained a possibility. Paulet Island was about 250 miles (400 km) away, and Shackleton was anxious to reduce the length of the lifeboat journey that would be necessary to reach it. Therefore, on 21 December he announced a second march, to begin on 23 December.
Conditions, however, had not improved since the earlier attempt. Temperatures had risen and it was uncomfortably warm, with men sinking to their knees in soft snow as they struggled to haul the boats through the pressure ridges. On 27 December, McNish rebelled and refused to work, arguing that Admiralty law had lapsed since Endurance's sinking and that he was no longer under orders. Shackleton's firm remonstrance finally brought the carpenter to heel, but the incident was never forgotten. Two days later, with only seven and a half miles (12.1 km) progress achieved in seven back-breaking days, Shackleton called a halt, observing: "It would take us over three hundred days to reach the land". The crew put up their tents and settled into what Shackleton called "Patience Camp", which would be their home for more than three months.
Supplies were now running low. Hurley and Macklin were sent back to Ocean Camp to recover food that had been left there to lighten the sledging teams' burden. On 2 February 1916, Shackleton sent a larger party back to recover the third lifeboat. Food shortages became acute as the weeks passed, and seal meat, which had added variety to their diet, now became a staple as Shackleton attempted to conserve the remaining packaged rations. In January, all but two teams of the dogs (whose overall numbers had been depleted by mishaps and illness in the preceding months) were shot on Shackleton's orders, because the dogs' requirements for seal meat were excessive. The final two teams were shot on 2 April, by which time their meat was a welcome addition to the rations. Meanwhile, the rate of drift became erratic; after being held at around 67° for several weeks, at the end of January there was a series of rapid north-eastward movements which, by 17 March, brought Patience Camp to the latitude of Paulet Island, but 60 nm (111 km) to its east. "It might have been six hundred for all the chance we had of reaching it across the broken sea-ice", Shackleton recorded.
The party now had land more or less continuously in sight. The peak of Mount Haddington on James Ross Island remained in view as the party drifted slowly by. They were too far north for Snow Hill or Paulet Island to be accessible, and Shackleton's chief hopes were now fixed on two remaining small islands at the northern extremity of Graham Land. These were Clarence Island and Elephant Island, around 100 nautical miles (185 km) due north of their position on 25 March. He then decided Deception Island might be a better target destination. This lay far to the west, toward the South Shetland Islands, but Shackleton thought it might be attainable by island-hopping. Its advantage was that it was sometimes visited by whalers and might contain provisions, whereas Clarence Island and Elephant Island were desolate and unvisited. To reach any of these destinations would require a perilous journey in the lifeboats once the floe upon which they were drifting finally broke up. Earlier, the lifeboats had been named after the expedition's three chief financial sponsors: James Caird, Dudley Docker and Stancomb Wills.
#### Lifeboat journey to Elephant Island
The end of Patience Camp was signalled on the evening of 8 April, when the floe suddenly split. The camp now found itself on a small triangular raft of ice; a break-up of this would mean disaster, so Shackleton readied the lifeboats for the party's enforced departure. He had now decided they would try, if possible, to reach the distant Deception Island because a small wooden church had been reportedly erected for the benefit of whalers. This could provide a source of timber that might enable them to construct a seaworthy boat.
At 1 p.m. on 9 April, the Dudley Docker was launched, and an hour later all three boats were away. Shackleton himself commanded the James Caird, Worsley the Dudley Docker, and navigating officer Hubert Hudson was nominally in charge of the Stancomb Wills, though because of his precarious mental state the effective commander was Tom Crean.
The boats were surrounded by ice, dependent upon leads of water opening up, and progress was perilous and erratic. Frequently the boats were tied to floes, or dragged up onto them, while the men camped and waited for conditions to improve. Shackleton was wavering again between several potential destinations, and on 12 April rejected the various island options and decided on Hope Bay, at the very tip of Graham Land. However, conditions in the boats, in temperatures sometimes as low as −20 °F (−29 °C), with little food and regular soakings in icy seawater, were wearing the men down, physically and mentally. Shackleton therefore decided that Elephant Island, the nearest of the possible refuges, was now the most practical option.
On 14 April, the boats lay off the south-east coast of Elephant Island, but could not land as the shore consisted of perpendicular cliffs and glaciers. Next day the James Caird rounded the eastern point of the island to reach the northern lee shore, and discovered a narrow shingle beach. Soon afterwards, the three boats, which had been separated during the previous night, were reunited at this landing place. It was apparent from high tide markings that this beach would not serve as a long-term camp, so the next day Wild and a crew set off in the Stancomb Wills to explore the coast for a safer site. They returned with news of a long spit of land, seven miles (11 km) to the west. With minimum delay the men returned to the boats and transferred to this new location, which they later christened Cape Wild.
#### Voyage of the James Caird
Elephant Island was remote, uninhabited, and rarely visited by whalers or any other ships. If the party was to return to civilization it would be necessary to summon help. The only realistic way this could be done was to adapt one of the lifeboats for an 800-mile (1,300 km) voyage across the Southern Ocean, to South Georgia. Shackleton had abandoned thoughts of taking the party on the less dangerous journey to Deception Island, because of the poor physical condition of many of his party. Port Stanley in the Falkland Islands was closer than South Georgia but could not be reached, as this would require sailing against the strong prevailing winds.
Shackleton selected the boat party: himself, Worsley, Crean, McNish, and sailors John Vincent and Timothy McCarthy. On instructions from Shackleton, McNish immediately set about adapting the James Caird, improvising tools and materials. Wild was to be left in charge of the Elephant Island party, with instructions to make for Deception Island the following spring should Shackleton not return. Shackleton took supplies for only four weeks, judging that if land had not been reached within that time the boat would be lost.
The 22.5-foot (6.9 m) James Caird was launched on 24 April 1916. The success of the voyage depended on the pin-point accuracy of Worsley's navigation, using observations that would have to be made in the most unfavourable of conditions. The prevailing wind was helpfully north-west, but the heavy sea conditions quickly soaked everything in icy water. Soon ice settled thickly on the boat, making her ride sluggishly. On 5 May, a north-westerly gale almost caused the boat's destruction as it faced what Shackleton described as the largest waves he had seen in 26 years at sea. On 8 May, South Georgia was sighted, after a 14-day battle with the elements that had driven the boat party to their physical limits. Two days later, after a prolonged struggle with heavy seas and hurricane-force winds to the south of the island, the party struggled ashore at King Haakon Bay.
#### South Georgia crossing
The arrival of the James Caird at King Haakon Bay was followed by a period of rest and recuperation, while Shackleton pondered the next move. The populated whaling stations of South Georgia lay on the northern coast. To reach them would mean either another boat journey around the island, or a land crossing through its unexplored interior. The condition of the James Caird, and the physical state of the party, particularly Vincent and McNish, meant that the crossing was the only realistic option.
After five days, the party took the boat a short distance eastwards, to the head of a deep bay which would be the starting point for the crossing. Shackleton, Worsley and Crean would undertake the land journey, the others remaining at what they christened "Peggotty Camp", to be picked up later after help had been obtained from the whaling stations. A storm on 18 May delayed their start, but by two o'clock the following morning the weather was clear and calm, and an hour later the crossing party set out.
The party's destination was the whaling station at Stromness, which had been Endurance's last port of call on their outbound journey. This was roughly 26 miles (40 km) away, across the edge of the Allardyce Range. Another whaling station was known to be at Prince Olav Harbour, just six miles (10 km) north of Peggotty Camp over easier terrain, but as far as the party was aware, this was only inhabited during the summer months. Shackleton and his men did not know that during their two-year absence in Antarctica, the station's owners had begun year-round operations.
Without a map, the route the party chose was largely conjectural. By dawn they had ascended to 3,000 feet (910 m) and could see the northern coast. They were above Possession Bay, which meant they would need to move eastward to reach Stromness. This meant the first of several backtrackings that would extend the journey and frustrate the men. At the close of that first day, needing to descend to the valley below them before nightfall, they risked everything by sliding down a mountainside on a makeshift rope sledge. They travelled without rest on by moonlight, moving upwards towards a gap in the next mountainous ridge.
Early next morning, 20 May, seeing Husvik Harbour below them, the party knew that they were on the right path. At seven o'clock in the morning, they heard a steam whistle sound from Stromness, "the first sound created by an outside human agency that had come to our ears since we left Stromness Bay in December 1914". After a difficult descent, which involved passage down through a freezing waterfall, they at last reached safety. Shackleton wrote afterwards: "I have no doubt that Providence guided us ... I know that during that long and racking march of 36 hours over the unnamed mountains and glaciers it seemed to me often that we were four, not three". This image of a fourth traveller was echoed in the accounts of Worsley and Crean and later influenced T. S. Eliot in the writing of his poem The Waste Land. This phenomenon has been reported by other adventurers and is known as the third man factor.
#### Rescue
Shackleton's first task, on arriving at the Stromness station, was to arrange for his three companions at Peggotty Camp to be picked up. A whaler was sent round the coast, with Worsley aboard to show the way, and by the evening of 21 May all six of the James Caird party were safe.
It took four attempts before Shackleton was able to return to Elephant Island to rescue the party stranded there. He first left South Georgia a mere three days after he had arrived in Stromness, after securing the use of a large whaler, The Southern Sky, which was laid up in Husvik Harbour. Shackleton assembled a volunteer crew, which had it ready to sail by the morning of 22 May. As the vessel approached Elephant Island they saw that an impenetrable barrier of pack ice had formed, some 70 miles (110 km) from their destination. The Southern Sky was not built for ice breaking, and retreated to Port Stanley in the Falkland Islands.
On reaching Port Stanley, Shackleton informed London by cable of his whereabouts and requested that a suitable vessel be sent south for the rescue operation. He was informed by the Admiralty that nothing was available before October, which in his view was too late. Then, with the help of the British Minister in Montevideo, Shackleton obtained from the Uruguayan government the loan of a tough trawler, Instituto de Pesca No. 1, which started south on 10 June. Again the pack thwarted them. In search of another ship, Shackleton, Worsley and Crean travelled to Punta Arenas, where they met Allan MacDonald, the British owner of the schooner Emma. McDonald equipped this vessel for a further rescue attempt, which left on 12 July, but with the same negative result—the pack defeated them yet again. Shackleton later named a glacier after McDonald on the Brunt Ice Shelf in the Weddell Sea. After problems arose in identifying this glacier, a nearby ice rise was renamed the McDonald Ice Rumples.
By now it was mid-August, more than three months since Shackleton had left Elephant Island. Shackleton begged the Chilean Navy to lend him Yelcho, a small steam tug that had assisted Emma during the previous attempt. They agreed; on 25 August, Yelcho—captained by captain of Chilean Navy Luis Pardo—set out for Elephant Island. This time, as Shackleton recorded, providence favoured them. The seas were open, and the ship was able to approach close to the island in thick fog. At 11:40 a.m. on 30 August, the fog lifted, the camp was spotted and, within an hour, all the Elephant Island party were safely aboard, bound for Punta Arenas.
#### On Elephant Island
After Shackleton left with the James Caird, Wild took command of the Elephant Island party, some of whom were in a low state, physically or mentally: Lewis Rickinson had suffered a suspected heart attack; Perce Blackborow was unable to walk, due to frostbitten feet; Hubert Hudson was depressed. The priority for the party was a permanent shelter against the rapidly approaching southern winter. On the suggestion of Marston and Lionel Greenstreet, a hut—nicknamed the "Snuggery"—was improvised by upturning the two boats and placing them on low stone walls, to provide around five feet (1.5 m) of headroom. By means of canvas and other materials the structure was made into a crude but effective shelter.
Wild initially estimated that they would have to wait one month for rescue, and refused to allow long-term stockpiling of seal and penguin meat because this, in his view, was defeatist. This policy led to sharp disagreements with Orde-Lees, the storekeeper, who was not a popular man and whose presence apparently did little to improve the morale of his companions, unless it was by way of being the butt of their jokes.
As the weeks extended well beyond his initial optimistic forecast, Wild established and maintained routines and activities to relieve the tedium. A permanent lookout was kept for the arrival of the rescue ship, cooking and housekeeping rotas were established, and there were hunting trips for seal and penguin. Concerts were held on Saturdays and anniversaries were celebrated, but there were growing feelings of despondency as time passed with no sign of rescue. The toes on Blackborow's left foot became gangrenous from frostbite and, on 15 June, had to be amputated by surgeons Macklin and McIlroy in the candle-lit hut. Using the last of the chloroform in their medical supplies, the whole procedure took 55 minutes and was a complete success.
By 23 August, it seemed that Wild's no-stockpiling policy had failed. The surrounding sea was dense with pack ice that would halt any rescue ship, food supplies were running out and no penguins were coming ashore. Orde-Lees wrote: "We shall have to eat the one who dies first [...] there's many a true word said in jest". Wild's thoughts were now seriously turning to the possibility of a boat trip to Deception Island—he planned to set out on 5 October, in the hope of meeting a whaling ship— when, on 30 August 1916, the ordeal ended suddenly with the appearance of Shackleton and Yelcho.
### Ross Sea Party
Aurora left Hobart on 24 December 1914, having been delayed in Australia by financial and organizational problems. The arrival in McMurdo Sound on 15 January 1915 was later in the season than planned, but the party's commander, Aeneas Mackintosh, made immediate plans for a depot-laying journey on the Ross Ice Shelf, since he understood that Shackleton hoped to attempt the crossing during that first season. Neither the men nor the dogs were acclimatised, and the party was, as a whole, very inexperienced in ice conditions. The first journey on the ice resulted in the loss of ten of the party's 18 dogs and a frostbitten and generally demoralised shore party; a single, incomplete depot was their only achievement.
On 7 May, Aurora, anchored at the party's Cape Evans headquarters, was wrenched from her moorings during a gale and carried with drifting ice far out to sea. Unable to return to McMurdo Sound, she remained captive in the ice for nine months until on 12 February 1916, having travelled a distance of around 1,600 miles (2,600 km), she reached open water and limped to New Zealand. Aurora carried with her the greater part of the shore party's fuel, food rations, clothing and equipment, although the sledging rations for the depots had been landed ashore. To continue with its mission the stranded shore party had to re-supply and re-equip itself from the leftovers from earlier expeditions, notably Scott's Terra Nova Expedition, which had been based at Cape Evans a few years earlier. They were thus able to begin the second season's depot-laying on schedule, in September 1915.
In the following months, the required depots were laid, at one-degree intervals across the Ross Ice Shelf to the foot of the Beardmore Glacier. On the return journey from the glacier the party contracted scurvy; Arnold Spencer-Smith, the expedition's chaplain and photographer, collapsed and died on the ice. The remainder of the party reached the temporary shelter of Hut Point, a relic of the Discovery Expedition at the southern end of McMurdo Sound, where they slowly recovered. On 8 May 1916, Mackintosh and Victor Hayward decided to walk across the unstable sea ice to Cape Evans, were caught in a blizzard and were not seen again. The survivors eventually reached Cape Evans, but then had to wait for eight further months. Finally, on 10 January 1917, the repaired and refitted Aurora, whose departure from New Zealand had been delayed by lack of money, arrived to transport them back to civilization. Shackleton accompanied the ship as a supernumerary officer, having been denied command by the governments of New Zealand, Australia and Great Britain, who had jointly organised and financed the Ross Sea party's relief.
## Return to civilization, and aftermath
The rescued party, having had its last contact with civilization in 1914, was unaware of the course of the Great War. News of Shackleton's safe arrival in the Falklands briefly eclipsed war news in the British newspapers on 2 June 1916. Yelcho had a "triumphal" welcome in Punta Arenas after its successful mission. The rescuees were then moved to the port of Valparaíso in Central Chile where they had again a warm welcome, from there they were repatriated. The expedition returned home in piecemeal fashion, at a critical stage in the war, without the normal honours and civic receptions. When Shackleton himself finally arrived in England on 29 May 1917, after a short American lecture tour, his return was barely noticed.
Despite McNish's efforts in preparing and sailing on the James Caird voyage, his prior insubordination meant that, on Shackleton's recommendation, he was one of four men denied the Polar Medal; the others whose contributions fell short of Shackleton's expected standards were John Vincent, William Stephenson and Ernest Holness. Most of the members of the expedition returned to take up immediate active military or naval service. Before the war ended, two—Tim McCarthy of the open boat journey and the veteran Antarctic sailor Alfred Cheetham—had been killed in action, and Ernest Wild, Frank's younger brother and member of the Ross Sea party, had died of typhoid while serving in the Mediterranean. Several others were severely wounded, and many received decorations for gallantry. Following a propaganda mission in Buenos Aires, Shackleton was employed during the last weeks of the war on special service in Murmansk, with the army rank of major. This occupied him until March 1919. He thereafter organised one final Antarctic expedition, the Shackleton–Rowett Expedition on Quest, which left London on 17 September 1921. From the Endurance crew, Wild, Worsley, Macklin, McIlroy, Hussey, Alexander Kerr, Thomas McLeod and cook Charles Green all sailed with Quest.
Shackleton died of a heart attack on 5 January 1922, while Quest was anchored at South Georgia. After his death the original programme, which had included an exploration of Enderby Land, was abandoned. Wild led a brief cruise which brought them into sight of Elephant Island. They anchored off Cape Wild and were able to see the old landmarks, but sea conditions made it impossible for them to land.
It would be more than forty years before the first crossing of Antarctica was achieved, by the Commonwealth Trans-Antarctic Expedition, 1955–1958. This expedition set out from Vahsel Bay, the same bay Shackleton was in sight of when the Endurance became trapped in ice. They followed a route which avoided the Beardmore Glacier altogether, and bypassed much of the Ross Ice Shelf, reaching McMurdo Sound via a descent of the Skelton Glacier. The entire journey took 98 days.
For Chile, the rescue marked the beginning of the country's official operations in Antarctica. |
# Lewis and Clark Exposition gold dollar
The Lewis and Clark Exposition Gold dollar is a commemorative coin that was struck in 1904 and 1905 as part of the United States government's participation in the Lewis and Clark Centennial Exposition, held in the latter year in Portland, Oregon. Designed by United States Bureau of the Mint Chief Engraver Charles E. Barber, the coin did not sell well and less than a tenth of the authorized mintage of 250,000 was issued.
The Lewis and Clark Expedition, the first European-American overland exploring party to reach the Pacific Coast, was led by Meriwether Lewis and William Clark, following the Louisiana Purchase of 1803. Between 1804 and 1806, its members journeyed from St. Louis to the Oregon coast and back, providing information and dispelling myths about the large area acquired by the United States in the Purchase. The Portland fair commemorated the centennial of that trip.
The coins were, for the most part, sold to the public by numismatic promoter Farran Zerbe, who had also vended the Louisiana Purchase Exposition dollar. As he was unable to sell much of the issue, surplus coins were melted by the Mint. The coins have continued to increase in value, and today are worth between hundreds and thousands of dollars, depending on condition. The Lewis and Clark Exposition dollar is the only American coin to be "two-headed", with a portrait of one of the expedition leaders on each side.
## Background
The Louisiana Purchase in 1803 more than doubled the area of the American nation. Seeking to gain knowledge of the new possession, President Thomas Jefferson obtained an appropriation from Congress for an exploratory expedition, and appointed his private secretary, Meriwether Lewis, to lead it. A captain in the United States Army, Lewis selected William Clark, a former Army lieutenant and younger brother of American Revolutionary War hero George Rogers Clark, as co-leader of the expedition. Lewis and William Clark had served together, and chose about thirty men, dubbed the Corps of Discovery, to accompany them. Many of these were frontiersmen from Kentucky who were in the Army, as well as boatmen and others with necessary skills. The expedition set forth from the St. Louis area on May 14, 1804.
Journeying up the Missouri River, Lewis and Clark met Sacagawea, a woman of the Lemhi Shoshone tribe. Sacagawea had been captured by another tribe and sold as a slave to Toussaint Charbonneau, a French-Canadian trapper, who made her one of his wives. Both Charbonneau and Sacagawea served as interpreters for the expedition and the presence of the Native American woman (and her infant son, Jean Baptiste Charbonneau) helped convince hostile tribes that the Lewis and Clark Expedition was not a war party. A great service Sacagawea rendered the expedition was to aid in the purchase of horses, needed so the group could cross the mountains after they had to abandon the Missouri approaching the Continental Divide. One reason for her success was that the Indian chief whose aid they sought proved to be Sacagawea's brother.
The expedition spent the winter of 1804–1805 encamped near the site of Bismarck, North Dakota. They left there on April 7, 1805, and came within view of the Pacific Ocean, near Astoria, Oregon, on November 7. After overwintering and exploring the area, they departed eastward on March 23, 1806, and arrived in St. Louis six months to the day later. Only one of the expedition members died en route, most likely of appendicitis. While they did not find the mammoths or salt mountains reputed to be in the American West, "these were a small loss compared to the things that were gained". In addition to knowledge of the territories purchased by the US, these included the establishment of relations with Native Americans and increased public interest in the West once their diaries were published. Further, the exploration of the Oregon Country later aided American claims to that area. In gratitude for their service to the nation, Congress gave Lewis and Clark land grants and they were appointed to government offices in the West.
## Inception
Beginning in 1895, Oregonians proposed honoring the centennial of the Lewis and Clark Expedition with a fair to be held in Portland, a city located along the party's route. In 1900, a committee of Portland businessmen began to plan for the event, an issue of stock was successful in late 1901, and construction began in 1903. A long drive to gain federal government support succeeded when President Theodore Roosevelt signed an appropriations bill on April 13, 1904. This bill allocated $500,000 to exposition authorities, and also authorized a gold dollar to commemorate the fair, with the design and inscriptions left to the discretion of the Secretary of the Treasury. The organizing committee was the only entity allowed to purchase these from the government, and could do so at face value, up to a mintage limit of 250,000.
Numismatist Farran Zerbe had advocated for the passage of the authorization. Zerbe was not only a coin collector and dealer, but he promoted the hobby through his traveling exhibition, "Money of the World". Zerbe, president of the American Numismatic Association from 1908 to 1910, was involved in the sale of commemorative coins for over 20 years, beginning in 1892. The Portland exposition's authorities placed him in charge of the sale of the gold dollar.
Details of the preparation of the commemorative dollar are lost; the Mint destroyed many records in the 1960s. Mint Chief Engraver Charles E. Barber was responsible for the designs.
## Design
Numismatic historians Don Taxay and Q. David Bowers both suggest that Barber most likely based his designs on portraits of Lewis and of Clark by American painter Charles Willson Peale found in Philadelphia's Independence Hall. Taxay deemed Barber's efforts, "commonplace". The piece is the only American coin to be "two-headed", bearing a single portrait on each side.
Art historian Cornelius Vermeule, in his volume on American coinage, pointed out that some people liked the Lewis and Clark Exposition dollar as it depicted historic figures who affected the course of American history, rather than a bust intended to be Liberty, and that Barber's coin presaged the 1909 Lincoln cent and the 1932 Washington quarter. Nevertheless, Vermeule deprecated the piece, as well as the earlier American gold commemorative, the Louisiana Purchase Exposition dollar. "The lack of spark in these coins, as in so many designs by Barber or Assistant Engraver (later Chief Engraver) Morgan, stems from the fact that the faces, hair and drapery are flat and the lettering is small, crowded, and even." According to Vermeule, when the two engravers collaborated on a design, such as the 1916 McKinley Birthplace Memorial dollar, "the results were almost oppressive".
## Production
The Philadelphia Mint produced 25,000 Lewis and Clark Exposition dollars in September 1904, plus 28 more, reserved for inspection and testing at the 1905 meeting of the United States Assay Commission. These bore the date 1904. Zerbe ordered 10,000 more in March 1905, dated 1905. The Mint struck 35,000 plus assay pieces in March and June in case Zerbe wanted to buy more, doing so in advance as the Philadelphia Mint shut down in the summer, but as he did not order more, the additional 25,000 were melted.
The Lewis and Clark Exposition dollar was the first commemorative gold coin to be struck and dated in multiple years. A total of 60,069 pieces were struck, from both years, of which 40,003 were melted. According to numismatists Jim Hunt and Jim Wells in their 2004 article on the coin, "the poor reception afforded the coin at the time of issue virtually guaranteed their rarity for future generation".
## Aftermath and collecting
The Lewis and Clark Centennial and American Pacific Exposition and Oriental Fair opened in Portland on June 1, 1905. It was not designated as an international exposition, and did not draw much publicity even within the United States. Nevertheless, two and a half million people visited the fair between Opening Day and the close on October 14. Sixteen foreign nations accepted invitations from organizers to mount exhibits at the exposition. There was the usual broad array of concessions and midway attractions to entertain visitors. Among Americans who displayed exhibits at the fair were prominent cartoonist and animal fancier Homer Davenport and long-lived pioneer Ezra Meeker. The exposition was one of the few of its kind to make a profit, and likely contributed to a major increase in Portland's population and economy between 1905 and 1912.
Funds from the sale of the coin were designated for the completion of a statue to Sacagawea in a Portland park. There was little mention of the dollar in the numismatic press. Q. David Bowers speculates that Dr. George F. Heath, editor of The Numismatist, who opposed such commemoratives, declined to run any press releases Zerbe might have sent. Nevertheless, an article appeared in the August 1905 issue, promoting the exhibit and dollar. As it quotes Zerbe and praises his efforts, it was likely written by him. Zerbe concentrated on bulk sales to dealers, as well as casual ones at the fair at a price of $2; he enlisted Portland coin dealer D.M. Averill & Company to make retail sales by mail. There were also some banks and other businesses that sold coins directly to the public. Averill ran advertisements in the numismatic press, and in early 1905, raised prices on the 1904 pieces, claiming that they were near exhaustion. This was a lie: in fact, the 1904-dated coins sold so badly that some 15,000 were melted at the San Francisco Mint. Zerbe had Averill sell the 1905 issue at a discounted price of ten dollars for six pieces. As he had for the Louisiana Purchase dollar, Zerbe made the coins available mounted in spoons or in jewelry. Little else is known regarding the distribution of the gold dollars.
The coins were highly unpopular in the collecting community, which had seen the Louisiana Purchase coin decrease in value since its issuance. Nevertheless, the value of the Lewis and Clark issue did not drop below issue price, but steadily increased. Despite a slightly higher number of coins recorded as extant, the 1905 issue is rarer and more valuable than the 1904; Bowers speculates that Zerbe may have held some pieces only to cash them in, or surrender them in 1933 when President Franklin Roosevelt called in most gold coins. The 1905 for many years traded for less than the 1904, but by 1960 had matched the earlier version's price and in the 1980s surpassed it. The 2014 edition of A Guide Book of United States Coins (the Red Book) lists the 1904 at between $900 and $10,000, depending on condition, and the 1905 at between $1,200 and $15,000. One 1904, in near pristine MS-68 condition, sold in 2006 at auction for $57,500.
Despite the relative failure of the coin issue, the statue of Sacagawea was duly erected in a Portland park, financed by coin sales. In 2000, Sacagawea joined Lewis and Clark in appearing on a gold-colored dollar coin, with the issuance of a circulating coin depicting her and her son.
## References and bibliography |
# Charlie Macartney
Charles George Macartney (27 June 1886 – 9 September 1958) was an Australian cricketer who played in 35 Test matches between 1907 and 1926. He was known as "The Governor-General" in reference to his authoritative batting style and his flamboyant strokeplay, which drew comparisons with his close friend and role model Victor Trumper, regarded as one of the most elegant batsmen in cricketing history. Sir Donald Bradman—generally regarded as the greatest batsman in history—cited Macartney's dynamic batting as an inspiration in his cricket career.
He started his career as a bowling all-rounder. He made his Test debut in 1907, primarily as a left arm orthodox spinner who was considered to be a useful lower-middle order right-hand batsman. As Macartney was initially selected for his flexibility, his position in the batting order was frequently shuffled and he was largely ineffective. His most noteworthy Test contribution in his early career was a match-winning ten wicket haul at Headingley in 1909, before being dropped in the 1910–11 Australian season. It was around this time that Macartney befriended Trumper and began to transform himself from a bowler who batted in a defensive and technically correct manner, into an audacious attacking batsman. He reclaimed his Test position and made his maiden Test century in the same season, before establishing himself as the leading batsman in the team.
The First World War stopped all first-class cricket and Macartney enlisted in the Australian Imperial Force. Upon the resumption of cricket, Macartney stamped himself as one of the leading batsmen in the world with his performances during the 1921 Ashes tour. Macartney produced an Australian record score in England of 345 against Nottinghamshire. The innings was the fastest triple century in first-class cricket and the highest score made by a batsman in a single day of play. He reached 300 in 205 minutes and the innings took less than four hours. Macartney topped the batting averages and run-scoring aggregates, which saw him named as one of the five Wisden Cricketers of the Year in 1922. Wisden said that he was, "by many degrees the most brilliant and individual Australian batsman of the present day". After missing the 1924–25 series due to mental illness or a recurrence of war injuries, Macartney departed international cricket at the peak of his powers on the 1926 tour of England. He became the second Australian to score a century in the first session of a Test match, and did so on a sticky wicket conducive to bowling. This was part of a sequence of three consecutive Test centuries as he led the batting charts. Macartney was posthumously inducted into the Australian Cricket Hall of Fame in 2007.
## Style
Macartney's flair was compared to that of Victor Trumper, and his determination to that of Don Bradman, who is generally regarded as the finest batsman in cricketing history. His style was quite different from that of Trumper, but he generated fascination with his Trumper-like daring and supreme confidence. Self-taught to a greater extent than anyone else in Australia or England in his era, the 1922 Wisden Almanack described Macartney as "a triumph to individualism... he is not a model to be copied" and "one of the most brilliant and attractive right-handed batsmen in the history of Australian cricket". His success was largely attributed to his eye, hand and foot co-ordination.
Macartney was a short man, standing 160 cm (5" 3). When batting, he would unconventionally attempt to leg glance yorkers pitched on middle stump down to fine-leg, and often lost his wicket in so doing. He was known for preferring his team-mates to give him candid criticism rather than praise. In later life he condemned modern batsmen; he would explain why he no longer watched cricket by saying "I can't bear watching luscious half-volleys being nudged gently back to bowlers". Sir Neville Cardus wrote that "there was always chivalry in his cricket, a prancing sort of heroism. The dauntlessness of his play, the brave beauty and the original skill bring tears to my eyes yet." In the late 1940s, Macartney received a letter from a compiler of Who's Who in Australia, seeking information on his life. Macartney said that he had "no record of figures, nor am I concerned with them. My only interest is the manner in which the runs are compiled and how wickets are taken, and in the good of the game." "Those sentiments", wrote former Australian Test batsman Jack Fingleton, "summed up the cricket story of C. G. Macartney".
An authoritative, combative stylist, Macartney's élan and devastating strokemaking led Kent cricketer Kenneth Hutchings to dub him "The Governor-General". Fingleton noted that, early in his innings, Macartney had a strategy of aiming a shot straight at the bowler's head, in order to rattle him and seize a psychological advantage. On one occasion, after reaching a century before lunch on the first day of a match, he immediately called for a bat change. He selected the heaviest bat from the batch that his team-mate brought out and stated "Now I'm going to have a hit". His rate of scoring and boundary-hitting subsequently increased. He possessed powerful hands, strong forearms and broad shoulders. Leg spinning Test team-mate Arthur Mailey recalled that Macartney would often hit him for six in Sydney Grade Cricket matches. Grinning, he would say "Pitch another one there and I'll hit you for a few more". On the occasions when he lost his wicket attempting further long hits, Macartney's grin remained, and he would remark "Wasn't it good fun?" The famed cricket writer RC Robertson-Glasgow said
> No other Australian batsman, not even Bradman, has approached Macartney for insolence of attack. He made slaves of bowlers. His batting suggested a racket player who hits winners from any position. Length could not curb him, and his defence was lost and included in his attack.
As a bowler, Macartney delivered the ball at a relatively fast pace for a left-arm orthodox spinner, comparable in speed to Derek Underwood. He was known for his consistent length and his well-concealed faster ball which often caught batsmen off guard. On sticky wickets, he was often incisive, and these conditions helped him take five wickets in an innings 17 times in his first-class career. He was known for his miserly attitude, often giving the impression that he would rather bowl ten consecutive maidens rather than take wickets if it meant conceding runs. This extended to his off-field activities, where he was considered careful with money. On the 1926 tour of England, he and Mailey visited a hat shop that had a tradition of giving souvenir hats to cricketers of touring Australian squads. When asked if he would like a similar style to the one he received in 1921, Macartney referred to the hat on his head and replied "Not on your life. I've been wearing this since you gave it to me in 1921." Macartney's notorious fiscal obsessions irritated his captain Warwick Armstrong on the 1921 tour; during the trip, he would hoard all manner of goods that were given to the team as gifts.
In 1909, Australian team-mate Trumper moved from Paddington, a suburb on Sydney's south shore to Chatswood on the northern side of the harbour, where Macartney lived. Macartney and Trumper played together for Gordon Cricket Club on the north shore and became close friends. Macartney regularly practised on the Trumper family's backyard turf pitch. Trumper's relocation made more frequent meetings possible, since the Sydney Harbour Bridge was not to open until 1932, and the only way of travelling between either side of the bay was by ferry. Trumper was regarded as the "crown prince of the golden age of cricket", the finest and most stylish batsman of his era, and one of the most elegant strokemakers of all time. Under Trumper's influence, Macartney became more audacious and adventurous; Unlike their English counterparts, the Australians were proud of their spontaneous play. Macartney revered Trumper as both a cricketer and a person, and was to be a pall bearer when Trumper died in 1915 at the age of 37. However, unlike Trumper, Macartney was known for his habit of "walking", the act of leaving the ground before or contrary to an umpire's decision if a batsman knows that he is out. On one occasion, Macartney felt so guilty that the umpire had incorrectly ruled him not out despite a clear edge that he attempted to throw his wicket away with a wild airborne shot. However, the ball went for six, and Trumper, his batting partner at the time, admonished him, saying that his good luck would be balanced by occasions when the umpire would give him out incorrectly.
## Early years
Macartney was born on 27 June 1886 in West Maitland, New South Wales. He was taught to play cricket as a child by his maternal grandfather George Moore, a slow roundarm bowler who represented New South Wales in three first-class matches against Victoria. The equipment consisted of small hand-crafted bat made from cedar, and apples from the family orchard used as balls.
In 1898, Macartney and his family moved from Maitland to Sydney. In his school cricketing career Macartney distinguished himself as an all-rounder at Woollahra Superior and Chatswood public schools, before briefly attending Fort Street High School. Macartney asserted that school cricket was insignificant in his development, believing that he learned more about cricket during informal summer cricket games with his brother at the local park, with their dog acting as a fielder. It was during his school career that Macartney was noticed by incumbent Australian captain Monty Noble, who heaped praise on him in a newspaper article.
After leaving school, Macartney worked for a fruit and vegetable merchant near Sydney's Sussex Street docks, honing his batting skills by practising without pads on a wooden wharf during his lunch break. At this stage in his career, he possessed a copybook technique and defensive style, something he was to discard for an audacious, self-styled and attacking outlook.
In 1902, Macartney joined North Sydney Cricket Club in the first division of Sydney Grade Cricket and then moved to the Gordon club in the outer northern suburbs when it was formed during the 1905–06 season. He played regularly for Gordon until 1933–34 when he was 47, amassing 7648 runs at an average of 54.62. He was known for his dominant status at the Chatswood Oval. In one match, he lofted a ball out of the ground, over the railway line and onto a lawn bowling green, forcing the players to take evasive action.
## First-class debut
Macartney's exploits were noticed by the State selectors, and he made his first class debut for New South Wales against Queensland at the start of the 1905–06 season. He made 56 in New South Wales' first innings of 691, and after not bowling in the first innings, he took 3/80 and his first catch in an innings victory. He then scored 70 not out in an innings triumph over South Australia. He failed to pass 25 in his remaining four matches for the season, but took at least one wicket in each game. In one match for his state against an Australian XI, Macartney took a total of 5/123, including the wickets of Trumper and Australian Test captains Noble and Joe Darling. He was also run out in both innings. Aside from this match, New South Wales were victorious in the remaining five fixtures. He scored 185 runs at 26.43 and took 15 wickets at 28.20 in six matches.
Macartney continued his rise with a more productive and consistent second season with both the ball and bat. In his second match in 1906–07, Macartney broke through for his first century, scoring 122 before taking match figures of 4/92 in an innings win over Queensland. In the next match, he took his first five-wicket innings haul, recording figures of 5/18 and 2/17 in an innings win over South Australia, including leading Test batsman Clem Hill twice. Macartney took wickets in each match; he ended the season with 405 runs at 40.50, with two further fifties, and took 30 wickets at 18.20 in nine matches.
The following season, in 1907–08, saw the arrival of England for a Test series. Macartney had a chance to stake his claim for national selection in a match for his state against the tourists. He made 9 and 13, unbeaten in both innings, as his partners were dismissed cheaply and left him stranded. New South Wales made 101 and 96 and lost by 408 runs, with Macartney taking a total of 1/64. He was selected for an Australian XI to play the tourists in an effective dress rehearsal for the Tests, and made 42 and took 4/36 in a drawn match. As a result, Macartney was selected to make his debut against England in the First Test at the Sydney Cricket Ground. He was viewed as a utility player, selected for the flexibility in his batting position and his left arm orthodox spin.
## Test debut
Macartney had a moderately successful debut. He bowled three wicketless overs in the first innings, before scoring 35 in Australia's reply while batting at No. 7. He took one wicket, that of leading English batsman Wilfred Rhodes. With Australia needing 274 runs to win in the second innings, Noble decided that Macartney's first innings effort warranted promotion to act as Trumper's opening partner. He managed to score only nine, but Australia managed to scrape home to seal a two wicket victory. Macartney's domestic form after his Test debut was sufficient for him to retain his position for the Second Test in Melbourne. Noble persevered with Macartney as Trumper's opening partner and he scored 37 of an 84 run opening partnership in the first innings. He returned to the middle order in the second innings to score 54 and took a total of 1/55 as England squared the series with a narrow one wicket victory. His most productive batting of the series came in the Third Test in Adelaide, when he scored 75 batting at No. 3 and took two wickets for 66 runs (2/66) in an Australian victory. His batting was largely ineffectual in the last two Tests; he failed to score more than 30 in any innings as he was moved to No. 8 and then back to the opening position in the Fifth Test. Despite his confused role as a batsman, he contributed with the ball in the Fifth Test victory on a pitch amenable to spin, taking match figures of 5/66. His first international series had yielded 273 runs at an average of 27.30 and ten wickets at an average of 26.60. He tasted victory, with Australia taking the series 4–0. His highest score of the season was 96 against England in a later match for New South Wales. The hosts were 12 runs short of victory with one wicket in hand when time ran out. Macartney scored 524 runs at 27.58 and took 25 wickets at 28.76 in 12 matches. In spite of his unsettled role in the batting line up, Macartney had performed well-enough as an all-rounder in the following domestic season in 1908–09 to be selected for the 1909 tour of England, his first overseas tour. Macartney took a total of 6/60 in an innings victory over South Australia in the first match of the season. He then scored 100 in the return match, and ended the six matches of the summer with 319 runs at 53.17 and nine wickets at 29.89.
## First tour of England
Macartney started his tour of England by taking match figures of 5/86 in a nine-wicket win over Northamptonshire. He took at least two wickets in each of the five matches leading up to the Tests, with 5/24 against Oxford University. He totalled 17 wickets at 15.18 and scored 141 runs at 28.20.
Macartney took 3/21 in the first innings of the First Test, removing captain Archie MacLaren, CB Fry and leading batsman Jack Hobbs, but managed only 0/35 in the second innings as England scored 0/105 to win by ten wickets. He then scored five and was wicketless as Australia levelled the series with a nine-wicket win in the Second Test. He then scored 124, his only century of the tour, in a non-first-class match against Western Union.
His bowling confounded the English team in the Third Test at Headingley in Leeds, where he took 7/58 in the first innings and 4/27 in the second. It was his best innings and match bowling figures in Tests and helped win the Test and eventually retain The Ashes. Australia had struggled to post 188 in their first innings on a pitch conducive to spin bowling, with Macartney scoring only four. Australia responded with a dual spin attack, with Noble bowling off spin in tandem with Macartney's left arm orthodox. Noble (0/22 from 13 overs) tied down the batsman, allowing Macartney to attack at batsman at the other end. He bowled with a high trajectory, tempting the batsmen to attack him and then varied his bowling speed to surprise them. He had Jack Sharp stumped after luring him from the crease and bowled Jack Hobbs with a faster ball. Other victims included English captain MacLaren, JT Tyldesley, George Hirst and Sydney Barnes. England were bowled out for 182 and Australia replied with 207; Macartney scored 18. Australia went on to win by 126 runs after Macartney took four more wickets in the second innings, removing MacLaren, Tyldesley, Rhodes and Barnes to help dismiss the hosts for 87. Macartney then made a half-century and took a wicket in each of the last two Tests, both of which were drawn to hand Australia a 2–1 series win.
Macartney's batting in the Test series was largely unsuccessful. He made two fifties, but otherwise failed to pass 20 and ended with 153 runs at 19.13. In his era, the expectation was that batsmen would be able to bat in a variety of positions and Macartney was gradually moved from seventh down to tenth in the batting order by the end of the tour. Largely due to his efforts at Headingley, his bowling figures were more impressive; he ended the Tests with 16 wickets at 16.13. At this stage of his career, Macartney was regarded as a bowling all-rounder. He was only eighth on the batting averages for the tour, with 503 first-class runs at 16.77, but took 71 wickets at an average of 17.46 in eight matches.
Upon returning to the southern hemisphere, Macartney headed to New Zealand for a stint with Otago instead of playing in Australia in 1909–10, due to attractiveness of the foreign outfit's remuneration. The period was unsuccessful for Otago—all three matches were lost—but Macartney was prolific as an individual. He took match figures of 7/68 against Canterbury and 7/81 in the first innings of a game against Australia, removing Test teammates Warwick Armstrong and Warren Bardsley. He ended with 17 wickets at 17.53 and scored 132 runs at 22.00.
Macartney started poorly in the home series against the touring South Africans in the Test series of 1910–11. In the first three Tests, he accumulated 15 runs in five innings and took a solitary wicket. As a result, he was dropped for the Fourth Test. Up to this point he had not passed 45 in 13 innings for the season and taken only seven wickets in seven matches. He then made 119 and 126 for New South Wales against South Africa and took match figures of 4/155 as the tourists fell to a 44-run defeat. This prompted the selectors to restore him to the Test team, and Macartney bounced back with his first Test century, making 137 in the first innings and 56 in the second in just 40 minutes, as Australia completed a seven-wicket win. It was his third century in as many first class innings. The late-season hat-trick of centuries pushed Macartney's season total to 609 runs at 33.83 and 10 wickets at 54.90 in ten matches.
The 1911–12 season started strongly for Macartney. He scored 122 and took 5/81 as New South Wales defeated Queensland by an innings. However, the season went downhill from there; Macartney failed to pass 30 in the next ten innings and took only three wickets in the next six matches. As a result, he was left out of the playing XI and made twelfth man for the first three Tests against the touring England team.
## Omission and recall in 1912
Macartney's omission was part of the most infamous disputes in Australian cricket history and led to a fracas. Australian captain and selector Clem Hill wanted to include Macartney for the Third Test, but another member of the panel, former player Peter McAlister objected and said that Hill should omit himself if he wanted Macartney to play. Tensions between the two selectors were high, and came to a head in a selection meeting ahead of the Fourth Test. McAlister criticised Hill's tactics and policies towards his bowlers, provoking an exchange of insults regarding the other's leadership ability. Hill then bloodied McAlister with a powerful blow to the nose and the ensuing brawl lasted between 10 and 20 minutes. Furniture was knocked across the room, artwork shattered and Hill had to be restrained from throwing McAlister out of the third floor window, before resigning as a selector.
Eventually, Macartney was recalled for the Fifth Test against England and scored 26 and 27 and took a total of 1/54 in a defeat. Macartney scored 300 runs at 27.27 and took nine wickets at 32.78 in eight first-class matches for the season. Macartney wrote later that "persistent ill-feeling seriously affected the morale of the side".
Macartney then toured England for the 1912 Triangular Test Tournament, which also included South Africa. He was not in the original touring party, but six senior players including Hill and vice-captain Warwick Armstrong and leading batsman Victor Trumper withdrew from the tour due to a dispute with the board. Macartney was thus given a late call-up.
Macartney scored 84 but the tourists started on a bad note, losing to Nottinghamshire. He then scored 127 against Northamptonshire, 208 against Essex, 123 and 25 not out against Surrey and 74 against the Marylebone Cricket Club in four consecutive matches. Australia won the first two by an innings and the latter two by seven and five wickets respectively. Up to this point, Macartney had only claimed a solitary wicket. He then took match figures of 6/60 in a ten-wicket win over Oxford University.
Australia then defeated South Africa in their first Test of the tournament. Macartney made 21 in an innings victory and did not bowl. Macartney's batting waned in the next seven tour matches, passing 50 only three times in ten innings. However, he did take 13 wickets, including 6/54 against Yorkshire. Macartney then scored 99 in a drawn Test against England at Lord's that did not reach the second innings. Wisden regarded the innings as his best for the season. Macartney added half-centuries in consecutive county matches and after three further matches without passing 21, the Tests resumed.
Macartney scored nine and took 3/29 in a ten-wicket win over South Africa, and then scored 142 and 121 in the next match against Sussex. The next Test against England was then washed out in the first innings; Macartney neither batted nor bowled. The following match against South Africa did not reach the second innings and Australia then lost to England by 244 runs in the final, with Macartney taking a total of 2/67 and scoring four and 30. It was a barren August for Macartney, who did not pass 35 and took only six wickets in six first-class matches. However, he finished the tour strongly, scoring 176 against the South of England and 71 against CB Fry's XI in the last two matches.
Macartney scored 2,207 runs during the tour at an average of 45.04. During the English season, he reached the peak of his performance as an all-rounder, taking 38 wickets. He made six centuries, including two in one match against Sussex. Apart from his 99 at Lord's, Macartney did not pass 34 in the other Tests and ended with 197 runs at 32.83. He did not bowl heavily during the series, taking six wickets at 23.66. It was not a happy tour for the Australians; without the senior players, there were frequent reports of drunken brawls and verbal abuse towards the locals. Macartney was one of only four players to accept the guaranteed tour fee of 400 pounds; the others signed up to a percentage share of the profits and the commercial failure of the tour left them with less than half of the flat fee.
There were no further Test matches before the First World War. The 1912–13 Australian season was a short one for Macartney, but he was in rare form, scoring 125, 96, 94, 76 not out, 91, 10 and 154 in four matches, to total 646 runs at 107.66. He also took four wickets at 30.50.
During an unofficial tour of the Australian team to the United States and Canada during the off season in 1913, which consisted of more than 50 matches, the overwhelming majority of which were not first-class, Macartney scored 2,390 runs at 45.92 and took 189 wickets at 3.81, topping both the batting and bowling averages. He also made the most centuries (seven) and the highest individual score of 186 against a combined Canada and United States team. Macartney played in only five first-class matches and scored two centuries in these fixtures. In two non-first-class matches, he took 11/23 and 10/29 in an innings.
The 1913–14 domestic season was to be the last season of cricket before the outbreak of World War I. Macartney captained New South Wales for the first time against Tasmania. He had another prolific season with the bat; in six matches he scored 892 runs at 111.50 in nine innings. He scored 201 in an innings victory over Victoria, four other centuries including a 195, and two fifties. With Macartney in such form, five of the matches were won by an innings, and another by nine wickets. The record was blotted only by a loss to South Australia by 19 runs. Macartney took two wickets at 32.50. Macartney was selected for the five-Test tour of South Africa in 1914–15, but the campaign was called off due to the war.
Despite his success on the field, Macartney still had a regular job outside of cricket, as with most cricketers of the era. In 1914, he left his job on the Sydney wharves and joined the staff of New South Wales Railways & Tramways in the Chief Mechanical Engineer's Office at Redfern. The following season, he scored 191 runs at 38.20 including a century in three matches. He did not take a wicket.
## Post-war Test career
World War I interrupted Macartney's career as competitive cricket was cancelled. In January 1916, he enlisted in the Australian Imperial Force (AIF). In July 1917 he was posted to France as a temporary Warrant Officer in the 3rd Division Artillery. In 1918, he was awarded the Meritorious Service Medal for gallantry and reached the rank of corporal. The death of his father later in the year led to his repatriation from Britain and prevented his appearance with the AIF cricket team.
The war years divided Macartney's career in two. Prior to the war, he was primarily known as a bowling all-rounder. In 21 Tests, he had taken 34 wickets at 26 and scored 879 runs at 27, with one century. After the war, Macartney transformed himself into one of the greatest batsmen of his era. In his 14 post-war Tests, he scored 1,252 runs at nearly 70, with six centuries. His bowling became more sporadic, taking just 11 more wickets, averaging 32.
Macartney resumed Test cricket when Australia hosted England in 1920–21, and was one of only four players remaining from before the war. However, he only played in two of the Tests due to illness and injury. His early season form was ominous for the tourists. Macartney scored 161 in guiding New South Wales to a successful run-chase of 4/335 over the Englishmen. He then scored 96 and 30 for an Australian XI against the tourists in a dress rehearsal for the Tests.
In the First Test, playing as an opening batsman, he struck 19 in the first innings. Australia's new post-war skipper Warwick Armstrong felt that Macartney would be more effective at number three, and in the second innings, he made a free-flowing 69 in a 111-run second-wicket stand with Herbie Collins as Australia went on to inflict a 377-run defeat. Macartney's return to form was interrupted by an illness, which caused him to miss the following three Test matches. After a two-month layoff, Macartney struck 130 in a match for his state against England.
He returned for the Fifth and final Test, where he recorded his highest Test innings of 170 on his home ground, the Sydney Cricket Ground. Among the spectators was a 12-year-old Don Bradman, who had been taken to watch Macartney by his father. Eight decades later, Bradman recalled the innings, "as if it were yesterday", describing it as full of "delicate leg-glances, powerful pulls, cuts and glorious drives" and concluding that it was one of the best innings he had seen in his lifetime. Bradman cited the innings as an inspiration for his career. Macartney headed the Australian Test averages with 260 runs at 86.66 as Australia won the Ashes 5–0. It was the only such Ashes whitewash until 2006–07. Macartney has amassed 821 runs at 68.42 for the season. He took only three wickets at 56.33.
## Wisden Cricketer of the Year
On the 1921 Ashes tour, Macartney—who needed a special medical clearance before being selected— had a chance to rectify his poor batting performances of his pre-war tours of England. In his first match, against Leicestershire, he started strongly with 177. his fast scoring helped Australia complete an innings victory in just over half the allotted playing time. He scored 87 against Surrey, 51 against Combined Services and 77 against Oxford University in the next seven matches leading up to the start of the Tests, with a total of 539 runs at 53.90 under his belt.
Macartney made 20 in the first innings and was unbeaten on 22, playing as an opener, as Australia completed a ten-wicket win in the First Test. It was Australia's sixth consecutive Test win over England. He failed to pass 20 in the next two county matches, but did take 2/19 against Middlesex, his first wickets on tour. This came in his 11th match on tour and was a reflection of his role as a specialist batsman in the post-war years. The next game against Gloucestershire heralded the start of a rich vein of run-scoring during the remainder of June. Macartney scored 149, in an Australian innings noted for elegant strokeplay and big hitting, after managing only 31 and eight in the eight-wicket win in the Second Test, hit three consecutive centuries.
Macartney hit 105 as Australia amassed 7/708 declared against Hampshire and then made 193 as Australia compiled 621 and defeated Northamptonshire by an innings and 484 runs. The two matches were separated by a match against Surrey, which Macartney missed due to injury. In the latter match, Macartney came in at first drop after the hosts took a wicket from the first ball of the match, and he scored 193 of the 318 runs scored while he was in the middle. Macartney took only 135 minutes and hit 31 fours as Australia added more than 300 in just over two hours of batting. Such was the dominance of Macartney and the rest of Armstrong's men that they disposed of Northamptonshire in less than two days. However, his most famed innings was yet to come.
In the next match, Macartney scored 345 against Nottinghamshire at Trent Bridge in 232 minutes, with 47 fours and four sixes. Macartney had an inauspicious start to the day, coming to the crease after the dismissal of Warren Bardsley with only one run scored. He attacked immediately and was dropped in the slips when on nine runs. The missed chance further emboldened Macartney, who had a philosophy that being dropped was a signal that it was his day to shine. He proceeded to exhibit his full repertoire of strokes. After reaching his double century in only 150 minutes, Macartney signalled to the pavilion. When Nottinghamshire captain Arthur Carr asked him if he was seeking a drink, Macartney said that he wanted a heavier bat and indicated that he was going to attack. Macartney kept his promise, adding his next 100 runs in only 48 minutes to reach 300 in 198 minutes. At the time, it was the fastest triple century in first-class cricket in terms of minutes. It still stands as the highest innings by an Australian in England, and at the time was the most runs scored by any batsman in one day. During the innings, Macartney partnered Nip Pellew in a partnership of 291. Australia went on to score 675 and won by an innings and 517 runs, in only two days, the largest winning margin achieved by Australia in a first-class match. The cricket writer Sumner Reid described Macartney's innings as:
> the most destructive innings I ever saw in England or Australia. Not Trumper at his brilliant best, nor even Bradman in his calculated genius, ever performed with more unadulterated, murderous power and masterful technique.
In the space of four days, Macartney had scored 538 runs, and for the month of June, he had totalled 913 runs at 91.30. He carried this form into the next Test.
In the Third Test at Headingley, he made his first Test century on foreign soil, striking 115 in the first innings. It was a relatively sedate innings for his standards, but helped Australia to victory by 191 runs and an unassailable 3–0 series lead. It gave Warwick Armstrong's men an eighth consecutive Test win, which remained a world record for more than five decades until surpassed by the West Indies cricket team of the 1980s. The cricket writer Gideon Haigh said that "It was like watching the armies of succeeding generations in combat, artillery, and tank against sword and horse".
Macartney had a quiet time over the next month, passing fifty only once in the next eight innings in seven matches. He also ended his wicket-taking drought, claiming six in three matches after almost two months without success. He returned to form with 72 against Warwickshire and 155 in the next match against Kent.
Macartney finished with 61 in the drawn Fifth Test at The Oval, to head the run-scoring with 300 runs at 42.85 as Australia took the series 3–0. He did not take a wicket in the Tests. Macartney then scored 121 against Gloucestershire in an innings victory immediately after the Tests, but did not pass 45 in the remaining four matches of the tour. Macartney topped the batting aggregates and averages with 2,317 runs at 59.41 in the first-class matches. He took only eight wickets at 32.63 for the entire tour.
Macartney's efforts during the 1921 English summer led to his being named as one of the 1922 Wisden Cricketers of the Year. Wisden stated that Macartney was "by many degrees the most brilliant and individual Australian batsman of the present day".
On the journey back to the southern hemisphere, Australia stopped for its first ever Test tour of South Africa. Macartney warmed up with 135 in a victory over Natal. The cricket writer Jack Pollard described Macartney's hitting as "powerful, almost arrogant". Macartney then scored 59 and 116 in an aggressive display in the First Test in Durban, which was drawn, with the hosts hanging on with only three wickets in hand. After missing the Second Test due to fitness reasons, Macartney returned against Western Province. He took 5/40 in the first innings, his first five-wicket innings since June 1912, nine and a half years earlier.
In the Third Test in Cape Town, Macartney scored 44, before taking 5/44 in the second innings to ensure that Australia would only have to chase a solitary run. He bowled three of his victims and removed Billy Zulch twice. The hosts struggled against the dual spin of Macartney and Mailey. Australians went on to secure a ten-wicket victory. Macartney finished the Test series with seven wickets at 14.86. He totalled 492 runs at 70.28 and 14 wickets at 17.14 for the tour, against topping the batting averages.
Macartney started the 1922–23 season strongly, scoring 63 and 84 and taking 2/8 in a five-wicket win over the touring MCC in the first match of the summer. He only passed fifty once more in the season and took 5/8 in an innings against Victoria. Macartney totalled 350 runs at 29.16 and 12 wickets at 12.16 in eight matches for the season. The next Australian season was a shortened one for New South Wales. Macartney scored 174 runs at 21.75 and took seven wickets at 21.14 in four matches before his state embarked on a tour of New Zealand.
Macartney struck form immediately, scoring 80 and 120 in the opening match against Wellington. He followed this with 100 (in a non-first-class match), 120 against Otago and 221 in the next match against Canterbury, all in consecutive innings. He added match figures of 4/38 as New South Wales defeated Canterbury by an innings. Macartney then scored 36 and 55 not out and took match figures of 4/55 in an eight-wicket win over New Zealand. He made only two and seven in the remaining first-class matches, and ended with 13 wickets at 20.92.
Macartney missed the 1924–25 Test series when England toured Australia. He played in only two first-class matches in the early stages of the season, scoring 11 runs at 3.66 and taking five wickets at 23.40. The withdrawal of Macartney from competition was attributed to a flare-up of an injury he had suffered during World War I, but sceptics believed that he had suffered a nervous breakdown.
Following his year off, Macartney returned to full-time cricket in 1925–26. He re-established himself in his first match, scoring 114 and taking a total of 4/49 as New South Wales crushed Western Australia by an innings and 235 runs. Macartney then scored 84 and 28 to help the Rest of Australia defeat the national team by 156 runs. He then scored two centuries as New South Wales won all four of their Sheffield Shield matches, three by an innings. Up to this point, Macartney had scored 582 runs at 72.75 and taken 20 wickets at 20.30. This was enough for him to be selected for the 1926 tour of England. His most notable performance with the ball was his 7/85 and 2/16 in an innings victory over arch-rivals Victoria. His wickets included batsmen Bill Woodfull (twice), Bill Ponsford, Jack Ryder and all rounder Hunter Hendry, who played alongside him in the 1926 Tests. Following his selection for the England tour, Macartney warmed up by scoring 66 and 163 not out and taking a total of 4/48 in consecutive innings victories for the Australian touring party over Tasmania.
## International farewell
Macartney's international farewell on the 1926 tour of England saw him at the peak of his batting powers. Unlike the previous tour in 1921, Macartney was also prominent with the ball.
During the opening first-class fixture against Leicestershire, Macartney scored only two but took 5/9 in a rain-affected draw. In the next match against Essex, another rain-affected draw, he starred with the bat, scoring 148. In the third match, another draw against Surrey, Macartney combined both of his skills and scored 53 and took 6/63 in the first innings. His victims included English Test batsmen Jack Hobbs and Percy Fender. He then took 3/21 and 4/57 as Australia beat Hampshire to record their first win of the season. In nine matches before the First Test, Macartney scored 379 runs at 42.11 and took 30 wickets at 13.20.
The First Test at Trent Bridge was washed out, with England scoring 0/32 in the only innings of the match. Macartney then scored 54 as Australia made 6/148 in the only innings of another wet match against Yorkshire. He then hit form ahead of the Second Test, scoring 160 and taking a total of 5/34 in an innings win over Lancashire.
After scoring 39 in the first innings in the Second Test at Lord's, Macartney took 1/90, removing centurion Jack Hobbs as England took a 92-run first innings lead. He then scored 133 not out in the second innings, to help to stave off defeat. Australia were 5/194 when the match ended, and were it not for Macartney's effort could have been bowled out.
Between Tests, Macartney scored 42 and 81 against Northamptonshire and Nottinghamshire respectively before taking 5/38 against Worcestershire. Australia won all three matches.
In the Third Test at Headingley, Macartney became only the second batsman to score a century before lunch on the opening day of a Test. The match started poorly for Australia. English captain Arthur Carr won the toss and sent Australia in to bat after a thunderstorm on the previous day had turned the surface into a sticky wicket; Bardsley was then dismissed by the first ball for a golden duck. Macartney strode to the crease, surveyed the fielding positions and called down the wicket to the bowler Maurice Tate "Let's have it\!" He nearly regretted his comment when he edged the ball to Carr at third slip from the fifth ball of the day. It was a difficult chance but the English skipper failed to hold the ball. Macartney was then on two. Within a few minutes, he had regained the initiative for the Australians.
Utilising both conventional technique and audacious shots, Macartney pierced the field with a variety of cuts, hooks, pulls, drives and glances. He teased the fielders with deliberate deflections through the slips; his late cuts were described by Raymond Robertson-Glasgow as being "so late they are almost posthumous". Macartney's attack helped his partner Bill Woodfull to settle in the difficult conditions. Macartney saved his severest hitting for George Macaulay, a medium pace swing bowler and off spinner whom he regarded as England's most potent bowler. Macartney had asked for and received permission from captain Herbie Collins to target Macaulay's bowling. By the end of the Australian innings, Macaulay had figures of 1/123 and was never to play against Australia again. Macartney's confidence was such that he charged down the pitch to meet the medium pace bowlers, a dangerous tactic on a surface with erratic bounce.
He reached 40 in as many minutes as Australia's total reached 50. Australia reached 100 in only 79 minutes with Macartney contributing 83 of those runs. Macartney reached his century in 103 minutes with the tourists on 131. By lunch, he had scored 112 in 116 minutes and he continued until the score reached 1/235, when he hit Macaulay to Patsy Hendren and was dismissed, having amassed 151 in 170 minutes. Former English captain Sir Pelham Warner said "I say without hesitation that I have never seen a greater innings... not even the immortal Victor Trumper could have played more finely". Macartney's innings allowed Australia to accumulate a healthy first innings total of 494. He then took 2/51, removing Carr and Fender as England made 294 and were forced to follow on; however, the Australians could not dismiss the hosts for a second time and the match ended in a draw.
Macartney then made 106 in a non-first-class match against the West of Scotland, before hitting 109 in the Fourth Test at Old Trafford in a rain-affected draw; the match failed to reach the second innings. Macartney had scored three centuries in as many innings.
Macartney's form tailed off thereafter; in the following six weeks, he made only one score beyond 40 in 11 innings and took only three wickets in nine matches. This included the Fifth Test, when he scored 25 and 16 and failed to take a wicket as England won the Test by 289 runs and with it the Ashes. Macartney topped the batting averages with 473 runs at 94.60 and took four wickets at 53.75. Macartney returned to form in the final first-class fixture of the season with an unbeaten 100 against and England XI.
Macartney decided to retire from Tests after the tour. He had taken part in twelve Test century partnerships, the highest being 235 with Woodfull in the Leeds Test.
## Career end
After his return to Australia, Macartney continued to play club cricket and turned out for a final first-class summer. At the start of the 1926–27 season, he captained a combined Sydney City team against a New South Wales country team, which included the then 18-year-old Bradman. Macartney scored 126 and Bradman 98 in a match viewed as a generational transition in Australian batting. He scored 114 in his opening first-class match of the season, and took wickets in each of his four matches. Macartney totalled 243 runs at 40.50 and took 11 wickets at 17.82.
In mid-1927 he toured Singapore and Malaya with Bert Oldfield's team and played in a series of non-first-class matches against local teams. In October 1929, he played for a New South Wales Cricket Association team against a series of local teams in the state's rural west.
In 1935–36, Macartney was vice-captain to Jack Ryder, on the tour of India organised by Frank Tarrant; he also wrote forthright columns for The Hindu, covering the trip. At the time, India had only received its first official tour, by England, and Australia was not keen on sending a Test team there. Thus, while the Test team were in South Africa, Tarrant's party consisted mainly of retired Test cricketers in their mid-40s and beyond.
In his return to first-class cricket after nine years, Macartney took 5/17 and 3/42 in the first international match against India, which the Australians won by eight wickets. He went wicketless as the series was squared in the second match, before taking 3/52 and 6/41 in the final match. Despite his nine wickets, Australia lost by 34 runs. Other notable performances included an 85 against Bengal and 3/45 and 3/47 against Madras. In the latter match, Macartney added 39 as the Australians scraped home by one wicket.
## Outside cricket
Macartney married Anna Bruce, a schoolteacher, at Chatswood Presbyterian Church in December 1921. At the time, the NSW Railway & Tramway Magazine noted that he was a "strict teetotaller and non-gambler" who loved his pipe, tennis and music. After his marriage, Macartney described himself as a civil servant while he was not engaged in cricketing activities. Like most Australian cricketers of his era, Macartney was Protestant, and a freemason.
Macartney wrote for several Sydney newspapers, and between 1936 and 1942 regularly produced pieces for the Sydney Morning Herald. In 1930 he published the autobiographical My Cricketing Days. During the Second World War, he was a lieutenant in the amenities service of the Australian Defence Force, and afterwards was a personnel officer at Prince Henry Hospital.
Childless, Macartney was predeceased by his wife. He died of coronary occlusion (heart attack) while at work in Little Bay, New South Wales, aged 72. In February 2007, Macartney was inducted into the Australian Cricket Hall of Fame along with Richie Benaud, making them the 26th and 27th inductees.
## Test match performance |
# U.S. Route 141
US Highway 141 (US 141) is a north–south United States Numbered Highway in the states of Wisconsin and Michigan. The highway runs north-northwesterly from an interchange with Interstate 43 (I-43) in Bellevue, Wisconsin, near Green Bay, to a junction with US 41/M-28 near Covington, Michigan. In between, it follows city streets in Green Bay and has a concurrent section with US 41 in Wisconsin. North of Green Bay, US 141 is either a freeway or an expressway into rural northern Wisconsin before downgrading to an undivided highway. In Michigan, US 141 is an undivided highway that runs through rural woodlands. The highway has two segments in each state; after running through Wisconsin for about 103 miles (166 km), it crosses into Michigan for approximately another eight miles (13 km). After that, it crosses back into Wisconsin for about 14+1⁄2 miles (23 km) before crossing the state line one last time. The northernmost Michigan section is about 43+1⁄2 miles (70 km), making the overall length about 169 miles (272 km).
When the US Highway System was formed on November 11, 1926, US 141 ran from Milwaukee to Green Bay, and one segment of the modern highway in Michigan was originally designated US Highway 102 (US 102). This other designation was decommissioned in 1928 when US 141 was extended north from Green Bay into Michigan. Michigan has rebuilt the highway in stages over the years to smooth out sharp curves in the routing. Since the 1960s, the section south of Green Bay has been converted into a freeway in segments. US 141 has ended southeast of Green Bay in Bellevue since the 1980s—the southern freeway segment was redesignated as I-43. The section north of Abrams, Wisconsin, was converted to a freeway in the opening years of the 21st century, with an additional divided-highway section opening a few years later.
## Route description
As a bi-state highway, US 141 is a state trunk highway in Wisconsin and a state trunkline highway in Michigan. In Wisconsin, the segment through the Green Bay area is not on the National Highway System (NHS), except for about four blocks along Broadway Avenue that is part of an intermodal connector with the Port of Green Bay. The NHS is a network of roads important to the country's economy, defense, and mobility. From the Green Bay suburb of Howard northward, including the entire length through Michigan, US 141 is a part of the NHS. From the I-43 interchange in Howard north to the split at Abrams, US 141 is also a part of the Lake Michigan Circle Tour (LMCT), a tourist route that surrounds Lake Michigan.
### Green Bay to Niagara
US 141 starts at an interchange with I-43 southeast of Green Bay in the suburb of Bellevue. From the terminus at exit 178, US 141 runs north to Main Street, and then northwesterly along Main Street through town. Wisconsin Highway 29 (WIS 29) merges with US 141 at an intersection on the northwest side of Bellevue, and the two highways run concurrently through residential subdivisions. Main Street passes over I-43 and continues to the north and into the city of Green Bay. US 141/WIS 29 crosses Baird Creek and runs along the banks of the East River. At the intersection with Monroe Avenue, WIS 29 turns south, joining WIS 54/WIS 57 while US 141 continues westward on Main Street to cross the Fox River on the Ray Nitschke Memorial Bridge. On the west side of the river, the highway follows Dousman Street for a block before turning north along Broadway Avenue for four blocks. From there, the highway follows Mather Street west to Velp Avenue. US 141 follows that street northwesterly and parallel to I-43 on the north side of Green Bay. This area is mostly residential with some businesses immediately on either side. In the suburb of Howard, US 141 merges onto the I-41/US 41 freeway via the interchange at exit 170. I-41/US 41/US 141 has an interchange for I-43 just south of the Duck Creek crossing, where I-41 terminates.
From Howard northward, the freeway runs through suburban Brown County to Suamico, parallel to a line of the Escanaba and Lake Superior Railroad (ELS), through a mixture of farm fields and residential subdivisions. There are frontage roads on both sides of the freeway to provide access to the properties immediately adjacent to US 41/US 141. There are a number of interchanges with county-maintained roads between Suamico and Abrams in Oconto County. At Abrams, US 141 splits from US 41 and heads northward as an expressway while the latter freeway turns northeasterly. The landscape north of the split transitions to forest, and the freeway crosses the Oconto River in Stiles south of the interchange with WIS 22. The expressway bypasses Lena to the east. It crosses a railroad track at the surface and continues north through mixed farm fields and forest to the county line. North of the line, US 141 continues to the Marinette County communities of Coleman and Pound as an expressway. There is another grade railroad crossing at Pound. Through Coleman and Pound there is also a Business US 141. Past the latter town, US 141 transitions from expressway to a two-lane undivided highway.
South of Crivitz, US 141 crosses the Peshtigo River. The highway crosses a branch line of the ELS on the east side of Crivitz and continues north through woodland to the community of Middle Inlet. North of town, the roadway turns northeasterly to the community of Wausaukee where it intersects WIS 180. From there, the highway passes through the communities of Amberg and Beecher before coming into Pembine, where US 8 merges in from the west. The two highways run concurrently north and northeasterly to an intersection southeast of Niagara. US 8 separates to the east, and US 141 turns northwesterly along River Street into Niagara. The highway then turns north along Roosevelt Road and over the Menominee River to exit the state of Wisconsin.
### Quinnesec northward
Once in Michigan, one mile (1.6 km) west of Quinnesec, US 141 meets and joins US 2. The two highways run concurrently westward into Iron Mountain along Stephenson Avenue, passing through a retail business corridor and into downtown. M-95 joins the two highways, and all three pass Lake Antoine. M-95 turns off north of town and US 2/US 141 crosses the Menominee River back into Wisconsin.
US 2/US 141 makes a 14.5-mile (23.3 km) run through Florence County, passing the Spread Eagle Chain of Lakes. The highway serves the communities of Spread Eagle and Florence. The only junction with another state trunk highway in Wisconsin on the northern section is with the concurrent highways WIS 70/WIS 101 in Florence. The highway crosses back into Michigan on a bridge over the Brule River south of Crystal Falls.
Across the state line, the trunkline runs through forest near several smaller bodies of water such as Stager, Kennedy, and Railroad lakes. The highway enters Crystal Falls on 5th Street. US 2/US 141 runs along the top of the hill in town and intersects the western terminus of M-69 next to the Iron County Courthouse. US 141 continues westward on Crystal Avenue and separates from the US 2 concurrency on the western edge of town. Running north and northwesterly, US 141 passes to the east of the Ottawa National Forest through rural Iron County. The highway crosses the Paint River and continues through forest to the community of Amasa. The trunkline crosses the Hemlock River on the west side of town. From there, US 141 runs northward into the southwest corner of Baraga County and also enters the Eastern Time Zone. West of Worm Lake, US 141 meets M-28 in the community of Covington. The two highways merge and run easterly for about four miles (6.4 km) before US 141 terminates at US 41; M-28 continues eastward, merging with US 41.
## History
### Initial state highways
In 1918, when Wisconsin initially numbered its highway system, the route of what later became US 141 followed two separate state highways: WIS 17 from downtown Milwaukee to Manitowoc and WIS 16 from Manitowoc north to Green Bay. Segments that later became US 141 in Wisconsin were numbered WIS 15 between Green Bay and Abrams, and WIS 38 between Abrams and Wausaukee. North of Wausaukee, the future US Highway was an unnumbered secondary highway. In 1919, Michigan signed its highway system, but the state did not have a highway running south from Quinnesec to the state line. The highway from Quinnesec into Iron Mountain was part of M-12. The segment through Florence County, Wisconsin, was WIS 69, and from the Crystal Falls area north to Covington, the M-69 moniker was used. In 1919, the WIS 38 designation was extended northward to Niagara and the state line. The highway was straightened to eliminate a series of sharp curves between Crivitz and Beaver in 1921. The same year, WIS 17 was realigned between Sheboygan and Cedar Grove to run via Oostburg. WIS 17 was also realigned in 1922 to follow a separate routing south of Port Washington; previously it was routed concurrently with WIS 57 in the area. By 1924, maps showed an unnumbered roadway running south from Quinnesec to connect with WIS 57 at the state line.
### Conversion to a US Highway
As originally proposed in 1925, several US Highways in Wisconsin and Michigan's Upper Peninsula were to be designated. However, the routings for two highways were different in Michigan in 1925 than on the final 1926 map. In the original plan, US 102 was supposed to replace M-15 from US 2 at Rapid River, continue via Marquette to Humboldt, and the highway between Crystal Falls and Covington was not included in the system. However, when the final plan was approved and implemented on November 11, 1926, US 41 took the eastern routing through Rapid River and Marquette, and US 102 was routed between Crystal Falls and Covington. In both plans, US 141 was only routed between Milwaukee and Green Bay, replacing WIS 17 and WIS 16. At the time the two US Highways were created, WIS 57 was left untouched between Abrams and Niagara. The next year, the M-57 designation was assigned to connect WIS 57 to Quinnesec, and US 8 was extended to follow US 141 to US 2 near Iron Mountain.
On November 12, 1928, the extension of US 141 northward from Green Bay along WIS 57 to the Michigan state line had been approved; the signage was readied for installation the previous month. The US 102 designation was decommissioned when US 141 was also extended to replace M-57 from the state line, along US 2 to Crystal Falls and north to Covington. US 8's eastern end was rerouted along a separate bridge over the Menominee River to a new terminus at an intersection with US 2 in Norway in 1929. US 141 was fully paved in Wisconsin in the early 1930s; the last segment to be completed was between Pound and Abrams.
The next major changes were made at the beginning of the 1930s in Michigan. A realignment in the Iron Mountain area shifted US 2/US 141 to a new bridge over the Menominee River between 1932 and 1934. In 1940, a new routing from the state line north to Crystal Falls was opened; the previous routing was returned to local control. The northern end was relocated near Covington in late 1948 or early 1949 when US 41 was realigned in the area. This terminus was shifted again when US 141/M-28 was realigned in the area in late 1955 or early 1956.
### Freeway era
At about the same time as the realignments in Michigan, two-lane bypasses of Manitowoc and Port Washington in Wisconsin were opened in 1957. The state built a divided-highway segment that opened the following year running from the Milwaukee area northward to the Ozaukee–Milwaukee county line. The highway was rerouted to run further inland, bypassing Haven, Wisconsin, in 1959. In late 1961, the highway in Michigan was rebuilt in northern Iron and southern Baraga counties between Amasa and Covington as the state smoothed out sharp corners in the routing and finished paving US 141; a similar project was completed in 1972 south of Amasa to Crystal Falls.
Wisconsin proposed an addition to the Interstate Highway System in the 1950s to connect Green Bay, the state's third-largest city, to the system. Variations on this proposal included using either the US 41 or US 141 corridors, or a new corridor in between. This request was rejected in the 1950s, but it was approved in the 1960s. After approval, the state started the process to convert US 141 between Milwaukee and Abrams into a freeway. The first segments of freeway were opened in the Milwaukee area, starting in 1963 between Locust Street and Good Hope Road. The following year, an extension of the freeway opened southward from Locust to North Avenue. By 1965, the bypass of Sheboygan was opened; the Milwaukee area freeway was extended northward to Brown Deer Road the following year. Another freeway segment in the Milwaukee area opened in 1967, extending northward to Grafton in Ozaukee County. The last section of US 141 in the city of Milwaukee to open as a freeway was completed in 1968 when I-94 was finished through downtown; at the same time, US 141 was extended southward from North Avenue to meet I-94.
Another freeway section from north of Green Bay to Suamico was opened in 1971. In 1972, the divided-highway segment between Suamico and Abrams opened, and the state started the construction of additional freeways between Green Bay and Milwaukee. The bypasses of Sheboygan and Cedar Grove were converted to full freeways in 1973. Another segment of freeway opened in 1975 that bypassed Port Washington and connected the freeway sections that ended near Grafton and Cedar Grove. I-43 was first designated on the 1978 official state highway map along US 141 from Milwaukee to Sheboygan; missing segments of I-43 between Green Bay and Milwaukee are shown as either under construction or proposed. In November of that year, the nine-mile (14 km) section bypassing Maribel opened. In October 1980, the 33-mile (53 km) segment of freeway between Sheboygan and Denmark opened. At the same time, the northern bypass of Green Bay was under construction and I-43/US 141 was open from Maribel to Branch northwest of Manitowoc; US 141 was truncated to end at the northern end of the Sheboygan bypass. I-43 was initially completed in 1981, and the southern terminus of US 141 was moved again, truncating the highway to end in Bellevue by 1983.
In 1986, the states in the Great Lakes region created the LMCT as part of a larger program of tourist routes in the region; US 141 carries the LMCT between the northern I-43 junction in the Green Bay area north to the split with US 41 at Abrams. In the first years of the 21st century, US 141 was expanded to a four-lane expressway northward from Abrams to Oconto Falls. A further upgrade in 2006 expanded the highway to four-lanes northward to Beaver. On April 7, 2015, the segment of US 141 that runs concurrently with US 41 on the west side of Green Bay was designated a part of I-41 by the Federal Highway Administration.
## Major intersections
|}
## Business routes
Three business routes of US 141 have existed. Two of these routes have been decommissioned.
### Coleman–Pound
Business U.S. Highway 141 (Bus. US 141) is a business loop of US 141 that runs through the communities of Coleman and Pound. The loop follows County Trunk Highway B (CTH-B) northeasterly from the US 141 expressway into downtown Coleman and then turns northward near Coleman High School. Bus. US 141 continues northward into Pound, crossing the Peshtigo River in between the two communities. North of Pound, the loop crosses over US 141 on 21st Road and continues to an intersection with WIS 64. The business loop follows WIS 64 back to an interchange on US 141 northwest of Pound where the loop terminates.
In 2006, the US 141 expressway was extended northward near Beaver, and the former route of US 141, plus a connector roadway southwest of downtown Coleman was designated as a business loop. This route does not appear on the official Wisconsin Department of Transportation maps, so it is a locally designated business loop under local maintenance.
### Manitowoc
Business U.S. Highway 141 (Bus. US 141) was a business loop of US 141 that was signed along US 151 and US 10 through downtown Manitowoc. The route was created in 1975, and decommissioned in 1980.
### Sheboygan
Business U.S. Highway 141 (Bus. US 141) was a business loop of US 141 that served downtown Sheboygan. The route was created when an expressway bypass of Sheboygan was finished in the early 1970s, and replaced with a business loop of WIS 42 when I-43 was finished and US 141 was truncated in 1980.
## See also
-
- |
# Richard Barre
Richard Barre (c. 1130 – c. 1202) was a medieval English justice, clergyman and scholar. He was educated at the law school of Bologna and entered royal service under King Henry II of England, later working for Henry's son and successor Richard I. He was also briefly in the household of Henry's son Henry the Young King. Barre served the elder Henry as a diplomat and was involved in a minor way with the king's quarrel with Thomas Becket, which earned Barre a condemnation from Becket. After King Henry's death, Barre became a royal justice during Richard's reign and was one of the main judges in the period from 1194 to 1199. After disagreeing with him earlier in his career, Barre was discharged from his judgeship during John's reign as king. Barre was also archdeacon of Ely and the author of a work of biblical extracts dedicated to one of his patrons, William Longchamp, the Bishop of Ely and Chancellor of England.
## Early life
Whether Barre was a native of England or of Normandy is unknown, but his surname appears to derive from the Norman village of La Barre, near Bernay, in the present-day department of Eure. He was likely born around 1130 and was related to Normandy's Sifrewast family, knights in Berkshire. Barre had a relative, Hugh Barre, who was Archdeacon of Leicester in the 1150s. Barre studied law at Bologna in Italy before 1150 and was a student there with Stephen of Tournai, who became Bishop of Tournai in 1192. Another fellow student wrote a short verse addressed to Barre: "Pontificum causas regumque negocia tractes, Qui tibi divicias deliciasque parant", which translates to "May you manage the causes of bishops and the affairs of kings, Who provide riches and delights for you." After finishing his schooling, Barre seems to have worked for either Robert de Chesney, the Bishop of Lincoln, or Nicholas, Archdeacon of Huntingdon; the main evidence for this is that Barre witnessed charters for both men from 1160 to 1164. By 1165, Barre had joined the household of King Henry II of England.
## Service to King Henry
Barre served King Henry during the king's quarrel with Thomas Becket, the Archbishop of Canterbury, who had gone into exile in 1164 over the dispute about the limits of royal authority over the English Church. Because of Barre's close ties to King Henry, Becket considered him one of the king's "evil counselors", and Barre was the subject of denunciations by the archbishop. In late August 1169, Barre was in Normandy with Henry, where Barre was part of a group of ecclesiastics advising the king on how to resolve the Becket dispute. In September 1169, Barre was sent along with two other clerks to Rome to complain about the behaviour of papal envoys during negotiations with Becket held at the beginning of September. The papal negotiators at first agreed to a compromise, but the next day claimed that the proposal was unacceptable. With the failure of the negotiations, Becket restored the sentences of excommunication on a number of royal officials, but Barre was not included among those specifically named even though many of his colleagues were. The historian Frank Barlow argues that Barre was not specifically named in the restoration of excommunications, as Becket considered him already excommunicated because of his association with those under the church's ban.
During January and February 1170 the king sent Barre on a diplomatic mission to the pope in Rome, on a matter related to the king's dispute with Becket. The mission attempted to secure the rescinding of the excommunication of those whom Becket had placed under clerical ban, but it was unsuccessful; rumours circulated that the mission sought and secured papal permission for the coronation of King Henry's eldest living son by someone other than Becket. When Becket protested to Pope Alexander III over this usurpation of the right of the archbishop to crown English kings, Alexander not only stated that no such permission had been granted but threatened to suspend or depose any bishop who crowned Henry's heir. Barlow thinks it possible that Barre received a verbal agreement from the pope in January to allow the coronation, but there is no written evidence that Alexander agreed to allow the coronation in 1170.
After Becket's murder in December 1170 King Henry sent Barre to Rome, accompanied by the Archbishop of Rouen, the bishops of Évreux and Worcester, and other royal clerks, to plead the royal case with the papacy. The mission's objective was to make it clear to Alexander that Henry had had nothing to do with Becket's murder and that the king was horrified that it had taken place. Barre was at first refused a meeting with Alexander, but eventually the envoys were allowed to meet with the pope. Although the mission was not a complete success, the royal commission did manage to persuade the papacy not to impose an interdict, or ban on clerical rites, on England or to excommunicate the king. Shortly afterwards Barre was granted the office of Archdeacon of Lisieux, probably as a reward for his efforts in Rome in 1171. In September he was named a royal justice. He was named chancellor to King Henry's eldest living son Henry for a brief period in 1172 and 1173, but when the younger Henry rebelled against his father and sought refuge at the French royal court, Barre refused to join him in exile and returned to the king's service. Barre took with him the younger Henry's seal.
In addition to the Lisieux archdeaconry, Barre held the prebend of Hurstborne and Burbage in the Diocese of Salisbury from 1177 and the prebend of Moreton and Whaddon in the Diocese of Hereford from 1180 through 1184. He continued to hold the archdeaconry at Lisieux until 1188, and was at Lisieux for most of the late 1170s and 1180s. In 1179 he was at Rouen for the display of the body of Saint Romanus and was one of the witnesses to the event. While holding his Norman archdeaconry, he gave land to the abbey of St-Pierre-sur-Dives along with Ralph, Bishop of Lisieux. In February or March 1198, King Henry sent Barre on a diplomatic mission to the continent with letters to Frederick Barbarossa, the German Emperor; Béla II, the King of Hungary; and Isaac II Angelos, the Emperor at Constantinople, seeking assistance for his projected crusade. Barre carried letters to the three rulers requesting passage through their lands and the right to procure supplies. Nothing came of this mission, as Henry died in 1189 before the crusade could set off.
## Later years and death
After the death of King Henry, Barre joined the service of William Longchamp, the Bishop of Ely, who was justiciar and Lord Chancellor. Longchamp named Barre as Archdeacon of Ely, with the appointment occurring before 4 July 1190. Longchamp sent Barre as a royal justice to the counties near Ely in 1190. However, Longchamp was driven into exile in late 1191 owing to the hostility of the English nobility and Richard's brother Prince John during Richard's absence on the Third Crusade. Longchamp's exile meant that Barre did not serve as a royal justice again until King Richard I returned to England in 1194. Although Longchamp eventually returned to England, he did not return to his diocese, and much of the administration of Ely would have devolved on Barre during Longchamp's absence.
Barre was one of the main royal justices between 1194 and 1199. He also served as a lawyer for the new Bishop of Ely, Eustace, who was elected in August 1197. But Barre had incurred the hostility of the king's younger brother Prince John, and when John succeeded Richard as king in 1199, Barre ceased to be employed as a royal justice, instead returning to Ely and business in his clerical office. His last sure mention in the historical record is on 9 August 1202, when he was serving as a judge-delegate for Pope Innocent III, but he may have been alive as late as 1213, as he was part of a papal panel deciding a case that can only be securely dated to between 1198 and 1213. Barre maintained his friendship with Stephen of Tournai, who corresponded with him later in their lives.
## Literary work
Barre wrote a work on the Bible entitled Compendium de veteri et novo testamento, which he dedicated to Longchamp. The work arranged passages from the Bible under topics, and then annotated the passages with marginal notations such as were done with glosses on Roman law. It is still extant in two manuscript (MS) copies, MS British Library Harley 3255, and Lambeth Palace MS 105. The Harley manuscript is shorter than the Lambeth manuscript. Richard Sharpe, a modern historian who studied both works, stated that the Harley manuscript "provides [a] well structured and systematic (though not complete) coverage of the whole Bible." Because of the dedication to William Longchamp as "bishop, legate, and chancellor", it is likely that the work was composed between January 1190 and October 1191, as Longchamp only held those three offices together during that period. The prologue to the work describes it as something to be used privately, and thus Sharpe feels that it was not intended to be a publicly published work; instead Barre may have intended it for Longchamp's private use in preparing sermons.
A third copy of Barre's Compendium may have existed at Leicester Abbey, where a late-15th-century library catalogue records a work by Barre on the Bible that the catalogue titles "Compendium Ricardi Barre super utroque testamento". The title and contents make this manuscript likely to be a copy of the Compendium. The same catalogue also records five books once owned by Barre—copies of Gratian's Decretum, Justinian's Codex, glossed copies of the Psalter and some of the Epistles of Paul, as well as Peter Lombard's Sentences. Also, another Leicester Abbey manuscript records some satirical verses that were said to have been written by Barre. |
# Banksia speciosa
Banksia speciosa, commonly known as the showy banksia, is a species of large shrub or small tree in the family Proteaceae. It is found on the south coast of Western Australia between Hopetoun (34° S 120° E) and Point Culver (33° S 124° E), growing on white or grey sand in shrubland. Reaching up to 8 m (26 ft) in height, it is a single-stemmed plant that has thin leaves with prominent triangular 'teeth' along each margin, which are 20–45 cm (7.9–17.7 in) long and 2–4 cm (0.8–1.6 in) wide. The prominent cream-yellow flower spikes known as inflorescences appear throughout the year. As they age they develop up to 20 follicles each that store seeds until opened by fire. Though widely occurring, the species is highly sensitive to dieback and large populations of plants have succumbed to the disease.
Collected and described by Robert Brown in the early 19th century, B. speciosa is classified in the series Banksia within the genus. Its closest relative is B. baxteri. B. speciosa plants are killed by bushfire, and regenerate from seed. The flowers attract nectar- and insect-feeding birds, particularly honeyeaters, and a variety of insects. In cultivation, B. speciosa grows well in a sunny location on well-drained soil in areas with dry summers. It cannot be grown in areas with humid summers, though it has been grafted onto Banksia serrata or B. integrifolia.
## Description
B. speciosa grows as a shrub or small tree anywhere from 1 to 6 or rarely 8 m (4–26 ft) high. It has an open many-branched habit, arising from a single stem or trunk with smooth grey bark. Unlike many banksias, it does not have a lignotuber. The plant puts on new growth, which is covered in rusty-coloured fur, in summer. The long thin leaves are linear, 20–45 cm (8–17.5 in) long and 2–4 cm (0.8–1.6 in) wide. They are bordered with 20 to 42 prominent triangular lobes that have a zigzag pattern. The lobes are 1–2 cm (0.4–0.8 in) long and 1–2.5 cm (0.4–1.0 in) wide, while the V-shaped sinuses between intrude almost to the midrib of the leaf. The leaf margins are slightly recurved. On the underside of each lobe, there are 3–10 nerves converging on the lobe apex. The midrib is raised on the leaf undersurface; it is covered with white hair when new but brownish hair when mature.
The cream to yellow flower spikes, known as inflorescences, can appear at any time of year. They arise on the ends of one- or two-year-old stems and are roughly cylindrical in shape with a domed apex, measuring 4–12 cm (1.6–4.7 in) high and 9–10 cm (3.5–3.9 in) wide at anthesis. Each is a compound flowering structure, with a large number of individual flowers arising out of a central woody axis. A field study on the southern sandplains revealed an average count of 1369±79 on each spike. The perianth is grey-cream in bud, maturing to a more yellow or cream. The style is cream and the tip of the pollen-presenter maroon. Ageing spikes are grey, with old flowers remaining on them, and develop up to 20 large red follicles each. Roughly oval and jutting out prominently from the spike, each follicle is 3.5–5 cm (1.4–2.0 in) long by 2–3 cm (0.8–1 in) wide and 2–3 cm (0.8–1 in) high and is covered in dense fur, red-brown initially before aging to grey. It remains closed until opened by bushfire, and contains one or two viable seeds.
The seed is 3.7–4.5 cm (1.5–1.8 in) long and fairly flattened, and is composed of the seed body proper, measuring 1–1.4 cm (0.4–0.6 in) long and 0.9–1.2 cm (0.4–0.5 in) wide, and a papery wing. One side, termed the outer surface, is grey and the other is dark brown; on this side the seed body protrudes and is covered with tiny filaments. The seeds are separated by a dark brown seed separator that is roughly the same shape as the seeds with a depression where the seed body sits adjacent to it in the follicle. It measures 3.7–4.5 cm (1.5–1.8 in) long and 2–2.5 cm (0.8–1.0 in) wide. The dull green cotyledons of seedlings are wider than they are long, measuring 1.4–1.5 cm (0.55–0.59 in) across and 1.2–1.3 cm (0.47–0.51 in) long, described by Alex George as "broadly obovate". Each cotyledon has a 2 mm (0.08 in) auricle at its base and has three faint nerve-like markings on its lower half. The hypocotyl is smooth and red. The seedling leaves emerge in an opposite arrangement and are deeply serrated into three triangular lobes on each side. The seedling stem is covered in white hair.
A variant from the Gibson area has an upright habit and leaves. Otherwise, B. speciosa shows little variation across its range. Combined with its vigour and prominence in its habitat, this has led George to speculate that it is a recent development among its relatives.
Banksia baxteri resembles B. speciosa and co-occurs with it at the western edge of B. speciosa's range, but has shorter, wider leaves with larger lobes, shorter flower spikes and is a smaller, more open shrub.
## Taxonomy
The first botanical collector of this species may well have been Claude Riche, naturalist to Bruni d'Entrecasteaux's 1791 expedition in search of the lost ships of Jean-François de Galaup, comte de La Pérouse. During a visit to Esperance Bay, Riche explored an area in which B. speciosa is extremely common. However, he got lost and was forced to abandon his collections. The species was eventually collected by Robert Brown in 1802, and published by him in 1810. Alex George selected an 1802 specimen collected at Lucky Bay to be the lectotype in 1981. An early common name was handsome banksia. Common names include showy banksia and ricrac banksia, from the zigzag shape of its long thin leaves.
Robert Brown recorded 31 species of Banksia in his 1810 work Prodromus Florae Novae Hollandiae et Insulae Van Diemen, and in his taxonomic arrangement, placed the taxon in the subgenus Banksia verae, the "True Banksias", because the inflorescence is a typical Banksia flower spike. By the time Carl Meissner published his 1856 arrangement of the genus, there were 58 described Banksia species. Meissner divided Brown's Banksia verae, which had been renamed Eubanksia by Stephan Endlicher in 1847, into four series based on leaf properties. He placed B. speciosa in the series Dryandroideae.
George Bentham published a thorough revision of Banksia in his landmark publication Flora Australiensis in 1870. In Bentham's arrangement, the number of recognised Banksia species was reduced from 60 to 46. Bentham defined four sections based on leaf, style and pollen-presenter characters. B. speciosa was placed in section Orthostylis.
In 1891, German botanist Otto Kuntze challenged the generic name Banksia L.f., on the grounds that the name Banksia had previously been published in 1775 as Banksia J.R.Forst & G.Forst, referring to the genus now known as Pimelea. Kuntze proposed Sirmuellera as an alternative, republishing B. speciosa as Sirmuellera speciosa. The challenge failed, and Banksia L.f. was formally conserved.
### Current placement
Alex George published a new taxonomic arrangement of Banksia in his classic 1981 monograph The genus Banksia L.f. (Proteaceae). Endlicher's Eubanksia became B. subg. Banksia, and was divided into three sections. B. speciosa was placed in B. sect. Banksia, and this was further divided into nine series, with B. speciosa placed in B. ser. Banksia. He thought its closest relative was clearly Banksia baxteri based on their similar appearance, noting the two overlapped in their distribution.
Kevin Thiele and Pauline Ladiges published a new arrangement for the genus in 1996; their morphological cladistic analysis yielded a cladogram significantly different from George's arrangement. Thiele and Ladiges' arrangement retained B. speciosa in series Banksia, placing it in B. subser. Cratistylis along with B. baxteri as its sister taxon and seven other Western Australian species. This arrangement stood until 1999, when George effectively reverted to his 1981 arrangement in his monograph for the Flora of Australia series. B. speciosa's placement within Banksia according to Flora of Australia is as follows:
-
Genus Banksia
: Subgenus Banksia
: : Section Banksia
: : : Series Banksia
: : : : B. serrata
: : : : B. aemula
: : : : B. ornata
: : : : B. baxteri
: : : : B. speciosa
: : : : B. menziesii
: : : : B. candolleana
: : : : B. sceptrum
In 2002, a molecular study by Austin Mast again showed B. speciosa and B. baxteri to be each other's closest relatives, but they were only distantly related to other members of the series Banksia. Instead, their next closest relative turned out to be the distinctive Banksia coccinea.
Mast, Eric Jones and Shawn Havery published the results of their cladistic analyses of DNA sequence data for Banksia in 2005. They inferred a phylogeny greatly different from the accepted taxonomic arrangement, including finding Banksia to be paraphyletic with respect to Dryandra. A new taxonomic arrangement was not published at the time, but early in 2007 Mast and Thiele initiated a rearrangement by transferring Dryandra to Banksia, and publishing B. subg. Spathulatae for the species having spoon-shaped cotyledons; in this way they also redefined the autonym B. subg. Banksia. They foreshadowed publishing a full arrangement once DNA sampling of Dryandra was complete. In the meantime, if Mast and Thiele's nomenclatural changes are taken as an interim arrangement, then B. speciosa is placed in B. subg. Banksia.
## Distribution and habitat
B. speciosa occurs on coastal dunes and sandplains in the Esperance Plains and Mallee biogeographic regions on the south coast of Western Australia, from East Mount Barren in the Fitzgerald River National Park and the vicinity of Hopetoun eastwards to Israelite Bay, generally within 50 km (30 mi) of the coast. The range extends inland to Mount Ragged and 25 km (16 mi) southwest of Grass Patch. There is an outlying population to the east at Point Culver on the Great Australian Bight.
B. speciosa grows on flat or gently sloping ground on deep white or grey sand. It is often the dominant shrub in shrubland, commonly found with such species as Lambertia inermis, Banksia pulchella, and B. petiolaris.
## Ecology
The prominent flower spikes are visited by many birds and insects. Honeyeaters are common visitors, particularly the New Holland honeyeater, as well as the fuscous honeyeater, western wattlebird and western spinebill. Other birds recorded foraging include the grey butcherbird and species of thornbill. Insects recorded include ants, bees, wasps, butterflies, moths, flies and beetles. The short-billed black cockatoo breaks off old cones with follicles to eat the seed, often doing so before the seed is ripe.
B. speciosa is serotinous, that is, it has an aerial seed bank in its canopy in the form of the follicles of the old flower spikes. These are opened by fire and release seed in large numbers, which germinate and grow after rain. Seed can last for many years; old spikes 11 to 12 years old have been found to have 50% viable seed. Flower spikes appear to have similar numbers of follicles regardless of the age of the parent plant. Young plants begin flowering three years after regenerating from bushfire and store progressively larger numbers of old flowerheads (and hence seed) in the canopy. In one study, decade-old plants averaged around 3.5 old cones, whereas 21-year-old plants had 105, and were calculated as having over 900 viable seeds per plant. Plants appear to have a life span of at least 40 years, as healthy and vigorous individuals of this age are known. An experimental burn and monitoring of resultant seedling germination and growth showed B. speciosa seeds, though numerous, had poor rates of establishment but that seedlings were able to access water more easily and had higher rates of survival after two years than co-occurring Banksia species. Though this suggested B. speciosa might outcompete its conspecifics, the authors of the study noted that there could be other factors not accounted for in its natural environment.
B. speciosa is extremely sensitive to dieback caused by Phytophthora cinnamomi and numbers in Cape Le Grand and Cape Arid National Parks have been drastically reduced as whole populations of plants have perished after exposure. It is an indicator species for the presence of the disease. Nursery plants in Italy perished from root and basal stem rot from the pathogen Phytophthora taxon niederhauserii.
The tiny sac fungus Phyllachora banksiae subspecies westraliensis has been described from the leaves of B. speciosa, its sole host. This fungus manifests as round flat cream-coloured spots around 1–3 mm in diameter on the upper leaf surface. The surrounding leaf tissue is sometimes discoloured orange. One or two shiny black fruit bodies measuring around 0.25–0.75 by 0.25–1 mm appear in the centre of the spots.
## Cultivation
A fast-growing and attractive plant, B. speciosa grows readily in a sunny location in dry climates on well-drained soil, but does poorly in areas of humid summer climate, such as Australia's east coast. It has been grafted successfully onto Banksia serrata and B. integrifolia to enable cultivation in these areas. Seeds do not require any treatment, and take 27 to 41 days to germinate. A specimen flowered in a greenhouse in the Royal Botanic Garden Edinburgh in 1830. B. speciosa is an important cut flower crop. It was one of several species considered for commercial cropping in Tenerife, and trials showed that seedlings were moderately tolerant to salinity. |
# Fort Vancouver Centennial half dollar
The Fort Vancouver Centennial half dollar, sometimes called the Fort Vancouver half dollar, is a commemorative fifty-cent piece struck by the United States Bureau of the Mint in 1925. The coin was designed by Laura Gardin Fraser. Its obverse depicts John McLoughlin, who was in charge of Fort Vancouver (present-day Vancouver, Washington) from its construction in 1825 until 1846. From there, he effectively ruled the Oregon Country on behalf of the Hudson's Bay Company. The reverse shows an armed frontiersman standing in front of the fort.
Washington Representative Albert Johnson wanted a coin for Fort Vancouver's centennial celebrations, but was persuaded to accept a medal instead. But when another congressman was successful in amending a coinage bill to add a commemorative, Johnson tacked on language authorizing a coin for Fort Vancouver. The Senate agreed to the changes, and President Calvin Coolidge signed the authorizing act on February 24, 1925.
Fraser was engaged to design the coin on the recommendation of the United States Commission of Fine Arts. The coins were flown from the San Francisco Mint, where they were struck, to Washington state by airplane as a publicity stunt. They sold badly; much of the issue was returned for redemption and melting, and the failure may have been a factor in one official's suicide. Due to the low number of surviving pieces, the coins are valuable today.
## Background
Fort Vancouver, on the north bank of the Columbia River in what is today Vancouver, Washington, lay across the river from what would become Portland, Oregon. It was founded in 1825 by the Hudson's Bay Company chief factor for the area, Dr. John McLoughlin. The company sought furs and other trade goods, and was in competition with John Jacob Astor's Pacific Fur Company, which had an outpost at what is now Astoria, Oregon. Fort Vancouver was named for the British sea captain George Vancouver, who also gave his name to Vancouver in Canada.
Until the Oregon Treaty of 1846 settled the disputed claims of the United States and Britain, McLoughlin was what government there was in the Oregon Country. McLoughlin's word was obeyed by white man and Native American alike, and there were no significant wars there in that time. Fort Vancouver became the trading center for a large area, and the largest settlement west of the Great Plains. With the coming of American rule in 1846, McLoughlin resigned from the Hudson's Bay Company, going to live at Oregon City, which he had founded, and became its mayor in 1851, two years after becoming a U.S. citizen. He died in 1857; a century later, the Oregon Legislature named him the "Founder of Oregon", and Fort Vancouver is now a national historic site.
## Legislation
The Fort Vancouver Centennial Corporation hoped to sell commemorative half dollars at the planned celebration, and persuaded Representative Albert Johnson of Washington state to introduce legislation in the House of Representatives. In May 1924, he and Senator Wesley Jones, also of Washington state, introduced legislation in their houses of Congress for a half dollar commemorating the centennial of Fort Vancouver. The bills were not given any hearings. Indiana Representative Albert Vestal, the chairman of the House Committee on Coinage, Weights, and Measures, met with Johnson and persuaded him to introduce a bill for a medal instead. Vestal reasoned that the Treasury Department was opposing more commemorative coin issues, as these were finding their way into circulation and confusing the public. On February 3, 1925, Jones introduced a bill for a medal, and on the 12th, Johnson did the same.
Legislation for a Vermont Sesquicentennial half dollar had been introduced by that state's senior senator, Frank Greene, and had passed the Senate. When that bill came to the floor of the House of Representatives on February 16, California Representative John E. Raker moved to amend it to provide for a California Diamond Jubilee half dollar. Vestal asked to be heard in opposition to the amendment, stating that his committee, after recommending the Vermont bill, had decided to promote no further coin bills. He added that because of this, Johnson had agreed to withdraw his bill. The Minority Leader, Democratic Congressman Finis J. Garrett of Tennessee, asked why the committee had not set the rule before considering the Vermont bill, and Vestal admitted it was hard to answer. The House voted, and the amendment was added. Johnson—to applause from his colleagues—moved a further amendment, to add "and Vancouver, Wash." The amendment passed, as did the bill.
Johnson realized that such a simple amendment might not result in a coin being issued. He therefore returned to the House floor soon thereafter, asking that the bill be reconsidered, so he could couch his amendment in the same phrasing as for the other two coins. Once the bill was again being considered, Johnson added his amendment, but Vestal moved that the bill be returned to his committee. Vestal's motion failed, 24 ayes to 67 noes. Lengthy procedural wrangling followed over whether that vote could be objected to because there was no quorum present. Once that was resolved, the House passed the bill again. The bill was returned to the Senate the following day. Kansas's Charles Curtis moved on behalf of Greene that the Senate agree to the House amendments, and though Treasury Secretary Andrew W. Mellon urged President Calvin Coolidge to veto it, the bill, authorizing all three coins, was enacted by the President's signature on February 24, 1925.
## Preparation
Once the coin had been approved by Congress, the Centennial Corporation submitted plaster models by an unknown artist, whose initials (SB) appeared on the obverse. They were sent to the Commission of Fine Arts, charged by a 1921 executive order by President Warren G. Harding with rendering advisory opinions regarding public artworks, including coins. The models showed McLoughlin on the obverse and the fort stockade with Mount Hood in the background for the reverse. These designs were likely dictated by the Centennial Corporation. On May 22, the Commission rejected the models, describing them as "interesting" but stating that an experienced medalist would be needed. It recommended Chester Beach, but when the corporation tried to hire him, it turned out he was traveling. The corporation instead hired the commission's second choice, Laura Gardin Fraser, an experienced designer of commemorative coins.
Since the Centennial Corporation had decided what design elements it wanted to see on the half dollar, Fraser had to do her own interpretation of the designs SB had essayed. Hired on June 15, she completed her models by July 1, when Louis Ayres, a member of the commission, came to view them. He was enthusiastic, and sent a letter to commission chairman Charles Moore to that effect, writing "the whole coin looks very interesting to me, and I think is mighty good." The models were approved by the commission, and then by Mellon. Dies were prepared at the Philadelphia Mint, then shipped to San Francisco, where the coins were to be struck.
## Design
The obverse features a portrait of McLoughlin, facing left. The name of his adopted country overarches him, and his name and 'HALF DOLLAR' are below him, with the centennial dates and 'IN GOD WE TRUST' flanking his bust. Fraser had no likenesses of McLoughlin to work with, and what she based her portrait of him on is unclear. It shows him as an older man than the 41 years he was at the time of Fort Vancouver's founding. The reverse shows an armed frontiersman, dressed in buckskins, with the stockade of Fort Vancouver behind him, and Mt. Hood in the distance. The inscription is somewhat broken up, but is intended to be read as 'FORT VANCOUVER CENTENNIAL VANCOUVER WASHINGTON FOUNDED 1825 BY HUDSON'S BAY COMPANY'. Numismatists have debated whether the absence of a mint mark was intentional; it is the only commemorative coin issue struck at Denver or San Francisco that lacks one. The artist's initials, 'LGF', are at lower right on the reverse, on the other side of the circle from the date '1825'.
Anthony Swiatek and Walter Breen, in their 1988 book on commemorative coins, describe Fraser's design as "better than anything [Chester] Beach could have come up with". Cornelius Vermeule, in his volume on the artistry of U.S. coins and medals, deemed Fraser's half dollar "a most acceptable coin". He wrote, "the obverse tries Pisanello's spacing of the lettering and circumscribed roughness of the bust, while the reverse has too much scenery in the background, surrounded by too much lettering. This and the Hawaiian Sesquicentennial coin of 1928 prove that background scenery or geography ought to be omitted from commemorative half dollars".
## Production, distribution, and collecting
Only 50,000 of the authorized mintage of 300,000 were coined, plus 28 pieces intended to be sent to Philadelphia to be available for inspection and testing at the 1926 meeting of the annual Assay Commission. The minting was done not later than August 1 at San Francisco. As a publicity stunt, the entire mintage (less the 28 assay coins) was flown by air to Vancouver, Washington, by United States Army Air Corps Lieutenant Oakley G. Kelly on August 1; the shipment, including packaging, weighed 1,462 pounds (663 kg). On arrival, the coins were received by Herbert Campbell, head of the centennial commission.
The half dollars were intended to help pay for the centennial festivities in Vancouver. These were held from August 17 to 23, with a highlight being a pageant, "The Coming of the White Man", which was "based on historical fact". The coins were sold at $1 each; several hundred were gilded, diminishing their future value as numismatic specimens; others were kept as pocket pieces, or were spent.
The poor sales caused financial problems and may have caused a suicide, for on August 22, Charles A. Watts, secretary of the Centennial Corporation and described by Campbell as the real force behind the coin, killed himself. The day before he died, he told a meeting of the corporation there were funds enough to pay all debts, and that Fraser was not owed any money. Neither proved to be the case, and unpaid bills totaled $6,000, with no money to pay them. In fact, Fraser's fee of $1,200 was outstanding, and she tried to get paid even with the half dollars, but her bill was unsatisfied until a year later, when she was paid by check. The half dollars were not owned by the corporation, as the Vancouver National Bank had advanced money for them. Sales came to a virtual halt by the end of October. Texas coin dealer B. Max Mehl offered to buy the remainder of the issue at face value, but this was rejected as many people had paid $1 for their coins. A total of 35,034 pieces were sent back to the mint for redemption and melting, leaving 14,966 pieces outstanding. According to Swiatek and Breen, "given the remoteness and exclusively local nature of the celebration, it is surprising that as many as fourteen thousand coins were sold."
A sale of 1,000 coins was made to an executive of the Hudson's Bay Company, and they were placed in the Archives of Manitoba in Winnipeg, Canada. They were stolen in 1982 by a caretaker, who spent them and redeemed some for Canadian currency at a bank. Many wound up in the hands of a coin dealer, who sold them widely. At the time, the coins were worth about US$800 each. Once the theft was realized, the Province of Manitoba filed suit to recover the remaining coins, but a settlement allowed the dealer to retain them.
The coins quickly commanded a premium after their 1925 issue due to their scarcity, rising to $10 by 1928 before falling back to $7 by 1930, in uncirculated condition. They peaked at about $9 during the commemorative coin boom of 1936. They had subsided back to the $6 level by 1940, but thereafter increased steadily in value, rising to $1,600 during the second commemorative coin boom in 1980. The edition of R. S. Yeoman's A Guide Book of United States Coins published in 2017 lists the coin for between $300 and $975, depending on condition. A near-pristine specimen sold at auction in 2014 for $8,225. |
# Andrew Johnson
Andrew Johnson (December 29, 1808 – July 31, 1875) was the 17th president of the United States, serving from 1865 to 1869. He assumed the presidency following the assassination of Abraham Lincoln, as he was vice president at that time. Johnson was a Democrat who ran with Abraham Lincoln on the National Union Party ticket, coming to office as the Civil War concluded. He favored quick restoration of the seceded states to the Union without protection for the newly freed people who were formerly enslaved as well as pardoning ex-Confederates. This led to conflict with the Republican-dominated Congress, culminating in his impeachment by the House of Representatives in 1868. He was acquitted in the Senate by one vote.
Johnson was born into poverty and never attended school. He was apprenticed as a tailor and worked in several frontier towns before settling in Greeneville, Tennessee, serving as an alderman and mayor before being elected to the Tennessee House of Representatives in 1835. After briefly serving in the Tennessee Senate, Johnson was elected to the House of Representatives in 1843, where he served five two-year terms. He became governor of Tennessee for four years, and was elected by the legislature to the Senate in 1857. During his congressional service, he sought passage of the Homestead Bill which was enacted soon after he left his Senate seat in 1862. Southern slave states seceded to form the Confederate States of America, including Tennessee, but Johnson remained firmly with the Union. He was the only sitting senator from a Confederate state who did not promptly resign his seat upon learning of his state's secession. In 1862, Lincoln appointed him as Military Governor of Tennessee after most of it had been retaken. In 1864, Johnson was a logical choice as running mate for Lincoln, who wished to send a message of national unity in his re-election campaign, and became vice president after a victorious election in 1864.
Johnson implemented his own form of Presidential Reconstruction, a series of proclamations directing the seceded states to hold conventions and elections to reform their civil governments. Southern states returned many of their old leaders and passed Black Codes to deprive the freedmen of many civil liberties, but Congressional Republicans refused to seat legislators from those states and advanced legislation to overrule the Southern actions. Johnson vetoed their bills, and Congressional Republicans overrode him, setting a pattern for the remainder of his presidency. Johnson opposed the Fourteenth Amendment which gave citizenship to former slaves. In 1866, he went on an unprecedented national tour promoting his executive policies, seeking to break Republican opposition. As the conflict grew between the branches of government, Congress passed the Tenure of Office Act restricting Johnson's ability to fire Cabinet officials. He persisted in trying to dismiss Secretary of War Edwin Stanton, but ended up being impeached by the House of Representatives and narrowly avoided conviction in the Senate. He did not win the 1868 Democratic presidential nomination and left office the following year.
Johnson returned to Tennessee after his presidency and gained some vindication when he was elected to the Senate in 1875, making him the only president to afterwards serve in the Senate. He died five months into his term. Johnson's strong opposition to federally guaranteed rights for black Americans is widely criticized. Historians have consistently ranked him one of the worst presidents in American history.
## Early life and career
### Childhood
Andrew Johnson was born in Raleigh, North Carolina, on December 29, 1808, to Jacob Johnson (1778–1812) and Mary ("Polly") McDonough (1783–1856), a laundress. He was of English, Scots-Irish, and Scottish ancestry. He had a brother William, four years his senior, and an older sister Elizabeth, who died in childhood. Johnson's birth in a two-room shack was a political asset in the mid-19th century, and he frequently reminded voters of his humble origins. Jacob Johnson was a poor man, as had been his father, William Johnson, but he became town constable of Raleigh before marrying and starting a family. Jacob Johnson had been a porter for the State Bank of North Carolina, appointed by William Polk, a relative of President James K. Polk. Both Jacob and Mary were illiterate, and had worked as tavern servants, while Johnson never attended school and grew up in poverty. Jacob died of an apparent heart attack while ringing the town bell, shortly after rescuing three drowning men, when his son Andrew was three. Polly Johnson worked as a washerwoman and became the sole support of her family. Her occupation was then looked down on, as it often took her into other homes unaccompanied. Since Andrew did not resemble either of his siblings, there are rumors that he may have been fathered by another man. Polly Johnson eventually remarried to a man named Turner Doughtry, who was as poor as she was.
Johnson's mother apprenticed her son William to a tailor, James Selby. Andrew also became an apprentice in Selby's shop at age ten and was legally bound to serve until his 21st birthday. Johnson lived with his mother for part of his service, and one of Selby's employees taught him rudimentary literacy skills. His education was augmented by citizens who would come to Selby's shop to read to the tailors as they worked. Even before he became an apprentice, Johnson came to listen. The readings instilled in him a lifelong love of learning, and one of his biographers, Annette Gordon-Reed, suggests that Johnson, later a gifted public speaker, learned the art as he threaded needles and cut cloth.
Johnson was not happy at James Selby's, and after about five years, both he and his brother ran away. Selby responded by placing a reward for their return: "Ten Dollars Reward. Ran away from the subscriber, two apprentice boys, legally bound, named William and Andrew Johnson ... [payment] to any person who will deliver said apprentices to me in Raleigh, or I will give the above reward for Andrew Johnson alone." The brothers went to Carthage, North Carolina, where Andrew Johnson worked as a tailor for several months. Fearing he would be arrested and returned to Raleigh, Johnson moved to Laurens, South Carolina. He found work quickly, met his first love, Mary Wood, and made her a quilt as a gift. However, she rejected his marriage proposal. He returned to Raleigh, hoping to buy out his apprenticeship, but could not come to terms with Selby. Unable to stay in Raleigh, where he risked being apprehended for abandoning Selby, he decided to move west.
### Move to Tennessee
Johnson left North Carolina for Tennessee, traveling mostly on foot. After a brief period in Knoxville, he moved to Mooresville, Alabama. He then worked as a tailor in Columbia, Tennessee, but was called back to Raleigh by his mother and stepfather, who saw limited opportunities there and who wished to emigrate west. Johnson and his party traveled through the Blue Ridge Mountains to Greeneville, Tennessee. Andrew Johnson fell in love with the town at first sight, and when he became prosperous purchased the land where he had first camped and planted a tree in commemoration.
In Greeneville, Johnson established a successful tailoring business in the front of his home. In 1827, at the age of 18, he married 16-year-old Eliza McCardle, the daughter of a local shoemaker. The pair were married by Justice of the Peace Mordecai Lincoln, first cousin of Thomas Lincoln, whose son would become president. The Johnsons were married for almost 50 years and had five children: Martha (1828), Charles (1830), Mary (1832), Robert (1834), and Andrew Jr. (1852). Though she had tuberculosis, Eliza supported her husband's endeavors. She taught him mathematics skills and tutored him to improve his writing. Shy and retiring by nature, Eliza Johnson usually remained in Greeneville during Johnson's political rise. She was not often seen during her husband's presidency; their daughter Martha usually served as official hostess.
Johnson's tailoring business prospered during the early years of the marriage, enabling him to hire help and giving him the funds to invest profitably in real estate. He later boasted of his talents as a tailor, "my work never ripped or gave way". He was a voracious reader. Books about famous orators aroused his interest in political dialogue, and he had private debates on the issues of the day with customers who held opposing views. He also took part in debates at Greeneville College.
### Johnson's slaves
In 1843, Johnson purchased his first slave, Dolly, who was 14 years old at the time. Dolly had three children—Liz, Florence and William. Soon after his purchase of Dolly, he purchased Dolly's half-brother Sam. Sam Johnson and his wife Margaret had nine children. Sam became a commissioner of the Freedmen's Bureau and was known for being a proud man who negotiated the nature of his work with the Johnson family. Notably, he received some monetary compensation for his labors and negotiated with Andrew Johnson to receive a tract of land which Andrew Johnson gave him for free in 1867.
In 1857, Andrew Johnson purchased Henry, who was 13 at the time and would later accompany the Johnson family to the White House. Ultimately, Johnson owned at least ten slaves.
Andrew Johnson freed his slaves on August 8, 1863; they remained with him as paid servants. A year later, Johnson, as military governor of Tennessee, proclaimed the freedom of Tennessee's slaves. Sam and Margaret, Johnson's former slaves, lived in his tailor shop while he was president, without rent. As a sign of appreciation for proclaiming freedom, Andrew Johnson was given a watch by newly emancipated people in Tennessee inscribed with "for his Untiring Energy in the Cause of Freedom".
## Political rise
### Tennessee politician
Johnson helped organize a Mechanics' (Working Men's) ticket in the 1829 Greeneville municipal election. He was elected town alderman, along with his friends Blackston McDannel and Mordecai Lincoln. Following Nat Turner's Rebellion in 1831, a state convention was called to pass a new constitution, including provisions to disenfranchise free people of color. The convention also wanted to reform real estate tax rates, and provide ways of funding improvements to Tennessee's infrastructure. The constitution was submitted for a public vote, and Johnson spoke widely for its adoption; the successful campaign provided him with statewide exposure. On January 4, 1834, his fellow aldermen elected him mayor of Greeneville.
In 1835, Johnson made a bid for election to the "floater" (open) seat which Greene County shared with neighboring Washington County in the Tennessee House of Representatives. According to his biographer, Hans L. Trefousse, Johnson "demolished" the opposition in debate and won the election with almost a two to one margin. During his Greeneville days, Johnson joined the Tennessee Militia as a member of the 90th Regiment. He attained the rank of colonel, though while an enrolled member, Johnson was fined for an unknown offense. Afterwards, he was often addressed or referred to by his rank.
In his first term in the legislature, which met in the state capital of Nashville, Johnson did not consistently vote with either the Democratic or the newly formed Whig Party, though he revered President Andrew Jackson, a Democrat and fellow Tennessean. The major parties were still determining their core values and policy proposals, with the party system in a state of flux. The Whig Party had organized in opposition to Jackson, fearing the concentration of power in the Executive Branch of the government; Johnson differed from the Whigs as he opposed more than minimal government spending and spoke against aid for the railroads, while his constituents hoped for improvements in transportation. After Brookins Campbell and the Whigs defeated Johnson for reelection in 1837, Johnson would not lose another race for thirty years. In 1839, he sought to regain his seat, initially as a Whig, but when another candidate sought the Whig nomination, he ran as a Democrat and was elected. From that time he supported the Democratic party and built a powerful political machine in Greene County. Johnson became a strong advocate of the Democratic Party, noted for his oratory, and in an era when public speaking both informed the public and entertained it, people flocked to hear him.
In 1840, Johnson was selected as a presidential elector for Tennessee, giving him more statewide publicity. Although Democratic President Martin Van Buren was defeated by former Ohio senator William Henry Harrison, Johnson was instrumental in keeping Greene County in the Democratic column. He was elected to the Tennessee Senate in 1841, where he served a two-year term. He had achieved financial success in his tailoring business, but sold it to concentrate on politics. He had also acquired additional real estate, including a larger home and a farm (where his mother and stepfather took residence), and among his assets numbered eight or nine slaves.
### United States Representative (1843–1853)
Having served in both houses of the state legislature, Johnson saw election to Congress as the next step in his political career. He engaged in a number of political maneuvers to gain Democratic support, including the displacement of the Whig postmaster in Greeneville, and defeated Jonesborough lawyer John A. Aiken by 5,495 votes to 4,892. In Washington, he joined a new Democratic majority in the House of Representatives. Johnson advocated for the interests of the poor, maintained an anti-abolitionist stance, argued for only limited spending by the government and opposed protective tariffs. With Eliza remaining in Greeneville, Congressman Johnson shunned social functions in favor of study in the Library of Congress. Although a fellow Tennessee Democrat, James K. Polk, was elected president in 1844, and Johnson had campaigned for him, the two men had difficult relations, and President Polk refused some of his patronage suggestions.
Johnson believed, as did many Southern Democrats, that the Constitution protected private property, including slaves, and thus prohibited the federal and state governments from abolishing slavery. He won a second term in 1845 against William G. Brownlow, presenting himself as the defender of the poor against the aristocracy. In his second term, Johnson supported the Polk administration's decision to fight the Mexican War, seen by some Northerners as an attempt to gain territory to expand slavery westward, and opposed the Wilmot Proviso, a proposal to ban slavery in any territory gained from Mexico. He introduced for the first time his Homestead Bill, to grant 160 acres (65 ha) to people willing to settle the land and gain title to it. This issue was especially important to Johnson because of his own humble beginnings.
In the presidential election of 1848, the Democrats split over the slavery issue, and abolitionists formed the Free Soil Party, with former president Van Buren as their nominee. Johnson supported the Democratic candidate, former Michigan senator Lewis Cass. With the party split, Whig nominee General Zachary Taylor was easily victorious, and carried Tennessee. Johnson's relations with Polk remained poor; the President recorded of his final New Year's reception in 1849 that
> Among the visitors I observed in the crowd today was Hon. Andrew Johnson of the Ho. Repts. [House of Representatives] Though he represents a Democratic District in Tennessee (my own State) this is the first time I have seen him during the present session of Congress. Professing to be a Democrat, he has been politically, if not personally hostile to me during my whole term. He is very vindictive and perverse in his temper and conduct. If he had the manliness and independence to declare his opposition openly, he knows he could not be elected by his constituents. I am not aware that I have ever given him cause for offense.
Johnson, due to national interest in new railroad construction and in response to the need for better transportation in his own district, also supported government assistance for the East Tennessee and Virginia Railroad.
During his campaign for a fourth term, Johnson concentrated on three issues: slavery, homesteads and judicial elections. He defeated his opponent, Nathaniel G. Taylor, in August 1849, with a greater margin of victory than in previous campaigns. When the House convened in December, the party division caused by the Free Soil Party precluded the formation of the majority needed to elect a Speaker. Johnson proposed adoption of a rule allowing election of a Speaker by a plurality; some weeks later others took up a similar proposal, and Democrat Howell Cobb was elected.
Once the Speaker election had concluded and Congress was ready to conduct legislative business, the issue of slavery took center stage. Northerners sought to admit California, a free state, to the Union. Kentucky's Henry Clay introduced in the Senate a series of resolutions, the Compromise of 1850, to admit California and pass legislation sought by each side. Johnson voted for all the provisions except for the abolition of the slave trade in the nation's capital. He pressed resolutions for constitutional amendments to provide for popular election of senators (then elected by state legislatures) and of the president (chosen by the Electoral College), and limiting the tenure of federal judges to 12 years. These were all defeated.
A group of Democrats nominated Landon Carter Haynes to oppose Johnson as he sought a fifth term; the Whigs were so pleased with the internecine battle among the Democrats in the general election that they did not nominate a candidate of their own. The campaign included fierce debates: Johnson's main issue was the passage of the Homestead Bill; Haynes contended it would facilitate abolition. Johnson won the election by more than 1,600 votes. Though he was not enamored of the party's presidential nominee in 1852, former New Hampshire senator Franklin Pierce, Johnson campaigned for him. Pierce was elected, but he failed to carry Tennessee. In 1852, Johnson managed to get the House to pass his Homestead Bill, but it failed in the Senate. The Whigs had gained control of the Tennessee legislature, and, under the leadership of Gustavus Henry, redrew the boundaries of Johnson's First District to make it a safe seat for their party. The Nashville Union termed this "Henry-mandering"; lamented Johnson, "I have no political future."
### Governor of Tennessee (1853–1857)
If Johnson considered retiring from politics upon deciding not to seek reelection, he soon changed his mind. His political friends began to maneuver to get him the nomination for governor. The Democratic convention unanimously named him, though some party members were not happy at his selection. The Whigs had won the past two gubernatorial elections, and still controlled the legislature. That party nominated Henry, making the "Henry-mandering" of the First District an immediate issue. The two men debated in county seats the length of Tennessee before the meetings were called off two weeks before the August 1853 election due to illness in Henry's family. Johnson won the election by 63,413 votes to 61,163; some votes for him were cast in return for his promise to support Whig Nathaniel Taylor for his old seat in Congress.
Tennessee's governor had little power: Johnson could propose legislation but not veto it, and most appointments were made by the Whig-controlled legislature. Nevertheless, the office was a "bully pulpit" that allowed him to publicize himself and his political views. He succeeded in getting the appointments he wanted in return for his endorsement of John Bell, a Whig, for one of the state's U.S. Senate seats. In his first biennial speech, Johnson urged simplification of the state judicial system, abolition of the Bank of Tennessee, and establishment of an agency to provide uniformity in weights and measures; the last was passed. Johnson was critical of the Tennessee common school system and suggested funding be increased via taxes, either statewide or county by county—a mixture of the two was passed. Reforms carried out during Johnson's time as governor included the foundation of the State's public library (making books available to all) and its first public school system, and the initiation of regular state fairs to benefit craftsmen and farmers.
Although the Whig Party was on its final decline nationally, it remained strong in Tennessee, and the outlook for Democrats there in 1855 was poor. Feeling that reelection as governor was necessary to give him a chance at the higher offices he sought, Johnson agreed to make the run. Meredith P. Gentry received the Whig nomination. A series of more than a dozen vitriolic debates ensued. The issues in the campaign were slavery, the prohibition of alcohol, and the nativist positions of the Know Nothing Party. Johnson favored the first, but opposed the others. Gentry was more equivocal on the alcohol question, and had gained the support of the Know Nothings, a group Johnson portrayed as a secret society. Johnson was unexpectedly victorious, albeit with a narrower margin than in 1853.
When the presidential election of 1856 approached, Johnson hoped to be nominated; some Tennessee county conventions designated him a "favorite son". His position that the best interests of the Union were served by slavery in some areas made him a practical compromise candidate for president. He was never a major contender; the nomination fell to former Pennsylvania senator James Buchanan. Though he was not impressed by either, Johnson campaigned for Buchanan and his running mate, John C. Breckinridge, who were elected.
Johnson decided not to seek a third term as governor, with an eye towards election to the U.S. Senate. In 1857, while returning from Washington, his train derailed, causing serious damage to his right arm. This injury would trouble him in the years to come.
### United States Senator
#### Homestead Bill advocate
The victors in the 1857 state legislative campaign would, once they convened in October, elect a United States Senator. Former Whig governor William B. Campbell wrote to his uncle, "The great anxiety of the Whigs is to elect a majority in the legislature so as to defeat Andrew Johnson for senator. Should the Democrats have the majority, he will certainly be their choice, and there is no man living to whom the Americans and Whigs have as much antipathy as Johnson." The governor spoke widely in the campaign, and his party won the gubernatorial race and control of the legislature. Johnson's final address as governor gave him the chance to influence his electors, and he made proposals popular among Democrats. Two days later the legislature elected him to the Senate. The opposition was appalled, with the Richmond Whig newspaper referring to him as "the vilest radical and most unscrupulous demagogue in the Union".
Johnson gained high office due to his proven record as a man popular among the small farmers and self-employed tradesmen who made up much of Tennessee's electorate. He called them the "plebeians"; he was less popular among the planters and lawyers who led the state Democratic Party, but none could match him as a vote-getter. After his death, one Tennessee voter wrote of him, "Johnson was always the same to everyone ... the honors heaped upon him did not make him forget to be kind to the humblest citizen." Always seen in impeccably tailored clothing, he cut an impressive figure, and had the stamina to endure lengthy campaigns with daily travel over bad roads leading to another speech or debate. Mostly denied the party's machinery, he relied on a network of friends, advisers, and contacts. One friend, Hugh Douglas, stated in a letter to him, "you have been in the way of our would be great men for a long time. At heart many of us never wanted you to be Governor only none of the rest of us Could have been elected at the time and we only wanted to use you. Then we did not want you to go to the Senate but the people would send you."
The new senator took his seat when Congress convened in December 1857 (the term of his predecessor, James C. Jones, had expired in March). He came to Washington as usual without his wife and family; Eliza would visit Washington only once during Johnson's first time as senator, in 1860. Johnson immediately set about introducing the Homestead Bill in the Senate, but as most senators who supported it were Northern (many associated with the newly founded Republican Party), the matter became caught up in suspicions over the slavery issue. Southern senators felt that those who took advantage of the provisions of the Homestead Bill were more likely to be Northern non-slaveholders. The issue of slavery had been complicated by the Supreme Court's ruling earlier in the year in Dred Scott v. Sandford that slavery could not be prohibited in the territories. Johnson, a slaveholding senator from a Southern state, made a major speech in the Senate the following May in an attempt to convince his colleagues that the Homestead Bill and slavery were not incompatible. Nevertheless, Southern opposition was key to defeating the legislation, 30–22. In 1859, it failed on a procedural vote when Vice President Breckinridge broke a tie against the bill, and in 1860, a watered-down version passed both houses, only to be vetoed by Buchanan at the urging of Southerners. Johnson continued his opposition to spending, chairing a committee to control it.
He argued against funding to build infrastructure in Washington, D.C., stating that it was unfair to expect state citizens to pay for the city's streets, even if it was the seat of government. He opposed spending money for troops to put down the revolt by the Mormons in Utah Territory, arguing for temporary volunteers as the United States should not have a standing army.
#### Secession crisis
In October 1859, abolitionist John Brown and sympathizers raided the federal arsenal at Harpers Ferry, Virginia (today West Virginia). Tensions in Washington between pro- and anti-slavery forces increased greatly. Johnson gave a major speech in the Senate in December, decrying Northerners who would endanger the Union by seeking to outlaw slavery. The Tennessee senator stated that "all men are created equal" from the Declaration of Independence did not apply to African Americans, since the Constitution of Illinois contained that phrase—and that document barred voting by African Americans. Johnson, by this time, was a wealthy man who owned 14 slaves.
Johnson hoped that he would be a compromise candidate for the presidential nomination as the Democratic Party tore itself apart over the slavery question. Busy with the Homestead Bill during the 1860 Democratic National Convention in Charleston, South Carolina, he sent two of his sons and his chief political adviser to represent his interests in the backroom deal-making. The convention deadlocked, with no candidate able to gain the required two-thirds vote, but the sides were too far apart to consider Johnson as a compromise. The party split, with Northerners backing Illinois Senator Stephen Douglas while Southerners, including Johnson, supported Vice President Breckinridge for president. With former Tennessee senator John Bell running a fourth-party candidacy and further dividing the vote, the Republican Party elected its first president, former Illinois representative Abraham Lincoln. The election of Lincoln, known to be against the spread of slavery, was unacceptable to many in the South. Although secession from the Union had not been an issue in the campaign, talk of it began in the Southern states.
Johnson took to the Senate floor after the election, giving a speech well received in the North, "I will not give up this government ... No; I intend to stand by it ... and I invite every man who is a patriot to ... rally around the altar of our common country ... and swear by our God, and all that is sacred and holy, that the Constitution shall be saved, and the Union preserved." As Southern senators announced they would resign if their states seceded, he reminded Mississippi Senator Jefferson Davis that if Southerners would only hold to their seats, the Democrats would control the Senate, and could defend the South's interests against any infringement by Lincoln. Gordon-Reed points out that while Johnson's belief in an indissoluble Union was sincere, he had alienated Southern leaders, including Davis, who would soon be the president of the Confederate States of America, formed by the seceding states. If the Tennessean had backed the Confederacy, he would have had small influence in its government.
Johnson returned home when his state took up the issue of secession. His successor as governor, Isham G. Harris, and the legislature organized a referendum on whether to have a constitutional convention to authorize secession; when that failed, they put the question of leaving the Union to a popular vote. Despite threats on Johnson's life, and actual assaults, he campaigned against both questions, sometimes speaking with a gun on the lectern before him. Although Johnson's eastern region of Tennessee was largely against secession, the second referendum passed, and in June 1861, Tennessee joined the Confederacy. Believing he would be killed if he stayed, Johnson fled through the Cumberland Gap, where his party was in fact shot at. He left his wife and family in Greeneville.
As the only member from a seceded state to remain in the Senate and the most prominent Southern Unionist, Johnson had Lincoln's ear in the early months of the war. With most of Tennessee in Confederate hands, Johnson spent congressional recesses in Kentucky and Ohio, trying in vain to convince any Union commander who would listen to conduct an operation into East Tennessee.
### Military Governor of Tennessee
Johnson's first tenure in the Senate came to a conclusion in March 1862 when Lincoln appointed him military governor of Tennessee. Much of the central and western portions of that seceded state had been recovered. Although some argued that civil government should simply resume once the Confederates were defeated in an area, Lincoln chose to use his power as commander in chief to appoint military governors over Union-controlled Southern regions. The Senate quickly confirmed Johnson's nomination along with the rank of brigadier general. In response, the Confederates confiscated his land and his slaves, and turned his home into a military hospital. Later in 1862, after his departure from the Senate and in the absence of most Southern legislators, the Homestead Bill was finally enacted. Along with legislation for land-grant colleges and for the transcontinental railroad, the Homestead Bill has been credited with opening the Western United States to settlement.
As military governor, Johnson sought to eliminate rebel influence in the state. He demanded loyalty oaths from public officials, and shut down all newspapers owned by Confederate sympathizers. Much of eastern Tennessee remained in Confederate hands, and the ebb and flow of war during 1862 sometimes brought Confederate control again close to Nashville. However, the Confederates allowed his wife and family to pass through the lines to join him. Johnson undertook the defense of Nashville as well as he could, though the city was continually harassed by cavalry raids led by General Nathan Bedford Forrest. Relief from Union regulars did not come until General William S. Rosecrans defeated the Confederates at Murfreesboro in early 1863. Much of eastern Tennessee was captured later that year.
When Lincoln issued the Emancipation Proclamation in January 1863, declaring freedom for all slaves in Confederate-held areas, he exempted Tennessee at Johnson's request. The proclamation increased the debate over what should become of the slaves after the war, as not all Unionists supported abolition. Johnson finally decided that slavery had to end. He wrote, "If the institution of slavery ... seeks to overthrow it [the Government], then the Government has a clear right to destroy it". He reluctantly supported efforts to enlist former slaves into the Union Army, feeling that African-Americans should perform menial tasks to release white Americans to do the fighting. Nevertheless, he succeeded in recruiting 20,000 black soldiers to serve the Union.
## Vice presidency (1865)
In 1860, Lincoln's running mate had been Senator Hannibal Hamlin of Maine. Although Hamlin had served competently, was in good health, and was willing to run again, Johnson emerged as running mate for Lincoln's reelection bid in 1864.
Lincoln considered several War Democrats for the ticket in 1864, and sent an agent to sound out General Benjamin Butler as a possible running mate. In May 1864, the president dispatched General Daniel Sickles to Nashville on a fact-finding mission. Although Sickles denied that he was there either to investigate or interview the military governor, Johnson biographer Hans L. Trefousse believes that Sickles's trip was connected to Johnson's subsequent nomination for vice president. According to historian Albert Castel in his account of Johnson's presidency, Lincoln was impressed by Johnson's administration of Tennessee. Gordon-Reed points out that while the Lincoln-Hamlin ticket might have been considered geographically balanced in 1860, "having Johnson, the southern War Democrat, on the ticket sent the right message about the folly of secession and the continuing capacity for union within the country." Another factor was the desire of Secretary of State William Seward to frustrate the vice-presidential candidacy of fellow New Yorker and former senator Daniel S. Dickinson, a War Democrat, as Seward would probably have had to yield his place if another New Yorker became vice president. Johnson, once he was told by reporters the likely purpose of Sickles' visit, was active on his own behalf, delivering speeches and having his political friends work behind the scenes to boost his candidacy.
To sound a theme of unity in 1864, Lincoln ran under the banner of the National Union Party, rather than that of the Republicans. At the party's convention in Baltimore in June, Lincoln was easily nominated, although there had been some talk of replacing him with a cabinet officer or one of the more successful generals. After the convention backed Lincoln, former Secretary of War Simon Cameron offered a resolution to nominate Hamlin, but it was defeated. Johnson was nominated for vice president by C.M. Allen of Indiana with an Iowa delegate seconding it. On the first ballot, Johnson led with 200 votes to 150 for Hamlin and 108 for Dickinson. On the second ballot, Kentucky switched its vote for Johnson, beginning a stampede. Johnson was named on the second ballot with 491 votes to Hamlin's 17 and eight for Dickinson; the nomination was made unanimous. Lincoln expressed pleasure at the result, "Andy Johnson, I think, is a good man." When word reached Nashville, a crowd assembled and the military governor obliged with a speech contending his selection as a Southerner meant that the rebel states had not actually left the Union.
Although it was unusual at the time for a national candidate to actively campaign, Johnson gave a number of speeches in Tennessee, Kentucky, Ohio, and Indiana. He also sought to boost his chances in Tennessee while reestablishing civil government by making the loyalty oath even more restrictive, in that voters would now have to swear that they opposed making a settlement with the Confederacy. The Democratic candidate for president, George McClellan, hoped to avoid additional bloodshed by negotiation, and so the stricter loyalty oath effectively disenfranchised his supporters. Lincoln declined to override Johnson, and their ticket took the state by 25,000 votes. Congress refused to count Tennessee's electoral votes, but Lincoln and Johnson did not need them, having won in most states that had voted, and easily secured the election.
Now Vice President-elect, Johnson was eager to complete the work of reestablishing civilian government in Tennessee, although the timetable for the election of a new governor did not allow it to take place until after Inauguration Day, March 4. He hoped to remain in Nashville to complete his task, but was told by Lincoln's advisers that he could not stay, but would be sworn in with Lincoln. In these months, Union troops finished the retaking of eastern Tennessee, including Greeneville. Just before his departure, the voters of Tennessee ratified a new constitution, which abolished slavery, on February 22, 1865. One of Johnson's final acts as military governor was to certify the results.
Johnson traveled to Washington to be sworn into office, although according to Gordon-Reed, "in light of what happened on March 4, 1865, it might have been better if Johnson had stayed in Nashville." Johnson may have been ill; Castel cited typhoid fever, though Gordon-Reed notes that there is no independent evidence for that diagnosis. On the evening of March 3, Johnson attended a party in his honor at which he drank heavily. Hung over the following morning at the Capitol, he asked Vice President Hamlin for some whiskey. Hamlin produced a bottle, and Johnson took two stiff drinks, stating "I need all the strength for the occasion I can have." In the Senate Chamber, Johnson delivered a rambling address as Lincoln, the Congress, and dignitaries looked on. Almost incoherent at times, he finally meandered to a halt, whereupon Hamlin hastily swore him in as vice president. Lincoln, who had watched sadly during the debacle, then went to his own swearing-in outside the Capitol, and delivered his acclaimed Second Inaugural Address.
In the weeks after the inauguration, Johnson only presided over the Senate briefly, and hid from public ridicule at the Maryland home of a friend, Francis Preston Blair. When he did return to Washington, it was with the intent of leaving for Tennessee to reestablish his family in Greeneville. Instead, he remained after word came that General Ulysses S. Grant had captured the Confederate capital of Richmond, Virginia, presaging the end of the war. Lincoln stated, in response to criticism of Johnson's behavior, that "I have known Andy Johnson for many years; he made a bad slip the other day, but you need not be scared; Andy ain't a drunkard."
## Presidency (1865–1869)
### Accession
On the afternoon of April 14, 1865, Lincoln and Johnson met for the first time since the inauguration. Trefousse states that Johnson wanted to "induce Lincoln not to be too lenient with traitors"; Gordon-Reed agrees.
That night, President Lincoln was shot and mortally wounded at Ford's Theatre by John Wilkes Booth, a Confederate sympathizer. The shooting of the President was part of a conspiracy to assassinate Lincoln, Johnson, and Seward the same night. Seward barely survived his wounds, while Johnson escaped attack as his would-be assassin, George Atzerodt, got drunk instead of killing the vice president. Leonard J. Farwell, a fellow boarder at the Kirkwood House, awoke Johnson with news of Lincoln's shooting. Johnson rushed to the President's deathbed, where he remained a short time, on his return promising, "They shall suffer for this. They shall suffer for this." Lincoln died at 7:22 am the next morning; Johnson's swearing-in occurred between 10 and 11 am with Chief Justice Salmon P. Chase presiding in the presence of most of the Cabinet. Johnson's demeanor was described by the newspapers as "solemn and dignified". Some Cabinet members had last seen Johnson, apparently drunk, at the inauguration. At noon, Johnson conducted his first Cabinet meeting in the Treasury Secretary's office, and asked all members to remain in their positions.
The events of the assassination resulted in speculation, then and subsequently, concerning Johnson and what the conspirators might have intended for him. In the vain hope of having his life spared after his capture, Atzerodt spoke much about the conspiracy, but did not say anything to indicate that the plotted assassination of Johnson was merely a ruse. Conspiracy theorists point to the fact that on the day of the assassination, Booth came to the Kirkwood House and left one of his cards with Johnson's private secretary, William A. Browning. The message on it was: "Don't wish to disturb you. Are you at home? J. Wilkes Booth."
Johnson presided with dignity over Lincoln's funeral ceremonies in Washington, before his predecessor's body was sent home to Springfield, Illinois, for interment. Shortly after Lincoln's death, Union General William T. Sherman reported he had, without consulting Washington, reached an armistice agreement with Confederate General Joseph E. Johnston for the surrender of Confederate forces in North Carolina in exchange for the existing state government remaining in power, with private property rights (slaves) to be respected. This did not even grant freedom to those in slavery. This was not acceptable to Johnson or the Cabinet, who sent word for Sherman to secure the surrender without making political deals, which he did. Further, Johnson placed a $100,000 bounty (equivalent to $ in ) on Confederate President Davis, then a fugitive, which gave Johnson the reputation of a man who would be tough on the South. More controversially, he permitted the execution of Mary Surratt for her part in Lincoln's assassination. Surratt was executed with three others, including Atzerodt, on July 7, 1865.
### Reconstruction
#### Background
Upon taking office, Johnson faced the question of what to do with the former Confederacy. President Lincoln had authorized loyalist governments in Virginia, Arkansas, Louisiana, and Tennessee as the Union came to control large parts of those states and advocated a ten percent plan that would allow elections after ten percent of the voters in any state took an oath of future loyalty to the Union. Congress considered this too lenient; its own plan, requiring a majority of voters to take the loyalty oath, passed both houses in 1864, but Lincoln pocket vetoed it.
Johnson had three goals in Reconstruction. He sought a speedy restoration of the states, on the grounds that they had never truly left the Union, and thus should again be recognized once loyal citizens formed a government. To Johnson, African-American suffrage was a delay and a distraction; it had always been a state responsibility to decide who should vote. Second, political power in the Southern states should pass from the planter class to his beloved "plebeians". Johnson feared that the freedmen, many of whom were still economically bound to their former masters, might vote at their direction. Johnson's third priority was election in his own right in 1868, a feat no one who had succeeded a deceased president had managed to accomplish, attempting to secure a Democratic anti-Congressional Reconstruction coalition in the South.
The Republicans had formed a number of factions. The Radical Republicans sought voting and other civil rights for African Americans. They believed that the freedmen could be induced to vote Republican in gratitude for emancipation, and that black votes could keep the Republicans in power and Southern Democrats, including former rebels, out of influence. They believed that top Confederates should be punished. The Moderate Republicans sought to keep the Democrats out of power at a national level, and prevent former rebels from resuming power. They were not as enthusiastic about the idea of African-American suffrage as their Radical colleagues, either because of their own local political concerns, or because they believed that the freedman would be likely to cast his vote badly. Northern Democrats favored the unconditional restoration of the Southern states. They did not support African-American suffrage, which might threaten Democratic control in the South.
#### Presidential Reconstruction
Johnson was initially left to devise a Reconstruction policy without legislative intervention, as Congress was not due to meet again until December 1865. Radical Republicans told the President that the Southern states were economically in a state of chaos and urged him to use his leverage to insist on rights for freedmen as a condition of restoration to the Union. But Johnson, with the support of other officials including Seward, insisted that the franchise was a state, not a federal matter. The Cabinet was divided on the issue.
Johnson's first Reconstruction actions were two proclamations, with the unanimous backing of his Cabinet, on May 29. One recognized the Virginia government led by provisional Governor Francis Pierpont. The second provided amnesty for all ex-rebels except those holding property valued at $20,000 or more; it also appointed a temporary governor for North Carolina and authorized elections. Neither of these proclamations included provisions regarding black suffrage or freedmen's rights. The President ordered constitutional conventions in other former rebel states.
As Southern states began the process of forming governments, Johnson's policies received considerable public support in the North, which he took as unconditional backing for quick reinstatement of the South. While he received such support from the white South, he underestimated the determination of Northerners to ensure that the war had not been fought for nothing. It was important, in Northern public opinion, that the South acknowledge its defeat, that slavery be ended, and that the lot of African Americans be improved. Voting rights were less important at the time—only a handful of Northern states (mostly in New England) gave African-American men the right to vote on the same basis as whites, and in late 1865, Connecticut, Wisconsin, and Minnesota voted down African-American suffrage proposals by large margins. Northern public opinion tolerated Johnson's inaction on black suffrage as an experiment, to be allowed if it quickened Southern acceptance of defeat. Instead, white Southerners felt emboldened. A number of Southern states passed Black Codes, binding African-American laborers to farms on annual contracts they could not quit, and allowing law enforcement at whim to arrest them for vagrancy and rent out their labor. Most Southerners elected to Congress were former Confederates, with the most prominent being Georgia Senator-designate and former Confederate vice president Alexander Stephens. Congress assembled in early December 1865; Johnson's conciliatory annual message to them was well received. Nevertheless, Congress refused to seat the Southern legislators and established a committee to recommend appropriate Reconstruction legislation.
Northerners were outraged at the idea of unrepentant Confederate leaders, such as Stephens, rejoining the federal government at a time when emotional wounds from the war remained raw. They saw the Black Codes placing African Americans in a position barely above slavery. Republicans also feared that restoration of the Southern states would return the Democrats to power. In addition, according to David O. Stewart in his book on Johnson's impeachment, "the violence and poverty that oppressed the South would galvanize the opposition to Johnson".
#### Break with the Republicans: 1866
Congress was reluctant to confront the President, and initially only sought to fine-tune Johnson's policies towards the South. According to Trefousse, "If there was a time when Johnson could have come to an agreement with the moderates of the Republican Party, it was the period following the return of Congress." The President was unhappy about the provocative actions of the Southern states, and about the continued control by the antebellum elite there, but made no statement publicly, believing that Southerners had a right to act as they did, even if it was unwise to do so. By late January 1866, he was convinced that winning a showdown with the Radical Republicans was necessary to his political plans – both for the success of Reconstruction and for reelection in 1868. He would have preferred that the conflict arise over the legislative efforts to enfranchise African Americans in the District of Columbia, a proposal that had been defeated overwhelmingly in an all-white referendum. A bill to accomplish this passed the House of Representatives, but to Johnson's disappointment, stalled in the Senate before he could veto it.
Illinois Senator Lyman Trumbull, leader of the Moderate Republicans and Chairman of the Judiciary Committee, was anxious to reach an understanding with the President. He ushered through Congress a bill extending the Freedmen's Bureau beyond its scheduled abolition in 1867, and the first Civil Rights Bill, to grant citizenship to the freedmen. Trumbull met several times with Johnson and was convinced the President would sign the measures (Johnson rarely contradicted visitors, often fooling those who met with him into thinking he was in accord). In fact, the President opposed both bills as infringements on state sovereignty. Additionally, both of Trumbull's bills were unpopular among white Southerners, whom Johnson hoped to include in his new party. Johnson vetoed the Freedman's Bureau bill on February 18, 1866, to the delight of white Southerners and the puzzled anger of Republican legislators. He considered himself vindicated when a move to override his veto failed in the Senate the following day. Johnson believed that the Radicals would now be isolated and defeated and that the moderate Republicans would form behind him; he did not understand that Moderates also wanted to see African Americans treated fairly.
On February 22, 1866, Washington's Birthday, Johnson gave an impromptu speech to supporters who had marched to the White House and called for an address in honor of the first president. In his hour-long speech, he instead referred to himself over 200 times. More damagingly, he also spoke of "men ... still opposed to the Union" to whom he could not extend the hand of friendship he gave to the South. When called upon by the crowd to say who they were, Johnson named Pennsylvania Congressman Thaddeus Stevens, Massachusetts Senator Charles Sumner, and abolitionist Wendell Phillips, and accused them of plotting his assassination. Republicans viewed the address as a declaration of war, while one Democratic ally estimated Johnson's speech cost the party 200,000 votes in the 1866 congressional midterm elections.
Although strongly urged by moderates to sign the Civil Rights Act of 1866, Johnson broke decisively with them by vetoing it on March 27. In his veto message, he objected to the measure because it conferred citizenship on the freedmen at a time when 11 out of 36 states were unrepresented in the Congress, and that it "discriminated" in favor of African Americans and against whites. Within three weeks, Congress had overridden his veto, the first time that had been done on a major bill in American history. The veto, often seen as a key mistake of Johnson's presidency, convinced moderates there was no hope of working with him. Historian Eric Foner, in his volume on Reconstruction, views it as "the most disastrous miscalculation of his political career". According to Stewart, the veto was "for many his defining blunder, setting a tone of perpetual confrontation with Congress that prevailed for the rest of his presidency".
Congress also proposed the Fourteenth Amendment to the states. Written by Trumbull and others, it was sent for ratification by state legislatures in a process in which the president plays no part, though Johnson opposed it. The amendment was designed to put the key provisions of the Civil Rights Act into the Constitution, but also went further. The amendment extended citizenship to every person born in the United States (except Indians on reservations), penalized states that did not give the vote to freedmen, and most importantly, created new federal civil rights that could be protected by federal courts. It also guaranteed that the federal debt would be paid and forbade repayment of Confederate war debts. Further, it disqualified many former Confederates from office, although the disability could be removed — by Congress, not the president. Both houses passed the Freedmen's Bureau Act a second time, and again the President vetoed it; this time, the veto was overridden. By the summer of 1866, when Congress finally adjourned, Johnson's method of restoring states to the Union by executive fiat, without safeguards for the freedmen, was in deep trouble. His home state of Tennessee ratified the Fourteenth Amendment despite the President's opposition. When Tennessee did so, Congress immediately seated its proposed delegation, embarrassing Johnson.
Efforts to compromise failed, and a political war ensued between the united Republicans on one side, and on the other, Johnson and his Northern and Southern allies in the Democratic Party. He called a convention of the National Union Party. Republicans had returned to using their previous identifier; Johnson intended to use the discarded name to unite his supporters and gain election to a full term, in 1868. The battleground was the election of 1866; Southern states were not allowed to vote. Johnson campaigned vigorously, undertaking a public speaking tour, known as the "Swing Around the Circle". The trip, including speeches in Chicago, St. Louis, Indianapolis, and Columbus, proved politically disastrous, with the President making controversial comparisons between himself and Jesus, and engaging in arguments with hecklers. These exchanges were attacked as beneath the dignity of the presidency. The Republicans won by a landslide, increasing their two-thirds majority in Congress, and made plans to control Reconstruction. Johnson blamed the Democrats for giving only lukewarm support to the National Union movement.
#### Radical Reconstruction
Even with the Republican victory in November 1866, Johnson considered himself in a strong position. The Fourteenth Amendment had not yet been ratified by enough states to go into force, with Tennessee alone among the Southern or border states in voting for it. As the amendment required ratification by three-quarters of the states to become part of the Constitution, he believed the deadlock would be broken in his favor, leading to his election in 1868. Once it reconvened in December 1866, an energized Congress began passing legislation, often over a presidential veto; this included the District of Columbia voting bill. Congress admitted Nebraska to the Union over a veto, and the Republicans gained two senators and a state that promptly ratified the amendment. Johnson's veto of a bill for statehood for Colorado Territory was sustained; enough senators agreed that a district with a population of 30,000 was not yet worthy of statehood to win the day.
In January 1867, Congressman Stevens introduced legislation to dissolve the Southern state governments and reconstitute them into five military districts, under martial law. The states would begin again by holding constitutional conventions. African Americans could vote for or become delegates; former Confederates could not. In the legislative process, Congress added to the bill that restoration to the Union would follow the state's ratification of the Fourteenth Amendment, and completion of the process of adding it to the Constitution. Johnson and the Southerners attempted a compromise, whereby the South would agree to a modified version of the amendment without the disqualification of former Confederates, and for limited black suffrage. The Republicans insisted on the full language of the amendment, and the deal fell through. Although Johnson could have pocket vetoed the First Reconstruction Act as it was presented to him less than ten days before the end of the Thirty-Ninth Congress, he chose to veto it directly on March 2, 1867; Congress overruled him the same day. Also on March 2, Congress passed the Tenure of Office Act over the President's veto, in response to statements during the Swing Around the Circle that he planned to fire Cabinet secretaries who did not agree with him. This bill, requiring Senate approval for the firing of Cabinet members during the tenure of the president who appointed them and for one month afterwards, was immediately controversial, with some senators doubting that it was constitutional or that its terms applied to Johnson, whose key Cabinet officers were Lincoln holdovers.
### Impeachment
Secretary of War Edwin Stanton was an able and hard-working man, but difficult to deal with. Johnson both admired and was exasperated by his War Secretary, who, in combination with General of the Army Grant, worked to undermine the president's Southern policy from within his own administration. Johnson considered firing Stanton, but respected him for his wartime service as secretary. Stanton, for his part, feared allowing Johnson to appoint his successor and refused to resign, despite his public disagreements with his president.
The new Congress met for a few weeks in March 1867, then adjourned, leaving the House Committee on the Judiciary behind, tasked in the first impeachment inquiry against Johnson with reporting back to the full House whether there were grounds for Johnson to be impeached. This committee duly met, examined the President's bank accounts, and summoned members of the Cabinet to testify. When a federal court released former Confederate president Davis on bail on May 13 (he had been captured shortly after the war), the committee investigated whether the President had impeded the prosecution. It learned that Johnson was eager to have Davis tried. A bipartisan majority of the committee voted down impeachment charges; the committee adjourned on June 3.
Later in June, Johnson and Stanton battled over the question of whether the military officers placed in command of the South could override the civil authorities. The President had Attorney General Henry Stanbery issue an opinion backing his position that they could not. Johnson sought to pin down Stanton either as for, and thus endorsing Johnson's position, or against, showing himself to be opposed to his president and the rest of the Cabinet. Stanton evaded the point in meetings and written communications. When Congress reconvened in July, it passed a Reconstruction Act against Johnson's position, waited for his veto, overrode it, and went home. In addition to clarifying the powers of the generals, the legislation also deprived the President of control over the Army in the South. With Congress in recess until November, Johnson decided to fire Stanton and relieve one of the military commanders, General Philip Sheridan, who had dismissed the governor of Texas and installed a replacement with little popular support. Johnson was initially deterred by a strong objection from Grant, but on August 5, the President demanded Stanton's resignation; the secretary refused to quit with Congress out of session. Johnson then suspended him pending the next meeting of Congress as permitted under the Tenure of Office Act; Grant agreed to serve as temporary replacement while continuing to lead the Army.
Grant, under protest, followed Johnson's order transferring Sheridan and another of the district commanders, Daniel Sickles, who had angered Johnson by firmly following Congress's plan. The President also issued a proclamation pardoning most Confederates, exempting those who held office under the Confederacy, or who had served in federal office before the war but had breached their oaths. Although Republicans expressed anger with his actions, the 1867 elections generally went Democratic. No seats in Congress were directly elected in the polling, but the Democrats took control of the Ohio General Assembly, allowing them to defeat for reelection one of Johnson's strongest opponents, Senator Benjamin Wade. Voters in Ohio, Connecticut, and Minnesota turned down propositions to grant African Americans the vote.
The adverse results momentarily put a stop to Republican calls to impeach Johnson, who was elated by the elections. Nevertheless, once Congress met in November, the Judiciary Committee reversed itself and passed a resolution of impeachment against Johnson. After much debate about whether anything the President had done was a high crime or misdemeanor, the standard under the Constitution, the resolution was defeated by the House of Representatives on December 7, 1867, by a vote of 57 in favor to 108 opposed.
Johnson notified Congress of Stanton's suspension and Grant's interim appointment. In January 1868, the Senate disapproved of his action, and reinstated Stanton, contending the President had violated the Tenure of Office Act. Grant stepped aside over Johnson's objection, causing a complete break between them. Johnson then dismissed Stanton and appointed Lorenzo Thomas to replace him. Stanton refused to leave his office, and on February 24, 1868, the House impeached the President for intentionally violating the Tenure of Office Act, by a vote of 128 to 47. The House subsequently adopted eleven articles of impeachment, for the most part alleging that he had violated the Tenure of Office Act, and had questioned the legitimacy of Congress.
On March 5, 1868, the impeachment trial began in the Senate and lasted almost three months; Congressmen George S. Boutwell, Benjamin Butler and Thaddeus Stevens acted as managers for the House, or prosecutors, and William M. Evarts, Benjamin R. Curtis and former Attorney General Stanbery were Johnson's counsel; Chief Justice Chase served as presiding judge.
The defense relied on the provision of the Tenure of Office Act that made it applicable only to appointees of the current administration. Since Lincoln had appointed Stanton, the defense maintained Johnson had not violated the act, and also argued that the President had the right to test the constitutionality of an act of Congress. Johnson's counsel insisted that he make no appearance at the trial, nor publicly comment about the proceedings, and except for a pair of interviews in April, he complied.
Johnson maneuvered to gain an acquittal; for example, he pledged to Iowa Senator James W. Grimes that he would not interfere with Congress's Reconstruction efforts. Grimes reported to a group of Moderates, many of whom voted for acquittal, that he believed the President would keep his word. Johnson also promised to install the respected John Schofield as War Secretary. Kansas Senator Edmund G. Ross received assurances that the new, Radical-influenced constitutions ratified in South Carolina and Arkansas would be transmitted to the Congress without delay, an action which would give him and other senators political cover to vote for acquittal.
One reason senators were reluctant to remove the President was that his successor would have been Ohio Senator Wade, the president pro tempore of the Senate. Wade, a lame duck who left office in early 1869, was a Radical who supported such measures as women's suffrage, placing him beyond the pale politically in much of the nation. Additionally, a President Wade was seen as an obstacle to Grant's ambitions.
With the dealmaking, Johnson was confident of the result in advance of the verdict, and in the days leading up to the ballot, newspapers reported that Stevens and his Radicals had given up. On May 16, the Senate voted on the 11th article of impeachment, accusing Johnson of firing Stanton in violation of the Tenure of Office of Act once the Senate had overturned his suspension. Thirty-five senators voted "guilty" and 19 "not guilty", thus falling short by a single vote of the two-thirds majority required for conviction under the Constitution. Ten Republicans—Senators Grimes, Ross, Trumbull, James Dixon, James Rood Doolittle, Daniel Sheldon Norton, William Pitt Fessenden, Joseph S. Fowler, John B. Henderson, and Peter G. Van Winkle—voted to acquit the President. With Stevens bitterly disappointed at the result, the Senate then adjourned for the Republican National Convention; Grant was nominated for president. The Senate returned on May 26 and voted on the second and third articles, with identical 35–19 results. Faced with those results, Johnson's opponents gave up and dismissed proceedings. Stanton "relinquished" his office on May 26, and the Senate subsequently confirmed Schofield. When Johnson renominated Stanbery to return to his position as attorney general after his service as a defense manager, the Senate refused to confirm him.
Allegations were made at the time and again later that bribery dictated the outcome of the trial. Even when it was in progress, Representative Butler began an investigation, held contentious hearings, and issued a report, unendorsed by any other congressman. Butler focused on a New York–based "Astor House Group", supposedly led by political boss and editor Thurlow Weed. This organization was said to have raised large sums of money from whiskey interests through Cincinnati lawyer Charles Woolley to bribe senators to acquit Johnson. Butler went so far as to imprison Woolley in the Capitol building when he refused to answer questions, but failed to prove bribery.
### Foreign policy
Soon after taking office as president, Johnson reached an accord with Secretary of State William H. Seward that there would be no change in foreign policy. In practice, this meant that Seward would continue to run things as he had under Lincoln. Seward and Lincoln had been rivals for the nomination in 1860; the victor hoped that Seward would succeed him as president in 1869. At the time of Johnson's accession, the French had intervened in Mexico, sending troops there. While many politicians had indulged in saber rattling over the Mexican matter, Seward preferred quiet diplomacy, warning the French through diplomatic channels that their presence in Mexico was unacceptable. Although the President preferred a more aggressive approach, Seward persuaded him to follow his lead. In April 1866, the French government informed Seward that its troops would be brought home in stages, to conclude by November 1867. On August 14, 1866, Johnson and his cabinet gave a reception for Queen Emma of Hawaii who was returning to Hawaii after her trip to Britain and Europe.
Seward was an expansionist, and sought opportunities to gain territory for the United States. After the loss of the Crimean War in the 1850s, the Russian government saw its North American colony (today Alaska) as a financial liability, and feared losing control to Britain whose troops would easily swoop in and annex the territory from neighboring Canada in any future conflict. Negotiations between Russia and the U.S. over the sale of Alaska were halted due to the outbreak of the Civil War, but after the U.S. victory in the war, talks resumed. Russia instructed its minister in Washington, Baron Eduard de Stoeckl, to negotiate a sale. De Stoeckl did so deftly, getting Seward to raise his offer from $5 million (coincidentally, the minimum that Russia had instructed de Stoeckl to accept) to $7 million, and then getting $200,000 added by raising various objections. This sum of $7.2 million is equivalent to $ in present-day terms. On March 30, 1867, de Stoeckl and Seward signed the treaty, working quickly as the Senate was about to adjourn. Johnson and Seward took the signed document to the President's Room in the Capitol, only to be told there was no time to deal with the matter before adjournment. The President summoned the Senate into session to meet on April 1; that body approved the treaty, 37–2. Emboldened by his success in Alaska, Seward sought acquisitions elsewhere. His only success was staking an American claim to uninhabited Wake Island in the Pacific, which would be officially claimed by the U.S. in 1898. He came close with the Danish West Indies as Denmark agreed to sell and the local population approved the transfer in a plebiscite, but the Senate never voted on the treaty and it expired.
Another treaty that fared badly was the Johnson-Clarendon convention, negotiated in settlement of the Alabama Claims, for damages to American shipping from British-built Confederate raiders. Negotiated by the United States Minister to Britain, former Maryland senator Reverdy Johnson, in late 1868, it was ignored by the Senate during the remainder of the President's term. The treaty was rejected after he left office, and the Grant administration later negotiated considerably better terms from Britain.
### Administration and Cabinet
#### Judicial appointments
Johnson appointed nine Article III federal judges during his presidency, all to United States district courts; he did not appoint a justice to serve on the Supreme Court. In April 1866, he nominated Henry Stanbery to fill the vacancy left with the death of John Catron, but Congress eliminated the seat to prevent the appointment, and to ensure that he did not get to make any appointments eliminated the next vacancy as well, providing that the court would shrink by one justice when one next departed from office. Johnson appointed his Greeneville crony, Samuel Milligan, to the United States Court of Claims, where he served from 1868 until his death in 1874.
### Reforms initiated
In June 1866, Johnson signed the Southern Homestead Act into law, believing that the legislation would assist poor whites. Around 28,000 land claims were successfully patented, although few former slaves benefitted from the law, fraud was rampant, and much of the best land was off-limits, reserved for grants to veterans or railroads. In June 1868, Johnson signed an eight-hour law passed by Congress that established an eight-hour workday for laborers and mechanics employed by the Federal Government. Although Johnson told members of a Workingmen's party delegation in Baltimore that he could not directly commit himself to an eight-hour day, he nevertheless told the same delegation that he greatly favoured the "shortest number of hours consistent with the interests of all". According to Richard F. Selcer, however, the good intentions behind the law were "immediately frustrated" as wages were cut by 20%.
### Completion of term
Johnson sought nomination by the 1868 Democratic National Convention in New York in July 1868. He remained very popular among Southern whites, and boosted that popularity by issuing, just before the convention, a pardon ending the possibility of criminal proceedings against any Confederate not already indicted, meaning that only Davis and a few others still might face trial. On the first ballot, Johnson was second to former Ohio representative George H. Pendleton, who had been his Democratic opponent for vice president in 1864. Johnson's support was mostly from the South, and fell away as the ballots passed. On the 22nd ballot, former New York governor Horatio Seymour was nominated, and the President received only four votes, all from Tennessee.
The conflict with Congress continued. Johnson sent Congress proposals for amendments to limit the president to a single six-year term and make the president and the Senate directly elected, and for term limits for judges. Congress took no action on them. When the President was slow to officially report ratifications of the Fourteenth Amendment by the new Southern legislatures, Congress passed a bill, again over his veto, requiring him to do so within ten days of receipt. He still delayed as much as he could, but was required, in July 1868, to report the ratifications making the amendment part of the Constitution.
Seymour's operatives sought Johnson's support, but he long remained silent on the presidential campaign. It was not until October, with the vote already having taken place in some states, that he mentioned Seymour at all, and he never endorsed him. Nevertheless, Johnson regretted Grant's victory, in part because of their animus from the Stanton affair. In his annual message to Congress in December, Johnson urged the repeal of the Tenure of Office Act and told legislators that had they admitted their Southern colleagues in 1865, all would have been well. He celebrated his 60th birthday in late December with a party for several hundred children, though not including those of President-elect Grant, who did not allow his to go.
On Christmas Day 1868, Johnson issued a final amnesty, this one covering everyone, including Davis. He also issued, in his final months in office, pardons for crimes, including one for Dr. Samuel Mudd, controversially convicted of involvement in the Lincoln assassination (he had set Booth's broken leg) and imprisoned in Fort Jefferson on Florida's Dry Tortugas.
On March 3, the President hosted a large public reception at the White House on his final full day in office. Grant had made it known that he was unwilling to ride in the same carriage as Johnson, as was customary, and Johnson refused to go to the inauguration at all. Despite an effort by Seward to prompt a change of mind, he spent the morning of March 4 finishing last-minute business, and then shortly after noon rode from the White House to the home of a friend.
## Post-presidency (1869–1875)
After leaving the presidency, Johnson remained for some weeks in Washington, then returned to Greeneville for the first time in eight years. He was honored with large public celebrations along the way, especially in Tennessee, where cities hostile to him during the war hung out welcome banners. He had arranged to purchase a large farm near Greeneville to live on after his presidency.
Some expected Johnson to run for Governor of Tennessee or for the Senate again, while others thought that he would become a railroad executive. Johnson found Greeneville boring, and his private life was embittered by the suicide of his son Robert in 1869. Seeking vindication for himself, and revenge against his political enemies, he launched a Senate bid soon after returning home. Tennessee had gone Republican, but court rulings restoring the vote to some whites and suppression of the African-American vote by the Ku Klux Klan led to a Democratic victory in the legislative elections in August 1869. Johnson was seen as a likely victor in the Senate election, although hated by Radical Republicans, and by some Democrats because of his wartime activities. Although he was at one point within a single vote of victory in the legislature's balloting, the Republicans eventually elected Henry Cooper over Johnson, 54–51. In 1872, there was a special election for an at-large congressional seat for Tennessee; Johnson initially sought the Democratic nomination, but when he saw that it would go to former Confederate general Benjamin F. Cheatham, decided to run as an independent. The former president was defeated, finishing third, but the split in the Democratic Party defeated Cheatham in favor of an old Johnson Unionist ally, Horace Maynard.
In 1873, Johnson contracted cholera during an epidemic but recovered; that year he lost about $73,000 (\~$ in ) when the First National Bank of Washington went under, though he was eventually repaid much of the sum.
### Return to the Senate
He began looking towards the next Senate election to take place in the legislature in early 1875. Johnson began to woo the farmers' Grange movement; with his Jeffersonian leanings, he easily gained their support. He spoke throughout the state in his final campaign tour. Few African Americans outside the large towns were now able to vote as Reconstruction faded in Tennessee, setting a pattern that would be repeated in the other Southern states; the white domination would last almost a century. In the Tennessee legislative elections in August, the Democrats elected 92 legislators to the Republicans' eight, and Johnson went to Nashville for the legislative session. When the balloting for the Senate seat began on January 20, 1875, he led with 30 votes, but did not have the required majority as three former Confederate generals, one former colonel, and a former Democratic congressman split the vote with him. Johnson's opponents tried to agree on a single candidate who might gain majority support and defeat him, but failed, and he was elected on January 26 on the 54th ballot, with a margin of a single vote. Nashville erupted in rejoicing; remarked Johnson, "Thank God for the vindication."
Johnson's comeback garnered national attention, with the St. Louis Republican calling it "the most magnificent personal triumph which the history of American politics can show". At his swearing-in in the Senate on March 5, 1875, he was greeted with flowers, and sworn in alongside Hamlin (his predecessor as vice president) by incumbent Vice President Henry Wilson (who as senator had voted for Johnson's ouster). Many Republicans ignored Senator Johnson, though some, such as Ohio's John Sherman (who had voted for conviction), shook his hand. Johnson remains the only former president to serve in the Senate. He spoke only once in the short session, on March 22 lambasting President Grant for his use of federal troops in support of Louisiana's Reconstruction government. The former president asked, "How far off is military despotism?" and concluded his speech, "may God bless this people and God save the Constitution".
### Death
Johnson returned home after the special session concluded. In late July 1875, convinced some of his opponents were defaming him in the Ohio gubernatorial race, he decided to travel there to give speeches. He began the trip on July 28, and broke the journey at his daughter Mary's farm near Elizabethton, where his daughter Martha was also staying. That evening he had a stroke, but refused medical treatment until the next day, when he did not improve and two doctors were sent for from Elizabethton. He seemed to respond to their ministrations, but had another stroke on the evening of July 30, and died early the following morning at the age of 66. President Grant had the "painful duty" of announcing the death of the only surviving past president. Northern newspapers, in their obituaries, tended to focus on Johnson's loyalty during the war, while Southern ones paid tribute to his actions as president. Johnson's funeral was held on August 3 in Greeneville. He was buried with his body wrapped in an American flag and a copy of the U.S. Constitution placed under his head, according to his wishes. The burial ground was dedicated as the Andrew Johnson National Cemetery in 1906, and with his home and tailor's shop, is part of the Andrew Johnson National Historic Site.
## Historical reputation and legacy
According to Castel, "historians [of Johnson's presidency] have tended to concentrate to the exclusion of practically everything else upon his role in that titanic event [Reconstruction]." Through the remainder of the 19th century, there were few historical evaluations of Johnson and his presidency. Memoirs from Northerners who had dealt with him, such as former vice president Henry Wilson and Maine Senator James G. Blaine, depicted him as an obstinate boor who tried to favor the South in Reconstruction but was frustrated by Congress. According to historian Howard K. Beale in his journal article about the historiography of Reconstruction, "Men of the postwar decades were more concerned with justifying their own position than they were with painstaking search for truth. Thus [Alabama representative and historian] Hilary Herbert and his corroborators presented a Southern indictment of Northern policies, and Henry Wilson's history was a brief for the North."
The turn of the 20th century saw the first significant historical evaluations of Johnson. Leading the wave was Pulitzer Prize-winning historian James Ford Rhodes, who wrote of the former president:
> Johnson acted in accordance with his nature. He had intellectual force but it worked in a groove. Obstinate rather than firm it undoubtedly seemed to him that following counsel and making concessions were a display of weakness. At all events from his December message to the veto of the Civil Rights Bill he yielded not a jot to Congress. The moderate senators and representatives (who constituted a majority of the Union party) asked him for only a slight compromise; their action was really an entreaty that he would unite with them to preserve Congress and the country from the policy of the radicals ... His quarrel with Congress prevented the readmission into the Union on generous terms of the members of the late Confederacy ... His pride of opinion, his desire to beat, blinded him to the real welfare of the South and of the whole country.
Rhodes ascribed Johnson's faults to his personal weaknesses, and blamed him for the problems of the postbellum South. Other early 20th-century historians, such as John Burgess, future president Woodrow Wilson, and William Dunning, concurred with Rhodes, believing Johnson flawed and politically inept but concluding that he had tried to carry out Lincoln's plans for the South in good faith. Author and journalist Jay Tolson suggests that Wilson "depict[ed Reconstruction] as a vindictive program that hurt even repentant southerners while benefiting northern opportunists, the so-called Carpetbaggers, and cynical white southerners, or Scalawags, who exploited alliances with blacks for political gain."
Even as Rhodes and his school wrote, another group of historians (Dunning School) was setting out on the full rehabilitation of Johnson, using for the first time primary sources such as his papers, provided by his daughter Martha before her death in 1901, and the diaries of Johnson's Navy Secretary, Gideon Welles, first published in 1911. The resulting volumes, such as David Miller DeWitt's The Impeachment and Trial of President Andrew Johnson (1903), presented him far more favorably than they did those who had sought to oust him. In James Schouler's 1913 History of the Reconstruction Period, the author accused Rhodes of being "quite unfair to Johnson", though agreeing that the former president had created many of his own problems through inept political moves. These works had an effect; although historians continued to view Johnson as having deep flaws which sabotaged his presidency, they saw his Reconstruction policies as fundamentally correct.
Castel writes:
> at the end of the 1920s, an historiographical revolution took place. In the span of three years five widely read books appeared, all highly pro-Johnson. ...They differed in general approach and specific interpretations, but they all glorified Johnson and condemned his enemies. According to these writers, Johnson was a humane, enlightened, and liberal statesman who waged a courageous battle for the Constitution and democracy against scheming and unscrupulous Radicals, who were motivated by a vindictive hatred of the South, partisanship, and a desire to establish the supremacy of Northern "big business". In short, rather than a boor, Johnson was a martyr; instead of a villain, a hero.
Beale wondered in 1940, "is it not time that we studied the history of Reconstruction without first assuming, at least subconsciously, that carpetbaggers and Southern white Republicans were wicked, that Negroes were illiterate incompetents, and that the whole white South owes a debt of gratitude to the restorers of 'white supremacy'?" Despite these doubts, the favorable view of Johnson survived for a time. In 1942, Van Heflin portrayed the former president as a fighter for democracy in the Hollywood film Tennessee Johnson. In 1948, a poll of his colleagues by historian Arthur M. Schlesinger deemed Johnson among the average presidents; in 1956, one by Clinton L. Rossiter named him as one of the near-great chief executives. Foner notes that at the time of these surveys, "the Reconstruction era that followed the Civil War was regarded as a time of corruption and misgovernment caused by granting black men the right to vote."
Earlier historians, including Beale, believed that money drove events, and had seen Reconstruction as an economic struggle. They also accepted, for the most part, that reconciliation between North and South should have been the top priority of Reconstruction. In the 1950s, historians began to focus on the African-American experience as central to Reconstruction. They rejected completely any claim of black inferiority, which had marked many earlier historical works, and saw the developing civil rights movement as a second Reconstruction; some neoabolitionist writers stated they hoped their work on the postbellum era would advance the cause of civil rights. These authors sympathized with the Radical Republicans for their desire to help the African American, and saw Johnson as callous towards the freedman. In a number of works from 1956 onwards by such historians as Fawn Brodie, the former president was depicted as a successful saboteur of efforts to better the freedman's lot. These volumes included major biographies of Stevens and Stanton. Reconstruction was increasingly seen as a noble effort to integrate the freed slaves into society.
In the early 21st century, Johnson is among those commonly mentioned as the worst presidents in U.S. history. According to historian Glenn W. Lafantasie, who believes James Buchanan the worst president, "Johnson is a particular favorite for the bottom of the pile because of his impeachment ... his complete mishandling of Reconstruction policy ... his bristling personality, and his enormous sense of self-importance." Tolson suggests that "Johnson is now scorned for having resisted Radical Republican policies aimed at securing the rights and well-being of the newly emancipated African-Americans." Gordon-Reed notes that Johnson, along with his contemporaries Pierce and Buchanan, is generally listed among the five worst presidents, but states "there have never been more difficult times in the life of this nation. The problems these men had to confront were enormous. It would have taken a succession of Lincolns to do them justice."
Trefousse considers Johnson's legacy to be "the maintenance of white supremacy. His boost to Southern conservatives by undermining Reconstruction was his legacy to the nation, one that would trouble the country for generations to come." Gordon-Reed states of Johnson:
> We know the results of Johnson's failures—that his preternatural stubbornness, his mean and crude racism, his primitive and instrumental understanding of the Constitution stunted his capacity for enlightened and forward-thinking leadership when those qualities were so desperately needed. At the same time, Johnson's story has a miraculous quality to it: the poor boy who systematically rose to the heights, fell from grace, and then fought his way back to a position of honor in the country. For good or ill, "only in America", as they say, could Johnson's story unfold in the way that it did.
## See also
- Amphitheatrum Johnsonianum, 1867 illustration
- Andrew Johnson alcoholism debate
- Efforts to impeach Andrew Johnson
- Emily Harold, extramarital affair
- List of presidents of the United States
- List of presidents of the United States by previous experience
- List of presidents of the United States who owned slaves
- Tennessee Johnson, 1942 film |
# Megalodon
Otodus megalodon (/ˈmɛɡələdɒn/ MEG-əl-ə-don; meaning "big tooth"), commonly known as megalodon, is an extinct species of giant mackerel shark that lived approximately 23 to 3.6 million years ago (Mya), from the Early Miocene to the Early Pliocene epochs. O. megalodon was formerly thought to be a member of the family Lamnidae and a close relative of the great white shark (Carcharodon carcharias), but has been reclassified into the extinct family Otodontidae, which diverged from the great white shark during the Early Cretaceous.
While regarded as one of the largest and most powerful predators to have ever lived, megalodon is only known from fragmentary remains, and its appearance and maximum size are uncertain. Scientists differ on whether it would have more closely resembled a stockier version of the great white shark (Carcharodon carcharias), the basking shark (Cetorhinus maximus) or the sand tiger shark (Carcharias taurus). The most recent estimate with the least error range suggests a maximum length estimate up to 20.3 meters (67 ft), although the modal lengths are estimated at 10.5 meters (34 ft). Their teeth were thick and robust, built for grabbing prey and breaking bone, and their large jaws could exert a bite force of up to 108,500 to 182,200 newtons (24,390 to 40,960 lbf).
Megalodon probably had a major impact on the structure of marine communities. The fossil record indicates that it had a cosmopolitan distribution. It probably targeted large prey, such as whales, seals and sea turtles. Juveniles inhabited warm coastal waters and fed on fish and small whales. Unlike the great white, which attacks prey from the soft underside, megalodon probably used its strong jaws to break through the chest cavity and puncture the heart and lungs of its prey.
The animal faced competition from whale-eating cetaceans, such as Livyatan and other macroraptorial sperm whales and possibly smaller ancestral killer whales (Orcinus). As the shark preferred warmer waters, it is thought that oceanic cooling associated with the onset of the ice ages, coupled with the lowering of sea levels and resulting loss of suitable nursery areas, may have also contributed to its decline. A reduction in the diversity of baleen whales and a shift in their distribution toward polar regions may have reduced megalodon's primary food source. The shark's extinction coincides with a gigantism trend in baleen whales.
## Classification
### Prescientific and early research history
Megalodon teeth have been excavated and used since ancient times. They were a valued artifact amongst pre-Columbian cultures in the Americas for their large sizes and serrated blades, from which they were modified into projectile points, knives, jewelry, and funeral accessories. At least some, such as the Panamanian Sitio Conte societies, seemed to have used them primarily for ceremonial purposes. Mining of megalodon teeth by the Algonquin peoples in the Chesapeake Bay and their selective trade with the Adena culture in Ohio occurred as early as 430 BC. The earliest written account of megalodon teeth was by Pliny the Elder in an AD 73 volume of Historia Naturalis, who described them as resembling petrified human tongues that Roman folklorists believed to have fallen from the sky during lunar eclipses and called them glossopetrae ("tongue stones"). The purported tongues were later thought in a 12th-century Maltese tradition to have belonged to serpents that Paul the Apostle turned to stone while shipwrecked there, and were given antivenom powers by the saint. Glossopetrae reappeared throughout Europe in late 13th to 16th century literature, ascribed with more supernatural properties that cured a wider variety of poisons. Use of megalodon teeth for this purpose became widespread among medieval and Renaissance nobility, who fashioned them into protective amulets and tableware to purportedly detoxify poisoned liquids or bodies that touched the stones. By the 16th century, teeth were directly consumed as ingredients of European-made Goa stones.
The true nature of the glossopetrae as shark's teeth was held by some since at least 1554, when cosmographer André Thevet described it as hearsay, although he did not believe it. The earliest scientific argument for this view was made by Italian naturalist Fabio Colonna, who in 1616 published an illustration of a Maltese megalodon tooth alongside a great white shark's and noted their striking similarities. He argued that the former and its likenesses were not petrified serpent's tongues but actually the teeth of similar sharks that washed up on shore. Colonna supported this thesis through an experiment of burning glossopetrae samples, from which he observed carbon residue he interpreted as proving an organic origin. However, interpretation of the stones as shark's teeth remained widely unaccepted. This was in part due the inability to explain how some of them are found far from the sea. The shark tooth argument was academically raised again during the late 17th century by English scientists Robert Hooke, John Ray, and Danish naturalist Niels Steensen (Latinized Nicholas Steno). Steensen's argument in particular is most recognized as inferred from his dissection of the head of a great white caught in 1666. His 1667 report depicted engravings of a shark's head and megalodon teeth that became especially iconic. However, the illustrated head was not actually the head that Steensen dissected, nor were the fossil teeth illustrated by him. Both engravings were originally commissioned in the 1590s by Papal physician Michele Mercati, who also had in possession the head of a great white, for his book Metallotheca. The work remained unpublished in Steensen's time due to Mercati's premature death, and the former reused the two illustrations per suggestion by Carlo Roberto Dati, who thought a depiction of the actual dissected shark was unsuitable for readers. Steensen also stood out in pioneering a stratigraphic explanation for how similar stones appeared further inland. He observed that rock layers bearing megalodon teeth contained marine sediments and hypothesized that these layers correlated to a period of flood that was later covered by terrestrial layers and uplifted by geologic activity.
Swiss naturalist Louis Agassiz gave megalodon its scientific name in his seminal 1833-1843 work Recherches sur les poissons fossiles (Research on fossil fish). He named it Carcharias megalodon in an 1835 illustration of the holotype and additional teeth, congeneric with the modern sand tiger shark. The specific name is a portmanteau of the Ancient Greek words μεγάλος (megálos, meaning "big") and ὀδών (odṓn, meaning "tooth"), combined meaning "big tooth". Agassiz referenced the name as early as 1832, but because specimens were not referenced they are not taxonomically recognized uses. Formal description of the species was published in an 1843 volume, where Agassiz revised the name to Carcharodon megalodon as its teeth were far too large for the former genus and more alike to the great white shark. He also erroneously identified several megalodon teeth as belonging to additional species eventually named Carcharodon rectidens, Carcharodon subauriculatus, Carcharodon productus, and Carcharodon polygurus. Because Carcharodon megalodon appeared first in the 1835 illustration, the remaining names are considered junior synonyms under the principle of priority.
### Evolution
While the earliest megalodon remains have been reported from the Late Oligocene, around 28 million years ago (Mya), there is disagreement as to when it appeared, with dates ranging to as young as 16 mya. It has been thought that megalodon became extinct around the end of the Pliocene, about 2.6 Mya; claims of Pleistocene megalodon teeth, younger than 2.6 million years old, are considered unreliable. A 2019 assessment moves the extinction date back to earlier in the Pliocene, 3.6 Mya.
Megalodon is considered to be a member of the family Otodontidae, genus Otodus, as opposed to its previous classification into Lamnidae, genus Carcharodon. Megalodon's classification into Carcharodon was due to dental similarity with the great white shark, but most authors believe that this is due to convergent evolution. In this model, the great white shark is more closely related to the extinct broad-toothed mako (Cosmopolitodus hastalis) than to megalodon, as evidenced by more similar dentition in those two sharks; megalodon teeth have much finer serrations than great white shark teeth. The great white shark is more closely related to the mako sharks (Isurus spp.), with a common ancestor around 4 Mya. Proponents of the former model, wherein megalodon and the great white shark are more closely related, argue that the differences between their dentition are minute and obscure.
The genus Carcharocles contains four species: C. auriculatus, C. angustidens, C. chubutensis, and C. megalodon. The evolution of this lineage is characterized by the increase of serrations, the widening of the crown, the development of a more triangular shape, and the disappearance of the lateral cusps. The evolution in tooth morphology reflects a shift in predation tactics from a tearing-grasping bite to a cutting bite, likely reflecting a shift in prey choice from fish to cetaceans. Lateral cusplets were finally lost in a gradual process that took roughly 12 million years during the transition between C. chubutensis and C. megalodon. The genus was proposed by D. S. Jordan and H. Hannibal in 1923 to contain C. auriculatus. In the 1980s, megalodon was assigned to Carcharocles. Before this, in 1960, the genus Procarcharodon was erected by French ichthyologist Edgard Casier, which included those four sharks and was considered separate from the great white shark. It is since considered a junior synonym of Carcharocles. The genus Palaeocarcharodon was erected alongside Procarcharodon to represent the beginning of the lineage, and, in the model wherein megalodon and the great white shark are closely related, their last common ancestor. It is believed to be an evolutionary dead-end and unrelated to the Carcharocles sharks by authors who reject that model.
Another model of the evolution of this genus, also proposed by Casier in 1960, is that the direct ancestor of the Carcharocles is the shark Otodus obliquus, which lived from the Paleocene through the Miocene epochs, 60 to 13 Mya. The genus Otodus is ultimately derived from Cretolamna, a shark from the Cretaceous period. In this model, O. obliquus evolved into O. aksuaticus, which evolved into C. auriculatus, and then into C. angustidens, and then into C. chubutensis, and then finally into C. megalodon.
Another model of the evolution of Carcharocles, proposed in 2001 by paleontologist Michael Benton, is that the three other species are actually a single species of shark that gradually changed over time between the Paleocene and the Pliocene, making it a chronospecies. Some authors suggest that C. auriculatus, C. angustidens, and C. chubutensis should be classified as a single species in the genus Otodus, leaving C. megalodon the sole member of Carcharocles.
The genus Carcharocles may be invalid, and the shark may actually belong in the genus Otodus, making it Otodus megalodon. A 1974 study on Paleogene sharks by Henri Cappetta erected the subgenus Megaselachus, classifying the shark as Otodus (Megaselachus) megalodon, along with O. (M.) chubutensis. A 2006 review of Chondrichthyes elevated Megaselachus to genus, and classified the sharks as Megaselachus megalodon and M. chubutensis. The discovery of fossils assigned to the genus Megalolamna in 2016 led to a re-evaluation of Otodus, which concluded that it is paraphyletic, that is, it consists of a last common ancestor but it does not include all of its descendants. The inclusion of the Carcharocles sharks in Otodus would make it monophyletic, with the sister clade being Megalolamna.
The cladogram below represents the hypothetical relationships between megalodon and other sharks, including the great white shark. Modified from Shimada et al. (2016), Ehret et al., (2009), and the findings of Siversson et al. (2015).
## Biology
### Appearance
One interpretation on how megalodon appeared was that it was a robust-looking shark, and may have had a similar build to the great white shark. The jaws may have been blunter and wider than the great white, and the fins would have also been similar in shape, though thicker due to its size. It may have had a pig-eyed appearance, in that it had small, deep-set eyes.
Another interpretation is that megalodon bore a similarity to the whale shark (Rhincodon typus) or the basking shark (Cetorhinus maximus). The tail fin would have been crescent-shaped, the anal fin and second dorsal fin would have been small, and there would have been a caudal keel present on either side of the tail fin (on the caudal peduncle). This build is common in other large aquatic animals, such as whales, tuna, and other sharks, in order to reduce drag while swimming. The head shape can vary between species as most of the drag-reducing adaptations are toward the tail-end of the animal.
One associated set of megalodon remains was found with placoid scales, which are 0.3 to 0.8 millimetres (0.012 to 0.031 in) in maximum width, and have broadly spaced keels.
### Size
Due to fragmentary remains, there have been many contradictory size estimates for megalodon, as they can only be drawn from fossil teeth and vertebrae. The great white shark has been the basis of reconstruction and size estimation, as it is regarded as the best analogue to megalodon. Several total length estimation methods have been produced from comparing megalodon teeth and vertebrae to those of the great white.
Megalodon size estimates vary depending on the method used, with maximum total length estimates ranging from 14.2–20.3 meters (47–67 ft). A 2015 study estimated the modal total body length at 10.5 meters (34 ft), calculated from 544 megalodon teeth, found throughout geological time and geography, including juveniles and adults ranging from 2.2 to 17.9 metres (7.2 to 58.7 ft) in total length. In comparison, large great white sharks are generally around 6 meters (20 ft) in length, with a few contentious reports suggesting larger sizes. The whale shark is the largest living fish, with one large female reported with a precaudal length of 15 meters (49 ft) and an estimated total length of 18.8 meters (62 ft). It is possible that different populations of megalodon around the globe had different body sizes and behaviors due to different ecological pressures. Megalodon is thought to have been the largest macropredatory shark that ever lived.
> A C. megalodon about 16 meters long would have weighed about 48 metric tons (53 tons). A 17-meter (56-foot) C. megalodon would have weighed about 59 metric tons (65 tons), and a 20.3-meter (67 foot) monster would have topped off at 103 metric tons (114 tons).
In his 2015 book, The Story of Life in 25 Fossils: Tales of Intrepid Fossil Hunters and the Wonders of Evolution, Donald Prothero proposed the body mass estimates for different individuals of different length by extrapolating from a vertebral centra based on the dimensions of the great white, a methodology also used for the 2008 study which supports the maximum mass estimate.
In 2020, Cooper and his colleagues reconstructed a 2D model of megalodon based on the dimensions of all the extant lamnid sharks and suggested that a 16 meters (52 ft) long megalodon would have had a 4.65 m (15.3 ft) long head, 1.41 m (4 ft 8 in) tall gill slits, a 1.62 m (5 ft 4 in) tall dorsal fin, 3.08 m (10 ft 1 in) long pectoral fins, and a 3.85 m (12 ft 8 in) tall tail fin. In 2022, Cooper and his colleagues also reconstructed a 3D model with the same basis as the 2020 study, resulting in a body mass estimate of 61.56 t (67.86 short tons; 60.59 long tons) for a 16 meters (52 ft) long megalodon (higher than the previous estimates); a vertebral column specimen named IRSNB P 9893 (formerly IRSNB 3121), belonging to a 46 year old individual from Belgium, was used for extrapolation. An individual of this size would have required 98,175 kcal per day, 20 times more than what the adult great white requires.
Mature male megalodon may have had a body mass of 12.6 to 33.9 t (13.9 to 37.4 short tons; 12.4 to 33.4 long tons), and mature females may have been 27.4 to 59.4 t (30.2 to 65.5 short tons; 27.0 to 58.5 long tons), assuming that males could range in length from 10.5 to 14.3 meters (34 to 47 ft) and females 13.3 to 17 meters (44 to 56 ft).
A 2015 study linking shark size and typical swimming speed estimated that megalodon would have typically swum at 18 kilometers per hour (11 mph)–assuming that its body mass was typically 48 t (53 short tons; 47 long tons)–which is consistent with other aquatic creatures of its size, such as the fin whale (Balaenoptera physalus) which typically cruises at speeds of 14.5 to 21.5 km/h (9.0 to 13.4 mph). In 2022, Cooper and his colleagues converted this calculation into relative cruising speed (body lengths per second), resulting in a mean absolute cruising speed of 5 kilometers per hour (3.1 mph) and a mean relative cruising speed of 0.09 body lengths per second for a 16 meters (52 ft) long megalodon; the authors found their mean absolute cruising speed to be faster than any extant lamnid sharks and their mean relative cruising speed to be slower, consistent with previous estimates.
Its large size may have been due to climatic factors and the abundance of large prey items, and it may have also been influenced by the evolution of regional endothermy (mesothermy) which would have increased its metabolic rate and swimming speed. The otodontid sharks have been considered to have been ectotherms, so on that basis megalodon would have been ectothermic. However, the largest contemporary ectothermic sharks, such as the whale shark, are filter feeders, while lamnids are regional endotherms, implying some metabolic correlations with a predatory lifestyle. These considerations, as well as tooth oxygen isotopic data and the need for higher burst swimming speeds in macropredators of endothermic prey than ectothermy would allow, imply that otodontids, including megalodon, were probably regional endotherms.
In 2020, Shimada and colleagues suggested large size was instead due to intrauterine cannibalism, where the larger fetus eats the smaller fetus, resulting in progressively larger and larger fetuses, requiring the mother to attain even greater size as well as caloric requirements which would have promoted endothermy. Males would have needed to keep up with female size in order to still effectively copulate (which probably involved latching onto the female with claspers, like modern cartilaginous fish).
#### Maximum estimates
The first attempt to reconstruct the jaw of megalodon was made by Bashford Dean in 1909, displayed at the American Museum of Natural History. From the dimensions of this jaw reconstruction, it was hypothesized that megalodon could have approached 30 meters (98 ft) in length. Dean had overestimated the size of the cartilage on both jaws, causing it to be too tall.
In 1973, John E. Randall, an ichthyologist, used the enamel height (the vertical distance of the blade from the base of the enamel portion of the tooth to its tip) to measure the length of the shark, yielding a maximum length of about 13 meters (43 ft). However, tooth enamel height does not necessarily increase in proportion to the animal's total length.
In 1994, marine biologists Patrick J. Schembri and Stephen Papson opined that O. megalodon may have approached a maximum of around 24 to 25 meters (79 to 82 ft) in total length.
In 1996, shark researchers Michael D. Gottfried, Leonard Compagno, and S. Curtis Bowman proposed a linear relationship between the great white shark's total length and the height of the largest upper anterior tooth. The proposed relationship is: total length in meters = − (0.096) × [UA maximum height (mm)]-(0.22). Using this tooth height regression equation, the authors estimated a total length of 15.9 meters (52 ft) based on a tooth 16.8 centimeters (6.6 in) tall, which the authors considered a conservative maximum estimate. They also compared the ratio between the tooth height and total length of large female great whites to the largest megalodon tooth. A 6-meter (20 ft) long female great white, which the authors considered the largest 'reasonably trustworthy' total length, produced an estimate of 16.8 meters (55 ft). However, based on the largest female great white reported, at 7.1 meters (23 ft), they estimated a maximum estimate of 20.2 meters (66 ft).
In 2002, shark researcher Clifford Jeremiah proposed that total length was proportional to the root width of an upper anterior tooth. He claimed that for every 1 centimeter (0.39 in) of root width, there are approximately 1.4 meters (4.6 ft) of shark length. Jeremiah pointed out that the jaw perimeter of a shark is directly proportional to its total length, with the width of the roots of the largest teeth being a tool for estimating jaw perimeter. The largest tooth in Jeremiah's possession had a root width of about 12 centimeters (4.7 in), which yielded 16.5 meters (54 ft) in total length.
In 2002, paleontologist Kenshu Shimada of DePaul University proposed a linear relationship between tooth crown height and total length after conducting anatomical analysis of several specimens, allowing any sized tooth to be used. Shimada stated that the previously proposed methods were based on a less-reliable evaluation of the dental homology between megalodon and the great white shark, and that the growth rate between the crown and root is not isometric, which he considered in his model. Using this model, the upper anterior tooth possessed by Gottfried and colleagues corresponded to a total length of 15 meters (49 ft). Among several specimens found in the Gatún Formation of Panama, one upper lateral tooth was used by other researchers to obtain a total length estimate of 17.9 meters (59 ft) using this method.
In 2019, Shimada revisited the size of megalodon and discouraged using non-anterior teeth for estimations, noting that the exact position of isolated non-anterior teeth is difficult to identify. Shimada provided maximum total length estimates using the largest anterior teeth available in museums. The tooth with the tallest crown height known to Shimada, NSM PV-19896, produced a total length estimate of 14.2 meters (47 ft). The tooth with the tallest total height, FMNH PF 11306, was reported at 16.8 centimeters (6.6 in). However, Shimada remeasured the tooth and found it actually to measure 16.2 centimeters (6.4 in). Using the total height tooth regression equation proposed by Gottfried and colleagues produced an estimate of 15.3 meters (50 ft).
In 2021, Victor J. Perez, Ronny M. Leder, and Teddy Badaut proposed a method of estimating total length of megalodon from the sum of the tooth crown widths. Using more complete megalodon dentitions, they reconstructed the dental formula and then made comparisons to living sharks. The researchers noted that the 2002 Shimada crown height equations produce wildly varying results for different teeth belonging to the same shark (range of error of ± 9 metres (30 ft)), casting doubt on some of the conclusions of previous studies using that method. Using the largest tooth available to the authors, GHC 6, with a crown width of 13.3 centimeters (5.2 in), they estimated a maximum body length of approximately 20 meters (66 ft), with a range of error of approximately ± 3.5 metres (11 ft). This maximum length estimate was also supported by Cooper and his colleagues in 2022.
There are anecdotal reports of teeth larger than those found in museum collections. Gordon Hubbell from Gainesville, Florida, possesses an upper anterior megalodon tooth whose maximum height is 18.4 centimeters (7.25 in), one of the largest known tooth specimens from the shark. In addition, a 2.7-by-3.4-meter (9 by 11 ft) megalodon jaw reconstruction developed by fossil hunter Vito Bertucci contains a tooth whose maximum height is reportedly over 18 centimeters (7 in).
### Teeth and bite force
The most common fossils of megalodon are its teeth. Diagnostic characteristics include a triangular shape, robust structure, large size, fine serrations, a lack of lateral denticles, and a visible V-shaped neck (where the root meets the crown). The tooth met the jaw at a steep angle, similar to the great white shark. The tooth was anchored by connective tissue fibers, and the roughness of the base may have added to mechanical strength. The lingual side of the tooth, the part facing the tongue, was convex; and the labial side, the other side of the tooth, was slightly convex or flat. The anterior teeth were almost perpendicular to the jaw and symmetrical, whereas the posterior teeth were slanted and asymmetrical.
Megalodon teeth can measure over 180 millimeters (7.1 in) in slant height (diagonal length) and are the largest of any known shark species, implying it was the largest of all macropredatory sharks. In 1989, a nearly complete set of megalodon teeth was discovered in Saitama, Japan. Another nearly complete associated megalodon dentition was excavated from the Yorktown Formations in the United States, and served as the basis of a jaw reconstruction of megalodon at the National Museum of Natural History (USNM). Based on these discoveries, an artificial dental formula was put together for megalodon in 1996.
The dental formula of megalodon is: . As evident from the formula, megalodon had four kinds of teeth in its jaws: anterior, intermediate, lateral, and posterior. Megalodon's intermediate tooth technically appears to be an upper anterior and is termed as "A3" because it is fairly symmetrical and does not point mesially (side of the tooth toward the midline of the jaws where the left and right jaws meet). Megalodon had a very robust dentition, and had over 250 teeth in its jaws, spanning 5 rows. It is possible that large megalodon individuals had jaws spanning roughly 2 meters (6.6 ft) across. The teeth were also serrated, which would have improved efficiency in cutting through flesh or bone. The shark may have been able to open its mouth to a 75° angle, though a reconstruction at the USNM approximates a 100° angle.
In 2008, a team of scientists led by S. Wroe conducted an experiment to determine the bite force of the great white shark, using a 2.5-meter (8.2 ft) long specimen, and then isometrically scaled the results for its maximum size and the conservative minimum and maximum body mass of megalodon. They placed the bite force of the latter between 108,514 to 182,201 newtons (24,395 to 40,960 lbf) in a posterior bite, compared to the 18,216 newtons (4,095 lbf) bite force for the largest confirmed great white shark, and 7,495 newtons (1,685 lbf) for the placoderm fish Dunkleosteus. In addition, Wroe and colleagues pointed out that sharks shake sideways while feeding, amplifying the force generated, which would probably have caused the total force experienced by prey to be higher than the estimate.
In 2021, Antonio Ballell and Humberto Ferrón used Finite Element Analysis modeling to examine the stress distribution of three types of megalodon teeth and closely related mega-toothed species when exposed to anterior and lateral forces, the latter of which would be generated when a shark shakes its head to tear through flesh. The resulting simulations identified higher levels of stress in megalodon teeth under lateral force loads compared to its precursor species such as O. obliquus and O. angusteidens when tooth size was removed as a factor. This suggests that megalodon teeth were of a different functional significance than previously expected, challenging prior interpretations that megalodon's dental morphology was primarily driven by a dietary shift towards marine mammals. Instead, the authors proposed that it was a byproduct of an increase in body size caused by heterochronic selection.
### Internal anatomy
Megalodon is represented in the fossil record by teeth, vertebral centra, and coprolites. As with all sharks, the skeleton of megalodon was formed of cartilage rather than bone; consequently most fossil specimens are poorly preserved. To support its large dentition, the jaws of megalodon would have been more massive, stouter, and more strongly developed than those of the great white, which possesses a comparatively gracile dentition. Its chondrocranium, the cartilaginous skull, would have had a blockier and more robust appearance than that of the great white. Its fins were proportional to its larger size.
Some fossil vertebrae have been found. The most notable example is a partially preserved vertebral column of a single specimen, excavated in the Antwerp Basin, Belgium, in 1926. It comprises 150 vertebral centra, with the centra ranging from 55 millimeters (2.2 in) to 155 millimeters (6 in) in diameter. The shark's vertebrae may have gotten much bigger, and scrutiny of the specimen revealed that it had a higher vertebral count than specimens of any known shark, possibly over 200 centra; only the great white approached it. Another partially preserved vertebral column of a megalodon was excavated from the Gram Formation in Denmark in 1983, which comprises 20 vertebral centra, with the centra ranging from 100 millimeters (4 in) to 230 millimeters (9 in) in diameter.
The coprolite remains of megalodon are spiral-shaped, indicating that the shark may have had a spiral valve, a corkscrew-shaped portion of the lower intestines, similar to extant lamniform sharks. Miocene coprolite remains were discovered in Beaufort County, South Carolina, with one measuring 14 cm (5.5 in).
Gottfried and colleagues reconstructed the entire skeleton of megalodon, which was later put on display at the Calvert Marine Museum in the United States and the Iziko South African Museum. This reconstruction is 11.3 meters (37 ft) long and represents a mature male, based on the ontogenetic changes a great white shark experiences over the course of its life.
## Paleobiology
### Prey relationships
Though sharks are generally opportunistic feeders, megalodon's great size, high-speed swimming capability, and powerful jaws, coupled with an impressive feeding apparatus, made it an apex predator capable of consuming a broad spectrum of animals. Otodus megalodon was probably one of the most powerful predators to have existed. A study focusing on calcium isotopes of extinct and extant elasmobranch sharks and rays revealed that megalodon fed at a higher trophic level than the contemporaneous great white shark ("higher up" in the food chain).
Fossil evidence indicates that megalodon preyed upon many cetacean species, such as dolphins, small whales, cetotheres, squalodontids (shark toothed dolphins), sperm whales, bowhead whales, and rorquals. In addition to this, they also targeted seals, sirenians, and sea turtles. The shark was an opportunist and piscivorous, and it would have also gone after smaller fish and other sharks. Many whale bones have been found with deep gashes most likely made by their teeth. Various excavations have revealed megalodon teeth lying close to the chewed remains of whales, and sometimes in direct association with them.
The feeding ecology of megalodon appears to have varied with age and between sites, like the modern great white shark. It is plausible that the adult megalodon population off the coast of Peru targeted primarily cetothere whales 2.5 to 7 meters (8.2 to 23 ft) in length and other prey smaller than itself, rather than large whales in the same size class as themselves. Meanwhile, juveniles likely had a diet that consisted more of fish.
### Feeding strategies
Sharks often employ complex hunting strategies to engage large prey animals. Great white shark hunting strategies may be similar to how megalodon hunted its large prey. Megalodon bite marks on whale fossils suggest that it employed different hunting strategies against large prey than the great white shark.
One particular specimen–the remains of a 9-meter (30 ft) long undescribed Miocene baleen whale–provided the first opportunity to quantitatively analyze its attack behavior. Unlike great whites which target the underbelly of their prey, megalodon probably targeted the heart and lungs, with their thick teeth adapted for biting through tough bone, as indicated by bite marks inflicted to the rib cage and other tough bony areas on whale remains. Furthermore, attack patterns could differ for prey of different sizes. Fossil remains of some small cetaceans, for example cetotheres, suggest that they were rammed with great force from below before being killed and eaten, based on compression fractures.
There is also evidence that a possible separate hunting strategy existed for attacking raptorial sperm whales; a tooth belonging to an undetermined 4 m (13 ft) physeteroid closely resembling those of Acrophyseter discovered in the Nutrien Aurora Phosphate Mine in North Carolina suggests that a megalodon or O. chubutensis may have aimed for the head of the sperm whale in order to inflict a fatal bite, the resulting attack leaving distinctive bite marks on the tooth. While scavenging behavior cannot be ruled out as a possibility, the placement of the bite marks is more consistent with predatory attacks than feeding by scavenging, as the jaw is not a particularly nutritious area to for a shark feed or focus on. The fact that the bite marks were found on the tooth's roots further suggest that the shark broke the whale's jaw during the bite, suggesting the bite was extremely powerful. The fossil is also notable as it stands as the first known instance of an antagonistic interaction between a sperm whale and an otodontid shark recorded in the fossil record.
During the Pliocene, larger cetaceans appeared. Megalodon apparently further refined its hunting strategies to cope with these large whales. Numerous fossilized flipper bones and tail vertebrae of large whales from the Pliocene have been found with megalodon bite marks, which suggests that megalodon would immobilize a large whale before killing and feeding on it.
### Growth and reproduction
In 2010, Ehret estimated that megalodon had a fast growth rate nearly two times that of the extant great white shark. He also estimated that the slowing or cessation of somatic growth in megalodon occurred around 25 years of age, suggesting that this species had an extremely delayed sexual maturity. In 2021, Shimada and colleagues calculated the growth rate of an approximately 9.2 m (30 ft) individual based on the Belgian vertebrate column specimen that presumably contains annual growth rings on three of its vertebrae. They estimated the individual died at 46 years of age, with a growth rate of 16 cm (6.3 in) per year, and a length of 2 m (6 ft 7 in) at birth. For a 15 m (49 ft) individual – which they considered to have been the maximum size attainable – this would equate to a lifespan of 88 to 100 years. However, Cooper and his colleagues in 2022 estimated the length of this 46 year old individual at nearly 16 m (52 ft) based on the 3D reconstruction which resulted in the complete vertebral column to be 11.1 m (36 ft) long; the researchers claimed that this size estimate difference occurred due to the fact that Shimada and his colleagues extrapolated its size only based on the vertebral centra.
Megalodon, like contemporaneous sharks, made use of nursery areas to birth their young in, specifically warm-water coastal environments with large amounts of food and protection from predators. Nursery sites were identified in the Gatún Formation of Panama, the Calvert Formation of Maryland, Banco de Concepción in the Canary Islands, and the Bone Valley Formation of Florida. Given that all extant lamniform sharks give birth to live young, this is believed to have been true of megalodon also. Infant megalodons were around 3.5 meters (11 ft) at their smallest, and the pups were vulnerable to predation by other shark species, such as the great hammerhead shark (Sphyrna mokarran) and the snaggletooth shark (Hemipristis serra). Their dietary preferences display an ontogenetic shift: Young megalodon commonly preyed on fish, sea turtles, dugongs, and small cetaceans; mature megalodon moved to off-shore areas and consumed large cetaceans.
An exceptional case in the fossil record suggests that juvenile megalodon may have occasionally attacked much larger balaenopterid whales. Three tooth marks apparently from a 4-to-7-meter (13 to 23 ft) long Pliocene shark were found on a rib from an ancestral blue or humpback whale that showed evidence of subsequent healing, which is suspected to have been inflicted by a juvenile megalodon.
## Paleoecology
### Range and habitat
Megalodon had a cosmopolitan distribution; its fossils have been excavated from many parts of the world, including Europe, Africa, the Americas, and Australia. It most commonly occurred in subtropical to temperate latitudes. It has been found at latitudes up to 55° N; its inferred tolerated temperature range was 1–24 °C (34–75 °F). It arguably had the capacity to endure such low temperatures due to mesothermy, the physiological capability of large sharks to maintain a higher body temperature than the surrounding water by conserving metabolic heat.
Megalodon inhabited a wide range of marine environments (i.e., shallow coastal waters, areas of coastal upwelling, swampy coastal lagoons, sandy littorals, and offshore deep water environments), and exhibited a transient lifestyle. Adult megalodon were not abundant in shallow water environments, and mostly inhabited offshore areas. Megalodon may have moved between coastal and oceanic waters, particularly in different stages of its life cycle.
Fossil remains show a trend for specimens to be larger on average in the Southern Hemisphere than in the Northern, with mean lengths of 11.6 and 9.6 meters (38 and 31 ft), respectively; and also larger in the Pacific than the Atlantic, with mean lengths of 10.9 and 9.5 meters (36 and 31 ft) respectively. They do not suggest any trend of changing body size with absolute latitude, or of change in size over time (although the Carcharocles lineage in general is thought to display a trend of increasing size over time). The overall modal length has been estimated at 10.5 meters (34 ft), with the length distribution skewed towards larger individuals, suggesting an ecological or competitive advantage for larger body size.
#### Locations of fossils
Megalodon had a global distribution and fossils of the shark have been found in many places around the world, bordering all oceans of the Neogene.
### Competition
Megalodon faced a highly competitive environment. Its position at the top of the food chain probably had a significant impact on the structuring of marine communities. Fossil evidence indicates a correlation between megalodon and the emergence and diversification of cetaceans and other marine mammals. Juvenile megalodon preferred habitats where small cetaceans were abundant, and adult megalodon preferred habitats where large cetaceans were abundant. Such preferences may have developed shortly after they appeared in the Oligocene.
Megalodon were contemporaneous with whale-eating toothed whales (particularly macroraptorial sperm whales and squalodontidae), which were also probably among the era's apex predators, and provided competition. Some attained gigantic sizes, such as Livyatan, estimated between 13.5 to 17.5 meters (44 to 57 ft). Fossilized teeth of an undetermined species of such physeteroids from Lee Creek Mine, North Carolina, indicate it had a maximum body length of 8 to 10 m (26 to 33 ft) and a maximum lifespan of about 25 years. This is very different from similarly sized modern killer whales that live to 65 years, suggesting that unlike the latter, which are apex predators, these physeteroids were subject to predation from larger species such as megalodon or Livyatan. By the Late Miocene, around 11 Mya, macroraptorials experienced a significant decline in abundance and diversity. Other species may have filled this niche in the Pliocene, such as the fossil killer whale Orcinus citoniensis which may have been a pack predator and targeted prey larger than itself, but this inference is disputed, and it was probably a generalist predator rather than a marine mammal specialist.
Megalodon may have subjected contemporaneous white sharks to competitive exclusion, as the fossil records indicate that other shark species avoided regions it inhabited by mainly keeping to the colder waters of the time. In areas where their ranges seemed to have overlapped, such as in Pliocene Baja California, it is possible that megalodon and the great white shark occupied the area at different times of the year while following different migratory prey. Megalodon probably also had a tendency for cannibalism, much like contemporary sharks.
## Extinction
### Climate change
The Earth experienced a number of changes during the time period megalodon existed which affected marine life. A cooling trend starting in the Oligocene 35 Mya ultimately led to glaciation at the poles. Geological events changed currents and precipitation; among these were the closure of the Central American Seaway and changes in the Tethys Ocean, contributing to the cooling of the oceans. The stalling of the Gulf Stream prevented nutrient-rich water from reaching major marine ecosystems, which may have negatively affected its food sources. The largest fluctuation of sea levels in the Cenozoic era occurred in the Plio-Pleistocene, between around 5 million to 12 thousand years ago, due to the expansion of glaciers at the poles, which negatively impacted coastal environments, and may have contributed to its extinction along with those of several other marine megafaunal species. These oceanographic changes, in particular the sea level drops, may have restricted many of the suitable shallow warm-water nursery sites for megalodon, hindering reproduction. Nursery areas are pivotal for the survival of many shark species, in part because they protect juveniles from predation.
As its range did not apparently extend into colder waters, megalodon may not have been able to retain a significant amount of metabolic heat, so its range was restricted to shrinking warmer waters. Fossil evidence confirms the absence of megalodon in regions around the world where water temperatures had significantly declined during the Pliocene. However, an analysis of the distribution of megalodon over time suggests that temperature change did not play a direct role in its extinction. Its distribution during the Miocene and Pliocene did not correlate with warming and cooling trends; while abundance and distribution declined during the Pliocene, megalodon did show a capacity to inhabit colder latitudes. It was found in locations with a mean temperature ranging from 12 to 27 °C (54 to 81 °F), with a total range of 1 to 33 °C (34 to 91 °F), indicating that the global extent of suitable habitat should not have been greatly affected by the temperature changes that occurred. This is consistent with evidence that it was a mesotherm.
### Changing ecosystem
Marine mammals attained their greatest diversity during the Miocene, such as with baleen whales with over 20 recognized Miocene genera in comparison to only six extant genera. Such diversity presented an ideal setting to support a super-predator such as megalodon. By the end of the Miocene, many species of mysticetes had gone extinct; surviving species may have been faster swimmers and thus more elusive prey. Furthermore, after the closure of the Central American Seaway, tropical whales decreased in diversity and abundance. The extinction of megalodon correlates with the decline of many small mysticete lineages, and it is possible that it was quite dependent on them as a food source. Additionally, a marine megafauna extinction during the Pliocene was discovered to have eliminated 36% of all large marine species including 55% of marine mammals, 35% of seabirds, 9% of sharks, and 43% of sea turtles. The extinction was selective for endotherms and mesotherms relative to poikilotherms, implying causation by a decreased food supply and thus consistent with megalodon being mesothermic. Megalodon may have been too large to sustain itself on the declining marine food resources. The cooling of the oceans during the Pliocene might have restricted the access of megalodon to the polar regions, depriving it of the large whales which had migrated there.
Competition from large odontocetes, such as macropredatory sperm whales which appeared in the Miocene, and a member of genus Orcinus (i.e., Orcinus citoniensis) in the Pliocene, is assumed to have contributed to the decline and extinction of megalodon. But this assumption is disputed: The Orcininae emerged in Mid-Pliocene with O. citoniensis reported from the Pliocene of Italy, and similar forms reported from the Pliocene of England and South Africa, indicating the capacity of these dolphins to cope with increasingly prevalent cold water temperatures in high latitudes. These dolphins were assumed to have been macrophagous in some studies, but on closer inspection, these dolphins are not found to be macrophagous and fed on small fishes instead. On the other hand, gigantic macropredatory sperm whales such as Livyatan-like forms are last reported from Australia and South Africa circa 5 million years ago. Others, such as Hoplocetus and Scaldicetus also occupied a niche similar to that of modern killer whales but the last of these forms disappeared during the Pliocene. Members of genus Orcinus became large and macrophagous in the Pleistocene.
Paleontologist Robert Boessenecker and his colleagues rechecked the fossil record of megalodon for carbon dating errors and concluded that it disappeared circa 3.5 million years ago. Boessenecker and his colleagues further suggest that megalodon suffered range fragmentation due to climatic shifts, and competition with white sharks might have contributed to its decline and extinction. Competition with white sharks is assumed to be a factor in other studies as well, but this hypothesis warrants further testing. Multiple compounding environmental and ecological factors including climate change and thermal limitations, collapse of prey populations and resource competition with white sharks are believed to have contributed to decline and extinction of megalodon.
The extinction of megalodon set the stage for further changes in marine communities. The average body size of baleen whales increased significantly after its disappearance, although possibly due to other, climate-related, causes. Conversely the increase in baleen whale size may have contributed to the extinction of megalodon, as they may have preferred to go after smaller whales; bite marks on large whale species may have come from scavenging sharks. Megalodon may have simply become coextinct with smaller whale species, such as Piscobalaena nana. The extinction of megalodon had a positive impact on other apex predators of the time, such as the great white shark, in some cases spreading to regions where megalodon became absent.
## In popular culture
Megalodon has been portrayed in many works of fiction, including films and novels, and continues to be a popular subject for fiction involving sea monsters. Reports of supposedly fresh megalodon teeth, such as those found by HMS Challenger in 1873 which were dated in 1959 by the zoologist Wladimir Tschernezky to be around 11,000 to 24,000 years old, helped popularise claims of recent megalodon survival amongst cryptozoologists. These claims have been discredited, and are probably teeth that were well-preserved by a thick mineral-crust precipitate of manganese dioxide, and so had a lower decomposition rate and retained a white color during fossilization. Fossil megalodon teeth can vary in color from off-white to dark browns, greys, and blues, and some fossil teeth may have been redeposited into a younger stratum. The claims that megalodon could remain elusive in the depths, similar to the megamouth shark which was discovered in 1976, are unlikely as the shark lived in warm coastal waters and probably could not survive in the cold and nutrient-poor deep sea environment.
Contemporary fiction about megalodon surviving into modern times was pioneered by the 1997 novel Meg: A Novel of Deep Terror by Steve Alten and its subsequent sequels. Megalodon subsequently began to feature in films, such as the 2002 direct to video Shark Attack 3: Megalodon, and later The Meg, a 2018 film based on the 1997 book which grossed over $500 million at the box office.
Animal Planet's pseudo-documentary Mermaids: The Body Found included an encounter 1.6 mya between a pod of mermaids and a megalodon. Later, in August 2013, the Discovery Channel opened its annual Shark Week series with another film for television, Megalodon: The Monster Shark Lives, a controversial docufiction about the creature that presented alleged evidence in order to suggest that megalodons still lived. This program received criticism for being completely fictional and for inadequately disclosing its fictional nature; for example, all of the supposed scientists depicted were paid actors, and there was no disclosure in the documentary itself that it was fictional. In a poll by Discovery, 73% of the viewers of the documentary thought that megalodon was not extinct. In 2014, Discovery re-aired The Monster Shark Lives, along with a new one-hour program, Megalodon: The New Evidence, and an additional fictionalized program entitled Shark of Darkness: Wrath of Submarine, resulting in further backlash from media sources and the scientific community. Despite the criticism from scientists, Megalodon: The Monster Shark Lives was a huge ratings success, gaining 4.8 million viewers, the most for any Shark Week episode up to that point.
Megalodon teeth are the state fossil of North Carolina.
## See also
- List of prehistoric cartilaginous fish
- Prehistoric fish
- Largest prehistoric organisms |
# Thomas Beecham
Sir Thomas Beecham, 2nd Baronet, (29 April 1879 – 8 March 1961) was an English conductor and impresario best known for his association with the London Philharmonic and the Royal Philharmonic orchestras. He was also closely associated with the Liverpool Philharmonic and Hallé orchestras. From the early 20th century until his death, Beecham was a major influence on the musical life of Britain and, according to the BBC, was Britain's first international conductor.
Born to a rich industrial family, Beecham began his career as a conductor in 1899. He used his access to the family fortune to finance opera from the 1910s until the start of the Second World War, staging seasons at Covent Garden, Drury Lane and His Majesty's Theatre with international stars, his own orchestra and a wide repertoire. Among the works he introduced to England were Richard Strauss's Elektra, Salome and Der Rosenkavalier and three operas by Frederick Delius.
Together with his younger colleague Malcolm Sargent, Beecham founded the London Philharmonic, and he conducted its first performance at the Queen's Hall in 1932. In the 1940s he worked for three years in the United States, where he was music director of the Seattle Symphony and conducted at the Metropolitan Opera. After his return to Britain, he founded the Royal Philharmonic in 1946 and conducted it until his death in 1961.
Beecham's repertoire was eclectic, sometimes favouring lesser-known composers over famous ones. His specialities included composers whose works were neglected in Britain before he became their advocate, such as Delius and Berlioz. Other composers with whose music he was frequently associated were Haydn, Schubert, Sibelius and the composer he revered above all others, Mozart.
## Biography
### Early years
Beecham was born in St Helens, Lancashire (now Merseyside), in a house adjoining the Beecham's Pills laxative factory founded by his grandfather, Thomas Beecham. His parents were Joseph Beecham, the elder son of Thomas, and Josephine, née Burnett. In 1885, with the family firm flourishing financially, Joseph Beecham moved his family to a large house in Ewanville, Huyton, near Liverpool. Their former home was demolished to make room for an extension to the pill factory.
Beecham was educated at Rossall School between 1892 and 1897, after which he hoped to attend a music conservatoire in Germany, but his father forbade it, and instead Beecham went to Wadham College, Oxford to read Classics. He did not find university life to his taste and successfully sought his father's permission to leave Oxford in 1898. He studied as a pianist but, despite his excellent natural talent and fine technique, he had difficulty because of his small hands, and any career as a soloist was ruled out by a wrist injury in 1904. He studied composition with Frederic Austin in Liverpool, Charles Wood in London, and Moritz Moszkowski in Paris. As a conductor, he was self-taught.
### First orchestras
Beecham first conducted in public in St. Helens in October 1899, with an ad hoc ensemble comprising local musicians and players from the Liverpool Philharmonic Orchestra and the Hallé in Manchester. A month later, he stood in at short notice for the celebrated conductor Hans Richter at a concert by the Hallé to mark Joseph Beecham's inauguration as mayor of St Helens. Soon afterwards, Joseph Beecham secretly committed his wife to a mental hospital. Thomas and his elder sister Emily helped to secure their mother's release and to force their father to pay annual alimony of £4,500. For this, Joseph disinherited them. Beecham was estranged from his father for ten years.
Beecham's professional début as a conductor was in 1902 at the Shakespeare Theatre, Clapham, with Balfe's The Bohemian Girl, for the Imperial Grand Opera Company. He was engaged as assistant conductor for a tour and was allotted four other operas, including Carmen and Pagliacci. A Beecham biographer calls the company "grandly named but decidedly ramshackle", though Beecham's Carmen was Zélie de Lussan, a leading exponent of the title role. Beecham was also composing music in these early years, but he was not satisfied with his own efforts and instead concentrated on conducting.
In 1906 Beecham was invited to conduct the New Symphony Orchestra, a recently formed ensemble of 46 players, in a series of concerts at the Bechstein Hall in London. Throughout his career, Beecham frequently chose to programme works to suit his own tastes rather than those of the paying public. In his early discussions with his new orchestra, he proposed works by a long list of barely known composers such as Étienne Méhul, Nicolas Dalayrac and Ferdinando Paer. During this period, Beecham first encountered the music of Frederick Delius, which he at once loved deeply and with which he became closely associated for the rest of his life.
Beecham quickly concluded that to compete with the two existing London orchestras, the Queen's Hall Orchestra and the recently founded London Symphony Orchestra (LSO), his forces must be expanded to full symphonic strength and play in larger halls. For two years starting in October 1907, Beecham and the enlarged New Symphony Orchestra gave concerts at the Queen's Hall. He paid little attention to the box office: his programmes were described by a biographer as "even more certain to deter the public then than it would be in our own day". The principal pieces of his first concert with the orchestra were d'Indy's symphonic ballad La forêt enchantée, Smetana's symphonic poem Šárka, and Lalo's little-known Symphony in G minor. Beecham retained an affection for the last work: it was among the works he conducted at his final recording sessions more than fifty years later.
In 1908 Beecham and the New Symphony Orchestra parted company, disagreeing about artistic control and, in particular, the deputy system. Under this system, orchestral players, if offered a better-paid engagement elsewhere, could send a substitute to a rehearsal or a concert. The treasurer of the Royal Philharmonic Society described it thus: "A, whom you want, signs to play at your concert. He sends B (whom you don't mind) to the first rehearsal. B, without your knowledge or consent, sends C to the second rehearsal. Not being able to play at the concert, C sends D, whom you would have paid five shillings to stay away." Henry Wood had already banned the deputy system in the Queen's Hall Orchestra (provoking rebel players to found the London Symphony Orchestra), and Beecham followed suit. The New Symphony Orchestra survived without him and subsequently became the Royal Albert Hall Orchestra.
In 1909, Beecham founded the Beecham Symphony Orchestra. He did not poach from established symphony orchestras, but instead he recruited from theatre bandrooms, local symphony societies, the palm courts of hotels, and music colleges. The result was a youthful team – the average age of his players was 25. They included names that would become celebrated in their fields, such as Albert Sammons, Lionel Tertis, Eric Coates and Eugene Cruft.
Because he persistently programmed works that did not attract the public, Beecham's musical activities at this time consistently lost money. As a result of his estrangement from his father between 1899 and 1909, his access to the Beecham family fortune was strictly limited. From 1907 he had an annuity of £700 left to him in his grandfather's will, and his mother subsidised some of his loss-making concerts, but it was not until father and son were reconciled in 1909 that Beecham was able to draw on the family fortune to promote opera.
### 1910–1920
From 1910, subsidised by his father, Beecham realised his ambition to mount opera seasons at Covent Garden and other houses. In the Edwardian opera house, the star singers were regarded as all-important, and conductors were seen as ancillary. Between 1910 and 1939 Beecham did much to change the balance of power.
In 1910, Beecham either conducted or was responsible as impresario for 190 performances at Covent Garden and His Majesty's Theatre. His assistant conductors were Bruno Walter and Percy Pitt. During the year, he mounted 34 different operas, most of them either new to London or almost unknown there. Beecham later acknowledged that in his early years the operas he chose to present were too obscure to attract the public. During his 1910 season at His Majesty's, the rival Grand Opera Syndicate put on a concurrent season of its own at Covent Garden; London's total opera performances for the year amounted to 273 performances, far more than the box-office demand could support. Of the 34 operas that Beecham staged in 1910, only four made money: Richard Strauss's new operas Elektra and Salome, receiving their first, and highly publicised, performances in Britain, and The Tales of Hoffmann and Die Fledermaus.
In 1911 and 1912, the Beecham Symphony Orchestra played for Sergei Diaghilev's Ballets Russes, both at Covent Garden and at the Krolloper in Berlin, under the batons of Beecham and Pierre Monteux, Diaghilev's chief conductor. Beecham was much admired for conducting the complicated new score of Stravinsky's Petrushka, at two days' notice and without rehearsal, when Monteux became unavailable. While in Berlin, Beecham and his orchestra, in Beecham's words, caused a "mild stir", scoring a triumph: the orchestra was agreed by the Berlin press to be an elite body, one of the best in the world. The principal Berlin musical weekly, Die Signale, asked, "Where does London find such magnificent young instrumentalists?" The violins were credited with rich, noble tone, the woodwinds with lustre, the brass, "which has not quite the dignity and amplitude of our best German brass", with uncommon delicacy of execution.
Beecham's 1913 seasons included the British premiere of Strauss's Der Rosenkavalier at Covent Garden, and a "Grand Season of Russian Opera and Ballet" at Drury Lane. At the latter there were three operas, all starring Feodor Chaliapin, and all new to Britain: Mussorgsky's Boris Godunov and Khovanshchina, and Rimsky-Korsakov's Ivan the Terrible. There were also 15 ballets, with leading dancers including Vaslav Nijinsky and Tamara Karsavina. The ballets included Debussy's Jeux and his controversially erotic L'après-midi d'un faune, and the British premiere of Stravinsky's The Rite of Spring, six weeks after its first performance in Paris. Beecham shared Monteux's private dislike of the piece, much preferring Petrushka. Beecham did not conduct during this season; Monteux and others conducted the Beecham Symphony Orchestra. The following year, Beecham and his father presented Rimsky-Korsakov's The Maid of Pskov and Borodin's Prince Igor, with Chaliapin, and Stravinsky's The Nightingale.
During the First World War, Beecham strove, often without a fee, to keep music alive in London, Liverpool, Manchester and other British cities. He conducted for, and gave financial support to, three institutions with which he was connected at various times: the Hallé Orchestra, the LSO and the Royal Philharmonic Society. In 1915 he formed the Beecham Opera Company, with mainly British singers, performing in London and throughout the country. In 1916, he received a knighthood in the New Year Honours and succeeded to the baronetcy on his father's death later that year.
After the war, there were joint Covent Garden seasons with the Grand Opera Syndicate in 1919 and 1920, but these were, according to a biographer, pale confused echoes of the years before 1914. These seasons included forty productions, of which Beecham conducted only nine. After the 1920 season, Beecham temporarily withdrew from conducting to deal with a financial problem that he described as "the most trying and unpleasant experience of my life".
### Covent Garden estate
Influenced by an ambitious financier, James White, Sir Joseph Beecham had agreed, in July 1914, to buy the Covent Garden estate from the Duke of Bedford and float a limited company to manage the estate commercially. The deal was described by The Times as "one of the largest ever carried out in real estate in London". Sir Joseph paid an initial deposit of £200,000 and covenanted to pay the balance of the £2 million purchase price on 11 November. Within a month, however, the First World War broke out, and new official restrictions on the use of capital prevented the completion of the contract. The estate and market continued to be managed by the Duke's staff, and in October 1916, Joseph Beecham died suddenly, with the transaction still uncompleted. The matter was brought before the civil courts with the aim of disentangling Sir Joseph's affairs; the court and all parties agreed that a private company should be formed, with his two sons as directors, to complete the Covent Garden contract. In July 1918, the Duke and his trustees conveyed the estate to the new company, subject to a mortgage of the balance of the purchase price still outstanding: £1.25 million.
Beecham and his brother Henry had to sell enough of their father's estate to discharge this mortgage. For more than three years, Beecham was absent from the musical scene, working to sell property worth over £1 million. By 1923 enough money had been raised. The mortgage was discharged, and Beecham's personal liabilities, amounting to £41,558, were paid in full. In 1924 the Covent Garden property and the pill-making business at St Helens were united in one company, Beecham Estates and Pills. The nominal capital was £1,850,000, of which Beecham had a substantial share.
### London Philharmonic
After his absence, Beecham first reappeared on the rostrum conducting the Hallé in Manchester in March 1923, in a programme including works by Berlioz, Bizet, Delius and Mozart. He returned to London the following month, conducting the combined Royal Albert Hall Orchestra (the renamed New Symphony Orchestra) and London Symphony Orchestra in April 1923. The main work on the programme was Richard Strauss's Ein Heldenleben. No longer with an orchestra of his own, Beecham established a relationship with the London Symphony Orchestra that lasted for the rest of the 1920s. Towards the end of the decade, he negotiated inconclusively with the BBC over the possibility of establishing a permanent radio orchestra.
In 1931, Beecham was approached by the rising young conductor Malcolm Sargent with a proposal to set up a permanent, salaried orchestra with a subsidy guaranteed by Sargent's patrons, the Courtauld family. Originally Sargent and Beecham envisaged a reshuffled version of the London Symphony Orchestra, but the LSO, a self-governing co-operative, balked at weeding out and replacing underperforming players. In 1932 Beecham lost patience and agreed with Sargent to set up a new orchestra from scratch. The London Philharmonic Orchestra (LPO), as it was named, consisted of 106 players including a few young musicians straight from music college, many established players from provincial orchestras, and 17 of the LSO's leading members. The principals included Paul Beard, George Stratton, Anthony Pini, Gerald Jackson, Léon Goossens, Reginald Kell, James Bradshaw and Marie Goossens.
The orchestra made its debut at the Queen's Hall on 7 October 1932, conducted by Beecham. After the first item, Berlioz's Roman Carnival Overture, the audience went wild, some of them standing on their seats to clap and shout. During the next eight years, the LPO appeared nearly a hundred times at the Queen's Hall for the Royal Philharmonic Society alone, played for Beecham's opera seasons at Covent Garden, and made more than 300 gramophone records. Berta Geissmar, his secretary from 1936, wrote, "The relations between the Orchestra and Sir Thomas were always easy and cordial. He always treated a rehearsal as a joint undertaking with the Orchestra. ... The musicians were entirely unselfconscious with him. Instinctively they accorded him the artistic authority which he did not expressly claim. Thus he obtained the best from them and they gave it without reserve."
By the early 1930s, Beecham had secured substantial control of the Covent Garden opera seasons. Wishing to concentrate on music-making rather than management, he assumed the role of artistic director, and Geoffrey Toye was recruited as managing director. In 1933, Tristan und Isolde with Frida Leider and Lauritz Melchior was a success, and the season continued with the Ring cycle and nine other operas. The 1934 season featured Conchita Supervía in La Cenerentola, and Lotte Lehmann and Alexander Kipnis in the Ring. Clemens Krauss conducted the British première of Strauss's Arabella. During 1933 and 1934, Beecham repelled attempts by John Christie to form a link between Christie's new Glyndebourne Festival and the Royal Opera House. Beecham and Toye fell out over the latter's insistence on bringing in a popular film star, Grace Moore, to sing Mimi in La bohème. The production was a box-office success, but an artistic failure. Beecham manoeuvred Toye out of the managing directorship in what their fellow conductor Sir Adrian Boult described as an "absolutely beastly" manner.
From 1935 to 1939, Beecham, now in sole control, presented international seasons with eminent guest singers and conductors. Beecham conducted between a third and half of the performances each season. He intended the 1940 season to include the first complete performances of Berlioz's Les Troyens, but the outbreak of the Second World War caused the season to be abandoned. Beecham did not conduct again at Covent Garden until 1951, and by then it was no longer under his control.
Beecham took the London Philharmonic on a controversial tour of Germany in 1936. There were complaints that he was being used by Nazi propagandists, and Beecham complied with a Nazi request not to play the Scottish Symphony of Mendelssohn, who was a Christian by faith but a Jew by birth. In Berlin, Beecham's concert was attended by Adolf Hitler, whose lack of punctuality caused Beecham to remark very audibly, "The old bugger's late." After this tour, Beecham refused renewed invitations to give concerts in Germany, although he honoured contractual commitments to conduct at the Berlin State Opera, in 1937 and 1938, and recorded The Magic Flute for EMI in the Beethovensaal in Berlin in the same years.
As his sixtieth birthday approached, Beecham was advised by his doctors to take a year's complete break from music, and he planned to go abroad to rest in a warm climate. The Australian Broadcasting Commission had been seeking for several years to get him to conduct in Australia. The outbreak of war on 3 September 1939 obliged him to postpone his plans for several months, striving instead to secure the future of the London Philharmonic, whose financial guarantees had been withdrawn by its backers when war was declared. Before leaving, Beecham raised large sums of money for the orchestra and helped its members to form themselves into a self-governing company.
### 1940s
Beecham left Britain in the spring of 1940, going first to Australia and then to North America. He became music director of the Seattle Symphony in 1941. In 1942 he joined the Metropolitan Opera as joint senior conductor with his former assistant Bruno Walter. He began with his own adaptation of Bach's comic cantata, Phoebus and Pan, followed by Le Coq d'Or. His main repertoire was French: Carmen, Louise (with Grace Moore), Manon, Faust, Mignon and The Tales of Hoffmann. In addition to his Seattle and New York posts, Beecham was guest conductor with 18 American orchestras.
In 1944, Beecham returned to Britain. Musically his reunion with the London Philharmonic was triumphant, but the orchestra, now, after his help in 1939, a self-governing co-operative, attempted to hire him on its own terms as its salaried artistic director. "I emphatically refuse", concluded Beecham, "to be wagged by any orchestra ... I am going to found one more great orchestra to round off my career." When Walter Legge founded the Philharmonia Orchestra in 1945, Beecham conducted its first concert. But he was not disposed to accept a salaried position from Legge, his former assistant, any more than from his former players in the LPO.
In 1946, Beecham founded the Royal Philharmonic Orchestra (RPO), securing an agreement with the Royal Philharmonic Society that the new orchestra should replace the LPO at all the Society's concerts. Beecham later agreed with the Glyndebourne Festival that the RPO should be the resident orchestra at Glyndebourne each summer. He secured backing, including that of record companies in the US as well as Britain, with whom lucrative recording contracts were negotiated. As in 1909 and in 1932, Beecham's assistants recruited in the freelance pool and elsewhere. Original members of the RPO included James Bradshaw, Dennis Brain, Leonard Brain, Archie Camden, Gerald Jackson and Reginald Kell. The orchestra later became celebrated for its regular team of woodwind principals, often referred to as "The Royal Family", consisting of Jack Brymer (clarinet), Gwydion Brooke (bassoon), Terence MacDonagh (oboe) and Gerald Jackson (flute).
Beecham's long association with the Hallé Orchestra as a guest conductor ceased after John Barbirolli became the orchestra's chief conductor in 1944. Beecham was, to his great indignation, ousted from the honorary presidency of the Hallé Concerts Society, and Barbirolli refused to "let that man near my orchestra". Beecham's relationship with the Liverpool Philharmonic, which he had first conducted in 1911, was resumed harmoniously after the war. A manager of the orchestra recalled, "It was an unwritten law in Liverpool that first choice of dates offered to guest conductors was given to Beecham. ... In Liverpool there was one over-riding factor – he was adored."
### 1950s and later years
Beecham, whom the BBC called "Britain's first international conductor", took the RPO on a strenuous tour through the United States, Canada and South Africa in 1950. During the North American tour, Beecham conducted 49 concerts in almost daily succession. In 1951, he was invited to conduct at Covent Garden after a 12-year absence. State-funded for the first time, the opera company operated quite differently from his pre-war regime. Instead of short, star-studded seasons, with a major symphony orchestra, the new director David Webster was attempting to build up a permanent ensemble of home-grown talent performing all the year round, in English translations. Extreme economy in productions and great attention to the box-office were essential, and Beecham, though he had been hurt and furious at his exclusion, was not suited to participate in such an undertaking. When offered a chorus of eighty singers for his return, conducting Die Meistersinger, he insisted on augmenting their number to 200. He also, contrary to Webster's policy, insisted on performing the piece in German. In 1953 at Oxford, Beecham presented the world premiere of Delius's first opera, Irmelin, and his last operatic performances in Britain were in 1955 at Bath, with Grétry's Zémire et Azor.
Between 1951 and 1960, Beecham conducted 92 concerts at the Royal Festival Hall. Characteristic Beecham programmes of the RPO years included symphonies by Bizet, Franck, Haydn, Schubert and Tchaikovsky; Richard Strauss's Ein Heldenleben; concertos by Mozart and Saint-Saëns; a Delius and Sibelius programme; and many of his favoured shorter pieces. He did not stick uncompromisingly to his familiar repertoire. After the sudden death of the German conductor Wilhelm Furtwängler in 1954, Beecham in tribute conducted the two programmes his colleague had been due to present at the Festival Hall; these included Bach's Third Brandenburg Concerto, Ravel's Rapsodie espagnole, Brahms's Symphony No. 1, and Barber's Second Essay for Orchestra.
In the summer of 1958, Beecham conducted a season at the Teatro Colón, Buenos Aires, Argentina, consisting of Verdi's Otello, Bizet's Carmen, Beethoven's Fidelio, Saint-Saëns's Samson and Delilah and Mozart's The Magic Flute. These were his last operatic performances. It was during this season that Betty Humby died suddenly. She was cremated in Buenos Aires and her ashes returned to England. Beecham's own last illness prevented his operatic debut at Glyndebourne in a planned Magic Flute and a final appearance at Covent Garden conducting Berlioz's The Trojans.
Sixty-six years after his first visit to America, Beecham made his last, beginning in late 1959, conducting in Pittsburgh, San Francisco, Seattle, Chicago and Washington. During this tour, he also conducted in Canada. He flew back to London on 12 April 1960 and did not leave England again. His final concert was at Portsmouth Guildhall on 7 May 1960. The programme, all characteristic choices, comprised the Magic Flute Overture, Haydn's Symphony No. 100 (the Military), Beecham's own Handel arrangement, Love in Bath, Schubert's Symphony No. 5, On the River by Delius, and the Bacchanale from Samson and Delilah.
Beecham died of a coronary thrombosis at his London residence, aged 81, on 8 March 1961. He was buried two days later in Brookwood Cemetery, Surrey. Owing to changes at Brookwood, his remains were exhumed in 1991 and reburied in St Peter's Churchyard at Limpsfield, Surrey, close to the joint grave of Delius and his wife Jelka Rosen.
### Personal life
Beecham was married three times. In 1903 he married Utica Celestina Welles, daughter of Dr Charles S. Welles, of New York, and his wife Ella Celeste, née Miles. Beecham and his wife had two sons: Adrian, born in 1904, who became a composer and achieved some celebrity in the 1920s and 1930s, and Thomas, born in 1909. After the birth of his second child, Beecham began to drift away from the marriage. By 1911, no longer living with his wife and family, he was involved as co-respondent in a much-publicised divorce case. Utica ignored advice that she should divorce him and secure substantial alimony; she did not believe in divorce. She never remarried after Beecham divorced her (in 1943), and she outlived her former husband by sixteen years, dying in 1977.
In 1909 or early 1910, Beecham began an affair with Maud Alice (known as Emerald), Lady Cunard. Although they never lived together, it continued, despite other relationships on his part, until his remarriage in 1943. She was a tireless fund-raiser for his musical enterprises. Beecham's biographers are agreed that she was in love with him, but that his feelings for her were less strong. During the 1920s and 1930s, Beecham also had an affair with Dora Labbette, a soprano sometimes known as Lisa Perli, with whom he had a son, Paul Strang, born in March 1933. Strang, a lawyer who served on the boards of several musical institutions, died in April 2024.
In 1943 Lady Cunard was devastated to learn (not from Beecham) that he intended to divorce Utica to marry Betty Humby, a concert pianist 29 years his junior. Beecham married Betty in 1943, and they were a devoted couple until her death in 1958. On 10 August 1959, two years before his death, he married in Zurich his former secretary, Shirley Hudson, who had worked for the Royal Philharmonic Orchestra's administration since 1950. She was 27, he 80.
## Repertoire
### Handel, Haydn, and Mozart
The earliest composer whose music Beecham regularly performed was Handel, whom he called, "the great international master of all time. ... He wrote Italian music better than any Italian; French music better than any Frenchman; English music better than any Englishman; and, with the exception of Bach, outrivalled all other Germans." In his performances of Handel, Beecham ignored what he called the "professors, pedants, pedagogues". He followed Mendelssohn and Mozart in editing and reorchestrating Handel's scores to suit contemporary tastes. At a time when Handel's operas were scarcely known, Beecham knew them so well that he was able to arrange three ballets, two other suites and a piano concerto from them. He gave Handel's oratorio Solomon its first performance since the 18th century, with a text edited by the conductor.
With Haydn, too, Beecham was far from an authenticist, using unscholarly 19th-century versions of the scores, avoiding the use of the harpsichord, and phrasing the music romantically. He recorded the twelve "London" symphonies, and regularly programmed some of them in his concerts. Earlier Haydn works were unfamiliar in the first half of the 20th century, but Beecham conducted several of them, including the Symphony No. 40 and an early piano concerto. He programmed The Seasons regularly throughout his career, recording it for EMI in 1956, and in 1944 added The Creation to his repertoire.
For Beecham, Mozart was "the central point of European music," and he treated the composer's scores with more deference than he gave most others. He edited the incomplete Requiem, made English translations of at least two of the great operas, and introduced Covent Garden audiences who had rarely if ever heard them to Così fan tutte, Der Schauspieldirektor and Die Entführung aus dem Serail; he also regularly programmed The Magic Flute, Don Giovanni and The Marriage of Figaro. He considered the best of Mozart's piano concertos to be "the most beautiful compositions of their kind in the world", and he played them many times with Betty Humby-Beecham and others.
### German music
Beecham's attitude towards 19th-century German repertoire was equivocal. He frequently disparaged Beethoven, Wagner and others, but regularly conducted their works, often with great success. He observed, "Wagner, though a tremendous genius, gorged music like a German who overeats. And Bruckner was a hobbledehoy who had no style at all ... Even Beethoven thumped the tub; the Ninth symphony was composed by a kind of Mr. Gladstone of music." Despite his criticisms, Beecham conducted all the Beethoven symphonies during his career, and he made studio recordings of Nos. 2, 3, 4, 6, 7 and 8, and live recordings of No. 9 and Missa Solemnis. He conducted the Fourth Piano Concerto with pleasure (recording it with Arthur Rubinstein and the LPO) but avoided the Emperor Concerto when possible.
Beecham was not known for his Bach but nonetheless chose Bach (arranged by Beecham) for his debut at the Metropolitan Opera. He later gave the Third Brandenburg Concerto in one of his memorial concerts for Wilhelm Furtwängler (a performance described by The Times as "a travesty, albeit an invigorating one"). In Brahms's music, Beecham was selective. He made a speciality of the Second Symphony but conducted the Third only occasionally, the First rarely, and the Fourth never. In his memoirs he made no mention of any Brahms performance after the year 1909.
Beecham was a great Wagnerian, despite his frequent expostulation about the composer's length and repetitiousness: "We've been rehearsing for two hours – and we're still playing the same bloody tune\!" Beecham conducted all the works in the regular Wagner canon with the exception of Parsifal, which he presented at Covent Garden but never with himself in the pit. The chief music critic of The Times observed: "Beecham's Lohengrin was almost Italian in its lyricism; his Ring was less heroic than Bruno Walter's or Furtwängler's, but it sang from beginning to end".
Richard Strauss had a lifelong champion in Beecham, who introduced Elektra, Salome, Der Rosenkavalier and other operas to England. Beecham programmed Ein Heldenleben from 1910 until his last year; his final recording of it was released shortly after his death. Don Quixote, Till Eulenspiegel, the Bourgeois Gentilhomme music and Don Juan also featured in his repertory, but not Also Sprach Zarathustra or Tod und Verklärung. Strauss had the first and last pages of the manuscript of Elektra framed and presented them to "my highly honoured friend ... and distinguished conductor of my work."
### French and Italian music
In the opinion of the jury of the Académie du Disque Français, "Sir Thomas Beecham has done more for French music abroad than any French conductor". Berlioz featured prominently in Beecham's repertoire throughout his career, and in an age when the composer's works received little exposure, Beecham presented most of them and recorded many. Along with Sir Colin Davis, Beecham has been described as one of the two "foremost modern interpreters" of Berlioz's music. Both in concert and the recording studio, Beecham's choices of French music were characteristically eclectic. He avoided Ravel but regularly programmed Debussy. Fauré did not feature often, although his orchestral Pavane was an exception; Beecham's final recording sessions in 1959 included the Pavane and the Dolly Suite. Bizet was among Beecham's favourites, and other French composers favoured by him included Gustave Charpentier, Delibes, Duparc, Grétry, Lalo, Lully, Offenbach, Saint-Saëns and Ambroise Thomas. Many of Beecham's later recordings of French music were made in Paris with the Orchestre National de la Radiodiffusion Française. "C'est un dieu", their concertmaster said of Beecham in 1957.
Of the more than two dozen operas in the Verdi canon, Beecham conducted eight during his long career: Il trovatore, La traviata, Aida, Don Carlos, Rigoletto, Un ballo in maschera, Otello and Falstaff. As early as 1904, Beecham met Puccini through the librettist Luigi Illica, who had written the libretto for Beecham's youthful attempt at composing an Italian opera. At the time of their meeting, Puccini and Illica were revising Madama Butterfly after its disastrous première. Beecham rarely conducted that work, but he conducted Tosca, Turandot and La bohème. His 1956 recording of La bohème, with Victoria de los Ángeles and Jussi Björling, has seldom been out of the catalogues since its release and received more votes than any other operatic set in a 1967 symposium of prominent critics.
### Delius, Sibelius and "Lollipops"
Except for Delius, Beecham was generally antipathetic to, or at best lukewarm about, the music of his native land and its leading composers. Beecham's championship of Delius, however, promoted the composer from relative obscurity. Delius's amanuensis, Eric Fenby, referred to Beecham as "excelling all others in the music of Delius ... Groves and Sargent may have matched him in the great choruses of A Mass of Life, but in all else Beecham was matchless, especially with the orchestra." In an all-Delius concert in June 1911 Beecham conducted the premiere of Songs of Sunset. He put on Delius Festivals in 1929 and 1946 and presented his concert works throughout his career. He conducted the British premieres of the operas A Village Romeo and Juliet in 1910 and Koanga in 1935, and the world premiere of Irmelin in 1953. However, he was not an uncritical Delian: he never conducted the Requiem, and he detailed his criticisms of it in his book on Delius.
Another major 20th-century composer who engaged Beecham's sympathies was Sibelius, who recognised him as a fine conductor of his music (although Sibelius tended to be lavish with praise of anybody who conducted his music). In a live recording of a December 1954 concert performance of Sibelius's Second Symphony with the BBC Symphony Orchestra in the Festival Hall, Beecham can be heard uttering encouraging shouts at the orchestra at climactic moments.
Beecham was dismissive of some of the established classics, saying for example, "I would give the whole of Bach's Brandenburg Concertos for Massenet's Manon, and would think I had vastly profited by the exchange". He was, by contrast, famous for presenting slight pieces as encores, which he called "lollipops". Some of the best-known were Berlioz's Danse des sylphes; Chabrier's Joyeuse Marche and Gounod's Le Sommeil de Juliette.
## Recordings
The composer Richard Arnell wrote that Beecham preferred making records to giving concerts: "He told me that audiences got in the way of music-making – he was apt to catch someone's eye in the front row." The conductor and critic Trevor Harvey wrote in The Gramophone, however, that studio recordings could never recapture the thrill of Beecham performing live in the concert hall.
Beecham began making recordings in 1910, when the acoustical process obliged orchestras to use only principal instruments, placed as close to the recording horn as possible. His first recordings, for HMV, were of excerpts from Offenbach's The Tales of Hoffmann and Johann Strauss's Die Fledermaus. In 1915, Beecham began recording for the Columbia Graphophone Company. Electrical recording technology (introduced in 1925–26) made it possible to record a full orchestra with much greater frequency range, and Beecham quickly embraced the new medium. Longer scores had to be broken into four-minute segments to fit on 12-inch 78-rpm discs, but Beecham was not averse to recording piecemeal – his well-known 1932 disc of Chabrier's España was recorded in two sessions three weeks apart. Beecham recorded many of his favourite works several times, taking advantage of improved technology over the decades.
From 1926 to 1932, Beecham made more than 70 discs, including an English version of Gounod's Faust and the first of three recordings of Handel's Messiah. He began recording with the London Philharmonic Orchestra in 1933, making more than 150 discs for Columbia, including music by Mozart, Rossini, Berlioz, Wagner, Handel, Beethoven, Brahms, Debussy and Delius. Among the most prominent of his pre-war recordings was the first complete recording of Mozart's The Magic Flute with the Berlin Philharmonic Orchestra, made for HMV and supervised by Walter Legge in Berlin in 1937–38, a set described by Alan Blyth in Gramophone magazine in 2006 as having "a legendary status". In 1936, during his German tour with the LPO, Beecham conducted the world's first orchestral recording on magnetic tape, made at Ludwigshafen, the home of BASF, the company that developed the process.
During his stay in the US and afterwards, Beecham recorded for American Columbia Records and RCA Victor. His RCA recordings include major works that he did not subsequently re-record for the gramophone, including Beethoven's Fourth, Sibelius's Sixth and Mendelssohn's Reformation Symphonies. Some of his RCA recordings were issued only in the US, including Mozart's Symphony No. 27, K199, the overtures to Smetana's The Bartered Bride and Mozart's La clemenza di Tito, the Sinfonia from Bach's Christmas Oratorio, a 1947–48 complete recording of Gounod's Faust, and an RPO studio version of Sibelius's Second Symphony. Beecham's RCA records that were released on both sides of the Atlantic were his celebrated 1956 complete recording of Puccini's La bohème and an extravagantly rescored set of Handel's Messiah. The former remains a top recommendation among reviewers, and the latter was described by Gramophone as "an irresistible outrage ... huge fun".
For the Columbia label, Beecham recorded his last, or only, versions of many works by Delius, including A Mass of Life, Appalachia, North Country Sketches, An Arabesque, Paris and Eventyr. Other Columbia recordings from the early 1950s include Beethoven's Eroica, Pastoral and Eighth symphonies, Mendelssohn's Italian symphony, and the Brahms Violin Concerto with Isaac Stern.
From his return to England at the end of the Second World War until his final recordings in 1959, Beecham continued his early association with HMV and British Columbia, who had merged to form EMI. From 1955, his EMI recordings made in London were recorded in stereo. He also recorded in Paris, with his own RPO and with the Orchestra National de la Radiodiffusion Française, though the Paris recordings were in mono until 1958. For EMI, Beecham recorded two complete operas in stereo, Die Entführung aus dem Serail and Carmen. His last recordings were made in Paris in December 1959. Beecham's EMI recordings have been continually reissued on LP and CD. In 2011, to mark the 50th anniversary of Beecham's death, EMI released 34 CDs of his recordings of music from the 18th, 19th and 20th centuries, including works by Haydn, Mozart, Beethoven, Brahms, Wagner, Richard Strauss and Delius, and many of the French "lollipops" with which he was associated.
## Relations with others
Beecham's relations with fellow British conductors were not always cordial. Sir Henry Wood regarded him as an upstart and was envious of his success; the scrupulous Sir Adrian Boult found him "repulsive" as a man and a musician; and Sir John Barbirolli mistrusted him. Sir Malcolm Sargent worked with him in founding the London Philharmonic and was a friend and ally, but he was the subject of unkind, though witty, digs from Beecham who, for example, described the image-conscious Herbert von Karajan as "a kind of musical Malcolm Sargent". Beecham's relations with foreign conductors were often excellent. He did not get on well with Arturo Toscanini, but he liked and encouraged Wilhelm Furtwängler, admired Pierre Monteux, fostered Rudolf Kempe as his successor with the RPO, and was admired by Fritz Reiner, Otto Klemperer and Karajan.
Despite his lordly drawl, Beecham remained a Lancastrian at heart. "In my county, where I come from, we're all a bit vulgar, you know, but there is a certain heartiness – a sort of bonhomie about our vulgarity – which tides you over a lot of rough spots in the path. But in Yorkshire, in a spot of bother, they're so damn-set-in-their-ways that there's no doing anything with them\!"
Beecham has been much quoted. In 1929, the editor of a music journal wrote, "The stories gathered around Sir Thomas Beecham are innumerable. Wherever musicians come together, he is likely to be one of the topics of conversation. Everyone telling a Beecham story tries to imitate his manner and his tone of voice." A book, Beecham Stories, was published in 1978 consisting entirely of his bons mots and anecdotes about him. Some are variously attributed to Beecham or one or more other people, including Arnold Bax and Winston Churchill; Neville Cardus admitted to inventing some himself. Among the Beecham lines that are reliably attributed are, "A musicologist is a man who can read music but can't hear it"; his maxim, "There are only two things requisite so far as the public is concerned for a good performance: that is for the orchestra to begin together and end together; in between it doesn't matter much"; and his remark at his 70th birthday celebrations after telegrams were read out from Strauss, Stravinsky and Sibelius: "Nothing from Mozart?"
He was completely indifferent to mundane tasks such as correspondence, and was less than responsible with the property of others. On one occasion, during bankruptcy proceedings, two thousand unopened letters were discovered among his papers. Havergal Brian sent him three scores with a view to having them performed. One of them, the Second English Suite, was never returned and is now considered lost.
## Honours and commemorations
Beecham was knighted in 1916 and succeeded to the baronetcy on the death of his father later that year. In 1938 the President of France, Albert Lebrun, invested him with the Légion d'honneur. In 1955, Beecham was presented with the Order of the White Rose of Finland. He was a Commendatore of the Order of the Crown of Italy and was made a Member of the Order of the Companions of Honour in the 1957 Queen's Birthday Honours. He was an honorary Doctor of Music of the universities of Oxford, London, Manchester and Montreal.
Beecham, by Caryl Brahms and Ned Sherrin, is a play celebrating the conductor and drawing on a large number of Beecham stories for its material. Its first production, in 1979, starred Timothy West in the title role. It was later adapted for television, starring West, with members of the Hallé Orchestra taking part in the action and playing pieces associated with Beecham.
In 1980 the Royal Mail put Beecham's image on the 131⁄2p postage stamp in a series portraying British conductors; the other three in the series depicted Wood, Sargent and Barbirolli. The Sir Thomas Beecham Society preserves Beecham's legacy through its website and release of historic recordings.
In 2012, Beecham was voted into the inaugural Gramophone magazine "Hall of Fame".
## Books by Beecham
Beecham's published books were:
-
-
-
The last of these was reissued in 1975 by Severn House, London, with an introduction by Felix Aprahamian and a discography by Malcolm Walker, .
## See also
- Thomas Beecham selected discography |
# Laguna del Maule (volcano)
Laguna del Maule is a volcanic field in the Andes mountain range of Chile, close to, and partly overlapping, the Argentina–Chile border. The bulk of the volcanic field is in the Talca Province of Chile's Maule Region. It is a segment of the Southern Volcanic Zone, part of the Andean Volcanic Belt. The volcanic field covers an area of 500 km<sup>2</sup> (190 sq mi) and features at least 130 volcanic vents. Volcanic activity has generated cones, lava domes, lava coulees and lava flows, which surround the Laguna del Maule lake. The field gets its name from the lake, which is also the source of the Maule River.
The field's volcanic activity began 1.5 million years ago during the Pleistocene epoch; such activity has continued into the postglacial and Holocene epoch after glaciers retreated from the area. Postglacial volcanic activity has included eruptions with simultaneous explosive and effusive components, as well as eruptions with only one component. In the postglacial era, volcanic activity has increased at Laguna del Maule, with the volcanic field rapidly inflating during the Holocene. Three major caldera-forming eruptions took place in the volcanic field prior to the last glacial period. The most recent eruptions in the volcanic field took place 2,500 ± 700, 1,400 ± 600 and 800 ± 600 years ago and generated lava flows; today geothermal phenomena occur at Laguna del Maule. Volcanic rocks in the field include basalt, andesite, dacite and rhyolite; the latter along with rhyodacite makes up most of the Holocene rocks. In pre-Columbian times, the field was a regionally important source of obsidian.
Between 2004 and 2007, ground inflation began in the volcanic field, indicating the intrusion of a sill beneath it. The rate of inflation is faster than those measured on other inflating volcanoes such as Uturunku in Bolivia and Yellowstone Caldera in the United States and has been accompanied by anomalies in soil gas emission and seismic activity. This pattern has created concern about the potential for impending large-scale eruptive activity.
## Geography and structure
The Laguna del Maule volcanic field straddles the Chilean–Argentine frontier; most of the complex lies on the Chilean side. The locality belongs to the Maule Region, of Talca Province in the Andes mountain range; it is close to the confluence of the Maule and Campanario rivers in the Maule valley. The city of Talca lies about 140–150 km (87–93 mi) west. The Argentine section of the field is in the Mendoza and Neuquén provinces, and the city of Malargüe is located about 140 km (87 mi) east from the volcanic field. The seasonal Highway 115 [es] passes through the northern part of the volcanic field. The Paso Pehuenche mountain pass few kilometres northeast of the lake connects Argentina and Chile; the Chilean customs are at the outlet of the lake. Tourists and fishermen come to the lake during summer, and the crew of the Laguna del Maule dam remains there year-round. Otherwise, the region is sparsely inhabited and economic activity is limited to oil prospecting, pastures and tourism; the closest towns are La Mina and Los Cipreses over 20 km (12 mi) northwest of Laguna del Maule.
The Laguna del Maule volcanic field covers a surface area of 500 km<sup>2</sup> (190 sq mi) and contains at least 130 volcanic vents including cones, lava domes, lava flows, and shield volcanoes; 36 silicic coulees and lava domes surround the lake. Over 100 km<sup>2</sup> (39 sq mi) of the field is covered by these volcanic rocks. The volcanic field lies at an average height of 2,400 m (7,900 ft), and some summits around Laguna del Maule reach altitudes of 3,900 m (12,800 ft). Volcanic ash and pumice produced by the eruptions has been found over 20 km (12 mi) away in Argentina. A number of Quaternary volcanic systems of various ages surround Laguna del Maule lake, including about 14 shield volcanoes and stratovolcanoes that have been degraded by glaciation. The topography in the area is often steep.
Among the structures in the volcanic field, the Domo del Maule lava dome is of rhyolitic composition and generated a lava flow to the north that dammed the Laguna del Maule. This lava flow is joined by other lava flows from the Crater Negro, a small cone in the southwest sector of the volcanic field; the lavas of this cone are andesitic and basaltic. Loma de Los Espejos is a large lava flow of acidic rocks that is 4 km (2.5 mi) long in the northern sector of the volcanic field, close to the outlet of Laguna del Maule. It consists of two lobes with a volume of about 0.82 km<sup>3</sup> (0.20 cu mi) and contains obsidian and vitrophyre. Crystals within the flow reflect the sunlight. The well-preserved Colada de las Nieblas lava flow is in the extreme southwest sector of the volcanic field and originates at a tuff cone. This lava flow is 300 m (980 ft) thick, varying from 5 km (3.1 mi) to 6 km (3.7 mi) in length, and is about 3 km (1.9 mi) wide. The Barrancas centre has a volume of 5.5 km<sup>3</sup> (1.3 cu mi) and reaches an elevation of 3,092 m (10,144 ft).
Past glaciation of this part of the Andes left traces in adjacent valleys, such as their U-shaped or trench-shaped outline. The older volcanics of Laguna del Maule have been disproportionately eroded by glacial action. Slopes around Laguna del Maule lake are covered by colluvium including talus.
The Laguna del Maule lake lies on the crest of the Andes, within a depression with a diameter of 20 km (12 mi). The lake has a depth of 50 m (160 ft) and covers a surface of 54 km<sup>2</sup> (21 sq mi); the surface is at an altitude of 2,160 m (7,090 ft). The name of the volcanic field comes from the lake, which contains several islets. On the lakefloor are slump scars, pits that may be pockmarks, and a basin in the northern lake sector that may be the crater of an early Holocene Plinian eruption. Terraces around the lake indicate that water levels have fluctuated in the past; an eruption dated between 19,000 ± 700 and 23,300 ± 400 years ago dammed the lake 200 m (660 ft) higher than its present level. When the dam broke 9,400 years ago, a lake outburst flood occurred that released 12 km<sup>3</sup> (2.9 cu mi) of water and left traces, such as scour, in the down-valley gorge. Benches and beach bars developed on the lake, which has left a shoreline around Laguna del Maule lake. The lake is regulated by a dam at the outlet; it was built in 1950 and completed in 1957 and caused a slight expansion of the lake's area. Laguna del Maule is Chile's fourth-largest reservoir with a capacity of 0.850 cubic kilometres (0.204 cu mi). Additionally, tephra fallout such as from the 1932 Quizapu eruption has impacted the lake through the Holocene and affected life in the lake waters.
Besides Laguna del Maule, other lakes in the field are Laguna El Piojo on the Chilean side in the southwest sector of the field, Laguna Cari Launa on the Chilean side in the northeastern sector of the field, and Laguna Fea in the south at 2,492 m (8,176 ft) elevation and Laguna Negra lake both on the Argentine side. Laguna Fea is dammed by a pumice dam and currently lacks an outlet. The Laguna Sin Salida ("lake without exit"; so named because it lacks a river running out of it) is in the northeastern sector of the volcanic field and it formed within a glacial cirque. The Andean drainage divide runs across the volcanic field; most of it lies west of the divide and drains into the Maule River, partially through its tributary Melado. The Maule river originates in the field and the Pehuenche and Barrancas Rivers have their headwaters in the volcanic field as well.
## Geology
Subduction of the eastern part of the Nazca Plate beneath the western margin of the South American Plate occurs at a rate of about 74 ± 2 mm/a (2.913 ± 0.079 in/year). This subduction process is responsible for growth of the Chilean Andes, and volcanic and geothermal manifestations such as the 1960 Valdivia earthquake and the 2010 Chile earthquake, as well as Laguna del Maule, which formed 25 km (16 mi) behind the volcanic arc.
A phase of strong volcanic activity began in the Andes 25 million years ago, probably due to increased convergence rates of the Nazca and South America plates over the past 28 million years. It is likely that this phase has persisted without interruption until today.
The subduction of the Nazca plate beneath the South American Plate has formed a volcanic arc about 4,000 km (2,500 mi) long, which is subdivided into several segments distinguished by varying angles of subduction. The part of the volcanic belt named the Southern Volcanic Zone contains at least 60 volcanoes with historical activity and three major caldera systems. Major volcanoes of the Southern Volcanic Zone include from north to south: Maipo, Cerro Azul, Calabozos, Tatara-San Pedro, Laguna del Maule, Antuco, Villarrica, Puyehue-Cordón Caulle, Osorno, and Chaitén. Laguna del Maule is located within a segment known as the Transitional Southern Volcanic Zone, 330 km (210 mi) west of the Peru–Chile Trench and 25 km (16 mi) behind the main arc. Volcanoes in this segment are typically located on basement blocks that have been uplifted between extensional basins.
In the area of Laguna del Maule, the subducting Nazca plate reaches a depth of 130 km (81 mi) and is 37 million years old. During the Late Miocene, the convergence rate was higher than today and the Malargüe fold belt formed east of the main chain in response. The Moho is found at depths of 40–50 km (25–31 mi) beneath the volcanic field.
### Local
The Campanario Formation is 15.3 to 7 million years old and forms much of the basement in the Laguna del Maule area; this geological formation contains andesitic-dacitic ignimbrites and tuffs with later dacitic dykes that were emplaced 3.6–2.0 million years ago. An older unit, of Jurassic–Cretaceous age, crops out northwest of the volcanic field. Other units include an Oligocene–Miocene group of lacustrine and fluvial formations named Cura-Mallín, and another intermediary formation named Trapa-Trapa, which is of volcanic origin and between 19 and 10 million years old. Remnants of Quaternary ignimbrites and Pliocene, early Quaternary volcanic centres, are also found around the field; they form the Cola del Zorro Formation, which is partly covered by the eruption products of Laguna del Maule. Glacial tills occur at the volcanic field.
There are several faults in the volcanic field, such as the Laguna Fea and Troncoso Faults in the southwest sector and Los Condores in the northwestern part. The Laguna Fea fault is a west-northwest trending normal fault that was identified during seismic surveys. The inactive Troncoso is alternatively described as a strike-slip or normal fault; it runs along the Cajón Troncoso valley and separates distinct regimes of tectonic and volcanic activity within the Laguna del Maule volcanic field. Faults have been imaged in lake sediments. Other north–south cutting faults are found within the Campanario Formation and the tectonic Las Loicas Trough is associated with Laguna del Maule and passes southeast of it. Some faults at Laguna del Maule may be linked to the northern termination of the Liquiñe-Ofqui Fault Zone, while others may relate to large-scale lineaments that cross the Andes.
Northeast of Laguna del Maule are several mountains that reach elevations exceeding 3 kilometres (1.9 mi); many of these are eroded volcanoes. Cerro Campanario is a 3,943 m (12,936 ft) high mafic stratovolcano that was active 160,000–150,000 years ago. South of Laguna del Maule is the Varvarco volcanic field, while the Puelche volcanic field and the Pichi Trolon calderas are north and northeast of it, respectively; all were active in the Pleistocene. The volcanoes Nevado de Longaví, Tatara-San Pedro and the caldera Rio Colorado lie west of Laguna del Maule; the latter two may be part of a volcano alignment with Laguna del Maule. The local volcanoes are in a segment of the crust where the Wadati–Benioff zone is 90 km (56 mi) deep. More distant are the Calabozos caldera and a late Pleistocene system with domes and flows south of Cerro San Francisquito, which are both silicic volcanic systems. The activity of Tatara-San Pedro and Laguna del Maule with the presence of rhyolite may be influenced by the subduction of the Mocha Fracture Zone, which projects in the direction of these volcanic centres. Nearby are the Risco Bayo and Huemul plutons, which are about 6.2 million years old and may have formed through volcanism similar to that of Laguna del Maule.
### Composition of erupted rocks
Laguna del Maule has erupted andesite, basaltic andesite, basalt, dacite, rhyodacite and rhyolite, the andesites and basaltic andesites define a rock suite with medium potassium contents. In the Loma de Los Espejos rocks a SiO<sup></sup>
<sub>2</sub> content of 75.6–76.7% by weight has been noted. Zircon composition data indicate that the magmatic system has evolved over time: After deglaciation, the composition of Laguna del Maule volcanic rocks has grown more silicic; since 19,000 years ago, andesite eruptions have been restricted to the edges of the volcanic field, consistent with the maturation of a silicic magmatic system. The postglacial phase of activity has generated about 6.4 km<sup>3</sup> (1.5 cu mi) of rhyolite and 1.0 km<sup>3</sup> (0.2 cu mi) of rhyodacite. Of the more than 350 km<sup>3</sup> (84 cu mi) of volcanic rock in the Laguna del Maule field, about 40 km<sup>3</sup> (9.6 cu mi) were emplaced postglacially. Laguna del Maule magmas contain large amounts of water and carbon dioxide; postglacial magmas on average consist of 5–6% water by weight with some variation between individual eruptions. Flushing of the magma with carbon dioxide may be important for starting eruptions.
Several stratigraphic units have been distinguished at the volcanic field, including the Valley unit exposed in the Maule valley and the Lake unit found around the lake. The Valley unit's rocks are basaltic andesite. Plagioclase and, in lesser measure, clinopyroxene and olivine form its phenocrysts. The Lake unit is mostly postglacial and includes glassy rhyolite, which is poor in crystals. Phenocrysts in the postglacial rocks are biotite, plagioclase and quartz. Granitic xenoliths and mafic rocks occur as discrete rock fragments in the rhyolitic units erupted by the rdm eruption. Microlites in the Lake unit rocks include biotite, plagioclase and spinel. Variable vesicular texture has been noted on rocks erupted during different eruptions. Temperatures of the postglacial magmas have been estimated at 820–950 °C (1,510–1,740 °F). The Holocene rhyolites are glassy and contain few crystals. Hydrothermal alteration has been reported at various sites such as La Zorra, generating alunite, calcite, halite, illite, jarosite, kaolinite, montmorillonite, opal, quartz, pyrite, smectite, sulfur, travertine and zeolite. At La Zorra there are occurrences of actinolite, apatite, augite, calcite, chlorite, hypersthene, ilmenite, magnetite, phlogopite, pyrite, pyroxene, quartz, thorite, titanite and zircon.
The postglacial rocks are composed of similar elements. High aluminium (Ai) and low titanium (Ti) are present in the basaltic andesite and basalt, a typical pattern for basic rocks in zones where plates converge. The rocks overall belong to the calc-alkaline series, although some iron-rich rocks have been attributed to the tholeiitic series. Strontium (Sr) isotope ratios have been compared to the ones of Tronador volcano; additional compositional similarity is found to other volcanoes close to Laguna del Maule such as Cerro Azul and Calabozos. Laguna del Maule stands out for the frequency of rhyolitic rocks, compared to volcanoes farther south in the chain. There are compositional trends in the region of the volcanic arc between 33° and 42°; more northerly volcanoes are more andesitic in composition while to the south basalts are more frequent.
### Magma genesis
The postglacial activity appears to originate from a shallow silicic magma chamber beneath the caldera. Research published in 2017 by Anderson et al. indicates that this system is somewhat heterogeneous with distinct compositions of magmas erupted in the northwesterly and southeasterly parts of the volcanic field. Tthe early post-glacial rhyodacites contain mafic inclusions, and implying that mafic lavas exist but do not reach the surface. From Sr isotope ratios it has been inferred that the magma is of deep origin, and the rare-earth element composition shows no evidence of crustal contamination. Neodymium (Nd) and Sr isotope ratios indicate all rocks are derived from the same parent source, with the rhyolites forming by fractional crystallization of the basic magma, similar to the postulated origins of rocks from the Central Volcanic Zone. Partial melting may also be the source of the rhyolites. Overall the environment where the rocks formed appears to be an oxidized 760–850 °C (1,400–1,560 °F) hot system that formed over 100,000 to 200,000 years, and was influenced by the injection of basaltic magma. The rhyolitic melts may originate in a crystal rich mush beneath the volcanic field and probably in at least two magma chambers. The magma remains in the chamber for days or weeks before erupting. A minimum long-term magma supply rate of 0.0005 km<sup>3</sup>/a (0.00012 cu mi/a) has been estimated, with a rate of 0.0023 km<sup>3</sup>/a (0.00055 cu mi/a) during the past 20,000 years.
## Obsidian and iron oxide-apatite
In pre-Columbian times, Laguna del Maule was an important source of obsidian for the region, on both sides of the Andes. Finds have been made from the Pacific Ocean to Mendoza, 400 km (250 mi) away, as well as at archaeological sites of Neuquén Province. Obsidian forms sharp edges and was used by ancient societies for the production of projectiles as well as cutting instruments. In South America, obsidian was traded over large distances. Obsidian has been found in the Arroyo El Pehuenche, Laguna Negra and Laguna del Maule localities. These sites yield obsidians with varying properties, from large blocks at Laguna del Maule to smaller pebbles probably carried by water at Arroyo El Pehuenche. Another scheme has a Laguna del Maule 1 source at Laguna Fea and Laguna Negra and a Laguna del Maule 2 source on the Barrancas river.
An occurrence of iron oxide (magnetite)-apatite ores (IOA) has been found at La Zorra volcano and is named Vetas de Maule ("Veins of Maule"). It features massive magnetite blocks, magnetite grains and veins and breccias; the dimensions of the magnetite occurrences range from tens of metres to few centimetres. IOA-type deposits are important iron resources and form in volcanoes, either through magmatic or hydrothermal processes. The IOA deposit at Laguna del Maule is one of the youngest in the world, being less than one million years old. It formed presumably through hydrothermal processes, about 120,000 years after the volcano was emplaced.
## Climate and vegetation
Laguna del Maule lies at the interface between a semi-arid, temperate climate and a colder montane climate. It has a tundra climate, with maximum temperatures of 14.1 °C (57.4 °F) in January and minimum of −4.6 °C (23.7 °F) in July. Annual precipitation reaches about 1,700 mm/a (67 in/year); precipitation related to cold fronts falls during autumn and winter, although occasional summer storms also contribute to rainfall. Laguna del Maule is subject to the rain shadow effect of mountains farther west, which is why the numerous summits more than 3,000 m (9,800 ft) high around the lake are not glaciated. Most of the lake water comes from snowmelt; for much of the year the landscape around the lake is covered with snow and storms and snowfall frequently impede traffic at the lake. Winds frequently blow sand and pumice.
The area of Laguna del Maule was glaciated during the last glacial period. A glacial maximum occurred between 25,600 ± 1,200 and 23,300 ± 600 years ago, during which 80 km-wide (50 mi) ice cap covered the volcano and the surrounding valleys. There are uncertainties about when the glaciers retreated, but radiocarbon dating suggests that deglaciation took place 17,000 years ago, synchronously to the rest of the Americas. The glaciation has left moraines and terraces in the area, with undulating hills lying close to the outlet of the lake. Poorly developed moraines with the appearance of tiny hills lie downstream of Laguna del Maule, and form small hills around the lake rising about 10–20 m (33–66 ft) above the lake level. Other climate changes in the Holocene such as the Little Ice Age are recorded from sediments in Laguna del Maule, such as a humid period in the 15th to 19th centuries and drought during the early and middle Holocene. Since the 2000s-2010s, a long drought has caused a decline in the level and surface area of Laguna del Maule; the lake has shrunk by almost 10 per cent between 1984 and 2020.
The landscape around Laguna del Maule is mostly desertic without trees. Vegetation around Laguna del Maule is principally formed by bunchgrass, cushion plants and sub-shrubs; at higher altitudes vegetation is more scattered. A richer vegetation is found in valley floors, and was historically used for grazing. The rocks around Laguna del Maule host a plant named Leucheria graui, which has not been found elsewhere. The unfavourable terrain and climate are responsible for the desert-like landscape.
## Eruptive history
Laguna del Maule has been active since 1.5 million years ago, with field-wide activity established by about 900,000 years ago. Its average magma volcanic output rate has been estimated to be 200,000 m<sup>3</sup>/a (7,100,000 cu ft/a)—comparable to other volcanic arc systems. Eruptions occur about every 1,000 years and it has been inferred that eruptions lasted between 100 and more than 3,000 days. Eruptions include both caldera-forming events and eruptions that did not leave a caldera. Most Pleistocene centres are found west of the lake.
Three caldera-forming events have occurred during the system's lifespan. The first took place 1.5 million years ago and produced the dacitic Laguna Sin Puerto ignimbrite, which is exposed northwest of Laguna del Maule lake. Between the two eruptions, about nine stratovolcanoes formed at Laguna del Maule. The largest occurred between 990,000 and 950,000 years ago and produced the Bobadilla caldera and a rhyodacitic ignimbrite, also known as the Cajones de Bobadilla ignimbrite. This ignimbrite reaches a thickness of 500 m (1,600 ft) and borders Laguna del Maule lake in the north, extending about 13 km (8.1 mi) away from it. The Bobadilla caldera is centred beneath the northern shore of Laguna del Maule, and has dimensions of 12 km × 8 km (7 mi × 5 mi). The third took place 336,000 years ago and produced the welded Cordon Constanza ignimbrite.
The 36 rhyodacitic lava domes and flows which surround the lake were erupted from about 24 individual vents. The eruptions began 25,000 years ago, after the onset of deglaciation, and continued until the last such eruption approximately 2,000 years ago. Two pulses of volcanism occurred at Laguna del Maule after deglaciation, the first 22,500–19,000 years ago and the second in the middle-late Holocene. A first, large Plinian eruption (unit rdm) formed the rhyolite of Laguna del Maule measuring 20 km<sup>3</sup> (4.8 cu mi) from a vent presumably located below the northern part of the lake.
The Cerro Barrancas centre became active circa 14,500 ± 1,500 years before present and was the main site of volcanic activity between 14,500 and about 8,000 years ago. After that point activity shifted and the volume output increased; the subsequent units have a volume of 4.8 km<sup>3</sup> (1.2 cu mi). These two phases of volcanic activity occurred within 9,000 years of each other and the magmas involved may have been sourced from different magma reservoirs.
Undated postglacial units are andesitic Crater Negro scoria cone and lava flow just west of Laguna del Maule, andesitic Playa Oriental on the southeastern shore of Laguna del Maule, rhyolitic Arroyo de Sepulveda at Laguna del Maule and rhyodacitic Colada Dendriforme (unit rcd) in the western part of the field. This rhyolitic flare-up is unprecedented the history of the volcanic field, and it is the largest such event in the southern Andes and on a global scale only the Mono-Inyo Craters and Taupō rival it. It took place in two stages, a first early after deglaciation and a second during the Holocene, which featured magmas with distinct composition. Post-glacial activity gave rise to more than 39 vents, which, compared to the pre-glacial volcanism, are concentrated around Laguna del Maule.
Three mafic volcanic vents named Arroyo Cabeceras de Troncoso, Crater 2657 and Hoyo Colorado are also considered postglacial. The former two are andesitic, while the latter is a pyroclastic cone. Mafic volcanism appears to have decreased after glacial times at Laguna del Maule, and the post-glacial volcanism has a mainly silicic composition. The magma chamber acts as a trap for mafic magma, preventing it from rising to the surface and thus explaining the absence of postglacial mafic volcanism. Only andesites and rhyodacites can bypass the rhyolites, and only in the western half of the field, away from the rhyolitic vents.
### Explosive eruptions and far-field effects
Explosive activity including ash and pumice has accompanied a number of the postglacial eruptions; the largest is associated with Los Espejos and has been dated to 23,000 years ago. The deposit of this Plinian eruption reaches 4 m (13 ft) of thickness at a distance of 40 km (25 mi). White ash and pumice form layered deposits east of the Loma de Los Espejos; another explosive eruption is linked to the Barrancas centre which emplaced block and ash flows 13 kilometres (8.1 mi) long. Other such explosive events have been dated at 7,000, 4,000 and 3,200 years ago by radiocarbon dating. About three Plinian eruptions and three smaller explosive eruptions have been identified at Laguna del Maule; most of them took place between 7,000 and 3,000 years ago. In total, there are about 30 known fall units. It has been estimated that the ash and pumice deposits have a volume comparable with that of the lava flows.
A tephra layer in the Argentine Caverna de las Brujas cave dated 7,780 ± 600 years ago has been tentatively linked to Laguna del Maule, and another with a thickness of 80 cm (31 in) that is 65 km (40 mi) away from Laguna del Maule is dated 765 ± 200 years ago and appears to coincide with a time with no archaeological findings in the high cordillera. Other tephras that possibly were erupted at Laguna del Maule have been found in Argentinian archaeological sites, one 7,195 ± 200 years ago at El Manzano and another 2,580 ± 250 to 3,060 ± 300 years old at Cañada de Cachi. The El Manzano tephra reaches a thickness of 3 m (9.8 ft) about 60 km (37 mi) away from Laguna del Maule and would have had a severe impact on Holocene human communities south of Mendoza. However, there is no evidence for long-term depopulation of affected regions after eruptions.
### Most recent activity and geothermal system
The most recent dates for eruptions are ages of 2,500 ± 700, 1,400 ± 600 and 800 ± 600 years for rhyolitic lava flows, with the last eruption forming the Las Nieblas flow. No eruptions have occurred during historical time, but petroglyphs in Valle Hermoso may depict volcanic activity at Laguna del Maule.
Laguna del Maule is geothermally active, featuring bubbling pools, fumaroles and hot springs. Temperatures in these systems range between 93–120 °C (199–248 °F). There is no degassing at the surface but emission of gas bubbles has been observed in Laguna del Maule lake and a creek southwest of the lake. In the Troncoso valley, CO<sup></sup>
<sub>2</sub> emissions have killed small animals. Hot springs occur mainly north and northeast of Laguna del Maule. The Baños del Maule hot springs are now submerged below the lake. The Baños Campanario hydrothermal springs lie northwest from Laguna del Maule and their waters together with these from the Termas del Medano springs appear to form through a mixing of magmatic and precipitation water. The field has been evaluated as a potential source of geothermal energy. It and the neighbouring Tatara-San Pedro volcano form the so-called Mariposa geothermal system discovered in 2009, whose temperature has been estimated on the basis of gas chemistry to be 200–290 °C (392–554 °F) and which features fumaroles. One estimate puts the potential productivity of Laguna del Maule as an energy source at 50–200 MW (67,000–268,000 hp).
## Possible future eruptions
The Laguna del Maule volcanic system is undergoing strong deformation; uplift between 2004 and 2007 attracted the attention of the public and the global scientific community after it was detected by radar interferometry. Between January 2006 and January 2007 uplift of 18 cm/year (7.1 in/year) was measured, and uplift during 2012 was about 28 cm (11 in). Between 2007 and 2011 the uplift reached close to 1 m (3 ft 3 in). A change in the deformation pattern occurred in 2013 related to an earthquake swarm that January, with deformation slowing through to mid-2014 but with another increase between 2016 and at least 2020. Measurements in 2016 indicated that the uplift rate was 25 cm/year (9.8 in/year); uplift has continued into 2019 and the total deformation has reached 1.8 m (5 ft 11 in) to 2.5 m (8 ft 2 in). This uplift is one of the largest in all volcanoes that are not actively erupting; the strongest uplift worldwide was recorded between 1982 and 1984 at Campi Flegrei in Italy with an end change of 1.8 m (5 ft 11 in). Other actively deforming dormant volcanoes in the world are Lazufre in Chile, Santorini in Greece from 2011 to 2012, and Yellowstone Caldera in the United States at a rate of 1/7th that of Laguna del Maule. Another South American volcano, Uturunku in Bolivia has been inflating at a pace 1/10th that of Laguna del Maule's. There is evidence that earlier deformations occurred at Laguna del Maule, with the lake shores having risen by about 67 m (220 ft) during the Holocene possibly as a consequence of about 20 km<sup>3</sup> (4.8 cu mi) entering the magmatic system and accumulating in the area of the Barrancas vents.
The present-day uplift is centred beneath the western segment of the ring of post-glacial lava domes, more specifically beneath the southwest sector of the lake. The source of the deformation has been traced to an inflation of a sill beneath the volcanic field that is 5.2 km (3.2 mi) deep with dimensions of 9.0 km × 5.3 km (5.6 mi × 3.3 mi). This sill has been inflating at an average pace of 31,000,000 ± 1,000,000 m<sup>3</sup>/a (1.095×10<sup>9</sup> ± 35,000,000 cu ft/a) between 2007 and 2010. The rate of volume change increased between 2011 and 2012. As of July 2016, 2,000,000 m<sup>3</sup>/a (71,000,000 cu ft/a) of magma are estimated to enter the magma chamber. The average recharge rate required to explain the inflation is about 0.05 km<sup>3</sup>/a (0.012 cu mi/a). This volume change is approximately 10 to 100 times as large as the field's long-term magma supply rate. Gravimetric analysis has indicated that between April 2013 and January 2014, approximately 0.044 km<sup>3</sup> (0.011 cu mi) of magma intruded beneath the field. The presence of a sill is also supported by magnetotelluric measurements indicating conductivity anomalies at depths of 4–5 km (2.5–3.1 mi) beneath the western side of the volcanic field and at 8–9 km (5.0–5.6 mi) depth beneath its northern part. They show the existence of rhyolitic melt, but they do not show a magmatic system associated with the southeastern vents, leaving their magma supply route uncertain. The existence of a Bouguer gravity anomaly also indicates the presence of a low-density body 2–5 km (1.2–3.1 mi) beneath the volcano, and several low-density bodies below the lake, the eastern vents and the Barrancas centre. The latter may be a trace of magma left behind by the Holocene eruptions there. Seismic tomography has found a 450 km<sup>3</sup> (110 cu mi) magma reservoir centered beneath the northwestern part of the lake, at 2–8 km (1.2–5.0 mi) depth. It may contain about 5% melt and has a heterogeneous structure with varying melt fractions in various parts of the reservoir. A reservoir of crystal-rich mush estimated as having a volume of 115 cubic kilometres (28 cu mi), with about 30 cubic kilometres (7.2 cu mi) of magma embedded within the mush, may have moved away from the old vents towards its present-day position. It is being resupplied by deeper, more crystal-poor magmas. In the deep crust, further magma systems may connect Laguna del Maule with Tatara-San Pedro volcano.
### Seismicity
Strong seismic activity has accompanied the deformation at Laguna del Maule. Seismic swarms have been recorded above the depth of the deforming sill south of the ring of lava domes, particularly around Colada Las Nieblas. A magnitude 5.5 earthquake occurred south of the volcanic field in June 2012. A major volcano-tectonic earthquake swarm occurred in January 2013, possibly due to faults and underground liquids being pressurized by the intrusion of magma. Between 2011 and 2014, swarms of earthquakes occurred every two or three months and lasted from half an hour to three hours. Afterwards activity decreased until 2017 and increased again, with the most intense seismic episode taking place in June 2020. Most earthquake activity appears to be of volcano-tectonic origin, while fluid flow is less important; two intersecting lineaments on the southwest corner of the lake appear to be involved. The 2010 Maule earthquake, 230 km (140 mi) west of Laguna del Maule, did not affect the volcanic field; the rate of uplift remaining unchanged, while other measurements indicate a change in the uplift rates at that point. Although some shallow earthquakes have been interpreted as reflecting diking and faulting on the magma chamber, the pressure within the chamber appears to be insufficient to trigger a rupture all the way between the surface and the chamber and thus no eruption has occurred yet.
### Potential mechanisms for the uplift
Several mechanisms have been proposed for the inflation, including the movement of magma underground, the injection of new magma, or the action of volcanic gases and volatiles which are released by the magma. Another proposal is that the inflation may be situated in a hydrothermal system; unless the Baños Campanario 15 km (9.3 mi) away are part of a hydrothermal system, there is little evidence that such a system exists at Laguna del Maule. Carbon dioxide () anomalies, concentrated on the northern lakeshore, have been found around Laguna del Maule, in 2020 together with dead animals and discoloured soil; the anomalies are possibly triggered by the stress of the inflation activating old faults. These anomalies may indicate that the inflation is of mafic composition, as rhyolite only poorly dissolves CO<sup></sup>
<sub>2</sub>. Gravity change measurements also show an interaction between magma source, faults and the hydrothermal system.
### Hazards and management
This uplift has been a cause of concern in light of the history of explosive activity of the volcanic field, with 50 eruptions in the last 20,000 years; the current uplift may be the prelude of a large rhyolitic eruption. In particular, the scarce fumarolic activity implies that a large amount of gas is trapped within the magma reservoir, increasing the hazard of an explosive eruption. It is not clear if such an eruption would fit the pattern set by Holocene eruptions or would be a larger event. The prospect of renewed volcanic activity at Laguna del Maule has caused concern among the authorities and inhabitants of the region. A major eruption would have a serious impact on Argentina and Chile, including the formation of lava domes, lava flows, pyroclastic flows near the lake, ash fall at larger distances and lahars. The international road across Paso Pehuenche and air traffic in the region could be endangered by renewed eruptions. A break-out flood from Laguna Fea may endanger communities downstream.
Laguna del Maule is considered to be one of the most dangerous volcanoes of the Southern Andean volcanic belt, and is Argentina's third most dangerous volcano. In March 2013, the Southern Andean Volcano Observatory declared a "yellow alert" for the volcano in light of the deformation and earthquake activity, withdrew it in 2021 and reinstated it in 2023; the alert was supplemented afterwards with an "early" warning (withdrawn in January 2017). The Argentine Servicio Geológico Minero and the Chilean National Geology and Mining Service monitor the volcano with a network of stations, and a bi-national volcanic hazard map has been published. |
# Burnley F.C. in European football
Burnley Football Club is an English professional association football club founded in 1882. Burnley first played against foreign opposition—Scottish club Cowlairs—in 1885, and embarked on their first overseas tour in 1914, playing sides from the German Empire and Austria-Hungary. Further trips to foreign countries followed in the next decades. In 1955, UEFA launched the first officially sanctioned European club competition, the European Cup. Burnley won their second First Division title in 1959–60, qualifying for the 1960–61 European Cup. They eliminated French champions Stade de Reims in the first round before being sent out of the contest by West German champions Hamburger SV in the quarter-final. Burnley's next campaign in a European club competition came six years later, in the 1966–67 Inter-Cities Fairs Cup, where they were again eliminated by a West German side (Eintracht Frankfurt) in the quarter-final. In 2018, Burnley qualified for the 2018–19 UEFA Europa League, reaching the play-off round.
The side also competed in minor international football tournaments in the 1970s and early 1980s. Burnley participated in two editions of the Texaco Cup, a competition involving sides from England, Scotland, Northern Ireland and the Republic of Ireland that had not qualified for UEFA-sanctioned European competitions or the Inter-Cities Fairs Cup. They reached the 1974 final but lost against Newcastle United after extra time. Burnley later competed in the Anglo-Scottish Cup—the Texaco Cup's successor—on five occasions and won the tournament in 1978–79, after they defeated Oldham Athletic 4–2 on aggregate in the final.
## History
### Foreign opponents and overseas tours
Burnley were founded in May 1882, and initially played their matches against local clubs. In January 1885, Burnley's committee invited Scottish clubs Cowlairs, Kilmarnock, and Glasgow Northern to play friendlies at Burnley's home ground, Turf Moor. Cowlairs were Burnley's first foreign opponents; the match ended in a 2–2 draw. Burnley subsequently lost 3–2 to Kilmarnock, but defeated Northern 4–0. Several weeks after winning the 1914 FA Cup, the club embarked on its first tour to continental Europe, playing sides from the German Empire and Austria-Hungary. Burnley won their first match with foreign opposition on foreign soil; 2–1 against Viktoria Berlin. Scottish Cup winners Celtic also made a trip to the continent; Hungarian club Ferencváros put up a vase—the Budapest Cup—for a charity match between Burnley and Celtic in Budapest. The game ended in a draw, with a replay held at Turf Moor several months later, which Celtic won 2–1. Burnley embarked on a tour to Italy during the off-season in 1922—which included a 1–0 victory against Football League champions Liverpool in Milan—and to Germany and the Netherlands in 1927, where they won five of six matches and scored thirty goals.
During the late 1940s and the 1950s, the club embarked on several overseas tours. During their trip in Spain in 1949, Burnley defeated Barcelona 1–0 at Barça's Camp de Les Corts. Burnley remained unbeaten during their stay in Turkey in 1951, defeating Fenerbahçe 3–2 and drawing with Beşiktaş and Galatasaray. In 1954, Burnley travelled to the African island nations Madagascar and Mauritius. They won all 7 matches—including a 14–1 victory against Madagascan side Tananarive—scoring 48 goals. Fifty years later, Mauritian newspaper L'Express described Burnley's 1954 tour as "innovative" as the Mauritian footballers made acquaintance with new footballing techniques.
In 1955, UEFA launched the first officially sanctioned European club competition, the European Cup—a tournament contested between several national champions and other European sides. Burnley won their second First Division title in 1959–60 under the management of Harry Potts. The club's squad consisted of mostly players who came through the Burnley youth academy; a transfer fee was paid for only two players—for Jimmy McIlroy in 1950 and for Alex Elder in 1959. After the 1959–60 season ended, the team travelled to the United States to represent England in the International Soccer League, the first modern international American soccer tournament. Burnley defeated Bayern Munich (West Germany), Glenavon (Northern Ireland) and Nice (France) but finished runners-up in the group stage behind Kilmarnock.
### 1960–61 European Cup
As a result of their First Division title, Burnley played the following season in European competition for the first time, in the 1960–61 European Cup. They were the third English club in the European Cup, preceded by Manchester United and Wolverhampton Wanderers. Burnley received a bye in the preliminary round and were drawn against French club Stade de Reims in the first round. Reims were the 1959–60 French Division 1 champions, and were European Cup runners-up in 1956 and 1959. The first leg was played at Turf Moor, with Burnley winning 2–0: Jimmy Robson scored in the first minute and McIlroy netted a second in the 22nd minute. The return leg, played two weeks later at Parc des Princes in Paris, ended in a 3–2 loss, although Robson had put Burnley 1–0 ahead. During the game, Potts ran on the pitch to put the ball back to its correct place during a Reims free-kick, having become exasperated by their several attempts to steal a few yards, after which he was taken off the field by the local police. Despite the loss and crowd disturbances, Burnley won 4–3 on aggregate and progressed to the quarter-final, in which the club faced West German champions Hamburger SV. At Turf Moor, in front of around 46,000 spectators, Brian Pilkington scored twice to put Burnley 2–0 up with Robson adding a third, before Hamburg pulled one back in the last minutes of the game. The second leg was played two months later at the Volksparkstadion and was broadcast live on the BBC. Uwe Seeler scored twice in a 4–1 win for Hamburg; McIlroy hit the post in the last minute and Burnley were eliminated from the competition.
### 1966–67 Inter-Cities Fairs Cup
The maximum wage in the Football League was abolished in 1961, which meant that clubs from small towns like Burnley could no longer compete financially with sides from bigger towns and cities, and damaged Burnley's fortunes. The side ventured back into international football competition, however, with qualification for the 1966–67 Inter-Cities Fairs Cup due to a third-place finish in the 1965–66 First Division. The Inter-Cities Fairs Cup was another European competition which started in 1955. It was organised by the Fairs Cup committee, which was backed by several FIFA executive committee members; as the Inter-Cities Fairs Cup was not under the auspices of UEFA, it does not consider teams' records in the Fairs Cup to be part of their European record. FIFA does view the competition as a major honour.
The first round draw paired Burnley with another West German team: VfB Stuttgart. The first leg was played at Stuttgart's Neckarstadion and ended in a 1–1 draw; Burnley's Willie Irvine scored the first goal, but the team ended the match with 10 players after Brian O'Neil was sent off at the end of the game. Burnley won the return leg with a scoreline of 2–0 and progressed to the second round to face Swiss side Lausanne Sports. They defeated Lausanne 8–1 on aggregate; Burnley won 3–1 away and 5–0 at home, with Andy Lochhead scoring a hat-trick in the latter match. The club was paired with Italian club Napoli in the following round. The first leg, at Turf Moor, ended in a 3–0 Burnley victory with goals from Ralph Coates, Les Latcham, and Lochhead, who scored his sixth goal in the competition. Napoli ended the game with 10 men after defender Dino Panzanato [it] was sent off for kicking Lochhead in the head. The Italian press previewed the return leg in a belligerent manner: "From Lancashire where studs are made out of rose petals ... to Naples where visiting players are put through a mincing machine at the end of the game and their remains are roasted on a spit". A crowd of 60,000 saw Burnley goalkeeper Harry Thomson make 13 saves, including a penalty kick from José Altafini, as the match ended in a goalless draw. The team coach was escorted to the local airport by a protective convoy to escape the Napoli fans. The Daily Express later hailed Thomson as a "God in a green jersey", while the Burnley Star highlighted the "barbaric conduct shown by the defeated Naples team and their lunatic spectators". The quarter-final draw paired Burnley with Eintracht Frankfurt, the first leg was held in Frankfurt; Brian Miller netted for Burnley in a 1–1 draw. In the return match, Eintracht took a 2–0 lead; Miller halved the score, but the team could not find more goals and were again eliminated by a West German side.
### 2018–19 UEFA Europa League
Burnley had to wait more than 50 years for their third appearance in a major European football competition. During that period, the club played in all four professional divisions and only avoided relegation to the non-League fifth-tier Football Conference on the last matchday in 1986–87. The team finished in seventh position in the 2017–18 Premier League, which ensured qualification for the 2018–19 UEFA Europa League second qualifying round.
Burnley were drawn against Scottish side Aberdeen, setting up an all-British tie. The first leg at Aberdeen's Pittodrie Stadium ended in a 1–1 draw, after Sam Vokes scored the equaliser for Burnley. The second leg also finished 1–1 after 90 minutes; the game went into extra time, with goals from Jack Cork and Ashley Barnes ensuring a 3–1 win for Burnley. They were paired with Turkish club İstanbul Başakşehir in the third qualifying round. Both games ended in goalless draws after 90 minutes; Cork scored the only goal in extra time in the second leg, setting up a tie with Greek club Olympiacos in the play-off round, the last phase before the group stage. Five Burnley supporters were injured in incidents of violence before the first leg started in Piraeus. Burnley lost 3–1, and ended the match with 10 men after defender Ben Gibson was sent off. Olympiacos owner Evangelos Marinakis had reportedly entered the referee's room at half-time to vent his frustration at the arbiter's performance; Burnley manager Sean Dyche later accused Olympiacos' staff of intimidating the officials. In the return leg, Burnley missed multiple chances to score; the game finished 1–1 with Matěj Vydra scoring on his Burnley debut. The team lost 4–2 on aggregate and went out of the competition.
## Record
### By season
### By competition
Source:
### By location
Source:
## Texaco Cup and Anglo-Scottish Cup
The Texaco Cup was a competition launched in 1970, involving sides from England, Scotland, Northern Ireland and the Republic of Ireland that had not qualified for UEFA-sanctioned European competitions or the Inter-Cities Fairs Cup. Burnley participated in the inaugural 1970–71 season where they were eliminated in the first round by Scottish side Heart of Midlothian; Burnley won the first leg 3–1 but lost 4–1 in the return match. The club's only other participation in the tournament was in the 1973–74 edition. In the first round, the club was paired with Scottish team East Fife. Burnley won the first match 7–0—a record victory in the competition—and the return leg 3–2 after having been 2–0 behind. The team defeated Heart of Midlothian 8–0 on aggregate in the following round to set up a semi-final with Norwich City. After recording a 2–0 victory in the first leg, Burnley went 2–0 behind in the second match, only to score three times in the last six minutes of the game to progress to the final. They faced Newcastle United, with the final played as a single match at Newcastle's St James' Park. Paul Fletcher scored halfway through the first half to put Burnley in front; Newcastle soon equalised, and the game went to extra time, where the hosts scored again to win 2–1.
In 1975, the Texaco Cup was replaced with the Anglo-Scottish Cup; only English and Scottish clubs participated in the tournament. Burnley competed in the Anglo-Scottish Cup on five occasions between 1976 and 1981. They were eliminated four times in the group stage and progressed to the knockout stage only once, in 1978–79. In that season, the team defeated Preston North End (3–2) and Blackpool (3–1), and drew with Blackburn Rovers (1–1), who also beat Preston and Blackpool; as Burnley twice scored three goals in a match, they received two bonus points while Blackburn received none. Burnley topped the group and progressed to the quarter-final where they faced Celtic. The Scots had started their season with eight consecutive victories, including a 3–1 win in the Old Firm match, before travelling to Turf Moor for the first leg. Steve Kindon scored the game's only goal to give Burnley the victory in front of around 30,000 spectators. The match was marred by crowd violence; Celtic fans hurled bottles, stones and iron railings on police and Burnley fans, who fled on to the pitch, causing 60 injuries. They also defeated Celtic in the away game—a 2–1 victory, the scorers being Ian Brennan and Kindon—to win 3–1 on aggregate and progress to the semi-final to play Mansfield Town. After Burnley won 2–1 at Mansfield, described by the Burnley Express as "one of the greatest acts of soccer robbery", they lost 1–0 at home after extra time. As it finished 2–2 on aggregate, a penalty shoot-out—a first at Turf Moor—was required to determine the winner, which Burnley won 8–7. They faced Oldham Athletic in the final, with the first leg taking place at Oldham's Boundary Park on 5 December 1978. On an icy pitch, Kindon scored in the first minute, with Peter Noble adding a second goal two minutes later. Halfway through the second half, Jim Thomson and Kindon both scored to put Burnley 4–0 up. Oldham netted a consolidation goal in the last minutes of the game, and the team won 4–1. The return leg took place a week later in Burnley, a 1–0 victory for Oldham. Burnley won 4–2 on aggregate to lift the trophy for the first and only time.
In 1981, the Scottish clubs withdrew from the competition as the attendances were low and the English teams were increasingly drawn from the lower leagues. The tournament continued with English entrants only as the Football League Group Cup, which was replaced by the Associate Members' Cup in 1983.
### By season
- Key to colours
### By competition
Source:
### By location
Source: |
# Quarter sovereign
The quarter sovereign is a British gold bullion and collector's coin, issued by the Royal Mint since 2009. The smallest in the sovereign range, it has a face value of 25 pence.
In 1853, the Royal Mint produced two patterns for a quarter sovereign for circulation, with one denominated as five shillings. These coins never went into production, due to concerns about their small size and the likely wear in circulation. Gold passed from circulation in the aftermath of the First World War.
Beginning in 1979, the Royal Mint began to sell sovereigns to those wishing to own gold coins, by the following year selling four different denominations, ranging from the half sovereign to the five-pound gold coin. In 2009, a quarter sovereign was introduced as an extension of this range. The quarter sovereign shares the same design as the larger coins, depicting on the obverse the reigning monarch, Elizabeth II or, since 2022, Charles III. Although there are some one-year designs, the one most often used on the reverse of these issues is Benedetto Pistrucci's depiction of Saint George and the Dragon, which was first used on the sovereign in 1817.
## Victorian pattern coin
Pattern gold coins valued at a quarter sovereign (five shillings) were struck in 1853, as the Royal Mint reconsidered which denominations were to be struck in gold and which in silver. At this time, the mint could not process both gold and silver simultaneously, and such a coin was seen as an alternative to the larger silver coins, such as the florin and half crown. In 1853, there was heavy demand for coins of both metals, but the Royal Mint gave priority to the more valuable gold coinage, such as the sovereign (the gold coin valued at one pound sterling). On 7 March 1853, the Chancellor of the Exchequer, William Gladstone, explained to the House of Commons that the demand for gold was so heavy that there was no opportunity for the Royal Mint to coin silver.
During the 18 April 1853 sitting of the House of Commons, either Edward Divett (per Hansard) or the former chancellor, Benjamin Disraeli (per a press account), asked the Financial Secretary to the Treasury, James Wilson, if consideration had been given to striking a quarter sovereign. Wilson replied that the government had directed that a die be prepared as an experiment. He noted that one difficulty with such a coin was that it would take four times as long to coin the same gold as with a sovereign. Wilson stated that such a coin would be of very small size, about the size of an American one-dollar gold coin, which he held in his hand as he made his statement.
By the end of the month, the Master of the Mint, Sir John Herschel, was able to present to the Treasury two quarter-sovereign pattern coins. Both of the patterns depicted, on the obverse, Queen Victoria as shown on the early issues of sovereigns of her reign, with her name and abbreviated titles. The reverse of one of the patterns was inscribed FIVE SHILLINGS 1853 with a crown above the inscription and a rose, thistle and shamrock below it. The other had the inscription QUARTER SOVEREIGN 1853, with, between the 8 and the 5, a crowned square shield of arms.
The report that Herschel presented with the pattern coins demonstrated that it would be very expensive to strike them for commerce, as they would have to meet exacting standards, and would wear quickly in circulation. Herschel explained that such a coin would circulate faster than larger gold coins, leading to increased wear, and that the very small size would make the coin easy to lose. The light weight would make it difficult to detect counterfeits. He estimated that gold would be lost to the public (either through misplacing the coins or through abrasion) at fifteen times the rate for the same value of sovereigns.
At the time, a parliamentary select committee was considering decimal coinage, and both Herschel and Thomson Hankey, former governor of the Bank of England, gave evidence before it. William Miller of the Bank of England also testified. All opposed the quarter sovereigns due to the expense of striking and maintaining them, and the committee did not recommend the quarter sovereign. No further action was taken on the quarter sovereign proposal; numismatic writer G. P. Dyer suggested that Herschel would not have spoken so negatively about the quarter sovereign to the select committee unless he knew the proposal was doomed. A quarter sovereign was proposed again by the new Master of the Mint, Thomas Graham, in 1859, but was turned down by Gladstone.
Pieces purporting to be quarter sovereigns dated 1911 or 1922 are not genuine, but are modern inventions.
## 21st-century bullion and collector's coin
### Background and authorisation
Striking of sovereigns for circulation had come to an end by 1932, with most issues after the start in 1914 of the First World War coined at the branches of the Royal Mint in Australia and South Africa, where economic conditions were different than in Britain. In 1979, the Royal Mint struck sovereigns for sale to collectors. The following year, it coined, also for collector sale, the sovereign as well as the half sovereign, double sovereign, and five-pound piece. These four denominations continued to be issued in most years, sold at a premium to their gold value. In 1987, the Royal Mint started to issue the Britannia gold issues, bullion pieces issued in competition with those of other nations. Despite the issue of the Britannia pieces, the sovereign range continued to be struck, as the Royal Mint found that the sovereign's history meant that some preferred it to the Britannia; others admired the sovereign's design.
In 2009, the Royal Mint added quarter sovereigns to the range. Since 21st-century gold coins do not go into circulation, the objections of the 1850s did not apply. The quarter sovereign's specifications and design were stated in a proclamation by the monarch, Elizabeth II (r. 1952–2022), dated 10 December 2008 and effective the following day.
### Designs
The quarter sovereign has not been given its own design, but uses a smaller version of those given to the other coins in the sovereign range, with the obverse depicting the reigning monarch. Initially, an obverse designed by Ian Rank-Broadley was used. Beginning with some 2015 issues, an obverse portrait of Elizabeth by Jody Clark was used, though in 2016, some coins bore a different portrait of the queen by James Butler. In most years, the sovereign-range coins have featured, on the reverse, Benedetto Pistrucci's depiction of Saint George and the Dragon that first appeared on the sovereign in 1817. Other reverse designs used include another interpretation of the George and Dragon, by Paul Day for Elizabeth II's Diamond Jubilee in 2012. In 2017, coins with the original, 1817 sovereign reverse design were struck, for its bicentennial.
In 2022, the Royal Mint struck quarter sovereigns with a reverse design by Noad showing an interpretation of the Royal Arms, marking the Platinum Jubilee of Elizabeth II. Later in the year, following the death of Elizabeth II, the Royal Mint issued memorial coins in the sovereign range, including the quarter sovereign, featuring on the obverse, the first coinage portrait of Elizabeth's successor, Charles III (r. 2022), by Martin Jennings. The reverse displayed an interpretation of the Royal Arms by Clark. In 2023, a quarter sovereign commemorating the coronation of Charles III was struck, with the obverse a crowned portrait of the king by Jennings and the reverse the Pistrucci George and Dragon. For 2024, Jennings' uncrowned portrait of Charles was paired with Pistrucci's reverse on each of the five sovereign denominations struck in proof condition, from the quarter sovereign to the five-pound piece. For 2025, Pistrucci's reverse was used on some coins, with others featuring Jean Baptiste Merlen's Royal Arms reverse, first used on the sovereign in 1825, for its 200th anniversary.
### Issuance
From 2009 to 2012 the quarter sovereign was sold as a bullion piece, with authorised mintages of between 50,000 and 250,000, though the actual numbers sold are unreported. Such bullion issues are sold based on the price of gold, carrying a smaller premium than proof coins, with a sufficient premium charged to allow for the costs of manufacture and sale. Both varieties of the 2022 quarter sovereign were sold by the Royal Mint as bullion pieces, as well as in proof condition; the same is true of the 2023 coronation issue, as well as that for 2024.
The quarter sovereign has been sold as a collector's coin, usually in proof condition, each year since 2009, though only in 2009 (13,495 pieces struck) did the reported mintage reach 10,000. In addition to being sold individually, these collector's coins have been available for purchase as part of a proof set of between three and five of the sovereign denominations. In 2023, the Royal Mint sold the Coronation quarter sovereign in proof condition for £235.00, and in uncirculated condition for £132.39 or £134.97, depending on what packaging is used.
## See also
- Twenty-five pence coin
- Crown |
# AdS/CFT correspondence
In theoretical physics, the anti-de Sitter/conformal field theory correspondence (frequently abbreviated as AdS/CFT) is a conjectured relationship between two kinds of physical theories. On one side are anti-de Sitter spaces (AdS) that are used in theories of quantum gravity, formulated in terms of string theory or M-theory. On the other side of the correspondence are conformal field theories (CFT) that are quantum field theories, including theories similar to the Yang–Mills theories that describe elementary particles.
The duality represents a major advance in the understanding of string theory and quantum gravity. This is because it provides a non-perturbative formulation of string theory with certain boundary conditions and because it is the most successful realization of the holographic principle, an idea in quantum gravity originally proposed by Gerard 't Hooft and promoted by Leonard Susskind.
It also provides a powerful toolkit for studying strongly coupled quantum field theories. Much of the usefulness of the duality results from the fact that it is a strong–weak duality: when the fields of the quantum field theory are strongly interacting, the ones in the gravitational theory are weakly interacting and thus more mathematically tractable. This fact has been used to study many aspects of nuclear and condensed matter physics by translating problems in those subjects into more mathematically tractable problems in string theory.
The AdS/CFT correspondence was first proposed by Juan Maldacena in late 1997. Important aspects of the correspondence were soon elaborated on in two articles, one by Steven Gubser, Igor Klebanov and Alexander Polyakov, and another by Edward Witten. By 2015, Maldacena's article had over 10,000 citations, becoming the most highly cited article in the field of high energy physics.
One of the most prominent examples of the AdS/CFT correspondence has been the AdS5/CFT4 correspondence: a relation between N = 4 supersymmetric Yang–Mills theory in 3+1 dimensions and type IIB superstring theory on AdS<sub>5</sub> × S<sup>5</sup>.
## Background
### Quantum gravity and strings
Current understanding of gravity is based on Albert Einstein's general theory of relativity. Formulated in 1915, general relativity explains gravity in terms of the geometry of space and time, or spacetime. It is formulated in the language of classical physics that was developed by physicists such as Isaac Newton and James Clerk Maxwell. The other nongravitational forces are explained in the framework of quantum mechanics. Developed in the first half of the twentieth century by a number of different physicists, quantum mechanics provides a radically different way of describing physical phenomena based on probability.
Quantum gravity is the branch of physics that seeks to describe gravity using the principles of quantum mechanics. Currently, a popular approach to quantum gravity is string theory, which models elementary particles not as zero-dimensional points but as one-dimensional objects called strings. In the AdS/CFT correspondence, one typically considers theories of quantum gravity derived from string theory or its modern extension, M-theory.
In everyday life, there are three familiar dimensions of space (up/down, left/right, and forward/backward), and there is one dimension of time. Thus, in the language of modern physics, one says that spacetime is four-dimensional. One peculiar feature of string theory and M-theory is that these theories require extra dimensions of spacetime for their mathematical consistency: in string theory spacetime is ten-dimensional, while in M-theory it is eleven-dimensional. The quantum gravity theories appearing in the AdS/CFT correspondence are typically obtained from string and M-theory by a process known as compactification. This produces a theory in which spacetime has effectively a lower number of dimensions and the extra dimensions are "curled up" into circles.
A standard analogy for compactification is to consider a multidimensional object such as a garden hose. If the hose is viewed from a sufficient distance, it appears to have only one dimension, its length, but as one approaches the hose, one discovers that it contains a second dimension, its circumference. Thus, an ant crawling inside it would move in two dimensions.
### Quantum field theory
The application of quantum mechanics to physical objects such as the electromagnetic field, which are extended in space and time, is known as quantum field theory. In particle physics, quantum field theories form the basis for our understanding of elementary particles, which are modeled as excitations in the fundamental fields. Quantum field theories are also used throughout condensed matter physics to model particle-like objects called quasiparticles.
In the AdS/CFT correspondence, one considers, in addition to a theory of quantum gravity, a certain kind of quantum field theory called a conformal field theory. This is a particularly symmetric and mathematically well behaved type of quantum field theory. Such theories are often studied in the context of string theory, where they are associated with the surface swept out by a string propagating through spacetime, and in statistical mechanics, where they model systems at a thermodynamic critical point.
## Overview of the correspondence
### Geometry of anti-de Sitter space
In the AdS/CFT correspondence, one considers string theory or M-theory on an anti-de Sitter background. This means that the geometry of spacetime is described in terms of a certain vacuum solution of Einstein's equation called anti-de Sitter space.
In very elementary terms, anti-de Sitter space is a mathematical model of spacetime in which the notion of distance between points (the metric) is different from the notion of distance in ordinary Euclidean geometry. It is closely related to hyperbolic space, which can be viewed as a disk as illustrated on the right. This image shows a tessellation of a disk by triangles and squares. One can define the distance between points of this disk in such a way that all the triangles and squares are the same size and the circular outer boundary is infinitely far from any point in the interior.
Now imagine a stack of hyperbolic disks where each disk represents the state of the universe at a given time. The resulting geometric object is three-dimensional anti-de Sitter space. It looks like a solid cylinder in which any cross section is a copy of the hyperbolic disk. Time runs along the vertical direction in this picture. The surface of this cylinder plays an important role in the AdS/CFT correspondence. As with the hyperbolic plane, anti-de Sitter space is curved in such a way that any point in the interior is actually infinitely far from this boundary surface.
This construction describes a hypothetical universe with only two space and one time dimension, but it can be generalized to any number of dimensions. Indeed, hyperbolic space can have more than two dimensions and one can "stack up" copies of hyperbolic space to get higher-dimensional models of anti-de Sitter space.
### Idea of AdS/CFT
An important feature of anti-de Sitter space is its boundary (which looks like a cylinder in the case of three-dimensional anti-de Sitter space). One property of this boundary is that, locally around any point, it looks just like Minkowski space, the model of spacetime used in nongravitational physics.
One can therefore consider an auxiliary theory in which "spacetime" is given by the boundary of anti-de Sitter space. This observation is the starting point for the AdS/CFT correspondence, which states that the boundary of anti-de Sitter space can be regarded as the "spacetime" for a conformal field theory. The claim is that this conformal field theory is equivalent to the gravitational theory on the bulk anti-de Sitter space in the sense that there is a "dictionary" for translating calculations in one theory into calculations in the other. Every entity in one theory has a counterpart in the other theory. For example, a single particle in the gravitational theory might correspond to some collection of particles in the boundary theory. In addition, the predictions in the two theories are quantitatively identical so that if two particles have a 40 percent chance of colliding in the gravitational theory, then the corresponding collections in the boundary theory would also have a 40 percent chance of colliding.
Notice that the boundary of anti-de Sitter space has fewer dimensions than anti-de Sitter space itself. For instance, in the three-dimensional example illustrated above, the boundary is a two-dimensional surface. The AdS/CFT correspondence is often described as a "holographic duality" because this relationship between the two theories is similar to the relationship between a three-dimensional object and its image as a hologram. Although a hologram is two-dimensional, it encodes information about all three dimensions of the object it represents. In the same way, theories that are related by the AdS/CFT correspondence are conjectured to be exactly equivalent, despite living in different numbers of dimensions. The conformal field theory is like a hologram that captures information about the higher-dimensional quantum gravity theory.
### Examples of the correspondence
Following Maldacena's insight in 1997, theorists have discovered many different realizations of the AdS/CFT correspondence. These relate various conformal field theories to compactifications of string theory and M-theory in various numbers of dimensions. The theories involved are generally not viable models of the real world, but they have certain features, such as their particle content or high degree of symmetry, which make them useful for solving problems in quantum field theory and quantum gravity.
The most famous example of the AdS/CFT correspondence states that type IIB string theory on the product space AdS<sub>5</sub> × S<sup>5</sup> is equivalent to N = 4 supersymmetric Yang–Mills theory on the four-dimensional boundary. In this example, the spacetime on which the gravitational theory lives is effectively five-dimensional (hence the notation AdS<sub>5</sub>), and there are five additional compact dimensions (encoded by the S<sup>5</sup> factor). In the real world, spacetime is four-dimensional, at least macroscopically, so this version of the correspondence does not provide a realistic model of gravity. Likewise, the dual theory is not a viable model of any real-world system as it assumes a large amount of supersymmetry. Nevertheless, as explained below, this boundary theory shares some features in common with quantum chromodynamics, the fundamental theory of the strong force. It describes particles similar to the gluons of quantum chromodynamics together with certain fermions. As a result, it has found applications in nuclear physics, particularly in the study of the quark–gluon plasma.
Another realization of the correspondence states that M-theory on AdS<sub>7</sub> × S<sup>4</sup> is equivalent to the so-called (2,0)-theory in six dimensions. In this example, the spacetime of the gravitational theory is effectively seven-dimensional. The existence of the (2,0)-theory that appears on one side of the duality is predicted by the classification of superconformal field theories. It is still poorly understood because it is a quantum mechanical theory without a classical limit. Despite the inherent difficulty in studying this theory, it is considered to be an interesting object for a variety of reasons, both physical and mathematical.
Yet another realization of the correspondence states that M-theory on AdS<sub>4</sub> × S<sup>7</sup> is equivalent to the ABJM superconformal field theory in three dimensions. Here the gravitational theory has four noncompact dimensions, so this version of the correspondence provides a somewhat more realistic description of gravity.
## Applications to quantum gravity
### A non-perturbative formulation of string theory
In quantum field theory, one typically computes the probabilities of various physical events using the techniques of perturbation theory. Developed by Richard Feynman and others in the first half of the twentieth century, perturbative quantum field theory uses special diagrams called Feynman diagrams to organize computations. One imagines that these diagrams depict the paths of point-like particles and their interactions. Although this formalism is extremely useful for making predictions, these predictions are only possible when the strength of the interactions, the coupling constant, is small enough to reliably describe the theory as being close to a theory without interactions.
The starting point for string theory is the idea that the point-like particles of quantum field theory can also be modeled as one-dimensional objects called strings. The interaction of strings is most straightforwardly defined by generalizing the perturbation theory used in ordinary quantum field theory. At the level of Feynman diagrams, this means replacing the one-dimensional diagram representing the path of a point particle by a two-dimensional surface representing the motion of a string. Unlike in quantum field theory, string theory does not yet have a full non-perturbative definition, so many of the theoretical questions that physicists would like to answer remain out of reach.
The problem of developing a non-perturbative formulation of string theory was one of the original motivations for studying the AdS/CFT correspondence. As explained above, the correspondence provides several examples of quantum field theories that are equivalent to string theory on anti-de Sitter space. One can alternatively view this correspondence as providing a definition of string theory in the special case where the gravitational field is asymptotically anti-de Sitter (that is, when the gravitational field resembles that of anti-de Sitter space at spatial infinity). Physically interesting quantities in string theory are defined in terms of quantities in the dual quantum field theory.
### Black hole information paradox
In 1975, Stephen Hawking published a calculation that suggested that black holes are not completely black but emit a dim radiation due to quantum effects near the event horizon. At first, Hawking's result posed a problem for theorists because it suggested that black holes destroy information. More precisely, Hawking's calculation seemed to conflict with one of the basic postulates of quantum mechanics, which states that physical systems evolve in time according to the Schrödinger equation. This property is usually referred to as unitarity of time evolution. The apparent contradiction between Hawking's calculation and the unitarity postulate of quantum mechanics came to be known as the black hole information paradox.
The AdS/CFT correspondence resolves the black hole information paradox, at least to some extent, because it shows how a black hole can evolve in a manner consistent with quantum mechanics in some contexts. Indeed, one can consider black holes in the context of the AdS/CFT correspondence, and any such black hole corresponds to a configuration of particles on the boundary of anti-de Sitter space. These particles obey the usual rules of quantum mechanics and in particular evolve in a unitary fashion, so the black hole must also evolve in a unitary fashion, respecting the principles of quantum mechanics. In 2005, Hawking announced that the paradox had been settled in favor of information conservation by the AdS/CFT correspondence, and he suggested a concrete mechanism by which black holes might preserve information.
## Applications to quantum field theory
### Nuclear physics
One physical system that has been studied using the AdS/CFT correspondence is the quark–gluon plasma, an exotic state of matter produced in particle accelerators. This state of matter arises for brief instants when heavy ions such as gold or lead nuclei are collided at high energies. Such collisions cause the quarks that make up atomic nuclei to deconfine at temperatures of approximately two trillion kelvins, conditions similar to those present at around 10<sup>−11</sup> seconds after the Big Bang.
The physics of the quark–gluon plasma is governed by quantum chromodynamics, but this theory is mathematically intractable in problems involving the quark–gluon plasma. In an article appearing in 2005, Đàm Thanh Sơn and his collaborators showed that the AdS/CFT correspondence could be used to understand some aspects of the quark–gluon plasma by describing it in the language of string theory. By applying the AdS/CFT correspondence, Sơn and his collaborators were able to describe the quark gluon plasma in terms of black holes in five-dimensional spacetime. The calculation showed that the ratio of two quantities associated with the quark–gluon plasma, the shear viscosity η and volume density of entropy s, should be approximately equal to a certain universal constant:
-
\(\frac{\eta}{s}\approx\frac{\hbar}{4\pi k}\)
where ħ denotes the reduced Planck constant and k is the Boltzmann constant. In addition, the authors conjectured that this universal constant provides a lower bound for η/s in a large class of systems. In an experiment conducted at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory, the experimental result in one model was close to this universal constant but it was not the case in another model.
Another important property of the quark–gluon plasma is that very high energy quarks moving through the plasma are stopped or "quenched" after traveling only a few femtometres. This phenomenon is characterized by a number called the jet quenching parameter, which relates the energy loss of such a quark to the squared distance traveled through the plasma. Calculations based on the AdS/CFT correspondence give the estimated value ≈ 4 GeV<sup>2</sup>/fm, and the experimental value of lies in the range 5–15 GeV<sup>2</sup>/fm.
### Condensed matter physics
Over the decades, experimental condensed matter physicists have discovered a number of exotic states of matter, including superconductors and superfluids. These states are described using the formalism of quantum field theory, but some phenomena are difficult to explain using standard field theoretic techniques. Some condensed matter theorists including Subir Sachdev hope that the AdS/CFT correspondence will make it possible to describe these systems in the language of string theory and learn more about their behavior.
So far some success has been achieved in using string theory methods to describe the transition of a superfluid to an insulator. A superfluid is a system of electrically neutral atoms that flows without any friction. Such systems are often produced in the laboratory using liquid helium, but recently experimentalists have developed new ways of producing artificial superfluids by pouring trillions of cold atoms into a lattice of criss-crossing lasers. These atoms initially behave as a superfluid, but as experimentalists increase the intensity of the lasers, they become less mobile and then suddenly transition to an insulating state. During the transition, the atoms behave in an unusual way. For example, the atoms slow to a halt at a rate that depends on the temperature and on the Planck constant, the fundamental parameter of quantum mechanics, which does not enter into the description of the other phases. This behavior has recently been understood by considering a dual description where properties of the fluid are described in terms of a higher dimensional black hole.
### Criticism
With many physicists turning towards string-based methods to solve problems in nuclear and condensed matter physics, some theorists working in these areas have expressed doubts about whether the AdS/CFT correspondence can provide the tools needed to realistically model real-world systems. In a talk at the Quark Matter conference in 2006, an American physicist, Larry McLerran pointed out that the N = 4 super Yang–Mills theory that appears in the AdS/CFT correspondence differs significantly from quantum chromodynamics, making it difficult to apply these methods to nuclear physics. According to McLerran,
> N = 4 supersymmetric Yang–Mills is not QCD ... It has no mass scale and is conformally invariant. It has no confinement and no running coupling constant. It is supersymmetric. It has no chiral symmetry breaking or mass generation. It has six scalar and fermions in the adjoint representation ... It may be possible to correct some or all of the above problems, or, for various physical problems, some of the objections may not be relevant. As yet there is not consensus nor compelling arguments for the conjectured fixes or phenomena which would insure that the N = 4 supersymmetric Yang Mills results would reliably reflect QCD.
In a letter to Physics Today, Nobel laureate Philip W. Anderson voiced similar concerns about applications of AdS/CFT to condensed matter physics, stating
> As a very general problem with the AdS/CFT approach in condensed-matter theory, we can point to those telltale initials "CFT"—conformal field theory. Condensed-matter problems are, in general, neither relativistic nor conformal. Near a quantum critical point, both time and space may be scaling, but even there we still have a preferred coordinate system and, usually, a lattice. There is some evidence of other linear-T phases to the left of the strange metal about which they are welcome to speculate, but again in this case the condensed-matter problem is overdetermined by experimental facts.
## History and development
### String theory and nuclear physics
The discovery of the AdS/CFT correspondence in late 1997 was the culmination of a long history of efforts to relate string theory to nuclear physics. In fact, string theory was originally developed during the late 1960s and early 1970s as a theory of hadrons, the subatomic particles like the proton and neutron that are held together by the strong nuclear force. The idea was that each of these particles could be viewed as a different oscillation mode of a string. In the late 1960s, experimentalists had found that hadrons fall into families called Regge trajectories with squared energy proportional to angular momentum, and theorists showed that this relationship emerges naturally from the physics of a rotating relativistic string.
On the other hand, attempts to model hadrons as strings faced serious problems. One problem was that string theory includes a massless spin-2 particle whereas no such particle appears in the physics of hadrons. Such a particle would mediate a force with the properties of gravity. In 1974, Joël Scherk and John Schwarz suggested that string theory was therefore not a theory of nuclear physics as many theorists had thought but instead a theory of quantum gravity. At the same time, it was realized that hadrons are actually made of quarks, and the string theory approach was abandoned in favor of quantum chromodynamics.
In quantum chromodynamics, quarks have a kind of charge that comes in three varieties called colors. In a paper from 1974, Gerard 't Hooft studied the relationship between string theory and nuclear physics from another point of view by considering theories similar to quantum chromodynamics, where the number of colors is some arbitrary number N, rather than three. In this article, 't Hooft considered a certain limit where N tends to infinity and argued that in this limit certain calculations in quantum field theory resemble calculations in string theory.
### Black holes and holography
In 1975, Stephen Hawking published a calculation that suggested that black holes are not completely black but emit a dim radiation due to quantum effects near the event horizon. This work extended previous results of Jacob Bekenstein who had suggested that black holes have a well-defined entropy. At first, Hawking's result appeared to contradict one of the main postulates of quantum mechanics, namely the unitarity of time evolution. Intuitively, the unitarity postulate says that quantum mechanical systems do not destroy information as they evolve from one state to another. For this reason, the apparent contradiction came to be known as the black hole information paradox.
Later, in 1993, Gerard 't Hooft wrote a speculative paper on quantum gravity in which he revisited Hawking's work on black hole thermodynamics, concluding that the total number of degrees of freedom in a region of spacetime surrounding a black hole is proportional to the surface area of the horizon. This idea was promoted by Leonard Susskind and is now known as the holographic principle. The holographic principle and its realization in string theory through the AdS/CFT correspondence have helped elucidate the mysteries of black holes suggested by Hawking's work and are believed to provide a resolution of the black hole information paradox. In 2004, Hawking conceded that black holes do not violate quantum mechanics, and he suggested a concrete mechanism by which they might preserve information.
### Maldacena's paper
On January 1, 1998, Juan Maldacena published a landmark paper that initiated the study of AdS/CFT. According to Alexander Markovich Polyakov, "[Maldacena's] work opened the flood gates." The conjecture immediately excited great interest in the string theory community and was considered in a paper by Steven Gubser, Igor Klebanov and Polyakov, and another paper of Edward Witten. These papers made Maldacena's conjecture more precise and showed that the conformal field theory appearing in the correspondence lives on the boundary of anti-de Sitter space.
One special case of Maldacena's proposal says that N = 4 super Yang–Mills theory, a gauge theory similar in some ways to quantum chromodynamics, is equivalent to string theory in five-dimensional anti-de Sitter space. This result helped clarify the earlier work of 't Hooft on the relationship between string theory and quantum chromodynamics, taking string theory back to its roots as a theory of nuclear physics. Maldacena's results also provided a concrete realization of the holographic principle with important implications for quantum gravity and black hole physics. By the year 2015, Maldacena's paper had become the most highly cited paper in high energy physics with over 10,000 citations. These subsequent articles have provided considerable evidence that the correspondence is correct, although so far it has not been rigorously proved.
## Generalizations
### Three-dimensional gravity
In order to better understand the quantum aspects of gravity in our four-dimensional universe, some physicists have considered a lower-dimensional mathematical model in which spacetime has only two spatial dimensions and one time dimension. In this setting, the mathematics describing the gravitational field simplifies drastically, and one can study quantum gravity using familiar methods from quantum field theory, eliminating the need for string theory or other more radical approaches to quantum gravity in four dimensions.
Beginning with the work of J. David Brown and Marc Henneaux in 1986, physicists have noticed that quantum gravity in a three-dimensional spacetime is closely related to two-dimensional conformal field theory. In 1995, Henneaux and his coworkers explored this relationship in more detail, suggesting that three-dimensional gravity in anti-de Sitter space is equivalent to the conformal field theory known as Liouville field theory. Another conjecture formulated by Edward Witten states that three-dimensional gravity in anti-de Sitter space is equivalent to a conformal field theory with monster group symmetry. These conjectures provide examples of the AdS/CFT correspondence that do not require the full apparatus of string or M-theory.
### dS/CFT correspondence
Unlike our universe, which is now known to be expanding at an accelerating rate, anti-de Sitter space is neither expanding nor contracting. Instead it looks the same at all times. In more technical language, one says that anti-de Sitter space corresponds to a universe with a negative cosmological constant, whereas the real universe has a small positive cosmological constant.
Although the properties of gravity at short distances should be somewhat independent of the value of the cosmological constant, it is desirable to have a version of the AdS/CFT correspondence for positive cosmological constant. In 2001, Andrew Strominger introduced a version of the duality called the dS/CFT correspondence. This duality involves a model of spacetime called de Sitter space with a positive cosmological constant. Such a duality is interesting from the point of view of cosmology since many cosmologists believe that the very early universe was close to being de Sitter space.
### Kerr/CFT correspondence
Although the AdS/CFT correspondence is often useful for studying the properties of black holes, most of the black holes considered in the context of AdS/CFT are physically unrealistic. Indeed, as explained above, most versions of the AdS/CFT correspondence involve higher-dimensional models of spacetime with unphysical supersymmetry.
In 2009, Monica Guica, Thomas Hartman, Wei Song, and Andrew Strominger showed that the ideas of AdS/CFT could nevertheless be used to understand certain astrophysical black holes. More precisely, their results apply to black holes that are approximated by extremal Kerr black holes, which have the largest possible angular momentum compatible with a given mass. They showed that such black holes have an equivalent description in terms of conformal field theory. The Kerr/CFT correspondence was later extended to black holes with lower angular momentum.
### Higher spin gauge theories
The AdS/CFT correspondence is closely related to another duality conjectured by Igor Klebanov and Alexander Markovich Polyakov in 2002. This duality states that certain "higher spin gauge theories" on anti-de Sitter space are equivalent to conformal field theories with O(N) symmetry. Here the theory in the bulk is a type of gauge theory describing particles of arbitrarily high spin. It is similar to string theory, where the excited modes of vibrating strings correspond to particles with higher spin, and it may help to better understand the string theoretic versions of AdS/CFT and possibly even prove the correspondence. In 2010, Simone Giombi and Xi Yin obtained further evidence for this duality by computing quantities called three-point functions.
## See also
- Algebraic holography
- Ambient construction
- Randall–Sundrum model |
# Francis Marrash
Francis bin Fathallah bin Nasrallah Marrash (Arabic: فرنسيس بن فتح الله بن نصر الله مرّاش, ; 1835, 1836, or 1837 – 1873 or 1874), also known as Francis al-Marrash or Francis Marrash al-Halabi, was a Syrian scholar, publicist, writer and poet of the Nahda or the Arab Renaissance, and a physician. Most of his works revolve around science, history and religion, analysed under an epistemological light. He traveled throughout West Asia and France in his youth, and after some medical training and a year of practice in his native Aleppo, during which he wrote several works, he enrolled in a medical school in Paris; yet, declining health and growing blindness forced him to return to Aleppo, where he produced more literary works until his early death.
Historian Matti Moosa considered Marrash to be the first truly cosmopolitan Arab intellectual and writer of modern times. Marrash adhered to the principles of the French Revolution and defended them in his own works, implicitly criticizing Ottoman rule in West Asia and North Africa. He was also influential in introducing French romanticism in the Arab world, especially through his use of poetic prose and prose poetry, of which his writings were the first examples in modern Arabic literature, according to Salma Khadra Jayyusi and Shmuel Moreh. His modes of thinking and feeling, and ways of expressing them, have had a lasting influence on contemporary Arab thought and on the Mahjari poets.
## Life
### Background and education
Francis Marrash was born in Aleppo, a city of Ottoman Syria (present-day Syria), to an old Melkite family of merchants known for their literary interests. Having earned wealth and standing in the 18th century, the family was well established in Aleppo, although they had gone through troubles: a relative of Francis, Butrus Marrash, was killed by the wali's troops in the midst of a Catholic–Orthodox clash in April 1818. Other Melkite Catholics were exiled from Aleppo during the persecutions, among them the priest Jibrail Marrash. Francis' father, Fathallah, tried to defuse the sectarian conflict by writing a treatise in 1849, in which he rejected the Filioque. He had built up a large private library to give his three children Francis, Abdallah and Maryana a thorough education, particularly in the field of Arabic language and literature.
Aleppo was then a major intellectual center of the Ottoman Empire, featuring many thinkers and writers concerned with the future of the Arabs. It was in the French missionary schools that the Marrash family learnt Arabic with French and other foreign languages (Italian and English). But Francis at first studied the Arabic language and its literature privately. At the age of four years, Marrash had contracted measles, and had ever since suffered from eye problems that had kept worsening over time. Hoping to find a treatment, his father had therefore taken him to Paris in 1850; Francis stayed there for about a year, after which he was sent back to Aleppo while his father remained in Paris. In 1853, Francis accompanied his father once again, on a business trip of several months to Beirut, where there was a noticeable presence and cultural influence of Europeans. Francis experienced similar cultural contact later on, when he received private tutoring in medicine for four years under a British physician, in Aleppo—he had by then developed a keen interest in science, and in medicine in particular. At the same time, he wrote and published several works. Marrash practiced medicine for about a year; however, deeming it safer for his trade to become a state-licensed physician, he went to Paris in 1866 so as to continue his medical education at a school. But his fragile health and his growing blindness forced him to interrupt his studies within a year after his arrival. He returned to Aleppo completely blind, but still managed to dictate his works.
### Literary career
#### Ghabat al-haqq
Around 1865, Marrash published Ghabat al-haqq ("The Forest of Truth" or "The Forest of Justice"), an allegory about the conditions required to establish and maintain civilization and freedom. This allegory relates the apocalyptic vision of a war between a Kingdom of Liberty and a Kingdom of Slavery, resolved by the capture of the latter's king and a subsequent trial before the King of Liberty, the Queen of Wisdom, the Vizier of Peace and Fraternal Love, the Commander of the Army of Civilization, with the Philosopher from the City of Light—who represents the author—as counsel. In this work, Marrash expressed ideas of political and social reforms, highlighting the need of the Arabs for two things above all: modern schools and patriotism "free from religious considerations". In 1870, when distinguishing the notion of fatherland from that of nation and applying the latter to the region of Syria, Marrash would point to the role played by language, among other factors, in counterbalancing religious and sectarian differences, and thus, in defining national identity.
Although Marrash's poetical expression lacked the legal meticulousness found in works from Enlightened Europe, orientalist Shmuel Moreh has stated that Marrash became, with Ghabat al-haqq, "the first Arab writer to reflect the optimism and humanistic view of 18th-century Europe. This view stemmed from the hope that education, science and technology would resolve such problems of humanity as slavery, religious discrimination, illiteracy, disease, poverty, war, and other scourges of mankind, and it gave utterance to his hope for brotherhood and equality among peoples." Yet, his views on freedom differed from those of the French revolutionists and of his Middle Eastern contemporaries; indeed, he considered pleading for freedom on the basis of natural analogy to be superficial, for even nature responds to its own set of rules, according to Marrash. As a consequence, nothing in the universe may yearn for liberty without satisfying essential rules and needs that guarantee its existence. Being one of these, the need for progress may therefore justify the abolition of any restriction that does not serve as a regulator for a good system. In light of this reasoning, and in reference to the ongoing American Civil War, he thus in Ghabat al-haqq supported the abolition of slavery.
But the significance of this work also lay in Marrash's attempt to blend European thought with his own reading of the Christian belief in universal love. Indeed, he had tried to reconcile his philosophical understanding of the concept of liberty with his belief in the benevolence of the Catholic Church's authority. As stated by Nazik Saba Yared:
> He argued that only the spiritual kingdom [i.e. the kingdom centered on religion] could curb evil [...] and consequently guarantee the freedom of man. Love is one of the pillars of Christianity, and Marrash, like some Sufis and Romantics, considered it to be the basis of civilization, indeed of the entire universe [...]. Since love, for Marrash, was the general law, and freedom meant participation in that law, it followed that freedom would be inseparable from love and religion.
#### Later writings
In 1867, Marrash published Rihlat Baris, an account of his second journey to Paris. The book begins with a description of his progress from Aleppo to İskenderun, Latakia, Tripoli, Beirut, Jaffa, Alexandria, Cairo, and then back to Alexandria from which he had boarded a ship to Marseille, where he arrived in October 1866. The Arab cities had inspired in him revulsion and indifference, except Alexandria and Cairo, where Ismail Pasha had already begun modernization projects. He had then travelled through France, with a stopover in Lyon before ending up in Paris. Marrash was fascinated by France, and by Paris the most; everything he described in his account, from the Paris Exhibition of 1867 to gas lighting in the streets, served to praise the accomplishments of Western civilization. In Mashhad al-ahwal ("The Witnessing of the Stages of Human Life"), published in 1870, Marrash would again compare the East and the West, writing that "while the East sank deeper into darkness, the West embraced light". The optimism he had formerly expressed about the first reform currents under the reign of Sultan Abdülaziz in the Ottoman Empire gave way to pessimism in Mashhad al-ahwal, as he realized these reforms were superficial and that those he had hoped for would not soon come into being. Yet, in Durr al-sadaf fi ghara'ib al-sudaf (Pearl Shells in Relating Strange Coincidences), which he published two years later, he depicted the Lebanese social life of his day and criticised the blind imitation of Western customs and the use of the French language in everyday life.
Throughout his life, Marrash composed many essays about science (especially mathematics), and about education, a subject which mattered a lot to him; indeed, he wrote in Ghabat al-haqq that "without the education of the mind, man is a mindless beast". He also wrote many articles in the popular press; in those published in Butrus al-Bustani's journal Al-Jinan, he showed himself favourable to women's education, which he restricted however to reading, writing, and a little bit of arithmetic, geography and grammar. In an 1872 issue of Al-Jinan, he wrote that it is not necessary for a woman "to act like a man, neglect her domestic and family duties, or that she should consider herself superior to the man"; he nonetheless closely followed his sister's studies. Marrash also condemned Arab men's severe treatment of their wives and daughters. In his later works, he tried to demonstrate the existence of God and of the divine law; the Sharia, as he conceived it, was not restricted to the sphere of the Islamic law alone.
## Works
### List
- Dalīl al-ḥurrīyah al-insānīyah (Guide to Human Liberty), 1861.
- Al-Mirāh al-ṣafīyah fī al-mabādi al-ṭabīīyah (The Clear Mirror of Natural Principles), 1861.
- Tazīyat al-makrūb wa-rāḥat al-matūb (Consolation of the Anxious and Repose of the Weary One), 1864—pessimistic discourse on nations of the past.
- Ghābat al-ḥaqq fī tafṣīl al-akhlāq al-fāḍilah (The Forest of Truth in Detailing Cultured Manners), c. 1865.
- Riḥlat Bārīs (Journey to Paris), 1867.
- Kitāb dalīl al-ṭabīah (Guide to Nature), c. 1867.
- Al-Kunūz al-fannīyah fī al-rumūz al-Maymūnīyah (Artistic Treasures Concerning the Symbolic Visions of Maymun), 1870—poem of almost 500 verses.
- Mashhad al-aḥwāl (The Witnessing of the Stages of Human Life), 1870—collection of poems and short works in rhymed prose.
- Durr al-ṣadaf fī gharāib al-ṣudaf (Pearl Shells in Relating Strange Coincidences), 1872—a romance with songs for which he supplied the tunes.
- Mirāt al-ḥasnā (The Mirror of the Beautiful One), 1872.
- Shahādat al-ṭabīah fī wujūd Allāh wa-al-sharīah (Nature's Proofs for the Existence of God and the Divine Law), 1892 (posthumous).
Writings published in periodicals:
### Style
Marrash often included poems in his works, written in muwashshah and zajal forms according to the occasion. Shmuel Moreh has stated that Marrash tried to introduce "a revolution in diction, themes, metaphor and imagery in modern Arabic poetry", sometimes even mocking conventional poetic themes. In the introduction to his poetry book Mir'at al-hasna' (The Mirror of the Beautiful One), which was first published in 1872, Marrash rejected even the traditional genres of Arabic poetry, particularly panegyrics and lampoons. His use of conventional diction for new ideas marked the rise of a new stage in Arabic poetry which was carried on by the Mahjaris. Shmuel Moreh has also considered some passages from Ghabat al-haqq and Rihlat Baris to be prose poetry, while Salma Khadra Jayyusi has described his prosaic writing as "often Romantic in tone, rising sometimes to poetic heights, declamatory, vivid, colourful and musical", calling it the first example of poetic prose in modern Arabic literature.
## Legacy
Kahlil Gibran was a great admirer of Marrash, whose works he had read at al-Hikma School in Beirut. According to Shmuel Moreh, Gibran's own works echo Marrash's style and "many of [his] ideas on enslavement, education, women's liberation, truth, the natural goodness of man, and the corrupted morals of society". Khalil Hawi has referred to Marrash's aforementioned philosophy of universal love as having left a deep impression on Gibran. Moreover, Khalil Hawi has stated that many of Marrash's recurring expressions became stock images for Arab writers of the 20th century: he has mentioned, for example, "the valleys of mental contemplation", "the wings of thoughts", "solicitudes and dreams", "the veils of history", "the Kingdom of the Spirit", "the nymphs of the forest, the spring and the dawn", "golden diadems", "the jewels of light", "the storms of days and nights", and "the smoke of revenge and the mist of anger". |
# D. Djajakusuma
Djadoeg Djajakusuma (; 1 August 1918 – 28 October 1987) was an Indonesian film director and promoter of traditional art forms. Born to a nobleman and his wife in Temanggung, Central Java, Djajakusuma became interested in the arts at a young age, choosing to pursue a career in theatre. During the Japanese occupation from 1943 to 1945 he was a translator and actor, and in the four-year national revolution which followed he worked for the military's educational division, several news agencies, and in drama.
In 1951, Djajakusuma joined the National Film Corporation (Perfini) at the invitation of Usmar Ismail. After making his directorial debut with Embun, Djajakusuma released a further eleven films with the company before leaving in 1964. He then returned to traditional Indonesian theatre, including wayang. Although he continued to direct movies independently of Perfini, most of his energies were dedicated to promoting traditional art forms and teaching cinematography. After over a decade of poor health and high blood pressure, Djajakusuma collapsed during a ceremony and died. He was buried in Karet Bivak Cemetery.
The dedicated but easily angered Djajakusuma was influenced by Usmar Ismail's realist views, although he focused more on traditional aspects of life. His theatrical performances attempted to modernize traditional forms so that they could be better received in a modern world. He is credited with revitalising the Betawi theatre form lenong and received numerous awards for his filmmaking, including a lifetime achievement award at the Indonesian Film Festival.
## Biography
### Early life
Djajakusuma was born on 1 August 1918 in Parakan, Temanggung, Central Java, Dutch East Indies, to a priyayi father, Raden Mas Aryo Djojokoesomo, and his wife Kasimah. Djajakusuma was the fifth child of six born to the couple, who lived comfortably off Djojokoesomo's salary as a government official. While young he enjoyed watching stage performances, such as wayang puppetry and the traditional dance form tayuban; at times he would furtively leave his home after bedtime to watch the productions. With his friends, he would act out the bedtime stories his mother told him. When imported Hollywood films began to be screened, he was an avid viewer, watching Westerns and works starring Charlie Chaplin.
Owing to his position as the son of a nobleman, Djajakusuma was able to receive an education. He completed his studies in Semarang, Central Java, graduating from the natural sciences programme at a senior high school there in 1941. Although his family hoped that he would become a government employee like his father, Djajakusuma decided to go into the performing arts. He returned to his hometown for a short time before realising that he would have little opportunity in Parakan. Accordingly, in early 1943 – almost a year after the Indies were occupied by the Empire of Japan – Djajakusuma moved to the colony's political centre, Jakarta, to find work.
Djajakusuma became employed at the Cultural Centre as a translator and actor under Armijn Pane. Among the works he translated were several pieces by the Swedish playwright August Strindberg and Norwegian playwright Henrik Ibsen, as well as a history of Japan and several kabuki stage plays. While with the centre, Djajakusuma wrote several of his own stage plays. In his free time, Djajakusuma helped establish the amateur theatre company Maya, together with artists such as HB Jassin, Rosihan Anwar, and Usmar Ismail. The troupe, formed in response to a desire for greater artistic freedom, performed translations of European works and original works by Ismail and El Hakim. To promote a sense of Indonesian nationalism while still conforming with the Japanese censorship bureau's rules, several of Maya's plays did not explicitly promote Japan, but rather the Greater East Asia Co-Prosperity Sphere. Themes supporting the Indonesian nationalist movement, meanwhile, remained implicit in the works. With Maya, Djajakusuma travelled from village to village, putting on performances.
### Indonesian National Revolution
President Sukarno proclaimed Indonesia's independence on 17 August 1945, days after the bombings of Hiroshima and Nagasaki. Expecting the Dutch colonial government to return, Djajakusuma and Ismail helped establish the Independent Artists (Seniman Merdeka) as a form of resistance. The group travelled throughout the city, spreading news of Indonesia's proclaimed independence while performing from an open-air truck. After the arrival of the Netherlands Indies Civil Administration, the group sometimes attempted to spy on the Europeans or hide information which would be considered useful to the returning Dutch forces. Owing to this dangerous work, Djajakusuma began carrying a pistol, and went to Banten to ask a kyai to make him impervious to bullets.
In early 1946, with the Dutch colonial forces in control of Jakarta, Djajakusuma fled to the new national capital at Yogyakarta. There, he spent a time with the national news agency Antara before joining the military's educational division, rising to the rank of captain. For the military Djajakusuma edited the weekly Tentara; he also contributed articles to Ismail's cultural magazine Arena. Despite his involvement in the press, he did not abandon the theatre; with Surjo Sumanto, he established a troupe which performed for soldiers and raised morale, sometimes travelling to the frontlines.
Djajakusuma was hired by the Ministry of Information in 1947 to teach at a school for the performance arts, the Mataram Entertainment Foundation (Stichting Hiburan Mataram). Through Mataram, he and Ismail were introduced to filmmakers Andjar Asmara, Huyung, and Sutarto; the two studied under these more established individuals. Meanwhile, Djajakusuma was put in charge of censoring radio broadcasts in Republican-held areas, a duty he held until the Dutch captured Yogyakarta on 19 December 1948. Djajakusuma fled the city, then met up with Republican forces. Using an old radio and a bicycle-powered generator, Djajakusuma listened to international news broadcasts and wrote them down; the information from these broadcasts was then printed in underground newspapers.
After the Indonesian National Revolution ended with Dutch recognition of Indonesia's independence in 1949, Djajakusuma continued to work as a journalist for Patriot (a rebranding of Tentara) and the magazine Kebudajaan Nusantara; Mataram was reopened, and Djajakusuma began teaching there again while managing the Soboharsono cinema and writing several stage plays. Ismail, meanwhile, went back to Jakarta and established the National Film Corporation (Perusahaan Film Nasional, or Perfini); its first production, Darah dan Doa (The Long March), which gave a fictionalised version of the Siliwangi Division's trek from Yogyakarta to West Java in 1948, was directed by Ismail and released in 1950.
### Career with Perfini
In preparation for his second film, Enam Djam di Jogja (Six Hours in Yogyakarta), Ismail recalled Djajakusuma to Jakarta. For the film, Djajakusuma helped Ismail adapt the General Assault of 1 March 1949 for the screen. Production was completed on a low budget; Djajakusuma later recalled that their camera had to be powered by a car battery. Despite this and other difficulties, Djajakusuma stayed on after the film's completion, completing another work for Perfini, Dosa Tak Berampun (Unforgivable Sin), later that year. Ismail served as director for this film, about a man who leaves his family after he is transfixed by the smile of a waitress.
While Ismail, who remained head of Perfini, went abroad to study cinematography at the School of Theater, Film and Television at the University of California, Los Angeles, Djajakusuma began taking a larger role in Perfini. He made his directorial debut in 1952 with Embun (Dewdrop), which showed the psychological troubles faced by soldiers upon returning to their village after the revolution. The film was shot in Wonosari, at the time in the middle of a drought, to provide a visual metaphor for the barren souls of the warriors. Because of its depiction of traditional superstitions, the film had trouble with both the censorship bureau and critics; superstition was considered incompatible with the new republic's need for modernisation. The release of Embun made Djajakusuma one of four directors to work for Perfini; the others were Ismail, Nya Abas Akup, and Wahyu Sihombing.
Djajakusuma's next production, Terimalah Laguku (Take My Song; 1952), was a musical about an old, impoverished musician who sells his saxophone to help his former student's career. Though the film's technical quality was poor, when he returned to Indonesia in 1953 Ismail was pleased with the work, stating that the editing had been done well. Over the next year Ismail conveyed information he learned at UCLA to the Perfini staff; Djajakusuma followed these lessons closely. This was followed by Harimau Tjampa (Tiger from Tjampa) in 1953, a film about a man who attempts to avenge his father's death. Set amidst Minang culture, the film featured some of the first nudity in a domestic production and was a considerable critical success.
In 1954 Djajakusuma directed two comedies, Putri dari Medan (Daughter of Medan) and Mertua Sinting (Insane Parents-in-Law). The first dealt with three young men who resolve to never marry, only for their strength to waver after meeting some women from Medan, while the second followed a man who rejects his son's choice of spouse owing to her lack of noble descent, then unknowingly chooses the same woman to be his son's wife. The following year Djajakusuma helped establish the Indonesian Screen Actors Guild (Persatuan Artis Film Indonesia; PARFI). His only film that year, the drama Arni, told of a man who married another woman while his sick wife went to Padang, Sumatra, for treatment.
Djajakusuma studied cinematography in the United States, first at the University of Washington in Seattle, then at the University of Southern California's School of Cinematic Arts, from 1956 to 1957. When he returned to Indonesia, he worked with Ismail and fellow Perfini employee Asrul Sani to establish the National Theatre Academy of Indonesia (Akademi Teater Nasional Indonesia), which promoted realism; the Indonesian dramatist Putu Wijaya described the realism promoted by the academy as more Indonesian than Western, while Djajakusuma considered inspired by the Italian neorealist movement. Djajakusuma remained a lecturer with the academy until 1970, and his students considered him humorous and easy to approach.
Upon his return to Indonesia, Djajakusuma began work on Tjambuk Api (Whips of Fire; 1958), a critique of the widespread corruption in Indonesia; this theme led to the film being held by the censorship bureau for almost a year. The director followed this with the drama Pak Prawiro (Mr. Prawiro), which was sponsored by the Post Savings Bank (Bank Tabungan Pos) and meant to convey the importance of having savings. During this period he studied the traditional theatre of India, travelling to Calcutta, Madras, and New Delhi; he hoped that this first-hand experience would inspire him in the filming of traditional Indonesian stories.
In 1960 Djajakusuma released his first film based on traditional wayang stories, Lahirnja Gatotkatja; the traditional puppetry had fascinated him as a child, and he greatly enjoyed the character Gatotkaca. Shot in Yogyakarta, the film featured a cast of stars from Jakarta and local talent in backing roles. It was, however, controversial: dhalang and others versed in wayang argued that the director had ignored too many traditional aspects of the puppetry. That year Djajakusuma also served as production manager for Ismail's Pedjuang (Warriors for Freedom) and directed Mak Tjomblang (Mrs. Tjomblang), a comedy adapted from Nikolai Gogol's 1842 drama Marriage.
Djajakusuma released another comedy, Masa Topan dan Badai (Time of Cyclones and Storms), in 1963; the film centres around the family dynamics of a conservative father, liberal mother, and their two teenaged daughters who are in the throes of adolescence. The following year Djajakusuma directed his last film with Perfini, Rimba Bergema (Echoing Jungles), which was meant to promote the nation's rubber industry. That year he helped establish the Film and TV Employee's Union (Persatuan Karyawan Film dan TV), a response to the Lekra-sponsored Indonesian Film League. As with Ismail and most Perfini employees, Djajakusuma was staunchly against the communist-affiliated Lekra; the cultural group was likewise hostile towards those affiliated with Perfini.
### Later career
Towards the end of his time with Perfini, Djajakusuma again became active in traditional arts. He devoted considerable time towards the promotion of wayang. In 1967 he organised the National Wayang Festival, which collapsed shortly afterwards owing to a lack of funds. In 1967 he directed the wayang-inspired film Bimo Kroda for Pantja Murti Film, which used the destruction of the Pandawa – brothers in the Hindu epic Mahābhārata – to represent the kidnappings and subsequent murders of five army generals during the 30 September Movement in 1965. Djajakusuma's involvement with wayang continued into the early 1970s; he organised two Wayang Weeks, in 1970 and 1974, as well as a national wayang festival in 1977. Furthermore, he established two wayang orang troupes, Jaya Budaya (1971) and Bharata (1973), hoping to save the ailing medium by modernising it.
Meanwhile, Djajakusuma helped promote art forms such as the Betawi lenong and Javanese ludruk over a period of several years. He is particularly recognised for his revitalising lenong. Beginning in 1968, Djajakusuma appeared on television as an advocate of lenong, which was then limited to rural villages and on the verge of dying out. He increased popular knowledge of the form while arguing for proper remuneration for performers. Through the 1970s lenong was performed at Ismail Marzuki Hall, drawing considerable audiences, and several lenong performers found mainstream acclaim in the film industry.
Djajakusuma also promoted non-traditional cultural activities, both modern and foreign. In 1968 he became the head of the Jakarta Art Council, a position he held until 1977, and in 1970 he held a kroncong music festival. Beginning with the school's establishment in 1970, he became a lecturer at the Jakarta Institute for Arts Education (Lembaga Pendidikan Kesenian Jakarta, later the Jakarta Art Institute [Institut Kesenian Jakarta, or IKJ]), teaching cinematography. To better understand the world's theatre, in 1977 he went to Japan and China to study their traditions. He later led the students in various stage performances, including adaptations of Japanese noh and Chinese opera; several of these performances were held at Ismail Marzuki Hall. In the 1970s Djajakusuma held a variety of positions in film organisations, including as a member of the Film Council (1974–76), a member of the board of trustees for Radio and TV Broadcasts (1976), and a member of the Bureau for the Development of National Film (1977–78).
Djajakusuma's productivity in the film industry, however, declined. In 1971 he directed his final films, Api di Bukit Menoreh (Fire on Mount Menoreh) and Malin Kundang (Anak Durhaka) (Malin Kundang [Faithless Child]). The first, released for Penas Film Studio and based on a novel by Singgih Hadi Mintardja, followed soldiers from the Kingdom of Pajang in their efforts to subdue soldiers from the rival kingdom of Jipang. The second film was an adaptation of the Malay folktale of the same name. Starring Rano Karno and Putu Wijaya as the title character, the film follows a young boy who forgets his roots after spending much of his childhood at sea. His last role as a filmmaker was in 1977, when he helped produce Fritz G. Schadt's comedy Bang Kojak (Brother Kojak; 1977).
### Final years and death
In 1977 Djajakusuma served on the jury of the Indonesian Film Festival (Festival Film Indonesia, or FFI). While reading the decision, he collapsed and was rushed to the hospital, while Rosihan Anwar completed the reading. Djajakusuma's neighbour and frequent collaborator Taufiq Ismail told reporters that it was not the first time Djajakusuma had collapsed. Djajakusuma continued to suffer from bouts of sudden weakness for the rest of his life, caused by high blood pressure.
Despite his rapidly failing health, Djajakusuma remained active in the arts. In 1980 he made his last film appearance, and his only role on the big screen, acting in Ismail Soebardjo's Perempuan dalam Pasungan (Woman in Stocks). He and Sofia WD portrayed parents who regularly put their daughter in stocks to punish her for being disobedient; in an interview with Suara Karya, Soebardjo recalled that, from the time he had written it, he had only considered Djajakusuma for the role. Perempuan dalam Pasungan won the Citra Award for Best Film at the 1981 Indonesian Film Festival, and Djajakusuma expressed an interest in making several further films; this was, however, never realised. In 1983 Djajakusuma served as dean of the Faculty of Arts at IKJ, and in 1984 he went to the Three Continents Festival in Nantes, France, where two of his films were shown to critical acclaim.
In early 1987 Djajakusuma's doctor diagnosed him with heart disease, which led Djajakusuma to begin dieting and stop smoking. He continued to be highly respected in Indonesian film circles, but was displeased with the condition of the country's film industry, which he considered to be on the verge of collapse. This he blamed on American cultural imperialism, which meant that most cinemas preferred screening foreign films, especially those from Hollywood, and that Indonesian youth were no longer creating a uniquely Indonesian identity.
Djajakusuma collapsed on 28 October 1987 while giving a speech in commemoration of the Youth Pledge at the IKJ, striking his head on a stone step. After being rushed to Cikini General Hospital, he was declared dead at 10:05 a.m. local time (UTC+7). He was buried at Karet Bivak Cemetery that evening, after ceremonies at the IKJ led by the author Sutan Takdir Alisjahbana and prayers at the Amir Hamzah Mosque in Ismail Marzuki Hall led by the poet Taufiq Ismail. Among the mourners were the former Minister of Information Boediardjo, the Minister of Education and Culture Fuad Hassan, and the Deputy Governor of Jakarta Anwar Umar.
Djajakusuma had never married, but left behind several nieces and nephews whom he had raised as his own children. After his death, newspapers throughout Jakarta carried obituaries by such cultural and film figures as Alisjahbana, the producer Misbach Yusa Biran, and the Perfini cameraman Soemardjono. These obituaries emphasised Djajakusuma's role in the development of the Indonesian film industry and the preservation of traditional culture. In a ceremony commemorating the fifth anniversary of Djajakusuma's death, all his documents and books were donated to the IKJ library.
## Style
Like Usmar Ismail, Djajakusuma was influenced by realism. However, while Ismail preferred to focus on national-level themes, Djajakusuma was more drawn to simple, locally relevant storylines with educational messages. This realism carried over into Djajakusuma's work in wayang. The settings, traditionally drawn, were instead created as three-dimensional sets, including representations of trees, rocks, and water. According to Soemardjono, who often edited Djajakusuma's films, the director enjoyed experimenting with new techniques to better convey his intentions.
Djajakusuma often included traditional arts in his films, and two of them (Lahirnja Gatotkatja and Bimo Kroda) were based on traditional wayang stories and used wayang-inspired costumes and tempos. This focus on aspects of traditional culture fell out of the mainstream after 1965, having been replaced by films about city life. Djajakusuma's theatrical productions experimented with new storytelling techniques, adapting the traditional styles for the modern world. As a lecturer teaching screenwriting and the history of theatre, Djajakusuma focused on Indonesian arts. He argued that Indonesians should rely on local culture, not continuously look towards the West. In other areas he was mostly apolitical.
The Indonesian sociologist Umar Kayam, who had served on the Jakarta Art Council with Djajakusuma, described the director as highly disciplined. Biran described him as having a fiery temper which could be triggered suddenly, yet quick to calm when the trigger was removed; this sentiment was echoed by several people who had worked with Djajakusuma. Coverage in the film magazine Djaja described him as hardworking and highly dedicated to his craft, to the point of forsaking romantic relationships.
## Reception
Djajakusuma's film Harimau Tjampa garnered him the Best Screenplay Award at the 1954 Asian Film Festival. His later film Bimo Kroda was recognised by the Indonesian Department of Information for promoting traditional culture. In 1970 he received an Art Award from the Indonesian government for "his service to the State as the Main Promoter of the Development of Modern Drama". At the 1987 Indonesian Film Festival, he received a special award for his contributions to the film industry, and in November 2003 he was posthumously granted a Budaya Parama Dharma Award by President Megawati Sukarnoputri for his contributions to the development of Indonesian culture.
Critical reception has been positive. The award-winning director Teguh Karya cited the works of Djajakusuma, Usmar Ismail, and Asrul Sani as "legendary" and among his greatest influences. Choreographer Bagong Kussudiardjo reportedly so respected Djajakusuma that he had named his son Djadoeg after the director. According to a memorial in the newspaper Kompas, Djajakusuma was also dubbed a "living legend" during his visit to Nantes. A later Kompas article records Djajakusuma's best-remembered works are Harimau Tjampa and Tjambuk Api. Those two works are those most often shown, as ready-to-use copies are stored at Sinematek Indonesia; his other surviving films are kept as negatives.
## Filmography
### Cast
- Perempuan dalam Pasungan (Girl in Stocks; 1980) – as Mr. Prawiro
### Crew
- Enam Djam di Jogja (Six Hours in Yogya; 1951) – as screenwriter
- Embun (Dewdrop; 1951) – as director and screenwriter
- Dosa Tak Berampun (Unforgivable Sin; 1951) – as screenwriter
- Terimalah Laguku (Take My Song; 1952) – as director
- Harimau Tjampa (Tiger from Tjampa; 1953) – as director and screenwriter
- Putri dari Medan (Girl From Medan; 1954) – director
- Mertua Sinting (Insane Parents-in-Law; 1954) – as director
- Arni (1955) – as director
- Tjambuk Api (Whips of Fire; 1958) – as director
- Pak Prawiro (Mr. Prawiro; 1958) – as director and screenwriter
- Pedjuang (Warriors for Freedom; 1960) – as production manager
- Mak Tjomblang (Mrs. Tjomblang; 1960) – as director and screenwriter
- Lahirnja Gatotkatja (Birth of Gatotkatja; 1960) – as director and screenwriter
- Masa Topan dan Badai (Time of Cyclones and Storms; 1963) – as director
- Rimba Bergema (Echoing Jungles; 1964) – as director
- Bimo Kroda (1967) – as director
- Malin Kundang (Anak Durhaka) (Malin Kundang [Faithless Child]; 1971) – as director
- Api Dibukit Menoreh (Gugurnya Tohpati) (Fire on Mount Menoreh [Death of Tohpati]; 1971) – as director
- Bang Kojak (Brother Kojak; 1977) – as producer
## Explanatory notes |
# Mathew Charles Lamb
Mathew Charles "Matt" Lamb (5 January 1948 – 7 November 1976) was a Canadian spree killer who, in 1967, avoided Canada's then-mandatory death penalty for capital murder by being found not guilty by reason of insanity. Abandoned by his teenage mother soon after his birth in Windsor, Ontario, Lamb had an abusive upbringing at the hands of his step-grandfather, leading him to become emotionally detached from his relatives and peers. He developed violent tendencies that manifested themselves in his physical assault of a police officer at the age of 16 in February 1964, and his engaging in a brief shoot-out with law enforcement ten months later. After this latter incident he spent 14 months, starting in April 1965, at Kingston Penitentiary, a maximum security prison in eastern Ontario.
Seventeen days after his release from jail in June 1966, Lamb took a shotgun from his uncle's house and went on a shooting spree around his East Windsor neighborhood, killing two strangers and wounding two others. He was charged with capital murder, which under the era's Criminal Code called for a mandatory death penalty, but he avoided this fate when the court found, in January 1967, that he had not been sane at the time of the incident. He was committed for an indefinite time in a psychiatric unit. Over the course of six years in care at Penetanguishene Mental Health Centre's Oak Ridge facility he displayed a profound recovery, prompting an independent five-man committee to recommend to the Executive Council of Ontario that he be released, saying that he was no longer a danger to society. The Council approved Lamb's release in early 1973 on the condition that he spend a year living and working under the supervision of one of Oak Ridge's top psychiatrists, Elliot Barker.
Lamb continued to show improvement, becoming a productive laborer on Barker's farm and earning the trust of the doctor's family. With Barker's encouragement, Lamb joined the Rhodesian Army in late 1973 and fought for the unrecognised government of Rhodesia (modern-day Zimbabwe) for the rest of his life. He started his service in the Rhodesian Light Infantry, and won a place in the crack Special Air Service unit in 1975, but was granted a transfer back to his former regiment a year later. Soon after he was promoted to lance-corporal, Lamb was killed in action on 7 November 1976 by friendly fire from one of his allies. He received what Newsweek called "a hero's funeral" in the Rhodesian capital, Salisbury, before his ashes were returned to Windsor and buried by his relatives.
## Early life
Mathew Charles Lamb was born in Windsor, Ontario on 5 January 1948, the only child of a 15-year-old mother who abandoned him soon after birth. Raised by an assortment of grandparents, aunts and uncles, he rarely saw his mother while growing up and never knew his father, who died in the United States while Lamb was young. Lamb spent most of his childhood with his maternal grandmother and her new husband Christopher Collins at their home on York Street in the South Central neighbourhood of Windsor, where his presence was resented by the step-grandfather Collins. According to interviews with relatives, friends and neighbours conducted by Lamb's legal counsel Saul Nosanchuk in the mid-1960s, Collins subjected the boy to sustained emotional and physical abuse, beating him and frequently calling him a "little bastard". The direction of this violence was not limited to Lamb himself, however; he often witnessed his step-grandfather and grandmother fighting while he was still a small boy.
Lamb started exhibiting violent traits of his own from an early age. Nosanchuk writes that the young boy lured his cousins into his bedroom, locked them in a closet and threatened them. On one occasion he followed through with these threats and beat one of his cousins so badly that medical attention was required at a local hospital. "I remember once," said Greg Sweet, a childhood friend, "when he was about seven years old, he held a knife to a smaller kid and made him eat dog faeces". Lamb first attended Colbourne School in Riverside where Collins later said he appeared to be normal. School staff agreed, later telling the Windsor Star that he rarely got into trouble, and was capable, but unable to concentrate for extended periods.
Starting with Grade 8, when he was 13, Lamb went to St Jude's School in Windsor, where the other pupils found him to be distant and quiet. According to a fellow pupil, he spurned attempts by the other children to include him in their social circles. For example, Lamb once refused an invitation to a party, saying that he "didn't like to dance." Developing a keen interest in weapons, Lamb began to carry a knife to school, which he had little hesitation in showing off. He also became fascinated with firearms; according to Sweet, he and Lamb "always had guns, from the time [they] were about 12 years old". Sweet later told the Windsor Star that police were not informed when, as a teenager, Lamb strolled down a residential street "firing a shotgun at the houses of people he didn't like". Sweet also said that around this time Lamb assembled a collection of bullets and wrote the names of various local policemen on them. Lamb's hobby even extended to crude bombs, which he taught himself to produce using parts of various weapons. When he accidentally detonated one of these concoctions during preparation his leg was sprayed with shrapnel.
On 10 February 1964, barely a month after he turned 16, Lamb confronted a physically imposing police sergeant outside the Windsor Arena and, in front of a large crowd of people, leaped upon the far larger man and repeatedly punched him in the face. According to journalist Bob Sutton's account in the Windsor Star (published three years later), Lamb did this "for no apparent reason". Lamb was convicted of assault under the Juvenile Delinquents Act and served six months at the House of Concord, a young offenders' unit near London, Ontario, run by the Salvation Army. Upon his release, Lamb was sent by his step-grandfather to live in East Windsor with his uncle, Earl Hesketh. With Hesketh's support, Lamb briefly attended Assumption College School, where apart from a dislike for learning Latin, he performed creditably. However, with no real motivation to study, the boy soon dropped out to look for work. He was unable to hold down a permanent job and drifted through a series of short-term engagements, none lasting more than three months.
## Kingston Penitentiary
On the evening of 24 December 1964, Lamb smashed the front window of Lakeview Marine and Equipment, a sporting goods store in Tecumseh, and stole three revolvers and a double-barrelled shotgun. Using one of the revolvers, he fired twice on a police constable and the shop's co-owner, missing both times. The officer returned several shots, leading Lamb to come forward with his hands raised. "Don't shoot. I give up," he said. He then showed the constable where he had hidden the other two handguns and the shotgun. Lamb, who turned 17 during the trial, was tried and convicted as an adult for "breaking, entering and theft ... [and] possession of a .22 caliber revolver, dangerous to public peace". Motivated by a presentence investigation report characterizing Lamb as exceptionally violent, Magistrate J. Arthur Hanrahan sentenced him to two years at Kingston Penitentiary, a maximum security prison. According to Nosanchuk's account, the severity of the sentence was unusual for a first-time adult offender who had not caused anybody physical harm. Hanrahan, Nosanchuk writes, must have deemed Lamb beyond rehabilitation. The boy arrived at the penitentiary in April 1965.
Psychiatric examinations and psychological tests conducted on Lamb at Kingston revealed an extremely immature young man who was strongly drawn to weapons. The prison doctors noted that the boy was very aggressive, did not tolerate discipline and had very little control over his behaviour. Soon after he arrived, Lamb assaulted another prisoner and had to be put into solitary confinement. The prison's director of psychiatry, George Scott, said that the boy had shown signs of "an obvious mental breakdown". Not long after this, Lamb knelt beside his bed and pushed a broom handle into his rectum. When he was discovered in this state by a guard, Scott examined the boy immediately, having to sedate him to do so. "I think this young man is developing a mental illness of hypomanic nature," he wrote in his report. In further interviews conducted by Scott, Lamb related what the doctor described as "elaborate fantasies involving robberies, fights, and shootings that demonstrated enormous hostility".
Lamb attempted suicide several times and for years afterward bore scars where he had tried to slit his wrists. According to Nosanchuk, by early March 1966 the prisoner's behaviour "bordered on psychotic". During this month he threw food at an officer and was once again found with a broomstick in his rectum, this time dragging it around the floor of his cell and laughing. When Scott sedated Lamb and questioned him on the latter incident, the boy said he had just been trying to annoy the guard on duty. Scott once again noted his concern that Lamb was developing a hypomanic condition, a mania of low intensity, and on 18 March committed him to Kingston Psychiatric Hospital for a month. Scott wrote in his report to the hospital that he was not sure whether Lamb's condition was genuine or whether he was just putting on an act.
Lamb returned to the penitentiary on 18 April 1966 with a report saying that, if released, he would probably relapse into recidivism. Scott grew nervous as Lamb's release date approached: he believed that allowing Lamb to return home could be dangerous for the community, but at the same time he was not certain about the young man's psychiatric state, describing it as "borderline" or "marginal". Lamb had, Scott noted, shown some minor improvement since his time in the hospital. Even discounting this, the symptoms observed in Lamb were not consistent and the doctor did not think he had evidence conclusive enough to certify Lamb as mentally unsound. He even still considered that the boy could just be playing immature games with the penal system. In this uncertainty, Scott resolved that he could not bar Lamb's release. The 18-year-old boy left Kingston on 8 June 1966, ten months ahead of schedule, and returned home to Windsor. He was taken in by another uncle, Stanley Hesketh, who lived at 1912 Ford Boulevard. Lamb secured a job as a woodworker on his release and after starting work showed no signs of irregular conduct.
## Shooting spree
### Incident
Seventeen days after his release from Kingston Penitentiary, on the evening of 25 June 1966, Lamb discovered a shotgun in his uncle's house. He took the weapon and left the house shortly before 22:00 Eastern Time, then walked a single block north along Ford Boulevard and hid behind a tree outside number 1872. Six young people—Edith Chaykoski, 20, her 22-year-old brother Kenneth, his wife and three friends, 21-year-old Andrew Woloch, Vincent Franco and Don Mulesa—were heading south from 1635 Ford Boulevard on their way to a bus stop on Tecumseh Road when they approached the tree behind which Lamb was hiding at about 22:15. Lamb suddenly stepped out in front of the strangers, pointed the shotgun at them and said "Stop. Put up your hands\!" When Edith Chaykoski stepped forward, towards Lamb, he shot her in the abdomen. Woloch then moved and was hit in the stomach by a second shot, which also wounded Kenneth Chaykoski. Lamb then ran across the street to 1867 Ford Boulevard and fired on a girl whose silhouette he had spotted in a side doorway of the house; his target, 19-year-old Grace Dunlop, was injured. As law enforcement and medical assistance were summoned, Lamb strolled away and walked two blocks before knocking on a door, which he had seemingly chosen at random. Pointing the shotgun at the elderly lady who lived there, Ann Heaton, he threatened to kill her. When Heaton cried out to her husband Forrest to phone the police, Lamb fled, throwing the shotgun over the old couple's fence into a field. He returned to the Hesketh house and went to sleep.
Edith Chaykoski died from her wounds at Windsor Metropolitan Hospital at about 05:30 on 26 June. Police searched the neighborhood during the morning and found the shotgun where Lamb had thrown it. They identified it as Hesketh's and concluded that the 18-year-old must have taken it and gone on a shooting spree the previous day. Lamb was arrested at 15:30 on 26 June and charged with the capital murder of Edith Chaykoski. Under the terms of Canadian law at that time, he faced a mandatory death penalty if convicted. When Woloch's injuries also proved fatal at 14:45 on 11 July 1966, his murder was added to Lamb's charge.
### Psychiatric examination
On the morning of 27 June 1966, Lamb appeared without legal counsel at Essex County Magistrate's Court in Windsor, where he was remanded for psychiatric examination. As he was being escorted from the courthouse at around noon, the boy attempted to escape custody and, when restrained, begged the officers to shoot him. A private psychiatrist from Windsor, Walter Yaworsky, gauged the teenager's mental state in an interview starting at 12:30. Yaworsky said that Lamb was "hyperactive and agitated"; he was unable to sit still and periodically rose from his seat and paced around the room. He was silent for a few minutes, apparently irritated, then began laughing as if in a state of euphoria. When questioned by Yaworsky directly, Lamb did not appear concerned about the interview: he treated his murder charge lightly and when asked about his spell in Kingston Penitentiary began laughing.
After dismissing a few more questions as "unimportant", the 18-year-old giggled childishly and said he "needed a lawyer". Lamb's conversation with the doctor continued incoherently, with Lamb "leaping from topic to topic", in Yaworsky's words. The young man continued to rise and pace around the room as the interview went on; he spoke in a casual, off-hand manner, giving non-specific answers to the doctor's questions and describing people especially vaguely. When asked about his parents, he simply said, "I don't remember." Yaworsky then inquired where his mother was, leading the boy to laugh as he replied, "I don't remember. Somewhere." When the doctor finally asked directly about the night of the shootings, Lamb said that he could not recall shooting anybody and that all he remembered was going home in a taxi, then being awoken by his uncle shaking him.
When the interview ended at 13:35, the doctor noted that he found Lamb's hour-long maintenance of this seemingly hypomanic behavior "remarkable". Simulation on Lamb's part was unlikely, Yaworsky believed, and lack of memory credible. The doctor wrote in his report that Lamb had been "suffering from a disease of the mind" at the time of the shootings, which had caused him to be in a kind of dream world, outside of reality. Standing before the magistrate that same afternoon, Yaworsky testified that Lamb was mentally unsound and not fit to stand trial. The magistrate once again remanded Lamb, this time to custody at Penetanguishene Mental Health Centre for a minimum of 30 days. Lamb was again examined on 29 June 1966, this time by James Dolan, the Clinical Director of Psychiatry at St Thomas Elgin General Hospital.
Lamb described the incident for Dolan in far more detail. He said that on 25 June he had finished his woodworking job at 15:00, refused to work overtime (having put in 62 hours that week), then drunk six beers at home during the afternoon and gone to sleep at 21:00. He woke up soon after, he said, and loaded his uncle's shotgun, intending to kill himself. "Next thing I knew," he continued, "I was on the street." He told Dolan that he had seen people "as if they were on television". He heard the sound of a gun being fired as if it were coming from far away, and remembered a voice that was not his own saying "Put up your hands". He then remembered hearing the dim, far-off gun again, a vision of a terrified man standing before him, then yet another shot. He told Dolan he remembered crossing the street, seeing a girl's outline in a doorway and "somehow" shooting at her. Lamb then said he had fired at a passing car, and that "everything seemed unreal." He said that the next thing he had memory of was confronting an elderly lady at a house nearby, suddenly thinking "what the hell am I in here for" and leaving. He concluded his account by saying that he had hailed a cab from the corner of Pillette and Tecumseh Road and gone back to his uncle's house.
After 30 days, on 27 July, the staff at Penetanguishene still deemed him unfit to stand trial, but by 27 August 1966 his state had improved enough for the doctors to allow his return to custody at Windsor. The doctors reported that he could now face the court—they said that the young man was now able to understand what the proceedings against him meant, and capable of working alongside a legal advisor. Because Lamb could not afford legal counsel, Justice Saul Nosanchuk was assigned by a local legal aid plan to advise him in the upcoming trial. Nosanchuk says that Lamb had "no hesitation" in signing a paper authorising the justice to represent him.
### Mental disorder defence
Nosanchuk quickly resolved that the only way to win the case and prevent the teenager from being hanged was to explore the background and circumstances both of the incident and of Lamb himself with a view to a mental disorder defence. At their first interview together, Nosanchuk says that Lamb was a "slightly built, almost frail, quiet and detached 18-year-old ... very boyish in appearance. He had an exceedingly polite demeanour. He was most appreciative and grateful that I had undertaken his defence." When Nosanchuk asked the boy about the events of 25 June, Lamb became hesitant, disjointed and confused in his conversation; "he seemed to view these events as if he was not really involved in them," Nosanchuk recalled. Lamb said that although he did have a faint recollection of what had happened, he could not remember taking the shotgun, loading it or shooting anybody. He said that earlier during the afternoon on 25 June he had watched a film on television in which somebody had shot and killed several people, but did not make clear whether he had been re-enacting this. The lawyer then asked Lamb to discuss his personal background, childhood and family. The 18-year-old was very reluctant to do so and when he did, Nosanchuk says that the boy seemed emotionally detached from the relatives and events he described.
Lamb appeared to the justice to be a profoundly troubled young man. There was no doubt that he had killed Chaykoski and Woloch, he had not acted in self-defence and there was no evidence of provocation; the chances of reducing the murder charge to manslaughter were therefore slim. Nosanchuk felt obliged, in the light of Lamb's psychiatric and personal history, to explore an insanity plea. The lawyer interviewed Yaworsky and Dolan, secured them as witnesses for the trial, and also reviewed a report from four doctors at Penetanguishene who had previously interviewed his client. According to this report, Lamb had been amiable, fluent in his conversation and apparently plausible; he at first claimed amnesia for the events of 25 June, but on continued questioning related what had happened "in detail". He admitted his responsibility but still did not appear to be concerned about what had happened. He did not seem to be able to emotionally appreciate the consequences of his actions, even though he understood on an intellectual level that he had shot at some people and that to do that was wrong. When he told the psychiatrists that he regretted what he had done, they wrote that he "clearly did not have any real underlying feeling of remorse". He also did not appear to have considered that he might have any illness, mental or otherwise. The Penetanguishene report concluded that Lamb "suffered from a disease of the mind as a pathological anti-social or psychopathic personality", which was a recognised psychiatric disorder under the Criminal Code and therefore grounds for an insanity defence in court.
However, Nosanchuk was still not certain that he would attempt to defend Lamb on mental grounds. In Ontario at that time, a defendant found not guilty under these terms remained imprisoned indefinitely unless an order for his release came from the province's Executive Council, acting on the advice of a Review Board including a Supreme Court judge. Yaworsky warned Nosanchuk that even if he used an insanity defence and won, Lamb would probably be committed to an institution for the criminally insane for life. It was still possible for the lawyer to approach the prosecution and propose a plea bargain, offering to plead guilty to non-capital murder, which would result in life imprisonment for Lamb but allow a parole hearing after 10 years. In any case, Nosanchuk could not use the case he had prepared unless he first received unequivocal written instruction from his client to plead insanity. Having been given a week by his counsel to consider the matter, Lamb wrote to Nosanchuk that he wished to attempt a mental disorder defence. The boy made clear his understanding that if he were found not guilty under these terms, the state still had the right to detain him for the rest of his life.
Nosanchuk then considered whether or not to call Lamb to the stand on his own behalf at the trial. He already had a strong defence, with five psychiatrists firmly behind him as well as George Scott of Kingston Penitentiary, who had agreed to testify on his behalf. The lawyer considered the boy's probable reaction to a cross-examination by the prosecution in court. Lamb appeared calm and collected on the surface, and had so far answered questions about right and wrong in a somewhat rational way. Although the doctors had been able to see Lamb's answers for what they really were, Nosanchuk feared that Lamb's appearance could actually lead the jury to believe that he was sane, which might lead to a guilty verdict and the gallows. He therefore did not call Lamb to the stand.
## Trial
After a brief preliminary hearing starting on 8 October 1966, during which Lamb reportedly showed no signs of emotion, the young man's trial for capital murder began on 16 January 1967 at Essex County Courthouse in Windsor. Because of the charge's severity, the case was heard by a judge and jury under the auspices of Ontario's Supreme Court, which chose Justice Alexander Stark to preside. The trial started with Lamb pleading not guilty to the capital murder of Edith Chaykoski and Andrew Woloch; Nosanchuk then opened his mental disorder defence under Section 562 of the Canadian Criminal Code. Stark gave an order allowing all relevant psychiatric doctors to remain, then allowed the Crown to open its case against Lamb.
### Testimonies and cross-examinations
#### Prosecution
The prosecutor, Eugene Duchesne QC, began his case by calling Mathew Lamb's uncle, Stanley Hesketh, to the stand. Hesketh testified that three hours after the shootings his nephew had told him that he "must have done it". He said that since his nephew had come to live with him following his release from prison, he had always been exceptionally polite and helpful to his family. However, during the morning of 26 June Lamb had for the first time been hesitant, not giving full answers to questions and generally acting in a less open manner. Duchesne then called witnesses of the shootings, who agreed unanimously that Lamb had been unhurried, cool and collected. When cross-examined by Nosanchuk about this unusual calmness, they said that Lamb had appeared very distant and did not seem to have any bearing on what was happening around him: one of the witnesses said that Lamb had not appeared to even notice a very loud party in progress across the street. Heaton said that the boy had looked frightened and had fled without reason when she called to her husband. The police officers who arrested Lamb also agreed that the defendant had been unusually cool and silent. Hesketh told the court that when he had come home his nephew had been fast asleep in bed.
#### Defence
The defence began to present its psychiatric evidence on the third day of the trial, 18 January 1967, when Yaworsky was called to testify on Lamb's behalf. Yaworsky recounted in detail his examination of the defendant two days after the shootings; he put weight on the fact that Lamb had laughed while incoherently describing the events of 25 June, and had at one point giggled and exclaimed "poor broad", referring to Edith Chaykoski. So far Lamb had sat through the trial in muted silence, showing no emotion whatsoever, but when Yaworsky mentioned the boy's having "giggled" at this point of the 27 June interview, Lamb did so again in a similar manner. Yaworsky said that he had interviewed Lamb four more times between December 1966 and the trial, and that in these discussions the young man had been able to remember more about the incident; Lamb told Yaworsky that he remembered confronting the people in the street, but that everything had then "felt fuzzy" when he fired the gun. Yaworsky quoted Lamb: "It was as if I was invisible. ... The next clear memory I have is standing minutes later in the Heaton living room. All of a sudden, I was standing there with a gun in my hand—that is when I ran out." The doctor hypothesised that this had been when Lamb returned to the real world following a psychopathic episode during which he had been divorced from reality. Dolan then testified along similar lines, describing his interview with Lamb two days after Yaworsky's and telling the court that he also believed that Lamb had had a psychotic break that had made him unable to appreciate "the nature and quality of the act of killing another human being".
In his cross-examination of the two doctors, Duchesne cited the psychological reports filed on Lamb at Penetanguishene in 1966, which had determined Lamb to have an IQ of 125, far above most of his 18-year-old peers. The prosecutor proposed that it was not beyond Lamb, with his psychopathic personality and high level of intelligence, to invent a story of amnesia and confusion to avoid responsibility for satisfying his dangerous impulses by consciously killing people. Both Yaworsky and Dolan said that although this was possible, they were each sticking to their original conclusions made in the days following the incident.
John Robinson, the governor of Essex County Jail, was then called by the defence. Robinson testified that during Lamb's time in the county jail, his conduct had been impeccable except for an incident five weeks before the trial, on 10 December 1966, when he had, for no apparent reason, gone on what Robinson called a "rampage". In an episode lasting three hours, the defendant had smashed over 100 windows, set fire to blankets and broken plumbing, causing cells to flood. "I was amazed by what I saw," Robinson said. "The pupils were dilated like someone who comes in heavy on narcotics—except their eyes appear sunk in and his were bulged out." George Scott of Kingston Penitentiary then told the court that Lamb lived in a fantasy dream world, which had existed in his mind since early childhood, and had been in a pre-psychotic state when released from jail on 8 June 1966. This, he said, had boiled over into an "acute schizoid episode" on the night of the shootings. In cross-examination, Scott was pressed as to why he had allowed Lamb's release from Kingston if this was the case; he replied that although the prison officers had been concerned about Lamb's mental state on his release, there had not been conclusive grounds to certify him insane at that time.
Three more psychiatrists from Penetanguishene were then called by Nosanchuk. George Darby told the court that in his conversations with Lamb the defendant had changed his story three times. He considered Lamb to be anti-social, aggressive and psychopathic—unable to appreciate the incident's consequences with any depth of feeling. Elliot Barker then testified that Lamb had told him in an interview that he treated all people "like bugs" except for his uncle and grandmother; killing a human being, Barker told the court, meant nothing more to Lamb than swatting a fly. Barry Boyd then confirmed what Barker had said, and quoted something Lamb had told him in an interview: "I hate everybody on the street. I probably will kill someone else before I die—it doesn't bother me—it's like killing a bug." Elizabeth Willet, a psychologist from the Penetanguishene unit, then testified that in her tests Lamb had indeed been revealed to have a high IQ, but had also been shown to have the emotional maturity of a small child, aged between 3 and 6. He had few defence mechanisms, she said, and when confronted by impulses acted them out almost without exception and without appreciating their consequences.
#### Prosecution
Duchesne now called on Basil Orchard and Wilfred Boothroyd to counter the defence's evidence. Orchard, another doctor from Penetanguishene, testified that Lamb had suddenly abandoned any show of amnesia during an interview in August 1966. He saw no evidence that Lamb was insane and said that he was simply a young man with strong impulses who sometimes could not control them. Boothroyd, of Sunnybrook Hospital in Toronto, then spoke, arguing that Lamb had been acting out strong feelings of anger and bitterness and fully intended to kill the people he confronted, knowing and understanding what that meant. Lamb, he said, was capable of all kinds of emotion and was perfectly able to understand the nature of what he had done. Justice Stark intervened during Boothroyd's testimony, asking how he could give a reliable opinion on Lamb when he had never examined him. He also remarked that his opinion was contrary to every doctor who had testified so far.
### Closing statements
#### Defence
Nosanchuk gave his final statement to the court first. Speaking for the defence, he reminded the jury that every one of the doctors speaking on Lamb's behalf had originally been engaged by the state and that Yaworsky was the only one not employed in the public sector. He conceded that the defendant's actions were senseless and violent, but stressed that if found not guilty by reason of insanity, Lamb would stay in custody and could be kept in psychiatric care for the rest of his life if necessary. He argued that Lamb's actions on the evening of 25 June 1966 clearly made no sense and asked the jury to carefully consider them: the victims were not known to Lamb; Lamb made no attempt to disguise himself; Lamb acted alone; Lamb had nothing to gain from the act; Lamb inexplicably diverted his fire from the original group to a shadow in a doorway; Lamb then chose another house at random and threatened to kill the occupant, then left without doing anything; Lamb made no attempt to hide the weapon, leaving it in a nearby field where it would surely be found; then, finally, the boy simply went home to bed as if nothing had happened. Was this, Nosanchuk asked, the behaviour of a premeditated killer, or of a profoundly disturbed young man who did not appreciate what he was doing?
#### Prosecution
In response, Duchesne gave what Nosanchuk writes was a "well-reasoned argument", which the jury paid rapt attention to. Duchesne declared that the incident had been the cold, calculated murder of a defenceless group of innocent young people by an ex-convict, released only three weeks earlier and already known as an anti-social psychopath. The defendant had given wildly conflicting stories to different doctors, Duchesne said, yet had convinced each one. Duchesne said that this was possible as Lamb was highly intelligent and able to project a false front if it suited him. The prosecutor then moved off the topic of Lamb's mental state and focussed on the matter of criminal responsibility in general. Five out of the eight psychiatrists in the court, the prosecutor said, had testified that Lamb had understood on an intellectual level that shooting Chaykoski and Woloch would kill them, which he asserted should be enough to incur criminal responsibility, even taking the psychiatric evidence into account. Finding Lamb not guilty by reason of insanity, he reasoned, would provide an inexpedient precedent in Canadian law and cause a spate of similar insanity defence attempts.
### Verdict
Justice Stark then reviewed the evidence and advised the jury that in his opinion the weight of psychiatric evidence favoured the defence; however, he reminded them, it was up to them to decide. The jury retired at 16:30 on 20 January 1967 to make their decision, and returned to the courtroom shortly before 19:00 to give their verdict. They found Lamb to be not guilty by reason of insanity. Lamb showed no reaction when the verdict was read.
## Psychiatric care
### Treatment at Oak Ridge
As had been made clear several times before and during the trial, Lamb's court victory did not make him a free man. He was escorted by police back to Penetanguishene and placed in the hospital's maximum security unit at Oak Ridge, where he was to remain indefinitely pending an order from the Ontario Executive Council.
Elliot Barker, the head of Oak Ridge's therapeutic division, had already interviewed Lamb in 1966 and spoken on his behalf at his trial. The doctor had arrived at Penetanguishene in 1959, and in 1965 stepped up his efforts to reform the unit's programmes, which on his arrival were still based around the traditional methods of neuroleptic tranquillisation and electroconvulsive therapy, supplemented by long periods of isolation for each inmate. Barker innovated a programme whereby the patients could spend more of their time in each other's company, in a more natural environment; he believed that the key to overcoming these illnesses was communication. "My original vision," he writes, "was that I wasn't really dealing with patients. I thought we could evolve a social structure where people could resolve the internal conflicts in community." Barker's "Social Therapy Unit" (STU), initially made up exclusively of young male psychopaths and schizophrenics of normal intelligence, began in September 1965, with a programme of 80 hours of treatment a week, focussing on cures brought about by mutual cooperation and interaction. Joan Hollobon, the medical editor of the Toronto Globe and Mail, volunteered in 1967 to spend two days at Oak Ridge as if she were a patient, and afterwards heaped praise on the inmates, saying that they were "pioneering a brave and exciting experiment in self-government and self-therapy ... [displaying] individual responsibility, co-operation with colleagues and authority, and acceptance of rules reached by consensus."
In August 1968 the unit created a "Total Encounter Capsule", which was a windowless, soundproofed room, 8 feet (2.4 m) wide and 10 feet (3.0 m) long, with green-painted walls, a green wall-to-wall mat on the floor and a ceiling containing a one-way mirror. It was empty apart from a sink and lavatory. In one of the earliest uses of videotape in therapy, television cameras were trained through the mirrored ceiling and through holes in the walls. Liquid nourishment was provided through drinking straws that were built into the door. The Capsule's purpose, Barker writes, was to provide "a place of undisturbed security where a small group of patients could focus on issues they felt important enough to warrant the exclusion of the usual physical and psychological distractions." Though participation in the STU programme was required, entering the Capsule was voluntary, and each patient could choose how many days he spent inside. Groups numbered between two and seven and stayed in the room for as little as 24 hours or for sustained periods as long as 11 days. Because Barker believed that they were more inclined to reveal their inner selves if unclothed, the inmates entered the Capsule naked. To further encourage communication, they were administered with LSD-25. The room was lit at all times, making day indistinguishable from night. While members of the programme were inside the Capsule, other patients operated the room and watched over those inside, running the cameras, keeping records and maintaining an appropriate room temperature.
Following his arrival in January 1967, Lamb enthusiastically took part and thrived in Barker's new programmes, becoming, the Montreal Gazette writes, "a model inmate". He became widely respected by his fellow patients and was successfully nominated as the ward's "patient therapist". "He was helpful to the other patients," Barker told the Globe and Mail, "and they looked up to him." Barker elaborated on this subject in an interview with the Windsor Star, telling them that during 1972 Lamb had been "one of the most respected therapists in the hospital". Lamb started a newspaper at Oak Ridge, for which he wrote articles while also encouraging others to contribute. Barker and his colleagues were so impressed by the young man's progress that they began to take him to lectures at Ontario Police College in Aylmer, where they introduced him as evidence of rehabilitation's potential. After about five years at Oak Ridge, the matter of Lamb's liberty was taken up by a five-man Advisory Review Board made up of Ontario Supreme Court Justice Edson Haines, two independent psychiatrists unrelated to Lamb's case, a lawyer and a lay person. The advisory board's recommendation that Lamb be released was approved by the Ontario Executive Council in early 1973; the board gave him a clean bill of health and said he was no longer dangerous.
### Release and further improvement
The conditions of Lamb's release were that he must spend a year living with the Barker family on their 200 acres (0.81 km<sup>2</sup>; 0.31 sq mi) farm, under the doctor's observation. The former inmate proved to be an industrious labourer, helping to fence the property and becoming one of the farm's best workers. Barker and his wife came to trust Lamb so closely that they allowed him to babysit their three-year-old daughter, who became very attached to the young man. During his time living and working on the farm, Lamb read a number of books on psychiatry, including The Mask of Sanity by Hervey M. Cleckley, which affected him particularly. He told the doctor that he had come to terms with his condition as a psychopath and that he wished to go overseas and do something purposeful with his life. At the same time, he considered a career in the military, which Barker supported. "He wanted that kind of life," Barker later told the Globe and Mail. "He really seemed to need the esprit de corps of an army organisation." When Egypt and Syria attacked Israel on 6 October 1973, starting the Yom Kippur War, Lamb thought he had found his calling—using money he had saved from his labourer's salary and gifts from his grandmother, he bought State of Israel Bonds and, with Barker's encouragement, travelled to Israel to volunteer for the Israel Defense Forces. However, after hitch-hiking to the Israeli lines, Lamb became disillusioned by conversations he had with the soldiers there, many of whom were loath to fight and wanted to go home. He applied anyway but was turned down because of his psychiatric history. He resolved to instead tour the world, and to that end left Israel days after arriving, intending to travel to Australia.
## Military career in Rhodesia
On his way to Australia in October 1973, Lamb stopped off in South Africa and Rhodesia (today Zimbabwe), where he cut his travels short to enlist in the Rhodesian Army. According to Barker, Lamb travelled to Africa with this intention all along. Rhodesia's unrecognised and predominantly white government was at that time fighting a war against communist-backed black nationalist guerrillas who were attempting to introduce majority rule. Like most of the foreign volunteers in the Rhodesian forces, Lamb was assigned to the Rhodesian Light Infantry (RLI), an all-white heliborne commando battalion engaged largely in counter-insurgency operations. He and the other foreign soldiers received the same pay and conditions of service as the Rhodesians they served alongside. Keeping his past a secret, he became a highly regarded and popular member of 3 Commando, RLI, noted for his professionalism and physical fitness. "The Rhodesians thought he was a first-class soldier," Barker later told the Globe and Mail.
Lamb visited his aunt and uncle in Windsor on leave in May 1975, "proudly wearing his uniform", journalist Tony Wanless writes. Turned out in the RLI's tartan green ceremonial dress and green beret, he was conspicuous walking along Ouellette Avenue, one of the city's main thoroughfares. Coincidentally, a funeral procession was being held for Edith Chaykoski's grandmother along that very street at the same time, leading Edith's younger brother Richard to spot Lamb on the pavement. The soldier remained oblivious, but his presence horrified the Chaykoski family. "He had the uniform and looked a little different," Richard told the Windsor Star a year later, "but I never forgot his face." Chaykoski's mother was so upset by the incident that for some time afterwards she refused to leave the house alone. While staying with the Hesketh family, Lamb went to see Barker and told him that serving in the Rhodesian security forces had enriched him personally and made him respect himself for the first time. Because of this he wished to forget about his previous life in Canada; in particular he said he "didn't want it associated with his adopted country". He expressed his concern that if he were killed or captured, the Canadian press might reveal his prior history and embarrass the Rhodesian Army, the Canadian government and the Penetanguishene mental hospital. However, he said, he felt great loyalty towards Rhodesia and would still go back to continue his service.
Lamb was deeply saddened by the bias he perceived the Western media to have against the Rhodesian government and army, but was reportedly conspicuous for leaping to the defence of any black Rhodesians he thought were receiving bad treatment. "He sympathised with the blacks," Barker told the Windsor Star, "but believed that chaos would result if they took over immediately. He used to scrap with other soldiers who treated blacks badly. He was very bright and knew the blacks would eventually take over the country."
Soon after his furlough to Canada, Lamb was transferred from the RLI to the elite Special Air Service (SAS) unit in September 1975. There he trained as a paratrooper and, after passing selection, found himself in a vastly different role to the one he had become used to during his time in the RLI. Rather than taking part in the RLI's fast and furious Fireforce counter-strike procedures, he found himself taking part in covert reconnaissance actions, "acting as eyes and ears," as Barbara Cole writes. Wishing to see more action, Lamb requested a posting back to the RLI, which was granted; he rejoined 3 Commando. In late 1976, at the age of 28, he was promoted to lance corporal and took command of a "stick" of four men from 12 Troop, 3 Commando on Fireforce duty on Operation Thrasher, which covered Rhodesia's eastern highlands against guerrilla activity.
In the late afternoon of 7 November 1976, three insurgents from a group of seven were spotted by an Army observation post in the Mutema Tribal Trust Lands, just south-west of Birchenough Bridge in Manicaland province. Fireforce was called up and the Rhodesians readied themselves to fly out by helicopter and engage the guerrillas. There were eight four-man "stops" involved in a Fireforce, and on this occasion Lamb headed Stop 2. Just before they left, Lamb ran over to Lance-Corporal Phil Kaye, the leader of Stop 3, and shouted over the noise of the aircraft. "They are going to get me this time," he yelled, sardonically; "Just you watch, Phil Kaye\!" Kaye and his MAG gunner, Trooper Pat Grogan, waved away this comment and told Lamb to get moving. "Go nail gooks\!" Kaye called after Lamb as he and the rest of Stop 2 took off aboard their Alouette III gunship. Lamb's men were an 18-year-old national service MAG gunner of Portuguese-Rhodesian extraction named Trooper Soares; Trooper Cornelius Olivier, a 20-year-old Rhodesian regular who carried an FN FAL battle rifle; and Trooper Tony Rok, an Australian Vietnam veteran, aged 28 and also equipped with an FN. Lamb carried the stick's radio on his back with his FN FAL ready in his hands.
Stop 2 landed, formed up into a sweep line and marched carefully to the north alongside a dry riverbed. As darkness fell, just as they came to a widening in the riverbed, they were suddenly fired upon by an unseen foe. All four men dropped to the ground to avoid being hit. The Canadian lance-corporal called for covering fire from Soares, which he provided as Lamb and Rok rose to their feet and cautiously moved forward. A dark figure suddenly ran across the soldiers' line of sight, between Lamb and the riverbed, and from a distance of about 16 paces Olivier reflexively swung his rifle around and let off a frenzied, imprecise burst of fire. Mortally wounded by two errant shots through the chest, Lamb stumbled, slumped to the ground and lay face-down in a heap. One of the bullets exited through the back of his body, smashing the radio he had been carrying. He died almost instantly. Meanwhile, the cadres ahead ran headlong into Stop 1, led by Sergeant Derrick Taylor, and were all killed in the ensuing fire fight. Taylor's stick sustained no casualties. When the battle was over, Stops 1 and 3 joined Olivier, Rok and Soares and waited beside Lamb's lifeless body all night until it could be evacuated by helicopter to the local hospital at Chipinga. The death was officially recorded as "killed in action", with no reference to friendly fire.
## Reactions to death; military funeral and burial
As Lamb had predicted, his death provoked stories in the Canadian local and national media that strongly stressed his history of violence and insanity. It even caused a heated debate in the Canadian House of Commons over why, considering his personal history in Ontario and subsequent service in the armed forces of a country Canada did not recognise, Lamb had been issued a valid Canadian passport and allowed to renew it on 26 April 1976. The Chaykoski family received the news with some relief, Wanless reported in the Windsor Star, having "lived in terror" since Lamb's release from Penetanguishene three years earlier.
In Rhodesia, by contrast, Lamb was posthumously held in very high regard by the men who had served alongside him. His photograph in full dress uniform was placed on 3 Commando's Wall of Honour and remained there until the RLI was disbanded in 1980. When the story of his earlier life in Canada was run by the Rhodesia Herald, the paper received numerous strongly worded letters from soldiers who refused to believe it. They demanded a printed retraction and apology, which the Herald gave soon after to preempt any further scandal. The lance-corporal was given what the Windsor Star and Newsweek both described as a "hero's funeral" in the Rhodesian capital, Salisbury, on 15 November 1976. No members of his family were present. His coffin, draped in the Rhodesian flag and topped with a large bouquet of flowers, was carried on a gun carriage to Warren Hills Cemetery, on the western outskirts of the city, where soldiers of the RLI fired three volleys of shots and senior officers saluted as the coffin was carried to the crematorium on the shoulders of eight RLI men. Lamb's ashes were afterwards returned to his relatives in Windsor, Ontario, where they were buried alongside the remains of his grandmother. |
# Bharattherium
Bharattherium is a mammal that lived in India during the Maastrichtian (latest Cretaceous) and possibly the Paleocene. The genus has a single species, Bharattherium bonapartei. It is part of the gondwanathere family Sudamericidae, which is also found in Madagascar and South America during the latest Cretaceous. The first fossil of Bharattherium was discovered in 1989 and published in 1997, but the animal was not named until 2007, when two teams independently named the animal Bharattherium bonapartei and Dakshina jederi. The latter name is now a synonym. Bharattherium is known from a total of eight isolated fossil teeth, including one incisor and seven molariforms (molar-like teeth, either premolars or true molars).
Bharattherium molariforms are high, curved teeth, with a height of 6 to 8.5 millimetres (0.24 to 0.33 in). In a number of teeth tentatively identified as fourth lower molariforms (mf4), there is a large furrow on one side and a deep cavity (infundibulum) in the middle of the tooth. Another tooth, perhaps a third lower molariform, has two furrows on one side and three infundibula on the other. The tooth enamel has traits that have been interpreted as protecting against cracks in the teeth. The hypsodont (high-crowned) teeth of sudamericids like Bharattherium are reminiscent of later grazing mammals, and the discovery of grass in Indian fossil sites contemporaneous with those yielding Bharattherium suggest that sudamericids were indeed grazers.
## Taxonomy
A gondwanathere tooth, catalogued as VPL/JU/NKIM/25, was first discovered in the Maastrichtian (latest Cretaceous, about 70–66 million years ago) Intertrappean Beds of Naskal, India, in 1989, but it was not identified as such until another gondwanathere, Lavanify, was found on Madagascar in the middle 1990s. The discoveries of Lavanify and VPL/JU/NKIM/25 were announced in Nature in 1997. Gondwanatheres were previously known only from Argentina; these discoveries extended the range of the gondwanathere family Sudamericidae across the continents of the ancient supercontinent of Gondwana.
In 2007, two teams of scientists independently named the Indian gondwanathere on the basis of new material; both teams included VPL/JU/NKIM/25 in their newly named species. Guntupalli Prasad and colleagues named the animal Bharattherium bonapartei on the basis of an additional tooth, VPL/JU/IM/33, from another Intertrappean locality, Kisalpuri. The generic name, Bharattherium, combines Bharat, Sanskrit for "India", with the Ancient Greek therion, meaning "beast", and the specific name, bonapartei, honors Argentine paleontologist José Bonaparte, who was the first to describe a gondwanathere fossil. G.P. Wilson and colleagues named Dakshina jederi on the basis of six teeth (in addition to VPL/JU/NKIM 25), and identified some additional material as indeterminate gondwanatheres. Of these teeth, three (GSI/SR/PAL-G059, G070, and G074) are from a third Intertrappean site at Gokak and three (GSI/SR/PAL-N071, N210, and N212) are from Naskal. Dakshina, the generic name, derives from Sanskrit daakshinaatya "of the south", and refers both to the animal's occurrence in southern India and to the distribution of gondwanatheres in the southern continents. The specific name, jederi, honors University of Michigan paleontologist Jeffrey A. Wilson, nicknamed "Jeder", who played an important role in the project that led to the discovery of Dakshina. Wilson and colleagues also described three other gondwanathere teeth from Gokak (GSI/SR/PAL-G111, G112, and G211), which they tentatively identified as a different species of gondwanathere on their small size. In 2008, Prasad commented that Bharattherium bonapartei and Dakshina jederi represented the same species and that Bharattherium, which was published first, was the correct name.
## Description
Bharattherium bonapartei is known from a total of eight isolated teeth. Among the seven teeth in their sample, Wilson and colleagues tentatively identified five as fourth lower molariforms (mf4)—because gondwanathere premolars and molars cannot be distinguished, they are collectively known as "molariforms"—one as a third lower molariform (mf3) and one as a lower incisor (i1). These determinations were made on the basis of comparisons with a sample of the South American gondwanathere Sudamerica ameghinoi, in which all eight molariform positions are known. However, the large number of mf4s led Wilson and colleagues to suspect that the criteria used for distinguishing Sudamerica tooth positions may not apply directly to Bharattherium. Prasad and colleagues did not assign their two Bharattherium teeth to any tooth position, but suggested that they may represent different tooth positions and that one may come from the upper and the other from the lower side of the jaw. As is characteristic of sudamericids, Bharattherium molariforms are hypsodont (high-crowned) and have a flat occlusal (chewing) surface atop a high tooth, with furrows that extend down the height of the tooth. Bharattherium molariforms are the smallest of any sudamericid; those of Lavanify, for example, are about 35% larger. Unlike Sudamerica molariforms, those of Bharattherium taper towards the top.
### Molariforms
GSI/SR/PAL-G074, a well-preserved right mf4 that Wilson and colleagues selected as the holotype of Dakshina jederi, is 7.57 mm high and has a crown of 3.66 × 2.99 mm. It is curved, with the base more distal (towards the back) than the top. The occlusal surface is rectangular. On the lingual side (towards the tongue), there is a deep furrow (filled in part with cementum) that extends from the top to near the base of the tooth. There is also a much smaller indentation on the buccal side (towards the cheeks). The occlusal surface is mostly covered with enamel surrounding a dentine lake, but there is a V-shaped islet in the middle, with the tip of the V towards the lingual side, that forms the remnant of an infundibulum—a deep cavity in the tooth. Perikymata—wave-like bands and grooves—are visible in the enamel.
The right mf4 GSI/SR/PAL-G070, which is damaged on the buccal, distal, and lingual sides, is 8.40 mm high, but has an occlusal surface of only 2.49 × 1.75 mm. Unlike in GSI/SR/PAL-G074, the dentine on the occlusal surface is not exposed, and the occlusal surface is oval in shape. Furthermore, the V-shaped islet is larger and the lingual furrow is less prominent at the occlusal surface, because it tapers near the tip of the tooth. In the heavily damaged left mf4 GSI/SR/PAL-N071 (height 7.16 mm), only the distal side is well preserved. The infundibulum is exposed internally; it extends 4.01 mm down the crown. The occlusal surface is poorly preserved, but its dimensions are at least 2.14 × 2.42 mm. GSI/SR/PAL-N212, a right mf4, is damaged on the mesial side and has a height of 5.86 mm and an occlusal surface of at least 2.66 x 2.04 mm. Cementum fills the V-shaped islet.
VPL/JU/NKIM/25 was the first Indian gondwanathere fossil to be described; it is damaged on one side. Wilson and colleagues identified it as a left mf4 (implying that the damaged side is buccal) with strong similarities to GSI/SR/PAL-G070, including a curved crown and a V-shaped enamel islet atop a deep infundibulum. The occlusal surface is oval. The tooth is 6 mm high and Wilson and colleagues estimate that the occlusal surface is 2.5 × 1.8 mm, close to the dimensions of GSI/SR/PAL-G070. They suggest the tooth probably had enamel on all sides of the crown, but Prasad and colleagues point to a possible enamel-dentine junction on the damaged side as evidence that enamel may be absent there.
GSI/SR/PAL-G059, identified as a left mf3, has a height of 5.97 mm at the mesial side, but only 2.02 mm at the distal side because of curvature. On the lingual side, two long furrows are visible, and on the buccal side breakage exposes three long infundibula, of which the most mesial one is the longest and the most distal one the shortest. In the occlusal surface, these three infundibula merge into a single islet. In addition, three dentine lakes are visible in the occlusal surface, which has dimensions of 4.58 × at least 2.52 mm. Although in Sudamerica, mf2, mf3, and the upper molariforms MF3 and MF4 all have three lophs, like GSI/SR/PAL-G059, its curvature matches the mf3 of Sudamerica best.
VPL/JU/IM/33, the holotype of Bharattherium bonapartei, is 7.33 mm high, 2.66 mm long, and 2.0 mm wide. The occlusal surface is about rectangular and is mostly covered by a V-shaped dentine lake, which encloses a small heart-shaped enamel islet at the top of an cementum-filled infundibulum. A vertical furrow is also present. Near the top of the tooth, enamel covers the entire crown, but further down there is no enamel on the concave face of the tooth.
### Incisor
The left i1 GSI/SR/PAL-N210 is flat on the medial side (towards the middle of the head) but convex on the lateral side (towards the side of the head) and bears a shallow groove on the lateral side. At the base, the tooth is broadest on the lower end. The tooth is slightly curved upward towards the tip. Measured on the lower side, the tooth is 11.76 mm long, but breakage means the true length is probably larger. The depth of the tooth is about 3.39 mm. Wilson and colleagues identified this incisor as Dakshina on the basis of its size; the upper and lower incisor that they assigned to an indeterminate gondwanathere are smaller.
### Enamel microstructure
The microstructure of the enamel of VPL/JU/NKIM/25 has been studied. Unlike other gondwanatheres, it has enamel consisting of three layers—radial enamel, tangential enamel, and PLEX. The rows of small, round enamel prisms are separated by interprismatic matrix that forms crystals oriented at right angles relative to the prisms. Prisms arise at the enamel-dentine junction, run through the enamel, and meet the outer enamel at a high angle. These features of the enamel are apparently adaptations that protect the tooth from cracks.
## Relationships
Bharattherium is identifiable as a sudamericid because it has hypsodont molariforms with cementum-filled furrows. Among the four known sudamericid genera—Gondwanatherium and Sudamerica from Argentina; Lavanify from Madagascar; and Bharattherium—it shares with Sudamerica and Lavanify the presence of furrows that extend down to the base of the tooth. In addition, it shares several features with Lavanify, suggesting the two are closely related. Wilson and colleagues list three features shared by the two: the presence of an infundibulum (seen in only one of two specimens of Lavanify), interprismatic matrix, and perikymata. Prasad and colleagues also interpreted the interprismatic matrix as a shared character, but added the absence of enamel on one side of the tooth crown. Wilson and colleagues identified the presence of a V-shaped enamel lake on mf4 and of three layers in the enamel as autapomorphies (uniquely derived characters) of the Indian sudamericid.
## Range and ecology
Remains of Bharattherium have been found at three widely separated Late Cretaceous sites in peninsular India—Naskal, Andhra Pradesh; Gokak, Karnataka; and Kisalpuri, Madhya Pradesh. All sites are in the Intertrappean Beds (part of the Deccan Traps) and are Maastrichtian (latest Cretaceous) in age. The Intertrappean Beds have yielded a variety of fossil animals, including eutherian mammals such as Deccanolestes, Sahnitherium, and Kharmerungulatum. In the perhaps slightly older Infratrappean Beds, a possible member of the ancient and enigmatic mammalian group Haramiyida has been found, Avashishta. Members of the family Sudamericidae, in which Bharattherium is classified, are also known from the Cretaceous of Argentina, Madagascar, and possibly Tanzania and from the Paleogene of Argentina and Antarctica, and the second gondwanathere family, Ferugliotheriidae, is known with certainty only from the Cretaceous of Argentina. Thus, Bharattherium is an example of a Gondwanan faunal element in India and indicates biogeographic affinities with other Gondwanan landmasses such as Madagascar and South America.
In modern mammals, hypsodont teeth are often associated with diets that include abrasive vegetation such as grasses. Hypsodonty in sudamericids has been interpreted as indicating semiaquatic, terrestrial habits and a diet with items like roots or bark, because it was thought that grasses had not yet appeared when sudamericids lived. However, grass remains have been found at Intertrappean sites contemporary with those where Bharattherium was found, suggesting that sudamericids like Bharattherium were indeed the first grazing mammals.
It is among the two Indian mammal taxa that are inferred to have survived the KT event in India, alongside Deccanolestes. |
# Vauxhall Bridge
Vauxhall Bridge is a Grade II\* listed steel and granite deck arch bridge in central London. It crosses the River Thames in a southeast–northwest direction between Vauxhall on the south bank and Pimlico on the north bank. Opened in 1906, it replaced an earlier bridge, originally known as Regent Bridge but later renamed Vauxhall Bridge, built between 1809 and 1816 as part of a scheme for redeveloping the south bank of the Thames. The bridge was built at a location in the river previously served by a ferry.
The building of both iterations of the bridge was problematic, with both the first and second bridges requiring several redesigns from multiple architects. The original bridge, the first iron bridge over the Thames, was built by a private company and operated as a toll bridge before being taken into public ownership in 1879. The second bridge, which took eight years to build, was the first in London to carry trams and later one of the first two roads in London to have a bus lane.
In 1963 it was proposed to replace the bridge with a modern development containing seven floors of shops, office space, hotel rooms and leisure facilities supported above the river, but the plans were abandoned because of costs. With the exception of alterations to the road layout and the balustrade, the design and appearance of the current bridge has remained almost unchanged since 1907. The bridge today is an important part of London's road system and carries the A202 road and Cycle Superhighway 5 (CS5) across the Thames.
## Background
In the early 13th century, Anglo-Norman mercenary Falkes de Breauté built a manor house in the then empty marshlands of South Lambeth, across the River Thames from Westminster. In 1223–24, de Breauté and others revolted against Henry III; following a failed attempt to seize the Tower of London, de Breauté's lands in England were forfeited and he was forced into exile in France and later Rome. The lands surrounding his Lambeth manor house continued to be known as Falkes' Hall, later Vauxhall.
With the exception of housing around the New Spring Gardens (later Vauxhall Gardens) pleasure park, opened in around 1661, the land at Vauxhall remained sparsely populated into the 19th century, with the nearest fixed river crossings being the bridges at Westminster, 1 mile (1.6 km) downstream, and Battersea, 2 miles (3.2 km) upstream. In 1806 a scheme was proposed by Ralph Dodd to open the south bank of the Thames for development, by building a new major road from Hyde Park Corner to Kennington and Greenwich, crossing the river upstream of the existing Westminster Bridge. The proprietors of Battersea Bridge, concerned about a potential loss of customers, petitioned Parliament against the scheme, stating that "[Dodd] is a well known adventurer and Speculist, and the projector of numerous undertakings upon a large scale most if not all of which have failed", and the bill was abandoned.
In 1809 a new bill was presented to Parliament, and the proprietors of Battersea Bridge agreed to allow it to pass and to accept compensation. The ' (49 Geo. 3. c. cxlii) incorporated the Vauxhall Bridge Company, allowing it to raise up to £300,000 (about £ in 2024) by means of mortgages or the sale of shares, and to keep all profits from any tolls raised. From these profits, the Vauxhall Bridge Company was obliged to compensate the proprietors of Battersea Bridge for any drop in revenue caused by the new bridge.
## Old Vauxhall Bridge
Dodd submitted a scheme for a bridge at Vauxhall of 13 arches. However, soon after the Vauxhall Bridge Act 1809 was passed, he was dismissed by the Vauxhall Bridge Company and his design was abandoned. John Rennie was commissioned to design and build the new bridge, and a stone bridge of seven arches was approved. On 9 May 1811, Lord Dundas laid the foundation stone of the bridge on the northern bank.
The Vauxhall Bridge Company ran into financial difficulties and was unable to raise more than the £300,000 stipulated in the 1809 act, and a new ' (52 Geo. 3. c. cxlvii) was passed permitting the company to build a cheaper iron bridge. Rennie submitted a new design for an iron bridge of eleven spans, costing far less than the original stone design. Rennie's design was rejected, and instead construction began on a nine arch iron bridge designed by Samuel Bentham. Concerns were raised about the construction of the piers, and engineer James Walker was appointed to inspect the work. Walker's report led to the design being abandoned for the second time, and Walker himself was appointed to design and build a bridge of nine 78-foot (24 m) cast-iron arches with stone piers, the first iron bridge to be built across the Thames.
On 4 June 1816, over five years after construction began, the bridge opened, initially named Regent Bridge after George, Prince Regent, but shortly afterwards renamed Vauxhall Bridge. The developers failed to pay the agreed compensation to the owners of Battersea Bridge and were taken to court; after a legal dispute lasting five years a judgement was made in favour of Battersea Bridge, with Vauxhall Bridge being obliged to pay £8,234 (about £ in 2024) compensation. As well as the compensation awarded by the courts to Battersea Bridge in 1821, the Vauxhall Bridge Act 1809 also obliged the Vauxhall Bridge Company to pay compensation to the operators of Huntley Ferry, the Sunday ferry service to Vauxhall Gardens, with the level to be decided by "a jury of 24 honest, sufficient and indifferent men". The bridge cost £175,000 (about £ in 2024) to build; with the costs of approach roads and compensation payments, the total cost came to £297,000 (about £ in 2024).
### Usage
In anticipation of the areas surrounding the bridge becoming prosperous suburbs, tolls were set at relatively high rates on a sliding scale, ranging from a penny for pedestrians to 2s 6d for vehicles drawn by six horses. Exemptions were granted for mail coaches, soldiers on duty and parliamentary candidates during election campaigns. However, the area around the bridge failed to develop as expected. In 1815 John Doulton built the Doulton & Watts (later Royal Doulton) stoneware factory at Vauxhall, and consequently instead of the wealthy residents anticipated by the company, the area began to fill with narrow streets of working class tenements to house the factory's workers. Meanwhile, the large Millbank Penitentiary was built near the northern end of the bridge, discouraging housing development. Consequently, toll revenues were initially lower than expected, and the dividends paid to investors were low.
Usage rose considerably in 1838 when the terminus of the London and South Western Railway was built at nearby Nine Elms. Nine Elms station proved inconvenient and unpopular with travellers, and in 1848 a new railway terminus was built 1+1⁄2 miles (2.4 km) closer to central London, at Waterloo Bridge station (renamed "Waterloo Station" in 1886), and the terminus at Nine Elms was abandoned.
With the closure of the rail terminus, Vauxhall Bridge's main source of revenue was visitors to the Vauxhall Gardens pleasure park. In addition to people visiting the Gardens themselves, Vauxhall Gardens were used as a launch point for hot air balloon flights, and large crowds would gather on the bridge and surrounding streets to watch the flights. A large crowd also assembled on the bridge in September 1844 to watch Mister Barry, a clown from Astley's Amphitheatre, sail from Vauxhall Bridge to Westminster Bridge in a washtub towed by geese.
### Public ownership
Despite early setbacks and the construction nearby in the 19th century of three competing bridges (Lambeth Bridge, Chelsea Bridge and Albert Bridge), the rapid urban growth of London made Vauxhall Bridge very profitable. The annual income from tolls rose from £4,977 (about £ in 2024) in its first full year of operation, to £62,392 in 1877 (about £ in 2024). The Metropolis Toll Bridges Act 1877 (40 & 41 Vict. c. xcix) was passed, allowing the Metropolitan Board of Works (MBW) to buy all London bridges between Hammersmith Bridge and Waterloo Bridge and free them from tolls.
In 1879 the bridge was bought by the MBW for £255,000 (about £ in 2024) and tolls on the bridge were lifted. Inspections of the bridge by the MBW following the purchase found that the two central piers were badly eroded, exposing the timber cradles on which the piers rested. Large quantities of cement in bags were laid around the wooden cradles as an emergency measure; however, the cement bags themselves soon washed away. The piers were removed, replaced by a single large central arch. By this time the bridge was in very poor condition, and in 1895 the London County Council (LCC), which had taken over from the MBW in 1889, sought and gained parliamentary approval to replace the bridge, in the "" (58 & 59 Vict. c. cxxix). Permission was granted by Parliament to raise the projected replacement costs of £484,000 (about £ in 2024) from rates across the whole of London rather than only local residents, as a new bridge was considered to be of benefit to the whole of London.
In August 1898 a temporary wooden bridge was moved into place alongside the existing bridge, and the demolition of the old bridge began.
## New Vauxhall Bridge
Sir Alexander Binnie, the resident engineer of the London County Council (LCC), submitted a design for a steel bridge, which proved unpopular. At the request of the LCC, Binnie submitted a new design for a bridge of five spans, to be built in concrete and faced with granite.
Work on Binnie's design began, but was beset by problems. Leading architects condemned the design, with Arthur Beresford Pite describing it as "a would-be Gothic architectural form of great vulgarity and stupid want of meaning", and T G Jackson describing the bridge designs as a sign of "the utter apparent indifference of those in authority to the matter of art". Plans to build large stone abutments had to be suspended when it was found that the southern abutment would block the River Effra, which by this time had been diverted underground to serve as a storm relief sewer and which flowed into the Thames at this point. The Effra had to be rerouted to join the Thames to the north of the bridge. After the construction of the foundations and piers it was then discovered that the clay of the riverbed at this point would not be able to support the weight of a concrete bridge. With the granite piers already in place, it was decided to build a steel superstructure onto the existing piers, and a superstructure 809 feet (247 m) long and 80 feet (24 m) wide was designed by Binnie and Maurice Fitzmaurice and built by LCC engineers at a cost of £437,000 (about £ in 2024).
The new bridge was eventually opened on 26 May 1906, five years behind schedule, in a ceremony presided over by the Prince of Wales and Evan Spicer, Chairman of the LCC. Charles Wall, who had won the contract to build the superstructure of the new bridge, paid the LCC £50 for the temporary wooden bridge, comprising 40,000 cubic feet (1,100 m<sup>3</sup>) of timber and 580 tons of scrap metal.
### Sculpture
The new bridge was built to a starkly functional design, and many influential architects had complained about the lack of consultation from any architects during the design process by the engineers designing the new bridge. In 1903, during the construction of the bridge, the LCC consulted with architect William Edward Riley regarding possible decorative elements that could be added to the bridge. Riley proposed erecting two 60-foot (18 m) pylons topped with statues at one end of the bridge, and adding decorative sculpture to the bridge piers. The pylons were rejected on grounds of cost, but following further consultation with leading architect Richard Norman Shaw it was decided to erect monumental bronze statues above the piers, and Alfred Drury, George Frampton and Frederick Pomeroy were appointed to design appropriate statues.
Frampton resigned from the project through pressure of work, and Drury and Pomeroy carried out the project, each contributing four monumental statues, which were installed in late 1907. On the upstream piers are Pomeroy's Agriculture, Architecture, Engineering and Pottery, whilst on the downstream piers are Drury's Science, Fine Arts, Local Government and Education". Each statue weighs approximately two tons. Despite their size, the statues are little-noticed by users of the bridge as they are not visible from the bridge itself, but only from the river banks or from passing shipping.
### Usage
The new bridge soon became a major transport artery and today carries the A202 and Cycle Superhighway 5 across the Thames. Originally built with tram tracks, New Vauxhall Bridge was the first in central London to carry trams. Initially it carried horse-drawn trams, but shortly after the bridge's opening it was converted to carry the electric trams of London County Council Tramways; it continued to carry trams until the termination of tram services in 1951. In 1968 Vauxhall Bridge and Park Lane became the first roads in London to have bus lanes; during weekday evening rush hours, the central lane of the bridge was reserved for southbound buses only.
## Millbank Bridge
During the Second World War the government was concerned that Axis bombers would target the bridge, and a temporary bridge known as Millbank Bridge was built parallel to Vauxhall Bridge, 200 yards (180 m) downstream. Millbank Bridge was built of steel girders supported by wooden stakes; however, despite its flimsy appearance it was a sturdy structure, capable of supporting tanks and other heavy military equipment. In the event, Vauxhall Bridge survived the war undamaged, and in 1948 Millbank Bridge was dismantled. Its girders were shipped to Northern Rhodesia and used to span a tributary of the Zambezi.
## The Crystal Span
In 1963 the Glass Age Development Committee commissioned a design for a replacement bridge at Vauxhall, inspired by the design of the Crystal Palace, to be called the Crystal Span. The Crystal Span was to have been a seven-story building supported by two piers in the river, overhanging the river banks at either end. The structure itself would have been enclosed in an air conditioned glass shell. The lowest floor would have contained two three-lane carriageways for vehicles, with a layer of shops and a skating rink in the centre of the upper floors. The southern end of the upper floors was to house a luxury hotel, whilst the northern end was to house the modern art collection of the nearby Tate Gallery, which at this time was suffering from a severe shortage of display space. The roof was to have housed a series of roof gardens, observation platforms and courtyards, surrounding a large open-air theatre. The entire structure would have been 970 feet (300 m) long and 127 feet (39 m) wide. Despite much public interest in the proposals, the London County Council was reluctant to pay the estimated £7 million (£ in 2024) construction costs, and the scheme was abandoned.
## Recent history
In 1993, a remnant of the earliest known bridge-like structure in London was discovered alongside Vauxhall Bridge, when shifting currents washed away a layer of silt which had covered it. Dating to between 1550 BC and 300 BC, it consists of two rows of wooden posts, which it is believed would originally have carried a deck of some kind. It is believed that it did not cross the whole river, but instead connected the south bank to an island, possibly used for burial of the dead. As no mention of this or similar structures in the area is made in Julius Caesar's account of crossing the Thames nor by any other Roman author, it is presumed that the structure had been dismantled or destroyed prior to Caesar's expedition to Britain in 55 BC. The posts are still visible at extreme low tides.
Following the closure of a number of the area's industries, in the 1970s and 1980s the land at the southern end of Vauxhall Bridge remained empty, following the failures of multiple redevelopment schemes. The most notable came in 1979 when Keith Wickenden MP, owner of the land at the immediate southern end of the bridge, proposed a large-scale redevelopment of the site. The development was to contain 300,000 square feet (28,000 m<sup>2</sup>) of office space, 100 luxury flats and a gallery to house the Tate Gallery's modern art collection. The offices were to be housed in a 500-foot (150 m) tower of green glass, which was nicknamed the "Green Giant" and met with much opposition. The then Secretary of State for the Environment, Michael Heseltine, refused permission for the development and the site remained empty.
In 1988 Regalian Properties purchased the site, and appointed Terry Farrell as architect. Farrell designed a self-contained community of shops, housing, offices and public spaces for the site. Regalian disliked the proposals and requested Farrell design a single large office block. Despite containing 50% more office space than the rejected Green Giant proposal, the design was accepted. The government then bought the site and design as a future headquarters for the Secret Intelligence Service, and the design was accordingly modified to increase security. In 1995 the SIS Building was opened on the site, and today dominates other buildings in the vicinity of the bridge.
In 2004 the Vauxhall Cross area at the southern end of the bridge was redeveloped as a major transport interchange, combining a large bus station with the existing National Rail and London Underground stations at Vauxhall. Immediately to the east of the southern end of the bridge, a slipway provides access for amphibious buses between the road and river.
The only significant alteration to the structure of the bridge itself since the addition of the sculptures in 1907 came in 1973, when the Greater London Council (GLC) decided to add an extra traffic lane by reducing the width of the pavements. To counter the increased load of extra traffic, the council announced the replacement of the cast-iron balustrades with low box-girder structures. Despite formal objections from both Lambeth and Westminster Councils, the GLC ignored the objections. In 2015, the extra lane of motor traffic was removed in favour of a kerb-protected two-way cycle track, on the north-east side of the bridge. This forms part of Cycle Superhighway 5.
The bridge was declared a Grade II\* listed structure in 2008, providing protection to preserve its character from alteration.
## See also
- List of crossings of the River Thames
- List of bridges in London |
# Gadsden Purchase half dollar
The Gadsden Purchase half dollar was a proposed commemorative coin to be issued by the United States Bureau of the Mint. Legislation for the half dollar passed both houses of Congress in 1930 but was vetoed by President Herbert Hoover. The House of Representatives sustained his action, 96 votes in favor of overriding it to 243 opposed, well short of the necessary two-thirds majority. This was the first veto of Hoover's presidency and the first ever for a commemorative coin bill.
The proposal to commemorate the 1854 congressional ratification of the Gadsden Purchase was the brainchild of El Paso coin dealer Lyman W. Hoffecker, who wanted a commemorative coin he could control and distribute. He gained the support of several members of Congress from Texas and the Southwest, and a bill was introduced in Congress in April 1929, receiving a hearing 11 months later. Treasury Secretary Andrew W. Mellon sent a letter and two officials in opposition to the bill, but it passed both houses of Congress without dissent. On April 21, 1930, Hoover vetoed the bill, deeming commemorative coins abusive. Although only one congressman spoke in favor of Hoover's action during the override debate in the House, the veto was easily sustained.
No commemorative coins were struck during the remainder of the Hoover Administration. They resumed after Franklin D. Roosevelt was inaugurated, but by 1935 Roosevelt was citing Hoover's veto in urging Congress to avoid passing commemorative coin bills. He vetoed one in 1938. In 1946, Harry S. Truman adopted similar arguments in warning he would oppose further coin bills, and he vetoed one in 1947. Dwight D. Eisenhower vetoed three more in 1954. No non-circulating commemorative coins were struck from 1955 until after the Treasury Department changed its position in 1981.
## Hearing
The Gadsden Purchase came as the result of negotiation between the U.S. minister to Mexico James Gadsden and Mexican president Antonio López de Santa Anna. Following the Mexican–American War, there were border disputes along the Mexican Cession left unresolved by the Treaty of Guadalupe Hidalgo. Specifically, Southerners in the United States sought land over which a southern route of a transcontinental railroad could run. Accordingly, U.S. President Franklin Pierce sent Gadsden to resolve these issues. The resulting treaty was signed on December 30, 1853. Initially, 45,000 square miles (120,000 km<sup>2</sup>) were to be conveyed in exchange for $15 million. But when the original treaty failed to pass the U.S. Senate, both the land and the payment were reduced by about a third. The Gadsden Purchase, west of El Paso, forms part of the states of Arizona and New Mexico.
During the late 1920s, El Paso coin dealer Lyman W. Hoffecker tried hard to gain congressional approval for a half dollar commemorating the 75th anniversary of the Gadsden Purchase. In 1929, he founded the Gadsden Purchase Commission (essentially just Hoffecker himself). At the time, commemorative coins were not sold by the government—Congress, in authorizing legislation, usually designated an organization which had the exclusive right to purchase them at face value and vend them to the public at a premium. Hoffecker wanted to be able to control the distribution of the Gadsden Purchase half dollars.
The April 1929 issue of The Numismatist printed a letter from Hoffecker, who was said to have designed the proposed coin and would be responsible for its distribution. It stated that Texas Congressman Claude Benton Hudspeth of El Paso had agreed to introduce legislation for the Gadsden piece. Hoffecker related that the coin would have the portrait of Gadsden on its obverse and on its reverse a map of New Mexico and Arizona depicting the purchased lands and El Paso. Ten thousand coins were to be sought, to be issued at $1.50 (equivalent to $ in 2020) each.
Hudspeth introduced a bill for a Gadsden Purchase half dollar into the House of Representatives on April 25, 1929; it was referred to the Committee on Coinage, Weights, and Measures. On January 29, 1930, committee chairman Randolph Perkins of New Jersey sent a letter to Treasury Secretary Andrew W. Mellon, enquiring as to the Treasury's views. Mellon replied on the 31st, opposing the bill. He felt that Congress had wisely decided in 1890 that coin designs should not be changed more often than once in 25 years, and that the 15 commemorative coin bills passed since 1920 were wasteful and a burden on the Mint. He noted that in 1927, at the time of the Vermont Sesquicentennial half dollar, the Coinage Committee had gone on record in opposition of commemorative coin issues, many of which were only of local and not national significance. Several issues had failed to sell out, resulting in coins being returned to the Mint to be melted, and he suggested that a medal be issued instead of a coin. On March 8, Hoffecker sent a telegram to the committee offering to pay for the entire issue of 10,000 anytime the department wanted, and given that the Mint had produced over 30,000,000 coins for other nations in 1929, any burden posed by commemorative half dollars was slight.
Hearings were held by the committee on the bill on March 10 (briefly) and 17, 1930, with Perkins presiding. On the 17th, Congressman Guinn Williams appeared on behalf of Hudspeth, who was ill. Williams, a Texan, stated that the coin issue was important to the entire Southwest, that proponents would not allow the government to incur any expense, and stated that they were ready to pay for the coins. He also presented a joint resolution of the houses of the Texas Legislature, asking the state's representatives to introduce and support a bill for a Gadsden Purchase half dollar. Next to speak was Albert Gallatin Simms of New Mexico, who assured the committee of his support for the bill, and that of his state's two senators. Hudspeth sent a letter, and his secretary Kate George told the committee that the senators from Texas, New Mexico and Arizona were unanimously in favor of the bill. Hudspeth's letter stated he had been told by Hoffecker's committee that the money from the coins would be used to set up a small monument where the U.S. flag had first been raised in the Gadsden Purchase.
Mellon had sent two Treasury officials to the hearing, Assistant Secretary Walter E. Hope and Assistant Director of the Mint Mary M. O'Reilly. In his testimony, Hope warned that the issuance of commemorative half dollars was leading to confusion, and there was a risk of counterfeiting. Sales for some issues had not met expectations, resulting in many commemoratives being returned to the Mint for redemption and melting. O'Reilly told of a man who was about to be put off a Philadelphia streetcar, having only a commemorative half dollar to pay the fare, something not recognized by the conductor. He was allowed to remain after the superintendent of the Philadelphia Mint, also a passenger, paid his fare. Luther A. Johnson of Texas appeared in support of the bill, and presented a letter from Hoffecker dated March 5, stating that the coins could be easily sold, and that when his committee had met with President Herbert Hoover, the president had told them the event should be fittingly celebrated. After the testimony concluded, the Coinage Committee went into executive session and endorsed the bill.
## Passage by Congress
Perkins's committee issued a report dated March 17, 1930, stating that the anniversary of the Gadsden Purchase was of international importance, and there was no risk to the Treasury given the willingness of Hoffecker to pay for the coins. On March 19, Perkins called up the bill before the House, and it passed without debate or dissent. Immediately thereafter, Perkins had another commemorative coin bill passed by the House, for the 300th anniversary of Massachusetts Bay Colony.
In the Senate, the Gadsden bill was referred to the Committee on Banking and Currency. On April 2, the committee issued a report through Tom Connally of Texas, similar to the House report (of which a copy was attached), recommending passage. On April 7, the bill passed the Senate without debate or opposition. The bill was engrossed and signed on April 9 by the Speaker of the House pro tempore and by Charles Curtis, Vice President of the United States, and was presented to President Hoover on April 10.
## Veto and override attempt
On April 21, 1930, Hoover vetoed the bill, returning it unsigned with a list of his objections to the House of Representatives where it originated. This veto was the first of Hoover's presidency, and the first time a commemorative coin bill had been vetoed. The president vetoed the bill on the advice of the Treasury Department. Hoover noted the issuance of 15 commemorative coins over the past decade, something he said allowed the opportunity for counterfeiting. He stated, "the matter is not perhaps one of large importance in itself, were it not for the fact of the great number of other similar proposals"—he noted the five other commemorative coin bills then before Congress, and if the Gadsden Purchase bill were enacted, it would be harder to turn down the other proposals. Deeming the issuance of commemorative coins a misuse of the coinage system, he offered the government's assistance in the production of medals, which could provide a souvenir without impacting the coinage. Hoover's attitude was said to be informed by fundraising committees returning large amounts of several commemorative coin issues to the Mint for redemption and melting, something he considered wasteful.
The day after the veto, the House Coinage Committee sought a vote to override it. Many of the bill's proponents came to the House floor to back the override, with the support of members from other parts of the country who sought commemorative coins. Those who would override dominated the debate, with only one speaker, House Majority Leader John Q. Tilson of Connecticut, supporting the president. The override attempt fell well short of the two-thirds needed, as 96 representatives voted in favor of the override, with 243 against it. Five Republicans voted to override Hoover's veto: New Mexico Congressman Simms and four from the East and Midwest.
The New York Sun applauded Hoover's "sound common sense" while Hoffecker's hometown paper, the El Paso Herald, stated that "President Hoover administered a figurative slap to Arizona, New Mexico and El Paso". On April 26, The Washington Post published an editorial in favor of Hoover's action:
> It is to be hoped that the practice of minting special coins for occasions of this kind is definitely at an end ... The issuance of commemorative coins has become a nuisance ... Yet if Congress thus favors one celebration, how can it refuse another? The only sound policy is uniform coinage for use only as a means of exchange. The clamor for souvenir coins arises from the fact that they may be sold at a profit. But Congress has authorized the minting of so many souvenir coins recently that profits have fallen off. ... President Hoover's judgment ... meets with general approval.
## Aftermath
According to David Bullowa in his 1938 volume on commemoratives, "with the vetoing of the Gadsden Purchase half-dollar proposal ... a statement was issued that commemorative coins were superfluous and that their purpose might be as well accomplished with officially authorized medals. These pieces, if struck, would adequately serve collectors, it was thought, and such pieces would not tend to 'confuse the coinage'."
No commemorative coins were authorized or struck during the remainder of Hoover's presidency. Following the inauguration of Hoover's successor, Franklin D. Roosevelt, striking and authorization of commemoratives resumed. By 1935, Roosevelt had warned Congress against issuing large numbers of commemoratives and repeated this in 1937, both times citing Hoover's veto of the Gadsden Purchase bill, and urging the issuance of medals instead. He vetoed a coin bill in 1938 for the 400th anniversary of Coronado's expedition. In 1946, President Harry Truman signed the first authorizations for commemorative coins since 1937, but cited the Treasury's position and stated he would not look with favor on further issues. Citing Hoover's veto, he vetoed a bill for a commemorative half dollar for the centennial of Wisconsin statehood on July 31, 1947. Similar arguments were made by the Treasury under the presidency of Dwight D. Eisenhower, who vetoed three commemorative coin bills in 1954. No commemoratives were issued thereafter until the department changed its position in 1981, as the Washington 250th Anniversary half dollar was being considered; it was issued in 1982. The government sold the new commemoratives to collectors and dealers, rather than having sales conducted by a designated group.
Although Hoffecker was unsuccessful with the Gadsden Purchase piece, he tried again in 1935. He was the designer and distributor of the Old Spanish Trail half dollar and was also the distributor of the Elgin, Illinois, Centennial half dollar (1936). In 1936, Hoffecker testified before Congress on the abuses committed by the distributors of commemorative coins. From 1939 to 1941, he served as president of the American Numismatic Association. He died in 1955.
## See also
- Louisiana Purchase Sesquicentennial half dollar, vetoed by Eisenhower in 1954. |
# Northampton War Memorial
Northampton War Memorial, officially the Town and County War Memorial, is a First World War memorial on Wood Hill in the centre of Northampton, the county town of Northamptonshire, in central England. Designed by architect Sir Edwin Lutyens, it is a Stone of Remembrance flanked by twin obelisks draped with painted stone flags standing in a small garden in what was once part of the churchyard of All Saints' Church.
Discussion of a war memorial for Northampton began shortly after the armistice in 1918, and from July 1919 a temporary wooden cenotaph stood on Abington Street in the town centre. The Northamptonshire War Memorial Committee commissioned Lutyens to design a permanent memorial. The monument's design was completed and approved quickly, but its installation was delayed by six years until the site could be purchased from the Church of England, which required a faculty from the local diocese. The memorial was finally unveiled on 11 November 1926 after a service and a parade including local schoolchildren and civic leaders.
Northampton's memorial is one of the more elaborate town memorials in England. It uses three features characteristic of Lutyens's war memorials: a pair of obelisks, the Stone of Remembrance (which Lutyens designed for the Imperial War Graves Commission), and painted stone flags on the obelisks, which were rejected for his Cenotaph in London but feature on several of his other memorials. Today it is a Grade I listed building; it was upgraded from Grade II in 2015 when Lutyens's war memorials were declared a "national collection" and all were granted listed building status or had their listing renewed.
## Background
The First World War produced casualties on an unprecedented scale. Men from every town and village in Northamptonshire died in the war, with the exception of two thankful villages (East Carlton in the north of the county and Woodend in the south). In the war's aftermath, thousands of memorials were built across Britain.
Among the most prominent designers of war memorials was architect Sir Edwin Lutyens, who was described by Historic England as "the leading English architect of his generation". Prior to the First World War, Lutyens established his reputation designing country houses for wealthy patrons; in the war's aftermath, he devoted much of his time to memorialising the casualties. He served as one of the three principal architects to the Imperial War Graves Commission (IWGC; later the Commonwealth War Graves Commission, CWGC) and designed numerous war memorials for towns and villages across Britain, as well as several elsewhere in the Commonwealth. He was responsible for The Cenotaph on Whitehall in London, which became the focal point of the national Remembrance Sunday commemorations; the Thiepval Memorial to the Missing, the largest British war memorial anywhere in the world; and the Stone of Remembrance (also known as the Great War Stone), which appears in all large Commonwealth War Graves Commission cemeteries and forms part of several of his civic memorials, including Northampton's.
## Commissioning
Northampton's first war memorial was a temporary cenotaph built from wood and plaster which stood in Abington Street from July 1919 as a placeholder until a more permanent memorial could be erected; the temporary cenotaph was the focal point for remembrance services until the installation of the permanent memorial. As in several towns and cities, there were discussions within the town as to whether its war memorial should serve a purely monumental purpose or some sort of community function. Suggestions included renovating civic buildings, a new 2,000-seat concert hall, and a classical-style arch on Guildhall road. The Northamptonshire War Memorial Committee, chaired by local landowner Lord Lilford, eventually commissioned Lutyens to design a purely commemorative monument, and selected a site in part of the churchyard of All Saints' Church. The memorial was funded by public donations, including a donation of £50 from Lord Lilford.
Lutyens's designs were complete by 1920 and approved in November of that year, but as the chosen site was part of the churchyard, and several graves would have to be relocated to accommodate the memorial, the war memorial committee had to seek a faculty from the Diocese of Peterborough (the diocese in whose jurisdiction Northampton falls), which delayed the installation. The Reverend Geoffrey Warden, vicar of All Saints' Church, submitted the application in 1922, supported by two church wardens and two parishioners. Construction work commenced only in 1926, six years after the completion of the designs. By July 1926, the Northampton Independent reported that the obelisks had been carved and were waiting for the flags to be painted before they could be erected.
## Design
Northampton's is a comparatively elaborate war memorial, especially for a town rather than a city. It consists of a Stone of Remembrance flanked by tall twin obelisks, each adorned with a pair of painted stone flags. Its use of obelisks, a Stone of Remembrance, and painted flags—all features characteristic of Lutyens's war memorials—make it particularly significant among his works.
Each obelisk sits on a tall, four-tiered rectangular column which itself stands on a wider, undercut square plinth. The obelisks and their supporting columns are ornately decorated. A narrow cross is set into the obelisks while the town's coat of arms is moulded onto the columns; the columns contain deep decorative niches, forming an arch shape beneath the obelisks. Obelisks feature in several of Lutyens's war memorials, though only Northampton's and Manchester's use a pair of flanking obelisks (in Manchester's case, the obelisks flank a cenotaph, rather than a stone); both are particularly fine designs in which Lutyens uses the obelisks with "dignity and simple dramatic effect", according to historian Richard Barnes. The obelisks are inscribed with the dates of the First and Second World Wars in Roman numerals (the inscriptions relating to the Second World War were added at a later date).
Two stone flags—painted in the form of the Union Flag and the flags of the Royal Navy (the White Ensign), Merchant Navy (the Red Ensign), and Royal Air Force (the RAF Ensign)—appear to hang from each obelisk, draping around the cornices; each flag is surmounted by gold wreaths. Lutyens first proposed stone flags for use on the Cenotaph on Whitehall, but the proposal was rejected in favour of fabric flags (though they were used on several other memorials, including Rochdale Cenotaph and the Arch of Remembrance in Leicester). The stone is a monolith (carved from a single piece of rock), curved so slightly as to barely be visible to the naked eye, 12 feet (3.7 metres) long and devoid of any decoration beyond the inscriptions. Unusually, the Stone of Remembrance is inscribed on both faces. The east face bears the inscription Lutyens chose for all his Stones of Remembrance: "THEIR NAME LIVETH / FOR EVERMORE", as suggested by Rudyard Kipling, truncated from a verse in the Book of Ecclesiasticus; the west face is inscribed "THE SOULS OF THE RIGHTEOUS / ARE IN THE HANDS OF GOD", from The Wisdom of Solomon.
The whole memorial is raised on a stone platform that forms a narrow path between the stone and the obelisks. The Stone of Remembrance is further raised on three stone steps. The memorial stands in a small garden now just outside the All Saints' churchyard, defined by a low stone wall to the front and a yew hedge to the rear with ornamental gateways to either side. The gates are of cast iron and supported by large stone piers with urn finials. The wall is inscribed: "TO THE MEMORY OF ALL THOSE OF THIS TOWN AND COUNTY WHO SERVED AND DIED IN THE GREAT WAR".
## History
The memorial was eventually unveiled on 11 November (Armistice Day) 1926, as part of a large ecumenical service, which included 5,000 local schoolchildren. Attendance was so great that the service could not be accommodated in the church and was instead held in the market square. At the conclusion of the service, the crowd proceeded to the new memorial; the parade was led by veterans from the Battle of Mons and included other military representatives, nurses from Northampton General Hospital, and the town's civic leaders. Once in the square, the unveiling was performed by General Henry Horne, 1st Baron Horne and the memorial was dedicated by the Right Reverend Norman Lang, Suffragan Bishop of Leicester. Horne committed the memorial to the care of the town's mayor and Northamptonshire County Council, and in his speech referred to Northampton's role as the county town; he observed that communities across Northamptonshire would be erecting their own memorials, but felt that it was "right and fitting that there should stand in the county town some visible monument, some tangible memorial appealing to the heart through the eye, of the bravery, devotion to duty, and self-sacrifice of the men of Northamptonshire". The Prince of Wales laid a wreath during a ceremony on 7 July 1927, the year after the unveiling.
The Town and County War Memorial does not contain a list of casualties. The local branch of the Royal British Legion launched a campaign for a memorial dedicated to the town and containing a list of names. A garden of remembrance was built in Abington Square, the location of the original temporary cenotaph, and unveiled by Major General Sir John Brown in 1937; the names of the fallen were inscribed on the garden walls. The memorial to Edgar Mobbs—a professional rugby player from Northampton who was killed in the First World War in 1917—was moved into the garden.
The memorial was designated a Grade II\* listed building on 22 January 1976. In November 2015, as part of commemorations for the centenary of the First World War, Lutyens's war memorials were recognised as a "national collection". All 44 of his free-standing memorials in England were listed or had their listing status reviewed and their National Heritage List for England list entries updated and expanded. As part of this process, Northampton War Memorial was upgraded to Grade I.
## See also
- Lancashire Fusiliers War Memorial, a Lutyens memorial featuring a similar obelisk in Bury
- Grade I listed buildings in Northamptonshire
- Grade I listed war memorials in England |
# Cyril Bassett
Cyril Royston Guyton Bassett, VC (3 January 1892 – 9 January 1983) was a New Zealand recipient of the Victoria Cross (VC), the highest award for gallantry "in the face of the enemy" that could be awarded to British and Empire forces at the time. He was the only soldier serving with the New Zealand Expeditionary Force (NZEF) to be awarded the VC in the Gallipoli Campaign of the First World War.
Born in Auckland, Bassett was a bank worker when the First World War began. A member of New Zealand's Territorial Force, he volunteered for service abroad with NZEF and was posted to the New Zealand Divisional Signal Company as a sapper. He saw action on the opening day of the Gallipoli Campaign, and during the Battle of Chunuk Bair he performed the actions that led to his award of the VC. Medically evacuated due to sickness shortly after the battle, he later served on the Western Front and finished the war as a second lieutenant. Bassett returned to the banking profession but was recalled to active duty during the Second World War. He served on the Home Front and by the time he was taken off active duty in December 1943, he had been promoted to the rank of lieutenant colonel and was commander of signals in the Northern Military District. When he retired from his banking career he became a justice of the peace in Devonport. He died in 1983 at the age of 91.
## Early life
Cyril Royston Guyton Bassett was born on 3 January 1892 in the Auckland suburb of Mount Eden, to a printer, Frederick Bassett, and his wife Harriet, née Powley. Bassett attended Auckland Grammar School and then Auckland Technical College. After completing his formal education in 1908, he worked as a clerk for the National Bank of New Zealand. In 1909, he joined what later became the Territorial Force, the part-time military reserve, and was posted to the Auckland College Rifles. Two years later he transferred to the Auckland Divisional Signal Company.
## First World War
When the First World War broke out in the summer of 1914, it was Bassett's intention to join the Royal Navy, but his mother, whose family had a history of service in the British Army, convinced him to enlist in the New Zealand Military Forces. Bassett was not particularly tall and was initially rejected on the grounds of height. He persisted with his attempt to enlist, and joined the New Zealand Expeditionary Force (NZEF) as a sapper in the Corps of New Zealand Engineers, assigned to the New Zealand Divisional Signal Company.
### Gallipoli
Bassett embarked with the main body of the NZEF for the Middle East in October 1914. Initially based in Egypt, after a period of training, he landed at ANZAC Cove on 25 April 1915, the opening day of the Gallipoli campaign. Along with the other signallers of his unit, he was immediately set to work laying communication lines to the headquarters of the New Zealand and Australian Division. In early May, he, along with two other signallers, was noted in consideration for a gallantry award for their efforts in laying telephone wires while under heavy fire.
Later in the campaign, Bassett was promoted to corporal. In August 1915, a series of offensives against Turkish positions along the Gallipoli front was planned to break the stalemate that had developed since the initial landing. On 7 August, the New Zealand Infantry Brigade attacked Chunuk Bair, a prominent hill overlooking the battlefield. The battle lasted for three days. Chunuk Bair was captured by the brigade's Wellington Infantry Battalion on the second day, during which Bassett, in command of a section of five other signallers of his unit, laid down and maintained telephone lines between brigade headquarters and the front lines. Working on the exposed slopes leading up to Chunuk Bair, he braved continuous gunfire during this time armed with only a revolver and a bayonet. A bullet struck his boot and two more passed through the fabric of his tunic during the fighting, but he was not wounded.
After the battle, Basset's name, along with those of the other five signallers of his section, was collected by Major Arthur Temperley of brigade headquarters, who nominated Bassett to receive the Victoria Cross (VC). A few days later, Bassett was evacuated from Gallipoli due to poor health. Suffering from dysentery, he spent several months recuperating at a hospital in Leicester and it was here that he was advised of his VC award. Instituted in 1856, the VC is the highest gallantry award that can be bestowed on a soldier of the British Empire. The citation read:
> No. 4/515 Corporal Cyril Royston Guyton Bassett, New Zealand Divisional Signal Company. For most conspicuous bravery and devotion to duty on the Chunuk Bair ridge in the Gallipoli Peninsula on 7th August, 1915. After the New Zealand Infantry Brigade had attacked and established itself on the ridge, Corporal Bassett, in full daylight and under a continuous and heavy fire, succeeded in laying a telephone line from the old position to the new one on Chunuk Bair. He has subsequently been brought to notice for further excellent and most gallant work connected with the repair of telephone lines by day and night under heavy fire.
The citation incorrectly refers to Bassett's actions on 7 August; it was not until the following day that the Wellington Infantry Battalion captured Chunuk Bair. His VC was the first to be awarded to a soldier of the NZEF and he was the only one to receive it for actions during the Gallipoli Campaign. King George V presented him the VC at an investiture held at Buckingham Palace on 3 February 1916. Bassett later remarked of the VC action, "I reckon there must be some guardian angel looking after me, especially as one man was shot dead in front of me and another wounded just behind."
### Western Front
In June 1916, Bassett rejoined his unit, by then on the Western Front in France as part of the New Zealand Division. Later that year, he participated in the Battle of the Somme, and in 1917 was commissioned as a second lieutenant. He was wounded twice while on the Western Front; the first occasion was in October 1917, and the second was during the German spring offensive in March 1918, when an artillery barrage destroyed the headquarters of the New Zealand Rifle Brigade, where he was the signals officer. The same barrage killed the brigade's commander, Brigadier-General Harry Fulton. On extracting himself from the rubble of the headquarters, Bassett immediately set about reestablishing communications for which he was recommended for, but was not awarded, the Military Cross. With the war now over, he returned to New Zealand in late 1918 as the New Zealand Division started demobilising and was formally discharged from the NZEF in 1919.
## Interwar period and Second World War
Bassett returned to his banking career after the war, managing branches of the National Bank in Auckland and later in Paeroa. He retained a link to the military, rejoining the Territorial Force shortly after his discharge from the NZEF but was placed on the retired list of officers in 1929. Three years previously, he had married Ruth Louise Grant in St David's Church, Auckland; the couple had two children. By 1939 he was the manager of the Auckland Town Hall branch of the National Bank.
Called up for the National Military Reserve as a result of the outbreak of the Second World War, Bassett was placed on active duty in 1941 as a captain in the Royal New Zealand Corps of Signals (RNZSigs). He was not required to serve overseas and instead he worked in signals on the Home Front in New Zealand. Promoted to major in February 1942, his active war service ended in December 1943. By then he had achieved the rank of lieutenant colonel, and was commander of signals in the Northern Military District.
## Later life and legacy
Bassett returned to the National Military Reserve from which he retired in 1948. As a civilian, he resumed his banking profession. He retired in 1952 but remained active in the community of Devonport, on Auckland's North Shore, as a justice of the peace. In 1953, he was awarded the Queen Elizabeth II Coronation Medal. He died at Stanley Bay, in Auckland, on 9 January 1983, shortly after his 91st birthday; his ashes were buried at North Shore Memorial Park. He was survived by his wife Ruth and their two daughters. His VC, gifted to the RNZSigs upon his death, is displayed at the Auckland War Memorial Museum. Several years earlier, Bassett had planted a pine tree, reportedly cultivated from a seedling taken from the area of the Battle of Lone Pine at Gallipoli, in front of the museum as part of an Anzac Day service.
According to his daughter, Bassett rarely spoke about his achievements, and she did not learn of her father's award until she studied Gallipoli at primary school. He was modest and expressed embarrassment at being the only New Zealand VC recipient of the Gallipoli Campaign. He commented that "when I got my medal I was disappointed to find I was the only New Zealander to get one at Gallipoli, because hundreds of Victoria Crosses should have been awarded there."
Bassett remains the only New Zealand signaller to have been awarded the VC and was a lifetime member of the Corps of Signals Association. In recognition of Bassett's rank at the time of his award, the Bassett Memorial Trophy is awarded annually to the most outstanding corporal in the RNZSigs. The trophy is a statue of Bassett on Chunuk Bair. An annual speech competition, run by the Royal New Zealand Returned and Services' Association and sponsored by the ANZ Bank, formerly the National Bank, for secondary school students is named for him. The winner travels to Gallipoli to attend the ANZAC Day commemorations. |