id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2017-09/1579/en_head.json.gz/21875 | iPod defeats Blue Screen of Death
Chris Mellor (Techworld.com) on 09 March, 2005 10:00
If your laptop crashes, what's the best to recover the data on it? Well, one IBM engineer uses his iPod.At Big Blue's USA PartnerWorld conference Steve Welch used software on the iPod to save the crashed ThinkPad. The software is One-touch Rescue & Recovery On Linux and is not as yet an announced IBM product.Welch said the iPod-based software could rebuild a hard drive in an hour or so and provide instant access to data such as e-mail and Lotus Notes, saying: "This is music to my ears."Actually the facility is old hat. What's new is the iPod and Linux.IBM PCs and notebooks now come with software from Xpoint Technologies. This arrangement has existed since 2003. The software can be licensed for other PCs. IBM calls the technology IBM Rescue and Recovery. It is part of its ThinkVantage program and provides users with do-it-yourself access to the notebook or PC even if the system can't boot.
In that circumstance, the user accesses the facility by pressing the blue Access key button on supported IBM systems or the F11 key on other PCs running the software. This operates at the BIOS level and enables the Xpoint software to take over the system. It can recover files, folders or even an entire system image, if it has been previously saved. The software includes diagnostic tools and a view of basic system information such as its BIOS set-up.It does not work if the hard drive has crashed though.The Xpoint Technologies software backups data on the PC's hard drive to a hidden partition, not accessible to the operating system, or to an external device. The first backup is a full one with subsequent backups being only of changed data, called differential backups. This data can be accessed by the Rescue and Recovery software. Rescue and Recovery is independent of Windows; the initial image is taken in DOS during the PC's setup.Interestingly, Frank Wang, Xpoint's president and chief executive office, was a member of the original technology team that developed the first IBM PC.
Techworld.com | 科技 |
2017-09/1579/en_head.json.gz/21876 | The QWERTY keyboard comes to prepaid, but lack of 3G dampens the occasion.
Design, QWERTY keyboard, display, value for money
Proprietary headphone/charging jack, lack of 3G connectivity
Targeted at teenagers, the KS360 is a commendable effort, but its lack of 3G connectivity is a black mark considering its features list. The QWERTY keyboard is excellent for SMS use, though, and the price is definitely appealing.
One of the first QWERTY keyboard phones to be available on prepaid, along with the Centro, LG's KS360 is a commendable slider that is only really let down by its lack of 3G connectivity.The KS360 feels surprisingly solid for a phone that commands a price of just $199. The spring-operated slider is sturdy and firm, while the rounded edges make this handset comfortable to hold in both a landscape and portrait orientation. The backlit QWERTY keyboard is obviously a highlight: it's a feature usually reserved for much more expensive smartphones. The KS360's keypad is comfortable, provides excellent tactility and is well designed. The circular buttons are a theme throughout the whole device, extending to the five-way navigational pad and selection buttons.When slid open, the KS360 automatically rotates the screen, but there is no accelerometer. Annoyingly, when in landscape mode, the two selection buttons don’t match up to their intended on-screen menu text, so you have to reach away from the keyboard to press them. The display itself is excellent considering the asking price of this handset, though sunlight glare is an issue.The KS360 is exclusively offered through Boost Mobile, which is owned and operated by Optus. Unfortunately, despite this handset possessing a number of social-networking features, including a preinstalled MySpace Mobile application, there is no 3G connectivity. Boost is quick to highlight the KS360's features, which include a Web browser and e-mail capabilities, but at GPRS speeds you'll quickly become frustrated with long loading times.The QWERTY keyboard is still handy, particularly for text messaging. Also bundled are a number of Boost games as well as multimedia features, including an MP3 player, a 2-megapixel camera that doubles as a video recorder, an FM radio and a voice recorder. Where LG has yet again failed is with its insistence on a proprietary headphone and charging port; this means you are stuck using the included headphones, which are below average in quality. The lack of a 3.5mm headphone jack has been a gripe with almost all LG phones and shows no sign of being rectified.
For extra storage, the KS360 includes a microSD card slot, located on the left side of the handset. There is no card included in the sales package, but their price has fallen dramatically in recent months, so it shouldn't add too much to your overall purchase. | 科技 |
2017-09/1579/en_head.json.gz/21891 | Home › News News February 16, 2017 NBAA Speaks Up About Santa Monica Settlement Business aviation has a surprise in store for FAA, Santa Monica. After the FAA’s surprise settlement with the City of Santa Monica last month, which seemed to many in aviation like a betrayal by the FAA, those hoping to keep the historic Santa Monica Municipal Airport open beyond 2028 finally got some good news. On February 13, the National Business Aviation Association (NBAA) announced that they more »
February 16, 2017 Drones To Carry Passengers This July Would you be brave enough to try? Think traveling by drone is a thing of the future? Well, the future isn’t all that far away. Dubai’s Roads and Transport Authority has announced that the Ehang 184 Autonomous Aerial Vehicle (AVV) will be used to ferry passengers around the city as early as July of this year. The single-seat Ehang 184 flies without more »
February 16, 2017 How Do Non-Pilots Feel About ATC Privatization? A new survey suggests that it's not just GA pilots who aren't interested in privatizing ATC. As we’ve written about extensively on P&P, air traffic control privatization, as has been championed by Pennsylvania Representative Bill Shuster, is a deal that gives the airlines everything they want while effectively raising costs to general aviation and cutting us out of the decision-making process. In short, it’s a disastrous plan. But how do non-pilots more »
February 8, 2017 Crowdsourced Weather On The Horizon New system will gather and distribute onboard weather radar data from aircraft in flight. Installed airborne weather radar has its strengths and weaknesses. On one hand, you get more direct and immediate information about what you’re flying into, which is great for tactical maneuvering. On the other hand, the distance at which the radar can detect cells is limited and what’s behind a big cell in front of you more »
February 7, 2017 Will The 2017 Collier Trophy Be Another Space Case? There are some intriguing nominees, including a group of real live airplane heroes. For the past two years the Collier Trophy, the biggest prize in all of aviation, has gone to space-based endeavors. Will 2017 see an aviation accomplishment win the hardware? Here are the nominees. First is Blue Origin for New Shepard. New Shepard is this year’s only nomination for a space-bound achievement. It is a suborbital, more »
February 7, 2017 Up In Arms Over The Price Of 100LL Are we seeing the beginning of a movement to do something about unreasonably high fuel prices? Long gone are the days when 100LL was a buck a gallon pretty much everywhere you went, give or take a few cents. Not only are we paying a lot more in general for 100LL, but there are FBOs scattered around the country that are charging double the average price for the fuel. According to more »
February 2, 2017 New Administration Freezes Pending Regs How this affects the BasicMed and Part 23 reforms. After all the work to make it happen on time, there has been concern over the past few days that the implementation of the third class medical reform rule might be pushed back a few months. This comes from an order issued by the new presidential administration to freeze new and pending regulations. As part more »
February 2, 2017 Kids’ First Flight Video Goes Viral Home movie reminds us all what it was like that very first flight. The video we shared with you all last week that shows the dramatic reaction of Jace and Zoriana, the kids in the back seat of the plane flown by Jeff Archuletta, has taken off, becoming planeandpilotmag.com’s top ranking video and our most viewed and shared post. If you haven’t seen it yet, you’ve got to check more »
February 1, 2017 More From Maule A few new additions to the M-9 line. Maule is once again expanding their list of available options. In addition to the new M-4 models we wrote about last week, they have just received FAA approval for two new versions of their M-9 four- (or five minus baggage) seater. The original M-9-235 Fuel Injected (Maule’s name for the model) was approved in 2012. The more »
January 25, 2017 Maule M-4 180V This capable Part 23 plane does a lot for a little. Maule has a couple of new airplanes for the new year – well, two slightly different models on the same airframe, anyway. The M-4 180V S2 and S4 are typical Maule: STOL performance, metal-winged tube-and-fabric construction, and an emphasis on cargo space. The big difference between the two it the seating. The S2 model is a more »
January 25, 2017 What’s NEXT? ADS-B is coming. Are we ALL ready? We’ve written a lot about what’s going on with ADS-B requirements for GA aircraft – who needs them, where to get them, and whether or not it will be worth all of the hassle. We haven’t talked much about what’s happening with the getting the surveillance network itself up and running. Right now, it’s looking more »
January 25, 2017 Tecnam Set To Show Off New Aircraft The P2012 Traveller will be meeting the public this April. Tecnam’s new piston twin prototype, the P2012 Traveller, is slated to make its debut at this year’s AERO Friedrichshafen Global General Aviation Expo. The event will be held April 5-8 in Friedrichshafen, Gemany. The P2012 is an 11-seat, fixed-gear passenger/cargo carrier with a max takeoff weight of 7,937 pounds. It is powered by two 375hp more »
January 19, 2017 Future Pilots Take To The Skies! Two young adventurers go flying You’re never too young to start flying and never too old to remember what it was like when it was all brand-new. Future pilots Jace and Zoriana took a trip out of Gillespie Field last Tuesday and were kind enough to share their adventure with us! Check out a video of their flight here. To get more aviation news more »
January 18, 2017 What’s Going On At Eclipse Aerospace? ONE Aviation pins their hopes on Project Canada It looks like Eclipse Aerospace might still be struggling. For the second time in three years, workers at the Albuquerque, NM manufacturing plant have been laid off. Eclipse hasn’t said how many people they let go or when those people might expect to be brought back. In the meantime, the plant is still operating, producing more »
January 18, 2017 American Jet Pioneer Goes West Joseph Sorota wasn’t well known, but his contributions to the jet age were great. Joseph Sorota, GE engineer and member of a top secret World War II engineering team called the “Hush-Hush Boys,” passed away on January 7th at the age of 96. As WWII was getting underway in Europe, Sorota studied engineering at Northeastern University and began working in a GE factory in Lynn, Massachusetts. In 1941, Sorota more »
January 18, 2017 Guess Where Piper Sold 50 Archers Okay, it might not be that hard to guess. China’s population of GA pilots is growing. Quickly. We’ve written several times about how Chinese airlines are having to bring in foreign pilots to get crews with enough flight experience. With expanding flight opportunities and the Civil Aviation Authority of China working to open up General Aviation, that may not be the case for much more »
January 18, 2017 Scariest-Sounding Airline Flight…Ever Would you take Flight 666 to HEL? How about if it was on Friday the 13th? Last Friday, January 13th, Finnair flight AY666 flew from CPH (Copenhagen) to HEL (Helsinki). The flight landed safely. No incidents, accidents, or unusual happenings were reported… So why is it news? Superstition is a funny thing. It changes with each country and culture. Even so, the date, flight number, and destination of Finnair’s trip hit more »
January 11, 2017 Plane Sharing Takes Its Case To The Supreme Court Internet startup Flytenow got shut down by the FAA. Did they fare any better before the highest court in the land? We all know that private pilots can’t fly for compensation or hire. There are a few exceptions – some charity flights, search and rescue, and sharing flight costs with passengers while paying at least a pro rata share. The caveat with the last is that according to the rulings of the administrative law judges of more » | 科技 |
2017-09/1579/en_head.json.gz/21906 | Report: Apple Falsified Jobs’ Stock Options Authorization
Date: Thursday, December 28th, 2006, 13:09
Apple Computer’s stock fell $1.64 per share, or two percent, following a report that indicated that Apple CEO Steve Jobs, who returned to the company in 1997, was awarded 7.5 million stock options in 2001 without proper authorization from the company’s board of directors. The revelation was made public via a recently released Financial Times article.
The article reports that records logging a full board meeting, which is required to approve stock options remunerations, were later falsifed. These records are now under review by the Securities and Exchange Commission as it decides which investigative avenues to pursue (individual or company-wide) and Apple is expected to discuss the situation along with details about an additional investigation as to stock option irregularities and backdating within the company at the annual meeting on Friday.
Additional questions and pressures have arisen given the fact that Jobs has sought legal counsel outside the company, despite having been above the fray of the investigation since it began this fall. To date, Jobs had claimed he had been “aware” of the backdating “in a few instances”, but claimed he had never benefited from these actions or been aware of the accounting implications.
In October of 2001, Jobs’ stock options were offered at an exercise price of $18.30 per share while the alleged board authorization occurred near the end of 2001. Such an action would indicate that the options weren’t properly authorized and had been backdated to maximize the value of Jobs’ stock options.
Jobs would later surrender the stock options before exercising them, thereby not showing any actual gain on the transaction. The Apple CEO would later be given a grant of restricted stock by the company in lieu of the options, this value perhaps having been calculated on the backdated stock options and targeted by the investigation.
Apple has reportedly refused to comment on the issue, but a spokesman has claimed the company has handed the findings of its own internal investiation to the SEC. The company has stated that its investigation found “no misconduct by any member of Apple’s current management team”, although two executives – Nancy Heinan, former senior vice president and Fred Anderson, former chief financial officer for the company, both resigned as of this year.
Anderson may be under investigation for a different set of actions, as he was not a part of the board of directors as of the October, 2001 decision that’s pertinent to Jobs’ reported role in the case.
If you have any ideas or comments, let us know.
Recent PostsRumor: Apple to debut updated iPad Pro models, 128 GB iPhone SE, Product Red iPhone 7, iPhone 7 Plus units at March eventAnalyst predicts Apple will switch from Touch ID to facial recognition technology for next-gen iPhoneVerizon settles on $4.48 billion purchase price for Yahoo's core assets following email hacksRumor: Next-gen iPhone to feature 3D-sensing front camera systemApple purchases Israeli firm Realface, thought to be interested in biometric login features Comments are closed. | 科技 |
2017-09/1579/en_head.json.gz/21909 | EmailA to ZContactsSite MapNewsMultimediaSearch Topics and PeopleShortcuts Other News Emergency Info Media Central Event Streaming Public Events Calendar Faculty News Student Publications The Daily Princetonian Campus Media Local News World News About PrincetonAcademicsAdmission & AidArtsInternationalLibraryResearch Administration & ServicesCampus LifeVisiting CampusStudentsFaculty & StaffAlumniParents & FamiliesUndergraduate ApplicantsGraduate School ApplicantsMobile Princeton Web AppMobile Princeton App for AndroidMobile Princeton App for iOSConnect & SubscribeHome » News » Archive » Expectation of extraterrestrial life built more on optimism than evidence, study findsNews at PrincetonSaturday, Feb. 18, 2017News StoriesFAQsEvents & CalendarsMultimediaFor News MediaShare Your NewsCurrent StoriesFeaturesScience & TechPeopleEmergency AlertsUniversity BulletinArchive Web StoriesTo News Archive|« Previous by Date|Next by Date »Expectation of extraterrestrial life built more on optimism than evidence, study finds
Posted April 26, 2012; 09:00 a.m.by Morgan KellyTweet e-mail
Recent discoveries of planets similar to Earth in size and proximity to the planets' respective suns have sparked scientific and public excitement about the possibility of also finding Earth-like life on those worlds.
But Princeton University researchers have found that the expectation that life — from bacteria to sentient beings — has or will develop on other planets as on Earth might be based more on optimism than scientific evidence.
Princeton astrophysical sciences professor Edwin Turner and lead author David Spiegel, a former Princeton postdoctoral researcher, analyzed what is known about the likelihood of life on other planets in an effort to separate the facts from the mere expectation that life exists outside of Earth. The researchers used a Bayesian analysis — which weighs how much of a scientific conclusion stems from actual data and how much comes from the prior assumptions of the scientist — to determine the probability of extraterrestrial life once the influence of these presumptions is minimized.
Turner and Spiegel, who is now at the Institute for Advanced Study, reported in the Proceedings of the National Academy of Sciences that the idea that life has or could arise in an Earth-like environment has only a small amount of supporting evidence, most of it extrapolated from what is known about abiogenesis, or the emergence of life, on early Earth. Instead, their analysis showed that the expectations of life cropping up on exoplanets — those found outside Earth's solar system — are largely based on the assumption that it would or will happen under the same conditions that allowed life to flourish on this planet.
In fact, the researchers conclude, the current knowledge about life on other planets suggests that it's very possible that Earth is a cosmic aberration where life took shape unusually fast. If so, then the chances of the average terrestrial planet hosting life would be low.
"Fossil evidence suggests that life began very early in Earth's history and that has led people to determine that life might be quite common in the universe because it happened so quickly here, but the knowledge about life on Earth simply doesn't reveal much about the actual probability of life on other planets," Turner said.
"Information about that probability comes largely from the assumptions scientists have going in, and some of the most optimistic conclusions have been based almost entirely on those assumptions," he said.
Turner and Spiegel used Bayes' theorem to assign a sliding mathematical weight to the prior assumption that life exists on other planets. The "value" of that assumption was used to determine the probability of abiogenesis, in this case defined as the average number of times that life arises every billion years on an Earth-like planet. Turner and Spiegel found that as the influence of the assumption increased, the perceived likelihood of life existing also rose, even as the basic scientific data remained the same.
"If scientists start out assuming that the chances of life existing on another planet as it does on Earth are large, then their results will be presented in a way that supports that likelihood," Turner said. "Our work is not a judgment, but an analysis of existing data that suggests the debate about the existence of life on other planets is framed largely by the prior assumptions of the participants."
Joshua Winn, an associate professor of physics at the Massachusetts Institute of Technology, said that Turner and Spiegel cast convincing doubt on a prominent basis for expecting extraterrestrial life. Winn, who focuses his research on the properties of exoplanets, is familiar with the research but had no role in it.
"There is a commonly heard argument that life must be common or else it would not have arisen so quickly after the surface of the Earth cooled," Winn said. "This argument seems persuasive on its face, but Spiegel and Turner have shown it doesn't stand up to a rigorous statistical examination — with a sample of only one life-bearing planet, one cannot even get a ballpark estimate of the abundance of life in the universe.
"I also have thought that the relatively early emergence of life on Earth gave reasons to be optimistic about the search for life elsewhere," Winn said. "Now I'm not so sure, though I think scientists should still search for life on other planets to the extent we can."
Promising planetary findsDeep-space satellites and telescope projects have recently identified various planets that resemble Earth in their size and composition, and are within their star's habitable zone, the optimal distance for having liquid water.
Of particular excitement have been the discoveries of NASA's Kepler Space Telescope, a satellite built to find Earth-like planets around other stars. In December 2011, NASA announced the first observation of Kepler-22b, a planet 600 light years from Earth and the first found within the habitable zone of a Sun-like star. Weeks later, NASA reported Keplers-20e and -20f, the first Earth-sized planets found orbiting a Sun-like star. In April 2012, NASA astronomers predicted that the success of Kepler could mean that an "alien Earth" could be found by 2014 — and on it could dwell similar life.
While these observations tend to stoke the expectation of finding Earth-like life, they do not actually provide evidence that it does or does not exist, Spiegel explained. Instead, these planets have our knowledge of life on Earth projected onto them, he said.
Yet, when what is known about life on Earth is taken away, there is no accurate sense of how probable abiogenesis is on any given planet, Spiegel said. It was this "prior ignorance," or lack of expectations, that he and Turner wanted to account for in their analysis, he said.
"When we use a mathematical prior that truly represents prior ignorance, the data of early life on Earth becomes ambiguous," Spiegel said.
"Our analysis suggests that abiogenesis could be a rather rapid and probable process for other worlds, but it also cannot rule out at high confidence that abiogenesis is a rare, improbable event," Spiegel said. "We really have no idea, even to within orders of magnitude, how probable abiogenesis is, and we show that no evidence exists to substantially change that."
Considering the sourceSpiegel and Turner also propose that once this planet's history is considered, the emergence of life on Earth might be so distinct that it is a poor barometer of how it occurred elsewhere, regardless of the likelihood that such life exists.
In a philosophical turn, they suggest that because humans are the ones wondering about the emergence of life, it is possible that we must be on a planet where life began early in order to reach a point so soon after the planet's formation 4.5 billion years ago where we could wonder about it.
Thus, Spiegel and Turner explored how the probability of exoplanetary abiogenesis would change if it turns out that evolution requires, as it did on Earth, roughly 3.5 billion years for life to develop from its most basic form to complex organisms capable of pondering existence. If that were the case, then the 4.5 billion-year-old Earth clearly had a head start. A planet of similar age where life did not begin until several billion years after the planet formed would have only basic life forms at this point.
"Dinosaurs and horseshoe crabs, which were around 200 million years ago, presumably did not consider the probability of abiogenesis. So, we would have to find ourselves on a planet with early abiogenesis to reach this point, irrespective of how probable this process actually is," Spiegel said. "This evolutionary timescale limits our ability to make strong inferences about how probable abiogenesis is."
Turner added, "It could easily be that life came about on Earth one way, but came about on other planets in other ways, if it came about at all. The best way to find out, of course, is to look. But I don't think we'll know by debating the process of how life came about on Earth."
Again, said Winn of MIT, Spiegel and Turner offer a unique consideration for scientists exploring the possibility of life outside of Earth.
"I had never thought about the subtlety that we as a species could never have 'found' ourselves on a planet with a late emergence of life if evolution takes a long time to produce sentience, as it probably does," Winn said.
"With that in mind," he said, "it seems reasonable to say that scientists cannot draw any strong conclusion about life on other planets based on the early emergence of life on Earth."
This research was published Jan. 10 in the Proceedings of the National Academy of Sciences and was supported by grants from NASA, the National Science Foundation and the Keck Fellowship, as well as a World Premier International Research Center Initiative grant from the Japanese Ministry of Education, Culture, Sports, Science and Technology to the University of Tokyo. | 科技 |
2017-09/1579/en_head.json.gz/21933 | ‘Birdbrains’ Helped Birds Survive Mass Extinction
According to researchers, brainpower may have enabled birds to survive in the midst of mass extinction.
Dinosaurs were wiped out during the Cretaceous-Tertiary mass extinction 65 million years ago, but birds were able to survive and thrive.
Recent analysis of fossil skulls using computer tomography scans shows that modern birds were able to survive the dire conditions due to their well-developed brains.
The birds' ability to solve problems gave them a crucial edge.
"Birdbrained is a dreadful misnomer," said Dr Angela Milner, of the Natural History Museum in London In an interview with the Times Online. "It's really quite an insult to birds when you think how sophisticated a lot of modern birds are.
"They can learn to talk, they can migrate over long distances, they have all sorts of capabilities and it all has to be crammed into a brain light enough that it doesn't stop them flying. They were in some ways more advanced than dinosaurs."
Milner, along with Dr. Stig Walsh, carried out the skull scans to see the shape and volume of the birds' brains. The results led the researchers to believe that birds had advanced mental agility which allowed them to adapt to difficult situations. The researchers believe the key feature may have been the development of the wulst, a structure in the brain linked to visual awareness.
The wulst was small in birds alive after the mass extinction, but today the brain structure is much larger, especially in species that rely heavily on eyesight, such as owls. The structure was absent from creatures such as pterosaurs, ancient flying creatures, and earlier forms of birds that are now extinct.
Researchers believe today's birds are descendants of the avian species that survived the mass extinction believed to have been caused by a meteor that wiped out 85 percent of animal species.
"Our research suggests that the evolution of an expanded and structurally complex brain in the ancestors of living birds may have provided them with a competitive advantage over the more archaic bird lineages and pterosaurs," Dr. Walsh told Times Online.
"There were other flying animals around, such as pterosaurs and older groups of birds, but we've not really known why the ancestors of the birds we see today survived the extinction event and the others did not. It has been a great puzzle for us "“ until now."
Milner and Walsh compared the brain cavity of the earliest known bird, the Archaeopteryx, which dates back 147 million years, with two marine birds living after the mass extinction.
"Scans revealed the creatures' brain power because the shape of the brain is imprinted in the inside of the skull showing the shape and volume," said Dr. Walsh.
The modern birds, Odontopteryx toliapica and Prophaethon shrubsolei, dated back 55 million years and were related to today's pelicans, and albatrosses.
The researchers needed to use species that lived 10 million years after the extinction due to the fact that intact bird skulls are very rare. The two marine species had very developed brains, too developed to have evolved after the extinction according to the scientists.
The researchers concluded that the modern-style birds living during the time of the extinction had to have brains comparable to birds alive today, both in their ability to control flight and sight, and in their ability to learn.
"In the aftermath of the extinction event, life must have been especially challenging. Birds that were not able to adapt to rapidly changing environments and food availability did not survive, whereas the flexible behavior of the large-brained individuals would have allowed them to think their way around the problem," Dr. Walsh said.
The study appears in the Zoological Journal of the Linnean Society.
Zoological Journal of the Linnean Society | 科技 |
2017-09/1579/en_head.json.gz/21934 | Start Of Ancient Global Warming Episode Discovered
Researchers have pinpointed the timing of the start of an ancient global warming episode known as the Paleocene-Eocene thermal maximum (PETM). The early part of the Cenozoic era witnessed a series of transit global warming events called hyperthermals. The most severe of these was the PETM at the Paleocene-Eocene boundary, which took place around 56 million years ago. Over a 20,000-year period, ocean temperatures rose globally by about 41 degrees Fahrenheit. The team said one possibility of this temperature rise is that these hyperthermals were driven by cyclic variations in the eccentricity of the Earth's orbit around the sun. Increased temperatures at the cycle peaks could have caused methane hydrate deposits in the deep sea to release large amounts of methane. The researchers also said it may have been geological processes that could have been the culprit for the warming associated with the PETM. Magmatism in this scenario would have caused the baking of marine organic sediments, leading to the massive release of methane and/or carbon dioxide. "Determining exactly what triggered the PETM requires very accurate dating of the event itself, to determine whether it occurred during a known maximum in the Earth's orbital eccentricity" Adam Charles, a University of Southampton PhD student supervised by Dr Ian Harding, and first author of the newly published report in the journal Geochemistry Geophysics Geosystems, said in a statement.
The researchers, based at the National Oceanography Centre, Southampton, measured radioisotopes of uranium and lead in the mineral zircon to get a better grip on the numerical age of the Paleocene-Eocene boundary. The team collected these rocks from two locations in Spitsbergen, the largest island of the Svalbard Archipelago in the Arctic. The researchers dated the Paleocene-Eocene boundary at between 55.728 and 55.964 million years ago, which they believe to be the most accurate estimate to date. Their analyses indicated that the onset of the PETM did not occur at the peak of a 400 thousand year cycle in the Earth's orbital eccentricity. "Compared to other early Eocene hyperthermals, it appears that the PETM was triggered by a different mechanism, and thus may have involved volcanism. However, a thorough test of this hypothesis will require further detailed dating studies," Adam said in a statement.
The researchers are Adam Charles, Ian Harding, Heiko Pälike and John Marshall (Ocean and Earth Science, University of Southampton), Daniel Condon (British Geological Survey), and Ying Cui and Lee Kump (Pennsylvania State University). The research was supported by the Natural Environment Research Council (NERC) and Shell UK, as well as a Philip Leverhulme Prize to Heiko Pälike.
Reference: Charles, A. J., Condon, D. J., Harding, I. C., Pälike, H., Marshall, J. E. A., Cui, Y. & Kump, L. Constraints on the numerical age of the Paleocene"“Eocene boundary. Geochemistry Geophysics Geosystems 12, Q0AA17, doi:10.1029/2010GC003426
Image Caption: The image shows a location in Longyearbyen, Spisbergen, where the researchers carried out field work. Credit: Ian Harding
National Oceanography Centre
Geochemistry Geophysics Geosystems | 科技 |
2017-09/1579/en_head.json.gz/21972 | Supplies run low at space station
MOSCOW -- A Russian cargo ship blasted off earlier today carrying badly needed food and equipment for the international space station, where supplies for the American and Russian crew have been dwindling rapidly.
The Progress M-51 took off from the remote Baikonur cosmodrome in Kazakhstan at 1:19 a.m. (Moscow time) with about 2.5 tons of food, fuel and research equipment for Russian cosmonaut Salizhan Sharipov and U.S. astronaut Leroy Chiao, ITAR-Tass and Interfax news agencies said. It was scheduled to arrive Sunday morning.
Russian and American space officials were alarmed earlier this month to learn that Sharipov and Chiao, in their second month at the station, had gone through much of their food. There was food to last seven to 14 days beyond Dec. 25 if the supply ship did not arrive.
The crew has been ordered to cut back on meals. A Russian Space Agency spokesman has said the two could be forced to return to Earth if the Progress does not reach the station.
An independent team was looking into how the orbiting station's food inventory ended up being tracked so poorly and how it can be improved in the future.
Sharipov and Chiao's launch to the station in October was delayed twice -- once after the accidental detonation of an explosive bolt used to separate the ship's various components, and then when a tank with hydrogen peroxide burst due to a sudden change in pressure.
Russian rockets and the non-reusable Soyuz space craft have been the only way Russia and the United States can get to the space station and back since the U.S. shuttle fleet was grounded after the Columbia burned up on re-entry in February 2003, killing all seven astronauts aboard.
NASA has said it plans to resume its shuttle program in May.
NASA: http://spaceflight.nasa.gov | 科技 |
2017-09/1579/en_head.json.gz/22003 | Golden alga causes fish die-off on Salt River
Thousands of fish are dead on a 20-mile stretch of the Salt River, where it flows into the east side of Roosevelt Lake. The Arizona Game and Fish Department began receiving reports of dead fish from the public on Wednesday, July 4, and department officers confirmed the fish die-off. A response team took water samples and collected dead fish on Thursday, and this morning’s lab tests revealed high concentrations of Golden alga believed to be the cause of the die-off.
Golden alga can produce a toxin that impacts the gills of fish and causes them to suffocate. Golden alga was first reported to cause extensive fish-kills in the 1930’s and has been found in California, Nevada, New Mexico, Texas, and at least a dozen other states. Biologists have not yet determined if Golden alga occurs naturally in Arizona, but it has been identified in more than 20 lakes statewide since 2003. Despite extensive research, biologists do not yet know exactly what causes Golden alga to produce the toxin that is fatal to fish, crayfish, mussels, and all gill-breathing creatures. However, experts have noted a connection between extended drought, elevated salinity in waterways, and fish-kills caused by the toxins in Golden alga. “We believe that drought conditions and increased salinity may create an environment where Golden alga can thrive,” said Kirk Young, a fisheries biologist with Game and Fish. “Golden alga is found most often in waters with especially high salinity.”
The Salt River takes its name from the salt springs that are found upstream and responsible for the water’s high salinity in periods of low flows, Young said. The salinity in the Salt River is more than three times the concentration currently found in Roosevelt Lake. Biologists do not believe that the Golden alga problem will extend into Roosevelt Lake in the near future, primarily because the high salinity and algae concentrations found in the inflow water from the Salt River are diluted when mixed with water already in the lake, and it becomes less than toxic to fish.
Game and Fish will continue to investigate the situation and monitor waterways along the Salt River, including Apache, Canyon and Saguaro lakes where Golden alga is believed to still exist, but currently in low concentrations. Small systems such as urban ponds can be treated to eliminate Golden alga, but there currently exists no way to treat large system reservoirs or rivers. Game and Fish will continue to investigate the situation and monitor waterways along the Salt River, including Apache, Canyon and Saguaro lakes.
However, if drought conditions persist, downstream reservoirs could be at risk in the future. To date, no adverse health impacts have been noted for humans or non-gill-breathing wildlife that have come in contact with waters experiencing a Golden alga toxin bloom. The die-off has included species such as catfish, carp, bluegill, red shiner, largemouth bass, buffalo fish and crayfish.
Game and Fish advises the public not to eat any dead or dying fish they find anywhere regardless of the cause. However, people can continue to eat the fish they catch, as long as the fish are properly cleaned and thoroughly cooked.
Buck Springs Volunteer Work Project - July 14-15
Get out of the heat and enjoy the Mogollon Rim area near Clint’s Wells, while restoring and improving wildlife habitat.
PHOENIX – July 14-15 is the weekend for the Arizona Elk Society Buck Springs volunteer work project.
Buck Springs, which is in the Coconino National Forest north of Payson, is a unique place with very high value for wildlife. The area is home to a great elk population as well as turkeys, bears, songbirds, native fish and a host of other important wildlife species. One of the keys to making a good place into a great place for wildlife and wildlife enthusiasts is helping to restore the function of the wet meadows that are so important to wildlife. At one time, these meadows acted like giant sponges and stored moisture, slowly leaking the water into the creeks, maintaining the riparian community. Many of these meadows and the associated riparian areas are overrun with young pine trees that are invading these meadows.
Unless removed, in time, the meadows will be entirely overrun with pines and the water storage role for the meadows will be lost. In a cooperative venture, volunteers from the Arizona Elk Society, the Forest Service, and Game and Fish will cut these trees down, lop the limbs off, and stack them into piles so they can be burned when dried. There are thousands of trees to remove and the more help we get, the more that can be done. As a result of removing these trees, the meadows will be more open again and the ecological function of the meadows improved. If you enjoy time in the forest, here is a chance to give something back and make the forest a healthier place for wildlife and for your next visit.
The Arizona Elk Society will provide meals on Friday evening, all day Sat, and Sunday breakfast and lunch. If fishing and wildlife viewing is your thing this is the area to do it. If you would like to carpool please let us know. We need 60-80 volunteers in camp for this weekend work project event. Go to www.arizonaelksociety.org for full project details. Please sign up online, www.arizonaelksociety.org or RSVP to Tom Schorr (tomschorr@arizonaelksociety.org) if you can attend the event. They need an accurate count for food for the weekend.
Arizona’s abundant elk herds roam the mountains from Grand Canyon to the San Francisco Peaks and across the Mogollon Rim to the White Mountains. The Arizona Elk Society is committed to restoring the habitat that elk and host of other wildlife species depend on to survive and hopefully flourish. The AES is also very active in efforts for youth and adult hunter recruitment and retention, youth conservation education programs, and representing conservationists who value sustained use of wildlife and the hunting heritage. To do this, the AES represents its members and supporters throughout the state on issues ranging from forest health, access to public and private lands, responsible wildlife management, and many other conservation issues important to sportsmen. The AES meets these components of our Mission Statement with active support and involvement of local, regional, and national conservation organizations and agencies. For more information and to join the Arizona Elk Society, visit us at www.arizonaelksociety.org. | 科技 |
2017-09/1579/en_head.json.gz/22009 | New Discovery is Key To Understaning Neutrino Transformations
From: University of California Berkeley Posted: Thursday, March 8, 2012 A new discovery provides a crucial key to understanding how neutrinos -- ghostly particles with multiple personalities -- change identity and may help shed light on why matter exists in the universe.
In an announcement today (Thursday, March 8), members of the large international Daya Bay collaboration reported the last of three measurements that describe how the three types, or flavors, of neutrinos blend with one another, providing an explanation for their spooky morphing from one flavor to another, a phenomenon called neutrino oscillation.
The measurement makes possible new experiments that may help explain why the present universe is filled mostly with matter, and not equal parts of matter and antimatter that would have annihilated each other to leave behind nothing but energy. One theory is that a process shortly after the birth of the universe led to the asymmetry, but a necessary condition for this is the violation of charge-parity (or CP) symmetry. If neutrinos and their antimatter equivalent, antineutrinos, oscillate differently, this could provide the explanation.
"The result is very exciting, because it essentially allows us to compare neutrino and antineutrino oscillations in the future and see how different they are and hopefully have an answer to the question, Why do we exist?" said Kam-Biu Luk, a professor of physics at the University of California, Berkeley, and a faculty scientist at Lawrence Berkeley National Laboratory (LBNL). Luk is co-spokesperson of the experiment and heads the U.S. participation in this collaboration.
Researchers knew that if the observed third kind of oscillation were zero or near zero, it would make further study of matter-antimatter asymmetry difficult.
"This is a new type of neutrino oscillation, and it is surprisingly large," said Yifang Wang of China's Institute of High Energy Physics (IHEP), who is the co-spokesperson and Chinese project manager of the Daya Bay experiment. "Our precise measurement will complete the understanding of the neutrino oscillation and pave the way for the future understanding of matter-antimatter asymmetry in the universe."
"Berkeley has played a key role since day one in the Daya Bay experiment, one of the biggest experimental particle physics collaborations ever between the U.S. and China," said Graham Fleming, UC Berkeley vice chancellor for research. "The large value of the mixing angle theta one-three from this experiment enables a very broad program of fundamental new physics including the proposed Long Baseline Neutrino experiment at the Sanford Underground Research Facility in North Dakota."
The researchers have submitted a paper describing their results to the journal Physical Review Letters.
Using Antineutrinos from Chinese Nuclear Reactors
The results come from the Daya Bay Reactor Anti-neutrino Experiment in Guangdong Province, China, near Hong Kong, which is a joint collaboration between scientists in the United States, China, the Czech Republic, Hong Kong, Russia and Taiwan. The U.S. institutions include UC Berkeley and LBNL, as well as Brookhaven National Laboratory, the University of Wisconsin and Caltech.
Nuclear power reactors at Daya Bay emit one kind or flavor of antineutrino -- electron antineutrinos -- that are identified in the six underground detectors. These detectors contain a liquid scintillator loaded with the element gadolinium. When the electron antineutrinos interact in the liquid, a blue glow is emitted.
Because some of the antineutrinos emitted by the reactors change flavor as they travel, the flux of electron antineutrinos measured in the detectors 1.7 kilometers from the reactor is less than the flux coming directly from the reactor and measured in the nearby detectors that are about 500 meters away. The deficit allowed scientists to determine the value of the so-called mixing angle (theta one-three), the last to be measured of three mixing angles needed to interpret neutrinos' flavor-changing behavior.
"Although we're still two detectors shy of the complete experimental design, we've had extraordinary success in detecting the number of electron antineutrinos that disappear as they travel from the reactors to the detectors nearly two kilometers away," Luk said.
Neutrinos interact so weakly with other types of matter that they can pass through Earth as if it were not there. Once thought to be fairly boring, with zero mass and always traveling at the speed of light, neutrinos have proved to be a major challenge to the Standard Model of particle physics. Experiments over the past two decades showed that neutrinos do have mass and change their identity as they oscillate between three flavors: electron, muon and tau.
Neutrinos' shifting personalities require them to have at least some mass -- probably less than one-millionth that of an electron -- because that is what causes their strange identity problem. Each flavor of neutrino is a mixture of three different masses that fluctuate with time. Just as a white light composed of red, green and blue shifts its tint as the proportions of each color change, so the type of a neutrino changes as the proportions of the masses oscillate.
"Once a neutrino starts to propagate in space, it's very hard to tell what its identity is until we remeasure it," as in the Daya Bay experiment, Luk said.
Three numbers, called mixing angles, are part of the equations that describe these oscillations. The largest two were measured earlier in similar experiments -- including the KamLand collaboration, in which UC Berkeley and LBNL were active participants -- but with detectors set hundreds of kilometers from the neutrino source. The oscillation period associated with the third mixing angle was expected to be so small that a much shorter baseline experiment was needed, hence the Daya Bay collaboration. The six power reactors at Daya Bay and nearby Ling Ao yield millions of quadrillions of electron antineutrinos every second, of which the six detectors recorded tens of thousands between Dec. 24, 2011, and Feb. 17, 2012.
Daya Bay will complete the installation of the remaining two detectors this summer to obtain more data about neutrino oscillations. As a result, Daya Bay will continue to have an interaction rate higher than three competing experiments in France, South Korea and Japan , making it "the leading theta one-three experiment in the world," said William Edwards, a specialist in the physics department at UC Berkeley and LBNL and the U.S. project and operations manager for the Daya Bay experiment.
Robert Sanders
rsanders@berkeley.edu
Science Contacts:
Kam-Biu Luk
k_luk@lbl.gov
+1 (510) 486-7054 on Mar. 8
Yifang Wang
yfwang@ihep.ac.cn
Note: Kam-Biu Luk will give a lecture on the Daya Bay results today at
LBNL at 12:15 p.m. PST (3:15 p.m. EST), streamed live at
http://hosting.epresence.tv/LBL/1.aspx
Daya Bay Reactor Neutrino Experiment:
http://neutrino.physics.berkeley.edu/
LBNL press release:
http://newscenter.lbl.gov/news-releases/2012/03/07/daya-bay-first-results/ | 科技 |
2017-09/1579/en_head.json.gz/22018 | News in brief IT projects 'running 86 years late'
Government computer projects are running a total of 86 years behind schedule and £1.4 billion over budget, it has been reported.The Daily Telegraph reported that the longest delay was at the Department for Work and Pensions where a new IT system to enforce child maintenance payments was running seven years late.
The Ministry of Defence, where a new satellite communication system is £885 million over budget, accounted for £1.14 billion of the cost overruns.
The figures were obtained by the Liberal Democrats through parliamentary written questions submitted to all Whitehall departments.
Lib Dem Treasury spokesman Jeremy Browne said the figures raised serious questions about the Government's determination to press on with the controversial NHS computerisation programme.
"This Government's record on IT is lamentable," he said.
"Time and time again it wastes millions of pounds of taxpayers' money on IT projects which are delayed, ineffectual and over budget. "It is disturbing that even with a track record as bad as the Government's, Gordon Brown is determined to plough on with the hugely expensive NHS IT scheme.
"How can people be expected to have any faith in a Government which continues to be so inept and inefficient?" More about:
Liberal Democrat Party
Telegraph Group | 科技 |
2017-09/1579/en_head.json.gz/22056 | Algorithm predicts pop chart success
Kate Taylor, 19th December 2011
A team of British scientists reckon they can predict the success of a pop song with abut 60 percent accuracy using a machine learning algorithm.
The University of Bristol team looked at the official top 40 singles chart over the past 50 years, and evaluated musical features such as tempo, time signature, song duration and loudness. They also computed more detailed summaries of the songs such as harmonic simplicity, how simple the chord sequence is, and non-harmonicity - how 'noisy' the song is.
They then came up with a 'hit potential' equation that scores a song according to its audio features. They found they could classify a song as either a hit or a non-hit based on this score, with an accuracy rate of 60 per cent as to whether a song will make it to top five or remain below position 30.
"Musical tastes evolve, which means our hit potential equation needs to evolve as well," says Dr Tijl De Bie, senior lecturer in artificial intelligence.
"Indeed, we have found the hit potential of a song depends on the era. This may be due to the varying dominant music style, culture and environment."
Before the nineteen-eighties, for example,the danceability of a song wasn't that relevant to its hit potential. Since then, though, danceable songs have been more likely to become a hit.
The team's found its algorithm is more accurate for some eras than others.It's been particularly difficult to predict hits around 1980, they say, with the equation performing best for the first half of the nineties and since the year 2000. This suggests that the late seventies and early eighties were particularly creative and innovative periods of pop music, they say. These days, though - and rather depressingly - it appears that people are looking for loudness above all else, with all songs on the chart becoming louder - especially those that make it to the top.
Games and Entertainment Features x algorithm x university of bristol x prediction x success x pop charts x top 40 x Please enable JavaScript to view the comments powered by Disqus. Related Stories | 科技 |
2017-09/1579/en_head.json.gz/22121 | You've Got To See It to Believe It: Turbines Boost Profile of Renewables
Treehugger Interns
Not everyone loves wind turbines. This is an easy fact to forget, if you surround yourself with environmentally conscious people, but some folks even consider them ugly. One of the most common criticisms of wind energy, at least in the UK, is the argument that turbines industrialize our already dwindling areas of natural beauty. This treehugger personally considers these towering giants to be graceful and elegant symbols of a green future, but understands that not everyone feels the same way. It seems to make sense then, to place turbines in already industrialized landscapes, wherever possible. Not only does this avoid inciting the hoardes of NIMBYs, it also has the added advantage of increased visibility for, and acceptance of, renewables by the general public - especially as they are not placed on their favourite hillside landscape. The turbine pictured here sits on junction 11 of the M4 motorway in Reading, UK, and is seen spinning by as many as 60 million people annually as they drive by. Apparently the turbine features state-of-the-art blade technology for maximum efficiency and represents 'the next step into the future for multi-megawatt class turbines in the UK.' It is also claimed that it creates enough electricity to power the equivalent of 1000 homes. Surely high profile installations are the best way of showing that many of the solutions we have been waiting for are already here. And we've yet to meet anyone who has claimed that their favorite motorway junction has been spoiled by one of those 'ugly turbines.' The good folks at Ecotricity, the company behind this installation, are also working on the high-profile turbine project at Manchester City Football Club's stadium, which we reported on here, and have just started work on a three turbine project in Avonmouth Docks, just outside Bristol. Coincidentally, the Avonmouth site sits right next to the M5 motorway, which connects with the M4, so a drive from Somerset to London will soon take you past at least 4 2MW wind turbines in the space of a few hours. And just in case anyone is still worried about aesthetics, it is probably fair to say that Avonmouth (pictured left) is not the prettiest part of the UK. When local news reported on proposals for this project, they asked local residents whether it would spoil the view. "Are you kidding me?" came the reply. No sign of the NIMBYs here.[Written by: Sami Grover]
How small-scale wind turbines are made
This startup's wind generator flaps its wings like a hummingbird
China to spend $361 billion on green energy, predicts 13 million jobs
Clean Energy | | 科技 |
2017-09/1579/en_head.json.gz/22180 | > Science News
Authors of arsenic-based life study respond to Web critiques
VideoNASA finds new life formNASA astrobiologist Felisa Wolfe-Simon talks about her recent findings.» LAUNCH VIDEO PLAYER
By Marc Kaufman
Saturday, December 18, 2010; 11:00 AM
Two weeks after the release of a major study about the possibility of arsenic-based life in California's Mono Lake, a torrent of criticism in the blogosphere has turned a widely reported scientific triumph into a scientific football - with much-discussed implications for how research will be evaluated and presented in the future.
After remaining largely silent to the critiques - which came from respected scientists as well as ill-informed posters - the researchers, their NASA funders and the prestigious journal that published the article responded Thursday with promises to better explain the work and answer formal criticism.
But in the fast-changing world of the Internet, it was also clear that those involved are not really sure how to respond without compromising their scientific methods and values.
Speaking at a panel discussion at a San Francisco science conference, convened specifically to discuss the arsenic research and the online response, study co-author Ronald Oremland of the U.S. Geological Survey defended his silence as an integral part of the tried-and-true scientific research process.
"I was trained to go to the lab and conduct my experiments, to send them to journals if they merited that, and to hope that they made it past peer review," he said. He can respond to critics, he said, when they present scientific arguments and data.
He said that when people launch online attacks on the work done by him and biochemist Felisa Wolfe-Simon, he doesn't really know who is behind them. "I don't want to get involved in what can end up in a Jerry Springer situation, with people throwing chairs," he said.
Yet not only was Oremland on the panel Thursday because of the blogging, but the research team also put out a series of answers to questions frequently asked about their work, and promised to respond by next month to more than 20 letters and e-mails sent to the the magazine Science questioning their work. The team announced as well that it would make samples of the microbes available to other scientists for their research.
Science spokeswoman Ginger Pinholster said that the journal hoped to publish the letters and responses in March. She said that while other Science papers have brought out challenges and criticism, the speed and intensity of the blogosphere response to the arsenic research was unusual, if not unique.
Active online discussion of the paper began even before it was released. Based on a NASA announcement about release of an upcoming study that had implications for astrobiology and "extraterrestrial life," some bloggers were predicting news of life on the moon Titan or elsewhere in the solar system.
Instead, the discovery involved microbes from Mono Lake, Calif., which were grown in a way that replaced most of the phosphorus in the organism (long held to be essential for life) with the generally toxic element arsenic. Using some of the most sophisticated instruments available, the team then determined the arsenic had replaced phosphorus in the DNA and other key molecules of the bacteria - creating a form of life long thought to be impossible.
The NASA news conference that presented the study included a skeptic, respected chemist Steven Benner, but that didn't stop bloggers from accusing NASA of both hyping the story and unquestioningly presenting flawed research. The first major blog attacking the work was posted by University of British Columbia zoology professor Rosie Redfield, with other early critiques from online commentators including the American Enterprise Institute's Kenneth Green and Heather Olins on We Beasties.
"Basically, it doesn't present ANY convincing evidence that arsenic has been incorporated into DNA (or any other biological molecule)," she wrote. She accused the Wolfe-Simon team of sloppy lab work and not testing whether their results were correct. | 科技 |
2017-09/1579/en_head.json.gz/22205 | From Top Secret Weapon to Everyday Household Appliance March 22, 2012 By Marc Merlin by Larry Phillips (Atlanta Science Tavern contributor)
In September of 1940 the British scientist Henry Tizard arrived in Washington DC on an official visit. Tizard had with him a carefully packed trunk containing a device that an American official later described as “the most valuable cargo ever brought to our shores”. The device was a top-secret invention called the “cavity magnetron”, and it was the key to making radar workable. Tizard’s mission was to persuade the US military to fund further development of magnetron-based radar.
Randall and Boot cavity magnetron (credit: Wikimedia Commons user geni)
To understand the cavity magnetron’s role in radar technology, let’s review what an effective radar system needs. First, radar determines the range of a target by timing how long it takes for a transmitted radio pulse to bounce off the target and return to the source. If this faint echo is to be detectable, the outgoing pulse must be quite powerful– ideally, thousands of watts of power.
Second, the pulse needs to be sent out in a narrow beam. Otherwise, we will not know the direction of the target. A typical radar system scans across the horizon, continuously sending out its narrow beam of pulses. When an echo comes back, we know that the target is in that direction. Because of the way radio waves work, a narrow beam can only be produced if the radio waves are of very high frequency (or equivalently, of short wavelength). High frequency radio also has the advantage of bouncing more readily off small objects, such as airplanes.
These two requirements, high power and high frequency, presented a technical problem that vacuum tubes and other devices could not solve. The magnetron met both these requirements. Indeed, the American scientists who tested Tizard’s device were astonished to find that it produced 1,000 times the power level of anything they had.
The Americans wisely accepted the Tizard offer, and they began a large-scale effort to exploit the magnetron breakthrough. In time, this effort rivaled the Manhattan Project in size, and it resulted in American and British radar that was much better than their German and Japanese counterparts had. Airborne radar was so sensitive that it could detect submarine periscopes, and shipping loses to German submarines declined dramatically.
Even so, radar had produced its biggest impact on the war before Tizard arrived in 1940. The British had fended off the German bombing campaign in The Battle of Britain largely because their radar had enabled them to have their fighter planes in the air when the German bombers arrived over the English Channel. Fortunately, the Nazis never understood how good the British radar was; they were simply baffled by how Britain’s small air force was able to down so many of their bombers.
In the 72 years since Tizard’s visit, the cavity magnetron has gone from super-secret war weapon to commonplace: every microwave oven uses one to produce its microwaves.
Larry Phillips is a former electrical engineer and software developer at Lucent Technologies and Bell Laboratories. He now tutors students in mathematics and physics, writes a blog on mathematics at BrightStarTutors, and is author of a forthcoming book titled The Elementary Mathematics of Gravity.
Share this:ShareClick to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to email this to a friend (Opens in new window)Filed Under: Atlanta Science Tavern, History and Science Tagged With: magnetron, microwave, radar, Tizard, World War IIComments
Librarypat says March 26, 2012 at 12:28 am Thank you for a most enlightening post. I am certain those gentlemen would be rather surprised to see what their device has given rise to. | 科技 |
2017-09/1579/en_head.json.gz/22242 | by Bryan M. Wolfe
Update #2; Oct. 24, 12:24 p.m. PDT
We heard back from Disney about that issue. According to a spokesperson, "The Lion King" is just one of the movie titles currently unavailable for streaming. By contrast, "Remember the Titans" will become available tomorrow, Oct. 25.
The spokesperson also noted that they don't pull content that has already been purchased in the iTunes Store -- even if that content is no longer available for new purchases.
Disney was not aware of this issue until seeing the original 9to5Mac report, and our followup. Since that time, they are working with Apple to resolve whatever issues do exist, to ensure that purchased content is once again available for streaming and download purposes.
If we hear anything else on this issue, we will be updating this post again.
Some of our readers have wondered how previously purchased content could be pulled, in the event that you wish to download it again.
In Apple's iTunes Store - Terms and Conditions is the following key paragraph:
As an accommodation to you, subsequent to acquiring iTunes Auto-Delivery Content, purchased (i.e. not rented) movies iTunes Products and TV show iTunes Products (each, “iTunes Eligible Content”), you may download certain of such previously-purchased iTunes Eligible Content onto any Associated Device. Some iTunes Eligible Content that you previously purchased may not be available for subsequent download at any given time, and Apple shall have no liability to you in such event. As you may not be able to subsequently download certain previously-purchased iTunes Eligible Content, once you download an item of iTunes Eligible Content, it is your responsibility not to lose, destroy, or damage it, and you may want to back it up.
This definitely absolves Apple of any responsibility. However, I don't recall ever seeing something like this happen in practice.
Post, as original published:
Some Disney and Pixar movies have been pulled from the iTunes Store, including “The Lion King,”“Beauty and the Beast,” and “Remember The Titans.” Some folks who had previously purchased these titles are also reporting they are no longer available for download in iTunes under “Purchased Items.”
Initially reported as a glitch by 9to5Mac, this issue could be anything but. As reader Kevin Shain rightly notes, Amazon has also also pulled the “The Lion King” from the company’s Instant Video service.
A search for the digital version of "The Lion King” via amazon.com, brings up this message: “Due to our licensing agreements this video is currently not available for purchase or rental.” The same message greets those searching for "Beauty and the Beast" in digital format.
We’re still trying to get to the bottom of this, and will let you know what we find. While there is nothing extraordinary about movies being pulled from sale, especially those from Disney, it is interesting that previously purchased copies are also no longer available.
If you can no longer play or download previously purchased content in iTunes, let us know.
See also: Mickey Video 2.0 Features New Custom Player, iCloud Sync And Other Enhancements, Disney's Story App Gains New Features And Enhancements Through First Major Update, and New Disney Animated App For iPad Offers The Complete Story On All 53 Classic Films.
Amazon Hits Apple's Newest iPad With A 'Lighter Than Air' Ad
In The UK, How Does Apple's New iPad mini Compare Against Its Competition? | 科技 |
2017-09/1579/en_head.json.gz/22255 | GET THE PEOPLE PRENDIDO
Cayo Buay
Tag: BTL
BTL Warning – Phone Bill Cramming !!! WARNING !!!
Read a note this morning from a friend regarding BTL billing. Check your bill carefully for any extra charges that seem out of the ordinary. Apparently they are adding 17 cents to your bill for paying with you credit card from your machine at home” …
To one person this may not seem like a lot but start adding that up over time and the amount of customers BTL has and, well, you get the picture.
$0.17 x 50,000 = $8,500 a month x 12 = $102,000 a year. And I bet they have more than 50,000 customers.
This is called “Cramming” and it was a practice used by phone companies in the USA a while back. Not sure if it’s still done but I haven’t heard about it for a while and I’ve not noticed it on my bills. Continue reading “BTL Warning – Phone Bill Cramming” →
Share this:TwitterFacebookGoogleRedditTumblrPinterestEmailPrintLike this:Like Loading...
CayoBuayNovember 24, 2015November 24, 2015 BelizeBelizean peopleBelizeansBillsBTLCrammingGovernmentIssuesPhone Bill CrammingTelco The reason BTL was Nationalized? I told you all back then and I’ll keep saying it.. The issue was never about nationalization as in making it for the people, initially I thought it was just to get to slip a backdoor into the Constitution to give the politicians in Government all power but it seems it goes way beyond that. Doing research for my article about the CARICOM Union that we are heading into, I stumbled on this and you have to know about it. To fully understand this you will have to read the CARICOM article.
Caribbean Telecommunications Union (CTU)
The Caribbean Telecommunications Union was established by the Heads of Government of the Caribbean Community in 1989 in Nassau, The Bahamas.
The Organization, established its Headquarters in Barbados, on agreement with that Government in 1990, but relocated to the Republic of Trinidad and Tobago, where it continues to function in accordance with the terms of a Headquarters Agreement, dated April 8, 1993.
The Union enjoys, in its member constituencies, full legal personality and capacity to contract, acquire and dispose of real and personal property and to be party to legal proceedings. It also enjoys immunities and privileges accorded to diplomatic and international organizations of equal status.
The objectives of the Union shall be:-
To facilitate the coordination of the planning, programming and development of intra regional and international communications networks to meet the immediate and future telecommunications needs of the Region.
To assist the development of the national components of regional and international telecommunications networks.
To promote the general awareness of the telecommunications needs of the Caribbean Region and its potential for promoting the socio-economic development of the Region.
To encourage the exchange of information, views and ideas between the telecommunications administrations of Members.
To foster coordination within the Caribbean Region of technical standards and routing plans for intraregional and international traffic.
To seek the adoption of efficient operating methods in national, regional and international telecommunications services.
To harmonize as far as possible the positions of Members in preparation for international and regional telecommunications conferences and other meetings.
To encourage and assist members in the establishment and development of telecommunications industries.
To encourage the transfer of technology in the field of telecommunications among Members.
To establish linkages with the information bases of other telecommunications organisations and, in particular, the Centre for Telecommunications Development at the International Telecommunications Union (ITU) in Geneva.
The Commonwealth of Dominica
The Cooperative Republic of Guyana
13-14 August 2012: DBSF 2012, a forum in collaboration with the Commonwealth Telecommunications Organisation, the Commonwealth Broadcasters Association and the Caribbean Broadcasters Union to involve regional broadcasters in transition planning and foster harmonisation of technical and business approaches to switchover and allocation of relevant spectrum. This forum developed a checklist of relevant issues to be addressed, noted that broadcasters were struggling to make their business cases to justify switchover and identified a need for research papers and business model evaluations to guide decision-making among regional broadcasters.
This fifth meeting on the issue is intended to consolidate the outcomes and outstanding matters of the previous four meetings, discuss and identify common and divergent positions among regional stakeholders and begin drafting the policy framework around which Caribbean approaches might be harmonised.
NOTICE that the common word that keeps coming up in both items is HARMONIZE. In essence, UNIFICATION.
Doesn’t this make you think?
Why take over BTL?
Why force everyone to register their SIM-Cards?
Why all the equipment to spy? (Oh you know it’s true, you’ve seen the truck(s) with the antennae.)
How come accounts are continually being compromised? (You’ve read the news and heard the boasting in the House of Representatives about getting into the email accounts of other politicians.)
American Registry for Internet Numbers (ARIN)
ARIN manages the distribution of Internet number resources (IPv4 and IPv6 address space and Autonomous System Numbers) in Canada, many Caribbean and North Atlantic islands, and the United States. ARIN is one of five Regional Internet Registries (RIRs) in the world. Like the other RIRs, ARIN provides services related to the technical coordination and management of Internet number resources, facilitates policy development, participates in the international Internet community, is a nonprofit and is a community-based organization governed by a member-elected executive board.
Canadian International Development Agency
The Canadian International Development Agency (CIDA) is Canada’s lead agency for development assistance. It has a mandate to support sustainable development in developing countries in order to reduce poverty and to contribute to a more secure, equitable, and prosperous world.
Congress WBN – Ethical Initiatives for Global Development
Congress WBN (C-WBN) is a synergy of global initiatives focused on effecting human, social and national transformation through the propagation of values-based development principles, patterns and approaches. It is comprised of Sectors of strategic operations involving networks of professional groups, educational institutions, businesses, churches, individual national leaders and university students. C-WBN operates through every continent and in over 75 nations.
Capacity Caribbean 2013
The most important C-level event for the wholesale Caribbean carrier community
The 7th annual Capacity Caribbean will be moving to the new, dynamic market of Curaçao for the first time in 2013. After extensive post-event analysis, 74% of our customers voted for Capacity Caribbean 2013 to come to Curaçao due to the recent surge in market activity, liberalisation and new revenue opportunities for international operators coming from the region. As ever this seminal conference is committed to exploring and developing those all-important contacts and regional market insights key to growing your wholesale profits in the Caribbean.
The networking opportunities at Capacity Caribbean are unparalleled for the regional wholesale community. Last year the event brought together 220 key decision-makers providing an optimal opportunity to network with the leading-executives in the Caribbean wholesale industry, conduct business meetings with both new and existing clients and gain an insight into the latest industry trends. Combined with our prestigious line-up of industry-leading speakers and ground-breaking agenda mean that Capacity Caribbean 2013 is an event not to be missed.
For more information please go here
International Telecommunication Union
Every time someone, somewhere, picks up a telephone and dials a number, answers a call on a mobile phone, sends a fax or receives an e-mail, takes a plane or a ship, listens to the radio, watches a favourite television programme or helps a small child master the latest radio-controlled toy, they benefit from the work of the International Telecommunication Union.
Inter-American Telecommunication Commission (CITEL)
CITEL, an entity of the Organization of American States, is the main forum in the hemisphere in which the governments and the private sector meet to coordinate regional efforts to develop the Global Information Society according to the mandates of the General Assembly of the Organization and the mandates entrusted to it by Heads of State and Government at the Summits of the Americas.
Internet Address Registry for Latin America and the Caribbean (LACNIC)
LACNIC, the Internet Address Registry for Latin America and the Caribbean, is the organization responsible for allocating and administrating IP Addresses and other related resources (Autonomous System Numbers and Reverse Resolution) for the region of Latin America and the Caribbean. It is one of the five Regional Internet Registries that exist worldwide.
The Commonwealth Telecommunications Organisation
The Commonwealth Telecommunications Organisation (CTO) is an international organisation based in London and established through a Headquarters Agreement with the Government of the United Kingdom of Great Britain and Northern Ireland. It is the oldest and largest Commonwealth organisation engaged in multilateral collaboration in the field of Information and Communication Technologies (ICTs), and uses its experience and expertise to support its members in integrating ICTs to deliver effective development interventions that enrich, empower, and emancipate people within the Commonwealth and beyond.
THE ORIGINAL TREATY
The Treaty of Chaguaramas which established the Caribbean Community including the Caribbean Common Market was signed by Barbados, Guyana, Jamaica and Trinidad and Tobago on 4th July, 1973, in Chaguaramas, Trinidad and Tobago. It came into effect on 1 August 1973.
The Caribbean Community and the Caribbean Common Market replaced the Caribbean Free Trade Association which ceased to exist on 1st May 1974.
The Treaty of Chaguaramas was juridical hybrid consisting of the Caribbean Community as a separate legal entity from the Common Market which had its own discrete legal personality.
Indeed, the legal separation of these two institutions was emphasised by the elaboration of two discrete legal instruments: the Treaty establishing the Caribbean Community and the Agreement establishing the Common Market (which was later annexed to the Treaty and designated the Common Market Annex). This institutional arrangement facilitated States joining the Community without being parties to the Common Market regime.
In addition to economic issues, the Community instrument addressed issues of foreign policy coordination and functional cooperation. Issues of economic integration, particularly those related to trade arrangements, were addressed in the Common Market Annex.
Because of this juridically separate identity of the regional common market, it was possible for the Bahamas to become a member of the Community in 1983 without joining the Common Market.
View the PDF
The International Telecommunication Union (ITU), originally founded as the International Telegraph Union, is a specialized agency of the United Nations that is responsible for issues that concern information and communication technologies. The ITU coordinates the shared global use of the radio spectrum, promotes international cooperation in assigning satellite orbits, works to improve telecommunication infrastructure in the developing world, and assists in the development and coordination of worldwide technical standards.
The ITU is active in areas including broadband Internet, latest-generation wireless technologies, aeronautical and maritime navigation, radio astronomy, satellite-based meteorology, convergence in fixed-mobile phone, Internet access, data, voice, TV broadcasting, and next-generation networks.
ITU, based in Geneva, Switzerland, is a member of the United Nations Development Group. Its membership includes 193 Member States and around 700 Sector Members and Associates.
Information Regarding Belize in the ITU
Ministry of Energy, Science & Technology and Public Utilities
1st Floor East Block Building
Tel +501 8223336
Email [hidden as of 31.5.2013]
URL www.belize.gov.bz
H.E. Ms Joy Grant, Minister
Dr Colin Young, Chief Executive Officer
ADMIN / REGULATOR
Public Utilities Commission (PUC)
41 Gabourel Lane
URL www.puc.bz
Mr John Avery, Chairman
(Email [hidden as of 31.5.2013])
Office of the Prime Minister and Ministry of Finance
Sir Edney Cain Building
H.E. Mr Dean Barrow, Prime Minister and Minister of Finance
Mr James Murphy, Secretary to the Cabinet
PERM MISSION
Permanent Mission of Belize to the United Nations in New York
675 Third Avenue, Suite 1911
Tel +1 212 9861240
H.E. Mrs Janine Elizabeth Coye-Felson, Ambassador, Deputy Permanent Representative
Addressability
This term is part of Newtorking. Basically, it is a way that any device is assigned a unique ID#, much like the MAC address on your computer or phone that allows for tracking and identifying the owner.
Addressability is the capacity for an entity to be targeted and found. To be addressable, an entity must be uniquely identifiable, which means that it must be associated with something — typically an alphanumeric string, although there are other possibilities — that is not associated with anything else that exists within that system.
A URI (Uniform Resource Identifier) is a unique identifier that makes content addressable on the Internet by uniquely targeting items, such as text, video, images and applications. A URL (Uniform Resource Locator) is a particular type of URI that targets Web pages so that when a browser requests them, they can be found and served to users.
Addressability is an increasing trend: more and more things can be assigned unique identifiers, and if something has a unique identifier, it can be tagged, assigned a URI and targeted over a network. That capacity paves the way for the Internet of Things (IoT), a scenario in which everything — including people, animals, servers, applications, shampoo bottles, cars, steering wheels, coffee machines, park benches or just about any other random item that comes to mind.– has a unique identifier and the ability to communicate over the Internet or a similar wide-area network (WAN).
unique identifier (UID)
Think of this in the sense of your username on Facebook or your email address.
A unique identifier (UID) is a numeric or alphanumeric string that is associated with a single entity within a given system. UIDs make it possible to address that entity, so that it can be accessed and interacted with.
Here are a few examples of UIDs:
A Uniform Resource Identifier (URI) is a unique identifier that makes content addressable on the Internet by uniquely targeting items, such as text, video, images and applications.
A Uniform Resource Locator (URL) is a particular type of URI that targets Web pages so that when a browser requests them, they can be found and served to users.
A Universal Unique Identifier (UUID) is a 128-bit number used to uniquely identify some object or entity on the Internet.
A global unique identifier (GUID) is a number that Microsoft programming generates to create a unique identity for an entity such as a Word document.
A bank identifier code (BIC) is a unique identifier for a specific financial institution.
A unique device identifier (UDID) is a 40-character string assigned to certain Apple devices including the iPhone, iPad, and iPod Touch.
A service set identifier (SSID) is a sequence of characters that uniquely names a wireless local area network (WLAN).
A national provider identifier (NPI) is a unique ten-digit identification number required by HIPAA for all health care providers in the United States.
The Internet of Things (IoT) is a scenario in which objects, animals or people are provided with unique identifiers and the ability to automatically transfer data over a network without requiring human-to-human or human-to-computer interaction. IoT has evolved from the convergence of wireless technologies, micro-electromechanical systems (MEMS) and the Internet.
A thing, in the Internet of Things, can be a person with a heart monitor implant, a farm animal with a biochip transponder, an automobile that has built-in sensors to alert the driver when tire pressure is low — or any other natural or man-made object that can be assigned an IP address and provided with the ability to transfer data over a network. So far, the Internet of Things has been most closely associated with machine-to-machine (M2M) communication in manufacturing and power, oil and gas utilities. Products built with M2M communication capabilities are often referred to as being smart.
IPv6’s huge increase in address space is an important factor in the development of the Internet of Things. According to Steve Leibson, who identifies himself as “occasional docent at the Computer History Museum,” the address space expansion means that we could “assign an IPV6 address to every atom on the surface of the earth, and still have enough addresses left to do another 100+ earths.” In other words, humans could easily assign an IP address to every “thing” on the planet. An increase in the number of smart nodes, as well as the amount of upstream data the nodes generate, is expected to raise new concerns about data privacy, data sovereignty and security.
Although the concept wasn’t named until 1999, the Internet of Things has been in development for decades. The first Internet appliance, for example, was a Coke machine at Carnegie Melon University in the early 1980s. The programmers could connect to the machine over the Internet, check the status of the machine and determine whether or not there would be a cold drink awaiting them, should they decide to make the trip down to the machine.
Kevin Ashton, cofounder and executive director of the Auto-ID Center at MIT, first mentioned the Internet of Things in a presentation he made to Procter & Gamble. Here’s how Ashton explains the potential of the Internet of Things:
“Today computers — and, therefore, the Internet — are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet were first captured and created by human beings by typing, pressing a record button, taking a digital picture or scanning a bar code.
The problem is, people have limited time, attention and accuracy — all of which means they are not very good at capturing data about things in the real world. If we had computers that knew everything there was to know about things — using data they gathered without any help from us — we would be able to track and count everything and greatly reduce waste, loss and cost. We would know when things needed replacing, repairing or recalling and whether they were fresh or past their best.”
The U.N. Threat to Internet Freedom
Today, however, Russia, China and their allies within the 193 member states of the ITU want to renegotiate the 1988 treaty to expand its reach into previously unregulated areas. Reading even a partial list of proposals that could be codified into international law next December at a conference in Dubai is chilling:
Subject cyber security and data privacy to international control;
Allow foreign phone companies to charge fees for “international” Internet traffic, perhaps even on a “per-click” basis for certain Web destinations, with the goal of generating revenue for state-owned phone companies and government treasuries;
Establish for the first time ITU dominion over important functions of multi-stakeholder Internet governance entities such as the Internet Corporation for Assigned Names and Numbers, the nonprofit entity that coordinates the .com and .org Web addresses of the world;
Police: Phones Will Have To Be Registered
The phone companies say there are as just under three hundred thousand prepaid cellular phones in Belize, and of that, only a very small number are registered. In fact, if you’re the owner of a pre-paid phone you might not even know what registration means – and you also might not know that it is – and has been – the law to register your phone. Now, the government says it is ready to start enforcing section 44 of the Telecommunications Act 2002; a piece of legislation passed almost ten years ago which requires the registration of cellular phones. The registration is mandatory, and cellular users will be given six months, starting October 11th in which to register their cell phones or face disconnection.
Phone clienteles line up for mandatory registration of SIM cards
Long lines could be seen at BTL and Smart offices across the country yesterday. Word was out that the deadline for registration of pre-paid cell numbers had arrived. Julian Cruz visited the respective offices here in Belmopan and filed this report. BTL and Smart.
Google attacks UN’s internet treaty conference
New World Information and Communication Order
New World Order: is the UN about to take control of the internet?
CayoBuayJanuary 26, 2014 Addressabilityaeronautical and maritime navigationAmerican Registry for Internet Numbers (ARIN)AnguillaAntigua and BarbudaApple devicesAuto-IDAuto-ID CenterAuto-ID Center at MITbank identifier code (BIC)BarbadosBelizeBelizean peopleBelizeansbiochip transponderBritish Virgin Islandsbroadband InternetBTLCAFTACanadaCanadian International Development Agency (CIDA)Capacity CaribbeanCaribbeanCaribbean Broadcasters UnionCaribbean Common MarketCaribbean CommunityCaribbean Free Trade AssociationCaribbean RegionCaribbean Telecommunications UnionCaribbean Telecommunications Union (CTU)CARICOMCarnegie Melon UniversityCayman IslandsCentre for Telecommunications DevelopmentCommon Market AnnexCommon Market regimeCommonwealth Broadcasters AssociationCommonwealth Telecommunications OrganisationCommonwealth Telecommunications Organisation (CTO)CTUCuraçaocurrent-eventscyber securitydatadata privacydata sovereignty and securitydevelopmentDr Colin YoungFacebookfixed-mobile phoneFreeFreedomGenevaglobal unique identifier (GUID)GovernmentGrenadaH.E. Mr Dean BarrowH.E. Mrs Janine Elizabeth Coye-FelsonH.E. Ms Joy GrantHeadquarters AgreementHIPAAHouse of Representativeshuman-to-computer interactionhuman-to-human interactionInformation and Communication Technologies (ICTs)Inter-American Telecommunication Commission (CITEL)internationalinternational Internet communityinternational lawInternational Telecommunication UnionInternational Telecommunication Union (ITU)International Telecommunications Union (ITU)International Telegraph UnioninternetInternet accessInternet Address RegistryInternet Address Registry for Latin America and the Caribbean (LACNIC)Internet Corporation for Assigned Names and Numbers (ICANN)Internet number resourcesInternet of Things (IoT)iPadiPhoneiPod TouchIPv4IPv6IssuesJamaicaKevin AshtonLatin America and the CaribbeanLondonmachine-to-machine (M2M)micro-electromechanical systems (MEMS)MITMontserratMr James MurphyMr John AveryNassaunational provider identifier (NPI)Newtorkingnext-generation networksOASOffice of the Prime Minister and Ministry of FinanceOrganization of American States (OAS)Permanent Mission of Belize to the United Nations in New YorkPoliticsPublic Utilities Commission (PUC)radio astronomyRegional Internet Registries (RIRs)Republic of Trinidad and TobagoSaint Kitts/NevisSaint LuciaSaint Vincent and the Grenadinessatellite-based meteorologyservice set identifier (SSID)SIM-Cardssocio-economic developmentSteve LeibsonSurinameSwitzerlandTelecommunicationsTelecommunications ActTelecommunications UnionThe BahamasThe Commonwealth of DominicaThe Cooperative Republic of Guyanathe Organization of American StatesThoughtsTreaty of ChaguaramasTrinidad and TobagoTurks and CaicosTV broadcastingUniform Resource Identifier (URI)Uniform Resource Locator (URL)Unionunique device identifier (UDID)unique identifier (UID)United Kingdom of Great Britain and Northern IrelandUnited Nations (UN)United Nations Development GroupUniversal Unique Identifier (UUID)URI (Uniform Resource Identifier)URL (Uniform Resource Locator)voicewide-area network (WAN)wireless local area network (WLAN) Right in your backyard This week started on a sad note. The start of a hostile takeover of our country by the very people we put all our trust in. While I believe that for a country like Belize, utilities should be owned by the people; I understand that the need for growth is important and it is better left in the hands of private investors; carefully selected investors.
Continue reading “Right in your backyard” →
CayoBuayJuly 5, 2011July 29, 2011 BelizeBelize CityBelizean peopleBTLBusiness and EconomyCentral AmericaDictatorshipGodGovernmentHostilityIssuesPlacenciaPoliticsTakeover Email Subscription
CayoBuay
I was born in Belize City, Belize on Sept. 01, 1975 and raised in the beautiful town of San Ignacio, Cayo; both places in the Central American Country of Belize. Hence the name Cayo Buay, it is Kriol for Boy from Cayo.
Raised in a single mother home with 8 children, I learnt from a tender age that you can't always have what you want and you always have to work hard for what you get. As a young lad I remember going to my Granddad's farm to help as we could. CB has been working since Std 6 (8th Grade).
I worked in the Architecture and Construction field while I lived in Belize and worked on many projects ranging from residential to Commercial with the most notable being the extension to the PGIA.
I know about hard life and how to make a lot out of a little. But that was then, today I am a happily married man and father of a wonderful son who I am doing my utmost best to raise the right way. I've worked in Architecture and Construction, owned my own company, did some remodeling and some call center work both as a rep and as a manager, worked as a Helpdesk Admin, System Administrator and now an ESM (Enterprise System Monitor) Engineer, essentially a walking NSA.
View Full Profile → TagsBelize
Belizean people
Belizeans
Belize Constitution
History of Belize
Independence of Belize
XboxStumble Upon If you visit from Stumble upon, please do me a favor and click the I like it! button.
It will be much appreciated.
Pinterest Visit my profile on Pinterest.
Friends Belize Post
Mountain Pine Ridge Environmental Coalition
Twocanview
Cayo Scoop
Black Swan Poetry
Top Posts & Pages Custom repos for Ubuntu 10.04 LTS Lucid Lynx The Doom of Babylon My favorite Breed of Dog - The German shepherd On Belize vs Guatemala at ICJ Happy Birthday Malcolm X Why is my water brown? Ignorance and Complacency - bad for you Archives Archives | 科技 |
2017-09/1579/en_head.json.gz/22462 | Erasing history? Temporal cloaks adjust light's throttle to hide an event in time
WASHINGTON, Oct. 12 -- Researchers from Cornell University in Ithaca, N.Y., have demonstrated for the first time that it's possible to cloak a singular event in time, creating what has been described as a "history editor." In a feat of Einstein-inspired physics, Moti Fridman and his colleagues sent a beam of light traveling down an optical fiber and through a pair of so-called "time lenses." Between these two lenses, the researchers were able to briefly create a small bubble, or gap, in the flow of light. During that fleetingly brief moment, lasting only the tiniest fraction of a second, the gap functioned like a temporal hole, concealing the fact that a brief burst of light ever occurred. The team will present their findings at the Optical Society's (OSA) Annual Meeting, Frontiers in Optics (FiO) 2011 (http://www.frontiersinoptics.com/), taking place in San Jose, Calif. next week.
Their ingenious system, which is the first physical demonstration of a phenomenon originally described theoretically a year ago by Martin McCall and his colleagues at Imperial College London in the Journal of Optics, relies on the ability to use short intense pulses of light to alter the speed of light as it travels through optical materials, in this case an optical fiber. (In a vacuum, light maintains its predetermined speed limit of 180,000 miles per second.) As the beam passes through a split-time lens (a silicon device originally designed to speed up data transfer), it accelerates near the center and slows down along the edges, causing it to balloon out toward the edges, leaving a dead zone around which the light waves curve. A similar lens a little farther along the path produces the exact but opposite velocity adjustments, resetting the speeds and reproducing the original shape and appearance of the light rays. To test the performance of their temporal cloak, the researchers created pulses of light directly between the two lenses. The pulses repeated like clockwork at a rate of 41 kilohertz. When the cloak was off, the researchers were able to detect a steady beat. By switching on the temporal cloak, which was synchronized with the light pulses, all signs that these events ever took place were erased from the data stream. Unlike spatial optical cloaking, which typically requires the use of metamaterials (specially created materials engineered to have specific optical properties), the temporal cloak designed by the researchers relies more on the fundamental properties of light and how it behaves under highly constrained space and time conditions. The area affected by the temporal cloak is a mere 6 millimeters long and can last only 20 trillionths of a second. The length of the cloaked area and the length of time it is able to function are tightly constrained--primarily by the extreme velocity of light. Cloaking for a longer duration would create turbulence in the system, essentially pulling back the curtain and hinting that an event had occurred. Also, to achieve any measurable macroscopic effects, an experiment of planetary and even interplanetary scales would be necessary.
FiO presentation FMI3, "Demonstration of Temporal Cloaking," by Moti Fridman et al. is at 4:45 p.m. on Monday, Oct. 17. ABOUT THE MEETING Frontiers in Optics 2011 is OSA's 95th Annual Meeting and is being held together with Laser Science XXVII, the annual meeting of the American Physical Society (APS) Division of Laser Science (DLS). The two meetings unite the OSA and APS communities for five days of quality, cutting-edge presentations, fascinating invited speakers and a variety of special events spanning a broad range of topics in physics, biology and chemistry. FiO 2011 will also offer a number of Short Courses designed to increase participants' knowledge of a specific subject while offering the experience of insightful teachers. An exhibit floor featuring leading optics companies will further enhance the meeting. Useful Links:
Meeting home page (http://www.frontiersinoptics.com/)
Conference program (http://www.frontiersinoptics.com/Home/Conference-Program.aspx)
Searchable abstracts (http://fio-ls2011.abstractcentral.com/login)
PRESS REGISTRATION: A Press Room for credentialed press and analysts will be located in the Fairmont San Jose Hotel, Sunday through Thursday, Oct. 16-20. Those interested in obtaining a press badge for FiO should contact OSA's Angela Stark at 202-416-1443 or astark@osa.org. ABOUT OSA Uniting more than 130,000 professionals from 175 countries, the Optical Society (OSA) brings together the global optics community through its programs and initiatives. Since 1916 OSA has worked to advance the common interests of the field, providing educational resources to the scientists, engineers and business leaders who work in the field by promoting the science of light and the advanced technologies made possible by optics and photonics. OSA publications, events, technical groups and programs foster optics knowledge and scientific collaboration among all those with an interest in optics and photonics. For more information, visit http://www.osa.org.
Angela Stark
astark@osa.org
@opticalsociety
http://www.osa.org More on this News Release
Frontiers in Optics 2011 | 科技 |
2017-09/1579/en_head.json.gz/22508 | Air Traffic Control Modernization Hits Turbulence
JOAN LOWY | Associated Press
WASHINGTON (AP) – Ten years after Congress gave the go-ahead to modernize the nation's air traffic control system, one of the government's most ambitious and complex technology programs is in trouble. The Next Generation Air Transportation System, or NextGen, was promoted as a way to accommodate an anticipated surge in air travel, reduce fuel consumption and improve safety and efficiency. By shifting from radar-based navigation and radio communications – technologies rooted in the first half of the 20th century – to satellite-based navigation and digital communications, it would handle three times as many planes with half as many air traffic controllers by 2025, the Federal Aviation Administration promised. Planes would fly directly to their destinations using GPS technology instead of following indirect routes to stay within the range of ground stations. They would continually broadcast their exact positions, not only to air traffic controllers, but to other similarly equipped aircraft. For the first time, pilots would be able to see on cockpit displays where they were in relation to other planes. That would enable planes to safely fly closer together, and even shift some of the responsibility for maintaining a safe separation of planes from controllers to pilots. But almost nothing has happened as FAA officials anticipated. Increasing capacity is no longer as urgent as it once seemed. The 1 billion passengers a year the FAA predicted by 2014 has now been shoved back to 2027. Air traffic operations – takeoffs, landings and other procedures – are down 26 percent from their peak in 2000, although chronic congestion at some large airports can slow flights across the country. Difficulties have cropped up at almost every turn, from new landing procedures that were impossible for some planes to fly to aircraft-tracking software that misidentified planes. Key initiatives are experiencing delays and are at risk of cost overruns. And the agency still lacks "an executable plan" for bringing NextGen fully online, according to a government watchdog. "In the early stages, the message seemed to be that NextGen implementation was going to be pretty easy: You're going to flip a switch, you're going to get NextGen, we're going to get capacity gains," said Christopher Oswald, vice president for safety and regulatory affairs at Airports Council International-North America. "It wasn't realistically presented." Some airline officials, frustrated that they haven't seen promised money-saving benefits, say they want better results before they spend more to equip planes to use NextGen, a step vital to its success. Lawmakers, too, are frustrated. NextGen has enjoyed broad bipartisan support in Congress, but with the government facing another round of automatic spending cuts, supporters fear the program will be increasingly starved for money. "It's hard not to be worried about NextGen funding ... because it's a future system," said Marion Blakey, who was the head of the FAA when the program was authorized by Congress in 2003 and now leads a trade association that includes NextGen contractors. "There is a temptation to say the priority is keeping the existing systems humming and we'll just postpone NextGen." In September, a government-industry advisory committee recommended that, given the likelihood of budget cuts, the FAA should concentrate on just 11 NextGen initiatives that are ready or nearly ready to come online. It said the rest of the 150 initiatives that fall under NextGen can wait. "You can't have an infrastructure project that is the equivalent of what the (interstate) highway program was back in the '50s and the '60s and take this ad hoc, hodgepodge approach to moving this thing forward," said Air Line Pilots Association First Vice President Sean Cassidy, who helped draft the recommendations. The threat of funding cuts comes just as NextGen is nearing a tipping point where economic and other benefits should start to multiply if only the FAA and industry would persevere, said Alaska Airlines Chairman Bill Ayers, a supporter. Responding to industry complaints, the FAA has zeroed in on an element of NextGen that promises near-term benefits: new procedures that save time and fuel in landings while decreasing greenhouse gas emissions. Planes equipped with highly calibrated GPS navigation are able fly precise, continuous descents on low power all the way to the runway rather than the customary and time-consuming stair-step approaches in which pilots repeatedly decrease power to descend and then increase power to level off. Last spring, Seattle-Tacoma International Airport became the first large airport where airlines can consistently use one of the new procedures. Known as HAWKS, the procedure shortens the approach from the southwest by about 42 miles. Multiplied over many planes every day it adds to up to significant savings, an enticing prospect for airlines, which typically operate on razor-thin profit margins. Alaska, with a major hub in Seattle, estimates new procedures there will eventually cut the airline's fuel consumption by 2.1 million gallons annually and reduce carbon emissions by 24,250 tons, the equivalent of taking 4,100 cars off the road every year. Fuel is the biggest expense for most airlines. In Atlanta, more precise navigation procedures have increased the number of departure paths that planes can fly at the same time, enabling takeoffs to double from one every two minutes to one every minute. That has freed up an additional runway for arrivals, said Dale Wright, the National Air Traffic Controllers Association's safety and technology director. FAA Administrator Michael Huerta says NextGen is on track despite the troubles. "It's a significant transformation that we're making," he told The Associated Press. "I would hope it would be moving faster as well, but we have a very large, a very complex system, and we're making great progress." But even use of the GPS-based procedures has been slowed by unforeseen problems. It takes several years to develop each procedure airport by airport. At large airports, new procedures are used only sporadically. During busy periods, controllers don't have time to switch back and forth between the new procedures, which most airliners can use, and older procedures that regional airliners and smaller planes often must still use. Consequently, older procedures are used because all planes can fly them. At six large airports in Chicago, New York and Washington, only 3 percent of eligible flights have used the new procedures, Calvin Scovel, the Transportation Department's inspector general, told a congressional hearing in July. Many other NextGen initiatives "are still in the early stages of development," he said. Another important NextGen initiative would replace radio communications between controllers and pilots with text messaging and digital downloads. Radio frequencies are often crowded, and information sometimes must be repeated because of mistakes or words not heard. Digital communications are expected to be safer and more efficient. But airlines are reluctant to make additional investments in new communications equipment for planes until the FAA shows NextGen can deliver greater benefits like fuel savings from more precise procedures, said Dan Elwell, a senior vice president at Airlines for America, a trade association for major carriers. Southwest Airlines spent more than $100 million in 2007 to equip its planes to use the new procedures. The airline expected to recoup its investment by 2011, but is still not there, primarily because of the FAA's slow pace, said Rick Dalton, Southwest's director of air space and flow management. NextGen was originally forecast to cost $40 billion, split between government and industry, and to be completed by 2025. But an internal FAA report estimates it will cost three times that much and take 10 years longer to complete, Scovel said. FAA officials have largely stopped talking about end dates and completion costs as the technologies that make up NextGen continue to evolve. The agency currently spends about $800 million a year on the program. "When we're talking about NextGen, it's like we're talking about the atmosphere," Cassidy said. "It's tough to pin down exactly what NextGen is in terms of the technologies and the cost of the technologies because, frankly, they're changing all the time." Hopefully the FAA can make a "mid-course correction" to get NextGen on track, said Rep. Rick Larsen, D-Wash., a supporter. "We shouldn't give up on the effort because I think everybody understands there is a lot of benefit to it." But he's concerned that more delays in the program "could force us to rename it LastGen." Follow Joan Lowy on Twitter at www.twitter.com/AP_Joan_Lowy Copyright 2013 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. | 科技 |
2017-09/1579/en_head.json.gz/22612 | 42% ad revenue decline: AOL's uphill battle
By Tobi Elkin.
Published on July 29, 2002. The Securities and Exchange Commission has launched a probe of its advertising accounting. Rival MSN is moving in on its customers. The CEO's office is vacant. The broadband Internet strategy is murky. Subscriber growth is sluggish. Ad revenue has plunged.
And America Online's top advertising and marketing executive is charged with the unenviable-some would say Sisyphean-task of revamping the service's sales strategy to stem the decline.
When parent AOL Time Warner reported earnings July 24, the AOL division turned in the worst performance in the company. The service's ad and commerce revenue crashed 42% last quarter to $412 million-or $362 million after revenue from AOL Time Warner siblings is factored out. secure spot
The April 22 appointment of Bob Sherman, former president of Time Warner Cable Advertising Sales, to president of AOL Interactive Marketing was perceived as a positive for the online giant, bogged down as it is by the steep decline in online ad revenue. Now it is up to Mr. Sherman to boost the unit's sagging fortunes. Although he was installed by ousted AOL Time Warner Chief Operating Officer Robert Pittman and had no existing relationship with his new boss, Don Logan, insiders believe Mr. Sherman's position is secure. That theory will be tested when AOL Time Warner Chairman Richard Parsons and Mr. Logan, newly named chairman of the Media & Communications Group, install a new CEO at AOL.
The company hired a headhunter to explore outside candidates for the CEO spot, but is also considering a tight group of internal candidates, including Time Inc. Exec VP Michael Klingensmith and Jimmy de Castro, AOL president-interactive services and Mr. Sherman's boss.
"Sherman is held in high regard. He has pretty good equity in other parts of the company," said Larry Goodman, president, CNN sales and marketing, who serves with Mr. Sherman on AOL Time Warner's Ad Council. `sensible' to anyone
Mr. Sherman expressed confidence the changes he has initiated will stick regardless of the management shifts. "I believe very strongly that what we've accomplished in the last eight weeks in this reorganization is consonant with the very best practices that I've been able to learn over the last 30 years running advertising and subscription businesses," he said. "That makes me believe that these changes will be sensible to any new management team."
In an earlier interview, Mr. Logan said the AOL CEO search "will take as long as it takes" and that after it is decided, "we can go forward." In AOL Time Warner's earnings call last week, Mr. Parsons said the unit's CEO will be named "shortly."
Mr. Sherman has already retooled AOL's ad sales organization, which focused on vertical industries such as automotive, fast food and retail. Mr. Sherman replaced that organization with a more traditional regional sales structure to enable more frequent client contact and improve accountability on the part of both the sales and account teams. He's also moved to streamline internal processes in order to reduce the number of steps and points of contact marketers and agencies must encounter when dealing with AOL, and has begun tackling the pricing of media and marketing programs.
a `dare' to do business "We're taking a close look at pricing. We ought to be at the very high end of the bell-shaped curve, if not a little bit beyond in the [media] business," Mr. Sherman said. "We were unfriendly in our offerings to the media community, our prices were probably too high, they were almost disdainful of the marketplace," he continued, adding, "We almost dared people to do business with us."
Mr. Sherman and Lon Otremba, a former Time Warner Cable executive installed by Mr. Sherman to coordinate processes within the unit as exec VP-AOL interactive marketing, said they want AOL to be more customer-focused and client-friendly.
To that end, Mr. Sherman and his team hit the road over the last three weeks meeting with marketers ranging from McDonald's Corp. and Citigroup to Travelocity and Samsung Electronics in a bid to allay concerns and communicate that AOL wants to be a good business partner. AOL, Mr. Sherman said, wants to structure deals that meet the marketer's needs, not just its own. "He's clearly aligned things in such a way that it's better [and] going to benefit McDonald's," said Neil Perry, senior director of national marketing, McDonald's. "There's a real feeling that AOL has immersed itself in our business."
McDonald's has done business with individual units of AOL Time Warner including AOL, which has served as its main point of contact in discussions with AOL Time Warner executives to craft a cross-media arrangement. "We are making great progress on what it might be and what it might look like, but it's not there yet," said Mr. Perry of the year-long process, adding that a deal was at least three or four months away.
Mr. Perry doesn't see the SEC probe as an issue that will stall talks. "It is my understanding the SEC is doing an investigation of AOL's accounting practices, and that they have not charged them with any wrongdoing," he said.
Other marketers said they would closely follow the SEC probe. "In all honesty, I will keep abreast of the process of this investigation. But if I were in a relationship with [AOL Time Warner] at this time, I wouldn't stop the process or the negotiation, though certainly I would be a fool not to keep track of what's going on there," said Kurt Graetzer, CEO, Fluid Milk Processor Promotion Board, a consortium of U.S. milk producers. "They are simply too important in the field. They have an extraordinary base of users and there is loads of opportunity there." a port in a storm
Marketers with whom Mr. Sherman and his team have met have expressed confidence in his demeanor amid the swirl of uncertainty at AOL. "[Bob] Sherman has been very good about listening [to us] about what our issues have been. He's a quick read," Mr. Graetzer said. MilkPEP had a $9 million marketing program with Time Inc., Time Warner cable properties and AOL last summer. But the deals were struck with individual units and not as a package deal, which suited Mr. Graetzer, who experienced flameouts with two previous multiplatform deals with AOL Time Warner. "They [the individual units] still do a very good job for us, they are still strong partners, but don't even think about bundling up this media unless you thoroughly understand and can provide a service to the client," Mr. Graetzer said. Rishad Tobaccowala, exec VP, Bcom3 Group's Starcom MediaVest Group, agreed, maintaining that smaller deals are more viable and that combining the right properties, for example, the WB's Cartoon Network and CartoonNetwork.com, make more sense to achieving marketer goals. "In the new world, you can't approach things with just scale as the answer."
Sam Gilliland, president-CEO, Travelocity, sensed a new attitude of partnership on AOL's part. The online travel service is halfway through a five-year, $200 million marketing and media relationship with AOL.
"The good news is that it appears as though they are very much interested in taking the relationship to the next level," Mr. Gilliland said, alluding to the potential to work with AOL on database management programs. Ominously, America Online's backlog of advertising deals has fallen precipitously. In October 2000, AOL reported a backlog of $3 billion in ad deals, representing future revenue. AOL last week said the backlog now is $860 million. Meanwhile, the status of several cross-company deals with marketers hangs in the balance, along with the structure and leader of the Global Marketing Solutions Group, the internal body charged with implementing mega-deals. "To date, an unfortunate set of circumstances have worked against AOL in working big deals and that is that many advertisers still separate digital [media] planning from traditional [media] planning, particularly the consumer package-goods companies," said Maggie Boyer, VP-media at digital agency Avenue A, Seattle. "Today, as an agency, we're a little bit confused. Who do I call, the agency relations or the sales folks? It's not altogether clear to us. Who's able to make change happen for me?" For his part, Mr. Sherman believes that cross-platform work holds promise, however: "We can restructure until the cows come home ... and if we aren't an organization that people trust and can rely upon, we're kidding ourselves." contributing: jon fine | 科技 |
2017-09/1579/en_head.json.gz/22702 | The Ericsson Award for Mobility Content and Applications in Canada - Submit your nominations today!
The Ericsson Award for Mobility Content and Applications to be given at CATAAlliance's Innovation Awards Gala
OTTAWA, December 15, 2014 - CATAAlliance, Canada's largest high technology association announced that nominations are now open for the Outstanding Product Achievement Award recognizing excellence in the Mobility Content and Applications. The Award is named after one of world's leading telecommunications firms, Ericsson.
The Award will be presented to a Canadian organization for an outstanding technology engineering development which has resulted in the production of a world-class Mobility product or family of products. The product will have proven itself in operation and its design and market success will have enhanced Canada's international reputation for innovation and excellence.
Mobile devices are the most widely adopted communications technology in history. There are now more mobile devices in use than PCs, landline phones, internet users, and daily newspaper readers -- put together.
Finalists and award recipients will be chosen by a selection committee comprising experts drawn from the CATAAlliance National Leadership Council and Ericsson Canada.
According to CATAAlliance President, John Reid, "Ericsson continues to demonstrate its dedication to technology excellence and leadership through its support of recognition programs, research and advocacy. Awards and related programs create role models and also help to reinforce the power of new technologies in stimulating Canada's business growth and productivity. They are an important part of building Canada's Innovation Nation. We're proud that the Mobility Technology Award is named in Ericsson's honour and congratulate them on their corporate citizenship."
"Mobility is growing at a very fast pace. The number of mobile broadband subscriptions exceeded 2.3 billion in 2014, "said Pierre Boucher, Director, Research and Innovation, Ericsson Canada Inc. "The amount of data usage per subscription also continues to grow steadily and over 55 percent of all mobile phones sold in Q3 2013 were smart phones. Mobility is mainstream technology and the race to develop new technologies to support the Networked Society is on and Canadian companies need to be at the forefront. We are honoured to support this award which recognizes Canada's best in Mobility Content and Applications."
The Ericsson Award for Mobility Content and Applications will be presented on May 21st, 2015 in Ottawa at the CATAAlliance's Annual Innovation Awards Gala Dinner, a 30 year tradition of recognizing outstanding technological innovation and corporate leadership in expanding the frontiers of Canada's advanced technology industries.
Alumni CTV Host, Paul Brent will be returning as the guest MC and moderator of an executive Panel focusing on Innovation Leadership and Social Give Back.
Please submit your Mobility Content nomination today for the 2015 Winners at: http://www.cata.ca/innovation-and-leadership-awards/awards/innovnominate.aspx
President, CATAAlliance | 科技 |
2017-09/1579/en_head.json.gz/22707 | Live From Space
Mike Massimino
NASA Astronaut, Mission Specialist
NASA astronaut Mike Massimino is a veteran of two Space Shuttle missions, both of which set out to service the Hubble Telescope, including the historic final repair mission in 2009.
DOB: August 9, 1962
Birth Place: Oceanside, New York
Hometown: Franklin Square, New York
Residence: Houston, Texas
Family: He is married with two children.
Education: Bachelor of Science Degree in Industrial Science; Master of Science in Mechanical Engineering and Master of Science in Technology and Public Policy, Degree of Mechanical Engineering and PhD in Mechanical Engineering.
Hobbies: Mike enjoys baseball, family activities, camping, and coaching kid’s sports.
Spaceflights: STS-109; STS-125
While at MIT, Mike spent his summers working at NASA in various positions, including general engineer and as a research fellow.
After graduating from MIT in 1992, Mike worked at McDonnell Douglas Aerospace in Houston, Texas as a research engineer where he developed laptop computer displays to assist operators of the Space Shuttle missions.
Between 1992 and 1996, he took up positions as visiting assistant professor at Rice University and assistant professor at Georgia Institute of Technology. In 1996, Mike was selected as an astronaut candidate by NASA, reporting to Johnson Space Centre in Houston, Texas. Prior to his first space flight assignment, Mike served in the Astronaut Office Robotics Branch; and in the Astronaut Office Extravehicular Activity (EVA or spacewalking) Branch. In 2002, following his first spaceflight, Mike served as a CAPCOM (spacecraft communicator) in Mission Control and as the Astronaut Office Technical Liaison to the Johnson Space Center EVA Program Office. A veteran of two space flights, (STS-109 in March 2002 and STS-125 in May 2009), Mike has logged a total of 571 hours and 47 minutes in space, and a cumulative total of 30 hours and 4 minutes of spacewalking in four spacewalks.
In addition to various technical tasks, Mike currently serves as Chief of the Astronaut Appearances Office.
He is also "on loan" from Johnson Space Center to Rice University as their Space Institute Executive Director.
Since 2011, he has starred in the CBS sitcom The Big Bang Theory as a recurring character. He plays a fictionalized version of himself and has been featured in four episodes so far. Space Flight Experience
STS-109 ColumbiaSTS-109 was the fourth Hubble Space Telescope servicing mission. The crew of STS-109 successfully upgraded the Hubble Space Telescope leaving it with a new power unit, a new camera (the Advanced Camera for Surveys), and new solar arrays. STS-109 set a record for spacewalk time with 35 hours and 55 minutes during 5 spacewalks. Mike performed 2 spacewalks totaling 14 hours and 46 minutes. STS-109 orbited the Earth 165 times, and covered 4.5 million statute miles in over 262 hours and 10 minutes.
STS-125 AtlantisSTS-125 was the fifth and final Hubble servicing mission. The 19 year old telescope spent six days in the Shuttles cargo bay undergoing an overhaul conducted by four spacewalkers over five daily spacewalks, with the assistance of crew mates inside the Atlantis. The space walkers overcame frozen bolts, stripped screws, and stuck handrails. STS-125 set a new record for spacewalking with 36 hours and 56 minutes during five spacewalks. Mike performed 2 spacewalks totaling 15 hours and 58 minutes. The refurbished Hubble Telescope now has four new or rejuvenated scientific instruments, new batteries, new gyroscopes, and a new computer. The STS-125 mission traveled 5,276,000 miles in 197 Earth orbits and was accomplished in 309 hours, 37 minutes and 9 seconds.
Mike's Hubble Telescope Spacewalk
Mike is a veteran of two Space Shuttle missions, both of which set out to service the Hubble Telescope, including the historic final repair mission in 2009.
The Hubble
In 1990, the Hubble became the first major optical telescope to be positioned in space. The space observatory Hubble takes pictures of stars, planets & galaxies as it whirls around Earth at 17,500 mph. In its 20 years it has made more than 930,000 observations and photographed over 570,000 images of 30,000 celestial objects.
Hubble weighs 24,500 pounds—as much as two full-grown elephants.
Hubble is 13.3 meters (43.5 feet) long—the length of a large school bus.
It has already made more than 110,000 trips around our planet and traveled about 2.8 billion miles.
Mike's Second Repair Mission
In 2009, the power supply on the Hubble failed. And there was no way really to replace this unit or to repair the instrument, because it was buttoned up with an access panel that blocked the power supply that had failed. This access panel had 117 small screws with washers, and just to play it safe, they put glue on the screw threads so they would never come apart.
Mike spent five years practicing this spacewalk. The first thing Mike had to do was to remove a handrail from the telescope that was blocking the access panel. There were four screws to remove but the fourth screw was stuck. Mike’s realized the screw was stripped and that the handrail wasn't coming off, which means he can’t get to the access panel with these 117 screws that he’s been worrying about for five years, which means he can’t get to the power supply that failed, which means he wasn't going to be able to fix this instrument that day. He felt alone in the darkness. Mike liaised with Mission Control for the next hour and eventually they suggested he use gaffer tape. They wanted Mike to use that tape to tape the bottom of the handrail and then see if he could yank it off the telescope. They said it was gonna take about sixty pounds of force for him to do that.
It worked! The handle came off. Then Mike pulled out his power tool, and now he’s got that access panel with those 117 little bitty screws with their washers and glue, and he’s ready to get each one of them. Mike pulled the trigger on the power tool and nothing happens, the battery is dead. After swapping out the battery and recharging his oxygen tank the mission went well and the Hubble came back to life. After 8 hours of spacewalking Mike returned to the airlock but with 15 minutes to wait for his colleague was sent back outside just to soak up the magnificent view. | 科技 |
2017-09/1579/en_head.json.gz/22769 | It Can't Possibly Be That Easy
Over the weekend, I read Paul Krugman's big essay on climate economics, Building a Green Economy. In it, he makes the following claim:
Just as there is a rough consensus among climate modelers about the likely trajectory of temperatures if we do not act to cut the emissions of greenhouse gases, there is a rough consensus among economic modelers about the costs of action. That general opinion may be summed up as follows: Restricting emissions would slow economic growth — but not by much. The Congressional Budget Office, relying on a survey of models, has concluded that Waxman-Markey “would reduce the projected average annual rate of growth of gross domestic product between 2010 and 2050 by 0.03 to 0.09 percentage points.” That is, it would trim average annual growth to 2.31 percent, at worst, from 2.4 percent. Over all, the Budget Office concludes, strong climate-change policy would leave the American economy between 1.1 percent and 3.4 percent smaller in 2050 than it would be otherwise.
And what about the world economy? In general, modelers tend to find that climate-change policies would lower global output by a somewhat smaller percentage than the comparable figures for the United States. The main reason is that emerging economies like China currently use energy fairly inefficiently, partly as a result of national policies that have kept the prices of fossil fuels very low, and could thus achieve large energy savings at a modest cost. One recent review of the available estimates put the costs of a very strong climate policy — substantially more aggressive than contemplated in current legislative proposals — at between 1 and 3 percent of gross world product.
Such figures typically come from a model that combines all sorts of engineering and marketplace estimates. These will include, for instance, engineers’ best calculations of how much it costs to generate electricity in various ways, from coal, gas and nuclear and solar power at given resource prices. Then estimates will be made, based on historical experience, of how much consumers would cut back their electricity consumption if its price rises. The same process is followed for other kinds of energy, like motor fuel. And the model assumes that everyone makes the best choice given the economic environment — that power generators choose the least expensive means of producing electricity, while consumers conserve energy as long as the money saved by buying less electricity exceeds the cost of using less power in the form either of other spending or loss of convenience. After all this analysis, it’s possible to predict how producers and consumers of energy will react to policies that put a price on emissions and how much those reactions will end up costing the economy as a whole.
There are, of course, a number of ways this kind of modeling could be wrong. Many of the underlying estimates are necessarily somewhat speculative; nobody really knows, for instance, what solar power will cost once it finally becomes a large-scale proposition. There is also reason to doubt the assumption that people actually make the right choices: many studies have found that consumers fail to take measures to conserve energy, like improving insulation, even when they could save money by doing so.
But while it’s unlikely that these models get everything right, it’s a good bet that they overstate rather than understate the economic costs of climate-change action. That is what the experience from the cap-and-trade program for acid rain suggests: costs came in well below initial predictions. And in general, what the models do not and cannot take into account is creativity; surely, faced with an economy in which there are big monetary payoffs for reducing greenhouse-gas emissions, the private sector will come up with ways to limit emissions that are not yet in any model.Now, it's important to note that the goal of the Waxman Markey bill is to reduce US carbon emissions by 83% by 2050 (from 2005 levels, so even more than that from 2010 levels). So essentially, the CBO is saying, and Krugman is endorsing, that this level of emissions reduction will have so small an effect on economic growth that it's going to be indistinguishable from noise. I don't dispute that environmental economists think this, but I find it to be a completely facially implausible conclusion. I want to lay out two arguments for why these economists cannot possibly be right. The first is a common-sense argument about what actually has to happen at the level of the lives of individual citizens to bring about such a large reduction in carbon emissions. The second argument is based on looking at what was required to cause significant changes in energy efficiency in past episodes.
Let me start by saying, for any new readers that happen to stumble across this, that I believe the general thrust of the scientific consensus on climate change, and I strongly agree there is an excellent case for decisive action. See here and here for some past relevant posts. In general, I like to get my information on climate change from reading Science, Nature, and PNAS, rather than from partisan political sources. However, I also believe in being realistic about what one is proposing to do, and honest about the implications.
Next, let's think briefly about some implications of the quantitative claims above about economic growth and emissions reductions. US trend economic growth in recent decades is about 3% a year. So between now and 2050, in a business-as-usual future that is similar to the recent past, we would expect the economy to grow by 1.0340-1 = 225%. So the economy will be about three times as large as it currently is. Some of this will come from there being more people in the US, but more of it will come from the people being wealthier, which of course they generally like to express by having bigger houses, bigger and faster cars, and more advanced technology to fill them both with.
Now, if the economy is going to be a bit more than three times larger, but we are only going to emit 17% of the current level of carbon emissions, then the carbon intensity of the economy - that is the ratio of carbon emitted per dollar of goods and services created, is going to have to be only 5% of the current value. Next you have to figure that there are certain things in an industrial society that are very hard to do without liquid fuel - construction and agricultural machinery come to mind, along with aviation. Relying heavily on biofuels is a very dubious prospect in a world that also needs to feed 9 billion (assumed wealthier) people from its limited agricultural land. So you can probably figure that the residual 5% of carbon emission intensity is all going to go on these kind of specialized uses that are hard to substitute.
Therefore, these goals basically imply that the ordinary living and working of most citizens would be essentially carbon free by 2050. That is in 40 years time.
Now, I can certainly imagine a middle class lifestyle and workstyle that is carbon free. The technology is almost there. For example, we could live in super-insulated passive solar houses, we could drive electric cars to work at our super insulated zero-emissions offices and factories. The electricity to power our cars, provide for residual heating and cooling needs, and drive our industrial production would all (or almost all) have to come from some combination of renewables and nuclear, rather than the coal and natural gas that form the bulk of it today. I think if everyone did something along those lines, we could get down to 5% of our current carbon intensity.
But it should be clear that this basically requires replacing almost everything in our society. Since today, our houses are by and large made from R12 2 x4 stud walls, they pretty much would all need to be replaced to avoid the need for lots of heating/cooling energy. Ditto our commercial and industrial buildings. And of course most of current electricity generation infrastructure would need to go too. Finally, of course, all the cars will have to be replaced.
Now, the lifetime of cars is much less than 40 years, so they will all be replaced anyway; that's not a problem (though there certainly are questions about the ultimate scalability of that many electric cars). But the median age of a house is 35 years. Here's the age of housing as of 2003 according to the US Census Bureau, American Housing Survey for the United States:
As you can see, there's a lot of houses that are more than 40 years old. Note also that this kind of graph tends to understate the life of houses - most of the young houses are built on greenfield sites on the edge of town, and most of the older houses in town are still there. So we are going to have to do a lot of extra replacement to get to 5% of current carbon intensity. Instead of just building big houses on the outskirts of town, we also need to go and replace everything in town.
And of course other kinds of infrastructure tends to last even longer than houses - for example, the median age of current coal plants is 44 years.
Now, think of it this way: suppose you have a certain amount of money to spend over the next forty years that is your share of industrial society's surplus. You could take that money and either a) tear down your house and replace it with a super-insulated carbon-neutral one of about the same size, or b) add an extra floor and a swimming pool to the house you have and continue to power it with cheap fossil fuels (coal and shale gas, let's say).
I would argue that this is, very roughly, what the choice between business as usual and an 80% reduction in carbon emissions means in personal terms. I think at the personal level, most of us can understand that if we have to completely replace the house, we're not going to end up with the same amount of house as if we just add to the one we have.
My second argument is based on looking at past history. In order to get to 5% of our current carbon intensity in 40 years, we need to improve carbon efficiency by an average of 7.2% per year (0.051/40 = 0.928). That's a very large rate of change. In particular, our main experience with society making serious improvements in energy intensity is as a result of the oil shocks of the 1970s. I have looked extensively in the past at the effect of those shocks. Here for example, is the year-on--year rate of change of deployed vehicle fuel economy in the US fleet (see here for methodological details):
As you can see, in the late 1970s and 1980s, we reached a level of fuel efficiency improvements of around 2-3% a year, sustained for a little over a decade. The peak year was an improvement of 6.5%. But what was required to kick that off? Two massive oil-shocks, each of which led to a big recession. The first was the Arab oil embargo in 1973-74, and the second was the effect of the Iranian revolution in 1979, and the immediately following Iran-Iraq war. Google shows the effect on US GDP as follows:
So to get a puny 2-3% a year for a decade, just in the liquid-fuel sector, took two major recessions (or three, if you include the 1982 one).
So is it really plausible that we can price carbon high enough to improve our fossil fuel intensity across the whole economy by an average of 7% a year for 40 years, and it have no effect on growth? That appears wildly implausible to me.
carbon emissions,
fuel economy,
main posts,
oil efficiency,
Michael Dawson
I think it's even more pie-in-the-sky than you say. You find it plausible that we'll have 4 or 5 billion automobiles in 2050.Personally, I find that to be a rather wild assumption of its own. We're somehow going to wean ourselves completely off oil while also figuring out how to not just maintain but expand current methods of industrial production -- all with electricity and renewables only? And then, on top of that, we're also going to choose to squander a big chunk of the new power-generating arrangement on making and driving cars -- machines that sit idle 95% of their lives and devote a ton of materials to a task that, under equal conditions, could be accomplished with a 30-lb bicycle?
porsena
Krugman's estimate puts him in other illustrious company. In 1996, the UK Government published a review of the economic impact of controlling greenhouse gases. Lead by Sir Nicholas Stern, the report put the cost of stabilizing atmospheric CO2e to 550 ppm at around -2% to +5% of GDP annually by 2050,with the most probable value of 1%. Stern had been Chief Economist and VP of the World Bank. Stern's review generated considerable discussion but my sense is a general acceptance of his conclusion that the economic cost would be lower if action was taken to lower emissions sooner rather than later.
It's also worth noting that three quarters of the US economy is propelled by the service sector, a lower carbon effort than primary industries. And, tongue-in-cheek, I note that 8%-and-growing of the US GDP is provided by the financial services sector, which we all know has no real inputs at all :)
And sorry, that's 2006 for the Stern review, not 1996!
Américain à Paris
First of all, your essay shows an American lack of vision. I didn't need a car when I lived in New York nor doing my two years in Paris or my final abode here in Budapest. Second of all, Americans will have no choice but to wean themselves off oil. The majority of the world's giant oil fields are in decline. The world needs to discover and bring online a Saudia Arabia every three years just to offset erosion of the existing production base. There will be no cars running on oil-based fuels in 40 years ....
As a building energy engineer, I think the assumption that houses must be replaced to become carbon-neutral is just wrong. I personally converted my 1940s shack to a low-energy passive solar house without replacing it. Basic requirements are additional external or internal insulation and much improved windows.I agree with the thrust of the article that the conversion to a low-carbon economy will be much more difficult than Krugman envisions, and that we will become more poor, with either path, whether through investment and lower-consumption lifestyles or because we fail to act and suffer economic impacts of climate change.
Very hard to understand the economy as it is. Trying to understand how it might be if it was completely changed is even harder. Still, as your clear rebuttal shows, modern economic analysis is unnecessarily hopeless. Anyway energy is the key: we'll either have very cheap electric energy from one of the nuclear research projects, or things will be very bad indeed. Since the doom market is saturated, I'm rather interested in considering how the world will develop if there is very cheap electricity. We can use it to create liquid fuels to keep our liquid oriented economy going, but maybe we can do something better?P.S. The joy of spell checking is that all misspellings are actual words. You probably want "farcical".
rks - you could probably argue that my use of facially was a bit too innovative, but I meant it in the legal sense of "obviously wrong on its face". Cf facial challenge.
Tom - I'm interested in more detail on your retrofit.
DaveMart
Here is Bill Gates with his ideas on how to solve the problem:http://www.ted.com/talks/bill_gates.htmlHe argues that it is no good trying to deploy immature technologies which will slightly reduce the problem, and that feed-in tariffs and so on are hugely expensive and it would be better to spend some tens of billions on developing really revolutionary technology in wind, solar, and nuclear.He is backing the Traveling wave reactor. I am keen on the Liquid fluoride thorium reactor - more here:http://thoriumenergy.blogspot.com/2009/12/wired-thorium-article-available-online.htmlSolar is tough to do even as far north as the US, but say between 20 degrees north and south it is a different matter, and most people live there.So you spend more on R & D, and when you have an economic solution build it fast.In my view nuclear comes into that category now, and any supposed risks surely pale in comparison with the risks from climate change.Therefore I's suggest a build out of conventional nuclear now, followed by more advanced nuclear, solar and perhaps high altitude wind
crisismaven
Very cogent arguments. Problem is politicians cannot think and the CBO is not an independent think tank nor is human mind meant to discuss exponential curves or reductions. And Krugman anyway is one of the greatest "linearist" simplifiers of all. No wonder, Nobel didn't dedicate a mathematics prize - he knew the committee wouldn't be up o it or they wouldn't have awarded the IPCC with its faulty calculators and its projections with error greater than forecast. However that may be, there was another refutation of these mealy-mouthed projections over at the Mises Institute: "Correcting Krugman on Climate" and "The Costs of Carbon Legislation".
Even Colin Campbell is talking peak demand now. I have my doubts; depending on which stream you're talking about the US hit trough fairly quickly from the late 70s peaks, which I documented in the article I submitted to TOD. We can do a lot with CAFE etc but the fleet itself is larger now, thus new vehicles have less of an impact than they did then. Also ethanol takes away some of those gains in the first place. Other big conservation measure employed then was phasing out of residual fuel oil, from 15.88% of US consumption in 1977 down to 8.78% in 1984. Unfortunately China has already played this card, being down to 8.95% in 2008. This is the only major stream in US demand that went down for good; others eventually hit trough and rebounded - see that TOD article if interested. The Oil Drum | Petroleum Demand Lessons from the Late 1970sWhether we absolutely positively have to have more and more ethylene every year to grow the economy is open to question, of course. Shopping bags, pfffah! Just put RFID chips in all the products and shoppers can just push their carts past readers and tally up their purchases in a split second - and have their AmeroCards deduct the total from their savings account. Checkers freshly unemployed can go, uh, into flipping real estate or making iPod skins...I want to examine how economies that have gone to hell and back had their petroleum demand affected - Japan, FSU, Argentina, Indonesia. Probably some lessons to be drawn in there.
Datamunger
Great post, Stuart. Though, if the neocons can serve up endless fudge to get the nation to go to war, maybe ol' Kruggers is to be forgiven.This issue has legs. Glad you raised it.
I agree that this carbon reduction will be difficult, and often argue with my climate economist friends about the somewhat rosy assumptions behind their calculations. Effectively, their models are based on a Panglossian, best-of-all-possible worlds, in the sense that one focuses on a climate target as the end point and then lets the model calculations find an optimized path to get there. In other words, their is an all-knowing, central-planner (in a beneficial way) who guides the economy toward the low-carbon goal. Not likely, but at least a glimmer of hope to evaluate in more detail.As to the issue of housing modifications, some fairly simple calculations based on typical houses for the midwestern US show that 30-50% of energy use can be saved by simple, cost-effective measures such as sealing leaks, additional insulation, behavior changes (temperature setbacks and water management), etc. With those as starting points, and some creative ways to do financing of costs, renewable sources of energy supply become more viable, thus further reducing the carbon footprint.I'm not much of an optimist, but I think there are still paths available to us, for a short time at least.
Engineer-Poet
(4096 char limit? Whiskey Tango Foxtrot?! Okay, splitting this up.)Stuart, you know I'm a big fan of your work, but I'll go ahead and play advocatus diaboli: maybe it can be that easy. I won't say it will be for sure, but I'll try to make a case for it.How it can be that easy isn't a big mystery. We've been conducting our affairs without regard to GHG emissions since the Industrial Revolution, roughly 150 years ago. We've done this because until recently it hasn't cost us in any way we cared to count, or even knew how to. We have counted other things, like SOx and fly-ash emissions, and done a pretty good job of slashing them (where interest groups haven't put a thumb on the scales to e.g. maintain a market for high-sulfur coal despite the results).We would have seen action a lot sooner if "environmental" interest groups hadn't insisted on placing their own thumb on the scales, such as insisting on a "climate levy" against nuclear power plants (and more recently using the fraudulent Storm & Smith report to claim that mining uranium is GHG-intensive). This resistance imposed non-economic costs on low-GHG power, but it is in the process of evaporating (it should surprise no one that many anti-nuclear groups were financed by coal companies).Other things are coming along. Wind power is far cheaper than oil, and islands like Aruba are going for it in a very big way despite the financial crisis. It's expanding rapidly in the US, with net generation up 20% in 2009 despite bad financials and apparent bad weather. The time horizon for long-distance transmission lines is much longer than for wind farms, but they're moving too. Using HVDC to squeeze more power over the same corridors is not going to be un-noticed, either.The price of low-carbon energy gets more competitive all the time. Nuclear is expensive, but it appears to cost about as much as state-of-the-art coal plants; it's only old, fully amortized, scrubber-less polluting coal that's cheap. The cost of wind keeps falling; it's already very competitive with $8/mmBTU natural gas, and only today's unrealistic pricing of shale gas lets anyone think it isn't a winner. Solar is 20 years behind wind, but in 2030 it should be about where wind is today and that's only halfway to the 2050 date.The impending retirement of those old, polluting, amortized coal plants is significant. Electricity is going to cost more, and with it, inefficiency.That's the electric supply end. On the electric demand end, heat pumps can replace fossil-fired furnaces. EVs and plug-in hybrids can substitute electricity for liquid fuel, and dynamic charging can make more of that electricity GHG-free. Most existing buildings will have been re-roofed and remodeled at least once in the next 40 years, if they still exist; that's the time to update the insulation to modern standards. If we get some reasonable building codes soon, whatever's put up in the mean time will not waste energy like today's current stock. The outdoor hot tub and massive array of incandescent Christmas lights might become a Japanese soaker in the master bath (fed by a tank off the solar DHW heater) and gaudy-but-thrifty strings of LEDs; similar luxury for a lot less energy.
The inefficient use of electricity followed from historical low prices, but high efficiency is quite feasible with modern technologies. When prices create pressures to adopt them, they will become universal. Just as today's crappy building codes are The Way Things Are Done until suddenly they aren't, the same will be true of the best practices possible with what we've got. What's coming will be even better.I know I'm hand-waving here, but there are existence proofs for everything I've claimed here. Texas got 19% of its electricity from wind one recent morning, and the total could have been a lot higher if e.g. AC Propulsion's V2G system had been available for stabilizing the grid. Building energy retrofits have done everything Tom claimed (I've seen multiple case studies from around the USA). Nuclear has been working quietly and cleanly for the last 40 years, and only seems to get better. The storage of mass quantities of energy as compressed air is in the pilot stages.Yes, people will be wastrels if juice costs 5¢/kWh. They're far less likely to go the McMansion route if it's 15¢/kWh and gas is $1.20/therm; they'll install a ground-source heat pump instead of putting in an addition because, really, don't Americans have enough space already? But juice at 15¢/kWh is a lot cheaper than $4.00/gallon gasoline, so the Volts and Leafs and plug-in Priuses and Fusions will flourish as the SUV segment dies a second, final death. All of this will be accompanied by huge reductions in the amount of carbon emitted.It's not a certain future, but it's quite feasible and a few policy initiatives can make it likely. More than that, I can't say.
Stuart,It won't be easy, but it may not be as hard as you imply. But then again, it may be harder.First, growth probably makes the job easier, not harder. If the economy didn't grow at all, the required rate of reduction is still 4.3%. Your chart of vehicle improvements shows that level being reached only twice after WWII demob. So in one sense, 4.3% is in the same category as 7.2%: dauntingly difficult. But consider this: when the economy is growing, and people feel positive, they buy new stuff to replace the old. When it's not growing, they make do with the old. So a (relatively) high rate of GDP growth may speed the replacement of vehicles. The outcome depends on policies and incentives.Second, as you point out, vehicle emissions are not the total. Coal and gas fired power stations emit a comparable amount. And these emissions are much easier to manage and eventually eliminate, simply because there are so few points of emission (relatively speaking). The 44-year median age of coal plants is also a good thing. It means that existing plants are mostly amortized. There would have to be a spate of power plant construction soon, whatever happens.The picture is more mixed for industrial and residential emissions. Overall, it seems to me that given well-distributed income growth, and good emission-reduction policies and methods, the target looks - just - achievable. But (leaving aside the risks to incomes, some of which you have covered), we're setting ourselves up for failure due to our choice of method.Currently only two methods are being considered for emissions reduction: cap-and-trade, and a carbon tax. Yes, the EPA has threatened to regulate, but regulation is well out of fashion in political circles, so I can't see EPA regulation lasting forty years. Cap-and-trade or a tax are the long-run choices, and it looks like cap-and-trade has been chosen. Either or both could fail halfway through the job, because they contain perverse incentives.A carbon tax is a revenue stream for the government. It would be a strong-willed government that does not start using that revenue for its ordinary business. Once a government comes to depend on the revenue from a carbon tax, it has incentives to maximise that revenue, and to maintain its existence for as long as possible. The level of emissions corresponding to maximum revenue is unlikely to be what the science says we need. Similar reasoning applies to cap and trade, which involves the government auctioning emissions permits. The government has an incentive to maximise, or at least maintain, its revenue from the auctions. The US government has already succumbed to the incentive to use carbon revenue. The Wikipedia article on emissions trading has this: "The 2010 United States federal budget proposes to support clean energy development with a 10-year investment of US $15 billion per year, generated from the sale of greenhouse gas (GHG) emissions credits. Under the proposed cap-and-trade program, all GHG emissions credits would be auctioned off, generating an estimated $78.7 billion in additional revenue in FY 2012, steadily increasing to $83 billion by FY 2019." So the US govt. is already counting on $63 billion p.a. going into general revenues.In a market system such as cap-and-trade, arbitrageurs, too, have incentives to keep the market large, and so incentives to influence the government to slow the sinking of the cap. The "influencing" is generally carried out discreetly, though, so we won't see it in Wikipedia.Even if there is initial success in reducing carbon emissions via market methods, the rate of emissions reduction will slow drastically in a decade or so, and it may well stop entirely, well above the target level, purely through the operation of incentives -- these among others.
Where is all the energy going to come from to support this concept, which really constitutes a far higher percentage reduction for already developed nations since it, I guess, presumes the developing world will be allowed to play catch up? What about all the other resources required as well? Isn't it safe to say, with oil peaking at some point over the decade 2005-2015 that the world economy will absolutely contract too, no matter how much funny money is printed? As usual more guys just mentally whacking off on supply side solutions this and techno-fixes that, with no regard for reality. Simple stuff like, how much C02 would be expended building the infrastructure to lift the developing world to some level of comfort, all 7 or 8 billion of them, if UN population projections are to be believed, and what additional burden would all that business place on the environment?
In re: guzzlers: GM Arlington workers toil overtime to maintain SUV supplies | News for Dallas, Texas | Dallas Morning News | Dallas Business News Maybe if we see sustained prices this time - in 1979-1981 the crude price peaked and took a long time to come down from that height, unlike the late 2008 crash through the floor and attendant move to higher mileage standards.In re: Texas wind, ERCOT had that embarrassing incident cutting supplies to interruptibile customers due to wind coming up short. Maybe just an isolated incident or part of the learning curve?Link to 2007 American Housing Survey here: Census: U.S. Housing Stock Now Numbers 128 Million Units - Demographics, Single Family - Builder Magazine Hmm, 78.3% are solo driving commuters, bit higher than other refs I've seen. They conduct the survey every 2 years.
Chris Vernon
I don't understand where the +225% growth over the next 40 years come from... to hang the post off "in a business-as-usual future that is similar to the recent past" seems strange to say the least.I'd say a 80% reduction in emissions over that time frame is more likely that a further 225% growth. Remember the Soviet collapse scrubbed off over 30% of their CO2 emissions over just the five year period from '91.I don't see how Americans can be richer with bigger houses more cars etc. 40 years from now. More likely in my opinion they (Europeans too) will be poorer, living is colder/hotter houses, and travelling significantly less.I see no reason to suspect the future to be similar to the recent past.
Fmagyar
"So between now and 2050, in a business-as-usual future that is similar to the recent past, we would expect the economy to grow by 1.0340-1 = 225%. So the economy will be about three times as large as it currently is. Some of this will come from there being more people in the US, but more of it will come from the people being wealthier, which of course they generally like to express by having bigger houses, bigger and faster cars, and more advanced technology to fill them both with."ROFLMAO!! I guess the tooth fairy will make all that possible, right?Ever hear of limits to growth? Of course you have, you just refuse to accept it as reality. Go study the exponential function, it describes "Growth" Then wrap your mind around the consequences.Try This. Take a chessboard and put a grain of rice in the square at the upper left hand corner. Next put two grains in the square to its right. keep doubling until you reach the 64th square a the bottom right of the board. Now tell me how much rice you have.We are not going to continue growing the economy because it is physically impossible, we are at the beginning of a global economic contraction. We need a completely new paradigm... BAU is kaput!Good luck to all, cheers!
jewishfarmer
To me the most critical observation raised by your post, which mirrors some things I've written about in the past is this -will the build-out required in order to accomplish this push us into a high emissions scenario in and of itself - maintaining the economy and building out on that scale is likely to be an enormously emissions intensive project.
Chris:The reason I looked at those assumptions are because that's what Krugman, and the CBO summarizing the work of environmental economists, are claiming: they are saying that Waxman-Markey will have a negligible impact on economic growth, but nonetheless produce an 83% reduction in emissions by 2050, and I just don't see how that can possibly be the case. (So my post effectively has a reductio-ad-absurdum structure to the argument).
EP:I wasn't trying to claim that the emissions reductions couldn't be achieved - I agree that with a sufficiently determined effort, they probably could be. Rather, I wanted to question the idea that they couldn't be achieved without any impact on growth. When you say "they'll install a ground-source heat pump instead of putting in an addition because, really, don't Americans have enough space already", that's exactly the kind of choice I'm talking about. Replacing the heating system and having a bigger house are *alternatives* for our limited resources - the latter is growth, whereas the former isn't. While you and I might agree that (most) Americans have enough space, most of them don't agree, and would move to a bigger house if they could.
Krugman responds.
kjmclark
I would say to remember that growth in an economic sense is not the same as growth in a social sense. To have economic growth, you just need to have the value of goods and services increase. I don't think anyone even said that it has to be real, non-inflationary growth. A bigger, more expensive house is one form of growth. A smaller, more expensive house is also growth. Economically, it's the "more expensive" part that's growth.Further, there's nothing requiring that the GDP growth trickle down to the majority. It works just as well from an economic perspective for an increasingly wealthy 10% to have high-performance electric cars while the rest of us are riding bikes or taking electric transit powered by overhead catenaries. And there's nothing wrong with GDP growth increasing in some areas while others shrink. If people spend 5% less per year on cars and 10% more each year on electronics, GDP will continue to grow.
Stuart,You wanted details on our retrofit. We added 2 inches of additional rigid insulation outside the frame walls with the dryvit stucco system (better than the apparent R-14 increase because no studs are penetrating the exterior insulation), plus high-end Anderson windows (triple-glazed, argon), added south glazing and window coverings, all done in 1990, been reaping the benefits ever since.EcoFutures in Boulder is a contractor specializing in zero-energy homes and they have good info on retrofit technologies on their website, like the following example.(http://www.ecofuturesbuilding.com/2009/10/1247-scrub-oak/)Building technologies are evolving and I think energy retrofits will get even more cost-effective, but windows/siding need replacement every ~30 years and I think the life-cycle cost is already lowest if window/furnace/envelope updates are done to high energy performance standards (just as another market failure, due to lack of information, split incentives,etc.).
Tom:I beat you to that link :-). Thanks for the update on what you did.
I wondered if you arrived at that link independently. Ecofutures remodeled the house across the alley from me which I am looking at right now, to an energy-efficient but not net-zero standard.
Tom:That's a pretty funny co-incidence. I was just googling "zero energy retrofit" around 6:30am this morning and came across it.
sofistek
This is more like it. Common sense weaved in with some quantitative analysis.Dave Cohen wrote something similar last year. It's long but worth a read.Yes, with the economy imagined to be over 3 times the current economy, just what do those economic modelers think that economy will consist of? People spending most of their time just talking to each other? No, it's going to consist of, roughly, 3 times the amount of resource consumption including a similar rise in energy consumption. Economists assume an infinite world, with infinite human ability to solve problems.
interesting article, Stuart, and comments.I myself was thinking along those lines for quite some time, and, well,did not come to a conclusion.Concerning the 'reductio ad absurdum' effort, I agree.What bothers me, that even an intelligent person like Krugman cannotthink without 'growth'.The 'qualitative' growth replacing 'quantitative' growth arguments are to my opinion mainly illusions.Ofcourse ist is at first a matter of definition, what we understandby 'growth'.After satisfying basic needs there is the wide-open area of well-being, where we 'invest' our efforts (metaphorically:energy)Just imagine an imaginary Buddhist economy, where after the basic needs are satisfied,everything would be 'invested' into the inner side: The education/perfection of the subject and its relations.Think Bhutan.But this is not how the western (capitalist) system works.Everything has to be monetized.Maybe we can become all artists and sell our works to 'consumers' who buy our workswith huge amounts of money. (Van Goghs, Picassos etc)A nearly zero-energy-Nirvana above the basic needs.Actually a lot of stable 'primitive societies' seem to work that way. Krugman wrote an interesting piece about the economy in 2100, I think in 2000.Looking back, he argued similarly:The information work was mostly devalued in his retrospect, and celebity-status wasthe 'value'.Our current value-system is inherently energy-based, i.e. material whealth, including vacationing in far-away places and so on.I am surprised that Krugman never elaborated on this fundamental issue.To make this not too long, I stop here and hopefully write a second comment tomorrow.(One evident contradiction in conventional economic thinking is in the 'discounting the future', which collapses, if we change our value-structure.)
second part.My main point is, that a longterm-contracting economy is incompatible with conventional economics.Actually it is not much different from a 'reductio ad absurdum'.To stay alive, it has to phantasize phantom growth in western terms. I realized that after the Stern-report 2006/7, who discounted the future with 1%, whereas American economists use something like 3%.Which means, that the Americans are far more optimistic than even the British.Common to both is that the future is, on average, brighter than the past.So the comparison of cost of doing something NOW compared to doing the same thing some 10-100 yearsLATER would be discounted by the respective compounded amounts.This way of argueing is deeply embedded in conventional economic thinking.It relies on the assumption, that the future is -GNP-wise- better than the past: i.e. GROWING.So we again land on the question what 'growth' is?In conventional economics it is everything which ADDS to GNP.If we dematerialize, this concept has the difficulty to integrate this dematerialization in itsmeasurement.But: To dematerialize we have!To rescue the growth paradigm in an economy which monetizes everything, we paradoxically have to shift our values!Is a robust iPad-economy thinkable, which ADDS to this GNP, on top of its physical implementation,which is mainly chinese -- in the US?I doubt that.The idea behind that is, that the Chinese are dumb manufacturers, the Americans the clever Value-Adders. And this forever.But:'Information' is cheap at last.The status-quo is still the physical/material as a measure of whealth/GNP.AND THIS HAS TO GIVE!Even the steady-state economists (e.g. Daly) rarely mention this.The main battle will be on value-systems.Well, it is already.--------------BTW, I am surprised that Peter Corning, to my opinion an eminent systems thinker, is never mentioned in this essential debate (here, theoildrum, climateprogress, and and and!) Why is that?
MisterMoose
Let's assume that we can make photovoltaic cells twice as efficient as they are now (with a few billion dollars and a few years of R & D this should be possible). So, instead of building 10,000 square miles of solar panels in the Nevada desert to replace our existing carbon-based electric generating system, we'll only have to build 5,000 square miles...How much do PV cells cost? How many trillions of dollars will be required to build the desert solar farms and the transmission lines to get that electricity to where it's needed? How much will it cost to build the huge banks of rechargeable batteries to store that electricity for when it is needed? Yeah, we could do this, but we'll go broke in the process.We can cut our electricity usage by using more efficient motors, insulation, etc. But, if we start transitioning to electric vehicles, we'll need to generate even more electricity.Whenever I read anything by Krugman, I wonder how did this guy ever get a Nobel Prize? Was this all just political, like Al Gore and Barack Obama's prizes?
Five Centuries of Inflation
Debts and Defaults in History
Nantucket Sound Wind Project Approved
Global Forest Losses
Planned Rumaila Output
Averaging US East Coast Wind
Crude + Condensate Trends
When Does Surplus = Resilience?
If 2005-2007 Wasn't Peak Oil, What Was It?
Latest Global Oil Production Statistics
European Aviation and the Icelandic Volcano
More on Icelandic Volcano
What is Waxman-Markey Really Doing?
Correlation is Not Causation
Icelandic Volcano Blows Again
Carbon Prices are a Very Blunt Tool
Deep Energy Retrofits
Robots vs Relocalists
Latest Sea Level Rise Projections
Icelandic Volcanoes
Ubaid Proto-Civilization
Exceeding the 2008 Peak of Oil Production
Implications of Unmeasurable Capital
State of the Blog, Q1 2010
Latest Male Employment Improving
35.5mpg by 2016
A Few Thoughts on Yesterday's Announcement | 科技 |
2017-09/1579/en_head.json.gz/22788 | NASA’S CURIOSITY ROVER BEAMS BACK PHOTOS OF POSSIBLE ASTRONAUT FOUND ON MARS
According to Scott C. Waring from ufosightingsdaily, this is not the only case when an unusual shadow slips into an official picture. A few years ago, a man stumbled across another similar shadow of a presumed astronaut equipped with an oxygen tank that appears to be fixing the rover.
Can these two appearances be considered a case of pareidolia – a psychological phenomenon that links something inexistent to a familiar pattern? Or rather a true fact that our minds refuse to acknowledge because of the social standards we’re all accustomed to?
The UFO community finds this catch to be clear evidence that something is wrong with this entire Martian mission, if it’s really Mars we’re talking about. Scott Waring who posted this amazing discovery on his blog has an interesting point of view on this:
“I was checking out the newest Curiosity rover photos on the NASA site and found four of them with the shadow of an astronaut in spacesuit that appears to be fixing the rover. This just goes to show the public that the rover is being maintained by humans on Mars, and that there are other spacecraft kept secret from the public that carry a person to Mars in just a few minutes.”
He added another remark that’s a bit unrealistic, but worth the mention:
“I got a Taiwanese friend who works at NASA right now. He said a few years ago that he saw a blue/green wheel spacecraft that hovers like a turning wheel and can carry an astronaut at about 1/20 the speed of light. That’s fast, and would get you to the moon in about 30 seconds.”
Such a scenario is not that hard to imagine considering the technological advancement in recent years (not to mention the hidden toys of the military), but I think that it’s highly unlikely because, even if we could achieve such speeds, the human body will be unable to withstand the atmospheric pressure.
Whatever the case, the shadowy figure found in close vicinity to Curiosity is sure to raise a lot of question marks. | 科技 |
2017-09/1579/en_head.json.gz/22824 | Wes Sechrest, Ph.D.
CHIEF SCIENTIST & CEO Dr. Sechrest founded Global Wildlife Conservation and serves as Chief Scientist and CEO. He leads GWC’s efforts to explore, document, and protect the world’s most threatened species and habitats. Dr. Sechrest has a background in conservation biology, particularly focused on mammal conservation and priority setting for biodiversity conservation. Dr. Sechrest is a conservation biologist, with a focus on international wildlife conservation. His interests in conservation science vary widely, and include work on threatened species, zoology, protected areas, biodiversity patterns and processes, natural history, environmental science, and the link between academic and applied conservation science.
Dr. Sechrest’s research focuses on global conservation issues, as well as field-based biodiversity conservation. Dr. Sechrest is an expert on species conservation, and is extensively involved in advancing conservation efforts, particularly for mammals. He has published numerous scientific papers in leading journals including Science, Nature, BioScience, and the Proceedings of the National Academy of Sciences. His research has been used to set priorities for international conservation to advance protection of the planet’s biodiversity. Additionally, his work has examined the effectiveness of protected areas for species conservation, as well as how to conserve species in areas under high threat from humans and their activities. He has international work experience in more than 25 countries.
Currently, Dr. Sechrest is leading work to explore, research and protect the most biologically important areas on the planet. As GWC’s Chief Scientist and Chief Executive Officer, he is implementing field projects across the planet along with a team of renowned field biologists. He is working on how to identify and conserve threatened species, including how to systematically set conservation priorities. Additionally, he is leading GWC’s efforts to secure long-term preservation of areas of global biological importance. He leads GWC’s collaborative efforts to best conserve the world’s threatened wildlife in partnership with local, national, and international non-governmental and governmental organizations, both in the United States and abroad. Dr. Sechrest also serves on the Board of Directors of Bat Conservation International and on the Global Council of the Amphibian Survival Alliance. He is Adjunct Faculty in the Department of Wildlife and Fisheries Sciences at Texas A&M University.
Ph.D. Biology, University of Virginia.
B.S. Biology, Wake Forest University.
A Q&A with GWC Chief Scientist & CEO Wes Sechrest on new anti-poaching endowment | 科技 |
2017-09/1579/en_head.json.gz/22826 | Google To Acquire YouTube for $1.65 Billion in Stock
Combination Will Create New Opportunities for Users and Content Owners EverywhereMOUNTAIN VIEW, Calif., October 9, 2006 – Google Inc. (NASDAQ: GOOG) announced today that it has agreed to acquire YouTube, the consumer media company for people to watch and share original videos through a Web experience, for $1.65 billion in a stock-for-stock transaction. Following the acquisition, YouTube will operate independently to preserve its successful brand and passionate community.The acquisition combines one of the largest and fastest growing online video entertainment communities with Google’s expertise in organizing information and creating new models for advertising on the Internet. The combined companies will focus on providing a better, more comprehensive experience for users interested in uploading, watching and sharing videos, and will offer new opportunities for professional content owners to distribute their work to reach a vast new audience."The YouTube team has built an exciting and powerful media platform that complements Google’s mission to organize the world’s information and make it universally accessible and useful," said Eric Schmidt, Chief Executive Officer of Google. "Our companies share similar values; we both always put our users first and are committed to innovating to improve their experience. Together, we are natural partners to offer a compelling media entertainment service to users, content owners and advertisers.""Our community has played a vital role in changing the way that people consume media, creating a new clip culture. By joining forces with Google, we can benefit from its global reach and technology leadership to deliver a more comprehensive entertainment experience for our users and to create new opportunities for our partners," said Chad Hurley, CEO and Co-Founder of YouTube. "I’m confident that with this partnership we’ll have the flexibility and resources needed to pursue our goal of building the next-generation platform for serving media worldwide."When the acquisition is complete, YouTube will retain its distinct brand identity, strengthening and complementing Google’s own fast-growing video business. YouTube will continue to be based in San Bruno, CA, and all YouTube employees will remain with the company. With Google’s technology, advertiser relationships and global reach, YouTube will continue to build on its success as one of the world’s most popular services for video entertainment.The number of Google shares to be issued in the transaction will be determined based on the 30-day average closing price two trading days prior to the completion of the acquisition. Both companies have approved the transaction, which is subject to customary closing conditions and is expected to close in the fourth quarter of 2006.Webcast and Conference Call InformationThe company will host a conference call and webcast at 1:30 p.m. Pacific Time (4:30 p.m. Eastern Time) today to discuss the acquisition. To access the conference call, please dial 800-289-0572 domestic and 913-981-5543 internationally. A replay of the call will be available until midnight Monday, October 16 at 888-203-1112 domestically and 719-457-0820 internationally. Confirmation code for the replay is 2260624.A live audio webcast of the conference call will be available at investor.google.com/webcast.html.About Google Inc.Google’s innovative search technologies connect millions of people around the world with information every day. Founded in 1998 by Stanford Ph.D. students Larry Page and Sergey Brin, Google today is a top web property in all major global markets. Google’s targeted advertising program provides businesses of all sizes with measurable results, while enhancing the overall web experience for users. Google is headquartered in Silicon Valley with offices throughout the Americas, Europe and Asia. For more information, visit www.google.com.About YouTubeFounded in February 2005, YouTube is a consumer media company for people to watch and share original videos worldwide through a Web experience. YouTube allows people to easily upload and share video clips on www.YouTube.comand across the Internet through websites, blogs, and e-mail. YouTube currently delivers more than 100 million video views every day with 65,000 new videos uploaded daily and it has quickly become the leading destination on the Internet for video entertainment.Caution Concerning Forward-Looking StatementsThis document includes certain forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995, including statements regarding Google’s and YouTube’s ability to improve their services, create new business models and content-owner opportunities, integration plans, the expected timing for the closing of the acquisition and the plans to operate YouTube independently. These statements are based on the current expectations or beliefs of management of Google Inc., and are subject to uncertainty and changes in circumstances. Actual results may vary materially from those expressed or implied by the statements herein due to (1) changes in economic, business, competitive, technological and/or regulatory factors, (2) failure to receive regulatory approval for the acquisition, (3) failure to retain the levels of traffic on the YouTube site, (4) failure to compete successfully in this highly competitive and rapidly changing marketplace, (5) failure to retain key employees, (6) other factors affecting the operation of the respective businesses of Google and YouTube, and (7) the failure of YouTube and Google to work together effectively. More detailed information about these factors may be found in filings by Google, as applicable, with the Securities and Exchange Commission, including their respective most recent Annual Report on Form 10-K and Quarterly Report on Form 10-Q. Google is under no obligation to, and expressly disclaims any such obligation to, update or alter their respective forward-looking statements, whether as a result of new information, future events, or otherwise.Press Contacts:Google Media:Jon Murchinson650.253.4437jonm@google.comInvestors:Maria Shim650.253.7663marias@google.comYouTube Media:Julie Supan650.685.6401press@youtube.com | 科技 |
2017-09/1579/en_head.json.gz/22861 | Invader Vex's Laboratory
Share Invader Vex's high tech lab is his home and birthplace of all his Inventions. The lab is located in a space station orbiting Planet Jupiter. This station is mobile and can be moved anywhere in the galaxy. Several teleporters located in Vex's base on Earth and other planets can transport people and equipment back and forth. In the right corner of the lab is Vex's home. It includes a kitchen, living room, PAK charging room, and a small room for tinkering.
Underneath the first level of the base is an atom smasher and dark matter reactor to test some of Vex's more radical ideas. The laboratory, however, takes up only a third of the station. The rest is the bridge, mess halls, crew quarters, entertainment, a library stocked with every work Vex has ever come across, and many more features. The station, being massive, has an array of cloaking panels to hide it from radar and sight.
Over a decade of study, Vex has safely constructed a power core of the best possible quality for the station. This power core floats inside a large room at the back of the station, held in place by a substance with the chemical makeup of C8FX5Q2.
This substance, a carbon based molecule found in black holes, can be induced to not become the center of gravity, but rather an inverse epicenter. In this manner, the power exerted by the core, which contains the substance in its center, will be thrown out of the power core onto the walls of the room, making for maximum energy capture with minimum energy release.
This is ideal, because it stops overload as well as capturing literally all of the fusion based core's energy. In a nutshell, Vex has created a small, containable star in his station.
The station was constructed according to Vex's blueprints during his invasion of The Vortian System and was originally crewed by Vortians, but later by an entirely Irken crew.
Retrieved from "http://irkempire.wikia.com/wiki/Invader_Vex%27s_Laboratory?oldid=38238"
Invader Vex Universe
Irken technology
Modified Irken Technology
Unofficial Irken Technology | 科技 |
2017-09/1579/en_head.json.gz/22921 | JF Ptak Science Books
A Daily History of Holes, Dots, Lines, Science, History, Math, the Unintentional Absurd & Nothing |1.6 million words, 7500 images, 4 million hits| Press & appearances in The Times, Le Figaro, MENSA, The Economist, The Guardian, Discovery News, Slate, Le Monde, Sci American Blogs, Le Point, and many other places... 4,200+ total posts
STORE: Books
STORE: WWI Photos
STORE: Physics & Maths
WWI Photography Catalog Twitter Follow
The History of Ideas Blog by John Ptak is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.Based on a work at longstreet.typepad.com.Permissions beyond the scope of this license may be available at http://www.thesciencebookstore.com.
« Popular Books on the History of Science--a Discussion on a Reading List |
| Inventions Ill-Advised: Bicycle Malapropisms in Edwardian England, 1901-1906 »
Roosevelt, Extermination and the Holocaust, 1942: the Only White House Meeting Between FDR and Jewish Leaders During WWII
JF Ptak Science Books LLC Post 240
I’ve reprinted this short, four-page pamphlet on the results of a meeting between President Franklin Roosevelt and a number of eminent leaders of American Jewish organizations that took place at the White House on 8 December 1942. (See the CONTINUED READING section below for the full document.) (These groups included American Jewish Committee, the American Jewish Congress, B’Nai B’Rith, the Jewish Labor Committee, the Synagogue Council of America, and the Union of Orthodox Rabbis of the United States.) The pamphlet itself came to me in a very large acquisition I made from the Library of Congress, and it had come to the Library from the Yiddish Scientific Institute (NYC) in May 1943. There are no indications of who printed the document. Communication with the still-extant Yiddish Scientific Institute has revealed that there are unpublished files concerning this meeting but that there are no published documents regarding it. This is evidently the only time that Roosevelt met with Jewish leaders to discuss the extermination of the Jewish people. The pamphlet is entitled Memorandum Submitted to the President of the United States at the White House, on Tuesday, December 8, 1942, at Noon. By a Delegation of Representatives of Jewish Organizations… The publication gets right to the point, instantly, in the first paragraph, leaving nothing to chance:
“Almost two million Jews of Nazi Europe have been exterminated through mass murder, planned starvation, deportation, slave labor and epidemic in disease-ridden ghettos, penal labor colonies and slave reservations created for their destruction…The five million Jews who may still be alive inside Nazi-occupied territory are threatened with total extermination under the terms of an order from Hitler calling for the complete annihilation of the Jews of Europe by December 31, 1942”
The document discusses slave labor and—in quite some detail—“Hitler’s extermination camps”, “caravans of death”, “extermination centres”, massacres”, “slow death”, and “religious persecution”.
This all reminds us of the pretty deep understanding of what was going on with the Jewish people, even in 1942.
The last sentences of the last paragraph:
“….Germany will go down in utter defeat. But the Jews of Europe, whom Hitler has marked out as the first to suffer utter extinction, have no assurance at present that a United Nations victory will come in time to save them from complete annihilation”. I cannot find the entire text of this document online.I’ve reprint it below in the CONTINUED READING section. The following LONG EXTRACT is very good and taken in toto from the Virtual Jewish Library (I'd include the link but this is no longer allowed on ebay)
Report On Meeting of Jewish Leaders with Roosevelt(December 8, 1942)
After the State Department confirmed reports that Hitler was planning to murder all the Jews in territories under German control, several American Jewish leaders including Rabbi Stephen Wise managed to arrange an audience with President Roosevelt. At this 29-minute meeting, the only one FDR had with Jewish leaders about the Holocaust, the President was presented with a document outlining the Nazi plan to annihilate European Jews. Adolph Held, the president of the American Jewish Labor Committee, wrote this report of the meeting, which indicates the president was acquainted with details of the atrocities being committed by the Nazis. The committee consisted of Rabbi Stephen S. Wise, of the Jewish Congress; Mr. Monsky, of Bnai Brith; Rabbi Rosenberg, of the Agudath, and Adolph Held, of the Jewish Labor Committee.The meeting with the President was arranged for Tuesday, December 8, 1942, at 12 o'clock. We were originally notified that the President would give us 15 minutes, but the conference lasted 29 minutes. The purpose of the conference was to present a prepared memorandum on the German atrocities in Poland consisting of an appeal to the President for immediate action against the German extermination of Jews, and also a 12 page memorandum citing the facts that have been gathered on this subject.We were taken into the President's office in the White House by General Watson, the President's personal military aide, exactly at 12 o'clock. The President was seated at his desk; in front of the desk were lined up five chairs for the delegation.The President sat behind the desk smoking a cigarette in a long cigarette-holder. The desk was full of all sorts of trinkets--ash trays, brass and porcelain figures, etc. There was not an empty spot on his desk. The figures were of all shapes and sizes.
As we filed in, the President greeted Rabbi Wise: "How have you been, Stephen? You are looking well. Glad to see you looking well." Rabbi Wise then introduced each of us separately. The President shook hands with each of us, repeated the name, and then asked: "How do you do, Mr. Monsky?," etc., following which he asked us to sit down.
When we were seated, the President opened the conversation by saying: "I am a sadist, a man of extreme sadistic tendencies. When I appointed Governor Lehman as head of the new Office of Relief and Rehabilitation, I had some very sadistic thoughts in my head. I know that Governor Lehman is a great administrator, and I wanted a great administrator for this post. I had another thought in my mind, however. I had hopes that, when God spares my life and the war is over, to be able to go to Germany, stand behind a curtain and have the sadistic satisfaction of seeing some "Junkers" on their knees, asking Lehman for bread. And, by God, I'll urge him to give it to them."
Rabbi Wise then said: "Mr. President, we have an orthodox Rabbi in our midst. It is customary for an orthodox rabbi to deliver a benediction upon the head of his country, when he comes in his presence. Will you, therefore, permit rabbi Rosenberg to say the prayer of benediction?"
"Certainly" the President answered.Rabbi Rosenberg rose and put on his scull-cap. We all rose. The President remained seated, and, as Rabbi Rosenberg commenced to recite the prayer in Hebrew, the President bowed his head."O, God Lord of Kings, blessed be Thy name that Thou bestowest a share of Thy glory upon the son of men.""Thank you very much"-- the President said.The President seemed to be moved, and so were we all.Rabbi Wise then read the declaration by the committee.
Rabbi Wise did not read the details but simply said: "Mr. President, we also beg to submit details and proofs of the horrible facts. We appeal to you, as head of our government, to do all in your power to bring this to the attention of the world and to do all in your power to make an effort to stop it."
The President replied: "The government of the United States is very well acquainted with most of the facts you are now bringing to our attention. Unfortunately we have received confirmation from many sources. Representatives of the United States government in Switzerland and other neutral countries have given up proof that confirm the horrors discussed by you. We cannot treat these matters in normal ways. We are dealing with an insane man-- Hitler, and the group that surrounds him represent an example of a national psychopathic case. We cannot act toward them by normal means. That is why the problem is very difficult. At the same time it is not in the best interest of the Allied cause to make it appear that the entire German people are murderers or are in agreement with what Hitler is doing. There must be in Germany elements, now thoroughly subdued, but who at the proper time will, I am sure, rise, and protest against the atrocities, against the whole Hitler system. It is too early to make pronouncements such as President Wilson made, may they even be very useful. As to your proposal, I shall certainly be glad to issue another statement, such as you request."The President turned toward the delegation for suggestions. All, except Rabbi Rosenberg, put in suggestions. Mine was about the possibility of getting some of the neutral representatives in Germany to intercede in behalf of the Jews. The President took notice of that but made no direct replies to the suggestions. The entire conversation on the part of the delegation lasted only a minute or two. As a matter of fact, of the 29 minutes spent with the President, he addressed the delegation for 23 minutes.
The President then plunged into a discussion of other matters. "We had a Jewish problem in North Africa" -- he said. "As you know, we issued orders to free all the Jews from concentration camps, and we have also advised our representatives in North Africa to abolish all the special laws against the Jews and to restore the Jews to their rights. On this occasion I would like to mention that it has been called to our attention that prior to the war, Jews and Frenchmen enjoyed greater rights than Moslems in some of the North African states. There are 17 million Moslems in North Africa, and there is no reason why anyone should enjoy greater rights than they. It is not our purpose to fight for greater rights for anyone at the expense of another group. We are for the freedom for all and equal rights for all. We consider the attack on the Jews in Germany, in Poland, as an attack upon our ideas of freedom and justice, and that is why we oppose it so vehemently." "Now you are interested in the Darlan matter. I can only illustrate this by a proverb, I recently heard from a Yugoslav priest--"When a river you reach and the devil you meet, with the devil do not quarrel until the bridge you cross."
Apparently, at the end of this quotation the President must have pushed some secret button, and his adjutant appeared in the room. His eyes and broad shoulders showed determination. We rose from our seats, and, as we stood up, the President said: "Gentlemen, you can prepare the statement. I am sure that you will put the words into it that express my thoughts. I leave it entirely to you. You may quote from my statement to the Mass -Meeting in Madison Square Garden some months ago, but please quote it exactly. We shall do all in our power to be of service to your people in this tragic moment."
The President then shook hands with each of us, and we filed out of the room. CLICK to enlarge.
Posted by John F. Ptak in History | Permalink
Europe, extermination, Holocaust, Jewish, Jews, Roosevelt, World War II
The First Appearance of a Monument in the History of Ambiguity (1832)
Early Data Visualization (1817)
Abbreviating Auschwitz: a Construction Agreement for the Construction of the Buna Works (FULL DOCUMENT)
The Coloring Book of Hell--Hell in Black & White (ca. 1600)
Weird Questions and Weirder Answers Department: Curing Insanity
The Heavy Stuff, #16--the Big Boom (1872)
The Heavy Stuff, #15--Big Pistons (1872)
Objects as their Own Context (1872)
A Tunnel Long in the Making (1881)
Catalogs // Books for Sale
Physics & Math Books for near-Spring, 2015
Urban, Military, and Unusual Social History
Tech Books: History of Computer Science
Physics and Maths Catalog: Unusual and Rare Books in the Modern Sciences
Books for Sale: New Arrivals Phys/Math/Tech
Absurdist, Unintentional
American History: Western Exploration & Native Americans
Anticipation, History of Architecture and Building
Atlas of Dead Ideas
atomic and nuclear weapons
Aviation & flight
Beautiful books
Blank and Empty Things; A History of
Books--Great Cover Art
Books--title pages, beautiful
Books: Great & Lost in the Dust
Books: Title Pages, Unusual
Boredom, History of
Brevity and Complexity
Chemistry, history of
Circles, Geometry
Civil War, U.S./NEW
Color & its advanced Abuses
Computer Tech/History
Contraries
Cross-Sections
Daily Dose from Dr. Odd
Electro-LUXurious
Fantastic Beasts and Tales
Fantastic Titles
Fear, History of Future Punk
Future, History of the
German Design
Histories of Smallness
History of Dots
History of Goodbye
History of Holes
History of Lines
History of Memory
History of Nothing
History of the Future
How Fast Stuff Is.
Imaginary and Impossible, Museum of the Impossible Books
Industrial & technological art
Information, Quantitative Display of
Lines, History of
Magnifcent Mundane
Maps, Cartography, History of Mapmaking
Maps/Diagrams of Imagination & Ideas
mathematics, logic
Medicine, History of
Memory, Historry of
Mistakes, The Importance of
Outsider Logic
Photography, history
Picture Post/Image Dump
Piles, History of Politics, American
Poster Series
Prints--looking HARD/deeply at
Questionable Quidity
Statistics--Fossil, Found, Odd, Forgotten
Strange Things in the Sky
Tech-Quiz
Technology, History of
Title Page Art, Beautiful
Women, History of WORD art
You Are There
Zoomology | 科技 |
2017-09/1579/en_head.json.gz/22964 | 5 Strange Things Banned by Governments
Caisey Robertsonfiled under: Lists, weird Image credit: ThinkStock
Any child can tell you that banning something is almost a surefire way of piquing an interest in it. Some governments, however, decided that banning these unusual things was an easy fix for their problems.
1. Greece: Video games
In 2002, Greece decided to ban any and all electronic computer games—everything from console games to online Solitaire and Minesweeper fell under this ban. The law was apparently intended to help crack down on Internet gambling; CNET reported that “the blanket ban was decided in February after the government admitted it was incapable of distinguishing innocuous video games from illegal gambling machines.” Soon after, though, a local Greek judge declared the law unconstitutional, and though the law still exists, it seems the Greek government hasn’t been doing much about it.
2. China: Time travel
Not the action of time travel itself, but rather the portrayal of it. In early 2011, the State Administration for Radio, Film and Television in China declared that time travel is all but prohibited from TV and movies. Apparently, time travel has been very popular in Chinese TV dramas, and the government discouraged them because they “casually make up myths, have monstrous and weird plots, use absurd tactics, and even promote feudalism, superstition, fatalism and reincarnation.” Naturally, the time-travel film Looper was extremely successful in China.
3. Russia: Being Emo
The emo trend began in the 1980s, characterized by its emotional music similar to punk and rock and the fashion styles that are a mix of punk and goth. To most people, it’s just a phase for teenagers to go through; in Russia, though, it’s a dangerous social group that should be stamped out. In 2008, a piece of legislation (“Government Strategy in the Sphere of Spiritual and Ethical Education”) began to restrict “dangerous teen trends” like the emo culture. The bill describes “emos” as teenagers with studded belts, painted fingernails, facial piercings, and black hair with face-concealing fringes. It also claims that the emo “negative ideology” encourages depression, social withdrawal and suicide—and it would be irresponsible to allow the trend to continue. The self-proclaimed emo culture took to the streets, marching in the UK and Siberia to defend their right to expressing their emotions. If Russia has its way, emo will be all but banned by 2020.
4. China: Reincarnation
If you don’t have permission from China’s government, you can’t seek reincarnation. While probably not a problem for most people, the Buddhist monks in Tibet are facing a complicated issue. The law, which is very specific on the procedures of permitted reincarnation and is stated to be “an important move to institutionalize management of reincarnation,” is an underhanded attempt to diminish the Dalai Lama’s influence and restrict the Buddhist establishment still in existence in Tibet. The current Dalai Lama is 77 years old, and he refuses to be reborn in Tibet as long as it is under Chinese control. In the future, there could be two Dalai Lamas—the one chosen by the Chinese government under their law, and the one chosen by the Buddhist monks.
5. Cuba: Cell phones
During Fidel Castro’s reign in Cuba, few citizens owned cell phones. Not because they were too expensive, but because they were banned—only executives working for foreign companies or high communist party officials were allowed to have them. Fidel Castro defended his ban by claiming the restrictions were “necessary sacrifices” in the “battle of ideas” against the U.S. When Raul Castro, younger brother of Fidel, took over control of Cuba in 2008, one of his early actions was to lift the ban on cell phones. While expensive, the freedom of owning a cell phone had many citizens to rushing to purchase their first.
More from mental_floss... 11 Things We No Longer See on Airplanes 11 Problems Music Can Solve Rare Color Footage of Depression-Era New York 10 Latin Phrases People Pretend to Understand March 13, 2013 - 2:00pm | 科技 |
2017-09/1579/en_head.json.gz/22989 | Mystery Light Over Texas Resembles 1980 Cash-Landrum UFO… Or Does It?
1 Comment June 9, 2016
In 1980, one of the most perplexing UFO incidents in American history occurred, on a winter’s evening by a narrow country road near Dayton, Texas.
The date was December 29, 1980, and Betty Cash was driving with her friend, Vicky Landrum, and Landrum’s seven-year-old grandson, Colby. The group had been on their way back home after dinner, driving along what is known locally as Farm to Market Road 1485. The crew had been entertaining the idea of a game of Bingo, but on this occasion all the usual spots were closed, since it was just prior to New Year.
The case is, of course well known, although I do feel that some new details may have emerged over time (and for a great overview on this, I recommend researcher Curt Collins’ resource guide on the case, which can be viewed here). In spite of the few general conclusions about the case that might be made, back in October I noted that, “despite all the seemingly substantive evidence this case provides, barely a clue has emerged as to what the origins of the object may have been, let alone the source of it’s apparent military escorts: a fleet of helicopters which, if they existed, have yet to be accounted for, given their numbers, and the timeframe during which the witnesses described seeing them.”
I have my own suspicions about what Betty Cash and the Landrums may have seen, although we can’t be entirely certain, even with the passing of many years, and the hope for new information. And on the subject of “new information”, a new piece of video footage comes to us courtesy of the UK tabloids, as well as the YouTube account SecureTeam10, who has been getting a fair amount of negative attention lately after widespread viewing of certain footage the account has featured was called into question.
The Mirror in the UK reported on the incident that:
“This new clip was recorded by a man, identified only as Steve, who works as a director of aircraft maintenance for a corporate jet fleet in the USA.
He sent the footage to SecureTeam10, a group of UFO experts, who analyse the footage and share their findings online.”
While they refer to SecureTeam10 as a “group of UFO experts” here, it would seem that the account is mostly (if not entirely) operated by Tyler Glockner, who was quoted by the Mirror saying, “It’s amazing, and there is no doubt in my mind, that this UFO looks identical to the one recorded in the Cash-Landrum incident.”
What is truly amazing is that the comparison between the Texas “mystery object” and the Cash-Landrum UFO has been made at all. Below is a screen-grab of how the recent Texas object appeared, after a bit of digital editing had been applied to help emphasize the object’s alleged shape:
While the image depicts something vaguely diamond shaped, it would be a bit of a stretch to presume that this looks anything like what was purportedly witnessed at the time of the 1980 incident:
Sure, there appears to be a superficial resemblance. However, while much of the literature suggests today that the original 1980 object was “diamond shaped”, this may not have actually been the case at all.
During a conversation with Curt Collins about the incident, he explained that the rhomboid shape may have been based on a description that Colby Landrum gave, who we should recall, remained in the back seat of Betty Cash’s Oldsmobile Cutlass throughout the majority of the encounter. Collins further notes that the way Cash and Landrum described the object in their initial recollections had merely described it as “balloon like”. This description had, in part, informed my speculation that the object in question might have actually been something along the lines of a thermal infrared emitter source, which appeared suspended in the air due to it’s possible use as an inflation device for a military balloon. If this sounds like a stretch, one may need to look no further than U.S. patent US 3837281, filed in April 1969, which detailed a thermal infrared emitter source designed precisely for this sort of purpose: “as means for supplying hot gas for the quick inflation of a closed balloon.”
This may not conclusively explain all of the aspects of the case, and in truth, I try to abstain from asserting that, so many decades later, we can suddenly claim we’ve “solved” another UFO cold-case… after all, there are plenty of the more hardened UFO “debunker” types that engage in this sort of speculative “solving” of cases on a frequent basis. Nonetheless, with the questions that remain about the case, is it really fair to try and compare the Cash Landrum case, with all its complexities, to a video of an amorphous light seen from a plane as it passed over Texas?
Whether or not one thinks it’s “fair”, it’s certainly not helpful. No, the object filmed over Texas isn’t “eerily similar” to anything that was seen over Texas back in 1980, and no, making that comparison doesn’t help “prove” anything or reveal anything more to us about what Cash and the Landrums saw, nor what was recently filmed over Texas from the window of a plane. The only way to get to the bottom of UFO a case like the Cash-Landrum incident is to look for facts related to that case, and make our best assessment based upon these details.
So please, tenuous similarities to spurious sources should be left at the door!
TAGS: Cash-Landrum, SecureTeam10, Texas, ufo VIEW ALL
The Devil in Disguise: Are UFOs and Aliens “Evil?”
There have been a number of bizarre subjects over the years that have come to light within the annals of Ufology. These include the broad conception, stemming from images recalled during abductee’s…
Decades-Old Brown Mountain Lights Photo Uncovered in ‘Secret’ Collection
There is a long-standing mystery about the Linville Gorge Wilderness of Western North Carolina, and the nearby Brown Mountain ridge where, for decades, the appearance of odd lights have been reported. Known as the…
Indian Air Force Jet Shoots Down UFO
People in the Barmer district of Rajasthan in northwestern India heard mysterious booms and found strange-looking pieces of metal after an Indian Air Force jet shot down an unidentified flying object over…
So please, tenuous similarities to spurious sources should be left at the door! LOL. | 科技 |
2017-09/1579/en_head.json.gz/23000 | Innovation Park looks toward expansion
Laura McCrystal | Monday, January 31, 2011
Rich Carlton, a local entrepreneur, attended the groundbreaking ceremony at Notre Dame’s Innovation Park in 2008 as a member of the Chamber of Commerce for St. Joseph County.
At the time, Carlton did not envision himself participating in the Park, but today he is the president and COO of Data Realty, one of 30 client companies at Innovation Park.
“I saw it as an opportunity to be involved in a true community partnership between the community and the University,” Carlton said.
Innovation Park, located just south of Notre Dame’s campus, opened in October 2009 and provides short-term space and advisory services to entrepreneurs while they start new companies. Its client companies either have previous ties to the University or are looking to make connections to resources at Notre Dame.
Dave Brenner, Innovation Park’s president and CEO, describes the Park as a bridge between the University and the marketplace.
“We act as a commercialization bridge between the University and the marketplace,” Brenner said. “As a bridge, it’s not a final destination. People go from one side to the other.”
At Innovation Park, startup companies rent space for up to a year and are able to connect with Notre Dame students, faculty and research.
Being at Innovation Park allowed Carlton to find student interns from Notre Dame. While he has been in business in South BenAd for more than 15 years, he made connections with the University for the first time when Data Realty, his technology-based startup, came to Innovation Park in October.
“The interaction that I get here with the students is not only energizing, I’m just thoroughly impressed,” he said.
Other client companies come to Innovation Park because they already have connections to Notre Dame. Les Ivie, president of F cubed, is developing a molecular detection device based on technology that Hsueh-Chia Chang, a Notre Dame chemical and biomolecular engineering professor, invented.
After Ivie decided to start F cubed based on Chang’s technology of using carbon nanotubes to attract and detect certain types of DNA, he came to Innovation Park because he could be close to research at Notre Dame. While he is based in Chicago, Ivie travels to South Bend three days each week.
“It’s been good for us,” Ivie said. “Not only do we get support from Innovation Park, but we get a lot of support from the local community.”
Ivie said the common laboratory with special equipment at Innovation Park has been especially valuable in developing a molecular detection device that can quickly and easily test liquid samples. Applications of his device include doctors being able to diagnose patients with conditions such as influenza or strep throat in 15 minutes as opposed to several hours. In addition, it could test lake water and quickly determine whether it is safe to swim.
After F cubed is ready to move out of Innovation Park, Ivie said he plans to remain in South Bend.
“My three top scientists that work at Innovation Park moved to South Bend, one of them from Pittsburgh, one of them from Raleigh-Durham and the third one from Austin, Texas,” he said. “So our intention is to graduate and stay in the local area.”
Innovation Park is also “a good community of like-minded companies,” Ivie said.
Brenner said client companies at Innovation Park often communicate and collaborate with one another because they find things in common.
“That’s the secret sauce of what goes on in a park like this,” he said. “It’s very rare that a single individual has all the answers. They need to reach out to other people.”
While all 30 client companies at the Park, which include for-profit companies and non-profit social ventures, are startups, Brenner said he and his staff have learned to be flexible in advising each company.
“Without any definitive data to base it, we really had assumed we would get companies in different stages of development but that there would be a number of them that would have common needs so it would be easy to shape programs,” he said. “What we’ve learned is that we have 30 companies in different stages of development with different challenges.”
Feedback from client companies has been very positive, Brenner said, and in the future Innovation Park could look to expand, although it is not yet at full occupancy.
“We do have space here on this site for four buildings,” he said. “It’s an open, active issue sand we’re very pleased we have the space.”
Innovation Park to open in October
Innovation Park, a product development facility launched by the University, seeks to connect the...
Four businesses start at Innovation Park
Innovation Park holds event
Quinn Family donates $5 million for Innovation Park | 科技 |
2017-09/1579/en_head.json.gz/23001 | ScienceLatest LHC experiments show early universe behaved like a liquidTannith CattermoleNovember 29th, 20105 picturesSimulated lead-lead collisions in ALICE View gallery - 5 imagesPhysicists from the ALICE detector team have been colliding lead nuclei together at CERN's Large Hadron Collider (LHC) in an attempt to recreate the conditions in the first few microseconds after the Big Bang. Early results have shown that the quark-gluon plasma created at these energies does not form a gas as predicted, but instead suggest that the very early universe behaved like a hot liquid.The Large Hadron Collider enables physicists to smash together sub-atomic particles at incredibly high-energies, providing new insights into the conditions present at the beginning of the universe.ALICE (an acronym for A Large Ion Collider Experiment) researchers have been colliding lead nuclei to generate incredibly dense sub-atomic fireballs – mini Big Bangs at temperatures of over ten million degrees.Previous research at lower energies had suggested the hot fire balls produced in nuclei collisions behaved like a liquid, yet many still expected the quark-gluon plasma to behave like a gas at these much higher energies.Additionally, it has been found that more sub-atomic particles are produced in the collision than some theoretical models suggested.“Although it is very early days we are already learning more about the early Universe,” said Dr David Evans, from the University of Birmingham’s School of Physics and Astronomy, and UK lead investigator at ALICE experiment. “These first results would seem to suggest that the Universe would have behaved like a super-hot liquid immediately after the Big Bang.”The ALICE experiment aims to study the properties of the state of matter called a quark-gluon plasma. The ALICE Collaboration comprises around 1,000 physicists and engineers from around 100 institutes in 30 countries. During collisions of lead nuclei, ALICE will record data to disk at a rate of 1.2 gigabytes (GB), equivalent to two CDs every second, and will write over two petabytes (two million GB) of data to disk. This is equivalent to more than three million CDs, or a stack of CDs without boxes several miles high!To process this data, ALICE will need 50,000 top-of-the-range PCs, from all over the world, running 24 hours a day.The research is funded by the Science and Technology Facilities Council (STFC).All images courtesy CERNVia University of Birmingham.View gallery - 5 imagesLatest LHC experiments show early universe behaved like a liquid1 / 5
#University of Birmingham
#CERN
#Atoms
#Universe
#LHC | 科技 |
2017-09/1579/en_head.json.gz/23072 | Bojana Danilovic, the woman who sees the world upside down
I came across an utterly fascinating case study on Twitter the other day (via Mo Costandi; see this video too):
Rare brain condition leaves woman seeing world upside down
Bojana Danilovic has what you might call a unique worldview. Due to a rare condition, she sees everything upside down, all the time.
The 28-year-old Serbian council employee uses an upside down monitor at work and relaxes at home in front of an upside down television stacked on top of the normal one that the rest of her family watches.
"It may look incredible to other people but to me it's completely normal," Danilovic told local newspaper Blic.
"I was born that way. It's just the way I see the world."
Experts from Harvard University and the Massachusetts Institute of Technology have been consulted after local doctors were flummoxed by the extremely unusual condition.
They say she is suffering from a neurological syndrome called "spatial orientation phenomenon," Blic reports.
"They say my eyes see the images the right way up but my brain changes them," Danilovic said. "But they don't really seem to know exactly how it happens, just that it does and where it happens in my brain.
"They told me they've seen the case histories of some people who write the way I see, but never someone quite like me."
calibration,
collaborations,
A taxonomy of information
Over the past several months I've been thinking about how perception falls within a hierarchy of types of information use. This was spurred by my ideas about an ecological approach to language, in which perceptual information and linguistic information are distinguished on the basis of the relationship between event structure and meaning. As part of this work, I defined perception as the apprehension of structure in an energy array where 1) the structure is specific to an event or property in the world, 2) where the meaning of the structure (for that organism in that task) is about that event or property (i.e., a dog's bark is about the event of a barking dog), and 3) where the meaning of the structure must be learned (or, more correctly, where an organism must learn how to coordinate action with respect to this structure). I arrived at this definition because it seemed to capture the ecological approach to perception and because it makes it obvious how perceptual information and linguistic information differ (also because I am crazy-obsessive about definitions). Read more »
Sabrina Golonka
Bojana Danilovic, the woman who sees the world ups... | 科技 |
2017-09/1579/en_head.json.gz/23126 | Former NASA astronaut joins World View as chief pilot
by Jeff Foust — February 23, 2016
Former NASA astronaut Ron Garan, seen here looking out the window at the International Space Station in 2011, will provide World View customers with their own views of the Earth piloting the company's high-altitude balloons. Credit: NASA WASHINGTON — Former NASA astronaut Ron Garan has joined high-altitude balloon company World View as its chief pilot, the company announced Feb. 23, making him the latest astronaut to seek a post-agency career in the commercial spaceflight field.
Garan, a former U.S. Air Force test pilot who spent nearly six months in space on two missions, will oversee flight operations for World View. The Tucson, Arizona-based company is developing balloons to take payloads, and eventually people, to altitudes of 30 kilometers or more, giving them at least some of the experience of a full-fledged space flight.
In an interview, Garan said he joined World View in large part because both he and the company have a goal of sharing the view from space with the public. “They are really aligned with the reason I left NASA in the first place, sharing this perspective of our planet,” he said. Garan, who left NASA in 2013, later wrote a book on what he calls the “orbital perspective,” which he describes as a change in perceptions about the Earth created by seeing in from space, including a willingness to embrace global collaboration to deal with various problems. Garan is also working on a movie about the orbital perspective.
“I did all of that to kind of figuratively take people to the edge of space,” Garan said of his book and film projects. “Now, because of World View, I can literally take people there.”
In January, officials in Pima County, Arizona, which includes the city of Tucson, approved plans to construct a $15 million headquarters building for World View near the city’s airport. That facility, which will be completed late this year, will also including a pad that World View will use for launching its balloons.
World View currently flies smaller balloons to carry experiments into the stratosphere, but is working on a vehicle that can carry six passengers and two crew members to an altitude of about 30 kilometers for flights lasting a few hours. Those flights could begin as soon as the end of 2017, company officials recently said.
Garan said those flights, while lacking the altitude and duration of an orbital mission, should still give people an opportunity to experience the orbital perspective he felt during half a year in space. “They will see the sky turn from blue to black, and they’ll see the curvature of the Earth,” he said. “They’re going to have time to process the experience.”
Garan said his “all-encompassing” job includes developing plans and procedures for both uncrewed and crew flights of World View’s balloons, including flight manuals, checklists and other systems needed for safe operations. He will also pilot World View missions once the company starts taking people on stratospheric flights, although he said ultimately the company will have a “regular cadre of pilots” to handle them.
While Garan has experience with spacecraft and with F-16 fighter jets he flew prior to becoming an astronaut, he acknowledged ballooning is something new for him. He said he’s currently in a parachute training program to gain experience with parafoils, which World View’s capsule will use to descend back to Earth at the end of its balloon flight.
Garan is the latest former astronaut to join one of several companies working on commercial suborbital and orbital vehicles. While aerospace companies frequently hired astronauts after their NASA careers in the past, Garan and others are taking jobs that give them a chance of flying once again in space, or at least to the edge of space.
“The entire commercial spaceflight industry is poised to take off,” Garan said. “They’re all exciting, but this seemed like the best fit for me because of the philosophy of the company.”
Business People astronautsNASAworld view Please enable JavaScript to view the comments powered by Disqus. | 科技 |
2017-09/1579/en_head.json.gz/23151 | RIM Blackberry trying to save itself with streaming music service after HP, Palm, and Motorola Demise
Patrick Nosker August 19, 2011
RIM, the maker of the Blackberry family of devices, is the latest company to push out of the smartphone business with its latest step of offering a subscription-based music service. While still manufacturing a wide variety of smartphones (three were just announced), the company is looking elsewhere to secure its place in the almost impossible to be competitive smartphone business. With the recent shake up from Motorola, HP (and with it Palm), and other telecom business in the recent past, RIM must set itself apart to remain competitive and profitable on the heels of Apple's new iPhone 5 release and continuous development of the Android platform by Google.
If RIM continues its trend of making shoddy tablets and phones that are not fully internet-capable, they will surely be the next to go. A complete revamp of their offerings needs to be made in order to remain firm in the market. With literally hundreds of various Android devices available, the entire smartphone market is becoming squeezed. Even Apple is finding it hard to be competitive now and will likely offer a cheaper variant from their flagship iPhone 5 soon.
So what is RIM doing to try to preserve its market status? It's creating a streaming music service of course. Following Apple, Google, Pandora, Rhapsody, Spotify, and countless others, RIM has decided that streaming music is the best place to go. But instead of unlimited streams such as from Pandora and Spotify, RIM's service will be limited to 50 songs.
Not much more is known, but if this is really RIM's savior product, we don't think they will be around for much longer.
POSTED IN » Tech News
About the author: Patrick Nosker View all posts by Patrick Nosker
Patrick Nosker founded pnosker.com in 2008 as a small review website. Since then, it has grown considerably as he serves as Editor-in-Chief. He enjoys high end audio, computers, gadgets, games, cars, and biology. In addition to being the boss of pnosker.com, he is also pursuing a PhD in the Molecular Biosciences at Rutgers University. | 科技 |
2017-09/1579/en_head.json.gz/23201 | Government’s wireless spectrum auction closes
Dean Takahashi March 18, 2008 3:35 PM
The federal government said that the 700-megahertz wireless spectrum auction has come to a close. The government will announce the winners in a matter of days.
Fortune magazine’s Techland blog said that, after eight weeks, there were 261 rounds of bidding. The spectrum is becoming available in 2009 thanks to the conversion from analog TV to digital TV that will take place on February 18, 2009. It is the last major chunk of nationwide spectrum available for wireless services. The bids reached a total of $19.6 billion, higher than the $10 billion the Federal Communications Commission set as its goal. Bidders included Verizon, Google, AT&T, and Qualcomm.
The winners are to be announced after all five blocks of spectrum are sold. The D block, which was set aside for a national public safety network, failed to raise the minimum of $1.3 billion set by the FCC. .
Reed Hundt, the former chairman of the FCC, had organized an effort to bid on the spectrum through his company Frontline Wireless, which focused on the public safety and open network opportunities of the spectrum. The effort had the backing of venture capitalists John Doerr of Kleiner Perkins Caufield & Byers, Ram Shriram of Sherpalo Ventures, and others. But he said that he knew that his company was out of the running, noting that it was out of business in January. Despite the backers, Hundt said his company couldn’t raise enough money. Beyond that, he said didn’t know how everything would turn out. Hundt made the comments during his speech at the Spring Von.x 2008 conference in San Jose.
“I could speculate that Verizon got a sweetheart deal but I would be making that up,” he said. “I can’t say whether it was a failure without knowing who won. If no new entrant comes out of it with a lot of spectrum, that is noteworthy.” | 科技 |
2017-09/1579/en_head.json.gz/23251 | Home > Imagination, Hypothesis-Testing, and “Ah Ha!” Moments Play a Role in Both Art and Science, Artist Says
Imagination, Hypothesis-Testing, and “Ah Ha!” Moments Play a Role in Both Art and Science, Artist Says
11 May 2010Molly McElroyNewsDan Pearce—an engineer, architect, general contractor, and artist—is enjoying a circuitous career. He spent nearly a decade as an aerospace engineer at NASA in Houston and then he moved with his wife as she pursued an academic career at the University of Virginia.
Not finding many aerospace engineering opportunities in his new hometown of Charlottesville, Pearce became involved in other projects, including building a solar-powered house for his family and creating unusual sculptures.
Pearce will show his bronze and wood sculptures in a new exhibit at the AAAS Gallery starting Thursday 13 May, with a public reception from 5 to 7 p.m. The gallery is on the first floor of the AAAS building at 12th and H Streets, N.W., in Washington, D.C.
The exhibit, “ComplexUs: The Nexus of Science and Art,” will run through 10 September. Regular hours for the gallery are Monday through Friday from 9 a.m. to 5 p.m.
Dan Pearce’s sculpture using a 3-d printer. View a larger version of this image.| Photo courtesy of and © Dan Pearce
“ComplexUs” is the name of the group of artists and scientists from Richmond, Virginia, including Pearce, who are displaying their pieces in the exhibit. In one of his works, “Walking Man continuous,” Pearce used a printer capable of making three-dimensional objects with successive layers of material. He created a 3.5-inch-high sculpture of what appears to be an animation of a man walking.
“The fact that you can pick up the object and see it as motion, I think expands your notion of what space-time is,” he said of his 3.5-inch sculpture.
“In this exhibit, scientists and engineers demonstrate how their lives and interests have taken them in different creative directions,” said Virginia Stern, director of the AAAS Art of Science and Technology program.
Stern created the program in 1985 as a way to showcase art about science, art by scientists, or art that employs a new or original technology or technique. “Careers in science and technology often don’t take an obvious or traditional path,” she said.
Pearce, in his artist’s statement, describes his work as an “attempt to traverse between the worlds of technology and art.” This description aligns well with the AAAS Art of Science and Technology program, Stern said, and it captures how the works in the new exhibit represent an intersection between science, technology and art. Shirley Koller, AAAS curator, arranged the exhibit.
Chemist Susan Van der Eb Greene will display her sculptures in the AAAS exhibit. “My mind is very three-dimensional—I love the three-dimensional aspects of molecules, that’s why I went into chemistry,” said Greene, who retired in 2001 from her job as a research chemist in Richmond. “I am fascinated by how the three dimensions of a molecule are involved in its function.”
Sue Greene’s sculpted table. View a larger version of this image. | Photo courtesy of and © Sue Greene.
Greene’s sculptures reflect topology, an area of mathematics that studies shapes. She carved a moebius band—like the infinity symbol—out of laminated wood, and she created a twisty manifold out of mahogany. One of her pieces is a glass-topped table with spiral legs. She used cherry veneers glued together to form the curving legs with “a logarithmic spiral of a constantly widening circle,” she said.
Richmond artist Sara Clark also finds inspiration in topology. “Topology is a part of mathematics which is really a visual language. The shapes are very pliable,” said Clark, who teaches drawing and painting at Virginia Commonwealth University (VCU) and at the Virginia Museum of Fine Arts Studio School, both in Richmond.
Sara Clark’s painting of topological shapes. View a larger version of this image. | Photo courtesy of and © Sara Clark
In her oil paintings, she uses shapes as characters and has developed some favorites: rippled or straight ribbons; knots and spheres; and the Boy’s surface, a three-dimensional geometrical shape derived by mathematician Werner Boy.
In her painting “Duet,” a yellowy-green Boy’s surface appears like a three-petaled, three-dimensional flower. Next to it, Clark painted a ruffled, rust-red ribbon and set both shapes against a backdrop with mustard yellow and deep turquoise.
“In the world of art and science, imagination plays such a major role in how things work,” Clark said. Art and science are two ways to look at the world, and their paths can cross, Clark said. In both fields, she explained, “you have a hypothesis, but sometimes you don’t know how things are going to turn out. There are ‘ah ha’ moments.”
And in between the invigorating ‘ah ha’ moments, scientists and artists alike can feel monotony. Artist Chuck Henry attests to this. He spent five years optimizing the brightness, color, and size of spheres using a computer program that simulates three-dimensional environments.
Chuck Henry’s computer graphic print. View a larger version of this image. | Photo courtesy of and © Chuck Henry
Henry will show his geometry-inspired, large-scale computer graphic prints at the AAAS exhibit. Peering at two of his stereo prints, measuring 54 inches wide and 42 inches high, gives the impression of a three-dimensional space. “My work gives me and hopefully the observer a glimpse into the world beyond the physical world,” he said.
He took five years to learn the software and experiment, which resulted in computer graphic prints that seem like colored bubbles or kaleidoscope images. “The first step is placing the 40-some spheres in space, and there’s an x-y-z coordinate for each one and a diameter for each one,” said Henry, a retired professor from Virginia Commonwealth University’s sculpture department.
“It’s very, very tedious. It’s just nerve-blasting,” he said. After situating the spheres, he used the computer program to experiment with 90 different light sources illuminating the spheres. “I discovered that you have to place these light sources at the points of contact between the spheres. Otherwise, you don’t see the spheres,” Henry said.
Tarynn M. Witten—a physicist—came up with the name of the group, which has been meeting for about four years at VCU. Witten is a professor in the Center for the Study of Biological Complexity at the university.
Tarynn M. Witten’s sculpture of birds in flight. View a larger version of this image. | Photo courtesy of and © Tarynn M. Witten.
“We all have a science background or training or interest. And we all have a unifying interest in art and science,” said Witten, who also will display pieces in the AAAS exhibit. “And it became very clear that we were each complex artists, and that our works have emerging and sometimes unpredictable properties,” she said.
For instance, one of her paintings is inspired by microarrays, a technique used to visualize the activity of genes and proteins. Witten described these molecular and cellular activities as “emergent phenomena,” in that they represent what the whole organism is doing.
In a sculpture, she takes another look at an emergent phenomenon: the flight of birds. In a series of origami paper foldings, birds emerge from flat pieces of paper and then disappear again. “You have this idea of emergence, evolution, adaptation and impermanence,” Witten said.
The AAAS Gallery is on the first floor of the association’s headquarters at 1200 New York Avenue, NW, in Washington, D.C., at the intersection of 12th and H streets. The gallery is open and free to the public from Monday through Friday, 9 a.m. to 5 p.m.
Source URL: https://www.aaas.org/pU7 | 科技 |
2017-09/1579/en_head.json.gz/23255 | cheap oakley sunglasses on sale By Alastair Sharp and Nicola Leske NEW YORK (Reuters) - BlackBerry Ltd would consider exiting its handset business if it remains unprofitable, its chief executive officer said on Wednesday, as the technology company looks to expand its corporate reach with investments, acquisitions and partnerships. "If I cannot make money on handsets, I will not be in the handset business," John Chen said in an interview, adding that the time frame for such a decision was short. He would not be more specific, but said it should be possible to make money off shipments of as few as 10 million a year. At its peak, BlackBerry shipped 52.3 million devices in fiscal 2011, while it recorded revenue on less than 2 million last quarter. Chen, who took the helm of the struggling company in November, said BlackBerry was also looking to invest in or team up with other companies in regulated industries such as healthcare, and financial and legal services, all of which require highly secure communications. The chief executive said small acquisitions to strengthen BlackBerry's network security offerings were also possible. "We are building an engineering team on the service side that is focused on security. We are building an engineering team on the device side that is focused on security. We will do some partnerships and we will probably, potentially do an M&A on security." He said security had become more important to businesses and government since the revelations about U.S. surveillance made by former National Security Agency contractor Edward Snowden. In a wide ranging interview in New York, Chen acknowledged past management mistakes and said he had a long-term strategy to compliment the short-term goals of staying afloat and stemming customer defections. "You have to live short term. Maybe the prior management had the luxury to bet the world would come to it. I don't have the luxury at all. I'm losing money and burning cash." In March, the embattled smartphone maker reported a quarterly net loss of 3 million and a 64 percent drop in its revenues, underscoring the magnitude of the challenge Chen faces in turning around the company. Chen said BlackBerry remained on track to be cash-flow positive by the end of the current fiscal year, which runs to the end of February 2015, and to return to profit some time in the fiscal year after that. Chen said his long-term plans for BlackBerry included competing in the burgeoning business of connecting all manner of devices, from kitchen appliances to automotive consoles to smartphones. Chen said he was not sure how long it would take for the "machine-to-machine" or "M2M" world to become a mainstream business, but he said he was sure that was coming. "We are not only interested in managing BlackBerry devices. We are interested in managing all devices that you would like to speak to each other," he said. "To achieve our dream of being a major player in M2M requires more partnerships with others," including telecom companies eager to participate. FOCUS ON ENTERPRISE BUSINESS Chen, viewed by tech industry insiders as a turnaround artist, wants BlackBerry to zero in on its core base of corporate and government clients, and on its services arm, which secures mobile devices on the internal networks of big clients. Chen was credited with turning around Sybase Inc in the late 1990s. Sybase, an enterprise software company, was eventually acquired by SAP AG in 2010. Canada's BlackBerry, which has lost most of the smartphone market to Apple Inc's iPhone and gadgets powered by Google Inc's Android operating system, has laid off about 9,500 employees, or more than half its work force over the last three years, as it has rushed to cut costs in the face of mounting losses. The company, a one-time pioneer in the smartphone arena, has seen its fortunes fade dramatically within a span of less than five years. BlackBerry, which boasted a global smartphone market share of roughly 20 percent back in 2009, has since seen that share shrink to less than 2 percent as of the end of 2013. Chen, who joined BlackBerry after it was unable to find a buyer in a sale process last year, said he was not fazed by recent acquisitions of companies offering similar services, such as Facebook Inc's billion purchase of mobile messaging service Whatsapp. "We are not going to go up against Whatsapp. We are going to be more focused on secure communications, secure messaging," he said of BlackBerry's BBM platform. Chen said he was determined not to lose focus on the corporate and other customers that helped it build its global reach in the first place, and would not be tempted back into the much larger but more fickle consumer smartphone race. "We are not going to spend any more money to maintain the latest version of Angry Birds," Chen said, referring to a wildly popular consumer mobile videogame. (Additional reporting by Nadia Damouni; Editing by Frank McGurty, Steve Orlofsky and Lisa Shumaker)Technology & ElectronicsHandheld & Connected DevicesBlackBerryJohn Chen
oakley ski gogglesoakley crosshaircheap oakley mens sunglassesoakley probationoakley saleoakley polarized sunglasses Total Pageviews
©2014 Abolgokh, All Rights Reserved. Simple template. Template images by konradlew. Powered by Blogger. | 科技 |
2017-09/1579/en_head.json.gz/23268 | Arinc Direct’s Xplore iPad Iridium Acars System Available This Month
- July 2, 2013, 5:10 AM
With Xplore, Arinc Direct offers an Iridium-based Acars and text messaging system.
With the new Xplore system Arinc Direct is jumping into the market for small portable Iridium-powered onboard communications devices that use Apple’s iPad as the control/display unit for cockpit and cabin data services.
Xplore is a small box, two inches thick and no larger than an iPad, that users will carry onto the aircraft, thus no installation of an avionics unit is required. Xplore needs to be attached to power and to an external dual Iridium/GPS antenna to enable communication with Iridium satellites.
The system is designed to provide, to both the flight deck and the cabin, Acars-type communications for the pilots and instant messaging/texting, email and one-way air-to-ground voice calling for passengers.
Inside Xplore are two boards, one stuffed with the telephony systems, which handle communications management, and the other featuring an Intel Atom processor, a file server and memory. The telephony board contains two separate Iridium “modems,” one to serve pilots and one for passengers. “The other board is a blank palette on which to develop additional applications,” said Arinc Direct senior director Bob Richard.
On the cockpit side, Xplore allows the iPad to act as a control/display unit (CDU) for Arinc Direct’s Acars services. Instead of making Acars requests via a flight management system CDU then viewing the results on the FMS’s limited screen, pilots see the results on the iPad’s large-screen colorful graphical interface. “Everything you can do with regular Acars you can do with this,” Richard explained, with the exception of uploading a flight plan to the FMS. However, the iPad running Arinc Direct’s app can be used for extensive flight planning on Arinc’s system. The iPad Xplore system can even log “out, off, on, in” (OOOI) events, but not triggered by a sensor on the aircraft such as a weight-on-wheels switch. Using Xplore, OOOI reports, such as the “off” report, happen when the GPS determines that the aircraft has accelerated to more than 60 knots, and the “on” report is triggered when slowing below that speed.
All of the other Acars functionality that Arinc offers is available via Xplore on the iPad, including messaging, graphical and textual weather, Notams, flight planning and changes to flight plans, pre-departure clearances, oceanic clearances and so on. Using the iPad for this is much easier than the typical FMS interface, according to Richard. On the FMS, he added, “things aren’t necessarily laid out as logically as they could be. The information is not well formatted, and it’s a lot more intuitive on the iPad.”
Arinc Direct has also improved its iPad app to work better when using Xplore. Before, the app was leg-centric, meaning that it was easy to gather data associated with a particular leg of a flight plan, such as weather, Notams and so on. However, with the new version of the app, pilots can view either leg-centric data or range out anywhere in the world for Metars, Tafs, Notams and more, making it easier to answer a question about a potential diversion or destination change. This also enables pilots to download colorful graphical weather products onto the iPad from anywhere, which can then be overlaid on flight plans. Turning graphical weather functions on for an FMS can cost tens of thousands of dollars, according to Richard. “They may not want to turn it on and pay, and simply use this instead and get better weather overlays than they could over the traditional Acars system.” Arinc Direct also offers a full lightning database, something that isn’t available on an FMS weather display, he said.
Passenger Communications
While Xplore can be used to make voice calls from the ground to the air, Richard said, “95 percent of communications with aircraft are air-initiated.” To keep the product simple, Xplore is focused on air-to-ground communications. For voice, this means voice over IP (VOIP) phone calls. When passengers board an Xplore-equipped aircraft, they can use their smartphones to connect to Xplore (via VOIP), which then directs the call to the ground on an Iridium voice channel. Passengers can also use Xplore to send text messages, and they are assigned a phone number in the Xplore app, which they will use anytime they tap into Xplore. When a passenger sends a text message, it goes to Arinc’s server in Annapolis, Md., then it is delivered to the recipient, who sees that it came from the passenger’s “aircraft” number. If someone on the ground sends a text message to the passenger, and that customer happens not to be flying, Arinc Direct will deliver the message to the customer’s mobile phone. Arinc Direct will help Xplore customers set up master accounts containing all their instant messaging account information, and this will enable Arinc to determine whether that customer is flying or on the ground. That helps the company figure out where to direct incoming messages.
Because the Iridium channels have limited bandwidth, Arinc Direct’s Xplore email is limited to simple text-based emailing. One supported protocol is BlackBerry email, from which Arinc can more easily strip out data-rich graphical data so only the text is sent. The other method is to give the customer a special email address (username@ongo.com). Any email sent from the ground will be stripped of graphical extras, turned into a text message and sent to the aircraft.
In the cabin, the first Xplore app will be a moving-map display. Many more applications will be possible, and these could be developed by Arinc Direct or other app-makers and might include preflight briefings, Mecca pointers, points-of-interest displays on moving maps, day/night maps, news and weather. “It could be a whole bunch of stuff,” Richard said.
Arinc Direct is not planning to develop apps for the Android market because writing software for Android is more complicated and costly, due to the inconsistencies in the screen sizes, resolution and operating system. “Aviation has pretty much gone with the iPad. The only [negative] thing about the iPad,” he said, is that “people hate the Apple model because it’s so locked down. But from a development perspective, it allows for consistency. I know how to be compatible with an iPad. To develop apps in another platform, it’s 100-percent duplication. You can’t reuse anything you did for the Apple. Is it easier to spend money and accommodate two paths and give customers a choice or focus on making one of them extremely good? We’ve chosen the latter path. If customers want mobile Arinc Direct, they will have to get an iPad.”
Xplore will start shipping at this month’s EAA AirVenture show in Oshkosh, where Arinc Direct is exhibiting for the first time. Suggested retail price for the Xplore box is $25,000, and installation of an Iridium/GPS antenna will add about $5,000 because that has to be done under a supplemental type certificate. For aircraft with existing Iridium systems, Xplore could plug into those, thus sparing the operator from paying for another Iridium account and a new antenna. Arinc Direct hasn’t published official Xplore usage prices yet, but Richards expects that voice calls will cost approximately $1.50 per minute. Arinc Direct is considering a monthly package price at about $750, which would include Acars services such as flight planning, text messaging, access to flight coordinators, weather and so on.
“Xplore fills a niche and is getting Acars into smaller [aircraft] and eliminating that barrier to entry,” Richard concluded.
Avionics Sponsor Content: Hawker Pacific February 2017 A Leading Player Built on a Powerhouse of Tradition Built on a powerhouse of tradition http://www.ainonline.com/aviation-news/aviation-international-news/2013-07-02/arinc-directs-xplore-ipad-iridium-acars-system-available-month
Blaine Banks (not verified)
Gentlemen....we have a Hawker 850 and (2) Challenger 604's. We are very interested in your new system, the Xplore. Can you provide more complete details as to the installation in these aircraft? We are located in Indonesia and do our maintenance with Jet Aviation in Singapore. Would Jet be able to do the installation needed for Xplore use?
July 8, 2013 - 11:13pm
Please note, NO installation is required. You can carry one Xplore system and use it on one airplane then carry it onto another airplane. For more information, contact Arinc Direct at http://www.arinc.com/sectors/aviation/aircraft_operations/business_aviat... | 科技 |
2017-09/1579/en_head.json.gz/23308 | Home/News/Kepler mission data to go through Space Telescope Science Institute (STScI)
Kepler mission data to go through Space Telescope Science Institute (STScI)The institute's role is to convert the raw science data into files that can be analyzed by Kepler researchers and to store the files every 3 months in an archive.Provided by STScI, Baltimore, Maryland
Published: Thursday, June 25, 2009Kepler's targeted star field graphic.NASAJune 25, 2009The Space Telescope Science Institute (STScI) in Baltimore is partnering on a historic search for Earth-size planets around other stars. STScI is the data archive center for NASA's Kepler mission, a spacecraft that is undertaking a survey for Earth-size planets in our region of the galaxy. The spacecraft sent its first raw science data to STScI June 19.Visit Astronomy.com's Kepler mission page for ongoing coverage including news and editors blogs.The institute was the logical choice for storing the anticipated flood of data because its scientists have processed enough observations from NASA's Hubble Space Telescope over the past 19 years to fill almost two collections of material in the U.S. Library of Congress.The institute's role is to convert the raw science data into files that can be analyzed by Kepler researchers and to store the files every 3 months in an archive."We are part of this mission because of our experience with Hubble data processing and archiving," said David Taylor, project manager for the development of Kepler's Data Management Center (DMC) at the institute. "NASA's Ames Research Center — the home of Kepler's science operations — had not done a science mission like this one. Building the data management center from scratch would have been more costly, and it would have taken longer to get up to speed."Launched March 6 on a Delta II rocket from Cape Canaveral, Florida, the Kepler spacecraft will spend the next 3.5 years searching for habitable planets by staring nonstop at more than 100,000 Sun-like stars out of about 4.5 million cataloged stars in the spacecraft's field-of-view — in the summer constellations Cygnus and Lyra.The spacecraft simultaneously measures the variations in brightness of the more than 100,000 stars every 30 minutes, searching for periodic dips in a star's brightness that happen when an orbiting planet crosses in front of it and partially blocks the light. These fluctuations are tiny compared with the brightness of the star. For an Earth-size planet transiting a solar-type star, the change in brightness is less than 1/100 of 1 percent. This event is similar to the dimming one might see if a flea were to crawl across a car's headlight viewed from several miles away.When the mission is completed in several years, the survey should tell astronomers how common Earth-size planets are around stars.Once a month, the Kepler spacecraft will send its science data, about 50 gigabytes, back to Kepler's Mission Operations Center at the Laboratory for Atmospheric and Space Physics at the University of Colorado. Raw science data will then be relayed to the institute's DMC. DMC Operations will convert the information into Flexible Image Transport System (FITS) files, a digital file format used to store, transmit, and manipulate scientific information.The FITS files will be sent to the Kepler Scientific Operations Center (SOC) at Ames Research Center in Mountain View, California, where the science data analysis will be carried out.Kepler mission scientists will turn the data into 30-minute snapshots of light from each of the 100,000 or more stars. From these snapshots, the scientists will construct a light curve for each star, which details any brightness fluctuations. They will review the light curves to look for any periodic decrease in brightness, an indication of a possible transiting planet.The mission scientists also will use the light curves to study the stars and their interiors. Because of the quality of the Kepler data and the large number of stars the spacecraft will observe, scientists hope to improve their understanding of stellar evolution."The mission's main purpose is to find planets that are the same distance from its solar-type star as Earth is from the Sun," said Daryl Swade, who directed the systems engineering development of Kepler's Data Management Center at the institute. "So that means that the planet would cross in front of its star every year. We would need three or four of these transits to confirm the detection, which will take about three or four years."A planet at an earthlike distance from its star would be in the star's "habitable zone," where temperatures are just right for liquid oceans to exist on the surface without freezing over or evaporating away. On Earth, a liquid ocean was needed to nurture the chemical processes that led to the appearance of life. This is considered an important prerequisite for life as we know it to appear elsewhere in the galaxy.Kepler's science data also will be archived at the institute. Every three months the SOC at Ames will ship FITS files in a 500-gigabyte computer hard drive to the institute for storage in the Multimission Archive. The archive houses data from about 14 missions, including Hubble, the Far Ultraviolet Spectroscopic Explorer (FUSE), and the Galaxy Evolution Explorer (GALEX).Based on its strong track record in processing and archiving data, the nstitute could earn a role in many future missions."Partnering with other institutions to share the duties of a mission may be a trend for future missions," Taylor said.0JOIN THE DISCUSSION
RELATED ARTICLESThe brightest, most distant pulsar has a complex and powerful magnetic fieldThis chart puts the moons of the solar system into perspectiveThis group wants Pluto to be a planet again — and bring hundreds of objects along with itScientists are months away from peering into black holes for the first timeJuno will remain in its current orbit around JupiterHow researchers use solar pressure to study our own star — and maybe reach interstellar spaceNASA is enlisting the public to find Planet NineScientists write a hypothetical Europa landing mission reportScientists narrow down list of landing sites for Mars 2020 YOU MIGHT ALSO LIKE100 Most Spectacular Sky WondersAstronomy Jigsaw PuzzleAstronomy Magazine 5-Year Collection on DVD-ROMPluto Globe (from Astronomy magazine)The New Cosmos (By David Eicher)Your Guide to the 2017 Total Solar EclipseCosmology's Greatest DiscoveriesCosmic Origins | 科技 |
2017-09/1579/en_head.json.gz/23421 | President Yudhoyono invited to give keynote address at major conference on future of Indonesia’s Forests and Climate Change | Center for International Forestry Research
Trending Now MoU signing ceremony announced partnership between CIFOR and Bogor Agricultural University
Highlights of 2016: Industrious thinking
Highlights of 2016: Four decades of forest loss
Home » Press Releases » President Yudhoyono invited to give keynote address at major conference on future of Indonesia’s Forests and Climate Change President Yudhoyono invited to give keynote address at major conference on future of Indonesia’s Forests and Climate Change
21 Sep 2011 MEDIA ADVISORY
Bogor (INDONESIA) 19 September 2011 _ Hundreds of climate change and forestry experts will meet in Jakarta next week to discuss how Indonesia can preserve its rainforests while at the same time ensure the country’s economic growth.
Indonesian President Susilo Bambang Yudhoyono has been invited to give the keynote address at the conference, which is entitled Forests Indonesia: Alternative futures to meet demands for food, fibre, fuel and REDD+.
He will be joined by Norwegian Minister of the Environment and International Development Erik Solheim, the World Bank’s Special Envoy on Climate Change Andrew Steer and U.K. Minister for State for Environment, Food and Rural Affairs Jim Paice. Several other leaders of Indonesia’s government, business community and civil society will also attend and participate in the discussions. In all, more than 1,000 people have registered, including 250 businessmen and women.
The conference will be held on Tuesday, September 27 at the Shangri-La Hotel. It is being organized by the Center for International Forestry Research, which has its international headquarters in Bogor.
Indonesia is home to the world’s third-largest tropical forest. Globally, deforestation accounts for up to 20 percent of greenhouse gas emissions. In Indonesia, however, that figure is more than 60 percent _ making the country one of the highest emitters in the world.
Indonesia’s government has a range of opportunities to reduce the pace of deforestation, while at the same time expanding agricultural production to guarantee food security targets and promote economic growth.
Indonesia’s forestry sector contributes US$8 billion in annual export earnings and directly employs 1.3 million people. However, at the same time it loses up to US$4 billion in large-scale illegal logging, with much of the timber being smuggled overseas. Indonesia also has more than 30 million hectares of so-called ‘degraded land’ that could be used for palm oil and timber plantations rather than having these expansions done on carbon-rich peatlands and rainforests.
In 2009, President Yudhoyono pledged to cut Indonesia’s greenhouse gas emission by 26 percent from business-as-usual levels by 2020, and by 41 percent with international assistance. Since then, Norway has committed US$1 billion to help Indonesia meet that target, and in May this year the government issued a two-year moratorium on new forestry concessions.
It is predicted that up to US$30 billion could flow from developed to developing countries each year to help facilitate significant reductions in deforestation, and Indonesia could potentially claim a significant share of these funds through REDD+, a global mechanism for reducing emissions from deforestation and forest degradation, as well as the conservation and sustainable management of forests, and the enhancement of forest carbon stocks.
Indonesia is one of the countries with the most REDD+ demonstration activities in various stages of development, and Indonesia has been an early participant in various bilateral and multilateral initiatives to prepare for REDD+ implementation at the national level.
Journalists are welcome to attend the Forests Indonesia Conference and are encouraged to register online by Tuesday September 20 at www.ForestsIndonesia.org, or contact Budhy Kristanty on email b.kristanty@cgiar.org or call +62-(0) 816637353.
Forests Indonesia Conference
9am, Tuesday, September 27, 2011
Shangri-La Hotel, Jakarta
Please note that due to the President’s possible attendance at the conference, extra security measures will be in place and access to the building will be closed prior to 9am. Journalists and other attendees are strongly encouraged to come early. A light breakfast will be served from 7:30am.
The Center for International Forestry Research (CIFOR) advances human wellbeing, environmental conservation and equity by conducting research to inform policies and practices that affect forests in developing counties. CIFOR helps ensure that decision-making that affects forests is based on solid science and principles of good governance, and reflects the perspectives of developing countries and forest-dependent people. CIFOR is one of 15 centres within the Consultative Group on International Agricultural Research.
www.cifor.org
www.blog.cifor.org
03 Feb 2017 Esther Mwangi - REDD+
For Media Enquiries
Shelley ThakralTeam Leader; Communications, Outreach & Engagement
Email: s.thakral@cgiar.org
Cell phone: +62 (0) 8118842475
Skype: shelley.thakral
Budhy KristantyIndonesia Communications Administrator
Email: b.kristanty@cgiar.org
Cell phone: +62 816 637353
Skype: bkristanty
Languages: English, Indonesian
Yoly Gutierrez ZavalaRegional Communications Specialist - Latin America
Email: y.gutierrez@cgiar.org
Cell phone: +51 1 993 59 22 61
Skype: yoly.jazmin
Languages: Spanish, English
20 Feb 2017Indonesian president hands over management of forests to indigenous peopleIndonesia - Indonesia has had a long history of conflict over control of its massive areas of tropic .. Amy Lumban Gaol
10 Feb 2017Convergence awards grant to ADM Capital and ADM Capital Foundation for design of financing facility for renewable energy and smallholder livelihood projects in IndonesiaOriginally published via Convergence. Convergence Design Funding award aims to mobilize new sources ... erika | 科技 |
2017-09/1579/en_head.json.gz/23426 | Limiting Bidding on Spectrum Auctions Could Be Costly, Study Says
Georgetown paper makes the case that Verizon and AT&T should be allowed to bid in upcoming auctions
Advocates: FCC Should Limit Big Carriers in Spectrum Auctions
FCC Chairman Announces His Resignation
Why Some U.S. Homes and Businesses Still Don't Have Cellular Service
If the U.S. Federal Communications Commission limits the participation of the largest mobile carriers in upcoming spectrum auctions, it could cost the U.S. treasury billions of dollars, according to a study released Tuesday.The U.S. Department of Justice's Antitrust Division and some digital rights groups have called on the FCC to ensure that small carriers can compete in spectrum auctions scheduled for 2014.But a policy to restrict the ability of Verizon Wireless and AT&T to bid on the spectrum would drive down the bidding during the auction and leave less money for a nationwide public safety network and the U.S. treasury, said the new paper, from the business-friendly Georgetown Center for Business and Public Policy. The center has received funding in the past from both Verizon and AT&T, although the two large carriers did not commission this study, said center director John Mayo.[ The votes are in: Which mobile data provider is best? ]The upcoming spectrum auction would sell spectrum that is voluntarily turned over by U.S. television stations, and the FCC's spectrum rules "have the potential either to significantly boost or significantly hinder the ability of the auction to move spectrum to its most highly valued use," Mayo said. The auction could raise up to US $31 billion, according to the paper's authors. Using bidding results from past auctions, the authors estimated that completely barring Verizon and AT&T from the so-called incentive auctions could cost $12 billion."Those revenues matter," said Douglas Holtz-Eakin, a co-author of the Georgetown study. "That has implications for public policy."Even a partial restriction of bids by Verizon and AT&T could have a significant impact on auction revenues, he said.The study considers scenarios that won't happen, countered Matt Wood, policy director at digital rights group Free Press. "No one is talking about completely barring AT&T and Verizon from the incentive auction," Wood said in an email. "Sensible people are talking about making sure that more than two companies have a chance at obtaining spectrum. The fact that these duopolists hired economists to parrot the companies' own talking points isn't really that newsworthy."Restricting the bids of the two largest carriers could also mean a price hike for mobile service because it would mean that carriers who make less efficient use of the spectrum would control it, said Robert Shapiro, a co-author of the Georgetown study. Shapiro estimated that mobile service prices would rise by 9 percent if Verizon and AT&T were excluded from bidding.This, in turn, would lead to fewer U.S. residents adopting 4G service, costing the U.S. tens of thousands of jobs in the coming years, he said.The calls to limit the participation of AT&T and Verizon are misplaced, he said. "There is no evidence of any lack of competition in this market," he said.The study's authors don't believe Verizon and AT&T will be barred from the auction, but the study's "thought experiment" looking at that possibility shows the outer bounds of the economic impact of bidding limits, Shapiro said.Grant Gross covers technology and telecom policy in the U.S. government for The IDG News Service. Follow Grant on Twitter at GrantGross. Grant's e-mail address is grant_gross@idg.com. | 科技 |
2017-09/1579/en_head.json.gz/23490 | Site developed for tablet users
Tidings for tabletsThe Daily Tidings has launched a site tailored specifically for tablet devices such as the iPad.Tablet users who visit www.dailytidings.com will have the option to view the tablet-friendly site or view the full site. Users also can type in t.dailytidings.com to be directed to the tablet version automatically. The new version of the site includes all the local news, sports, columnists, entertainment and lifestyle coverage available in the Tidings every day, as well as photos and videos.All the content is displayed in a way that is easy to scroll through on touchscreen tablets. It's the second such version of the site tailored for specific media devices. There's also a site for smartphones at m.dailytidings.com. | 科技 |
2017-09/1579/en_head.json.gz/23493 | Previous Article Guavus Raises $30 Million for Big Data Analytics
Next Article Pets in the Clouds: CyberlinkASP Cloudifies Veterinary Records
Europe, HPC Cray Supercomputer Powers German Weather Service
by John Rath on January 16, 2013Add Your Comments Tweet
The new Cray XC30 supercomputer. (Photo: Cray Inc.)
Cray announced it has been awarded a $23 million dollar contract to provide two Cray XC30 supercomputers and two Cray Sonexion 1600 storage systems to Germany’s National Meteorological Service — the Deutscher Wetterdienst (DWD).
The new systems will enable DWD to produce higher resolution and more accurate global and regional weather forecasts. The two Cray Sonexion 1600 storage systems that will be deployed at DWD will have a combined capacity of more than 3 petabytes of storage and 72 gigabytes per-second of combined bandwidth.
“DWD is one of the world’s most prestigious numerical weather prediction centers, and we’re honored to provide them with the supercomputing technologies necessary for delivering such an extensive range of important services,” said Dr.Ulla Thiel, Cray vice president, Europe. “We are looking forward to building a strong collaboration and close partnership with DWD. This contract is yet another example of how we continue to expand our presence in the meteorological community in Europe and across the globe.”
Previously code-named “Cascade,” the XC30 was introduced last November and features the Aries system interconnect, a new Dragonfly network topology that frees applications from locality constraints, and a cooling system that utilizes a transverse airflow to lower customers’ total cost of ownership. Consisting of products and services, the multi-year, multi-phase contract is valued at more than $23 million, and the systems are expected to be delivered and put into production in 2013 and 2014.
“At our national meteorological service, we are responsible for providing services for the protection of life and property in the form of weather and climate information,” said Dr. Gerhard Adrian, President of DWD. “This is the core task of the DWD, and thus it is imperative that we equip our researchers and scientists with scalable, productive, and above all, highly reliable supercomputing systems. The Cray XC30 supercomputers will be valuable resources for us, and we are pleased to be working with Cray.”
Get Daily Email News from DCK!Subscribe now and get our special report, "The World's Most Unique Data Centers." Email*
John Rath (1802 Posts) John Rath is a veteran IT professional and regular contributor at Data Center Knowledge. He has served many roles in the data center, including support, system administration, web development and facility management. Related Stories
HPC News: Cray, NVIDIA Win Supercomputing Contracts
by John Rath on December 18, 2012
Cray announced it has been awarded a $39 million contract from the North-German Supercomputing Alliance (HLRN) to deliver a distributed Cray XC30 supercomputing system. Meanwhile, NVIDIA announced that it has been awarded a contract worth up to $20 million from the Defense Advanced Research Projects Agency (DARPA). Read More HPC
Cray Envisions 100 Petaflops With New XC30 Supercomputer
by John Rath on November 9, 2012
Rolling out news ahead of the SC12 supercomputing conference, Cray announced the launch of the its next generation Cray XC30 supercomputer, and said it has acquired Appro for $25 million. Read More HPC
NCSA Blue Waters Project Awarded To Cray
by John Rath on November 15, 2011
NCSA and Cray announced that they have finalized a contract with the University of Illinois' National Center for Supercomputing Applications (NCSA) to provide the supercomputer for the National Science Foundation's Blue Waters project. Read More Data Center Videos
Assembling A Cray XT5 Supercomputer
by Rich Miller on February 23, 2011
The Swiss National Supercomputing Centre's video captures the process of upgrading its supercomputer to a Cray XT5, dubbed the "Monte Rosa." It is the most powerful supercomputer in Switzerland and one of the largest high performance computing (HPC) systems worldwide. Read More Data Center Videos
Cray’s Rack-Mounted Supercomputer
by Rich Miller on July 9, 2010
In this video, Cray President and CEO Peter Ungaro and discusses the introduction of the Cray CX1000 rack-mounted supercomputer and its fit with Cray's Adaptive Supercomputing vision. Read More Add Your Comments | 科技 |
2017-09/1579/en_head.json.gz/23500 | U.S. & WorldFew tests done at toxic sites after superstorm
KEVIN BEGOS Published: Dec. 23, 2012 12:00 a.m.
OLD BRIDGE, N.J. — For more than a month, the U.S. Environmental Protection Agency has said that the recent superstorm didn't cause significant problems at any of the 247 Superfund toxic waste sites it's monitoring in New York and New Jersey.
But in many cases, no actual tests of soil or water are being conducted, just visual inspections.
The EPA conducted a handful of tests right after the storm, but couldn't provide details or locations of any recent testing when asked this week. New Jersey officials point out that federally designated Superfund sites are EPA's responsibility.
The 1980 Superfund law gave EPA the power to order cleanups of abandoned, spilled and illegally dumped hazardous wastes that threaten human health or the environment. The sites can involve long-term or short-term cleanups.
Jeff Tittel, executive director of the Sierra Club in New Jersey, says officials haven't done enough to ensure there is no contamination from Superfund sites. He's worried toxins could leach into groundwater and the ocean.
"It's really serious and I think the EPA and the state of New Jersey have not done due diligence to make sure these sites have not created problems," Tittel said.
The EPA said last month that none of the Superfund sites it monitors in New York or New Jersey sustained significant damage, but that it has done follow-up sampling at the Gowanus Canal site in Brooklyn, the Newtown Creek site on the border of Queens and Brooklyn, and the Raritan Bay Slag site, all of which flooded during the storm.
But this week EPA spokeswoman Stacy Kika didn't respond to questions about whether any soil or water tests have been done at the other 243 Superfund sites. The agency hasn't said exactly how many of the sites flooded.
"Currently, we do not believe that any sites were impacted in ways that would pose a threat to nearby communities," EPA said in a statement.
Politicians have been asking similar questions, too. On Nov. 29, N.J. Sen. Frank Lautenberg wrote to the EPA to ask for "an additional assessment" of Sandy's impact on Superfund sites in the state.
Elevated levels of lead, antimony, arsenic and copper have been found at the Raritan Bay Slag site, a Superfund site since 2009. Blast furnaces dumped lead at the site in the late 1960s and early 1970s, and lead slag was also used there to construct a seawall and jetty.
The EPA found lead levels as high as 142,000 parts per million were found at Raritan Bay in 2007. Natural soil levels for lead range from 50 to 400 parts per million.
The EPA took four samples from the site after Superstorm Sandy; two from a fenced-off beach area and two from a nearby public playground. One of the beach samples tested above the recreational limit for lead. In early November the EPA said is taking additional samples "to get a more detailed picture of how the material might have shifted" and will "take appropriate steps to prevent public exposure" at the site, according to a bulletin posted on its website. But six weeks later, the agency couldn't provide more details of what has been found.
The Newtown Creek site, with pesticides, metals, PCBs and volatile organic compounds, and the Gowanus Canal site, heavily contaminated with PCBs, heavy metals, volatile organics and coal tar wastes, were added to the Superfund list in 2010.
Some say the lead at the Raritan Bay site can disperse easily.
Gabriel Fillippeli, director of the Center for Urban Health at Indiana University-Purdue University Indianapolis, said lead tends to stay in the soil once it is deposited, but can be moved around by stormwaters or winds. Arsenic, which has been found in the surface water at the site, can leach into the water table, Fillippeli said.
"My concern is twofold. One is, a storm like that surely moved some of that material physically to other places, I would think," Fillippeli said. "If they don't cap that or seal it or clean it up, arsenic will continue to make its way slowly into groundwater and lead will be distributed around the neighborhood."
The lack of testing has left some residents with lingering worries.
The Raritan Bay Slag site sits on the beach overlooking a placid harbor with a view of Staten Island. On a recent foggy morning workers were hauling out debris, and some nearby residents wondered whether the superstorm increased or spread the amount of pollution at the site.
"I think it brought a lot of crud in from what's out there," said Elise Pelletier, whose small bungalow sits on a hill overlooking the Raritan Bay Slag site. "You don't know what came in from the water." Her street did not flood because it is up high, but she worries about a park below where people go fishing and walk their dogs. She would like to see more testing done.
Thomas Burke, an associate dean at the Johns Hopkins School of Public Health, says both federal and state officials generally have a good handle on the major Superfund sites, which often use caps and walls to contain pollution.
"They are designed to hold up," Burke said of such structures, but added that "you always have to be concerned that an unusual event can spread things around in the environment." Burke noted that the storm brought in a "tremendous amount" of water, raising the possibility that groundwater plumes could have changed.
"There really have to be evaluations" of communities near the Superfund sites, he said. "It's important to take a look."
Officials in both New York and New Jersey note they've also been monitoring less toxic sites known as brownfields, and haven't found major problems. The New York DEC said in a statement that brownfields in that state "were not significantly impacted" and that they don't plan further tests for storm impacts.
Larry Ragonese, a spokesman for the New Jersey Department of Environmental Protection, said the agency has done visual inspections of major brownfield sites and also alerted towns and cities to be on the lookout for problems. Ragonese said they just aren't getting calls voicing such concerns.
Back at the Raritan Bay slag site, some residents want more information. And they want the toxic soil, which has sat here for years, out.
Pat Churchill, who was walking her dog in the park along the water, said she's still worried.
"There are unanswered questions. You can't tell me this is all contained. It has to move around," Churchill said. | 科技 |
2017-09/1579/en_head.json.gz/23522 | IBiquity Approves Surround Sound Technology for HD Radio
IBIQUITY DIGITAL APPROVES SRS LABS’ CIRCLE SURROUND 5.1 AS COMPATIBLE SURROUND SOUND FORMAT FOR THE HD RADIO SYSTEM SRS Labs’ Circle Surround Technology Enables Radio Stations To Broadcast Digital Surround Radio To Millions Of Listeners Across The United States SANTA ANA, Calif., June 29, 2004 – SRS Labs, Inc. (NASDAQ: SRSL), a leading provider of innovative audio, voice and semiconductor technology solutions, and iBiquity Digital Corporation, the sole developer and licensor of HD Radio technology today announced the completion of the joint testing of SRS Circle Surround technology as a compatible surround sound format for iBiquity’s HD Radio broadcast technology. HD Radio technology is the digital broadcasting system for terrestrial radio, and is the only digital technology authorized by the FCC for broadcasting in the AM and FM band in the United States. Circle Surround technology allows FM radio stations nationwide to deliver digital quality surround sound content to millions of listeners through their home theater systems or car audio entertainment units. More than 300 radio stations in 100+ U.S. markets across 38 states reaching 67 percent of the US population have already licensed HD Radio technology. This announcement marks the completion of the joint testing process announced in January 2004 at the International Consumer Electronics Show (CES) where the two companies announced their strategic technology partnership. iBiquity has developed and patented HD Radio technology that transmits digital audio and data alongside existing AM and FM analog signals, allowing listeners to enjoy CD-quality sound and virtually eliminating the static, hiss, and pops associated with today’s analog radio signals. Implementation of the Circle Surround encoding solution for the HD Radio platform allows radio stations to encode any multichannel content into two-channel output for broadcast over the HD Radio system, which can then be decoded into full-bandwidth surround sound with any decoder found in millions of home theater and automotive systems. Up to a full 6.1-channel surround sound experience can be realized with a CS II decoder, which can be found in a wide variety of home theater products from Kenwood, Marantz, Accuphase and Theta Digital. For those listening to HD Radio receivers in their cars, Circle Surround Automotive technology also includes the latest CS decoder along with several specifically designed technologies that overcome the challenges of speaker placement and other obstacles specific to the automotive environment to deliver the best available surround sound in cars. Mike Lyons, Vice President, Aftermarket Business Development for iBiquity Digital said, “We are extremely pleased that SRS has proven Circle Surround sound compatibility with iBiquity’s HD Radio system. We feel that SRS Labs’ surround sound technology will accelerate the growth and adoption of the HD Radio system across the country and we encourage our broadcast and consumer electronics partners to realize how powerful Circle Surround can be for radio.” One of the most powerful features of CS encoding is that it is 100 percent compatible with mono and stereo. For radio stations, this means a competitive advantage in the ability to offer surround sound to their listeners. “As Circle Surround reaches beyond home theater-based products to automobiles and other HD Radio-enabled receivers, broadcasting in surround sound will quickly become the standard,” said Mike Canevaro, director of sales and business development for SRS Labs. “We are working hard to bring the best available surround sound technologies to HD Radio digital broadcasting and their tens of millions of FM listeners. Broadcasters, automakers, retailers, manufacturers and consumers alike will benefit from the teamwork of Circle Surround and iBiquity, as radio transforms into a full-blown home theater-like experience whether it’s in the home or on the road.” SRS Circle Surround is an advanced and patented, encode/decode multichannel system that is rapidly gaining worldwide support among audio professionals and product manufacturers alike. The Circle Surround encoding system accepts any multichannel source, encodes it to a two-channel output, which can then be transmitted over any existing stereo analog or digital broadcasting infrastructure, such as the HD Radio system. The Circle Surround-encoded two-channel signal can be decoded back into thrilling 5.1 or 6.1 surround sound by any consumer with a multispeaker system that includes a matrix decoder, such as Circle Surround or Dolby Pro Logic. The best results are achieved from the Circle Surround decoder, which currently can be found in a wide variety of home theater products from Kenwood, Marantz, Motorola, Accuphase, Orion Studios and Theta Digital among others. SRS Labs is also working with several leading auto makers, automotive entertainment manufacturers and platform partners to provide a version of Circle Surround that is optimized for an automobile, Circle Surround Automotive. For information regarding SRS technologies, contact Mike Canevaro at (949) 442-1070 Ext. 3216 or mikec@srslabs.com. Additional information is also available from the company’s website at www.srslabs.com. About Circle Surround Encoding & Decoding The patented Circle Surround technology is a highly versatile, multichannel audio encode and decode system capable of supporting a wide range of surround sound creation and playback applications. CS hardware and software encoders can encode up to 6.1 channels of discrete audio for distribution over existing two-channel carriers such as broadcast television, cable, streaming media over the Internet, CDs and VHS tapes. Over the past year, SRS Labs has cultivated relationships with major broadcasters and content providers, such as ESPN Productions, Paramount and ABC Sports, to telecast their sporting events and television programs in Circle Surround. Circle Surround and Circle Surround II decoding is capable of delivering up to 6.1 channels of audio from mono, stereo, matrix-encoded or CS-encoded source material. CS II features patented post-processing techniques to enhance the surround sound experience, including SRS Dialog Clarity, which improves the intelligibility and clarity of center channel information such as dialog, and SRS TruBass, which uses patented psychoacoustic techniques to increase bass performance from any playback system, including a subwoofer. CS has been adopted by many of the leading electronics companies as a surround decoding option, including Kenwood, Marantz, Theta Digital, Accuphase, M-Audio, Jaton, and Smart Devices. It is the most powerful, flexible and feature-rich surround decoding format available today. About iBiquity Digital Corp. iBiquity Digital is the sole developer and licensor of HD Radio technology in the U.S., which will transform today’s analog radio to digital, enabling radically upgraded sound and new wireless data services. The company’s investors include 15 of the nation’s top radio broadcasters, including ABC, Clear Channel and Viacom; leading financial institutions, such as J.P. Morgan Partners, Pequot Capital and J&W Seligman; and strategic partners Ford Motor Company, Harris, Texas Instruments and Visteon. iBiquity Digital is a privately held company with operations in Columbia, MD, Detroit, MI, Redwood City, CA and Warren, NJ . For more information please visit: http://www.ibiquity.com. About SRS Labs Inc. SRS Labs is a recognized leader in the advancement of audio and voice technology. The company works with the world’s top manufacturers to provide a richer entertainment experience through patented sound techniques. SRS Labs’ technologies can be heard through products ranging from televisions, flat panel displays, DVD players, mobile phones, car audio systems, headphones and notebook and desktop computers. The company also offers hardware and software tools to professionals and consumers for the creation, production and broadcast of content featuring SRS Labs’ technologies. SRS Labs’ wholly owned subsidiary, ValenceTech, is a Hong Kong-based semiconductor company that designs and sells custom ASICs and standard ICs to leading manufacturers worldwide. Based in Santa Ana, Calif., the company also has licensing representation in Hong Kong, Japan, Europe, and Korea. For more information about SRS Labs, Inc. please visit www.srslabs.com. The information on the aforementioned website is not incorporated by reference into this press release. Leave a Comment Last Updated:
Home Theater New Products More in Home Theater
StormAudio Bursts Into High-end Home Theater Market
Brian MitchellNovember 16, 2016
Marantz 2016 Flagship A/V Receiver & Preamp Announced
Razer Now Owns The THX Brand
Pioneer Elite SC-LX901, SC-LX801 and SC-LX701 A/V Receivers Announced
Brian MitchellSeptember 15, 2016
Integra Debuts Research Series A/V Receiver & Preamp Flagships
Panasonic PV-GS400 Camcorder Mstar Flat Panel Display Mount Systems | 科技 |
2017-09/1579/en_head.json.gz/23537 | Debate over the 'holy grail'
As we noted this past week, finding the "holy grail" is just the first step. Learning how to use it, value it and de-mystify it is another step. A big one. Especially when the so-called "holy grail" is energy storage, which has a variety of applications, each valued differently, with impacts on electricity markets and with alternative technologies serving similar roles. And no well-defined place in the regulatory structure.We offered two columns this week on a staff proposal at the California Public Utilities Commission on how to approach this thorny subject and we received some feedback, most of it thoughtful. We'll bring those comments to the fore today. Synopses are unwieldy things, so I'd refer you to the original articles, "California's Energy Storage Policies" and "Energy Storage and the Barriers to Adoption."First, one reader returned to the fundamental premise of fossil fuels vs. renewable resources, declaring that endless quantities of "inexpensive" fossil fuels precludes the need for renewables. This view, of course, simply shrinks in the face of a challenge. It does not include consideration of the subsidies currently dedicated to fossil fuels, nor their massive environmental costs, which if included in the cost of "cheap" coal and natural gas, would fundamentally change their economics. Nor, of course, does this argument address the finite nature of fossil fuels or the increased costs as recovery becomes ever-more difficult. Further, this view doesn't make any allowance for U.S. competitiveness in the global race for clean energy, which clearly will dominate 21st century global competition. Sustainability doesn't appear anywhere in this world view. But this view has a large constituency that depends on ignoring these factors. And the simplicity of an argument made without taking any of these factors into account is very appealing. To wit:"The technology for storage on a cheap, large-scale basis simply isn't there," our anonymous correspondent wrote. "If it was, utilities would implement it. There would be no need for a government mandate. The same can be said for using renewables. In most cases they aren't the cheapest or the most reliable solution, and bring their own problems."The hard truth staring everybody in the face is that today the cheapest and most efficient form of energy storage is fossil fuel," our correspondent continued. "Given that we keep finding more and more of them, especially in forms that are excellent for power generation, it's silly to force expensive alternatives on society. One day we are going to stress energy production to the point that we will find ourselves with a third-world power generation system, unable to provide power with the 99.99 percent reliability that a first-world economy needs."Just mine, transport and burn more coal and pay no attention to the resulting devastation of communities, the emissions that kill thousands each year or the millions of gallons of untreated toxic waste from hydro fracking that wind up in our water sources, perhaps? And be sure to place responsibility for promoting sustainable, renewable energy—fighting, naturally, either for a level playing field where no one gets subsidies or for its share in a world riddled with subsidies—in the hands of "the government"—the universal bogie man—rather than in the hands of the people who use the government to reflect their values. But I have to applaud the rhetorical use of the phrase "the hard truth" to bolster one's argument, as if adding that phrase somehow precludes any other point of view. I believe I'll tuck that one into my own rhetorical toolbox. On a factual basis, would anyone complaining about federal subsidies for renewable energy care to review the history of, say, the fossil fuel industry? I'd suggest that would yield its own trove of truths that would undercut those with short memories.Well, I certainly don't have any opinions on the matter .Then we had comments from the thermal energy storage sector, which certainly deserves a fresh look as we've focused on batteries of late. "While all parties involved are posturing for position in California and storage happens at a snail's pace, proven technologies and regional independent system operators like PJM and ERCOT are moving forward with programs that reward storage customers for participation," wrote P. Valenta. "Many of those storage customers use proven affordable thermal energy storage (TES)."According to a March 2012 KEMA energy storage study for the copper industry, thermal energy storage has over 1GW of installed capacity in the U.S. That is twice as much as pumped hydro and more than all other energy storage technologies combined. The prediction is that that pattern of growth will continue because of the new control technologies available to TES operators. TES is affordable today. "While TES is a seasonal storage solution in most areas of the country and it may not be the total solution that California is looking for, TES provides the biggest benefit when the California grid has the most difficulty—summertime. A total solution will require all forms of storage to provide the grid reliability needed when moving away from dispatchable fossil fuels to intermittent renewable energy. While I understand the slow pace for a total solution, a separate program for distributed thermal energy storage could get the storage movement started with minimal investment."Of course, the discussion suffers from disagreements over the facts in the case, as evidenced by another forum posting."Pumped storage in California alone amounts to more than 3 GW," wrote contributor Jack Ellis. "Nationwide I think the amount is well into the double digits of GW."I'd love to see more thermal storage, but unless it's cost-competitive with peaking plants, it's just too expensive. Even if its capital costs are comparable to a peaking plant, only the most efficient technologies are competitive on an operating cost basis. The bottom line, though, is that thermal storage would be readily adopted if it made economic sense. Since it doesn't make economic sense without sizable out-of-market payments, either it's still too expensive or market conditions don't make sense. As a long-time observer of the electricity industry in California, I suspect both cost and poor market conditions are responsible for the slow uptake of thermal storage."My "takeaway"? The work begins with that elusive "single version of the truth" where facts are agreed upon. Of course, that's easier when it comes to, say, gravity, versus, say, global warming. The more complex and intangible the facts, the greater the debate.But the more interesting discussion is around the viability of arguments that focus solely on snapshots of current practices and prices in contrast to the direction that a given service territory or our nation as a whole ought to move in. Add to that contrast the self-preservation of entrenched interests and the notion that we can simply remain static and endlessly repeat what we did today into the future and you have a real debate.No doubt, preparing for the future costs money. Typically, all new technologies are more expensive at the outset and you don't achieve savings without scale. And because "the government" doesn't always execute well on the will of the people (distorted, as it is, by vast sums of money that fuel influence peddling), we have a real wrestling match on our hands.In my view, you can look backwards and say things are good enough or you can look forward and say things could be better. I believe readers know which side I'm on.Not that that resolves any issues around energy storage in California.Phil CarsonEditor-in-chiefIntelligent Utility Daily pcarson@energycentral.com303-228-4757 Energy Central Contributor Contributor, Energy Central View Profile
Explore Related Topics:#energy-storage
#regulatory-&-legal
Doug SterbenzEVP & COO (retired), Westar | 科技 |
2017-09/1579/en_head.json.gz/23545 | Will Daniel Radcliffe Star in Grand Theft Auto Developer Special?
There are reports surfacing that Daniel Radcliffe is in negotiations to play the lead role for the upcoming drama about Grand Theft Auto developer, Rockstar. The new 90-min drama is about the creation of the legendary and persistently controversial Grand Theft Auto, as well as other Rockstar titles such as Manhunt and Bully, although if I’m honest, I figured the show would actually feature the real-world developers, not the guy who played Harry Potter.
There’s no doubt that GTA is a hot topic and it’s had quite a bit of history in the game development community since the game first launched on the PlayStation 1. It’s been the focus of numerous controversies from public shootings, blamed for children committing violent acts, court battles with celebrities for using their likeness, using their songs out of licence and a whole host of things, so perhaps this could be an interesting insight into what things have been like behind the scenes.
Image courtesy of Jason Merritt/Getty Images.
Topics: BBC, daniel radcliffe, dev, developer, Grand Theft Auto, news, Rockstar, tv Be Social | 科技 |
2017-09/1579/en_head.json.gz/23554 | An arcade cabinet that dynamically changes depending on the game you want to play
Posted: April 29, 2016 at 6:19 am
You may think the art of creating arcade cabinets may be long dead but you’d be, ironically, dead wrong.
The last time we covered such an innovation in that space, it was a Teenage Mutant Ninja Turtles cabinet with 3D printed joysticks (among other things). That beautiful machine was created by Paradox Arcade Systems , and they’re back with a new creation: an arcade system that will change how it looks depending on the game you want to play.
It doesn’t employ some kind of shape shifting or cloaking, unfortunately, but it’s still really impressive. It uses the emulation software HyperSpin to play a variety of older titles on a gaming PC. All you need to do is select your game and watch the cabinet change.
Once you’ve chosen a game, RGB LEDs in the buttons change to highlight what you’ll need to mash to play. That’s cool and all, but the real draw is a 32″ LCD screen that sits at the top of the machine and displays the game being played. It’s a special display at the silly resolution of 16:3 that costs $600 (R8 647). See it in action:
That may seem like a lot for a single screen, but it’s a relatively small amount when considering the full project cost more than $4 500 (R64 892) and took “a few hundred hours” to complete. That’s a staggering investment when you consider you can play almost any classic arcade game, for free, in your browser. While you may object to the legal ambiguity when doing that, the software used here is the same.
Putting the price and the legality aside, we still really love this project. It’s great to go back to the classic pieces of electronics and see how they can be made better and personalised with newer tech, like a Game Boy / Raspberry Pi Zero that’s so much better than the original.
Like this story? Share it!
'Minecraft Gear VR' is Now Live, and I Wish I Could Play it
Destiny: The Taken King’s Dreadnaught Patrol zone is a secret space dungeon
Metro 2034: Why We Need Another Sequel
Metal Gear Solid 5: The Phantom Pain Tips and Tricks | 科技 |
2017-09/1579/en_head.json.gz/23602 | Home Screens Tekken 6
Tekken 6 Screenshots
Santa Clara, CA - May 1, 2006 – Leading video games publisher and developer NAMCO BANDAI Games America Inc. announced today its plan to be a key industry force in the next generation market with the unveiling of its next generation line-up at the 2006 Electronic Entertainment Expo. NAMCO BANDAI Games America Inc.'s highly-anticipated content offerings for the next generation of consoles will include a variety of its signature franchises as well as introducing some new and exciting IP's into the HD era. Triple-A franchises including Tekken and Ridge Racer will both be making their way to the PLAYSTATION3 computer entertainment system. Ridge Racer 7 will bring racing excitement to a new level when it ships to PLAYSTATION 3, while TEKKEN 6 will pack a punch as an acclaimed fighting franchise. Taking full advantage of the Xbox 360 video game and entertainment system from Microsoft, MOBILE OPS: The One Year War will take mechanized warfare to the next level as the player enters the intense battlefields of The One Year War as a soldier who fights by foot, in vehicles and in a wide array of mecha. Boasting online play and high definition graphics, Mobile Ops: The One Year War will bring the battlefield into the player's living room in the most immersive mech-action game to date. Also making its way to the Xbox 360 is the newest installment of the popular card battle series, Culdcept Saga. The popular puzzle series' latest installment is being developed by Omiya Soft and will feature exciting online battle via Xbox Live. "As a recently integrated company, NAMCO BANDAI is excited to demonstrate our new publishing strength within the industry by unveiling our next generation line-up at E3 2006," said Genichi Ito, President and CEO of NAMCO BANDAI Games America Inc. "We are looking forward to supporting the next generation of consoles and being at the forefront of delivering quality content for all gamers to enjoy." About NAMCO BANDAI Games America Inc. NAMCO BANDAI Games America Inc. is an interactive entertainment software publisher and developer based in Santa Clara, CA. The company is a subsidiary of the newly formed NAMCO BANDAI Holdings (USA) Inc. based in Cypress, CA. On September 29, 2005, Bandai Co., Ltd. and NAMCO LIMITED implemented a management integration to form NAMCO BANDAI Holdings Inc. This step was taken to compete more effectively on the global stage in the fast-changing entertainment industry and to deliver further growth. The resulting BANDAI NAMCO Group is a global entertainment group involved in business fields ranging from toys, amusement facilities, video game software and visual software, to apparel, sundries and network content. As previous separate entities (Namco Hometek Inc. and Bandai Games Inc.) the companies were each known for creating some of the industry's top video game franchises including: Tamagotchi, Tekken, SOULCALIBUR, Dead to Rights, Pac-Man World, Digimon, Ridge Racer, Time Crisis and ACECOMBAT. For more information about NAMCO BANDAI Games America Inc. and its products log onto www.namcobandaigames.com. Content on this page comes directly from press releases and fact sheets provided by publishers and developers and was not written by the Game Revolution staff.
More information about Tekken 6 | 科技 |
2017-09/1579/en_head.json.gz/23727 | Pentagon shows off life-size robot
AFP-JIJI Apr 23, 2014 Article history
WASHINGTON – U.S. Defense Secretary Chuck Hagel got a firsthand look at a life-size robot Tuesday that resembles Hollywood’s Terminator, the latest experiment by the Pentagon’s high-tech researchers.
But unlike the cinematic version, the hulking Atlas robot is designed not as a warrior, but as a humanitarian machine that would rescue victims in the rubble of natural disasters, officials said.
The 187-cm Atlas is one of the entrants in a contest designed to produce a manlike lifesaving machine, the brainchild of the Defense Advanced Research Projects Agency (DARPA).
The competition, which will require the robots to navigate rough terrain and enter buildings, was created in the aftermath of the Great East Japan Earthquake and tsunami.
DARPA, the Pentagon’s research arm known for futuristic projects often evoking science fiction, showed off the Atlas robot to Hagel, but, except for LED lighting, the humanoid was apparently switched off in a static display.
Brad Tousley, head of DARPA’s Tactical Technology Office, told Hagel that Hollywood has created unrealistic expectations of what robots can do.
Building robots that can climb ladders, open doors and carry objects requires daunting feats of engineering and computer science, he said.
Scientists also showed Hagel the latest technology for prosthetics, including a mechanical hand that responds to brain impulses and a prosthetic arm controlled by foot movements.
A wounded veteran who once worked with Hagel in the 1980s demonstrated one of the devices, giving the Pentagon chief a thumbs up with his prosthetic left arm.
“It’s the first time in 45 years, since Vietnam, I’m able to use my left hand,” said Fred Downs, who lost his arm in a land mine explosion during the war.
He controlled the device using two accelerometers strapped to his feet, manipulating the elbow, wrist and fingers.
“This is transformational,” Hagel said. “We’ve never seen anything like this before.”
Dr. Justin Sanchez, a program manager at DARPA who works with prosthetics and brain-related technology, showed Hagel a video of a patient whose brain had been implanted with a sensor, allowing her to control a mechanical arm with her thoughts.
Scientists then displayed a shiny black mechanical hand and arm that responds to brain impulses, and said sensors would be attached to allow the fingers to send sensations back to the brain. The tactile feedback system should be operational within a few months, officials said.
U.S., military, weapons, robots Politics | 科技 |
2017-09/1579/en_head.json.gz/23735 | NASA Hosts Teleconference Today About Curiosity Rover
NASA Rover Confirms First Drilled Mars Rock Sample
NASA's Kepler Mission Discovers Tiny Planet System
PASADENA, Calif. -- NASA's Kepler mission scientists have discovered a new planetary system that is home to the smallest planet yet found around a star similar to our sun.
The planets are located in a system called Kepler-37, about 210 light-years from Earth in the constellation Lyra. The smallest planet, Kepler-37b, is slightly larger than our moon, measuring about one-third the size of Earth. It is smaller than Mercury, which made its detection a challenge. The moon-size planet and its two companion planets were found by scientists with NASA's Kepler mission, which is designed to find Earth-sized planets in or near the "habitable zone," the region in a planetary system where liquid water might exist on the surface of an orbiting planet. However, while the star in Kepler-37 may be similar to our sun, the system appears quite unlike the solar system in which we live. Astronomers think Kepler-37b does not have an atmosphere and cannot support life as we know it. The tiny planet almost certainly is rocky in composition. Kepler-37c, the closer neighboring planet, is slightly smaller than Venus, measuring almost three-quarters the size of Earth. Kepler-37d, the farther planet, is twice the size of Earth.
The first exoplanets found to orbit a normal star were giants. As technologies have advanced, smaller and smaller planets have been found, and Kepler has shown that even Earth-size exoplanets are common. "Even Kepler can only detect such a tiny world around the brightest stars it observes," said Jack Lissauer, a planetary scientist at NASA's Ames Research Center in Moffett Field, Calif. "The fact we've discovered tiny Kepler-37b suggests such little planets are common, and more planetary wonders await as we continue to gather and analyze additional data." Kepler-37's host star belongs to the same class as our sun, although it is slightly cooler and smaller. All three planets orbit the star at less than the distance Mercury is to the sun, suggesting they are very hot, inhospitable worlds. Kepler-37b orbits every 13 days at less than one-third Mercury's distance from the sun. The estimated surface temperature of this smoldering planet, at more than 800 degrees Fahrenheit (700 degrees Kelvin), would be hot enough to melt the zinc in a penny. Kepler-37c and Kepler-37d, orbit every 21 days and 40 days, respectively. "We uncovered a planet smaller than any in our solar system orbiting one of the few stars that is both bright and quiet, where signal detection was possible," said Thomas Barclay, Kepler scientist at the Bay Area Environmental Research Institute in Sonoma, Calif., and lead author of the new study published in the journal Nature. "This discovery shows close-in planets can be smaller, as well as much larger, than planets orbiting our sun."
The research team used data from NASA's Kepler space telescope, which simultaneously and continuously measures the brightness of more than 150,000 stars every 30 minutes. When a planet candidate transits, or passes, in front of the star from the spacecraft's vantage point, a percentage of light from the star is blocked. This causes a dip in the brightness of the starlight that reveals the transiting planet's size relative to its star.
The size of the star must be known in order to measure the planet's size accurately. To learn more about the properties of the star Kepler-37, scientists examined sound waves generated by the boiling motion beneath the surface of the star. They probed the interior structure of Kepler-37's star just as geologists use seismic waves generated by earthquakes to probe the interior structure of Earth. The science is called asteroseismology.
The sound waves travel into the star and bring information back up to the surface. The waves cause oscillations that Kepler observes as a rapid flickering of the star's brightness. Like bells in a steeple, small stars ring at high tones while larger stars boom in lower tones. The barely discernible, high-frequency oscillations in the brightness of small stars are the most difficult to measure. This is why most objects previously subjected to asteroseismic analysis are larger than the sun. With the very high precision of the Kepler instrument, astronomers have reached a new milestone. The star Kepler-37, with a radius just three-quarters of the sun, now is the smallest bell in the asteroseismology steeple. The radius of the star is known to three percent accuracy, which translates to exceptional accuracy in the planet's size.
Ames is responsible for Kepler's ground system development, mission operations, and science data analysis. NASA's Jet Propulsion Laboratory in Pasadena, Calif., managed Kepler mission development.
Ball Aerospace & Technologies Corp. in Boulder, Colo., developed the Kepler flight system and supports mission operations with the Laboratory for Atmospheric and Space Physics at the University of Colorado in Boulder.
The Space Telescope Science Institute in Baltimore archives, hosts and distributes Kepler science data. Kepler is NASA's tenth Discovery Mission and was funded by NASA's Science Mission Directorate at the agency's headquarters in Washington.
For more information about the Kepler mission, visit: http://www.nasa.gov/kepler .News Media ContactWhitney Clavin 818-354-4673
Jet Propulsion Laboratory, Pasadena, Calif.
whitney.clavin@jpl.nasa.gov
J.D. Harrington 202-358-5241
Headquarters, Washington j.d.harrington@nasa.gov 2013-066 PopularIn Atmospheric River Storms, Wind Is a Risk, TooNASA to Host News Conference on Discovery Beyond Our Solar SystemNASA's Europa Flyby Mission Moves into Design PhaseDawn Discovers Evidence for Organic Material on CeresNASA's NEOWISE Mission Spies One Comet, Maybe TwoNASA Study Solves Two Mysteries About Wobbling Earth You Might Also Like
NASA will hold a news conference at 10 a.m. PST (1 p.m. EST) Wednesday, Feb. 22, to present new findings on planets that orbit stars other than our sun, known as exoplanets.
NASA to Host News Conference on Discovery Beyond Our Solar System
NASA is inviting the public to help search for possible undiscovered worlds in the outer reaches of our solar system and in neighboring interstellar space.
NASA-funded Website Lets the Public Search for New Nearby Worlds
A planet and a star are having a tumultuous romance that can be detected from 370 light-years away.
Spitzer Hears Stellar 'Heartbeat' from Planetary Companion | 科技 |
2017-09/1579/en_head.json.gz/23800 | Praise for offshore wind initiative from Newport News ally
Gamesa Technology Corp. praised steps announced yesterday by Energy Secretary Steven Chu and Interior Secretary Ken Salazar to speed development of wind energy off the U.S. East Coast.Gamesa has an alliance with the Newport News Shipbuilding operation of Northrop Grumman Shipbuilding to cooperate on the design, development, site selection, permitting, installation and testing of Gamesa's G11X -5.0 MW offshore prototype wind turbine in the United States.On Thursday, Gamesa and Northrop Grumman Shipbuilding, will stage the official opening of the Gamesa Offshore Wind Technology Center in Chesapeake, Va. The two companies formed the center to develop the next generation of offshore wind systems that will be deployed in the United States and around the world.Yesterday, Dirk Matthys, Chairman and CEO of Gamesa North America, called the steps announced by Secretary Chu and Secretary Salazar "significant and important actions.""This is good news for businesses and families on the East Coast and elsewhere in the U.S. who stand to benefit from the development of this major source of sustainable energy," said Mr. Matthys.With more than 15 years' experience, Gamesa is a world leader in the design, manufacture, installation and maintenance of wind turbines, with more than 19,000 megawatts installed in 26 countries on four continents.
February 8, 2010 Want more? Subscribe now!
« Ensco swoops on Pride International in offshore drilling merger USCG and EPA cooperate on preventing illegal discharges from ships » | 科技 |
2017-09/1579/en_head.json.gz/23817 | 'Explosive growth' for BSkyB TV
Broadcaster BSkyB brushed aside competition from rival BT today as it hailed "explosive growth" in its on-demand TV service and a surge of more than 40% in the take-up of products over its Christmas quarter. The group said its advertising campaign fronted by Absolutely Fabulous star Joanna Lumley helped it notch up five million customers for high definition TV in the quarter to December 31, while it saw a record one million connected Sky+HD boxes added in the three months - nearly 11,000 a day.
Higher sports rights and content investment costs weighed on half-year underlying operating profits, down 8% to £595 million, although this was better than expected, sending shares more than 3% higher.
It added another 110,000 broadband internet customers in the last three months of 2013, despite the challenge from BT, which launched its own sport channels last August offering free Premier League football if customers sign up to a broadband package. Sky was left reeling after losing out in November on the UK rights to show Champions League and Europa League matches to BT, which paid almost £900 million to show both Uefa competitions for three seasons from 2015/16. Jeremy Darroch, chief executive of BSkyB, said the group was not prepared to pay over the odds for the rights, saying it believes there are "better ways to invest for our customers". It announced a new five-year deal for the exclusive rights to the entire HBO TV catalogue, which includes top US shows such as Girls and Game Of Thrones, extending its existing agreement to 2020, having also just secured a new drama pay channel with broadcaster ITV, called ITV Encore, that will launch next year. The group declined to comment on reports suggesting it has held talks with Vodafone about adding a mobile offering to boost its defences against BT, but Mr Darroch said Sky was "open-minded" about tie-ups. He said: "Mobile has been something that from time to time we've looked at - we haven't ruled anything out but don't see it being imperative for the business." BSkyB posted an 8% rise in overall adjusted half-year revenues to £3.8 billion and said customers took out 873,000 new paid-for subscription products in its second quarter, up 42% year-on-year, with 3.8 million added in the past 12 months in what marks its fastest growth for three years.
It said 36% of customers now take-up so-called triple-play packages across its services, up from 29% two years ago.
Richard Hunter, head of equities at Hargreaves Lansdown Stockbrokers, said it was a "pleasing" set of results.
"The breadth of the offering positions the company well on a number of fronts," he added.
An update from BT tomorrow will be watched closely for signs of how the firm has been benefiting from its push into content.
Aside from its battle with BT, Sky has been in the takeover spotlight again recently as speculation has resurfaced over a potential new move from Rupert Murdoch to take full control of the firm.
Mr Murdoch, who owns 39.1% of BSkyB, scrapped his last takeover effort in 2011 amid opposition from the communications watchdog Ofcom and as details of the phone-hacking scandal at his now defunct News of the World tabloid emerged. | 科技 |
2017-09/1579/en_head.json.gz/23828 | The Master of the Stars
Michael Hammers gigantic Christmas Stars
The firmament has a new star.
A real Hammers.
By Hildegard Mathies
The newborn star by Michael Hammers softly shines its light into the night sky above Innsbruck. The artist from Aachen who also created the star for the most famous Christmas tree in the world – in front of the Rockefeller Center in New York –, has regaled the Innsbruck Christkindlmarkt with a new celestial body. The star, which is almost four metres in height, crowns the Swarovski crystal tree, which itself is more than 14 metres tall and forms the radiating centre of the Christmas fair in the city’s market square.
Despite all its brilliance, the star of Innsbruck is to shine just as modestly as the way in which Jesus Christ was born into the world in a stable in Bethlehem. Clearly visible and an eyeturner too, of course. But not loud, intrusive or overpowering. Hence, there is no excited twinkling, and even the famous Swarovski crystals have been built into the inside of the elegant silver giant.
Almost 2500 tiny crystals of 14 millimetres in diameter have been mounted in front of 54 LED spotlights and disperse the light. A total of 36 light-scenarios can be timed and randomly enacted and let the star, which is more than three metres in diameter shine. “The effect is magical”, finds Hammers’ client, former Swarovski boss Gernot Langes-Swarovski, known to this day as the “master of crystals”.
The night of enlightenment
When Michael Hammers talks, it feels almost like being there on the night when the Innsbruck star shone for the very first time. There they are, sitting in their car, a few hundred metres down the road, looking at their star: Hammers, the creator of the filigreed marvel made of glass-pearl-blasted steel, and Florian Kick who is responsible for the light and electric planning. Immaterial are now all the hours and days of planning, refining and constructing as well as the late hour of the day. Hammers is devoted to his projects with all his heart. Clock hands and other nullities do not come into it.
Those who work with him do not only know this, but share it. Voluntarily and gladly. In spite of all his artistic and know-how-based self-confidence, which Hammers exuberates in a down to earth and endearing manner after almost three decades of creative work: he is a team player. And the founder and creative head of Michael Hammers Studios does not only demand full commitment from his co-workers and cooperation partners – he himself gives it too. This makes it easy for him to inspire people.
It is after midnight. Via Wi-Fi Kick steers the lighting programme. And then the star finally gleams exactly as it should. “With the light that I integrated I wanted it to belong to the sky”, says Michael Hammers. With the wonder of a child the 49-year-old finds the light effect “utterly fascinating”. Just as the real stars in the firmament twinkle silently and seem to move, his star appears to be alive. It floats over Innsbruck and lets us forget that it weighs 570 kilograms.
The most beautiful star
“It is exactly where I wanted it”, rejoices the artist. Once again he has succeeded to let something materialise from his power of imagination, bringing it from the realm of ideas into reality. “I am happy that I have got the measurements and proportions just right”, he says. “After all, I did not know the Christkindlmarkt before.”
But he did meet the founder of the fair, Christian Mark. He describes him as someone who just does his own thing. Someone who takes care that his Chistkindlmarkt does not only consist of food and drinks stalls or the usual made-in-China range. This was also why it was clear for Hammers from the start: “I will create an Innsbruck mountain Christmas and not an Innsbruck mountain fun fair.” That he succeeded became evident straight away, on the night the star shone for the first time: “Only moments later somebody from the mountain rang Mr. Mark and told him that he had just seen the most beautiful star.”
Stars form a golden trail through Hammers’ work. “I see stars everywhere”, he says. In nature, in architecture, in everyday life – the man blessed with many gifts also spots stellar shapes in places where others would not even suspect them. And he has been developing new star shapes, 20 or more by now. “This comes to me easily”, the artist recounts.
When eternity breaks into time
The fascination with heavenly entities is rooted in his Christian background. The Christmas star, the star of Bethlehem, which led the Three Wise Men to baby Jesus in his manger, for Hammers it is the star of all stars. “For me that is eternity entering time”, he says without pathos. For him this powerful sign is, however, not attached to any faith community or religion: “It is universal.”
For Michael Hammers stars and comets in particular are also symbols for time and life. “Comets emerge and – like everything – only become visible in the light of the sun”, he explains. “They exist in the dark for a long time, become visible, follow their path and perish. That is what we call time.” That the moving celestial bodies only form their tails when they perish, he finds particularly remarkable. He pauses for a moment. “I can understand why people say: ‘that is my star’, when something good or bad happens to them.”
Humans always look upwards. At all times and in all cultures this has been repeating itself. Even though Hammers himself likes to look at the clouds – he is not an astrologer or an esoteric. “It is not so much that I think that the stars are watching over me”, he says. He rather believes that man “stands where he stands, and orientates himself by what surrounds him”. But he is also sure: “I have always followed a star. I have never consciously decided on anything and never knew where I would end up.” He just starts, begins to walk, just does it – trusting that he will find at the end what inspiration had promised him in the beginning. At least that.
And what does this mean for his Stars, whether in Innsbruck or New York? “When children’s eyes look at them and shine and the children point at the star, then I have done everything right”, says Michael Hammers. For adults stars are symbols of peace, harmony, love and special events and times – not only, but particularly at Christmas. And for many people stars represent their loved ones, the living as well as those who preceded them, with whom they also feel connected eternally. Because of this and not only because of the big radiance of the heavenly stars, it would be beautiful if Michael Hammers’ dream would come true: “Every city in the world needs a star.” And ideally one made by the master of the stars.
Translation: Marion Weisskirchen
Let´s Merry. Jubilee! 10th season of the Swarovski Star crowning the Rockefeller Center Christmas tree... | 科技 |
2017-09/1579/en_head.json.gz/23837 | Apple Launches Subscriptions on the App Store, Finally!
Now we have a big competitor to Microsoft's Zune services, Apple today announced a new subscription service available to all publishers of content-based apps on the App Store, including magazines, newspapers, video, music, etc. This is the same innovative digital subscription billing service that Apple recently launched with News Corp.'s "The Daily" app. Here is more:Subscriptions purchased from within the App Store will be sold using the same App Store billing system that has been used to buy billions of apps and In-App Purchases. Publishers set the price and length of subscription (weekly, monthly, bi-monthly, quarterly, bi-yearly or yearly). Then with one-click, customers pick the length of subscription and are automatically charged based on their chosen length of commitment (weekly, monthly, etc.). Customers can review and manage all of their subscriptions from their personal account page, including canceling the automatic renewal of a subscription. Apple processes all payments, keeping the same 30 percent share that it does today for other In-App Purchases. Publishers who use Apple?s subscription service in their app can also leverage other methods for acquiring digital subscribers outside of the app. For example, publishers can sell digital subscriptions on their web sites, or can choose to provide free access to existing subscribers. Since Apple is not involved in these transactions, there is no revenue sharing or exchange of customer information with Apple. Publishers must provide their own authentication process inside the app for subscribers that have signed up outside of the app. However, Apple does require that if a publisher chooses to sell a digital subscription separately outside of the app, that same subscription offer must be made available, at the same price or less, to customers who wish to subscribe from within the app. In addition, publishers may no longer provide links in their apps (to a web site, for example) which allow the customer to purchase content or subscriptions outside of the app. -----------------------------------------------------Check out our iPhone and iPod touch Game and App Reviews at:http://www.mobiletechreview.com/iPhone-game-reviews.htmCheck out our iPad Game Reviews at:http://www.mobiletechreview.com/iPad-game-reviews.htmMore News. . . -----MTR iPhone and iPad Game Reviews | 科技 |
2017-09/1579/en_head.json.gz/23900 | Scientists Use Elephant Seals to Monitor Oceans
June 30, 2007 Elephant seals live at sea for months at a time, diving several thousand feet below the surface and swimming from northern California to Russia and back again. Now, the amazing creatures are being recruited by scientists.
Bald Eagles Viewed Differently in Alaska
June 30, 2007 The bald eagle was taken off the threatened species list this week. In Alaska, bald eagles outnumber people in some places, and are not viewed with the respect they get in the continental United States.
A Week to Recognize the Role of Pollinating Insects
June 30, 2007 This is the first annual National Pollinators Week, designated by the United States Senate and the USDA, to draw attention to the plight of disappearing honeybees and other pollinating insects.
Party Pollution Threatens Ozarks' Rivers
June 29, 2007 The spring-fed, protected rivers that wend through the Missouri Ozarks have begun to draw swarms of drunken party-goers, and the National Park Service's attempts to crack down run up against staff shortages.
Biologist Recounts Path to Bald Eagles' Recovery
June 28, 2007 The Interior Department will take the American bald eagle off the Endangered Species List next month. Biologist Peter Nye has been tracking the bald eagle for more than 30 years. He talks with Melissa Block about how America's bird will fare off the list.
Hear Biologist Peter Nye
Bottled Water: A Symbol of U.S. Commerce, Culture
June 28, 2007 The bottled-water business in the United States is booming. People increasingly are willing to pay for something they can just as easily have for free. Yet many people around the world lack safe, dependable drinking water.
Condors Being Poisoned by Hunters' Ammunition
June 28, 2007 Two studies suggest that condors are being poisoned by lead that they ingest from carcasses and gut piles left by hunters. Now, wildlife officials in Arizona are trying to persuade hunters to switch to lead-free ammunition in an attempt to save the endangered bird.
Bald Eagle Comes Off Endangered List
June 28, 2007 The U.S. Fish and Wildlife Service has announced that it is removing the American bald eagle from the nation's list of threatened and endangered species.
Bald Eagle Leaves Endangered Species List
June 28, 2007 The Interior Department is removing the American bald eagle from protection under the Endangered Species Act. Once almost wiped out by hunters and DDT poisoning, the eagle has not only survived but is thriving.
The Bald Eagle's Bold Comeback
June 28, 2007 The American bald eagle was taken off the endangered species list Thursday. Dr. Patrick Redig, founder of the Raptor Center at the University of Minnesota, talks about the bald eagle's transformation from widely-hunted raptor to one of the primary symbols of the United States.
Individuals Unite to Trim Personal Carbon Emissions
June 28, 2007 In the U.K., members of Carbon Rationing Action Groups, known as craggers, try to cut their personal carbon emissions by 10 percent each year. They turn off the TV, try not to fly and turn down the heat.
Big Retail Stores Prime Solar Energy Generators
June 28, 2007 A few solar energy companies have discovered an opportunity on the roofs of big retail stores. They're offering to install solar panels for free. They then sell the electricity back to the store, often at lower rates than the local utility charges.
Bald Eagle No Longer Threatened Species
June 28, 2007 Secretary of Interior Dirk Kempthorne is expected to announce that the bald eagle is no longer listed as a threatened species. The population has risen since the early 1960s from 417 nesting pairs to more than 10,000 nesting pairs today. Steve Inskeep, John Nielsen
Cold-Adverse Plants Warm Up to a New Home
June 28, 2007 Only plant nerds used to risk loving the Hebe, native to New Zealand's cliffs and tufted grasslands. But as winters warm up in the U.S., plum pewter and burnt orange hebes are blooming, along with other species formally doomed by cold.
Cats First Tamed in the Middle East
June 28, 2007 Whenever humans met dogs or horses in the wild, they usually tamed them and put them to work. But not cats; it appears most pet cats can trace their heritage back to the wildcats of the Middle East.
More from Environment | 科技 |
2017-09/1579/en_head.json.gz/23942 | John C. Dvorak/
Digital Camera Lament
Only a few digital camera makers stand on solid ground.
It was a shame to see the Konica-Minolta folks fold their tent and sell their designs and assets to Sony, a company that will undoubtedly fail to exploit any of what it bought. Minolta and Konica have a combined experience of over 200 years in the industry and apparently weren't able to get out of the SLR film camera business fast enough. This is despite aggressive attempts to make headway in the digital game, which included the development of some breakthroughs.
The company produced a definitive pocket camera, the DiMage X50the take-everywhere camera of choice for me and others. Its folded optics are remarkable but were never marketed with any vigor. More importantly, Minolta developed an image stabilization system that was never marketed to any extreme. Few people understood the importance of this mechanism or how many good shots are lost because of minor shaking. Instead of promoting this as an exclusive breakthrough technology that could benefit all photographers in this era of hand-held cameras, the company billed it as more of a novelty. A curiosity. What killed Minolta in the digital camera game was its failure to develop top-notch super optics for the mainstream products, and a reputation for making cameras with poor battery life. Both were correctable, but by failing to exploit the strengths of its line (ease of use and image stabilization) the company ran out of time. I guess I'll just continue to use the X50 (which I was hoping would eventually incorporate image stabilization) until it falls apart. I still haven't seen a pocket camera I like as much.
It all boils down to marketing and getting the message out. Konica-Minolta began to lose its edge when it lost one particular PR guy who knew what he was talking about. Marketing and PR are generally weaknesses in camera companiesonly a few have developed a consistent foundation. That said, I don't see Konica-Minolta being the last casualty. From what I can tell only Canon, Kodak, Nikon, Olympus, and Sony are completely secureat least for now. After that, predicting survival gets sketchy because of the huge number of second-tier players that could be in or out of the business on a whim. In this group I include Epson and HP, which seem to be making cameras only to sell photo printers. I also include Panasonic, which seems to be in the business only to prove that it can be. It has a couple of high-end novelty cameras that it likes to showcase, but who really uses or buys them? Panasonic also developed an image stabilization system that was intriguing but incredibly under-promoted. The company looks like it could pull the plug at the drop of a hat or keep making cameras forever, but it seems to have no heart for the venture.Continue reading...
Much Ado over Google Book Search
Google vs. China vs. Evil
Columnist, PCMag.com
John Dvorak is a columnist for PCMag.com and the host of the weekly TV video podcast CrankyGeeks. His work is licensed around the world. Previously a columnist for Forbes, Forbes Digital, PC World, Barrons, MacUser, PC/Computing, Smart Business and other magazines and newspapers. Former editor and consulting editor for Infoworld. Has appeared in the New York Times, LA Times, Philadelphia Enquirer, SF Examiner, Vancouver Sun. Was on the start-up team for CNet TV as well as ZDTV. At ZDTV (and TechTV) was host of Silicon...
More Stories by John C.
Do you have the right to fix products you purchased and own? Yes and no. More »
Telecommuters Are Not Slackers
IBM and Yahoo are not fans of telecommuting, an odd, retrograde approach. More »
Facebook: A Tool for Evil?
I knew it years ago, and the craziness of today's users confirms it: social media is the root of som... | 科技 |
2017-09/1579/en_head.json.gz/23944 | The Duo get their hands on the Sony PlayStation Portable, one of 2005's most eagerly anticipated devices. Does the thrill of discovery keep them from finding plenty to complain about? Does Steve Jobs swear by Windows XP?
Angela notes that not only does the PSP incorporate a lot of power and versatility into a case about the same size as that of the Nintendo DS, Sony's stuffed a ton of ambition in there, too. The unit is designed to play digital music, display photos, and play video and movies in addition to games. (Not your beloved PS2 games, though; they're not compatible with the new hardware.)
Steve agrees that there's a lot going on here, but comments that it's rather hard to evaluate the PSP when so little content is (at the time of taping) available for it. The Duo have fun with the anime-style golf game included when they saw the PSP, despite the game's all-Japanese interface. Steve displays a few pictures from a Memory Stick, but as he points out, he can play those back on his camera just as easily.
Angela, momentarily channeling the Fashion Police, has a beef with the body of the PSP itself. Grumbling that "shiny black is the new white," she roundly criticizes the unit for being extremely smudge-prone. (Indeed, within days of the PSP's launch there was a booming aftermarket of cases and skins designed to protect the devices.)
She also notes that the device has so many buttons that it's hard to pick it up without jostling something. Steve agrees and reminds us that in the end, it's content that'll make or break the most hotly anticipated device to debut in the Year of the Portables.
SAVE/DELETESteve: Waffle (that is, neither a SAVE nor a DELETE)Angela: Waffle
Sony PlayStation PSP System
Related: Input Devices | 科技 |
2017-09/1579/en_head.json.gz/24113 | Genetic modification trial slashes mosquito numbers
Emma Woollacott, 13th November 2010
Scientists have announced that a field trial in which millions of genetically modified mosquitoes were released in the wild led to a dramtic fall in wild mosquito numbers.
The trial, carried out by Oxford firm Oxitec and the Mosquito Research and Control Unit of Grand Cayman (MRCU) aimed to test a method of combating dengue fever. It's the world’s fastest growing mosquito-borne viral disease, and one for which there is neither medication nor vaccine. According to the World Health Organization, two fifths of the world’s population, some 2.5 billion people, are at risk.
The genetically sterile strain Aedes aegypti can be reared in large numbers by feeding the mosquitoes with a supplement that turns off the sterility gene temporarily. The team bred three million sterile males - male mosquitoes don't bite or spread diseases - and then released them into the wild. There, they were able to seek out and mate with wild females - without producing offspring.
They found that the number of mosquitoes in the area fell by 80 percent in six months.
"Oxitec considers that this approach could be used in many countries to help control the Aedes aegypti mosquito and hence prevent dengue fever," says Dr Luke Alphey, chief scientific officer and founder of the company. "We have been working on this for many years to ensure the approach is both effective and safe. This trial represents the first demonstration in the open field, and we are delighted with the results."
General Sciences Features x mosquitoes x aedes aegypti x dengue x genetic modification x oxitec x Please enable JavaScript to view the comments powered by Disqus. Related Stories | 科技 |
2017-09/1579/en_head.json.gz/24165 | Konarka's Organic Photovoltaic Modules Pass Aging Tests Performed by TÜV Rheinland
LOWELL, Mass. --(Business Wire)-- Konarka Technologies, Inc., an innovator in development and commercialization of Konarka Power Plastic®, a lightweight, flexible organic solar film that converts light to electricity, today announced that Konarka's organic solar modules have passed through aging tests following the type approval standard for photovoltaic thin-film modules IEC (News - Alert) 61646. These recent tests confirm that Konarka's flexible applications can withstand critical IEC stress tests, while in February, the company announced product testing success in a rigid glass system. Testing was performed by TÜV Rheinland, a leading international test provider, in its Solar Energy Assessment Center (SEAC) in Cologne, Germany. These recent advances continue to be based upon Konarka's inverted cell architecture, the company's proprietary intellectual property protected under issued patents.
"In addition to the many rigid, glass-based applications where onarka's Power Plastic has been designed into OEM products, customers using Power Plastic in applications requiring a flexible photovoltaic material can now move forward with product lifetime confidence as a result of this most recent TÜV Rheinland performance testing," commented Howard Berke, chairman, CEO and co-founder of Konarka. "For customers who want to take advantage of Power Plastic's extensive product features, including color options, transparency and custom sizes, Konarka continues to extend tremendous, independent validation for Power Plastic's lifetime performance for applications requiring flexibility, conformance to contours and standalone applications."
The SEAC of TÜV Rheinland in Cologne is the world's leading test lab which optimizes processes and modern testing and simulation facilities for proven quality and reliability according to national and international standards. In February, Konarka announced its organic photovoltaic modules were the first OPV technology to pass through the set of individual critical tests following IEC 61646 performed by TÜV Rheinland.
Thin, light weight, transparent and flexible, Power Plastic solar films are ideally suited for integration into various building materials, including glass, plastics, steel, composites and fabrics, offering creative architects and product designers superior, widespread latitude with several color and transparency options across a wide range of sizes enabled by Konarka's roll-to-roll continuous manufacturing process.
About Konarka Technologies, Inc.
Konarka Technologies develops and manufactures solar plastic films that convert light to electricity - anywhere. As the leading developer of polymer-based, organic photovoltaic (OPV) technology that provides a source of renewable power in a variety of form factors, Konarka has a broad portfolio of patents, technology licenses and an accomplished technical, scientific and manufacturing team. Manufactured at low cost and low energy consumption, the company's Konarka Power Plastic® technology is lightweight, flexible, scalable and adaptable for use in a variety of commercial, industrial, government and consumer applications. Konarka Technologies is headquartered in Lowell, Mass., U.S.A. and has a full scale production manufacturing facility in New Bedford, Mass. U.S.A., with European headquarters in Nürnberg, Germany and a business development office in Japan. For additional information, visit http://www.konarka.com.
All trademarks recognized. | 科技 |
2017-09/1579/en_head.json.gz/24230 | NEWSHot off the Press!
Earth News
Solar System News
Space Mission News
NSF News Releases
News Archive Shop Windows to the UniverseArches National Park Geology Tour provides an extensive, visually rich description of the geology of Arches, by Deborah Ragland, Ph.D. See our DVD collection. RSS Feeds Kennedy Space Center, Cape Canaveral
Image courtesy of Corel Photography
Space Missions section
Rocket and Top Secret Satellite Explode!
News story originally written on August 12, 1998 The 18-story tall Titan rocket and its payload exploded today less than a minute after take-off. Debris from this $350 million rocket and $1 billion satellite rained down on the Atlantic Ocean. No injuries have been reported.The Lockheed Martin rocket was launched from Cape Canaveral Air Station in Florida. It was to carry a classified National Reconnaissance Office satellite into orbit. We can only speculate that this satellite would have become a part of an array of satellites used by the U.S. to monitor "nuclear hotspots", nations that are increasing their nuclear weapon capabilities. This is indeed one of the costliest explosions in U.S. space program history! Investigations by the U.S. Air Force will begin immediately.
Shop Windows to the Universe Science Store!Our online store includes books on science education, classroom activities in The Earth Scientist, mineral and fossil specimens, and educational games! Windows to the Universe Community NewsOpportunities
You might also be interested in:Traveling Nitrogen Classroom Activity KitCheck out our online store - minerals, fossils, books, activities, jewelry, and household items!...more1999--A Year in Review...It was another exciting and frustrating year for the space science program. It seemed that every step forward led to one backwards. Either way, NASA led the way to a great century of discovery. Unfortunately,...moreSTS-95 Launch: "Let the wings of Discovery lift us on to the future."
The Space Shuttle Discovery lifted off from Kennedy Space Center at 2:19 p.m. EST, October 29th. The sky was clear and the weather was great as Discovery took 8 1/2 minutes to reach orbit for the Unitied...moreMoon Found Orbiting Asteroid A moon was discovered orbiting the asteroid, Eugenia. This is only the second time in history that a satellite has been seen circling an asteroid. A special mirror allowed scientists to find the moon...moreU.S. is Fed Up with Russia Will Russia ever put the service module for the International Space Station in space? NASA officials are demanding an answer from the Russian government. The necessary service module is currently waiting...moreMore on Recent Coronal Mass Ejection During a period of about two days in early May, 1998, the ACE spacecraft was immersed in plasma associated with a coronal mass ejection (CME). The SWICS instrument on ACE, which determines unambiguously...moreMother Nature's Air Conditioning J.S. Maini of the Canadian Forest Service has referred to forests as the "heart and lungs of the world." Forests reduce soil erosion, maintain water quality, contribute to atmospheric humidity and cloud...morePlanetary Alignment 2002In late April through mid-May 2002, all five naked-eye planets are visible simultaneously in the night sky! This is includes Mercury which is generally very hard to see because of its proximity to the...more | 科技 |
2017-09/1579/en_head.json.gz/24246 | Asteroid 150 feet wide buzzes Earth
CAPE CANAVERAL, Fla. -- A 150-foot asteroid hurtled through Earth's backyard Friday, coming within an incredible 17,150 miles and making the closest known flyby for a rock of its size. In a chilling coincidence, a meteor exploded above Russia just hours before the asteroid zoomed past the planet.Scientists the world over, along with NASA, insisted the meteor had nothing to do with the asteroid since they appeared to be traveling in opposite directions. The asteroid is a much more immense object and delighted astronomers in Australia and elsewhere who watched it zip harmlessly through a clear night sky."It's on its way out," reported Paul Chodas of NASA's Near-Earth Object program at Jet Propulsion Laboratory in California.Asteroid 2012 DA14 came closer to Earth than many communication and weather satellites orbiting 22,300 miles up. Scientists insisted these too would be spared, and they were right.
The asteroid was too small to see with the naked eye even at its closest approach around 2:25 p.m., over the Indian Ocean near Sumatra.The best viewing locations, with binoculars and telescopes, were in Asia, Australia and eastern Europe. Even there, all anyone could see was a pinpoint of light as the asteroid buzzed by at 17,400 mph.As asteroids go, this one is a shrimp. The one that wiped out the dinosaurs 65 million years ago was 6 miles across. But this rock could still do immense damage if it ever struck given its 143,000-ton heft, releasing the energy equivalent of 2.4 million tons of TNT and wiping out 750 square miles.
By comparison, NASA estimated that the meteor that exploded over Russia was tiny -- about 49 feet wide and 7,000 tons before it hit the atmosphere, or one-third the size of the passing asteroid.As for the back-to-back events, "this is indeed very rare and it is historic," said Jim Green, NASA's director of planetary science. While the asteroid is about half the length of a football field, the exploding meteor "is probably about on the 15-yard line," he said."Now that's pretty big. That's typically a couple times bigger than the normal influx of meteorites that create these fireballs," he said in an interview on NASA TV."These fireballs happen about once a day or so, but we just don't see them because many of them fall over the ocean or in remote areas. This one was an exception."
As the countdown for the asteroid's close approach entered the final hours, NASA noted that the path of the meteor appeared to be quite different than that of the asteroid, making the two objects "completely unrelated." The meteor seemed to be traveling from north to south, while the asteroid passed from south to north -- opposite directions.Most of the solar system's asteroids are situated in a belt between the orbits of Mars and Jupiter, and remain stable there for billions of years. Some occasionally pop out, though, into Earth's neighborhood.NASA scientists estimate that an object of this size makes a close approach like this every 40 years. The likelihood of a strike is every 1,200 years.The flyby provides a rare learning opportunity for scientists eager to keep future asteroids at bay -- and a prime-time advertisement for those anxious to step up preventive measures.
Friday's meteor further strengthened the asteroid-alert message."We are in a shooting gallery and this is graphic evidence of it," said former Apollo astronaut Rusty Schweickart, chairman emeritus of the B612 Foundation, committed to protecting Earth from dangerous asteroids.Schweickart noted that 500,000 to 1 million sizable near-Earth objects -- asteroids or comets -- are out there. Yet less than 1 percent -- fewer than 10,000 -- have been inventoried.Humanity has to do better, he said. The foundation is working to build and launch an infrared space telescope to find and track threatening asteroids.If a killer asteroid were, indeed, incoming, a spacecraft could, in theory, be launched to nudge the asteroid out of Earth's way, changing its speed and the point of intersection. A second spacecraft would make a slight alteration in the path of the asteroid and ensure it never intersects with the planet again, Schweickart said.Asteroid DA14 -- discovered by Spanish astronomers only last February -- is "such a close call" that it is a "celestial torpedo across the bow of spaceship Earth," Schweickart said in a phone interview Thursday.
NASA's deep-space antenna in California's Mojave Desert was ready to collect radar images, but not until eight hours after the closest approach given the United States' poor positioning for the big event.
'Buckwild' cast member charged with aggravated DUI
Meteor explodes over Russia; 1,100 injured | 科技 |
2017-09/1579/en_head.json.gz/24309 | Seismic wave
This article is about waves that travel through the Earth. For ocean waves sometimes called "seismic sea waves", see Tsunami.
Foreshock
Blind thrust
Interplate
Intraplate
Megathrust
Remotely triggered
Supershear
Earthquake swarm
Fault movement
Induced seismicity
Hypocenter
Shadow zone
P-wave
S-wave
Seismic scales
Earthquake duration magnitude
Coordinating Committee for
Shear wave splitting
Adams–Williamson equation
Flinn–Engdahl regions
Earthquake engineering
Seismite
Category • Related topics
Body waves and surface waves
p-wave and s-wave from seismograph
Velocity of seismic waves in the Earth versus depth.[1] The negligible S-wave velocity in the outer core occurs because it is liquid, while in the solid inner core the S-wave velocity is non-zero.
Seismic waves are waves of energy that travel through the Earth's layers, and are a result of earthquakes, volcanic eruptions, magma movement, large landslides and large man-made explosions that give out low-frequency acoustic energy. Many other natural and anthropogenic sources create low-amplitude waves commonly referred to as ambient vibrations. Seismic waves are studied by geophysicists called seismologists. Seismic wave fields are recorded by a seismometer, hydrophone (in water), or accelerometer.
The propagation velocity of the waves depends on density and elasticity of the medium. Velocity tends to increase with depth and ranges from approximately 2 to 8 km/s in the Earth's crust, up to 13 km/s in the deep mantle.[2]
Earthquakes create distinct types of waves with different velocities; when reaching seismic observatories, their different travel times help scientists to locate the source of the hypocenter. In geophysics the refraction or reflection of seismic waves is used for research into the structure of the Earth's interior, and man-made vibrations are often generated to investigate shallow, subsurface structures.
1.1 Body waves
1.1.1 Primary waves
1.1.2 Secondary waves
1.2 Surface waves
1.2.1 Rayleigh waves
1.2.2 Love waves
1.2.3 Stoneley waves
1.2.4 Free oscillations of the Earth
1.3 P and S waves in Earth's mantle and core
2 Notation
3 Usefulness of P and S waves in locating an event
Among the many types of seismic waves, one can make a broad distinction between body waves, which travel through the Earth, and surface waves, which travel at the Earth's surface.[2]:48–50[3]:56–57
Other modes of wave propagation exist than those described in this article; though of comparatively minor importance for earth-borne waves, they are important in the case of asteroseismology.
Body waves travel through the interior of the Earth.
Surface waves travel across the surface. Surface waves decay more slowly with distance than body waves, which travel in three dimensions.
Particle motion of surface waves is larger than that of body waves, so surface waves tend to cause more damage.
Body waves[edit]
Body waves travel through the interior of the Earth along paths controlled by the material properties in terms of density and modulus (stiffness). The density and modulus, in turn, vary according to temperature, composition, and material phase. This effect resembles the refraction of light waves. Two types of particle motion result in two types of body waves: Primary and Secondary waves.
Primary waves[edit]
Main article: P-wave
Primary waves (P-waves) are compressional waves that are longitudinal in nature. P waves are pressure waves that travel faster than other waves through the earth to arrive at seismograph stations first, hence the name "Primary". These waves can travel through any type of material, including fluids, and can travel at nearly twice the speed of S waves. In air, they take the form of sound waves, hence they travel at the speed of sound. Typical speeds are 330 m/s in air, 1450 m/s in water and about 5000 m/s in granite.
Secondary waves[edit]
Main article: S-wave
Secondary waves (S-waves) are shear waves that are transverse in nature. Following an earthquake event, S-waves arrive at seismograph stations after the faster-moving P-waves and displace the ground perpendicular to the direction of propagation. Depending on the propagational direction, the wave can take on different surface characteristics; for example, in the case of horizontally polarized S waves, the ground moves alternately to one side and then the other. S-waves can travel only through solids, as fluids (liquids and gases) do not support shear stresses. S-waves are slower than P-waves, and speeds are typically around 60% of that of P-waves in any given material.
Surface waves[edit]
Seismic surface waves travel along the Earth's surface. They can be classified as a form of mechanical surface waves. They are called surface waves, as they diminish as they get further from the surface. They travel more slowly than seismic body waves (P and S). In large earthquakes, surface waves can have an amplitude of several centimeters.[4]
Rayleigh waves[edit]
Main article: Rayleigh wave
Rayleigh waves, also called ground roll, are surface waves that travel as ripples with motions that are similar to those of waves on the surface of water (note, however, that the associated particle motion at shallow depths is retrograde, and that the restoring force in Rayleigh and in other seismic waves is elastic, not gravitational as for water waves). The existence of these waves was predicted by John William Strutt, Lord Rayleigh, in 1885. They are slower than body waves, roughly 90% of the velocity of S waves for typical homogeneous elastic media. In the layered medium (like the crust and upper mantle) the velocity of the Rayleigh waves depends on their frequency and wavelength. See also Lamb waves.
Love waves[edit]
Main article: Love wave
Love waves are horizontally polarized shear waves (SH waves), existing only in the presence of a semi-infinite medium overlain by an upper layer of finite thickness.[5] They are named after A.E.H. Love, a British mathematician who created a mathematical model of the waves in 1911. They usually travel slightly faster than Rayleigh waves, about 90% of the S wave velocity, and have the largest amplitude.
Stoneley waves[edit]
Main article: Stoneley wave
A Stoneley wave is a type of boundary wave (or interface wave) that propagates along a solid-fluid boundary or, under specific conditions, also along a solid-solid boundary. Amplitudes of Stoneley waves have their maximum values at the boundary between the two contacting media and decay exponentially towards the depth of each of them. These waves can be generated along the walls of a fluid-filled borehole, being an important source of coherent noise in VSPs and making up the low frequency component of the source in sonic logging.[6] The equation for Stoneley waves was first given by Dr. Robert Stoneley (1894 - 1976), Emeritus Professor of Seismology, Cambridge.[7]
Free oscillations of the Earth[edit]
The sense of motion for toroidal 0T1 oscillation for two moments of time.
The scheme of motion for spheroidal 0S2 oscillation.Dashed lines give nodal (zero) lines. Arrows give the sense of motion.
Free oscillations of the Earth are standing waves, the result of interference between two surface waves traveling in opposite directions. Interference of Rayleigh waves results in spheroidal oscillation S while interference of Love waves gives toroidal oscillation T. The modes of oscillations are specified by three numbers, e.g., nSlm, where l is the angular order number (or spherical harmonic degree, see Spherical harmonics for more details). The number m is the azimuthal order number. It may take on 2l+1 values from -l to +l. The number n is the radial order number. It means the wave with n zero crossings in radius. For spherically symmetric Earth the period for given n and l does not depend on m.
Some examples of spheroidal oscillations are the "breathing" mode 0S0, which involves an expansion and contraction of the whole Earth, and has a period of about 20 minutes; and the "rugby" mode 0S2, which involves expansions along two alternating directions, and has a period of about 54 minutes. The mode 0S1 does not exist because it would require a change in the center of gravity, which would require an external force.[2]
Of the fundamental toroidal modes, 0T1 represents changes in Earth's rotation rate; although this occurs, it is much too slow to be useful in seismology. The mode 0T2 describes a twisting of the northern and southern hemispheres relative to each other; it has a period of about 44 minutes.[2]
The first observations of free oscillations of the Earth were done during the great 1960 earthquake in Chile. Presently periods of thousands modes are known. These data are used for determining some large scale structures of the Earth interior.
P and S waves in Earth's mantle and core[edit]
When an earthquake occurs, seismographs near the epicenter are able to record both P and S waves, but those at a greater distance no longer detect the high frequencies of the first S wave. Since shear waves cannot pass through liquids, this phenomenon was original evidence for the now well-established observation that the Earth has a liquid outer core, as demonstrated by Richard Dixon Oldham. This kind of observation has also been used to argue, by seismic testing, that the Moon has a solid core, although recent geodetic studies suggest the core is still molten[citation needed].
Notation[edit]
Earthquake wave paths
The path that a wave takes between the focus and the observation point is often drawn as a ray diagram. An example of this is shown in a figure above. When reflections are taken into account there are an infinite number of paths that a wave can take. Each path is denoted by a set of letters that describe the trajectory and phase through the Earth. In general an upper case denotes a transmitted wave and a lower case denotes a reflected wave. The two exceptions to this seem to be "g" and "n".[8]
the wave reflects off the outer core
a wave that has been reflected off a discontinuity at depth d
a wave that only travels through the crust
a wave that reflects off the inner core
a P-wave in the inner core
a reflection off a discontinuity in the inner core
an S wave in the inner core
a P-wave in the outer core
a Love wave sometimes called LT-Wave (Both caps, while an Lt is different)
a wave that travels along the boundary between the crust and mantle
a P wave in the mantle
a P wave ascending to the surface from the focus
a Rayleigh wave
an S wave in the mantle
an S wave ascending to the surface from the focus
the wave reflects off the bottom of the ocean
No letter is used when the wave reflects off of the surfaces
ScP is a wave that begins traveling towards the center of the Earth as an S wave. Upon reaching the outer core the wave reflects as a P wave.
sPKIKP is a wave path that begins traveling towards the surface as an S-wave. At the surface it reflects as a P-wave. The P-wave then travels through the outer core, the inner core, the outer core, and the mantle.
Usefulness of P and S waves in locating an event[edit]
The Hypocenter/Epicenter of an earthquake is calculated by using the seismic data of that earthquake from at least three different locations.
In the case of local or nearby earthquakes, the difference in the arrival times of the P and S waves can be used to determine the distance to the event. In the case of earthquakes that have occurred at global distances, three or more geographically diverse observing stations (using a common clock) recording P-wave arrivals permits the computation of a unique time and location on the planet for the event. Typically, dozens or even hundreds of P-wave arrivals are used to calculate hypocenters. The misfit generated by a hypocenter calculation is known as "the residual". Residuals of 0.5 second or less are typical for distant events, residuals of 0.1-0.2 s typical for local events, meaning most reported P arrivals fit the computed hypocenter that well. Typically a location program will start by assuming the event occurred at a depth of about 33 km; then it minimizes the residual by adjusting depth. Most events occur at depths shallower than about 40 km, but some occur as deep as 700 km.
P- and S-waves sharing with the propagation
A quick way to determine the distance from a location to the origin of a seismic wave less than 200 km away is to take the difference in arrival time of the P wave and the S wave in seconds and multiply by 8 kilometers per second. Modern seismic arrays use more complicated earthquake location techniques.
At teleseismic distances, the first arriving P waves have necessarily travelled deep into the mantle, and perhaps have even refracted into the outer core of the planet, before travelling back up to the Earth's surface where the seismographic stations are located. The waves travel more quickly than if they had traveled in a straight line from the earthquake. This is due to the appreciably increased velocities within the planet, and is termed Huygens' Principle. Density in the planet increases with depth, which would slow the waves, but the modulus of the rock increases much more, so deeper means faster. Therefore, a longer route can take a shorter time.
The travel time must be calculated very accurately in order to compute a precise hypocenter. Since P waves move at many kilometers per second, being off on travel-time calculation by even a half second can mean an error of many kilometers in terms of distance. In practice, P arrivals from many stations are used and the errors cancel out, so the computed epicenter is likely to be quite accurate, on the order of 10–50 km or so around the world. Dense arrays of nearby sensors such as those that exist in California can provide accuracy of roughly a kilometer, and much greater accuracy is possible when timing is measured directly by cross-correlation of seismogram waveforms.
Helioseismology
Reflection seismology
^ G. R. Helffrich & B. J. Wood (2002). "The Earth's mantle" (PDF). Nature. Macmillan Magazines. 412 (2 August): 501; Figure 1. doi:10.1038/35087500. Archived (PDF) from the original on 24 August 2016. ^ a b c d Peter M. Shearer (2009). Introduction to Seismology. Cambridge University Press. ISBN 978-0-521-88210-1. (Also see errata)
^ Seth Stein; Michael Wysession (1 April 2009). An Introduction to Seismology, Earthquakes, and Earth Structure. John Wiley & Sons. ISBN 978-14443-1131-0. ^ Sammis CG, Henyey TL. Geophysics Field Measurements, page 12. Academic Press; 1987
^ Sheriff, R. E., Geldart, L. P. (1995). Exploration Seismology (2nd ed.). Cambridge University Press. p. 52. ISBN 0-521-46826-4. CS1 maint: Multiple names: authors list (link)
^ Schlumberger Oilfield Glossary. Stoneley wave.
^ Robert Stoneley, 1929 – 2008.. Obituary of his son with reference to discovery of Stoneley waves.
^ The notation is taken from Bullen, K.E.; Bolt, Bruce A. (1985). An introduction to the theory of seismology (4th ed.). Cambridge: Cambridge University Press. ISBN 978-0521283892. and Lee, William H.K.; Jennings, Paul; Kisslinger, Carl; et al., eds. (2002). International handbook of earthquake and engineering seismology. Amsterdam: Academic Press. ISBN 9780080489223. External links[edit]
The Wikibook Historical Geology has a page on the topic of: Seismic waves
EDT: A MATLAB Website for seismic wave propagation
Geophysicists
Geophysical fluid dynamics
Geodynamics
Mathematical geophysics
Mineral physics
Near-surface geophysics
Tectonophysics
Physical phenomena
Chandler wobble
Geothermal gradient
Gravity of Earth
Mantle convection
Precession of the equinoxes
Canadian Geophysical Union
Environmental and Engineering Geophysical Society
European Geosciences Union
International Association of Geomagnetism and Aeronomy
International Union of Geodesy and Geophysics
Royal Astronomical Society
Seismological Society of America
Retrieved from "https://en.wikipedia.org/w/index.php?title=Seismic_wave&oldid=766756153" Categories: SeismologyHidden categories: CS1 maint: Multiple names: authors listAll articles with unsourced statementsArticles with unsourced statements from January 2010 Navigation menu
العربيةAzərbaycancaБългарскиCatalàČeštinaDanskDeutschEestiEspañolEuskaraفارسیFrançaisGaeilgeGalego한국어ՀայերենHrvatskiBahasa IndonesiaItalianoעבריתქართულიLietuviųМакедонскиNederlands日本語Norsk bokmålNorsk nynorskOromooOʻzbekcha/ўзбекчаPolskiPortuguêsРусскийSlovenčinaСрпски / srpskiSrpskohrvatski / српскохрватскиSuomiSvenskaไทยTürkçeУкраїнськаاردوTiếng Việtייִדיש中文 Edit links This page was last modified on 21 February 2017, at 23:37. | 科技 |
2017-09/1579/en_head.json.gz/24357 | Continental Scientific Drilling Coordination Office for the Division of Earth Sciences
(CSDCO)
David .
dlambert@nsf.gov
The Division of Earth Sciences (EAR) at the National Science Foundation (NSF) requests proposals from interested groups to support current and future continental scientific drilling activities. We request proposals for the establishment of a Continental Scientific Drilling Coordination Office (CSDCO) that will help coordinate planning for continental scientific drilling projects, in collaboration with the Earth science community, and will have the capability to supply continental scientific drilling support and expertise for NSF-funded research. The CSDCO may work with the commercial drilling community or other drilling organizations to determine the type and availability of drilling services that may be available for NSF-funded scientific projects that require continental drilling. The CSDCO, if requested by the PI of the scientific project, may assist with the development of requests for bids to provide the drilling services. The CSDCO is expected to play a proactive role in the community to encourage innovation in drilling technologies and methods in response to community needs and will help guide the development of new drilling designs as requested by the research community. The successful proponent will be expected to manage drilling activities for the US scientific community, as needed. Requirements for drilling activities will be derived both from long-range science plans developed by the community as well as research proposals funded by NSF. We encourage collaborations with international partners such as the International Continental Scientific Drilling Program (ICDP) and with scientists funded by other sources. The CSDCO will be capable of assisting in the planning and execution of all aspects of the drilling activities that EAR supports. Interested parties must propose to this solicitation with a plan to provide for these services under a single award. The CSDCO award will be administered as a Cooperative Agreement with an anticipated duration of up to five-years beginning on October 1, 2013. A mid-term management review will be required, which will guide a decision to re-compete or renew the Cooperative Agreement for up to a further five-year period. REVISIONS AND UPDATES | 科技 |
2017-09/1579/en_head.json.gz/24373 | Red Hat Shifts Its Cloud Into High Gear
By Jeffrey Schwartz05/05/2011
Red Hat unveiled two new additions to its cloud service offerings at its Red Hat Summit-JBossWorld event, taking place this week in Boston. The company launched its new platform-as-a-service (PaaS) product called OpenShift, as well as its CloudForms infrastructure as a service (IaaS) software at the event. OpenShift could emerge as an open source alternative to Microsoft's Windows Azure PaaS-based cloud service. OpenShift supports Red Hat's JBoss middleware, and is designed for four languages: Java, Ruby, PHP and Python. For storage, it supports SQL and NoSQL data stores, as well as a distributed file system.
"That gives us one of the broadest platform as a service offerings out there," said Matt Hicks, Red Hat's cloud computing expert, speaking during a press briefing at the conference. "Really, when you talk about languages and persistence, it's at a low level. We wanted to incorporate all of the frameworks that go along with that. We want to make sure this stuff looks familiar to developers because we don't want them to have to rewrite their code just to move to the cloud."
OpenShift comes in three versions, according to a company FAQ:
Express, aimed at Ruby, PHP and Python applications delivered in a shared-hosting environment, is free.
Flex, intended for Java Enterprise Edition and PHP apps, can be deployed on JBoss or Tomcat and is intended for developers who want more control than they would get with the Express version.
Power offers developers the most control at the operating system configuration level. "Power gives you complete control down to the root system level, in the cloud, of your application's configuration," said Issac Roth, Red Hat's PaaS master. Express and Flex are in beta now, but Power is not yet available for testing.
"We are offering levels going from fully automated [but] not as much control, to lots of control with less automation, for the different kinds of developers and the different kinds of applications that are appropriate to the cloud," Roth said. The company has not announced pricing or availability for OpenShift. Meanwhile, Red Hat also took the wraps off its CloudForms application lifecycle management software that lets organizations build private and hybrid IaaS clouds using Red Hat's JBoss. CloudForms supports automation and offers application management tools aimed at bringing sophisticated apps to the cloud. It includes infrastructure services aimed at helping organizations build up capabilities. Those capabilities include making data persistent in a cloud or transferring data in and out of a cloud using the middleware's messaging infrastructure, according to Bryan Che, Red Hat's senior director of product management and marketing.
"CloudForms fundamentally revolutionizes infrastructure as a service by introducing so many more capabilities that enterprises need," Che said. "For example, because Red Hat supports the Deltacloud API as part of our compute resource management, we will be able to support resource management across every part of your IT infrastructure, whether it's physical systems, your choice of virtualization technology and your choice of public cloud providers. So you can bring the benefits in abstraction and automation that [the] cloud provides across your entire IT organization with your choice of providers and vendors."
Available in beta now, CloudForms is scheduled for release this fall.
Jeffrey Schwartz is editor of Redmond magazine and also covers cloud computing for Virtualization Review's Cloud Report. In addition, he writes the Channeling the Cloud column for Redmond Channel Partner. Follow him on Twitter @JeffreySchwartz. | 科技 |
2017-09/1579/en_head.json.gz/24418 | Alion wins $10M nanotechnology contract with Air Force
By Mark HooverAug 04, 2014
Alion Science and Technology has won a $10 million contract to provide the Air Force with nanotechnology development and technology transfer.
The contract is known as the AMMTIAC contract, which provides technical, engineering and expertise in nanotechnology applications in materials, manufacturing and testing of interest to the military, the Defense Department said in a release.
Work will be performed in Rome, New York, N.Y., and Crane, Ind., and is expected to be completed December 2016.
Mark Hoover is a senior staff writer with Washington Technology. You can contact him at mhoover@washingtontechnology.com, or connect with him on Twitter at @mhooverWT. | 科技 |
2017-09/1579/en_head.json.gz/24480 | Elon Musk: I'm Afraid of a 'Terminator'-Like Catastrophe
Max Eddy
Evan Dashevsky
Everything You Can Do With Amazon Alexa
Elon Musk has a vested interest in artificial intelligence -- but not for reasons you’d expect.
As an investor in AI ventures DeepMind (prior to its acquisition by Google in January) and Vicarious, Musk explained that “it’s not from the standpoint of actually trying to make any investment return. I like to just keep an eye on what’s going on.”
In an interview yesterday with CNBC, Musk described his paranoia of a Terminator-like future of robots gone awry. “Nobody expects the Spanish Inquisition,” he said. “But you have to be careful.”
Related: Elon Musk: 'Maybe We'll Make a Flying Car, Just For Fun'
The irony here, noted CNBC’s Kelly Evans, is that Musk is the driving force behind both Tesla and Space X -- two enterprises whose very mission is to push the boundaries of technology as we know it.
If robots were to rise up against humanity, asked Evans, what could be done? Given Musk’s expertise in the realm of space travel -- “the first people could be taken to Mars in 10, 12 years,” he said -- she suggested an “escape to Mars if there is no other option.”
“The AI will chase us there pretty quickly,” he responded.
Related: No Sci-Fi Here: Your Own Personal Robot Is Coming | 科技 |
2017-09/1579/en_head.json.gz/24499 | Honors Bio ch7
HrWasp
Cellular Respiration
The process by which cells produce energy from carbohydrates; atmospheric oxygen combines with glucose to form water and carbon dioxide.
Pyruvic Acid
The three-carbon compound that is produced during glycolysis and needed for both the aerobic and anaerobic pathways of cellular respiration that follow glycolysis.
The reduced form of NAD+; an electron-carrying molecule that functions in cellular respiration.
Anaerobic Describes a process that does not require oxygen.
The process in which pyruvic acid is broken down and NADH is used to make a large amount of ATP; the part of respiration that is carried out in the presence of oxygen.
The anaerobic breakdown of glucose into pyruvic acid, which makes a small amount of energy available to cells in the form of ATP.
NAD+
Abbreviation for nicotinamide adenine dinucleotide, a coenzyme involved in redox reactions.
The breakdown of carbohydrates by enzymes, bacteria, yeasts, or mold in the absence of oxygen.
Lactic Acid Fermentation
The chemical breakdown of carbohydrates that produces lactic acid as the main end product.
Alcoholic Fermentation
The anaerobic process by which yeasts and other microorganisms break down sugars to form carbon dioxide and ethanol.
A unit of energy equal to 1,000 cal.
Mitochondrial Matrix
The fluid that is inside the inner membrane of a mitochondrion.
Acetyl-CoA
Acetyl coenzyme A, a compound that is synthesized by cells and that plays a major role in metabolism.
Krebs Cycle
A series of biochemical reactions that convert pyruvic acid into carbon dioxide and water; it is the major pathway of oxidation in animal, bacterial, and plant cells, and it releases energy.
Oxaloacetic Acid
A four-carbon compound of Kreb's cycle that combines with acetyl CoA to form citric acid.
A six-carbon compound formed in the Krebs cycle.
Flavin adenine dinucleotide, a compound that acts as a hydrogen acceptor in dehydrogenation reactions.
Tags: Biology chapter definitions Folders: Description: Honors Bio ch7 vocab | 科技 |
2017-09/1579/en_head.json.gz/24507 | UCI Student Accidently Creates A Rechargeable Battery That Lasts 400 Years by
Share September 13, 2016
There’s an old saying that luck happens when preparation meets opportunity. There’s no better example of that than a recent discovery at the University of California, Irvine by doctoral student Mya Le Thai. After playing around in the lab she made a discovery that could lead to a rechargeable battery that lasts up to 400 years. That means longer-lasting laptops and smartphones and fewer lithium ion batteries piling up in landfills.
A team of researchers at UCI had been experimenting with nanowires for potential use in batteries, but found that over time the thin, fragile wires would break down and crack after too many charging cycles. A charge cycle is when a battery goes from completely full to completely empty and back to full again. But one day, on a whim, Thai coated a set of gold nanowires in manganese dioxide and a Plexiglas-like electrolyte gel. “She started to cycle these gel capacitors, and that’s when we got the surprise,” said Reginald Penner, chair of the university’s chemistry department. “She said, ‘this thing has been cycling 10,000 cycles and it’s still going.’ She came back a few days later and said ‘it’s been cycling for 30,000 cycles.’ That kept going on for a month.”
Thai’s discovery is mind blowing because the average laptop battery lasts 300 to 500 charge cycles. The nanobattery developed at UCI made it though 200,000 cycles in three months. That would extend the life of the average laptop battery by about 400 years. The rest of the device would have probably gone kaput decades before the battery, but the implications for a battery that that lasts hundreds of years are pretty startling. “The big picture is that there may be a very simple way to stabilize nanowires of the type that we studied,” Penner said. “If this turns out to be generally true, it would be a great advance for the community.” Not bad for just fooling around in the laboratory. Recently on GOOD
Share this UCI Student Accidently Creates A Rechargeable Battery That Lasts 400 Years Recent | 科技 |
2017-09/1579/en_head.json.gz/24539 | Inventing Entertainment: The Early Motion Pictures and Sound Recordings of the Edison Companies >
Inventing Entertainment: The Early Motion Pictures and Sound Recordings of the Edison Companies
Edison kinetoscopic record of a sneeze, January 7, 1894
The stenographer's friend, or, What was accomplished by an ...
Uncle Josh at the moving picture show
This site features 341 motion pictures, 81 disc sound recordings, and other related materials, such as photographs and original magazine articles. Cylinder sound recordings will be added to this site in the near future. In addition, histories are given of Edison's involvement with motion pictures and sound recordings, as well as a special page focusing on the life of the great inventor. Prolific inventor Thomas Alva Edison (1847-1931) has had a profound impact on modern life. In his lifetime, the "Wizard of Menlo Park" patented 1,093 inventions, including the phonograph, the kinetograph (a motion picture camera), and the kinetoscope (a motion picture viewer). Edison managed to become not only a renowned inventor, but also a prominent manufacturer and businessman through the merchandising of his inventions. The collections in the Library of Congress's Motion Picture, Broadcasting and Recorded Sound Division contain an extraordinary range of the surviving products of Edison's entertainment inventions and industries. The Paper Print Film Collection at the Library of Congress
Most of the films from the New York, President McKinley, and the Pan-American Exposition, Westinghouse Works, 18 San Francisco, Variety Stage, Spanish-American War, and Edison presentations, are from the Paper Print Collection of the Library of Congress Motion Picture, Broadcasting, and Recorded Sound Division. Because the copyright law did not cover motion pictures until 1912, early film producers who desired protection for their work sent paper contact prints of their motion pictures to the U.S. Copyright Office at the Library of Congress. These paper prints were made using light-sensitive paper the same width and length as the film itself, and developed as though a still photograph. Some motion picture companies, such as the Edison Company and the Biograph Company, submitted entire motion pictures--frame by frame--as paper prints. Other producers submitted only illustrative sequences.
The Paper Print Collection contains more than 3,000 motion pictures. Most are American but many are from England, France, and Denmark. The extreme scarcity of early motion pictures makes these paper prints particularly valuable. In most instances they remain the only record of early films, providing a rare insight into America at the start of the twentieth century and the beginnings of the motion picture industry in America.
Rights and Access | 科技 |
2017-09/1579/en_head.json.gz/24540 | MacArthur Awards $5.5 Million to Support Conservation Policy Research and Analysis
The MacArthur Foundation, which has supported conservation efforts around the world for more than 25 years, today announced three grants totaling $5.5 million for global policy research and analysis projects to help inform and respond to increased pressures from development and climate change. Carnegie Institution for Science (Washington, DC) will receive $2.8 million to map and monitor watersheds using high-resolution remote measurement and modeling methods. The project will provide data in support of international climate change mitigation efforts and biodiversity conservation agreements.
World Wildlife Fund (Washington, DC) will receive $1.5 million to help change the policies and practices of 100 companies that buy and sell 25 percent of the 15 commodities with the most significant impact on high biodiversity landscapes. If these companies commit to sustainability, global markets may shift to protect the planet human consumption has already outgrown.
United Nations Environment Program World Conservation Monitoring Centre (Cambridge, United Kingdom) will receive $1.2 million to analyze current and projected impacts of major commodities (agricultural, timber, minerals, gas, and oil) on ecosystems and biodiversity in the Great Lakes of East and Central Africa, the Greater Mekong Headwaters, and the Watersheds of the Andes.
The Foundation’s conservation policy grantmaking targets biodiversity conservation at the global scale and reinforces the priorities of MacArthur’s regional work, with a focus on four issues: climate change mitigation and adaptation; understanding and influencing China’s consumption patterns and use of natural resources, particularly in Africa, Latin America, Asia, and the Pacific; integrating environmental and social considerations into commodities markets (for example carbon, timber, oil palm, cotton, and soy); and responding to the overexploitation and illegal use of marine fisheries.
“Meeting the resource needs of an ever-increasing human population will place very significant pressure on the planet’s most biodiverse areas,” said Jorgen Thomsen, MacArthur's Director of Conservation and Sustainable Development. “We must consider and address the significant role of water provisioning, agriculture, fisheries, energy development, and climate change on biodiversity so that we are able to provide these vital resources for future generations.”
MacArthur was the first major private foundation to adopt biodiversity conservation as a core component of its international grantmaking. In March 2011, the Foundation announced a $176 million, ten-year commitment to conservation and sustainable development and a new, broader strategy that builds on the Foundation's historic focus on preserving biodiversity to guide its grantmaking over the next decade. More information is at www.macfound.org/conservation.
Grantee Profile:
Carnegie Institution of Washington, Department of Global Ecology, United Nations Environment Programme, World Conservation Monitoring Centre, World Wildlife Fund
Conservation & Sustainable Development, Conservation
MacArthur's conservation grantmaking aims to preserve ecosystems and species and to promote development that respects the environment.
February 13, 2017 - Publication
High-tech Maps of Forest Diversity Identify New Conservation Targets
Remote sensing maps of the forest canopy in Peru are testing the strength of current forest protections and identifying new regions for conservation efforts, according to a report by the Carnegie Airborne Observatory. Read More
January 25, 2017 - From The Field
Economic Tools for Conservation
A two-week course at the University of California Berkeley by the Conservation Strategy Fund will focus on how to use economics to be more strategic and successful in conservation work, and transform how environmental issues are ... Read More
November 30, 2016 - From The Field
Conference to Examine Conservation and Development in African Great Lakes
The African Great Lakes Conference aims to bring together a diverse group of stakeholders to focus on sustainable development, human and environmental health, and biodiversity conservation in the African Great Lakes region. Read More
November 30, 2016 - Article
Video: Biodiversity and Climate Change in the Tropical Andes
Scientists, policymakers, community representatives, and civil society work together under a MacArthur-supported project managed by the Inter-American Institute for Global Change Research to understand the impacts of climate change on biodiversity in the Tropical Andes. Read More | 科技 |
2017-09/1579/en_head.json.gz/24596 | Jeff Bezos builds space ports, not relationships
Texan community in collective huff
Locals in Van Horn, West Texas, have been unsettled by the arrival in Culberson County of Blue Origin, the space tourism company owned by Amazon-founder Jeff Bezos.
According to the Wall Street Journal, the billionaire started buying up land in the area three years ago so he could build the first commercial space port. Construction work at the site began in May this year. Locals say they have seen up to 40 trucks in neighbouring Hudspeth County, setting up power lines to the property.
However, he has not made a particular effort to be a friendly neighbour, a terrible social faux pas in the ever-polite south.
The paper outlines how ranch owners were repeatedly approached by lawyers acting for firms called James Cook LP, Jolliet Holdings, Coronado Ventures, and Cabot Enterprises, all businesses using the same forwarding address, care of Zefram* LLC.
After making offers the ranch owners couldn't refuse, Bezos acquired the land deeds to several adjoining farms with a total land area of almost 300,000 acres. Former ranch owner Ronald Stasny told the paper that the offers increased until they were too rich to turn down, although he demurred when asked exactly what he had been paid for his land, explaining that he had signed a confidentiality agreement.
However, since buying up the land, Bezos has gone silent, rebuffing all attempts at contact from the locals, and requesting confidentiality agreements be signed by those with whom he does business.
One neighbour who had a property-line dispute with Bezos says he was impossible to contact. Another, who wanted to discuss a local disagreement about water rights with Bezos, told the paper: "You've got to talk to three colonels and generals and all that stuff [to get to him]".
The locals are frustrated because they want to use Bezos' presence in the area to drum up interest in their very small (under 3,000 people) town. But Bezos, renowned for being publicity shy, was probably attracted to the area precisely because it is so small that it has no cinema and failed to attract a prison.
The company has already come under fire from environmentalists who argue that the development of the space port will have an unacceptable impact on local wildlife.
Some lizards and birds could be disturbed by the construction project, and an environmental impact assessment warns that "small numbers of less-mobile, burrow-dwelling animals (gophers, chipmunks) inhabiting the construction area could be displaced by construction activity or killed if burrows are filled, crushed, or paved".
*Yes, very geeky, isn't it? But as the editor said to this reporter: "Well, you recognised the name, didn't you?". ® | 科技 |
2017-09/1579/en_head.json.gz/24598 | Red Hat Vice President Of Global Support Services To Deliver Opening Keynote At TSANet Event
Red Hat, Inc. (NYSE: RHT), the world's leading provider of open source solutions, today announced that its vice president of Global Support Services, Marco Bill-Peter, will be delivering the opening keynote at the...
Red Hat, Inc. (NYSE: RHT), the world's leading provider of open source solutions, today announced that its vice president of Global Support Services, Marco Bill-Peter, will be delivering the opening keynote at the TSANet 20 th Anniversary event. Established in 1993, TSANet is a vendor-neutral global support alliance where companies work together to support mutual customers more effectively. TSANet will host an event Sept. 10-12, 2013 in Campbell, Calif. to bring together TSANet and industry-leading members for an event about the future of multi-vendor technical support. The event will include technical training and certification as well as best practices round table discussions. It will also celebrate TSANet’s 20 th anniversary of supporting the people who support technology. Bill-Peter is set to speak about exceeding customer expectations with multi-vendor support. At Red Hat, Bill-Peter has served as the vice president of Red Hat Global Support Services since 2008. Global Support Services, accountable for earning subscriber loyalty, is responsible for support delivery to Red Hat’s clients and partners across all product lines, including Red Hat Enterprise Linux, JBoss Middleware, Red Hat Storage, and Red Hat’s cloud offerings. Under Bill-Peter’s leadership, Global Support Services has led a transition from a focus on transactional support to one on sustained customer relationships - expanding the role of support from reactive interactions to include high-value, proactive, and collaborative customer engagement. Bill-Peter has more than 20 years of experience in the information technology and support delivery fields. Recently, TSANet announced the election of executive officers for the period of July 2013 to July 2014. Ted Williams, support programs manager for Partnerships at Red Hat, was named Chairperson. WHAT: Red Hat’s vice president of Global Support Services, Marco Bill-Peter, to deliver opening keynote WHERE: Pruneyard Plaza, DoubleTree by Hilton Hotel, 1995 S. Bascom Ave., Campbell, Calif. WHEN: Tuesday, Sept. 10, 2013 at 9:15 a.m. ET For more information on TSANet, visit http://www.tsanet.org/20years. Additional Resources Red Hat Support on Twitter Red Hat Support on Facebook Red Hat Support on Google+ Connect with Red Hat Learn more about Red Hat Get more Red Hat news or subscribe to the Red Hat news RSS feed Follow Red Hat on Twitter Join Red Hat on Facebook Watch Red Hat videos on YouTube Join Red Hat on Google+ About Red Hat, Inc. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, and open source communities, Red Hat helps create relevant, innovative technologies that liberate resources for growth and prepare customers for the future of IT. Learn more at http://www.redhat.com. Forward-Looking Statements Certain statements contained in this press release may constitute "forward-looking statements" within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements provide current expectations of future events based on certain assumptions and include any statement that does not directly relate to any historical or current fact. Actual results may differ materially from those indicated by such forward-looking statements as a result of various important factors, including: risks related to delays or reductions in information technology spending; the effects of industry consolidation; the ability of the Company to compete effectively; the integration of acquisitions and the ability to market successfully acquired technologies and products; uncertainty and adverse results in litigation and related settlements; the inability to adequately protect Company intellectual property and the potential for infringement or breach of license claims of or relating to third party intellectual property; the ability to deliver and stimulate demand for new products and technological innovations on a timely basis; risks related to data and information security vulnerabilities; ineffective management of, and control over, the Company's growth and international operations; fluctuations in exchange rates; and changes in and a dependence on key personnel, as well as other factors contained in our most recent Quarterly Report on Form 10-Q (copies of which may be accessed through the Securities and Exchange Commission's website at http://www.sec.gov), including those found therein under the captions "Risk Factors" and "Management's Discussion and Analysis of Financial Condition and Results of Operations." In addition to these factors, actual future performance, outcomes, and results may differ materially because of more general factors including (without limitation) general industry and market conditions and growth rates, economic and political conditions, governmental and public policy changes and the impact of natural disasters such as earthquakes and floods. The forward-looking statements included in this press release represent the Company's views as of the date of this press release and these views could change. However, while the Company may elect to update these forward-looking statements at some point in the future, the Company specifically disclaims any obligation to do so. These forward-looking statements should not be relied upon as representing the Company's views as of any date subsequent to the date of this press release. Red Hat, the Shadowman logo and JBoss are registered trademarks of Red Hat, Inc. in the U.S. and other countries. Linux is a registered trademark of Linus Torvalds. Copyright Business Wire 2010
Facebook, General Motors, Ford: Jim Cramer's Views
Cramer shares his views on the bull market, and also points out something you may have missed.
Cramer: Skies Brightening for Cloud Group
The cloud players haven't done much for a while, but they appear ready for a big move, which is a big deal.
Cramer: This Bull Market Is Still Charging (Hard)
Consider how many post-election predictions of disaster have not happened.
Mizuho Analysts Share Their Top Stock Picks for 2017
Apple, Energen and McKesson are among the top stock picks of Mizuho's analysts for 2017.
Rhonda Schaffler | 科技 |
2017-09/1579/en_head.json.gz/24622 | Maxtor Names Park CEO
Maxtor Corp. named Chairman C.S. Park chief executive after Paul Tufano resigned as president, chief executive, acting chief financial officer and a director. In addition, the Milpitas, Calif., hard-drive maker named Charles M. Boesenberg lead director. Mr. Boesenberg is chairman and chief executive of Santa Clara, Calif.-based computer-infrastructure-management software provider NetIQ Corp. and has been a member of Maxtor's board since 2003. Maxtor is continuing its search for a finance chief after Chief Financial Officer Michael Bless resigned unexpectedly for personal reasons, and the company named Mr. Tufano... | 科技 |
2017-09/1579/en_head.json.gz/24662 | Making a statement to Save the Bay
On the 25th anniversary of the Chesapeake Bay restoration effort, The Baltimore Sun reported that a group of over a dozen of scientists and activists have released a statement to the Environmental Protection Agency’s Chesapeake Bay Program calling for a more aggressive commitment to cleaning up the bay. Sun reporter, Tim Wheeler, has also blogged about this subject showing a dramatic image illustrating the poor health of the Bay. This plea for better tactics and enforceable measures is not the first, and certainly won’t be the last one presented to officials charged with bay restoration. Officials at the National Aquarium are standing in line with all of the Chesapeake Bay advocates encouraging mandatory, enforceable measures put in place in the areas of agriculture, zoning, development, wetland restoration, the list goes on. The Aquarium’s conservation team and volunteers spend endless hours each year restoring wetlands in and around Maryland and educating visitors on watershed health. And there are countless organizations leading their own charges, doing their part to “Save the Bay”. The message has been made clear. Voluntary efforts to restore the bay have not succeeded. The bay's importance to the 15 million people whose waters drain to it, from Washington, D.C., Virginia, West Virginia, Pennsylvania, Delaware, and as far north as upstate New York, cannot be overstated. We now know that better results over the next 25 years will only be seen through the creation of consistent, mandatory practices.
baltimore-sun,
chesapeake-bay,
conservation-events,
evironmental-protection-agency
A New Vision for Animal Care and Rescue
We’re building a new Animal Care and Rescue Center in Baltimore’s Jonestown neighborhood.
2016 Recap: Babies!
Each year we welcome many babies into our Aquarium family and 2016 was no exception! Take a look back on some of the highlights from the past year.
Get More Facts About Climate Change Published February 17, 2017
Get the Facts About Climate Change | 科技 |
2017-09/1579/en_head.json.gz/24712 | Heckman joins Zazzle as CSO
By John Cook on October 29, 2007 at 1:08 AM
Jim Heckman, the Scout.com and Rivals.com founder who most recently served as chief strategy officer of Fox Interactive Media, has joined Zazzle as chief strategy officer and board adviser. Jim Heckman, left, with Ross Levinsohn
The Bainbridge Island resident will oversee acquisitions, business development, marketing and sales for Zazzle, a heavily funded Redwood City, Calif., startup that allows consumers to create customized postcards, T-shirts, calendars, hats and other items. Heckman was introduced to Zazzle and its board member, venture capitalist John Doerr, last year as he helped seal a mammoth advertising partnership between Fox Interactive and Google. While at Fox, Heckman said he had the chance to review hundreds of Internet companies and none was better positioned than Zazzle. Here’s a profile from last year that I did on Heckman, who has certainly seen some ups (the sale of Scout.com to News Corp/Fox) and downs (football recruiting scandals and the demise of Rivals.com ) over the years.
In that story, Heckman’s friends predicted that he wouldn’t last long at a big corporate empire like Fox, something that may have accelerated once his boss, Ross Levinsohn, stepped down last November.
Nonetheless, Heckman never is too far away from entrepreneurial endeavors. It’s something that just seems to run in his veins. john cook View Comments Blog Search | 科技 |
2017-09/1579/en_head.json.gz/24728 | Home / Sony PlayStation 4
/ Sony PlayStation 4
Sony Corporation today announced that it has sold more than 2.1 million units PlayStation 4 through only two weeks, bringing the total sold to the company's total to more than 4.2 million device. The number of unsold games on the Playstation 4 crossed the 9.7 million game. The beginning is clearly better than the Xbox One.On the other hand, Sony will launch the company will soon have limited Playstation Now service, which will be broadcast through the games on any device from smart devices. Where players will be able to play any game they want through broadcasting services through the clouds and play regardless of the type of device they own. As we have said now this service will be launched in a limited way and in the test phase | 科技 |
2017-09/1579/en_head.json.gz/24765 | Home Supporting Mayotte’s Biodiversity Strategy
Mon, 10 Jun 2013 Since June 2012, IUCN French National Committee has worked to help develop a Biodiversity Strategy for Mayotte. The process has involved the participation of numerous stakeholders and the organisation of several workshops to map out how the island’s rich biodiversity can be better protected. Photo: Isirus
Photo: A.Bocquet
Photo: IUCN French Committee
Like many small island states, Mayotte’s terrestrial and marine environment are vulnerable to numerous pressures from pollution, habitat destruction, invasive alien species and the consequences of climate change. The island also has a high population density which has brought considerable development pressures on biodiversity.
In order to ensure that future development of the island can be decoupled from further loss of biodiversity, IUCN French Committee has worked over the past year to develop a Biodiversity Strategy for Mayotte. After an initial period of review, IUCN picked out some key issues to be considered as part of the strategy, including: the need for all future policy developments to consider the state of the island’s biodiversity; the need to support biodiversity in key sectors such as agriculture, fisheries and tourism; the need for improved management tools such as protected areas; and the need for improving research and sharing of best practice.
Over the past year IUCN Committee has organised seven workshops and involved almost 100 different stakeholders in the development of the strategy, and in April this year a seminar was held to finalise the strategy and involve representatives from the island’s government. The final product was welcomed by the Mayotte government, and IUCN Committee will work to ensure the strategy is adequately implemented and that funding mechanisms are in place to support small scale conservation projects.
By developing its Biodiversity Strategy, Mayotte has become a pilot for French overseas departments both in that it has brought an example for the implementation of the national French biodiversity strategy and in that it has translated France’s international biodiversity commitments into practice.
Mayotte is also a good example for future voluntary schemes on biodiversity for Europe’s overseas. The French Development Agency (AFD), one of the framework partner of IUCN at international level, is committed to supporting the EU BEST initiative and is contributing to financing IUCN actions to that end at European level as well as specifically in Mayotte. AFD has had a long lasting local presence in Mayotte, like in each French overseas territory. The cooperation on the BEST initiative is particularly timely since Mayotte is in the run to become a new European Outermost Region (January 2014).
Such actions as Mayotte's Biodiversity Strategy give a particular political signal to the European Institutions when it comes to supporting environmental actions in the EU overseas. They highlight the fact that, with Europe's overseas, the EU has key allies for preserving its own biodiversity and important components of global biodiversity hotspots.Work area: MarineLocation: EuropeEast and Southern AfricaEuropean UnionOverseasMayotte (France) | 科技 |
2017-09/1579/en_head.json.gz/24812 | Home › (Green) Economy
Business and Biodiversity: A Licence to Operate
Business wants access to resources, capital and markets, and a seat at the global policy development table in order ensure it has a licence to operate. At a time of growing concern about pressure on natural resources and the need for sustainability, business also has to talk about biodiversity and sustainable development as a means to secure its business targets. But its motives, influence and outcomes in terms of biodiversity conservation, sustainable use and equitable benefits need to be assessed. Before COP12, steps should be taken to reduce the direct and indirect influence of business on biodiversity decisions in order to assert the primacy of biodiversity as part of our global commons, to be governed by the CBD, not the corporate sector.
At COP 11, business was omnipresent. There were more than 70 events described as ‘business-related’ around COP11 in Hyderabad. It is worth looking a little more closely at the groupings involved. For example, The Economics of Ecosystems and Biodiversity (TEEB), originally commissioned by the G8 +5 was linked to several of them. The TEEB for Business Coalition has powerful founder members including a UK accountancy institute, large conservation organisations and the World Business Council for Sustainable Development (WBCSD), which was involved in 3 side events. This is a very large group whose emergence dates back to the Earth Summit of 1992. Top business clusters within WBCSD include 23 utilities and power companies, 17 oil and gas, 17 engineering, 17 chemical companies, 13 consumer goods, 13 cement, 12 mining and 11 tyre companies.
Biodiversity should not be expected to earn its living in the market
On May 2nd 2012 a paper appeared in Nature entitled: A global synthesis reveals biodiversity
loss as a major driver of ecosystem change. It analyses existing data to show that biodiversity loss and extinctions are altering processes fundamental to ecosystem functioning and resilience, with major implications for us all. This is not a new message, but one that has constantly been ignored.
Are we continuing to fuel Biodiversity Loss?
Biofuels, Bioenergy, Biochar and the Technologies of the new Bioeconomy
Industrial scale bioenergies, including biofuels are rapidly expanding, creating massive new demand for wood, vegetable oil and agricultural products. Already these demands are inflicting serious and irreversible impacts on forests and other natural ecosystems, soils and water resources. Expansion of industrial monocultures, including tree plantations, to meet this demand occurs at the expense of biodiversity and food production, while also contributing to “land grabs”, undermining the rights of peasant farmers and indigenous peoples, and hampering efforts to achieve food sovereignty and agrarian reform.
The CBD Secretariat's report rightly acknowledges many of these negative impacts. However, in line with COP10 decision X/37, it focuses predominantly on 'tools', i.e. standards and certification, to address the often complex direct and indirect negative impacts, without assessing whether those tools are credible instruments.
Standards and certification schemes per se have not been effective and are no match for countering the drivers of bioenergy expansion: targets, mandates and subsidies, especially in Europe and North America. To effectively address the negative impacts, those incentives need to be eliminated.
Information concerning Innovative Financial Mechanisms (IFMs): Offset Programmes
A submission to the CBD Secretariat concerning decision X/3
With Decision X/3, A, paragraph 8(c) "invites parties, relevant organisations and initiatives.[...] to submit information concerning innovative financial mechanisms that have potential to generate new and additional financial resources as well as possible problems that could undermine achievement of the Convention's three objectives [...]".
This submission focuses on experiences with offset programmes, showing examples and concerns that have arisen from them and that are relevant to ideas of developing biodiversity offset systems or similar mechanisms.
Green economy and biofuels: what did the CBD say ?
With next year’s Rio+20 Earth Summit due to meet in the ‘biofuel republic of Brazil’ it is little wonder that the fights over agrofuels will be intensifying in the years ahead. UNEP’s flagship ‘Green Economy’ study published last month appears to bless a massive expansion of agrofuel – advocating for over a fifth (21.6%) of all liquid fuels to be bio‐based by 2050. Sourcing all that biological feedstock is a feat that even UNEP admit will gobble up over a third (37%) of global agricultural and forest ‘residues’ – a hefty take from already overstressed ecosystems.
- March 2011
Finance, targets, green economy and innovative financial mechanisms
Discussions on funding, financial targets and innovative financial mechanisms were extremely difficult during the COP10 in Nagoya in October 2010 and clearly revealed the divide between North and South. They also reflect a wider struggle going on over the effectiveness and implications of market‐oriented approaches to the three Rio Conventions, including biodiversity
conservation. This struggle that is going to be central for "Rio+20", the 2012 United Nations Conference on Sustainable Development where 'green economy' is one of the two main topics on the agenda.
Rio+20, the “Green Economy” and the Real Priorities
At the recent preparatory conference for Rio+20 in New York (7-8th March) it became clear that the “green economy” concept is complicating an already difficult process. Definitions of sustainable development have been argued over for years; now we are invited instead to see everything in terms of a “green economy”. UNEP, which produced its massive economics-dominated report shortly before the prepcom/conference, defines the “green economy” as one that results in improved human wellbeing and social equity, while significantly reducing environmental risks and ecological scarcities. Thus the biodiversity and ecosystem resilience on which we all depend are reduced to “risks” and “scarcities”. Even though it is clear that the “green economy” means very different things to different interests, many parties simply parroted the phrase over and over; Bolivia was one of the few that commented critically, noting that there is not a shared vision of what the "green economy" might be.
Carbon markets – A distraction from the real priority: immediate emission reductions
In discussions about climate, market interests are of course focused on finance and how the market can participate. In this context, market interests include not just carbon markets, but also land and commodity markets, mining, timber and paper, that hope to profit from offsets. There is a real risk that their increased participation could give market mechanisms, traders and investors more power over development and also over land than developing countries and their peoples. Before they will commit, market players want incentives to invest, voluntary standards, enhanced returns, reduced risk and guarantees against failure to deliver. Private investors want to greatly expand the carbon markets, where money can be made in the short term, in order to attract traders. They hope to gain from multiple market devices linked to claimed carbon sequestration or emission reductions. This briefing raises some of the issues that must be considered, especially by developing countries and their peoples.
The carbon market dream: millions of offsets from land-use “sinks”
Carbon traders and high emitting Parties would like all land-use to count as carbon sinks to offset sources, delay reducing emissions and make money for carbon markets. There is more than one route to this goal: REDD++ could be one way, and CDM in LULUCF is another, as we shall see. Parties could also be enabled to use every current and future market-based mechanism to meet their reduction commitments. This briefing provides background to these key issues for Cancun.
Carbon - The New Cash Crop
Following Copenhagen the message is clear: if we do not act swiftly, industrial agriculture could soon claim large rewards from carbon trading by being recognized as a carbon sink. We know that climate change has the potential to irreversibly damage the natural resource base on which agriculture depends. But we also know that industrial agriculture is a major cause of climate change, so how can rewarding it with carbon credits help reduce its climate impacts?
The Land Magazine: http://www.thelandmagazine.org.uk/ | 科技 |
2017-09/1579/en_head.json.gz/24825 | Select Language (de-be)
(de-be)
Pressemitteilungen 2007 Subscribe to receive news from ESO in your language!
eso0756 — Organisatorische Pressemitteilung
31. Dezember 2007: A Happy New Year!
eso0755 — Bildveröffentlichung
Anatomy of a Bird
21. Dezember 2007: Using ESO's Very Large Telescope, an international team of astronomers [1] has discovered a stunning rare case of a triple merger of galaxies. This system, which astronomers have dubbed 'The Bird' - albeit it also bears resemblance with a cosmic Tinker Bell - is composed of two massive spiral galaxies and a third irregular galaxy. Weiter
2009 to be the International Year of Astronomy
20. Dezember 2007: Yesterday, the 62nd General Assembly of the United Nations proclaimed 2009 the International Year of Astronomy, with the aim of increasing awareness among the public of the importance of astronomical sciences and of promoting widespread access to new knowledge and experiences of astronomical observation.
eso0753 — Pressemitteilung Wissenschaft
Speedy Mic's Photograph
19. Dezember 2007: Using observations from ESO's VLT, astronomers were able for the first time to reconstruct the site of a flare on a solar-like star located 150 light years away. The study of this young star, nicknamed 'Speedy Mic' because of its fast rotation, will help scientists better understand the youth of our Sun.
Discovering Teenage Galaxies
28. November 2007: Staring for the equivalent of every night for two weeks at the same little patch of sky with ESO's Very Large Telescope, an international team of astronomers has found the extremely faint light from teenage galaxies billions of light years away. These galaxies, which the research team believes are the building blocks of normal galaxies like our Milky Way, had eluded detection for three decades, despite intensive searches. Weiter
ESO Helps Antofagasta Region after the Earthquake
23. November 2007: On November 14 at 12:41 local time, a major earthquake with magnitude 7.7 on the Richter scale affected the north of Chile. The epicentre was located 35 km from the city of Tocopilla and 170 km of Antofagasta.
Close to the Sky
22. November 2007: Today, a new ALMA outreach and educational book was publicly presented to city officials of San Pedro de Atacama in Chile, as part of the celebrations of the anniversary of the Andean village.
A Galaxy for Science and Research
9. November 2007: During his visit to ESO's Very Large Telescope at Paranal, the European Commissioner for Science and Research, Janez Potočnik, participated in an observing sequence and took images of a beautiful spiral galaxy. Weiter
Commissioner Potočnik at Paranal Observatory
27. Oktober 2007: As part of his first official trip to Brazil and Chile, the European Science and Research Commissioner, Janez Potočnik, visited Europe's flagship for ground-based astronomy, the ESO Paranal Observatory.
Drizzly Mornings on Xanadu
11. Oktober 2007: Noted for its bizarre hydrocarbon lakes and frozen methane clouds, Saturn's largest moon, Titan, also appears to have widespread drizzles of methane, according to a team of astronomers at the University of California, Berkeley. New near-infrared images from ESO's Very Large Telescope (VLT) in Chile and the W. M. Keck Observatory in Hawaii show for the first time a nearly global cloud cover at high elevations and, dreary as it may seem, a widespread and persistent morning drizzle of methane over the western foothills of Titan's major continent, Xanadu.
Catch a Star 2008!
5. Oktober 2007: ESO and the European Association for Astronomy Education have just launched the 2008 edition of 'Catch a Star', their international astronomy competition for school students. Now in its sixth year, the competition offers students the chance to win a once-in-a-lifetime trip to ESO's flagship observatory in Chile, as well as many other prizes.
A Colossus Gets its Name
5. Oktober 2007: Today, the first of the two ALMA antenna transporters was given its name at a ceremony on the compounds of the manufacturer, the heavy-vehicle specialist Scheuerle Fahrzeugfabrik GmbH, in Baden-Württemberg. The colossus, 10 metres wide, 20 metres long and 6 metres high, will be shipped to Chile by the end of the month. The second one will follow in a few weeks. Weiter
A Grand Vision for European Astronomy
28. September 2007: Today, and for the first time, astronomers share their global Science Vision for European Astronomy in the next two decades. This two-year long effort by the ASTRONET network of funding agencies, sponsored by the European Commission and coordinated by INSU-CNRS, underscores Europe's ascension to world leadership in astronomy and its will to maintain that position. It will be followed in just over a year by a prioritised roadmap for the observational facilities needed to implement the Vision. Implementation of these plans will ensure that Europe fully contributes to Mankind's ever deeper understanding of the wonders of our Universe. Weiter
Into the Chrysalis
27. September 2007: A team of European astronomers has used ESO's Very Large Telescope Interferometer and its razor-sharp eyes to discover a reservoir of dust trapped in a disc that surrounds an elderly star. The discovery provides additional clues about the shaping of planetary nebulae.
The Frugal Cosmic Ant
27. September 2007: Using ESO's Very Large Telescope Interferometer and its unique ability to see small details, astronomers have uncovered a flat, nearly edge-on disc of silicates in the heart of the magnificent Ant Nebula. The disc seems, however, too 'skinny' to explain how the nebula got its intriguing ant-like shape.
A Warm South Pole? Yes, on Neptune!
18. September 2007: An international team of astronomers using ESO's Very Large Telescope has discovered that the south pole of Neptune is much hotter than the rest of the planet. This is consistent with the fact that it is late southern summer and this region has been in sunlight for about 40 years. Weiter
Galaxy 'Hunting' Made Easy
14. September 2007: Astronomers using ESO's Very Large Telescope have discovered in a single pass about a dozen otherwise invisible galaxies halfway across the Universe. The discovery, based on a technique that exploits a first-class instrument, represents a major breakthrough in the field of galaxy 'hunting'.
Stellar Firework in a Whirlwind
3. September 2007: Stars do not like to be alone. Indeed, most stars are members of a binary system, in which two stars circle around each other in an apparently never-ending cosmic ballet. But sometimes, things can go wrong. When the dancing stars are too close to each other, one of them can start devouring its partner. If the vampire star is a white dwarf – a burned-out star that was once like our Sun – this greed can lead to a cosmic catastrophe: the white dwarf explodes as a Type Ia supernova.
Professor Tim de Zeeuw Takes Up Duty as New ESO Director General
3. September 2007: On 1 September, Tim de Zeeuw became the new ESO Director General, succeeding Catherine Cesarsky. In his first day in office, he kindly agreed to answer a few questions.
Edge-On!
23. August 2007: As Uranus coasts through a brief window of time when its rings are edge-on to Earth - a view of the planet we get only once every 42 years - astronomers peering at the rings with ESO's Very Large Telescope and other space or ground-based telescopes are getting an unprecedented view of the fine dust in the system, free from the glare of the bright rocky rings. They may even find a new moon or two.
Angezeigt werden 1 bis 20 von 56
Schicken Sie uns Ihre Kommentare! | 科技 |
2017-09/1579/en_head.json.gz/24861 | Initiatives Home
Initiatives Initiatives
Our initiatives are our guide. Active collaboration with local, regional and international partners enables us to address some of the world’s greatest species extinction threats through these initiatives.
GWC explores remaining wilderness areas to improve our understanding of biodiversity patterns around the globe and in search of undiscovered species. We confirm areas of high species diversity and richness and identify new sites that are critical to threatened plants and animals. Our biodiversity surveys range from short one-person surveys intended to ground-truth the presence of a single species, to multi-month explorations of areas by international teams representing several taxonomic and ecological specialists. Our biodiversity explorations always aim to determine the threats to an area’s wildlife, and the opportunities that exist to alleviate these threats.
Habitat Conservation
The single biggest threat to the survival of species worldwide is the loss and degradation of habitat. GWC prioritizes species and sites most at risk and work with local partners to protect and manage these critical habitats. We are founding partners of the Leapfrog Conservation Fund, a mechanism that enables us to efficiently support the most important projects worldwide.
Capacity building comes in many shapes and sizes. At GWC we use capacity building efforts to develop both individuals and organizations. Our associates program provides researchers, conservationists and academics from around the world an opportunity to come together and use the GWC institution to build connections, access resources and bolster conservation action.
Preventing Extinctions
GWC implements conservation action to ensure the viability of populations. These actions span a variety of measures with habitat protection being our first and foremost goal. We also address threats beyond habitat loss, such as disease or harvest, are also addressed when they pose a significant extinction risk.
Our focus is on endangered species that are hanging on within one or only a few populations and have an inherently high risk of extinction. When research supports drastic declines of more widespread species, we also prioritize conservation actions to stabilize populations.
GWC researches the biology of species to develop the best methods for protecting them in the wild or, if needed, in captivity for future reintroduction into their native habitats. We also support the IUCN Red List of Threatened Species as it underpins our conservation strategy by continuously updating the status of the world’s wildlife. | 科技 |
2017-09/1579/en_head.json.gz/25047 | 'That's no moon': Astronomers find smallest planet yet outside of solar system
Alicia Chang, The Associated Press | February 20, 2013 2:40 PM ETMore from The Associated Press
Small planet Kepler-37b found by astronomersLOS ANGELES — Astronomers searching for planets outside our solar system have discovered the tiniest one yet – one that’s about the size of our moon.
But hunters for life in the universe will need to poke elsewhere. The new world orbits too close to its sun-like star and is too sizzling to support life. Its surface temperature is an estimated 700 degrees Fahrenheit. It also lacks an atmosphere and water on its rocky surface.
University of California, Berkeley astronomer Geoff Marcy, one of the founding fathers of the planet-hunting field, called the latest find “absolutely mind-boggling.”
“This new discovery raises the specter that the universe is jampacked, like jelly beans in a jar, with planets even smaller than Earth,” said Marcy, who had no role in the new research.
RelatedNicolaus Copernicus becomes the centre of Google’s universe as the search giant marks astronomer’s 540th birthdayRussian meteor explained: Videos and pictures of blast and crash siteIt’s almost like time travel’: Work begins on B.C. radio telescope that will act like a time machine, scientists say
It’s been nearly two decades since the first planet was found outside our solar system. Since then, there’s been an explosion of discoveries, accelerated by NASA’s Kepler telescope launched in 2009 to search for a twin Earth. So far, 861 planets have been spotted and only recently have scientists been able to detect planets that are similar in size to Earth or smaller.
While scientists have theorized the existence of a celestial body that’s smaller than Mercury – the baby of the solar system since Pluto’s downgrade – they have not spotted one until now. Nearest to the sun, Mercury is about two-fifths the Earth’s diameter; the newly discovered planet and our moon are about a third the size of Earth.
The teeny planet was detected by Kepler, which simultaneously tracks more than 150,000 stars for slight dips in brightness – a sign of a planet passing in front of the star. The planet – known as Kepler-37b – orbits a star 210 light years away in the constellation Lyra. It’s one of three known planets in that solar system.
Discoverer Thomas Barclay of the NASA Ames Research Center in Northern California was so excited when he spied the moon-sized planet that for days, he said he recited the “Star Wars” movie line: “That’s no moon.” It took more than a year and an international team to confirm that it was a bona fide planet.
The discovery is detailed in Thursday’s issue of the journal Nature.
Scientists are looking for an Earth-size planet that’s in the so-called Goldilocks zone – that sweet spot that’s not too hot and not too cold where water, which is essential for life, could exist on the surface.
While the newly discovered planet isn’t it, “that does not detract from the fact that this is yet another mile marker along the way to habitable Earth-like planets,” said Alan Boss of the Carnegie Institution for Science in Washington, who was not part of the discovery team.
Topics: News, World, space | 科技 |
2017-09/1579/en_head.json.gz/25265 | Airgun Regulator Basics
by Joe Korick
In order to understand the regulator as it applies to the precharged pneumatic airgun we must first look at a non regulated version. Figure 1 is an illustration of the basic non regulated airgun.
When the gun is cocked the striker is pulled back a certain distance before the trigger sear engages. This distance is called the Stroke. When the trigger is pulled the striker is released and travels forward striking the valve stem of the Firing valve lifting it off its seat allowing air to flow past through the Transfer port and into the barrel. The length of time the valve remains open is referred to as Valve duration and is measured in thousandths of a second. As the reservoir pressure decreases the valve duration increases due to less resistance of air pressure on the firing valve. As the valve duration increases a larger volume of air at a lower pressure is allowed to pass. The result is a rise in pellet velocity. Figure 2 shows the typical velocity verses number of shots curve of a non regulated gun. Looking at the curve you can see an area where the curve is somewhat consistent. This is called the heart of the fill. Most people will find this area and shoot their gun here for the greatest accuracy. To find this area the gun is filled to full pressure and shot over a chronograph. The shots are counted, velocities recorded and the graph plotted. The pressure at the end of the test is recorded and subtracted from that of the beginning. This number is divided by the number of shots and the result is approximately the amount of pressure per shot that the reservoir diminishes. The gun will now be charged to the pressure where the graph starts to flatten and shot to where the graph starts to drop off. In order to extend this plateau into the higher supply pressures and maintain greater shot to shot consistency a regulator is used.
What is a regulator?
Regulators come in different shapes and sizes. Designs vary from manufacturer to manufacturer but the end result is the same. A regulator is simply a valve. Probably one of the simplest regulating devises is the firing valve used in the Titan and Falcon line of guns. The area around the plastic firing valve has a set clearance that limits the air being allowed to pass. In this system the gun remains somewhat constant over a range of several hundred PSI. The conclusion drawn from this is that it isn't a function of pressure propelling the pellet as much as a given volume of air to get the job done. The job of the regulator is to maintain a constant output pressure while the input pressure varies. In the airgun the regulator does not regulate the flow of the air as much as it regulates the pressure of the firing valve chamber. Some regulators have a very slow recovery rate, others are almost instantaneous. As discussed earlier pressure alone is not enough to do the entire job, there must also be enough volume at this pressure to propel the pellet at the desired velocity. Therefore an efficient regulator system must also include a secondary chamber of sufficient size for the job at hand.
How does a regulator work?
Most of the airgun regulator manufacturers do not want their secrets to get out. The reality of the issue is that their designs are based on SCUBA equipment with small modifications to suit the needs of the airgun. In an attempt not to give away any one design and to prove a point the illustration in Figure 3 is taken from a 1975 SCUBA divers training manual which pre dates the precharge pneumatic air rifle of today. This regulator design is called a " Balanced flow-through piston valve".
The air enters the regulator from the reservoir, travels through the piston and into the firing valve chamber. As the pressure increases so does the force on the large end of the piston. As the force increases on the piston the spring behind the piston begins to compress. This process continues until the shaft of the piston contacts the teflon seat and shuts the flow of air off. When the shot is fired the air pressure in the firing valve chamber drops and the spring lifts the piston off its seat allowing high pressure air to flow into the valve chamber once again and the cycle is repeated.
The pressure in the firing valve chamber is determined by the size of the piston head and the strength of the spring. These dimensions will vary between manufactures, among all the airgun regulators available today none has the same dimensions as another. There are, however, several regulators available that use this basic design.
Typically you would want the regulator to be set at a pressure that is in the middle of the "Heart of the fill" so that small variances in gun operation would have little or no effect on pellet velocity. This is not always the case. Some people want to shoot at very high velocities with heavy pellets and push the gun to its limits. Because the gun is operating so close to its limits, alterations are necessary. These might include Lengthening the stroke, larger transfer ports, increasing striker spring tension, changing the firing valve and or valve spring are all options that can greatly affect gun operation. The precharge Pneumatic airgun is a DYNAMIC SYSTEM meaning that all these factors are in a state of balance and each has an effect on one another. Unless someone is completely familiar with these principals, adjustments and modifications should be left to the airgunsmith.
Regulator Pros and Cons
Regulators extend the amount of shots per fill and bring the shot to shot consistency closer. For the serious competitor whether it be field target, bench rest, silhouette or 10 meter shooting. Regulators are a part of basic equipment requirements.
The down side of regulators is that they are mechanical devices and consequently can fail. Regulators depend heavily on rubber O-Rings for seals. With time and use the rubber can dry out and crack. They are also subject to wear, as the valve moves back and forth from shot to shot the o-rings are rubbing the walls of the regulator body. It's really just a matter of time or number of shots before some maintenance is required. This could leave a shooter in a bad situation if it happened at a match. If a shooter shoots a lot then a yearly check up should be considered. Fortunately o-rings are not very expensive and primarily represent the majority of repairs required.
Back to General Airgun Information | 科技 |
2017-09/1579/en_head.json.gz/25351 | Technology Stonebraker: A Database icon
He began exploring databases 40 years ago — and continues to invent thriving startups
Mar 1, 2013, 6:00am EST
Technology Kyle Alspach
At 69, Michael Stonebraker is still pursuing new database ventures. He’s seen here at… more W. Marc Bernsau The term “serial entrepreneur” refers to people who have founded more than one company. But for entrepreneurs like Michael Stonebraker, the descriptor probably falls short.
Stonebraker, a computer scientist at the Massachusetts Institute of Technology, isn’t a household name outside of tech circles. But in the area of database technology — which underpins much of computing — he is likely the most recurring character in its history.
At 69, Michael Stonebraker is still pursuing new database ventures. He’s seen here at… more W. Marc Bernsau
Stonebraker got into databases in the early 1970s, long before “big data” had become a buzzword. He has gone on to found nine companies, including six in Massachusetts in the past decade.
Now 69, Stonebraker is still showing no signs of slowing down. Along with heading a new big-data research center at MIT, he is actively working for three of the Boston-area companies he founded — VoltDB, Paradigm4 and DataTamer. He launched DataTamer just a few months ago.
“I plan to continue doing what I’m doing as long as I’m competitive and can make a difference,” Stonebraker said. “I can’t quite imagine retiring to Florida and playing golf. That seems boring beyond belief.”
Most of his ventures have been financial successes, including Billerica-based Vertica Systems, bought by Hewlett-Packard for $340 million in 2011. His research projects and companies have also spawned dozens of other experts in the field. Stonebraker is “the greatest living contributor to database technology,” according to Curt Monash, president at Monash Research in Acton, who has followed Stonebraker’s career for three decades.
“He has either invented or helped popularize a significant fraction of the important ideas in database technology,” Monash said.
Boston- Technical Sales Engineer Corning Rooms Division Manager Hotel 140 Marketing manager John Hancock Post a Job
Suzanne Olbricht Beth Israel Deaconess Medical Center See More People on the Move Yet for all his achievement, Stonebraker could have been a far bigger name in the business world. His first company went head-to-head with Oracle for primacy in the database market in the 1980s — and lost.
As unwilling as Stonebraker is to leave the world of technology and business now, he had different ambitions as a college graduate in the mid-1960s. He recalls going to graduate school to avoid being drafted for service in Vietnam. What he really wanted to do, Stonebraker said, was drive around the country like the characters on a popular show of the time, “Route 66.”
“That’s what I would’ve done. But that would’ve gotten me to Vietnam,” he said. “I sat out the war in graduate school, and that’s the only reason I have a PhD.”
The degree would ultimately land him a job at University of California-Berkeley, where he says he read an academic paper on relational databases “on a lark.” The paper, by Ted Codd of IBM, would inspire Stonebraker and a colleague to launch the Ingres project in 1971.
The project would help to pioneer the use of relational database technology — which today is the standard technology for business storage of data. After years spent developing Ingres at Berkeley, Stonebraker formed Relational Technology (later renamed Ingres Corp.) to commercialize the technology in 1980. The move to form a company, Stonebraker said, was partly out of frustration with Oracle co-founder Larry Ellison, who had claimed that Oracle’s system “was 100 times faster than Ingres — when his system didn’t even work at the time.”
Stonebraker’s company became the chief competitor to Oracle in the early 1980s. In Stonebraker’s view, Ellison succeeded not because of having a strong product but because he was a shrewd marketer. Ingres, meanwhile, failed to market itself effectively.
“Oracle at the time had lots of chinks in their armor which we could’ve exploited a lot better,” Stonebraker said. “I’d like to have that one back and do it a lot better.”
Ingres went public in 1988 and was acquired by ASK Computer in 1990, but by that time Oracle’s annual revenue had neared $1 billion while Ingres had generated around $130 million annually. Oracle maintained its market dominance to the present day, making Ellison one of the world’s richest people in the process.
Stonebraker, however, has never stopped trying to undermine Oracle’s database empire.
“Mike was the first to preach publicly that one size doesn’t fit all, that there would be opportunities for new vendors” other than Oracle, said Jo Tango, a venture capitalist who has backed five Stonebraker companies through his current firm, Kepha Partners, and his prior firm, Highland Capital Partners.
Stonebraker replicated the approach he took with Ingres — spinning out university research into a startup — with two companies in California, Illustra and Cohera.
After those companies were acquired, Stonebraker moved to Massachusetts in 2001 to be closer to his wife Beth’s family. Following the move, he accepted an offer to resume his research at MIT. In 2003, he launched his first company in Massachusetts, algorithmic trading technology firm StreamBase Systems.
The six companies founded by Stonebraker in Massachusetts have raised a combined $100 million in venture capital.
“I’ve been very, very productive in the decade I’ve been here,” Stonebraker said. “My production of good ideas per unit-time has been higher here than in California.”
Financial success has never been a motivation, he said, though he says Ingres, Illustra and Vertica “were all nice liquidity events.”
“I am not hurting for money. That wasn’t the reason for doing any of these companies,” he said. “I’m intent on making a difference.”
Along with spending time with family — he has two daughters, ages 26 and 21 — Stonebraker’s pastimes include playing bluegrass banjo and hiking. A native of the small town of Milton, N.H., Stonebraker said he has climbed all 48 mountains taller than 4,000 feet in the state.
Even after the business success he’s had, Stonebraker sounds like his life philosophy hasn’t changed much from the days when he dreamed of traveling the country, seeing where life took him. He has never been much for advanced planning, he said.
“Most of the good things that have happened to me occurred because there was a fork in the road, and I was willing to take the path less traveled,” he said. “I don’t think I ever said, ... ‘This is where I’m going to be in 10 years.’ ” | 科技 |
2017-09/1579/en_head.json.gz/25501 | Home > Apple > Steve Jobs’ dream yacht impounded over… Steve Jobs’ dream yacht impounded over unpaid design bills By
Anna Washenko
In 2007, Steve Jobs started working with French designer Phillippe Starck on plans for a superyacht. Jobs passed away before he had a chance to use the boat, dubbed Venus. Now, a lawyer representing Starck’s company has said the designer has only received 6 million euros out of the 9 million euros that he was owed on commission. As a result, the yacht was impounded on Wednesday night and will remain at its current location in an Amsterdam port until the rest of the money is received. “The project has been going on since 2007 and there had been a lot of detailed talk between Jobs and Starck,” Roelant Klaassen, the lawyer for Starck’s firm, told Reuters. “These guys trusted each other, so there wasn’t a very detailed contract.” The legal counsel for Jobs’ estate was not available for comment.
The yacht was finished and unveiled to Jobs’ family in the Netherlands in late October. The vessel was designed with a minimalist, streamlined aesthetic, just as you’d imagine an aquatic Apple product to be. The ship is 80 meters long, and has several 27-inch iMacs installed on the deck. Jobs also consulted with the chief engineer for Apple stores and asked him to create special glass that could be used for structural support. According to Jobs’ biography by Walter Isaacson, the creation of this superyacht was a project near and dear to Jobs. Even when he was diagnosed with pancreatic cancer, Jobs continued working on the boat’s design. It’s uncertain what will happen to the vessel once the dispute over Starck’s commission is resolved. Image via onemorething.nl | 科技 |
2017-09/1579/en_head.json.gz/25502 | Home > Cool Tech > Barnes & Noble says it has 25 pct of the U.S… Barnes & Noble says it has 25 pct of the U.S. ebook market By
Fresh on the heels of bookseller Borders filing for bankruptcy, Barnes & Noble has announced its latest financial results, which cover its third fiscal quarter of 2011. Although the company saw a significant dip in profits compared to the same quarter last year, the bookstore is making a bold claim: according to CEO William Lynch, Barnes & Noble now accounts for 25 percent of the U.S. ebook market. That’s a larger share of the company’s share of the U.S. physical book market, and Barnes & Noble says it’s selling twice as many ebooks as physical books in any format from its online store.
“We’re pleased with our financial results this quarter, but just as importantly, the third quarter was another big quarter for the company from the standpoint of key strategic progress that positions us well for the future,” said CEO William Lynch, in a statement.
Barnes & Noble’s claim that it accounts for the quarter of the U.S. ebook market is hard to substantiate: neither Barnes & Noble nor Amazon publish any specific sales figures. However, last year Barnes & Noble claimed it had about 20 percent of the U.S. eBook market while Amazon.com claimed it accounted for 70 to 80 percent, and no other ebook sellers stepped forward to contradict those figures.
The financial quarter wasn’t all roses for Barnes & Noble, however: although the company did see its revenue increase 7 percent to $2.3 billion for the quarter and say at 64 percent increase in sales from the Barnes & Noble Web site, net earnings were down 25 percent to $60.6 million and the company will be canceling its shareholder dividend to invest money back into building out its Nook platform and digital services.
Speaking with analysts, Lynch indicates that the company believes its retail locations are an important factor in selling Nook e-readers, and that the company might consider moving into some locations being closed down by Borders. | 科技 |
2017-09/1579/en_head.json.gz/25521 | Wolverton: Pay-as-you-go plans harder to find… Share this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window) Trending: San Jose flooding Tahoe avalanche ‘Hitler’ busted Milo Yiannopoulos woes Bob Myers to the Lakers? More trouble for Chris Brown Breaking News
Wolverton: Pay-as-you-go plans harder to find in smartphone era Share this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window)
By Bay Area News Group | PUBLISHED: July 18, 2014 at 4:55 pm | UPDATED: August 15, 2016 at 5:29 am
In the wireless world, pay-as-you go seems to be going the way of the dodo bird.
Until recently, consumers who couldn’t afford, didn’t want or didn’t need a pricey cellphone plan had the option of paying small amounts for small increments of service. Typically, they could pay for small buckets of calling minutes that could be used over a period of say 90 days or even a year.
But in the smartphone era, those plans are going away. Earlier this year, for example, Sprint canceled its Sprint As You Go service, it’s own version of the pay-as-you-go plan. While you can still find some pay-as-you-go plans at other carriers, those companies generally don’t promote them and the plans usually aren’t available for smartphones.
“Those options are diminishing,” said Bill Menezes, an analyst who covers the mobile industry for Gartner, a tech research company.
Pay-as-you go was one of the first versions of prepaid wireless service. With such plans, consumers pay for service in advance of using it and typically have to either provide their own cellphone or buy one outright from the carrier. Instead of having to pay a monthly fee, consumers on pay-as-you-go plans only had to replenish their accounts when they ran out of voice minutes.
The plans were marketed to consumers who couldn’t afford to sign up for monthly service or didn’t have good credit. They were also popular with seniors and consumers on fixed budgets, and those who rarely used their phones.
“Carriers don’t want these people, it’s clear,” said Michael Gikas, who covers the cellphone industry as the senior electronics editor at Consumer Reports. “They’re not making any money on them.”
In place of pay-as-you-go-plans, carriers have been funneling customers, particularly those with smartphones, into prepaid services that look a lot like subscription plans. Consumers pay a monthly fee and get a set amount of voice minutes, text messages and data usage. Whatever they don’t use expires at the end of the month. The only thing that makes them similar to the older prepaid plans is that consumers can cancel their service at any time and they typically have to provide or pay up-front for their phones.
The new prepaid plans are typically less costly than subscription plans, mostly because they don’t include the cost of subsidizing phones. And on a per-minute or per-megabyte basis, they’re generally a better deal than the old pay-as-you-go plans.
But the monthly service cost, which generally starts at around $40, could be a barrier to the lower-income consumers who formerly used pay-as-you-go services. And they aren’t a good value for folks who rarely use their phones or for people visiting the country who might need phone service only for a week or two.
They are, however, a good deal for carriers because they get recurring monthly fees and less turnover of customers, note analysts.
To be sure, consumers who haven’t made the jump to smartphones can still find pay-as-you-go plans. Among the major carriers, T-Mobile still offers such deals. So too does TracFone, a low-cost carrier owned by global mobile giant América Móvil.
Unlike T-Mobile, TracFone offers similar deals for smartphone owners too. In addition to buckets of minutes and messages, smartphone users can buy buckets of bandwidth as well. For about $20, consumers can get 30 minutes of calling and 300 megabytes of data. While the voice minutes expire at the end of a month, the bandwidth won’t expire and can be used as long at the phone is active.
Consumers looking for low-cost smartphone service without recurring monthly fees have another option: T-Mobile offers two plans that allow users to pay for service by the day. The cheaper of the two costs $2 a day for unlimited calling, texting and bandwidth usage, but only includes slower 2G data access. For $3 a day, consumers get access to 4G data speeds for their first 200 megabytes of bandwidth.
In both cases, consumers only get charged when they actually use their phone. But users can be charged for a day of service even if they receive a single text message. And with both T-Mobile plans, whatever days consumers have purchased will expire 90 days after activation, unless they buy more.
One other possibility is Ting, which charges users only for the minutes, messages or bandwidth they use each month, plus a small service fee. Consumers who didn’t use their phone at all would pay about $6 a month, not including taxes. A consumer who used 500 voice minutes, sent 1,000 messages and used about 500 megabytes of data would pay about $32 plus taxes that month. For consumers of limited means, those aren’t a lot of options. It’s too bad that as we move into the smartphone era, many of those consumers risk being left behind.
Contact Troy Wolverton at 408-840-4285 or twolverton@mercurynews.com. Follow him at www.mercurynews.com/troy-wolverton or Twitter.com/troywolv.
Pay-as-you-go plans are tough to find for smartphones, but consumers who can’t afford or don’t want to pay pricey monthly service fees do have some options. Among them:
TracFone: Offers a true pay-as-you-go plan for smartphones. Bandwidth is extra, but is good for as long as the phone is active. Voice minutes expire after a month.
T-Mobile: Offers pay-by-the-day service. Consumers can buy as little as one or two days worth of service. But unused days expire within 90 days after activation, unless users add more days to their account. Cheaper of two plans; only offers slow 2G service.
Ting: Charges consumers only for the amount of minutes, messages or bandwidth they use each month. But charges a monthly service fee even when phone isn’t used.
Mercury News research | 科技 |
2017-09/1579/en_head.json.gz/25541 | Next Generation Firewall Enterprise Apps / Colleges Getting Proactive on Disaster Communication
Colleges Getting Proactive on Disaster Communication
Case Study: Two colleges are now well-prepared to communicate via text messaging in case of a disaster.
Chris Preimesberger was named Editor-in-Chief of Features & Analysis at eWEEK in November 2011. Previously he served eWEEK as Senior Writer, covering a range of IT sectors that include data center systems, cloud computing, storage, virtualization, green IT, e-discovery and IT governance. His blog, Storage Station, is considered a go-to information source. Chris won a national Folio Award for magazine writing in November 2011 for a cover story on Salesforce.com and CEO-founder Marc Benioff, and he has served as a judge for the SIIA Codie Awards since 2005. In previous IT journalism, Chris was a founding editor of both IT Manager's Journal and DevX.com and was managing editor of Software Development magazine. His diverse resume also includes: sportswriter for the Los Angeles Daily News, covering NCAA and NBA basketball, television critic for the Palo Alto Times Tribune, and Sports Information Director at Stanford University. He has served as a correspondent for The Associated Press, covering Stanford and NCAA tournament basketball, since 1983. He has covered a number of major events, including the 1984 Democratic National Convention, a Presidential press conference at the White House in 1993, the Emmy Awards (three times), two Rose Bowls, the Fiesta Bowl, several NCAA men's and women's basketball tournaments, a Formula One Grand Prix auto race, a heavyweight boxing championship bout (Ali vs. Spinks, 1978), and the 1985 Super Bowl. A 1975 graduate of Pepperdine University in Malibu, Calif., Chris has won more than a dozen regional and national awards for his work. He and his wife, Rebecca, have four children and reside in Redwood City, Calif.Follow on Twitter: editingwhiz | 科技 |
2017-09/1579/en_head.json.gz/25598 | News Discover why the franchise players aren't made by who you think
If only Miyamoto and Iwata really did work day and night in some giant factory, rainbows springing from the smokestacks, dreams from the delivery doors and Italian-accented laughter from the plumbing. Sadly, however, there are just too many games (and laws of physics and other such nonsense) for that suggestion to be true. Sometimes someone else has to step in and be gaming's amazing chocolateer, taking possession of Mario, The Mushroom Kingdom, or another of Nintendo's exclusive brands. Nintendo's ever-changing family of outside developers extends far beyond the walls of their R&D studios in Tokyo, not to mention Japan's borders and all other territorial boundaries. Their search for talent takes them wherever it's to be found, and the possibilities for the chosen few are, as reigning President Satoru Iwata would be sure to tell you, practically limitless.Iwata was himself one of Nintendo's scouted talents after all, introduced to the company through one of their earliest and most trusted second-parties, HAL Laboratory. Like many, HAL's relationship began as an experiment, their first assignment being to repair the overdue NES Pinball, which they did prior to its release in 1984. Success there led to the development of Mother for NES and Kirby's Dream Land for Game Boy, during which time Nintendo kept a close eye on the company's people and practices, analyzing the potential of both the team and their constituent members.It's a practice that's been continued ever since, Nintendo fostering talent and recognizing enthusiasm with nary a thought, initially at least, for the potential side effects. So when HAL almost went bankrupt with a cutting-edge office expansion in 1991, plunging them into $42 million of debt, Nintendo saw through the numbers and threw them a lifeline. In return for finance and support, HAL would live on to develop titles exclusively for Nintendo platforms - while Iwata, whose persistence is credited with securing the whole arrangement, was instated as company chairman. | 科技 |
2017-09/1579/en_head.json.gz/25620 | Android phones do not infringe on Gemalto's patent...
Android phones do not infringe on Gemalto's patents, US court rules
The way Android phones run Java apps is no patent infringement, a US court ruled
Loek Essers (IDG News Service) on 20 June, 2014 20:48
Android devices running Java applications do not infringe on patents belonging to SIM card maker Gemalto, the U.S. Court of Appeals for the Federal Circuit ruled.Gemalto sued Google, Motorola, HTC and Samsung Electronics in 2010 in the U.S. District Court for the Eastern District of Texas, alleging that the companies infringed on three of its patents when running Java applications on their Android phones.Before Gemalto's invention, personal computers with microprocessors could run Java applications, but these computers required a substantial amount of memory which was located on chips separate from the processor chip, called off-chip memory. Gemalto's technology enabled devices with less memory to run applications written in high level programming languages such as Java.To do that, applications are converted from Java into another format that is suitable for computing devices with less memory. The applications are stored in the memory of the chip containing the embedded processor that executes the application. The processor however cannot run the application and requires a converter to translate the converted application into instructions the processor can execute.The interpreter is also stored in on-chip memory, which is necessary to run a Java application. Both the application and the interpreter must fit within the constraints of the platform for storage and execution.Gemalto alleged that the defendants infringed on this technology when their smartphones run the Android operating system and Java applications that are converted using the Android software development kit. The companies however argued they don't infringe because they rely on off-chip memory to run Java applications.The District Court sided with the smartphone makers in a summary judgment, ruling that the accused devices didn't infringe because Gemalto did not dispute that they store program instructions off-chip and access those off-chip instructions to run the accused applications.
Gemalto did not challenge the court's findings on direct infringement, but rather argued that the devices indirectly infringe on the patents because they run Java programs in substantially the same way as described in the patents, the Court of Appeals said in its judgement.According to Gemalto, the accused devices infringe on the patents when they temporarily load program instructions from off-chip memory into on-chip cache memory before execution. However, because cache memory cannot store applications when a device is turned off, the District Court concluded that cache memory is substantially different from permanent memory and not equivalent for infringement purposes.Gemalto appealed the case. The company argued that on-chip cache memory is equivalent to on-chip memory permanently storing applications. Applications are loaded into on-chip cache memory before execution 97 percent of the time, according to Gemalto, which argued that the difference between 97 percent and 100 percent is insubstantial.However, this argument is too general, the Court of Appeals said, adding that the cache memory functionality that is the basis for Gemalto's theory was employed by microprocessor-based systems at the time of the invention."We agree with the district court that the accused devices do not infringe ... due to their use of cache memory," the Court of Appeals said Thursday, affirming the district court's judgement of no infringement.Gemalto said it was disappointed by the judgment, but said the decision has no impact on its licensing business.Loek is Amsterdam Correspondent and covers online privacy, intellectual property, open-source and online payment issues for the IDG News Service. Follow him on Twitter at @loekessers or email tips and comments to loek_essers@idg.com
Tags htcGoogleintellectual propertyGemaltopatentSamsung ElectronicslegalMotorola Mobility
Loek Essers | 科技 |
2017-09/1579/en_head.json.gz/25689 | BioresAnalysis and news on trade and environment
Climate ChangeBioresCountries agree international aviation emissions pact14 October 2016
The UN’s civil aviation body has approved a ground-breaking international scheme to help offset some emissions from air travel. The so-called “Carbon Offsetting and Reduction Scheme for International Aviation,” or CORSIA for short, is a market-based mechanism that will address any annual increase in carbon dioxide emissions above 2020 levels.
The news was announced on Thursday 6 October following nearly two weeks of meetings in the Canadian city of Montreal, which is the headquarters of the International Civil Aviation Organization (ICAO). The UN agency holds its highest level of meetings – the ICAO Council – every three years.
Under CORSIA, countries can choose to take part in a pilot phase starting in 2021. Afterward, they can opt into a first voluntary phase, which will kick off in 2024. A second phase will then apply from 2027 to 2035 to all states with the exception of countries claiming a minimal share in the aviation industry, as well as least developed countries (LDCs), small island developing states (SIDS), and landlocked developing countries (LLDCs), unless they wish to participate. During the pilot phase, countries will calculate airline operators’ offsetting requirements based on data for the overall sector. The first and second phase will then look at emissions by individual airlines. The text also stipulates that new entrants into the industry will be exempt from participation in the CORSIA for three years. Operators with a very low level of activity, along with humanitarian, medical, and firefighting operations, will be permanently exempt.
“It has taken a great deal of effort and understanding to reach this stage,” said ICAO Council President Dr. Olumuyiwa Benard Aliu in a press release last week. “We now have practical agreement and consensus on this issue backed by a large number of states who will voluntarily participate.”
As of last Thursday, 65 states representing more than 86.5 percent of international aviation activity had indicated plans to participate in the new scheme from its outset. This includes big emitters such as the EU, US, Canada, and Japan, but does not yet include Russia or India.
Emissions from air travel currently account for about two percent of the global total, but these are growing rapidly. While ICAO has set a target of carbon-neutral growth from 2020, some stakeholders argued last week that the new offsetting scheme was not ambitious enough to meet this goal.
According to the Environmental Defense Fund (EDF), current participants in the various phases should cover around 77 percent of air travel emissions growth between 2021-2035. A long journey
The CORSIA decision comes after ICAO members agreed in 2013 to outline an aviation emissions reduction platform within three years. The move also followed hot on the heels of ratification of the new international Paris Agreement on Climate Change last Tuesday. (See Bridges Weekly, 6 October 2016) [Editor's note: Bridges Weekly is ICTSD's flagship publication for trade and sustainable development news]
The Paris Agreement does not cover emissions from either international aviation or shipping. Assigning responsibility for tackling emissions from both sectors has long been considered a tricky subject given the cross-border nature of these activities.
For example, slow progress on the subject at ICAO had prompted the EU in 2012 to include international aviation emissions in its flagship Emissions Trading System (ETS). This required the surrender of permits of the entire duration of any flight landing in the bloc – even for those parts which took place outside the EU. The rule provoked a strong backlash by over two dozen countries including China and the US, who claimed that this amounted to a breach of sovereignty, would alter competitiveness, and could potentially be in violation of international trade rules.
The high-profile spat eventually saw the EU modify its ETS rules to require the submission of carbon permits only for intra-European flights. The bloc maintained at the time, however, that it would revisit this decision depending on whether this year’s ICAO Council was able to reach a satisfactory agreement. (See BioRes, 7 April 2014)
European officials have confirmed that this review will go forward, with the EU’s executive arm to examine the ICAO scheme and present their findings to the bloc’s parliamentarians and the European Council next year.
Several industry voices welcomed the recent ICAO outcome, having backed the push for a global scheme, as opposed to a patchwork of national measures that could be both costly and create an uneven playing field.
“The CORSIA agreement has turned years of preparation into an effective solution for airlines to manage their carbon footprint,” said Alexandre de Juniac, Director General and Chief Executive of the International Air Transport Association (IATA), in a press release last week.
ICAO estimates that the scheme could cost the airline industry between US$1.5 to US$6.2 billion in 2025, assuming carbon prices range between US$6-10 per tonne of carbon dioxide equivalent (CO2e) to US$20-33 per tonne of CO2e. The UN agency also predicts that this could go up to between US$5.3 to US$23.9 billion in 2035, assuming carbon prices are at between US$12-40 per tonne of CO2e. Carbon prices will be determined by the various offsetting projects selected. Details in progress
According to experts, several key details will need to be clarified in order to bring the scheme into operation. These include developing systems for monitoring, reporting, and verifying emissions and determining the type of projects that qualify for offsets. “Viewed globally, this is a landmark deal that addresses a gap in the plan to deliver the Paris Agreement, namely how to tackle the soaring emissions from international aviation. But there are gaps in coverage and many issues still to be decided that will determine its effectiveness,” Tim Johnson, representing the Aviation Environment Federation, told journalists last Thursday.
Starting in 2022, the ICAO Council will also review the implementation of the CORSIA every three years, making adjustments as appropriate. While the scheme is not currently linked to global temperature limits enshrined in the Paris climate deal, the review process would consider ways to improve the scheme that could support these objectives.
Emissions units generated by mechanisms under the UN Framework Convention on Climate Change (UNFCCC), under which the Paris pact sits, would be eligible for use in the CORSIA.
The UNFCCC has had several offsetting schemes in place under the existing Kyoto Protocol. The new Paris deal envisages the development of a mechanism that could include offsets, along with the development of provisions for the use of emissions sinks and reservoirs, such as through forest conservation.
Various experts consider that scaling up carbon pricing efforts around the world will be crucial for ensuring a timely transition to a low-carbon economy in order to avoid the worst impacts of climate change. They have therefore commended CORSIA for its potential spillover effects on carbon markets more broadly.
“Achieving carbon neutral growth from 2020 is a significant step in its own right. And with robust implementation, the market-based measure can serve as a springboard to greater ambition, not only for the aviation sector, but – through market linkages worldwide – also for emitting sectors more broadly,” said Nathaniel Keohane, Vice President of the EDF, commenting on the possible links between different market-based climate action initiatives. Next emissions deal in line
Another emissions-related agreement may be announced by the end of this week, with a UN meeting underway in Kigali, Rwanda, to finalise talks on an amendment to the Montreal Protocol on Substances that Deplete the Ozone Layer.
The amendment would involve phasing down the production and consumption of hydrofluourocarbons (HFCs), a potent greenhouse gas used in air conditioners and refrigerators as a coolant. The use of HFCs has grown in recent years, particularly as a substitute for an ozone-depleting substance that is covered under the Montreal Protocol.
The Kigali meeting is due to conclude on 14 October, with negotiators currently grappling with topics such as when to “freeze” their levels of HFC production and consumption, along with the pace of the so-called “phase-down” and how to handle the costs of adapting to more sustainable coolants.
Over 100 countries had already signed on to a statement backing a draft version of the amendment by the time the meetings began. In anticipation of a successful accord, various countries and private donors have already started putting money toward a multilateral fund aimed at facilitating the transition toward using different coolants.
ICTSD reporting; “Reaction: Aviation climate deal agreed in Montreal,” CLIMATE HOME, 6 October 2016; “Over 190 Countries Adopt Plan to Offset Air Travel Emissions,” THE NEW YORK TIMES, 6 October 2016; “Why a UN Climate Deal on HFCs Matters,” CARBON BRIEF, 10 October 2016; “EU airline pollution curbs stay in the air until next year,” REUTERS, 11 October 2016.
TAG: Climate Change
CITESCITES hosts largest-ever meet, takes key wildlife trade decisions11 October 2016Governments from around the world have agreed to a host of new protections and sustainable management strategies for trade in endangered plants and animals, following an 11-day gathering held in...Share: 3 3 Next article
Climate ChangeCountries agree to HFC amendment to Montreal Protocol20 October 2016Nearly 200 countries secured a deal to phase down global climate-warming hydrofluorocarbon (HFC) emissions on Saturday, following round-the-clock negotiations during the Twenty-Eighth Meeting of the...Share: View the discussion thread.
What next for the trade, climate communities?Volume 10 - Number 1, 19 February 2016
BioresTaking stock of evolutions in the trade and climate relationship1 December 2015
4 4 BioresGreen Climate Fund opens for business21 May 2014
13 13 BioresThe newsroom27 November 2014
Climate Change UNFCCC European Union (EU) Climate Change Governance WTO TWITTER @ICTSD_BIORES | 科技 |
2017-09/1579/en_head.json.gz/25712 | Cluster spacecraft detects elusive space wind
A new study provides the first conclusive proof of the existence of a space wind first proposed theoretically over 20 years ago.
By analysing data from the European Space Agency’s Cluster spacecraft, researcher Iannis Dandouras detected this plasmaspheric wind, so-called because it contributes to the loss of material from the plasmasphere, a donut-shaped region extending above the Earth’s atmosphere. The results are published today in Annales Geophysicae, a journal of the European Geosciences Union (EGU). “After long scrutiny of the data, there it was, a slow but steady wind, releasing about 1 kg of plasma every second into the outer magnetosphere: this corresponds to almost 90 tonnes every day. It was definitely one of the nicest surprises I’ve ever had!” said Dandouras of the Research Institute in Astrophysics and Planetology in Toulouse, France. The plasmasphere is a region filled with charged particles that takes up the inner part of the Earth’s magnetosphere, which is dominated by the planet’s magnetic field. To detect the wind, Dandouras analysed the properties of these charged particles, using information collected in the plasmasphere by ESA’s Cluster spacecraft. Further, he developed a filtering technique to eliminate noise sources and to look for plasma motion along the radial direction, either directed at the Earth or outer space. As detailed in the new Annales Geophysicae study, the data showed a steady and persistent wind carrying about a kilo of the plasmasphere’s material outwards each second at a speed of over 5,000 km/h. This plasma motion was present at all times, even when the Earth’s magnetic field was not being disturbed by energetic particles coming from the Sun. Researchers predicted a space wind with these properties over 20 years ago: it is the result of an imbalance between the various forces that govern plasma motion. But direct detection eluded observation until now. “The plasmaspheric wind is a weak phenomenon, requiring for its detection sensitive instrumentation and detailed measurements of the particles in the plasmasphere and the way they move,” explains Dandouras, who is also the vice-president of the EGU Planetary and Solar System Sciences Division. The wind contributes to the loss of material from the Earth’s top atmospheric layer and, at the same time, is a source of plasma for the outer magnetosphere above it. Dandouras explains: “The plasmaspheric wind is an important element in the mass budget of the plasmasphere, and has implications on how long it takes to refill this region after it is eroded following a disturbance of the planet’s magnetic field. Due to the plasmaspheric wind, supplying plasma – from the upper atmosphere below it – to refill the plasmasphere is like pouring matter into a leaky container.” The plasmasphere, the most important plasma reservoir inside the magnetosphere, plays a crucial role in governing the dynamics of the Earth’s radiation belts. These present a radiation hazard to satellites and to astronauts travelling through them. The plasmasphere’s material is also responsible for introducing a delay in the propagation of GPS signals passing through it. “Understanding the various source and loss mechanisms of plasmaspheric material, and their dependence on the geomagnetic activity conditions, is thus essential for understanding the dynamics of the magnetosphere, and also for understanding the underlying physical mechanisms of some space weather phenomena,” says Dandouras. Michael Pinnock, Editor-in-Chief of Annales Geophysicae recognises the importance of the new result. “It is a very nice proof of the existence of the plasmaspheric wind. It’s a significant step forward in validating the theory. Models of the plasmasphere, whether for research purposes or space weather applications (e.g. GPS signal propagation) should now take this phenomenon into account,” he wrote in an email. Similar winds could exist around other planets, providing a way for them to lose atmospheric material into space. Atmospheric escape plays a role in shaping a planet’s atmosphere and, hence, its habitability. *More information* This research is presented in the paper ‘Detection of a plasmaspheric wind in the Earth’s magnetosphere by the Cluster spacecraft’ to appear in the EGU open access journal Annales Geophysicae on 2 July 2013. Please mention the publication if reporting on this story and, if reporting online, include a link to the paper or to the journal website (http://www.annales-geophysicae.net/). The scientific article is available online, free of charge, from the publication date onwards, at http://www.ann-geophys.net/recent_papers.html. To obtain a copy of the paper before the publication date, please email Bárbara Ferreira at media@egu.eu. The paper is authored by Iannis Dandouras of the Research Institute in Astrophysics and Planetology (IRAP), a joint institute of the French National Centre for Scientific Research (CNRS) and the Paul Sabatier University in Toulouse, France. The data was acquired by the CIS, Cluster Ion Spectrometry, experiment onboard ESA’s Cluster, a constellation of four spacecraft flying in formation around Earth. The European Geosciences Union (http://www.egu.eu) is Europe’s premier geosciences union, dedicated to the pursuit of excellence in the Earth, planetary and space sciences for the benefit of humanity, worldwide. It is a non-profit interdisciplinary learned association of scientists founded in 2002. The EGU has a current portfolio of 15 diverse scientific journals, which use an innovative open access format, and organises a number of topical meetings, and education and outreach activities. Its annual General Assembly is the largest and most prominent European geosciences event, attracting over 11,000 scientists from all over the world. The meeting’s sessions cover a wide range of topics, including volcanology, planetary exploration, the Earth’s internal structure and atmosphere, climate, energy, and resources. The 2014 EGU General Assembly is taking place is Vienna, Austria from 27 April to 2 May 2014. For information regarding the press centre at the meeting and media registration, please check http://media.egu.eu closer to the time of the conference. If you wish to receive our press releases via email, please use the Press Release Subscription Form at http://www.egu.eu/news/subscribe/. Subscribed journalists and other members of the media receive EGU press releases under embargo (if applicable) 24 hours in advance of public dissemination. *Contact* Iannis Dandouras Research Institute in Astrophysics and Planetology (IRAP) Toulouse, France Tel: +33-5-6155-8320 Email: Iannis.Dandouras@irap.omp.eu Bárbara Ferreira EGU Media and Communications Manager Munich, Germany Tel: +49-89-2180-6703 Email: media@egu.eu Weitere Informationen:http://www.egu.eu/news/66/cluster-spacecraft-detects-elusive-space-wind/http://www.annales-geophysicae.net/
Dr. Bárbara Ferreira | European Geosciences Union Further information:
http://www.egu.eu Further reports about: > Astrophysics
> Cluster
> EGU
> ESA
> Earth's magnetic field
> GPS data
> GPS signal
> energetic particle
> magnetosphere
> plasmasphere
> space wind
Microhotplates for a smart gas sensor
Scientists unlock ability to generate new sensory hair cells | 科技 |
2017-09/1579/en_head.json.gz/25842 | Mitchell residents to chat live via video with physicists underground
By news@mitchellrepublic.com
on Jun 18, 2013 at 4:59 a.m.
Mitchell residents can connect live with scientists deep underground this week during an event hosted by the local Rotary Club. The Deep Science lecture series will make its The entire final stop of the summer at noon Thursday at the Mitchell Ramada. It's open to the public.
The event will begin with a 15-minute slideshow overview of the lab presented by an individual in the room, consisting of pictures and video, before moving into a live, two-way video Qand-A session with physicists at the Davis Campus of the Sanford Lab. The lab is located at the former Homestake gold mine in Lead and is 4,850 feet underground, nearly a mile under the earth's surface. "The entire purpose of a video conference is to make it be a conversation," said Bill Harlan, communications director of the Sanford Underground Research Facility. "At first, people are a little shy, and then they realize it does work. We've had some really interesting conversations." The physicists will give a short walking tour of the facility, and will then answer questions on their experiences working underground and about the lab's experiments. The lab is involved in a worldwide race to detect dark matter, the most prevalent and predominant form of matter in the universe that has never been directly detected. The Majorana Demonstrator is being used to detect dark matter, which scientists believe is a Weakly Interacting Massive Particle, or WIMP. The physicists utilize a xenon detector with the hope that a WIMP will collide with a xenon atom. The lab is also involved in a neutrino experiment in an attempt to detect a low-mass particle. "We want people to know these are two world-leading experiments, and they are exploring some of the most important questions facing 21st century physics," Harlan said. "Physicists and astrophysicists all over the world know about this project. We want to be able to say the same thing for the people in South Dakota." Live video conferences with the underground physicists have been held in Huron, Aberdeen, Sioux Falls and Yankton, with more than a hundred audience members in attendance at several sessions. The free presentation is sponsored by the Mitchell Rotary Club. An optional lunch is available for $10.
Explore related topics:NewslocalstateNewsupdatesLocalmitchellStateLeadSanfordlabSciencePhysicsResearchhomestakeAdvertisement | 科技 |
2017-09/1579/en_head.json.gz/25927 | How Global Warming, Hunting and Industrial Exploration Threatens the Polar Bear
Caroline Lennon February 11, 2013
In the 1960s an environmental, political and social movement emerged with the aim of protecting all species, living systems, and natural resources, worldwide. The movement encompasses wildlife conservation, which focuses on animal and plant species, as well as their natural habitat. The goal is to help protect them, not only for future generations and any value they may offer humans, but also for their own inherent significance.
When presenting population figures on a particular animal species conservationists often report how many are left “in the wild,” reminding us that some animals are held in captivity. In addition, as the human population continues to grow at an unprecedented rate and modern civilisation threatens all that is natural, the numbers we’re given usually show a decline in population. The polar bear has garnered a lot of attention in the past decade for these reasons, raising regular debate over whether enough is being done to prevent habitat loss and possible extinction.
In advance of the Convention on International Trade in Endangered Species (CITES) – scheduled to take place in Thailand in March – there has been a recent surge in support for new legislation to protect the polar bear. In 2010, at the last CITES summit, Canada and the European Union ignored pleas to “uplist” the polar bear and led a vote in favor of lighter regulation. Now, the United States is again proposing an upgraded status of Appendix I, which would impose an effective international ban on the trade in all polar bear parts.
HUNTING AND INTERNATIONAL TRADE
The polar bear’s habitat covers Canada; Denmark (Greenland); Norway (Svalbard); Russia; and the U.S. (Alaska), throughout which it has always played an important cultural and spiritual role in indigenous communities. Native subsistence hunting to provide raw materials was usually small enough in scale not to affect the population, but in the 1960s and 1970s large scale hunting raised international concern when numbers dropped significantly. As a direct result the hunting of polar bears was completely banned in Norway, while in Greenland; Russia; and the U.S. non-native hunting was banned and native subsistence hunting curtailed. In these countries populations rebounded after controls took effect, but Canada is yet to take any action.
Although its entire range hasn’t been comprehensively studied, based on the tag and track studies which have been conducted it is estimated there are 20-25,000 polar bears left in the wild. Of these 16,000 live in Canada, where approximately 500 are killed annually. Growing interest from China and Russia, in particular, has led to an increase in demand for sport-hunting and trophies, with wealthy tourists paying up to $20,000 to hunt and $100,000 for a polar bear pelt.
GLOBAL WARMING, SEA ICE AND THE ARCTIC
Unfortunately, hunting is only the polar bear’s second greatest threat. In 2006 the International Union for Conservation in Nature (IUCN) upgraded the polar bear from a species of least concern to a vulnerable species, citing climate change as the main reason; followed in 2008 by the U.S. Department of the Interior listing it as a threatened species under the Endangered Species Act. In fact, the polar bear has unwittingly become the poster child of climate change, and not by chance.
Over the past 25 years the average temperature of the atmosphere and oceans has risen, causing the summer to lengthen and sea ice cover to decline. These changes gravely affect polar bears because of their dependence upon sea ice for food and habitat; and having evolved to become perfectly adapted to the Arctic Circle and Polar Basin, without this habitat they cannot survive. Based on current predictions, by 2050 only those polar bears living in the Canadian Arctic Islands and along the northern coast of Greenland, where sea ice is expected to remain, will be alive. This accounts for less than a third of the current population and, as sea ice retreats at a rate faster than previously predicted, even this could be optimistic.
INDUSTRIAL DEVELOPMENT AND OIL EXPLORATION
Regrettably, additional pressure is put on the ecological and environmental health of these delicate areas by industrial development. In Canada, Russia and the U.S. onshore oil exploration and extraction has taken place for many years and the threat it poses is only increasing as companies push north, further into the polar bear’s habitat. The sensitive dens set up for pregnant females and those with cubs are easily disturbed which can cause the mother to abandon her young, while an oil spill could affect the polar bear’s entire food supply. In addition, any oil ingested could cause fatal kidney failure; and if their fur came into contact with oil it would no longer provide insulation, increasing the chance of death from hypothermia.
In 1973 the International Agreement on the Conservation of Polar Bears was signed by Canada, Denmark, Norway, the U.S. and the former U.S.S.R. to manage preservation of the polar bear’s habitat and population, as well as conduct vital research. These countries agreed to ban hunting from aircraft and icebreakers, as well as place restrictions on commercial and recreational hunting, permitting only that by indigenous people using traditional methods. With other threats emerging, however, a lot more must be done.
The agreement holds these five countries accountable for taking appropriate action to protect the polar bear, but in reducing our own carbon footprint we can help to preserve the arctic marine ecosystem.
Minimise consumption: reduce, reuse and recycle; and when purchasing new products ensure they are both sustainable and made by an environmentally-friendly company. Furthermore, buy and cook only what food you’ll eat, and use only what water you need.
Be energy efficient: use energy-efficient appliances, bulbs and electronics, turning them off when not in use. If possible use conventional methods, i.e. washing dishes by hand, and use green power from utility suppliers that don’t exploit oil and gas reserves.
Please remember that species can only be conserved as part of their entire ecosystem, so do not support zoos. With any luck the above image should show how unnatural these environments are.
Photo credit: Jennifer Mairéad
0 comments on “How Global Warming, Hunting and Industrial Exploration Threatens the Polar Bear” Sign on with: | 科技 |
2017-09/1579/en_head.json.gz/25963 | PFF Scholars
PFF Areas of Study
PFF BLOGS NEWS
Headline Issues
Aspen Summit
Search pff.org
Search Issues & Publications
Search The PFF Blog
FOR IMMEDIATE RELEASE CONTACT: Mike Wendy
Esbin Hopeful That Implementation of National Broadband Plan Will Follow Proper Authority, Process WASHINGTON D.C. — Today, the FCC released its National Broadband Plan. The Plan seeks to increase the availability of broadband services across America. The following statement may be attributed to Barbara Esbin, Senior Fellow & Director of the Center for Communications and Competition Policy at The Progress & Freedom Foundation: Much of what we heard this morning about the Federal Communications Commission's National Broadband Plan was very encouraging. The Plan is bold in reach and outlook, and sets an agenda for further research and discussion, as well as targeted actions on the part of the FCC and other agencies of government. Perhaps of equal importance, it recognizes that "Today's broadband ecosystem is vibrant and healthy in many ways," suggesting that there is no need to fix that which is not broken with respect to our regulatory framework for the broadband Internet.
I am particularly pleased to read that the many contentious issues concerning the legal framework for the FCC's implementation of the Plan, including the ill-conceived request that the FCC simply re-classify all broadband Internet access services as Title II telecommunications services so that they may be more pervasively regulated, are not explicitly included among the recommendations contained in the Plan itself. Rather, the Plan commits only that the knotty legal and jurisdictional questions raised by these requests will be subject to further consideration by the FCC as it moves forward to implement those portions of the Plan that fall within its subject matter jurisdiction over wire and radio communications. This is how it should be.
Only the full Commission can vote proposed plans and policy goals into enforceable rules — through notice-and-comment rule making proceedings — and only where the proposals lie squarely within the authority delegated to the agency by Congress. Following correct procedures and recognizing the FCC's jurisdictional limitations is critical for the prompt achievement of the many important goals outlined in the broadband blueprint that the FCC has put forth. It is a hopeful omen that the Plan's authors have exercised a degree of regulatory humility with respect to the scope of the FCC's regulatory authority. Following the correct process will ensure that the FCC's efforts can be well-balanced, sustainable and effective.
Esbin is available for further comment. Please contact Mike Wendy mwendy@pff.org.
The Progress & Freedom Foundation is a market-oriented think tank that studies the digital revolution and its implications for public policy. It is a 501(c)(3) research & educational organization. PFF Events
1444 Eye Street, NW | Suite 500 | Washington, DC 20005 | © 1993-2009 | Contact Us | Home | 科技 |
2017-09/1579/en_head.json.gz/25993 | dansk Deutsch español Français italiano Nederlands norsk português suomeksi svenska O2 Secure Wireless, Inc. to Launch Testing of New State of the Art Wireless Broadband Equipment in Palm Coast Florida
from O2 Secure Wireless, Inc. ST. AUGUSTINE, Fla., Aug. 27, 2012 /PRNewswire/ -- O2 Secure Wireless, Inc. (Pink Sheets: OTOW) announces that the Company is set to begin testing of new cutting edge wireless broadband equipment beginning in September, 2012.
The Company has recently completed a fiber backhaul installation at the corporate retail location in Palm Coast and will be installing new revolutionary advanced technology broadband equipment for testing.
The testing period is scheduled to conclude 60 days following installation. O2 Secure Wireless will be thoroughly testing the total coverage area and signal strength for indoor and outdoor coverage, as well as the different antennas and amplifiers. The Company will be testing many different configurations of the equipment. Once it meets or exceeds the specifications of the manufacturers' technical data the Company will approve the purchase of the equipment and begin building out the Palm Coast market area. O2 Secure Wireless has a Service Network Agreement for the populated territory encompassing the city of Palm Coast Florida, representing a market potential in the range of 55,000 households. The Company has been engaging in Community events for the purpose of building brand recognition and raising awareness for the Company, with FCC approvals of its "COALS" Applications, the recent announcement with BigHeadTV, and execution of national resale agreements, these give the Company a distinctly superior advantage in the marketplace. The "triple play" packages are designed to be priced competitively with the median revenue per client to be estimated at $90 to $100 per month.
"We are very excited to have reached this landmark in our strategy to build a presence in the Palm Coast territory. These tests will allow us to have 'real' coverage data, original coverage models indicated 12 sites were needed; we refined that down to 7 sites, but once the evaluation is complete we will be able to determine the precise coverage requirement in order to deliver the best quality product to the consumer. We anticipate that we will be able to announce many new developments as events begin to unfold and the project goes into full swing," stated Val Kazia, President, O2 Secure Wireless, Inc. About O2 Secure Wireless: O2 Secure Wireless is a Company that is currently developing numerous wireless tower facilities in the U.S. The Company is also instrumental in the development of wireless broadband communication services domestically. Under a recent merger with Earthcom Service Inc., the Company is currently being structured to provide affordable flat rate pre-paid wireless services in developing countries internationally.
Safe Harbor Act: This release may contain "forward-looking statements" within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E the Securities Exchange Act of 1934, as amended and such forward-looking statements are made pursuant to the safe harbor provisions of the Private Securities Litigation Reform Act of 1995. "Forward-looking statements" describe future expectations, plans, results, or strategies and are generally preceded by words such as "may," "future," "plan" or "planned," "will" or "should," "expected," "anticipates," "draft," "eventually" or "projected." You are cautioned that such statements are subject to a multitude of risks and uncertainties that could cause future circumstances, events, or results to differ materially from those projected in the forward-looking statements, including the risks that actual results may differ materially from those projected in the forward-looking statements as a result of various factors, and other risks identified in a company's annual report.
For more information visit our website at http://www.o2securewireless.com or contact Investor Relations: Gibraltan Financial (407)830-9777 SOURCE O2 Secure Wireless, Inc. RELATED LINKS
http://www.o2securewireless.com
Preview: O2 Expands Campaign to Hispanic Market with targeted Television and Radio Ads
Preview: O2 Secure Wireless, Inc. Share Structure Modification in Anticipation of Capital Infusions for the Wireless Projects Domestic and Abroad | 科技 |
2017-09/1579/en_head.json.gz/26013 | Rev'd Dr. Nevil Maskelyne
Britain's fifth Astronomer Royal, the Rev'd Dr. Nevil Maskelyne (1732-1811), was a central figure in the planning of the 1769 British ToV expeditions. He advised on the appropriate instruments, developed observing protocols, helped select the appropriate personnel for the observing teams, received and corrected reports of observations, and transmitted data to others. He appears to have done so with good humour and efficiency, for the most part. His importance for the realization of the Canadian ToV data cannot be overemphasised.
Before he was appointed Astronomer Royal, he himself was sent to observe the 1761 ToV from the Island of St. Helena in the South Atlantic. The single largest expenditure of the two astronomers manning that station was for alcohol!
Image courtesy of Specula astronomica minima (©Specula astronomica minima).
ArchiveTags: 1804Astronomer RoyalNevil MaskelyneTransit of Venus | 科技 |
2017-09/1579/en_head.json.gz/26021 | SMU’s David Blackwell Touts Nationwide Geothermal Energy Potential At Capitol Hill Science Briefing
SMU geothermal expert David Blackwell called unconventional geothermal techniques a potential game changer for US energy policy in a Capitol Hill briefing sponsored by the NSF, Discover Magazine, IEEE and ASME
SMU Geothermal energy expert David Blackwell gave a Capitol Hill briefing Tuesday, March 27, on the growing opportunities for geothermal energy production in the United States, calling "unconventional" geothermal techniques a potential game changer for U.S. energy policy.
Blackwell's presentation outlined the variety of techniques available for geothermal production of electricity, the accessibility of unconventional geothermal resources across vast portions of the United States and the opportunities for synergy with the oil and gas industry. Also speaking at the briefing were Karl Gawell, executive director of the geothermal energy association, and James Faulds, professor at the University of Nevada-Reno and director of the Nevada Bureau of Mines and Geology.
"This is a crucial time to do this briefing," said Blackwell, W. B. Hamilton Professor of Geophysics in SMU's Dedman College of Humanities and Sciences and one of the nation's foremost experts in geothermal mapping. "Everybody is worrying about energy right now."
The session was one in a series of continuing Congressional briefings on the science and technology needed to achieve the nation's energy goals, titled collectively, "The Road to the New Energy Economy." The briefing was organized by the National Science Foundation, DISCOVER Magazine, the Institute of Electrical and Electronics Engineers (IEEE) and the American Society of Mechanical Engineers (ASME). Senate Majority Leader Harry Reid of Nevada was honorary host for the March 27 briefing at the Senate Visitor's Center, which included congressional staffers, members of science and engineering associations, government, private and industry representatives.
SMU's geothermal energy research is at the forefront of the movement to expand geothermal energy production in the United States. Blackwell and Maria Richards, the SMU Geothermal Lab coordinator, released research in October that documents significant geothermal resources across the United States capable of producing more than three million megawatts of green power – 10 times the installed capacity of coal power plants today. Sophisticated mapping produced from the research, viewable via Google Earth at http://www.google.org/egs/, demonstrates that vast reserves of this green, renewable source of power generated from the Earth's heat are realistically accessible using current technology.
Blackwell began his presentation by debunking the common misperception that geothermal energy is always dependent on hot fluids near the surface — as in the Geysers Field in California. New techniques are now available to produce electricity at much lower temperatures than occur in a geyser field, he said, and in areas without naturally occurring fluids. For example, enhanced geothermal energy systems (EGS) rely on injecting fluids to be heated by the earth into subsurface formations, sometimes created by hydraulic fracturing, or "fracking."
Blackwell noted the potential for synergy between geothermal energy production and the oil and gas industry, explaining that an area previously "fracked" for oil and gas production (creating an underground reservoir) is primed for the heating of fluids for geothermal energy production once the oil and gas plays out.
The SMU geothermal energy expert called these "unconventional" geothermal techniques a potential game changer for U.S. Energy policy. Geothermal energy is a constant (baseload) source of power that does not change with weather conditions, as do solar and wind-powered energy sources. Blackwell noted that SMU's mapping shows that unconventional geothermal resources "are almost everywhere."
Blackwell closed his presentation with acknowledgment that site-specific studies and more demonstration projects are needed to make geothermal energy a strong partner in the new energy economy.
The briefing was taped and will be posted to the Science 360 website hosted by the National Science Foundation at a later date. | 科技 |
2017-09/1579/en_head.json.gz/26070 | July 22, 2007: Extinct In Two Years?
Recently a Chinese dolphin species, the baiji or Yangtze River Dolphin, was declared "functionally extinct" – that is, scientists can't prove they're all dead, but they couldn't find a single one. It was wiped out by habitat destruction, accidental entanglement in nets, and deliberate illegal fishing. A North American porpoise is about to follow it.
The vaquita is the world's smallest porpoise. It is found only in the northern Gulf of California. Estimates of the surviving population range from 100 to 300. The vaquita, or Gulf of California Porpoise, is not a deliberate target of fisherman, but several dozen are killed every year by entanglement in fishing nets. The Mexican government, which controls its waters, has issued regulations prohibiting gill net use, but does little to enforce them. As a result, fishing, and destruction of porpoises, continues. Researchers extrapolating from reported vaquita deaths estimated that 78 vaquita die each year from gill nets – more than one every week. At that rate, the species probably has about two years.
According to an article in the July-August 2007 issue of Natural History Magazine, $25 million would make it possible to eliminate all vaquita bycatch, by buying out fishermen or re-equipping them with fishing equipment that doesn't kill dolphins. U.S. and Mexican economists and marine researchers are working on such a program, but they don't have the funding to implement it. See www.vaquita.org for more about their group - or to contribute.
The authors of the Natural History article point out that $25 million would be a drop in the bucket to a corporate sponsor. Is there a corporation out there willing to direct a donation toward actually saving a species? Or to redirect part of their ad budget toward doing some good that would be worth advertising?
There's also a vaquita article at the Cetacean Society Incorporated website. As it points out: for the species to survive, one thing has to happen. The deadly nets, already illegal, have to come out of the water now. That would require the government of Mexico to enforce its regulations. As I write this, I do not know what the most effective way might be for citizens of Mexico to address this issue with their government, or for citizens of other countries to address this issue in a way that will reach the government of Mexico. – Steve Jackson | 科技 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.