id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2016-40/3982/en_head.json.gz/6703 | Bird Cams
Bird Academy
Bird ID Skills
Bird Friendly Homes
Living Bird Magazine
Support the Lab
Get eNews
The Cornell Lab of Ornithology All About Birds
The Burrowing Owls of the Salton Sea The Burrowing Owls of the Salton Sea
by Floris van Breugel January 15, 2010
More From Living BirdLiving Bird Winter 2010—Table of ContentsLiving Bird Magazine—Latest IssueLiving Bird Magazine Archives There she stood, poking her head out of the burrow, keeping an eye on me. After a few minutes she crept slowly out and sidled up to her mate to get a better look. Owls are curious creatures—particularly Burrowing Owls, which make their homes in small tunnels in the dry open areas of western North America as well as in Florida, Cuba, and a number of nearby islands. As we stared at each other, we seemed to connect in a way that is difficult with any other kind of bird. Like humans, owls have both eyes in front of their faces, so it seemed like I was photographing two miniature, feathered people. Their behaviors were equally varied, and full of character. The female crouched low to the ground, blending in with the dry grasses and peering at me with her huge yellow eyes, while her mate hopped around on his long skinny legs.
The Burrowing Owl’s scientific name, Athene cunicularia, refers to the Greek deity Athena, goddess of wisdom, who was attended by an owl. Unlike most other owls, this species will hunt throughout the day. Most of their foraging takes place in the early mornings and late evenings, but they will catch anything from grubs to small birds while hanging around outside their burrows during the day. Over time, in addition to wisdom, the goddess Athena also came to symbolize war and the liberal arts, and if any creature can bring all of these characteristics together, it is the owl. Although the diminutive Burrowing Owl might seem to have a rather goofy demeanor compared with larger owls, such as Great Horned and Great Gray owls, they have all the ferocity and stealth of a skilled hunter.
As the sun went down, the pair of owls suddenly took off in complete silence— a feat made possible by their specially evolved feathers. Their primaries are serrated like a comb on the leading edge, which helps break up the flow of air, and the tattered trailing edges of their wings, along with their soft body feathers, further muffle the sound. The fluffy feathers that make them such effective predators also contribute to their cute appearance.
The second part of their name, cunicularia, comes from the Latin word cunicularis, meaning “pertaining to the rabbit.” Although I’m sure a Burrowing Owl would love to eat a tiny young bunny for dinner, the birds probably received the name because they reside in tunnels, like rabbits. Burrowing Owls often nest in colonies, using multiple burrows to increase the chances of their families’ survival in case a predator tries to eat the young. They generally use holes dug by small mammals or tortoises, although they sometimes shape their own holes to some extent. They are very picky when it comes to selecting a nesting location, preferring soft sand and dirt in relatively open areas for good visibility, with an entrance above ground level to avoid flooding.
The pair I was photographing, along with several others, was nesting on the fringes of agricultural fields alongside Southern California’s famed Salton Sea, about 80 miles east of San Diego and 30 miles north of the Mexican border. With the expansion of residential and commercial developments over the past few years, the number of burrowing sites and amount of hunting habitat has been decreasing significantly, but the State of California, local farmers, and the nearby Sonny Bono National Wildlife Refuge have been helping the owls by installing artificial plastic burrows. These makeshift homes have been quite successful in such unlikely places as golf courses, airport grasslands, and agricultural fields. The Burrowing Owls make their homes alongside the cement agriculture irrigation ditches there, sometimes no more than three feet from the road. The Salton Sea is now one of the most reliable places to see the tiny owls.
The area is much more than Burrowing Owl habitat; it’s a hotspot for migrating waterfowl along the Pacific Flyway. At first glance, you wouldn’t expect it to be such a great place for birds, with all the agricultural fields, power plants, and hunters. But the area supports one of the most diverse collections of birdlife in the United States, including a number of endangered species.
The story of the Salton Sea’s formation as a wildlife hotspot is ironic. Thousands of years ago, the Gulf of California extended all the way into the Imperial and Coachella valleys, but a natural dam formed, cutting off these valleys from the rest of the gulf. With the high evaporation rate of this desert landscape, the water soon disappeared, leaving behind a salty desert basin. In the early 1900s the Colorado River was diverted to provide water to the Salton Sea basin for agricultural development. Then, in 1905, a structural failure allowed the water to flow freely into the valley for two years. Since then, the water level has been maintained by runoff from the surrounding mountains and countless farms bordering the lake.
As the water from the Colorado River was diverted, the natural marshlands shrank, and birds began favoring the new marshes appearing at the edges of the young Salton Sea. As farming grew, the vast body of water expanded from the runoff, taking over those marshes and leaving precious little space for the birds. The lake now occupies about 380 square miles, and varies in depth to a maximum of 51 feet. Levees have been installed to prevent further encroachment on the precious marshland between the sea and farms.
The Salton Sea gradually increased in size and salinity, and several marine species were introduced from the Gulf of California. Over the years, the salinity has further increased, and it is now 25 percent saltier than the Pacific Ocean. This has started to threaten the fish species, an important food source for the migrating birds. So what once turned the arid landscape into a haven for thousands of migrating waterfowl is now threatening to destroy that delicate ecosystem. Unless something changes drastically soon, scientists predict that by 2015 all the sport fish will have died off, and many species of birds that nest there will leave shortly after.
While the state is still brainstorming for ideas, some people have started working to make a difference. One such person is Debi Livesay, a former journalist who has been working with the Torres Martinez Tribe to create a wetland for the birds. They started with seven ponds, with a range of depths and up to 20 acres in size, and now the area supports a plentiful collection of plants and fish and numerous species of birds. This is a complicated ecosystem, and it is difficult to predict what will ultimately happen, but this experiment has so far been wildly successful, and she plans to expand the artificial wetlands.
All of these developments directly affect migrating and resident shorebirds, but the entire ecosystem is interconnected. Increased pollution and salinity in the lake will slowly kill off various species in the food chain, and the effects may eventually trickle down to the Burrowing Owls. So far, however, the owls seem to be thriving, largely thanks to efforts on their behalf by the state and by local farmers. The owls prefer to nest under the edges of cement irrigation troughs that line the agricultural fields. The farmers need to keep these areas clear of vegetation to keep the troughs from clogging up, and the owls like these open spaces. Before removing weeds or doing other maintenance work, researchers clearly mark the areas with active burrows and then avoid them to minimize disturbance to the birds.
But nearby commercial and residential development has permanently disrupted many areas with potential burrowing habitat for the birds. To minimize this loss, new state regulations require that every time a burrowing area is destroyed, new artificial burrows must be installed and several acres of foraging space must be set aside. To coax the birds to move into the new areas, artificial burrows are set up before breeding season, and then the birds’ existing burrows are closed off with one-way trap doors. This way the owls can leave but no longer enter their old burrows. Then, over the course of a single night, an entire colony of owls moves to the artificial burrows.
These artificial burrows are simple boxes or wide tubes dug into the ground, with the entrance sticking up in a small mound. They work very well, and many pairs of owls have moved into them. In addition to the artificial burrows installed to replace burrows being destroyed, the wildlife refuge and local farmers have been putting in extra artificial burrows, which is not surprising. These owls benefit farmers, eating many rodents and insects that might otherwise damage crops.
To determine how well these programs are working and to get a better idea of how the owl populations are changing, the Imperial County Water District has been conducting a two-year survey that will finish soon. A statewide survey a few years ago showed a slight decline in Burrowing Owl populations, which corresponds to the trend many photographers had noticed in the Salton Sea area. It is possible, though, that the birds’ population is simply in a natural state of flux. And with all the attention they are getting, many people hope their population will remain stable or even increase.
Although they're small and cute, Burrowing Owls are fierce hunters of rodents and insects. Photo by Floris van Breugel.In California's Salton Sea, Burrowing Owls live in an agricultural landscape. Photo by Floris van Breugel.A pair of Burrowing Owls stare down the photographer from behind some leafy cover. Photo by Floris van Breugel.On agricultural lands, Burrowing Owls often take to culverts instead of earthen burrows. Photo by Floris van Breugel.The Black-necked Stilt is another common bird of the Salton Sea. Photo by Floris van Breugel.A Burrowing Owl stands to attention at its burrow. Photo by Floris van Breugel.PreviousNext
It is interesting to note that this entire ecosystem was artificial from the beginning. What started out as a desert was accidentally turned into a thriving wildlife sanctuary. Now the area hangs in a delicate balance, both ecologically and politically. These days conservation is a game of compromises, and the Salton Sea area is a balance of desert and oasis, farmland and open spaces, and hunting grounds and wildlife sanctuaries. Ensuring peaceful coexistence among all these interests can be difficult. To keep the geese away from agricultural fields, for example, farmers plant crops in the wildlife refuge during the off season to give the birds something to eat. In adjacent fields, hunters are allowed to shoot waterfowl, and fees from their hunting licenses and waterfowl stamps help fund the wildlife refuge itself.
Another example of ecological and economic balance are the geothermal power plants, operated by the CalEnergy Company, scattered throughout the southern part of the Salton Sea. They provide environmentally sound renewable energy but are another source of habitat destruction and pollution (though far less so than coal-fired power plants).
Despite the unnatural and in many ways depressing landscape, if you get a chance to visit the Salton Sea, I highly recommend it—especially in winter, when countless migratory birds arrive to wait out the cold northern winters. My favorites, the Burrowing Owls, are best seen from December through April. They nest from March through April, and can often be seen posing outside their burrows. During the summer it may be harder to spot them, and the intense desert heat is unpleasant during the day.
Floris van Breugel is an award-winning nature photographer based in California. See more of his work at his website.
As my visit wound to an end, an incredible sunset unfolded overhead. Bright pink light illuminated a landscape that supported the Burrowing Owls, hunters returning from their day’s shoot, countless waterfowl making their home on the lake, and the geothermal power plant churning its turbines to provide renewable power—all surrounded by the agricultural fields growing our food. The moment revealed the importance, and the difficulty, of maintaining a balance among competing interests.
In the fading light, I photographed the owls one last time— huddled together in the low vegetation. They were certainly a happy couple, and I wish them the best of luck on their hunts through the farm fields. Maybe I’ll find their kids next year, posing outside of their own burrows.
Recreating a Home Where Buffalo Can Roam (and Burrowing Owls, Too)
Team Sapsucker Is Ready to Connect the Birds on 500-Mile Big Day Run
Thursday p.m.: When Burrowing Owls Misbehave
Team Sapsucker Goes "Gigante" With 275 Species on Big Day 2014
Here's a Scout Week Peek at Our 2014 Big Day Route Through the Southwest
Animated Migration Map for Savannah Sparrow
Visit the Lab
Project Feederwatch
NestWatch
Celebrate Urban Birds
YardMap
Bird Walks & Events
Spring Field Ornithology
BirdSleuth
Clements Checklist
Neotropical Birds
State of the Birds
Scientific Citations
“Raven” Sound Analysis
Our Youtube Videos
Support Our Cause
Join the Lab
Shop for Our Cause
Contact the Lab | Privacy Policy | Terms of Use | Site Credits
Copyright © 2015 Cornell University
Or Browse Bird Guide by Family, Name or Shape ×Close Blue-winged Warbler by Brian Sullivan
Receive bird news, tips, and information about Lab projects.
The Cornell Lab will send you updates about birds, birding, and opportunities to help bird conservation. You can unsubscribe at any time. We will never sell or give your email address to others. | 科技 |
2016-40/3982/en_head.json.gz/6725 | Contacts: Karen McNulty Walsh, (631) 344-8350 or Peter Genzer, (631) 344-3174Print
The following news release was issued by Virginia Commonwealth University. X-ray diffraction data for this study were collected and analyzed at the National Synchrotron Light Source at the U.S. Department of Energy's Brookhaven National Laboratory. Wayne Hendrickson of Brookhaven's Photon Sciences Directorate and Columbia University and Qun Liu of Photon Sciences are co-authors on the paper.
Researchers Gain Insight into Key Protein Linked to Development of Cancers and Neurodegenerative Disorders
Virginia Commonwealth University researchers studying a key molecular player called Hsp70 that is responsible for protein homeostasis have uncovered how it binds together with another molecule responsible for intracellular energy transfer to enhance its overall activity and efficiency – details that have previously not been well understood.
Heat shock proteins, particularly the 70-kilodalton heat shock proteins, Hsp70, are important for cellular processes such as protein folding and protecting cells from stress. It is also involved with protein assembly, degradation and transport. Imbalances in protein homeostasis have been previously found to contribute to the onset of neurodegenerative diseases and cancers.
In the study, published this week in the Online First section of Nature Structural & Molecular Biology, a publication of the Nature journal family, researchers conducted a biochemical analysis of the structure to learn how ATP binding allosterically opens the polypeptide-binding site. In order for Hsp70 to do its job of regulating its binding to unfolded polypeptide substrates, it gains energy from the process of ATP hydrolysis. ATP is a molecule responsible for intracellular energy transfer.
The team found that when Hsp70 binds ATP it promotes the allosteric opening of the polypeptide binding site.
"Due to their essential roles in protein trafficking and proper folding since mis-folded proteins can disrupt cell function, Hsp70s are inextricably linked to the development of cancers, aging and neurodegenerative disorders," said Qinglian Liu, Ph.D., assistant professor in the Department of Physiology and Biophysics in the VCU School of Medicine.
"Understanding the structural properties at the atomic level and molecular working of Hsp70s will pave the foundation for designing efficient and potent small molecule drugs to specifically modulate the function of Hsp70s. The small molecule drugs may become novel and efficient treatments for cancers or neurodegenerative disorders," she said.
Liu said that the team's structural and biochemical analysis revealed how Hsp70s use ATP to open their peptide substrate binding site and thus regulate their ability in binding peptide substrates. "These findings help us understand at the atomic level how Hsp70s function in maintaining the well-being of cellular proteins, such as folding, assembly, transport and degradation," said Liu.
According to Liu, future work will move the team in two directions. First, based on this published work, they aim to design specific and potent modulators for Hsp70s and test their potential in treating cancers or neurodegenerative disorders. A second focus will be to study how Hsp70s cooperate with their Hsp40 partners to achieve their optimum activity in maintaining protein homeostasis.
VCU collaborated with Brookhaven National Laboratory, the Department of Biochemistry and Molecular Biophysics at Columbia University in New York. All the work was conducted at VCU, with the exception of the X-ray diffraction data collection and analysis, which was done at Brookhaven's National Synchrotron Light Source.
This work was supported by startup funds from the VCU School of Medicine, a New Scholar Award in Aging from the Ellison Medical Foundation, grant number AG-NS-0587-09, and a Grant-In-Aid Award from the American Heart Association, grant number 11GRNT7460003.
The study is titled "Allosteric opening of the polypeptide-binding site when an Hsp70 binds ATP.
Tags: NSLSphoton sciences
2013-1551 | Media & Communications Office
This is a print-friendly version of this news release. To see the full content, go to: http://www.bnl.gov/newsroom/news.php?a=11551 | 科技 |
2016-40/3982/en_head.json.gz/6749 | You are at:Home»Uncategorized»‘Will We Become God?’ Asks Science Channel and Morgan Freeman ‘Will We Become God?’ Asks Science Channel and Morgan Freeman 6
Kate O'Hare
on July 7, 2014
On Wednesday, July 9 (10 p.m. ET/PT, 9 p.m. Central), Science Channel’s “Through the Wormhole” series, with host Morgan Freeman, asks the question, “Will We Become God?”
The description for the episode reads:
Humanity’s potential seems limitless. But could we become as powerful as God? Scientific breakthroughs grant our species seemingly divine abilities. Biologists tinkering with DNA are figuring out ways to grow new life forms, while neuroscientists try to create artificial consciousness. Statisticians around the world are using big data to predict the future, and computer scientists have discovered a ‘God algorithm’ that could solve any global problem in an instant.
But to truly become God, we not only have to be all-knowing, but all-being. Quantum physicists are figuring out how to teleport matter at the speed of light!
What, does this mean humans will be able to create a rock so big even they can’t lift it?
I can’t wait.
What should Catholics do when presented with yet another science program that can’t resist asking questions about the nature of God?
First, be flattered. As Oscar Wilde (a convert to Catholicism before he died), observed, “There is only one thing in the world worse than being talked about, and that is not being talked about.” If they’re talking about God, they’re thinking about God. And thinking about God can lead to all sorts of other things, including believing in God (in God and the Holy Spirit’s own good time, of course).
Second, the answer is obviously “NO,” so we can look at questions like this from our own perspective, without fearing the conclusion. We realize humans can’t become God, but we also know they persist in trying, and we’ve got a pretty clear idea what it looks like when they do. So, view with a critical eye, comment (politely but confidently) on Facebook and Twitter, leave a message on the show’s Website.
While you’re at it, drop a line to host Morgan Freeman on Twitter. He hasn’t posted anything in a while, but he did play God in “Bruce Almighty” and its sequel, “Evan Almighty,” and he has discussed the subject of God in a recent interview for The Daily Beast, so he must be used to the topic.
(But don’t necessarily expect him to agree with the Catholic view of deity, since he’s quoted in the piece as saying, “The highest power is the human mind. That’s where God came from, and my belief in God is my belief in myself,” and ” think that God and the Devil are one. They’re not one in the same, but they’re in the same body, and it depends on which one of them surfaces.”)
Third, realize that the notion that faith stands in opposition to science and reason, however misguided that is from our point of view, is dogma for a lot of non-religious types. And, for some mean-spirited or unscrupulous folks, it’s bait to get less sophisticated Christians to spout ridicule-ready things that can then be plastered on Facebook and Twitter memes.
But we Catholics have a huge body of scholarship and thought on the subjects of faith, reason and science, along with some great apologetics sites and modern authors that cut to the chase, so Google before you speak or write.
When in doubt, one can always start with Aquinas (I just heard him quoted in “Madam Secretary,” a CBS drama about a non-Hillaryesque Secretary of State, more on that at a later date).
A good place to start is this Weekly General Audience address by Pope Emeritus Benedict XVI, from June of 2010, on “St. Thomas Aquinas: The Harmony of Faith and Reason,” as reported in the National Catholic Register.
He says, “In short, Thomas Aquinas demonstrated that a natural harmony exists between Christian faith and reason. This was Thomas’ great achievement. In that moment of a clash between two cultures — a moment in which it seemed that faith would have to capitulate to reason — Thomas demonstrated that the two go together: what seemed to be reason incompatible with faith was not reason, and what seemed to be faith was not faith insofar as it was opposed to true rationality. Thus, he created a new synthesis, which shaped culture throughout the following centuries.”
And if you need a short course on Aquinas, here’s one, just 50 pages long, courtesy of scholar Dr. Taylor Marshall, in a downloadable PDF.
The views expressed here are those of the author, and do not necessarily represent the views of CatholicVote.org Share.
Kate O'Hare A native of the Adirondacks and Saratoga Springs in northern New York State, journalist and fiction writer Kate O'Hare now lives in Los Angeles, where she's on a neverending quest to find a parish in the L.A. Archdiocese with orthodox preaching, excellent traditional music and parking.
Related Posts September 26, 2016 2 Who remembers Cecil the Lion? [We forget things quickly]
August 24, 2016 8 New Study on ‘Sexuality and Gender’ Shakes Assumptions About Transgenderism
July 26, 2016 1 A French Martyr, and the West in Denial
Leave A Reply Cancel Reply Recent PostsPlaying the VictimSeptember 30, 2016No CommentsWhy Trump Scares ThemSeptember 30, 20161 CommentReading the Bible: An Act of ProtestSeptember 29, 2016No CommentsSorry Catholic Left, but Pope Francis wants us to be Culture Warriors for Life and Marriage.September 28, 20161 CommentTrump-Support Need Not Be an Internal Threat to the ChurchSeptember 28, 20166 Comments | 科技 |
2016-40/3982/en_head.json.gz/6753 | General FEA Software Mesh Refinement High-Performance Computing (HPC) Finite Element Method Electrical The Joule Heating Effect Induction Heating Microwave Heating Piezoelectric Effect Piezoresistive Effect Electromechanical Effects Mechanical Aeroacoustics The Joule-Thomson Effect Thermal Expansion and Thermal Stresses Acoustic-Structure Interaction Material Fatigue Fluid Navier-Stokes Equations Boussinesq Approximation Nonisothermal Flow Fluid-Structure Interaction Poroelasticity Squeezed and Sliding Films The Marangoni Effect Chemical What Is Mass Transfer? What Is Ionic Migration? What Is Convection? What Is Diffusion? Convection-Diffusion Equation Diffusion Coefficient Diffusion Equation Transport Phenomena Poroelasticity
Fluid Poroelasticity Defining Poroelasticity
Poroelasticity is the term used to describe the interaction between fluid flow and solids deformation within a porous medium. As their name indicates, porous materials are solid structures comprised of pores or voids. This type of material is typically associated with natural objects, such as rocks and solids, as well as biological tissues, foams, ceramics, and paper products.
When an external load is applied to a porous medium, the volume fraction of the pores is affected. The fluid-filled pores experience a change in pressure under this mechanical stress, which, in turn, leads to fluid motion. As a reaction to this change in pore volume, the solid material shifts and deforms elastically.
From left to right, graph showing sediment displacement after two years, five years and ten years. Biot poroelasticity: Plots show changes in sediment displacement over the course of ten years.
Modeling Fluid Flow in Porous Materials
Modeling poroelasticity requires the coupling of two laws. The first of these is Darcy's law, which describes the relation between fluid motion and pressure within a porous medium. According to this law, the fluid velocity is directly proportional to the difference in pressure over a given distance and the fluid's viscous properties and the porous material's ability to disrupt the flow. The second law is the structural displacement of the porous matrix. Biot poroelasticity describes this coupled physics.
Real-World Application
Take an oil reservoir, for instance. Pore pressure decreases as fluid is pumped out and the reduction in pore pressure generates fluid movement. This reduces the in-situ stress, which leads to a gradual deformation in the overburden above the reservoir, causing layers to cave in or sink. This process is known as subsidence. As the images show, the deformation in porous materials progresses and becomes more pronounced over time.
Fluid-Structure Interaction Related
Squeezed and Sliding Films Appears In - fluid
Boussinesq Approximation
Nonisothermal Flow
Fluid-Structure Interaction
Squeezed and Sliding Films
The Marangoni Effect | 科技 |
2016-40/3982/en_head.json.gz/6765 | Tech Exec's Family Inspires a Senior-Care Business
Gwen Moran
A few years ago, Kian Saneii watched his parents, both in their 70s, help manage the care of his nonagenarian grandmother. Seeing his father call her several times a day to check on her and remind her to take her medications, he thought, There has to be a better way.
He knew his family was facing an increasingly common dynamic. The graying of the U.S. is accelerating: Washington, D.C., research organization the Population Reference Bureau estimates that the country's 65-and-older population will grow from its current 40 million to 89 million by 2050. As the population ages, there will be more caregivers looking to manage seniors' needs while attempting to help them maintain some independence for as long as possible.
After nearly a year of research, in 2009 Saneii launched Independa, a technology firm that helps seniors (and their caregivers).
Independa's telecommunications platform works on telephones, computers and tablets. The basic service uses any telephone to deliver reminders about medications and attending appointments. The Angela platform focuses on connection; an individual can get a telephone reminder about a family member's birthday with a prompt to press a button to connect.
The Independa platforms allow people who know nothing about technology to use a touchscreen device or TV to connect to the internet, video chat or Facebook, reducing social isolation and possible incidences of depression. The platform also has games and puzzles for entertainment.
Before launching Independa, Saneii held key management roles at two Southern California tech companies, Websense and IPNet Solutions. His reputation in the local business community aided the firm, says Robert Holmen, managing director of Miramar Venture Partners in Corona del Mar, Calif. When Saneii began to look for seed funding, Holmen was immediately interested, both because of Saneii's business acumen and because medical IT is the focus of some of Miramar's investments. Independa closed a $1.6 million seed round of funding in September 2011, led by Miramar and City Hill Ventures, a San Diego venture capital firm. The investment team has authorized Independa to raise an additional $600,000 in seed funding thanks to ongoing interest from investors. In 2012, Independa will work on a $5 million Series A funding round.
Independa is using the funding for product development and to foster potential partnerships. Saneii is particularly enthusiastic about efforts to further develop the company's Artemis platform, which monitors measurements important for managing chronic disease, such as blood pressure and weight. The company is working on integrating this platform with wireless health devices and home sensors, such as blood pressure monitors, glucometers and scales. Once integrated, important health information will be reported to caretakers, insurance companies and healthcare providers. The service will also provide a communication method for seniors in distress to call for help.
"We don't think 'I've fallen and I can't get up' is the first line of defense anymore," Saneii says. "We know we can do a much better job as a society and use technology to help everyone's comfort and decrease the costs by keeping people independent longer."
Corrections & Amplifications: An earlier headline on this article incorrectly identified Independa as a franchise. It is a ventured-backed technology company.
Tech Startups | 科技 |
2016-40/3982/en_head.json.gz/6767 | Public Release: 10-May-2011
An enigmatic problem in marine ecology uncovered
Reef fishes and many other marine species live all their adulthood in one place but early in their lives, when they're eggs and larvae, spend a short period of time drifting and swimming in the open ocean. It seems intuitive that the duration of this open water period should determine the geographic extent over which species are found as species that spend longer drifting at sea are likely to reach greater distances. Interestingly enough, numerous studies have consistently failed to find any relationship between the duration of the open water period and the geographic coverage of marine species. A new research paper has uncovered this mystery.
"One of the most puzzling results in the study of reef fishes and other marine organisms that dwell sea-floor habitats as adults but drift in open water early in their lives is why their geographic coverage bears no relationship with the duration of the open water period," explains co-author Dr. Camilo Mora, post-doctoral fellow in the Department of Biological Science at Dalhousie University. "Since this idea was first proposed over 30 years ago, we've been scratching our heads trying to resolve this mystery by evaluating the relationship multiple times in different groups of species and regions. Yet we consistently we failed to find a noteworthy relationship."
In this new study, the team of researchers, which included marine ecologists, geneticists and ocean current modelers, first evaluated the possibility that the relationship between geographic extent and the duration of the open water period was compounded by the evolutionary age of species, whose effect has not been considered in previous studies. The rationale was that the age of a species should add to the geographic coverage of species as older species have had more time to expand geographically compared to younger species.
To evaluate this idea, the authors compiled the largest set of data yet assembled on evolutionary ages of reef fish species, and the duration of their open water periods and geographic extents. The analysis of this data showed, however, that even after taking evolutionary age into account there was still no relationship between geographic extent and the duration of the open water period.
"We expected that the effect of species ages could be the missing piece to resolve this puzzle," says co-author Dr. Denis Roy, post-doctoral associate in the Department of Biology/Marine Gene Probe Laboratory at Dalhousie University. "So we were a bit disappointed to find that neither the age of the species nor the duration of the open water period or both combined played an important role on the geographic extent of reef fishes."
"This result, about the limited effect of species ages, deepened our intrigue," says co-author Kate Crosby from Dalhousie University. "The only other thing we could think of was that perhaps reef habitats were so highly connected by ocean currents that species could reach all suitable habitats regardless of their open water period or time since they originated as new species."
To test this idea, the team took on the challenge of modeling the paths of fish larvae during the open water period over the world's tropical reefs. The authors used state of the art models of ocean currents and compiled a worldwide set of data on marine habitats where reef fish dwell. Simulated larvae were released from all possible habitats and allowed to drift for times equal to the duration of their open water periods. The simulation required 600 computer processors running continuously for six months. The results revealed that the majority of reef habitats worldwide are so interconnected that species can quickly spread their geographic distribution pushed by ocean currents. This lack of constraints to the geographic expansion of species provides one of the first explanations for why geographic extent bears no relationship with the duration of the open water period.
"An underlying assumption of the expected relationship between geographic extent and the duration of the open water period is that reef habitats are positioned in a gradient of isolation, which species can bridge only depending on how long the spend drifting in the open ocean," says co-author Dr. Eric Treml, post-doctoral fellow in the School of Biological Sciences at University of Queensland, Australia. "Our simulations of what happens during the open water period suggest that that assumption is just not valid. Given ocean currents, fish larvae can go almost anywhere."
"This is like having a 100 metre race between a car and a bike and giving them one hour to finish; the task is so easy that both vehicles will reach the finish line independent of their speeds," says co-author Jason Roberts at Duke University, U.S.A. "As for reef fishes, ocean currents provide such fast freeways that species can easily reach suitable reefs independent of the time they spend drifting in the open water."
"We've been able to provide new insight into why an intuitively important factor played no role in shaping the geographic extent of reef fish species," says co-author Derek Tittensor at Microsoft Research in Cambridge, U.K. "Given our results, however, a question that still needs to be answered is why all reef fish species are not found everywhere."
Dr. Camilo Mora
cmora@dal.ca
01-157-313-776-2282
http://www.dal.ca More on this News Release
Ecography | 科技 |
2016-40/3982/en_head.json.gz/6808 | News An Ocean of Knowledge An Ocean of Knowledge
Thursday 29 September 2011 0:00 CEST By Peter Kaiser IAEA Division of Public Information. Staff of the IAEA's Environment Laboratories in Monaco contributed to this re, The Laboratory began its work by studying radioactive substances in the marine environment and their effects on marine life. (Photo: IAEA)
On 10 March 1961, the IAEA concluded with the Principality of Monaco and the Oceanographic Institute, then directed by Jacques Cousteau, an agreement on a research project on the effects of radioactivity in the sea. The opening of the IAEA Marine Environment Laboratories in Monaco that same year marked the start of a new era for research into the marine environment.
Fifty years later, that cooperation has expanded significantly through collaboration between the Laboratories - now known as the Environment Laboratories - with international and regional organizations, as well as national laboratories.
During the past fifty years, the Principality of Monaco has been an active partner in the Laboratories' development. His Serene Highness Prince Albert takes a strong personal interest in the Laboratories' work.
Unique data derived from the application of nuclear and isotopic techniques improve scientists' knowledge of the seas and oceans and help to assess pollution. These studies support the sustainable development of the ocean. The research is buttressed by strategic partnerships with other UN agencies such as the Intergovernmental Oceanographic Commission, which also celebrates its 50th anniversary this year, as well as the United Nations Environment Programme, the United Nations Development Programme, the United Nations Educational, Scientific and Cultural Organization and the International Maritime Organization.
Many Member States' national laboratories rely upon the Laboratories' accurate analyses of sea water, sediment and marine life samples. Reference materials and methods produced by the laboratories have helped to improve the quality and reliability of analytical data in Member State laboratories for the past 50 years.
For instance, the Radiometrics Laboratory uses radionuclides as environmental tracers, in collaboration with leading research centres around the world, to quantify ocean circulation, the transport of pollutants in coastal ecosystems, sedimentation and submarine groundwater discharge. IAEA scientists have participated in collaborative research and assessment work on sites impacted by pollution throughout the world.
The Radioecology Laboratory studies the impacts of contaminants on seafood safety and the effect of climate change and ocean acidification on marine organisms, as well as the ocean's ability to sequester CO2. The Laboratories have been in the front line of the battle against harmful algal blooms, which are highly toxic to fish, shellfish and other marine life. They also represent a threat to human health and endanger the livelihoods of fishermen in almost every coastal region around the world. The Environment Laboratories have successfully promoted the use of a nuclear-based technique using receptor-binding assay for early detection and monitoring.
The Laboratories provided the essential scientific and analytical support for a landmark study of radioactive and non-radioactive pollutant levels in all principal seas. They have undertaken worldwide radioactivity baseline studies of the Atlantic, North and South Pacific, Indian, Arctic and Antarctic Oceans and the Far Eastern, Mediterranean, and Black Seas. Regional studies have been conducted in the Gulf, the Irish, Kara and Caspian Seas, New Caledonia and the Mururoa and Fangataufa Atolls.
The Marine Environmental Studies Laboratory focuses mainly on non-radioactive pollutants such as pesticides, polychlorinatedbiphenyls (PCBs), petroleum hydrocarbons, polycyclic aromatic hydrocarbons (PAHs), antifouling paint booster biocides, but recently has also dealt with radioactive contaminants. In cooperation with regional laboratories, the Laboratory provides training and implements marine monitoring programmes, while acting as the analytical support centre for regional organizations protecting marine environments.
In time of need, the laboratories can respond quickly to support Member States. After the accident at the Fukushima Daiichi nuclear power plant in Japan in March 2011, the Environment Laboratories worked actively with Japan, and with the IAEA Incident and Emergency Centre in Vienna, making staff available and analysing samples collected by IAEA radiation monitoring teams.
Currently, more than 80 Technical Cooperation projects are led by the laboratories, including a marine study on the impact of the Fukushima Daiichi radioactive releases in the Pacific Region.
Related Resources Understanding Water Pollution, 19 September 2011 Marine Environmental Laboratory (MEL, Monaco) Exhibit at IAEA General Conference, 23 September 2011 IAEA Environment Laboratories Celebrate 50 Years of Marine Research Milestones, 29 September 2011 Why Water Matters, IAEA Bulletin (Vol. 53/1, 2011) IAEA General Conference Scientific Forum IAEA Department of Nuclear Sciences and Applications Environmental LAboratories, Monaco IAEA Marine Environmental Studies Laboratory (MESL), Monaco IAEA Department of Technical Cooperation 29September 2011 Last update: 24 February 2015 More on the IAEA | 科技 |
2016-40/3982/en_head.json.gz/6876 | Severe declines in Everglades mammals linked to invasive pythons, researchers find
New research links precipitous declines in formerly common mammals in Everglades National Park to the presence of invasive Burmese pythons.
John “J.D.” Willson holds a young Burmese python captured in Everglades National Park.
Credit: Image courtesy of Michael Dorcas
Collaborative research, led by Michael Dorcas of Davidson College and John "J.D." Willson of Virginia Tech's College of Natural Resources and Environment, has linked precipitous declines in formerly common mammals in Everglades National Park to the presence of invasive Burmese pythons.
The study, published on Jan 30, 2012, in the Proceedings of the National Academy of Sciences, is the first to document the ecological impacts of this invasive species and strongly supports that animal communities in the 1.5-million-acre park have been markedly altered by the introduction of pythons within 11 years of their establishment as an invasive species. Mid-sized mammals are the most dramatically affected.
"Our research adds to the increasing evidence that predators, whether native or exotic, exert major influence on the structure of animal communities," said Willson. "The effects of declining mammal populations on the overall Everglades ecosystem, which extends well beyond the national park boundaries, are likely profound, but are probably complex and difficult to predict."
Willson is a post-doctoral researcher in the Wildlife Ecotoxicology and Physiological Ecology Program in the Department of Fish and Wildlife Conservation at Virginia Tech and is a co-author of the book "Invasive Pythons in the United States."
"Dr. Willson's recent work on pythons provides significant insights into the important roles that reptiles can play in community structure and ecosystem processes," said Associate Professor Bill Hopkins, who directs the ecotoxicology program. "Understanding how introduced predators like pythons influence community structure will ultimately prove critical to conserving important ecological systems like the Everglades."
The most severe declines, including a nearly complete disappearance of raccoons, rabbits, and opossums, have occurred in the remote southernmost regions of the park, where pythons have been established the longest. In this area, populations of raccoons dropped 99.3 percent, opossums 98.9 percent, and bobcats 87.5 percent. Marsh and cottontail rabbits, as well as foxes, were not seen at all.
"Pythons are wreaking havoc on one of America's most beautiful, treasured, and naturally bountiful ecosystems," said U.S. Geological Survey Director Marcia McNutt. "Right now, the only hope to halt further python invasion into new areas is swift, decisive, and deliberate human action."
The researchers collected their information via repeated systematic nighttime road surveys within Everglades National Park, counting both live and road-killed animals. Researchers traveled a total of nearly 39,000 miles from 2003 to 2011 and compared their findings with similar surveys conducted along the same roadways in 1996 and 1997 before pythons were recognized as established in the park.
The study's authors noted that the timing and geographic patterns of the documented mammal declines are consistent with the timing and geographic spread of pythons.
The authors also conducted surveys in ecologically similar areas north of the park where pythons have not yet been discovered. In those areas, mammal abundances were similar to those in the park before pythons proliferated. At sites where pythons have only recently been documented, however, mammal populations were reduced, though not to the dramatic extent observed within the park where pythons are well established.
"The magnitude of these declines underscores the apparent incredible density of pythons in Everglades National Park and justifies the argument for more intensive investigation into their ecological effects, as well as the development of effective control methods," said lead author Michael Dorcas, a professor in the Department of Biology at Davidson College in North Carolina, who co-authored "Invasive Pythons in the United States" with Willson. "Such severe declines in easily seen mammals bode poorly for the many species of conservation concern that are more difficult to sample but that may also be vulnerable to python predation."
The mammals that have declined most significantly have been regularly found in the stomachs of Burmese pythons removed from Everglades National Park and elsewhere in Florida. The authors noted that raccoons and opossums often forage for food near the water's edge, a habitat frequented by pythons in search of prey.
The authors suggested that one reason for such dramatic declines in such a short time is that these prey species are "naive" since such large snakes have not existed in the eastern United States for millions of years. Burmese pythons over 16 feet long have been found in the Everglades. In addition, some of the declining species could be both victims of being eaten by pythons and of having to compete with pythons for food.
"It took 30 years for the brown tree snake to be implicated in the nearly complete disappearance of mammals and birds on Guam; it has apparently taken only 11 years since pythons were recognized as being established in the Everglades for researchers to implicate pythons in the same kind of severe mammal declines," said Robert Reed, a U.S. Geological Survey scientist and a co-author of the paper. "It is possible that other mammal species, including at-risk ones, have declined as well because of python predation, but at this time, the status of those species is unknown."
The scientists noted that in their native range in Asia, pythons have been documented to consume leopards. Consequently, even large animals, including top predators, are susceptible to python predation. For example, pythons in the Everglades have been documented consuming alligators and full-grown deer. Likewise, the authors state that birds, including highly secretive birds such as rails, make up about a fourth of the diet of Everglades pythons, and declines in these species could be occurring without managers realizing it.
The authors found little support for alternative explanations for the mammal declines, such as disease or changes in habitat structure or water management regimes.
"This severe decline in mammals is of significant concern to the overall health of the park's large and complex ecosystem," said Everglades National Park Superintendent Dan Kimball. "We will continue to enhance our efforts to control and manage the non-native python and to better understand the impacts on the park."
"No incidents involving visitor safety and pythons have occurred in the park," Kimball continued. "Encounters with pythons are very rare; that said, visitors should be vigilant and report all python sightings to park rangers."
The U.S. Fish and Wildlife Service published a rule in the Federal Register on Jan. 23, 2012, that will ban the importation and interstate transportation of four non-native constrictor snakes (Burmese python, yellow anaconda, and northern and southern African pythons) that threaten the Everglades and other sensitive ecosystems. These snakes are being listed as injurious species under the Lacey Act. In addition, the U.S Fish and Wildlife Service will continue to consider listing as injurious five other species of nonnative snakes (reticulated python, boa constrictor, DeSchauensee's anaconda, green anaconda, and Beni anaconda).
The authors of the research paper, "Severe mammal declines coincide with proliferation of invasive Burmese pythons in Everglades National Park," are Michael E. Dorcas, Davidson College; John D. Willson, Virginia Tech; Robert N. Reed, U.S. Geological Survey; Ray W. Snow, National Park Service; Michael R. Rochford, University of Florida; Melissa A. Miller, Auburn University; Walter E. Meshaka Jr., State Museum of Pennsylvania; Paul T. Andreadis, Denison University; Frank J. Mazzotti, University of Florida; Christina M. Romagosa, Auburn University; and Kristen M. Hart, U.S. Geological Survey.
Materials provided by Virginia Tech. Note: Content may be edited for style and length.
M. E. Dorcas, J. D. Willson, R. N. Reed, R. W. Snow, M. R. Rochford, M. A. Miller, W. E. Meshaka, P. T. Andreadis, F. J. Mazzotti, C. M. Romagosa, K. M. Hart. Severe mammal declines coincide with proliferation of invasive Burmese pythons in Everglades National Park. Proceedings of the National Academy of Sciences, 2012; DOI: 10.1073/pnas.1115226109
Virginia Tech. "Severe declines in Everglades mammals linked to invasive pythons, researchers find." ScienceDaily. ScienceDaily, 31 January 2012. <www.sciencedaily.com/releases/2012/01/120131135205.htm>.
Virginia Tech. (2012, January 31). Severe declines in Everglades mammals linked to invasive pythons, researchers find. ScienceDaily. Retrieved October 1, 2016 from www.sciencedaily.com/releases/2012/01/120131135205.htm
Virginia Tech. "Severe declines in Everglades mammals linked to invasive pythons, researchers find." ScienceDaily. www.sciencedaily.com/releases/2012/01/120131135205.htm (accessed October 1, 2016).
Decline in amphibian populations
Seed predation
Severe Python Damage to Florida's Native Everglades Animals Documented in New Study
Jan. 30, 2012 Precipitous declines in formerly common mammals in Everglades National Park in Florida have been linked to the presence of invasive Burmese pythons, according to new research. The study, the first to ... read more Related Stories
Burmese Python Habitat Use Patterns May Help Control Efforts
Apr. 28, 2015 The largest and longest Burmese python tracking study of its kind -- here or in its native range -- is providing researchers and resource managers new information that may help target control efforts ... read more Burmese Pythons Pose Little Risk to People in Everglades, Study Suggests
Feb. 28, 2014 The estimated tens of thousands of Burmese pythons now populating the Everglades present a low risk to people in the park, according to a new study. The human risk assessment looked at five incidents ... read more New Threat to Birds Posed by Invasive Pythons in Florida
Apr. 5, 2012 Scientists have uncovered a new threat posed by invasive Burmese pythons in Florida and the Everglades: The snakes are not only eating the area's birds, but also the birds' eggs straight ... read more Salt Water Alone Unlikely to Halt Burmese Python Invasion
Jan. 4, 2012 Invasive Burmese python hatchlings from the Florida Everglades can withstand exposure to salt water long enough to potentially expand their range through ocean and estuarine ... read more Strange & Offbeat | 科技 |
2016-40/3982/en_head.json.gz/6913 | Forums General Phreaking Toll-Free Numbers not free, so says FCC
Toll-Free Numbers not free, so says FCC
Got this data from the FCC, hope you enjoy:Reference:TSR12, ID# 02360492 The following is the information you requested from the FCC Consumer Center. Thank you for your inquiry. ------------------------------------------------------------------------- CONSUMER INFORMATION ______________________ Federal Communications Commission 445 12th Street, S.W. Washington, DC 20554 ______________________________ CALLS MADE FROM PAYPHONES _________________________ The Communications Act requires the FCC to take actions to promote competition among payphone service providers and the widespread deployment of payphone services to the benefit of the general public. The Act also requires the FCC to ensure fair compensation to payphone service providers for each and every call placed from payphones. A payphone service provider is the person or entity who owns the payphone instrument, such as the local telephone company; an independent company; or the owner of the premises where the payphone is located. Payphone service providers are called "PSPs" in this brochure. This brochure explains the actions the FCC has taken to carry out its responsibilities. _________________________________________________________________________ Are The Coin Rates For Local Calls From Payphones Regulated? No. Effective October 7, 1997, the FCC deregulated coin rates for all local calls made from payphones. Prior to 1996, most payphones were provided by local telephone companies and received indirect subsidies through the rates paid by consumers for other types of services. States regulated the coin rate for a local call. The resulting artificially low prices tended to discourage new companies from entering the payphone market and also limited the number of payphones available for the benefit of the public. In 1996, Congress required that payphones no longer be subsidized in order to encourage competition and the greater availability of payphones. The FCC determined that deregulating local coin rates and allowing the marketplace to set the price of local payphone calls is one of the essen- tial steps needed to achieve the goals set by Congress. Deregulation will allow PSPs to receive fair compensation for their services and will encourage the widespread placement of payphones. Also, the FCC anticipates that Americans will have greater access to emergency and public safety services. States may also choose to place public interest payphones in areas where payphones are necessary for health and safety reasons. The Commission intends to actively monitor the payphone marketplace by regularly meeting with representatives from the states, PSPs, and consumer advocates. _________________________________________________________________________ Must I Pay For An Emergency Call? No. Calls made to emergency numbers, such as 911, and to the Telecommunications Relay Service, a service of use to people with disabilities, will be provided free of charge from payphones. You can also continue to reach an operator without depositing a coin. _________________________________________________________________________ Can I Still Make Toll-Free Calls From Payphones Without Depositing A Coin? Yes. However, the Communications Act requires the FCC to establish a per-call compensation plan to ensure that all PSPs are fairly compensated for each and every completed intrastate and interstate call using their payphone -- except for emergency calls and telecommunications relay service calls for hearing disabled individuals. Prior to 1996, PSPs often received no compensation for completed intrastate and interstate calls -- including completed toll-free calls -- no matter how frequently callers used payphones to originate calls. The FCC carried out its responsibilities by adopting rules that require long distance telephone companies to compensate PSPs 28.4 cents for each call they receive from payphones, except those calls for which the PSPs already collect compensation under a contractual arrangement. Payphone- originated calls that are unlikely to be the subject of a contract with the PSPs include calls to 800 telephone numbers or 10XXX access code calls which connect callers to long distance telephone companies. The 24 cents per-call compensation rate is a default rate that can be reduced or increased at any time through an agreement between the long distance company and the PSP. The FCC encouraged long distance companies and PSPs to contract with each other for more economically efficient compensation rates. Some long distance companies are advising consumers that the FCC decided that consumers making calls from payphones should pay a per-call charge to compensate the PSP. The FCC did not make such a decision. Long distance companies have significant leeway on how to compensate PSPs. The FCC left it to each long distance company to determine how it will recover the cost of compensating PSPs. _________________________________________________________________________ Tips For Consumers Companies compete for your payphone business. Use your buying power wisely and shop around. If you think that the rate for placing a call from a pay- phone is too high, a less expensive payphone could be around the corner. Also let the PSP know that the rates are too high. It's in their best interest to meet the needs of their customers. Contact your preferred long distance company and ask for instructions for placing calls through that company from a payphone. Also ask what rates or charges apply to calls placed from payphones. Let the company know if you believe their rate are too high. Then call other long distance companies and ask about their rates. _________________________________________________________________________ INFORMAL COMPLAINTS MAY BE SENT TO: Federal Communications Commission Consumer Information Bureau Common Carrier Complaints 445 12th Street, S.W. Washington, D.C. 20554 ALL CONSUMER COMPLAINTS MUST BE IN WRITING. No telephone complaints can be processed by the Bureau. This is an unofficial announcement of Commission action. Release of the full text of a Commission order constitutes official action. See MCI v. FCC. 515 F 2d 385 (D.C. Circ 1974). Report No. CC 99-2 COMMON CARRIER ACTION January 28, 1999 COMMISSION RESOLVES PAYPHONE COMPENSATION ISSUES ON REMAND (CC Docket 96-128) The Federal Communications Commission today adopted an order addressing payphone compensation issues remanded by the United States Court of Appeals for the District of Columbia Circuit (D.C. Circuit). In today's order, the Commission resolved compensation issues for so-called "dial-around" calls, which allow a consumer to use a long distance carrier other than the payphone's presubscribed carrier. Dial-around calls include long distance access code calls, such as those using familiar 10-10-NXX codes, as well as calls to toll free numbers. Today's order sets a rate of $.24 per call that long distance companies must pay to owners of payphones for delivery of these calls. Prior to the 1996 Act, payphone owners received little or no compensation for these calls even though they were required by other provisions of the Act to allow consumers to access these services. In the Telecommunications Act of 1996, Congress directed the Commission to establish rules that benefit the public by promoting the widespread deployment of payphones in a competitive marketplace. While the Commission in prior orders largely has achieved the goals of the 1996 Act, the D.C. Circuit remanded the Commission's rules governing one payphone issue. The 1996 Act specifically requires the Commission to establish a per-call compensation plan to provide fair compensation for all calls made from payphones. The D.C. Circuit ordered the Commission to provide a better explanation of its per-call compensation plan. In today's Order, the Commission reaffirmed that payphone compensation issues are best addressed in the marketplace by negotiations between long distance companies and payphone owners. In the absence of such a negotiated rate, however, the Commission established a default rate of $.24 per call. Although this rate represents a reduction from the $.284 rate remanded by the D.C. Circuit, the Commission explained that this reduction resulted primarily from the use of more up-to-date information concerning the costs of providing payphone service. The Commission emphasized that its decision would ensure that payphone owners are fairly compensated while satisfying Congress's mandate to promote the widespread deployment of payphone services to the benefit of the public. In previous orders, the Commission calculated the cost of providing payphone service using a "top-down" method that subtracted certain costs from the prevalent price of a local payphone call. On appeal, the D.C. Circuit questioned this approach and found that the Commission had not adequately explained its reasoning in this area. In this order, the Commission decided to use a "bottom-up" method, in which the costs of providing payphone service are added together to calculate a fair compensation amount. In previous Orders, the Commission already has taken steps to increase competition in the payphone industry. In those orders, for example, the Commission eliminated implicit subsidies that local telephone companies historically provided to their own payphones because these subsidies gave local telephone companies an unfair advantage over other payphone providers. The Commission also established non-structural safeguards to prevent Bell Operating Companies from discriminating in favor of their own payphones in the provision of local service, as well as other measures designed to place all providers of payphone services on an equal competitive footing. The Commission also deregulated the local coin rate for payphone calls to allow the competitive marketplace to determine the cost for such calls. Action by the Commission January 28, 1999, by Third Report and Order and Order on Reconsideration of Second Report and Order (FCC 99-7) Chairman Kennard, Commissioners Ness, Furchtgott-Roth, Powell, and Tristani. -FCC- Common Carrier Bureau contact: Glenn Reynolds at (202) 418-0960. News media contact: Emily Hoffnar at (202) 418-0253. TTY: (202) 418-0485.
Re: Toll-Free Numbers not free, so says FCC
SephiroX
****! SOO MUCH TO READ!
seph, stop with the ****in useless posts.
As far as the calls not being free, they still remain so to the consumer, regardless of what the LD companies want you to believe. I can understand the argument; PSPs get no compensation when you use their equipment to dial an 800 or 101XXXX number. But this assumes that by using the phone you're somehow "wearing it down" and need to pay for that. This of course, is all theoretical, considering the LD companies have no problem servicing an 800# or 101XXXX number placed from your home. With a payphone the LD companies and PSPs receive money for regular calls that are placed, why the assume they deserve extra for free calls (that the opposite side pays for I might add) is beyond me. Apparently they believe that they have a right to make as much money as possible, regardless of how much they charge the consumer.In response to the article's notion that "a cheaper payphone might be right around the corner," let me remind you that "around the corner" is probably another calling area. If you've been to NYC lately, you have cocots, and you have Verizon...choose. The same with most other parts of the country, you rarely see a different type of payphone right around the corner, at least not when one company owns virtually all of them.Bottom line, free calls are free calls. In any other business it's up to the company to meet the demands of the consumer, NOT the other way around.
Well Rane, this really isn't much of an article, this was from the FCC, word for word. Now True, phone companys should be meeting the demands of the customers, do they have to no. They know that competition is fairly limited in many aspects and convienence matters. I mean if you have to make a call using a 1-800 number, and your out somewhere, you are going to have to use a payphone, and if they feel the ugre to charge you they will. "Toll-Free" calls have to be payed for, if you use one of those phone cards you buy at the store, you get charged 25� to make the call to use it, and they make that perfectly clear, and why you might ask, well if its an AT&T Phone and the card is from PHONEMIN+ (fictional company) who is the one making money, Phonemin, and AT&T might not see a dime from that call (unless AT&T and Phonemin+ made an agreement), so they feel they should have the right to charge you because you are using their equipment.Companies aren't ment to serve the public good, Companies are all about $$. Payphones break down and where will they get the money to fix them if they aren't making money of 1-800 calls, they would raise rates across the board, so everyone else gets screwed.
Oh believe me, I'm well aware of their logic. My only question is why we need to put new charges into place for something that doesn't benefit the customer any more than usual. If I make a call to an 800#, the person I'm calling fronts the bill (which is why so many companies can get screwed by wardialers that do nothing but call their 800# repeatedly). Granted, if the person buys the number from MCI and you use AT&T to make the call, AT&T isn't getting money...but do they ever? If I call that 800# from home, should I pay extra to cover their switch/line maintenance? Now they're telling me (or at least in that FCC article/memo/thing) I should have to pay from a payphone? Where's the sense in that? The call IS being paid for, albeit by the called party. I get no further benefits, and therefore see no reason to pay more. The notion of using an equal access provider or phone card is a bit more clear, you must pay to use the phone companies equipment...but with regular calls already costing 50+ cents in most areas (so high because their customers have moved to cell phones...they neglect to mention that they too are most likely in the cell phone market, and have not seen any loss, their profits have only moved to a new location) I can't help but wonder why charges still must be paid for these services. Of course, you're absolutely right, a business is designed to make money. My argument isn't with you Defcon, it's with the way these companies go about obtaining that money.
Yeah I can understand your frustration with how they get the money. Its a weird law, but hey its what was felt to be in the best interest for everyone.I do have something to say about a point you brought up: Quote: If I call that 800# from home, should I pay extra to cover their switch/line maintenance? From what I know switch and line maintenance is already covered by the money you pay every month for local service, so thats taken care of. Now when you use a payfone its different because you wouldn't be paying the fone company anything to use that 1-800 Number.
Personally I like to think they get enough money to cover that by charging people 4 dollars to call the next state #11474 - 03/11/02 02:15 PM
Yeah, but you gotta deal with the **** Wanna Be Phreakers tearing open every line they see and abuse telcom, its safe to say that we're paying for everything the dumbasses do.
hey, i resent that and i pefer to beige from a can thank you very much :x
lmao at gizmo, Hey does anyone know how old a payphone should be for the tone generators to work to get free calls? Because some of the new ones dont work with it. #11477 - 03/16/02 07:42 PM
Neo, the Payphone has nothing to do with it at all. Its the line, switch, and switching system used to service the phone.
View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message View Homepage Follow User Show Forum Posts View Profile Send Private Message View Homepage Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message View Homepage Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts View Profile Send Private Message Follow User Show Forum Posts | 科技 |
2016-40/3982/en_head.json.gz/6925 | Teacher Submissions
Partner Announcements
Windows to the Universe Facebook Group
Thoughts on This American Life: Kid Politicsby Roberta
Some of you may have recently heard the rebroadcast of the This American Life: Kid Politics interview I had with Ira Glass and Erin Gustafson last January, regarding climate change.� I have some thoughts about that interview - the discussion itself, the outcome, and thoughts thereafter that may be valuable to some of you, so thought I'd share them. Some background - Ira met Erin at a Glenn Beck rally in the fall of 2010 in D.C.� She seemed like a lovely young lady (14 at the time), well-spoken, and intelligent, and in the course of their conversation, it became apparent that Erin thought that global warming was a hoax, and that the scientists involved in promoting that view were all on the make.� Some time later, Ira and his staff came up the idea for a show to look at the question of asking young people to make adult decisions, and they thought climate change might be an example of that.� The hour-long show includes two other parts.� For my part of the interview, I was located here in Boulder, Ira was in New York, and Erin was in a studio in rural Virginia.� We spoke together for 1.5 hrs total, and only something like 10 minutes of that was spliced together, with narration (of course) to put together a story.� Although I must say I didn't really realize what I was getting into when I blithely said "yes", when asked if I would be willing to talk with a 14-yr old climate skeptic (not realizing it was for radio ), I am very happy that I did the interview.� It was a very good experience, I think I learned a lot from it, and has led to interesting post-interview contacts, discussions, and extended thinking about it on my part.� I will share some of those thoughts with you, below.� Before I do, though, I want to comment on one perspective that seems to be driven by the way the interview was presented in the 10 minute segment.� It appears, in the shortened version broadcast, that Erin did not change her mind as a result of the conversation.� That is not actually what happened.� When we started the interview, Erin stated that she did not believe that the climate was changing, and that furthermore, �she felt that the scientists promoting that view were making it up after receiving research money, i.e., they weren't credible.� 1.5 hours later, her position had changed to "well, maybe climate is changing, but I'm not sure why", and "I want to learn about both sides".� From my perspective, that felt like a significant achievement - and maybe all that could be expected from a 1.5 hour long conversation.� Also, Ira started off the interview with the statement, "Dr. Johnson, this is your chance to try to convince Erin that climate change is real."� My response to both was something like - "Hold on, let me be clear, it is not the responsibility of a science teacher to "convince" a student of anything - their job is to prepare students with an understanding of science concepts and process skills, so that they can use these to analyze observations and make science-based conclusions using this toolset".� Throughout the interview, I repeatedly mentioned that this is not about belief, but about observations and science, and I was happy that one of those statements made it into the broadcast.� Now, onto some post-interview thoughts I've had.� I really enjoyed talking with Erin - she seemed like a lovely person - and she must be brave, to take on such a project.� She is clearly a good student, too, and has a loving family.� Several of the points she made in the interview showed that she doesn't understand some climate change concepts and probably some important aspects of Earth science in general - but she has made the effort to do independent research.� At one point in the interview, she said she wanted to see data from "both sides".� I replied that there is an enormous amount of data that I'd be happy to point her to (which I did) online, and that she could look at the data herself - that the data were from authoritative web sites from places like NOAA, NASA, and NSF (and of course, I mentioned http://www.skepticalscience.com and Windows to the Universe).� I also replied that there really isn't anything like a comparable amount of data on the "other" side, but that there are a lot of other websites, which are more focused on opinions, that include those views.� It was clear, though, that for her, NOAA, NASA, and NSF are not authoritative, that she views them with suspicion. This experience brought into clear focus the importance of "frames", and also really got me thinking about trust.� As a scientist myself, and living in a world of scientists, I know that the lion's share of those working in this area are good, honorable, honest, and hardworking people - looking for what the scientific evidence is telling them, and not slanting that to try to get research funding.� The very large majority of these folks would never think of making stuff up, not only because it is clearly wrong to do so, but because they would get caught through review, and their careers would be ruined.� I know this - but it occurred to me, after speaking to Erin - that perhaps she doesn't (although I did mention it to Erin).� I don't know, but it may well be that she does not know any scientists - in fact, there may not even be any of them in her community.� Or maybe there are, but perhaps they do not share much about their work in their community.
That then led to thoughts about the importance of scientists being engaged in their communities.� We all know about how scientists are being encouraged to be involved with education and outreach, and there has been a similar focus about getting scientists prepared to talk with the media, and to public groups.� My point is a little different.� My sense is that, perhaps out of frustration or understandable exhaustion, many scientists have tired from sharing what they do with their neighbors and in their social groups.� Perhaps they think that people won't understand, or that they won't be interested.� But if we don't share what we do, in a way that is understandable and interesting, how will our neighbors, friends, and communities learn about our science from people they trust? I fear that, by being reticent to share our science, we may have inadvertently set ourselves up to be easily classed as "the other" - someone that exists outside of the frame of "regular people" - someone too easy to not trust.
Let me briefly mention the importance of thinking carefully about and building on common values when you are reaching out with this content to different groups.� Also, recognize the value of having a trusted third party in the conversation (Ira Glass in this case), and the value of patience and respect in our discourse.� In fact, that's probably what made this interview so enjoyable.
Finally, you might be interested in a blog that erupted shortly after the interview - http://www.sindark.com/2011/01/20/roberta-johnson-and-erin-gustafson/ - the discussion there (which I weighed in on a couple of times) is pretty interesting, and sometimes scary.
Hope all is going well for you in your new semester! Celebrate Black History Month!by Jennifer
February is Black History Month. Celebrate these important people and their culture in your science classroom by taking time to do the Earth Scientist Project with your students. This is a research, writing and presentation activity where students learn about scientists. It's also a great activity to use in encouraging teamwork. Here are some scientists you might want to focus on to celebrate Black History Month:
Evan B. Forde is an oceanographer at the National Oceanic and Atmospheric Administration's Atlantic Oceanographic and Meteorological Laboratory in Florida. He has been an oceanographer since 1973, and was the first black oceanographer to participate in research dives aboard the submersibles ALVIN, JOHNSON SEA LINK, and NEKTON GAMMA. His current research is aimed at understanding how hurricanes form and intensify, and he is also working extensively in science education.
Wangari Maathai is a Kenyan environmental activist, and the founder of the Green Belt Movement, an organization that promotes environmental conservation and community development. In Kenya, the Green Belt Movement works to organize poor rural women and promote the planting of new trees to fight deforestation and stop soil erosion. Dr. Maathai was the first East African woman to earn a PhD in 1971, and for her efforts to protect the environment and the poor of Africa, she was awarded the 2004 Nobel Peace Prize.
Warren Washington is a scientist at the National Center for Atmospheric Research, where he is currently the head of the Climate Change Research Section. He has been a climate scientist for nearly 50 years, and has served as a key advisor to many different government agencies. From 2002-2006,�Dr. Washington served as the Chairman of the National Science Board, which helps to oversee the National Science Foundation and advises the President and Congress on scientific matters. He has won many awards and honors over the course of his career, and is a nationally recognized expert on climate change.
Matthew Henson - Polar Explorerby Jennifer
Matthew Henson was a polar explorer during the late 19th and early 20th centuries. He was one of five men (the other four were Inuit) who accompanied the famous American polar explorer Robert E. Peary in 1909 on the final stage of an expedition in which Peary (controversially) claimed to be the first to reach the North Pole. Henson, an African-American, spent 20 years making journeys in the Far North, and was highly respected by the Inuit for his command of their language and his sled dog driving skills. Peary once remarked that Henson "was more of an Eskimo than some of them."
Global Map of Ocean Salinityby Jennifer
The new Aquarius instrument, launched as part of an earth-orbiting satellite on June 10,�produced its first global map of the salinity of the ocean surface. �Surface salinity is the last of the major ocean surface quantities to�be measured globally from space and provides scientists with a new�tool to explore the connections between global rainfall, ocean�currents and climate changes. Aquarius is now producing continuous�observations of the global oceans in unprecedented detail, including�extensive low-salinity regions associated with the outflow of major�rivers.
Carnival Around the Worldby Julia
February 21st marks the end of Carnival, which is a celebration that occurs annually in many countries around the world, particularly those with a history and culture in which Catholicism plays a major role. �Carnival occurs just before the start of Lent, and is traditionally a time in which people feast and embrace one last time the things they will be giving up during the season of Lent.
Some historians think that the first carnival celebrations predate Christianity itself, and occurred over five thousand years ago in ancient Sumer and Egypt. �Some believe that the�ancient Roman festivals of Saturnalia (a festival devoted to the Roman god Saturn) and Bacchanalia and other pagan celebrations of spring may have been absorbed into the Carnival.
Some of the best-known carnival traditions�date back to medieval Italy, spread to the rest of�Catholic Europe, and were brought to America during the Spanish�conquest. The best-known carnivals, drawing hundreds of thousands of�visitors, happen in Venice, Italy, Rio de Janeiro, Brazil, and New Orleans, Louisiana (Mardi Gras).
9/10 of the Warmest Years on Record Have Occurred Since 2000by Jennifer
The global average surface temperature in 2011 was the�ninth warmest since 1880, according to NASA scientists. The finding�continues a trend in which nine of the 10 warmest years in the modern�meteorological record have occurred since the year 2000. �Scientists expect this trend to continue for at least a few more years, since atmospheric carbon dioxide levels are also at historic highs, solar activity is on the upswing, and the next El Nino will increase tropical Pacific temperatures. �To read more about how scientists combined temperature data from more than 1,000 locations around the world to analyze global temperature change, visit the project's website (http://data.giss.nasa.gov/gistemp).
Are Polar Bears on Thin Ice as the Earth Warms?by Jennifer
Polar bears peer through cracks in the Arctic sea ice to look for ringed seals, their favorite food, in the waters below. Almost all of a polar bear's food comes from the sea and the floating sea ice is a perfect vantage point for the bears as they look for food. However, the amount of sea ice floating in the Arctic region is shrinking each year. The ice is melting as the Earth is warming. Within the next few decades, there may be no more sea ice in the Arctic Ocean during the summer. What will this mean for the polar bears? Click here to learn more about how one of the Arctic's most well known species is responding to climate change.
GRAIL Lunar Science Mission is Underway!by Jennifer
As part of the GRAIL mission, twin spacecraft will map the Moon's gravitational field to better understand what lies beneath the lunar surface. �The GRAIL spacecraft that achieved orbit around the moon New Year's Eve and New Year's Day actually have new names, thanks to elementary students in Bozeman, MT. �Their winning entry, Ebb and Flow, was selected as part of a nationwide school contest that began in October 2011.
To read more about this project, visit the project website (http://www.nasa.gov/grail).
Earthquake Education Resourcesby Jennifer
Haiti's 7.0 magnitude earthquake occurred on January 12, 2010. �It's hard to believe it's been over two years since that heartbreaking disaster. �Earthquakes in New Zealand struck in February 2011, bringing devastation to Christchurch and surrounding areas. �Japan is still dealing with�repercussions�from earthquakes that struck March 9-11, 2011.
More recently, an earthquake with magnitude 7.3 struck off the coast of Sumatra - luckily it brought very little damage.
Hazards like earthquakes are a natural part of Earth's processes. Learning more about how and why they happen, especially around the time of these earthquake anniversaries, can be a helpful way to connect students with our planet. And it is a reminder that the human experience and natural sciences are, perhaps, not so far apart.
The IRIS web site has excellent resources for teachers related to earthquakes, including PowerPoint presentations intended for use with middle school, high school, or college students. You can also turn to Windows to the Universe to learn more about earthquakes, including where earthquakes occur and why they happen. And for a hands-on plate tectonics experience, try the Snack Tectonics activity with your classes. In this activity, students make tasty models of plate tectonic motions and then eat the evidence!
NESTA and Windows to the Universe Sessions at NSTA in Indianapolisby Jennifer
We are happy to announce our events for the NSTA in Indianapolis this coming March. If you plan to be at the NSTA conference, please join us at the following sessions:
NESTA Field Trip: From Glacial Till to Minerals that Thrill! (Buy tickets online!)
NESTA Board of Directors Meeting
8:00am-noon
Westin Indianapolis, Senate 3
NESTA Half Day Field Trip: Sampling Midwest Geology (Buy tickets online!)
NESTA Geology Share-a-Thon
Westin Indianapolis, Grand Ballroom 5
NESTA Atmospheres, Oceans, and Climate Change Share-a-Thon
11:00am-noon
NESTA Earth System Science Share-a-Thon
Drama in "Near Earth" Space: The Sun, Space Weather, and Earth's Magnetic Field as We Approach Solar Maximum!
Earth and Space Science Education Today in K�12: Status and Trends at the State and National Levels
Friends of Earth and Space Science Reception
Activities Across the Earth System
Strategies for Teaching About Charged Topics in the Earth Science Classroom
NESTA Earth and Space Science Educator Luncheon: Dust in the Wind - The Geological Record of Ancient Atmospheric Circulation (Buy tickets online!)
Westin Indianapolis, State
NESTA Astronomy, Space Science, and Planetology Share-a-Thon
NESTA Rock and Mineral Raffle
NESTA Annual Membership Meeting
It's Not Too Late to Sign Up for Share-a-Thons in Indianapolis!by Jennifer
As you can see from looking through our session listings above, the National Earth Science Teachers Association will be very involved at the National NSTA conference in Indianapolis. �Join this supportive teachers' network and you can meet other NESTA members at these NSTA conferences. These NESTA members have great ideas for teaching Earth science, and their enthusiasm for the geosciences is contagious! Other�membership benefits include receiving�The Earth Scientist (a quarterly journal), full voting privileges, access to members-only areas of the NESTA web site, a discount on�Windows to the Universe Educator Membership, and the monthly e-mail newsletter, NESTA ENews, that shares new resources, opportunities, alerts, and upcoming events.
NESTA will host four Share-a-Thon sessions at the NSTA Indianapolis. Whether you are a NESTA member or not, you can present at those Share-a-Thon sessions. �It's not too late to sign up to be a presenter or a volunteer!
This is a fun opportunity to share your activities at a national conference and simultaneously have a chance to meet an extended group of colleagues. For people who are first-time presenters, this is an easy way to get some experience.
Not ready to share an activity? We also need volunteers to help behind-the-scenes. We need volunteers to check in presenters, greet attendees, make packets, and just generally help out. Each volunteer gets a complete set of Share-a-Thon materials.
If you are interested in being a part of a NESTA share-a-thon this March, please email our share-a-thon coordinator,�Michelle Harris (michelle.harris@apsva.us). Share-a-thon presenters will need to register for the NSTA meeting, provide complete name and address, and return a confirmation form in order to participate in the event (confirmation forms will be sent out after you have signed up with Michelle). If you have any other questions, please email Michelle for more details.
A Photo Album of the Inuit Experience from the Early 1900'sby Roberta
A spectacular collection of images documenting the Inuit experience at the turn of the last century is available on Windows to the Universe at http://www.windows2universe/earth/polar/inuit_image_gallery.html.
Check out these evocative images from the Library of Congress, and explore links to related content on our website.
Make Your Field Trip More Meaningfulby Jennifer
Many teachers take their classes on field trips in late winter and spring. What a wonderful idea! As a teacher, I faced the challenge of making those outings meaningful.
Use our Snapshot Exercise to have your students write about a select moment of a given field trip. We have a simple page for elementary school students where they can write down as many words as they can think of that have to do with what they see, hear, smell and touch. For middle-high school students, we have a large list of sensory adjectives that would be helpful in writing their snapshot!
This activity makes your field trip or outing more meaningful and encourages students to communicate about science.
Birthdays in Februaryby Julia
February 8 - Dmitri Mendeleev (1834-1907), a Russian chemist who created the first version of the periodic table of elements.
February 11 - Thomas Edison (1847-1931), a famous American inventor who patented over a thousand inventions, including the light bulb, phonograph and a motion picture machine.
February 12 - Charles Darwin (1809-1882), English naturalist whose book On the Origin of Species laid the basis of modern evolutionary theory.
February 15 - Galileo Galilei (1564-1642), an Italian physicist, astronomer and philosopher who was called "the father of modern science" by Albert Einstein.
February 19 - Nicholas Copernicus (1473-1543), a Polish astronomer and mathematician who introduced the heliocentric model of the universe.
February 25 - Maria Kirch (1670-1720), a German astronomer who discovered the comet of 1702.
February 28 - Linus Pauling (1901-1994), an American chemist who was awarded a Nobel price for his work in biochemistry and another one for his efforts in stopping nuclear weapons testing.
In the Mood for Some Games?by Jennifer
Did you know that the Windows to the Universe site has many educational games? Take a needed break and play one. Or recommend them to your students.
There's the Atmosphere and Clouds Word Search. �Like the rest of the site, it's available at three different levels. �How about a Jigsaw Puzzle? �The Climate Crossword Puzzle is sure to challenge you. �Are you in the mood for some Planet Sudoku?
Our�Carbon Cycle Game even allows students to travel around the carbon cycle, and find out about carbon reservoirs, greenhouse gases, climate change and more. They will answer quiz questions on their way.
We hope you and your students enjoy these games and many more!
IntroductionBlack History MonthMatthew HensonAquarius SalinityCarnivalWarmest Years RecordPolar BearsGRAIL MissionEarthquakesNESTA/W2U at NSTAIndy Share-Thons1900's Inuit PhotosField Trip HelpBirthdaysGames!TEACHERSDeath by Lava!Bad Astronomy98 Astronomy AppsPARTNERSGeoscience ProgramChildren's Book YearChasing IceEPA Pick 5Women@NASAFree Posters!Sci News for KidsBudburst3rd Rock RadioFish and WildlifeTrain Like AstronautStop the Beetle!NOAA WeekGreen WeekSustainable AwardGreat Backyard BirdGreen Natl ConfBob the BunnyStudent RoboticsMoonbuggy RaceEE Week in AprilThacher Env ContestRocket TeamsMicrogravity Teams
Click here to submit your ideas to the newsletter Death by Lava!A piece written by�Ardis Herrold, Earth Science teacher
There's a great discussion on a Wired Science blog written by Erik Klemetti entitled The Right (and Wrong) Way to Die When You Fall Into Lava. Read the blog post by clicking on�this link.
This seems to suit the humor and interest level of many of our students. Near the end of the post, Klemetti suggests a classroom demonstration using motor oil and styrofoam. I found this interesting, at least for the purposes of reviewing the concept of density! I suggest using vegetable oil instead of motor oil, because it has a similar density to motor oil, but is a safer classroom alternative.
The Myth of Nibiru and the End of the WorldMy students are already aware of the calamity that will allegedly strike in 2012 that will result in the end of our planet and of course, us along with it. I have been asked by my 8th grade students on numerous occasions if this catastrophe will actually occur and it is probable that many science teachers have or will be asked the same dire question.
For those not aware of this rapidly spreading rumor, a rogue planet that goes by the name Nibiru was discovered by the Mayans. According to the story, Nibiru has an orbital period of 3,600 years and is fast approaching our inner Solar System and subsequently will destroy our planet as it whips around the sun in December 2012!
Educators need to be aware that the Internet is alive with hundreds of websites feeding this Bad Astronomy to our students and that many students take it quite seriously! I try to put their fears to rest with some Good Astonomy- if the Solar System is 4.6 Billion years old and all the planets formed at the same time, then Nibiru has swung around the Sun about 1.3 million times. Surely it would have done in our planet on one of its previous visits!
There are many websites such as the Skeptical Inquirer that are working hard to debunk this nonsense. Another good website for astronomical facts is www.badastronomy.com.Kind Regards,Nicholas, Middle School Teacher
98 Astronomy AppsA piece written by Ardis Herrold, Earth Science teacher
An annotated overview of 98 astronomy applications for smart phones and tablets has been published in the online journal Astronomy Education Review. Compiled by Andrew Fraknoi (Foothill College), the list features a brief description and a direct URL for each app. You can access the article free-of-charge at�this link.
The listing includes a variety of apps for displaying and explaining the night sky; astronomical clocks, calculators and calendars; sky catalogs and observing planners; planet atlases and globes; citizens science tools and image displays; a directory of astronomy clubs in the U.S., and even a graphic simulator for making galaxies collide. A number of the apps are free, and others cost just a dollar or two.
Astronomy Education Review is an online journal about astronomy education and outreach published by the American Astronomical Society. AER recently celebrated its 10th anniversary. You can find it at�http://aer.aas.org.
Announcements from Partners
Click here to submit information about your program to the newsletter
Teachers in Geosciences Program from Mississippi State UniversityYou can earn your Master of Science degree via distance learning through the Teachers in Geosciences program from Mississippi State University. All of the core Earth science courses are taught online, and the curriculum is designed around the Earth science content that is most relevant to K�12 educators.�The program concludes with an 8- to 10-day capstone field course that is taught during the summer at a variety of locations including Yellowstone/Grand Tetons, Western Washington State, the Sierra, Central Arizona, Upstate NY, Lake Superior, the Bahamas, and the Great Plains Storm Chase.
This 12-course, 36-credit-hour graduate program is designed to take as little as two years to complete and includes courses in meteorology, geology, planetary science, oceanography, hydrology, and environmental geoscience. The program has alumni in all 50 states, and all students qualify for in-state tuition rates.
Please visit our website at www.distance.msstate.edu/geosciences/TIG/index.html or contact Joy Bailey, jbailey@aoce.msstate.edu, for additional information.
The 2nd Annual National Children's Book of the Year ContestSPECIAL ANNOUNCEMENT from the National Association of Elementary School Principals!
The 2nd annual National Children's Book of the Year Contest is a great opportunity for writers.
Two winners will get their book published by national publisher Charlesbridge Publishing in Boston and have it endorsed by the NAESP and its 30,000 members � reaching hundreds of thousands of teachers and millions of students at the schools.� The winners will be announced at the NAESP annual meeting this spring.
The contest is open to all authors.� Self-published manuscripts are accepted.� The two winners will be one children's picture book and one children's chapter book (from early reader to YA novel).
Teacher.net said, "this contest is an incredible opportunity to launch a writer's career."
The deadline for entries is March 1, 2012.
Enter at�http://www.naesp.org/naesp-foundation/national-childrens-book-year-contest
Chasing Ice Premiere"Chasing Ice" - a documentary film about one photographer's journey from a single magazine shoot to a five-year project recording climate change's impact on glaciers - premiered 23 January at the 2012 Sundance Film Festival. Photographer and�AGU�member James Balog, the subject of the film, stopped by�AGU�headquarters in Washington, D.C., in early January to share his tips for scientist-communicators. Watch the video interview on The Plainspoken Scientist:�http://bit.ly/x4qkPn
It's Not Too Late for New Year's ResolutionsHappy New Year! �It's always the right time to give a gift to the environment and it's easy! Pick 5 for your New Year's Resolution! Healthy resolutions easily partner with helping the environment. Get inspired. Get involved. Decide to act, share and maintain. �Here are a couple ideas to get you started:
-Save energy and recycle more at your Super Bowl Party (or other holiday events). Visit http://www.epa.gov/osw/wycd/funfacts/holidays.htm
-Got new toys? eCycle your old cell phone, computer or TV! �Donating your used electronics benefits others by passing on ready-to-use or refurbished equipment to those who need it. �Visit�http://www.epa.gov/osw/partnerships/plugin
Women@NASA Website Encourages Girls to Pursue STEM CareersNASA has expanded its Women@NASA website to include�Aspire 2 Inspire, a new feature aimed at helping middle school girls�explore education and careers in science, technology, engineering,�and mathematics (STEM) fields.
The site features four short films and one overview film that explore�the careers and backgrounds of early-career women who work for NASA�in each of the STEM areas. A list of community organizations and�NASA-affiliated outreach programs with a STEM emphasis is also available.The site features four Twitter feeds where visiting girls can submit questions to the young women featured in the�films.
Life of the Forest PostersLearn about tree rings, seeds, leaves, bark, and needles, and learn how trees eat, drink, and breathe using these colorful posters from the International Paper Learning Center.� Each 16 x 20 inch poster features photos and facts about the topic (e.g., did you know that "happy" trees produce evenly spaced tree rings?) and includes an accompanying handout. K-6 teachers can�order a free poster set or�download them directly from the web.
Science News for KidsCan lizards learn?� Will the Sun's cycle stay the same? �What are aftershocks? �Find answers to these questions and delve into more of life's curiosities�at�Science News for Kids.� The site presents timely science stories categorized by subject, along with suggestions for hands-on activities, books, articles, and web resources.
Science News for Kids is run by the Society for Science and the Public.
Project BudburstProject Budburst is a network of people across the United States that monitors plants as the seasons change. They are a national field campaign designed to engage the public in the collection of important ecological data based on the timing of leafing, flowering, and fruiting of plants (plant phenophases). Project BudBurst participants make careful observations of these plant phenophases. The data are being collected in a consistent manner across the country so that scientists can use the data to learn more about the responsiveness of individual plant species to changes in climate locally, regionally, and nationally.�Project BudBurst began in 2007. �Since then, thousands of people from all 50 states have participated.
Data is being collected now! �It's free to participate and Project Budburst is open�to people of all ages and abilities.
Third Rock RadioDid you know you could listen to NASA's�Third Rock Radio on your iPhone, iPod Touch, iPad or Android?
"Now you can listen to great music in the same app that still provides�all of NASA's amazing content wherever you are," said Jerry Colen,�NASA App project manager at NASA's Ames Research Center.Try out the NASA app and listen to the Third Rock station.
New Resources from the Association of Fish and Wildlife AgenciesThe Association of Fish and Wildlife Agencies (AFWA) has released three new products for educators to connect more people, especially youth, to the outdoors and to increase our nation's understanding of how fish and wildlife and their habitats are conserved. The three products are�Benchmarks for Conservation Literacy,�Outdoor Skills Education Handbook, and�Sustainable Tomorrow - A Teachers Guidebook for Applying Systems Thinking to Environmental Education Curricula.� Designed for teachers of grades 9-12, Sustainable Tomorrow uses lessons from Project WILD, Project WET and Project Learning Tree.
NASA Challenges Students To Train Like An AstronautAn engaging new NASA program brings the excitement of�space exploration to children learning to live a healthy lifestyle.�Inspired by First Lady Michelle Obama's Let's Move! initiative,�NASA's Train Like an Astronaut program aims to increase opportunities�both in and out of school for kids to become more physically and�mentally active.The program uses the excitement of space exploration and astronaut�training to challenge, inspire and educate kids to set physical�fitness goals and practice fitness and proper nutrition. Kids will�explore mission challenges, learn the science behind nutrition and�learn to train like an astronaut.
The activities align with national education standards that are part of physical education and health curriculum in schools throughout�the country. Teachers can easily modify the activities to create an�environment that supports all learners. �No special equipment is required and the activities involve no heavy lifting.�Although designed�for 8-12 year olds, the program is for anyone who is curious about�space exploration and what it takes to be an astronaut. �Participants simply visit the�website, find a favorite exercise, and get started.
USDA Stop the Beetle Web SiteWe recently created a classroom Activity called Changing Planet: �Bark Beetle Outbreaks.
There is a lot of information available about the Emerald Ash Borer beetle as well. �USDA even has�a kid's corner where students can play a role in helping to protect ash trees. �These creative tools and activities will enable you to learn more about the Emerald Ash Borer Beetle (EAB) and protect our precious ash trees � all while having lots of fun.
NOAA Heritage Week 2012NOAA's Heritage Week is taking place February 3-11, in Silver Spring, Maryland. Attend the guest speaker series, mini-talks, an open house, and visit the Gateway to NOAA exhibit. You'll be amazed by what you learn!
National Green WeekNational Green Week is February 6-10, 2012. The ways to participate are endless! �Eco-Challenges for the week (and beyond!) include the�Waste Free Snacks Challenge, the�Green Energy Challenge, and implementation of a�GEF recycling program for your school.
National Green Week, run by the Green Education Foundation, even has a special�curriculum for themed�standard-based lessons that is organized for each grade.
Samsung Sustainable Energy AwardThe National Environmental Education Foundation and Samsung are partnering to offer the�$10,000 Sustainable Energy Award. �$10,000 awards will be presented to the top three high schools that can demonstrate how they have engaged students and teachers in school-wide energy savings through the creative and innovative use of technology. Applications are due February 10, 2012. �Apply now!
Schools across the nation are looking for creative ways to cut spending without compromising the quality of the education they deliver. Increasing energy efficiency offers many opportunities for meeting that challenge. It is estimated that America's primary and secondary schools spend more than $6 billion annually to power their facilities.� But schools can reduce their costs by as much as 30 percent by implementing energy-saving measures.
Energy efficiency also provides an invaluable opportunity for hands-on learning for students inside their own school building. Through the application of science, technology, engineering and math (STEM), students are empowered to identify solutions that cut spending and reduce energy use, giving them�and their entire community�a healthier environment and a sense of school pride. Technology, in particular, can improve school-wide energy efficiency through the use of energy-efficient fixtures such as solar panels and meters that monitor energy use. The use of tools such as smart meters allows schools to track energy and cost savings and integrate these activities into their lessons.
Great Backyard Bird CountCitizen scientists - get ready, get set, count! �The Great Backyard Bird Count is an annual four-day event that engages bird watchers of all ages in counting birds to create a real-time snapshot of where the birds are across the continent. Anyone can participate, from beginning bird watchers to experts. It takes as little as 15 minutes on one day, or you can count for as long as you like each day of the event. It's free, fun, and easy�and it helps the birds.
The 2012 GBBC will take place February 17-20. �Kids can participate too!
Second Annual Green Schools National ConferenceBe a part of the only national gathering of K-12 leaders and educators coming together to make their schools and districts green & healthy centers of academic excellence. �This national conference will be held February 27-29, 2012, in Denver, Colorado.
The conference will include over 100 breakout sessions, exhibits, and chances to network with other like-minded peers from across the country. �Find out more at�http://www.greenschoolsnationalconference.org/index.php
Bob the Bunny's Environmental Cartoon CompetitionBob the Bunny's environmental competition is aimed at young adventurers aged 10-12 years old.
To enter, you form a team of 1 to 3 members, identify a local environmental issue and create a cartoon strip illustrating the issue and actions that you might take to solve the problem. Submissions should be sent in by February 29, 2012.The winning team will be sent to the 2012 Volvo Adventure Final (an event you don't want to miss!). See the Volvo Adventure website for more details on how to register for the Bob the Bunny contest.
Student FIRST Robotics CompetitionNASA is continuing its strong support for the annual�FIRST Robotics Competition, which inspires student interest in�science, technology, and mathematics. The agency is awarding grants totaling $1,386,500�for student teams in 37 states to participate in FIRST, or For�Inspiration and Recognition of Science and Technology.Each FIRST team receives an identical kit of parts and has six weeks�to design and build a robot. Other than dimension and weight�limitations and other technical restrictions, the look and function�of the robot is up to each team. NASA volunteers support many teams�throughout the process.The competition is structured like a professional athletic event and�teams compete in an arena the size of a small basketball court. �Robots must have offensive and defensive capabilities. Teams�collaborate to complete tasks, while simultaneously preventing�opposing teams from completing the same activity.This year, 45 regional competitions will take place in the U.S., along�with four additional international competitions in March and April.�The FIRST Championship competition will be held in St. Louis in�April.For more information, visit: �http://robotics.nasa.gov
Annual Great Moonbuggy Race - Register Soon!NASA is challenging student inventors to gear up�for the agency's 19th annual Great Moonbuggy Race. Registration is�open for the engineering design and racing contest set to culminate�in a two-day event in Huntsville, AL, on April 13-14, 2012.Participating high schools, colleges and universities may register up�to two teams and two vehicles. Registration for U.S. teams closes�Feb. 10. International registration has closed. For complete rules�and to register, visit:http://moonbuggy.msfc.nasa.govThe race is organized annually by NASA's Marshall Space Flight Center�and is held at the U.S. Space & Rocket Center, both in Huntsville. Since�1994, NASA has challenged student teams to build and race�human-powered rovers of their own design. These fast, lightweight�moonbuggies address many of the same engineering challenges overcome�by Apollo-era lunar rover developers at Marshall in the late 1960's.Prizes are awarded to the three teams in each division with the�fastest final times. NASA and industry sponsors present additional�awards for engineering ingenuity, team spirit and overcoming unique�challenges -- such as the weekend's most memorable crash!For images and additional information about past races, visit:http://www.nasa.gov/topics/moonmars/moonbuggy.html
National Environmental Education Week is April 15-21, 2012National Environmental Education Week (EE Week) is April 15-21, 2012. Celebrate the environment as an engaging context for teaching STEM (Science, Technology, Engineering and Math) concepts and skills with the theme�Greening STEM: The Environment as Inspiration for 21st Century Learning.
Those who�register for EE Week 2012 will have access to a�free educator webinar,�planning toolkits and several other perks (including discounts, giveaways and special offers).
2012 Thacher Environmental Research ContestFrom the massive Gulf oil spill to the continued decline of Arctic sea ice, satellites and other observing instruments have proved crucial this year in monitoring the many environmental changes -- both natural and human induced -- occurring on global, regional and local scales.
The 2012 Thacher Environmental Research Contest, sponsored by the Institute for Global Environmental Strategies, challenges high school students (grades 9-12) to conduct innovative research on our changing planet using the latest geospatial tools and data, which in recent years have become increasingly accessible to the public.�$3,500 in cash awards are available.
Eligible geospatial tools and data include satellite remote sensing, aerial photography, geographic information systems (GIS) and Global Positioning System (GPS). The main focus of the project must be on the application of the geospatial tool(s) or data to study a problem related to Earth's environment.
Entries are due April 16, 2012.
Student Rocket Teams Take NASA Launch ChallengeMore than 500 students from middle schools, high�schools, colleges and universities in 29 states will show their�rocketeering prowess in the 2011-12 NASA Student Launch Projects�flight challenge. The teams will build and test large-scale rockets�of their own design in April 2012. �The student teams will vie for a variety of awards for engineering�skill and ingenuity, team spirit and vehicle design.Each Student Launch Projects team will build a powerful rocket,�complete with a working science or engineering payload, which the�team must design, install and activate during the rocket launch. The�flight goal is to come as close as possible to an altitude of 1 mile,�requiring a precise balance of aerodynamics, mass and propulsive�power.In April, the teams will converge at Marshall, where NASA engineers�will put the students' creations through the same kind of rigorous�reviews and safety inspections applied to the nation's space launch�vehicles. On April 21, 2012, students will fire their rockets�toward the elusive 1-mile goal, operating onboard payloads and�waiting for chutes to open, signaling a safe return to Earth. �For more information about�the challenge, visit:http://education.msfc.nasa.gov/sli
NASA Selects Student Teams For Microgravity Research FlightsNASA has selected 24 undergraduate student teams to test�science experiments under microgravity conditions. The teams will fly�during 2012 as part of the agency's Reduced Gravity Education Flight�Program (RGEFP).The teams will design and build their experiments at NASA's Johnson�Space Center in Houston and conduct tests aboard an aircraft modified�to mimic a reduced-gravity environment. The aircraft will fly�approximately 30 parabolas with roller-coaster-like climbs and dips�to produce periods of weightlessness and hyper-gravity ranging from 0�to 2g's.The RGEFP experience includes scientific research, experimental�design, test operations and outreach activities. The program supports�NASA's goal of strengthening the nation's future workforce. �For more information about the Reduced Gravity Education Flight�Program, visit:�http://microgravityuniversity.jsc.nasa.gov
Newsletter archive Log in to visit our members' area, change your registration information or newsletter options.
The source of this material is Windows to the Universe, at http://windows2universe.org/ from the National Earth Science Teachers Association (NESTA). The Website was developed in part with the support of UCAR and NCAR,
where it resided from 2000 - 2010. © 2010 National Earth Science Teachers Association. Windows to the Universe® is a registered trademark of NESTA. All Rights Reserved. Site policies and disclaimer. | 科技 |
2016-40/3982/en_head.json.gz/6954 | New Progress Claimed in Quest for Hydrogen Fuel By OPINION BY LEE DYE
Email The elusive dream of producing clean, inexhaustible fuel to run everything from our cars to our cell phones may not be quite as illusive as we had thought. Scientists around the world are making significant progress toward using sunlight to split water into hydrogen and oxygen.
The goal is alluring because a cheap way to extract hydrogen from water would end energy shortages around the world while cleaning up the environment using two of nature's gifts, water and sunlight.
Sound too good to be true? Maybe it is, because the challenges are still great, but there are several reasons to be optimistic. Within the past few days, several research teams have reported progress toward reducing the cost of hydrogen production, currently a show stopper.
Meanwhile, other teams are having some success in creating new materials that can trap and hold those tiny hydrogen atoms until they are ready to be used.
The competition is intense, for obvious reasons. We are still in the age of hydrocarbon fuels, because they have been easy to acquire and amazingly efficient in terms of the amount of potential energy in a given volume. It takes a lot of hydrogen to produce the same amount of useable energy as a gallon of gasoline, which, incidentally, has become extremely tough competition. But clearly oil is not inexhaustible, and those countries that have it can hold the rest of the world hostage. Something has to change.
Here are a few givens if hydrogen is to succeed. The cost of producing it must be slashed. More efficient ways to harness solar energy must be found, because sunlight is a critical component in any feasible large-scale method of splitting water. Hydrogen, like any other fuel, has safety risks that must be overcome.
Hydrogen Fuel Could Reduce Dependence on Oil, Fight Global Warning
Yet hydrogen is being "aggressively explored" as a fuel for passenger cars, according to the Department of Energy, partly because it "has the potential to dramatically reduce our dependence on imported oil." It also could help in the fight against global warming because no greenhouse gases would need to be released into the atmosphere.
DOE is funding much of the research in this country, and it has set goals that some had thought could not be reached. Eight research institutions are trying to meet DOE's target of bringing the cost of hydrogen production down to $6 per kilogram (2.2 pounds) by 2015 and $2-$3 per kilogram by 2025. So far, only one team has reached those goals, the University of Colorado at Boulder.
The Boulder team built an array of mirrors to concentrate the sun's rays and generate temperatures as high as 2,640 degrees Fahrenheit. That heat, directed on a thin film of metal ferrite created by the Boulder team, splits water at 482 degrees cooler than other technologies.
In announcing the development, Alan Weimer of Boulder's chemical and biological engineering department, said the lower temperature makes water-splitting more cost effective and faster.
"It's pretty significant and it seems like there's a good shot for this to become mainstream in the southwest U.S. and other high isolation regions around the world," Weimer said.
Within days of that announcement several researchers reported progress in overcoming one of the main barriers to cost-effective production of hydrogen. Commercial hydrogen production today requires the use of platinum as a catalyst, which now sells for about $18,000 an ounce.
Is a Hydrogen Car in Your Future?
That's a show stopper for widespread use, and there isn't enough platinum in the world to produce enough hydrogen to power very many cars.
One team, at Australia's Monash University, discovered that birnessite, a common mineral that produces a black stain on rocks, also works as a catalyst for splitting water into hydrogen and oxygen. It mimics the way plants split water into fuel for growth, suggesting that there may be even more effective -- and cheaper -- substitutes for platinum.
Scientists at Stanford University are following a similar course. They are studying natural catalysts used by plants and other organisms to produce fuel for growth, and it looks like a common compound, molybdenum sulfide, might be "an inexpensive solution" for catalyzing hydrogen production, the researchers reported earlier this month. However, a different catalyst must be found to isolate the other component of water -- oxygen -- and the team is experimenting with other prospects. Left free to roam, oxygen will clog the chemical solar cell that powers the system.
Following a different course, scientists at Los Alamos National Laboratory have had some success experimenting with a compound of carbon, iron and cobalt as a catalyst.
But even if the cheap production of hydrogen is closer, how do you tame the stuff once you've got it? Hydrogen is the smallest atom, and thus it can squeeze in and out of every pore. You could use it today to run the family car, but it would seep past every hose clamp, and don't even think of keeping it in an ordinary gas tank. What is needed is some kind of matrix that is capable of holding a lot of hydrogen in a small place.
The hurdle here is to create seemingly solid stuff that has room for hydrogen. Rice University scientists say they have found that a class of material known as metallacarborane can store hydrogen better than the benchmarks set by DOE's hydrogen program for 2015. Other researchers at the University of California, Los Angeles, claim to have already solved the storage problem with a brand new material that works something like a sponge.
So if all of this pans out, is a hydrogen car in your future? It looks more likely now than it did just a couple of years ago. But success is not guaranteed. There's no way to distribute it like gasoline, but it seems reasonable that such an infrastructure would evolve fairly quickly if the potential is as great as so many scientists believe. However, there's a little problem with the competition.
Gasoline and diesel engines are getting better and better. So are batteries, leading to widespread acceptance of hybrids and electric vehicles. And sticker shock at the local hydrogen car dealership is likely to be awful as a brand new technology weaves its way through the marketplace.
So it's not going to happen tomorrow. But much progress is being made.
New Progress Claimed in Quest for Hydrogen Fuel +Just One Thing: Green Your Electric Car +Fuel Cell Limos? Virgin Says Yes " Reptile With 'Bizarre' Limbs Tweaks Current Understanding of Evolution " Angry Man Goes on Smashing Rampage in French Apple Store " SpaceX Unveils Plan for Manned Mars Mission 5 Decisions That Made the Nobel Prizes Look Bad " Feds List 7 Hawaii Bee Species as Endangered, a First in US " A Year of Alphabet: Great for Google, Less So for Moonshots Federal Authorities List 7 Species of Hawaii's Yellow-Faced Bees as Endangered, a First for Any Bees in the US SpaceX's Elon Musk Turns to Science Fiction for Mars Ship California OKs Self-Driving Vehicles Without Human Backup Europe's Comet Probe Rosetta Ends 12-Year Mission With Crash " Clinton Describes Trump's Latest Twitter 'Meltdown' as 'Unhinged' " Debate Commission Says There Were 'Issues' With Trump's Audio " Clinton Touts National Service Agenda and Takes a Few Shots at Trump " Matthew Weakens Slightly to a Category 4 Hurricane " RNC Goes After Bill Clinton's Half-Brother in New Memo " VP Candidates Prep for Their Debate " Tulsa Officer Pleads Not Guilty in Manslaughter of Unarmed Black Man " Silly Baby Panda Falls Flat on Its Face During Public Debut " Beloved Pet Chicken Thinks She Is a Dog " 13-Year-Old Twins Honored for Helping Mom Deliver Baby Sister at Home " Birth Mother Meets Daughter for First Time in 60 Years in Emotional Video " Obama Welcomes Team USA to White House " Yankees Fan Fumbles Ring While Proposing | 科技 |
2016-40/3982/en_head.json.gz/6958 | By Ann-Christine Diaz - 1 hour ago
DigitalNext
The End of the Facebook 'Fan' As We Know It
We Still Lack Technology for Brands to Build Personal Relationships In Social Media
By Victoria Ransom.
Published on June 28, 2012. Victoria Ransom
When the concept of a social media "fan" emerged a few years ago, it held out the promise of enabling meaningful, one-to-one conversations between brands and consumers at unprecedented scale. But that promise has yet to be delivered. Think about it: do you know whether your fans are moms, or sports enthusiasts or country-music aficionados? Do you know which ones are "superfans" and consistently engage with your programs, and systematically use that information to increase word-of -mouth?
Chances are you don't, because there hasn't been a scalable way to capture and use information about the "fans" you're engaging with on Facebook, Twitter and other social channels. And because marketers lack a deep understanding of their fans, they've been using social networks as another mass communication channel, broadcasting to "faceless," unknown masses.
Social-media marketing needs to move in a new direction that finally delivers on the promise of personalized interactions between brands and consumers. This will require new technologies that enable marketers to develop rich data profiles of the consumers they're interacting with on social networks. The model for this transformation will be the social data "system of record." Just as an organization's accounting system is its system of record for financial data and transactions, and its HR system is the system of record for personnel and employment data, social-data systems of record will become the central repository of all social data that is leveraged across other parts of the organization. With a system of record for social data, brands would be continuously aggregating, organizing and updating consumer data from across multiple sources -- everything from ad clicks and comments to public profile data on Facebook, LinkedIn and other networks. Based on these detailed profiles, marketers could then target content to consumers' specific interests, resulting in increased conversion rates and deeper relationships.
If you are a sporting goods manufacturer, for example, you'd be able to know which of your fans are snowboarders vs. skiers, which are active advocates of your brand vs. passive consumers, etc. -- so you can tailor and target your messages to those different consumers based on what you know about them. These systems should share social-profile data with other business systems, so that organizations can leverage social data to inform and power business processes through every stage of the customer lifecycle, from pre-sale awareness and consideration, to in-store promotions, to post-sale support and loyalty management. For example, in a customer-support situation, knowledge that the customer is a "highly engaged" consumer and a major influencer in social media could trigger a higher level of support for that customer -- in real time, at the point of interaction -- to ensure a superior support experience and continued brand advocacy by that influential consumer. The ability to have personalized interactions with consumers would also provide brands with a powerful lever to drive word of mouth at scale. If you know who your consumers are and if you understand their interests and social behavior, you can create content that is highly relevant and engaging, and therefore more likely to be shared with your consumers' personal networks.
In the first phase of social media marketing, brands understandably focused on creating a presence in social channels and building their base of fans and followers. The strategy was largely about growth -- it was a pure numbers game based on racking up the biggest fan count. Today it is no longer sufficient just to build a base of fans and followers. Brands need to focus on engagement, conversion and nurturing relationships with their consumers, by understanding who these people are.
Victoria Ransom
is founder and CEO of Wildfire. | 科技 |
2016-40/3982/en_head.json.gz/6969 | AOL Gets a New CEO: Google Sales Boss Tim Armstrong (Plus the Whole Press Release)
Everyone who wondered why Randy Falco and Ron Grant were still running AOL gets an answer: Time Warner (TWX) was lining up their replacement.
Google (GOOG) sales chief Tim Armstrong becomes chairman and CEO of the troubled Web property, effective immediately.
The move is getting immediate cheers from current and former AOL employees I’ve talked to. The snap consensus is that anyone would have been better than Falco, a longtime NBC executive, and Grant, who was Time Warner CEO Jeff Bewkes’s chief lieutenant before being elevated to his role as President and COO of AOL.
But they’re particularly happy to see a sales guy running the organization: AOL once had a much admired sales operation. But in recent years, the group has been roiled, as a series of sales chiefs came and went. (From Kara Swisher, here are more details on the shakeup, and an interview with Armstrong. And here’s some early betting on Armstrong’s replacement at Google — former Doubleclick CEO David Rosenblatt has a lot of fans).
The current AOL sales chief, former Yahoo (YHOO) sales boss Greg Coleman, was installed just last month. He’s been deep into a reorg of his own.
It was desperately needed after AOL’s miserable performance in 2008, which concluded with a quarter that saw ad revenue drop 18 percent. But those plans may be up in the air now.
In any case, here is the full press release from Time Warner about the firing of Falco and Grant, after the jump:
NEW YORK, March 12, 2009–Tim Armstrong, Google Senior Vice President, has been named Chairman and CEO of AOL, LLC, Time Warner Inc. (NYSE:TWX) Chairman and CEO Jeff Bewkes announced today. Current AOL Chairman and CEO Randy Falco and President and COO Ron Grant plan to leave the company after a transition period.
In making the announcement Mr. Bewkes said: “Tim is the right executive to move AOL into the next phase of its evolution. At Google, Armstrong helped build one of the most successful media teams in the history of the Internet–helping to make Google the most popular online search advertising platform in the world for direct and brand marketers. He’s an advertising pioneer with a stellar reputation and proven track record. We are privileged to have him preside over AOL as its audience and programming businesses continue to grow and its advertising platform expands globally. He’ll also be helpful in helping Time Warner determine the optimal structure for AOL.”
Tim Armstrong said: “I’m very excited about the opportunities presented in leading AOL. AOL has a wide-ranging set of assets and audience. The company is well positioned to enhance those assets into a larger share of the Internet audience and advertiser communities. AOL and Google have been partners for years and I look forward to collaborating with Jeff Bewkes and his team as we explore the right structure and future for AOL.”
Mr. Bewkes added: “Randy led AOL in its transition from a subscription business to an audience business. Under Randy and Ron, AOL’s programming sites exhibited year-over-year growth in unique visitors for 23 consecutive months with many of its sites now in the top five of their categories. They also assembled Platform-A, the number one display ad network in the U.S. with a reach of more than 90%. They also aggressively cut costs as they restructured the Audience business portion of the company into three distinct operating units: People Networks, MediaGlow, and Platform-A. As Randy and Ron move on, they leave AOL with our gratitude and appreciation for remaking the company and bringing it to a new and promising level.”
Tim Armstrong was a member of Google’s Operating Committee and served as the president of the Americas Operations. Under the Americas Operations, Armstrong’s team managed publishers and advertisers’ relationships and platforms with some of the world’s most widely recognized media and agency brands. Armstrong started at Google in the year 2000 and opened the first office outside of the Mountain View, CA headquarters.
Mr. Armstrong joined Google from Snowball.com, where he was vice president of sales and strategic partnerships. Prior to his role at Snowball.com, he served as director of integrated sales & marketing at Starwave’s and Disney’s ABC/ESPN Internet Ventures, working across the companies’ Internet, TV, radio, and print properties. He started his career by co-founding and running a newspaper based in Boston, MA, before joining IDG to launch their first consumer Internet magazine, I-Way.
Mr. Armstrong sits on the boards of the Interactive Advertising Bureau (IAB), the Advertising Council, and the Advertising Research Foundation, and is a trustee at Connecticut College and Lawrence Academy. He is a member of Mayor Bloomberg’s MediaNYC 2020 committee. He is a graduate of Connecticut College, with a double major in economics and sociology.
Tagged with: ABC, advertising, Advertising Council, Advertising Research Foundation, agency, Amercias Operations, AOL, audience, Boston, brand, costs, Disney, display ad, ESPN, Google, Greg Coleman, IAB, Interactive Advertising Bureau, Internet, Jeff Bewkes, Mayor Bloomberg, MediaGlow, MediaNYC 2020, Mountain View, NBC, network, newspaper, online search, People Networks, Peter Kafka, Platform-A, programming, Publishers, Randy Falco, Ron Grant, sales, Snowball.com, Starwave, subscription, Tim Armstrong, Time Warner, unique visitors, Web, Yahoo
I’m a giant vat of creative juices.— David Pogue on why he’s joining Yahoo AllThingsD by Writer | 科技 |
2016-40/3982/en_head.json.gz/7008 | Robert Etheridge Jr, Curator, 1895-1919
Robert Etheridge led the Museum’s first scientific survey which was an expedition to Lord Howe Island.
Image IRN: 1540068
NotesRobert Etheridge Jr, 1846–1920
Robert Etheridge Jnr was born in Gloucestershire, England, and trained as a palaeontologist. He arrived in Australian in 1866 and spent the next five years as assistant field geologist to the Geological Survey of Victoria, and as a gold miner. Back in England he became palaeontologist to the Geological Survey of Scotland, and an assistant in the geology department of the British Museum.
First scientific survey
Etheridge returned to Australia in 1887 and worked both as assistant in palaeontology at the Australian Museum, and as palaeontologist to the Geological Survey of New South Wales. Shortly after he led the Museum’s first scientific survey, an expedition to Lord Howe Island. In 1888 he explored the caves at the junction of the Murrumbidgee and Goodradigbee Rivers.
In 1893 he was acting curator of the Museum and became curator in 1895. He faced many difficulties in these early years. The economic depression of 1893 had led to severe cuts in the operational budget and staff, and yet there was a growing need for increased collection, exhibition and working space. Due to the shortage of attendants the new Geological Hall could only be open to the public on alternate weeks.
The first decade of the twentieth century saw increases in staff and museum space. The new south wing opened in 1910. Its new lecture theatre firmly established the museum’s educational role.
Prolific publisher
A dedicated scientist, Etheridge published over 350 papers. His contribution to Australian stratigraphy was substantial. While his primary interest remained palaeontology, a significant number of studies were on ethnological subjects. Etheridge set up the separate department of Ethnology in 1906, and oversaw the enrichment of the collections of artefacts from Australia and the Pacific. Etheridge wrote an Elementary Guide to the Exhibited Zoological Collection (1914). He also wrote two important papers on the museum’s early history.
Etheridge’s tenure was marked by disagreements with some of his senior staff, and a long-running dispute with the Museum’s secretary, Sutherland Sinclair, over executive power of the Museum. After Sinclair’s death in 1917, Etheridge was made director as well as curator. He received the Clarke memorial medal of the Royal Society of New South Wales in 1895, and the Mueller medal of the Australasian Association for the Advancement of Science in 1911. Etheridge died suddenly of pneumonia in 1920.
Last Updated: 2 August 2014
archive, history, curator, director, Related sections
Curators and Directors of the Australian Museum | 科技 |
2016-40/3982/en_head.json.gz/7158 | News Opinion Polls Reader forums Lifestyles Sports Ads Classifieds Jobs Sections Extras Services Local columns
« We’ve become accustomed to m...
Summer: It’s almost over»
MSU research facility hits milestones
By Sen. Carl Levin ,
Save | Post a comment | WASHINGTON - It's always good news when Congress takes big steps to help boost scientific research. And it's even better when those steps lean on Michigan's world-class research universities. That's why two recent developments involving a major science investment at Michigan State University are so important: the Department of Energy's approval in early August of cost and construction timelines for the Facility for Rare Isotope Beams, and committee approval in both the Senate and House of appropriations bills that included full funding for FRIB in 2014. The $55 million in funding for FRIB, if it receives final approval, would clear the way for construction of the facility to begin next year. Most importantly, these developments are good news for our quest to understand our natural world. FRIB is a $730 million project that will allow researchers to create and study rare elements that are not normally found on Earth. That holds enormous promise for helping physicists better understand the universe, and to harness the power of nuclear science for practical applications that can improve our standard of living, solve our energy challenges and grow the economy. Article Photos
Research funding is a relatively small portion of the federal budget, but it pays for itself many times over. From radio and television to the Internet and Google, technological innovations that have changed the world got their start thanks, at least in part, to federal research funding. What makes this project doubly exciting is what it says about Michigan and our role in promoting that kind of innovation. The Department of Energy selected MSU in 2008 to host FRIB, cementing the university's status as one of the world's most important centers of nuclear research. For more than 60 years, MSU has been pushing the frontiers of nuclear knowledge. And since 1980, MSU's National Superconducting Cyclotron Laboratory has conducted pioneering nuclear research through partnerships with the Department of Energy and other research institutions around the world. That kind of research leadership was vital to winning the competition for FRIB. When complete, the facility will use incredibly powerful equipment to create atoms that don't naturally exist on Earth, and often remain intact for just fractions of a second. These atoms are of course far smaller than the eye can see, but by creating and studying them, scientists can answer questions about everything from the center of the universe's most powerful stars to treatments that can cure the most vexing human diseases. FRIB will bring benefits to Michigan beyond this groundbreaking research. The project will create hundreds of permanent jobs in our state, as well as hundreds of jobs during construction, and contribute an estimated $1 billion to the Michigan economy in its first decade. The benefits, for Michigan and the country, should be obvious. But the work of ensuring that FRIB meets these goals isn't over yet. The Department of Energy approval and actions by appropriations committees in both chambers of Congress are a good start, but I'll be working in the Senate with Sen. Debbie Stabenow and with members of Michigan's delegation in the House to make sure funding passes both chambers. And, because FRIB will take several years to complete, we will need to keep working to ensure that Congress provides the funding necessary to build and operate the facility. Still, we've achieved another major milestone for FRIB, which is vital to America's preeminence in nuclear research and an important investment in Michigan. I'm grateful for this recognition of FRIB's value to the nation, and I will continue to work to ensure that this funding receives final approval and FRIB continues moving forward. Save | Post a comment | Subscribe to Daily Press I am looking for: | 科技 |
2016-40/3982/en_head.json.gz/7233 | Quantifying degassing-driven crystal growth in basaltic lavas
Applegarth, Louisa and Tuffen, Hugh and James, Mike R. and Cashman, Katharine V. and Pinkerton, Harry (2011) Quantifying degassing-driven crystal growth in basaltic lavas. In: . Full text not available from this repository. Abstract As magma ascends and decompresses, volatile exsolution not only produces bubbles, but increases the liquidus temperature of the residual melt, resulting in an undercooling that can trigger crystallisation. In volcanic systems of intermediate composition, late-stage crystallisation and vesiculation in the shallow conduit have been shown to exert a strong control on eruptive style. These processes may be similarly important during subsurface and surface transport of basaltic melts. In recent experiments we demonstrated that the lag between degassing and crystallisation is sufficiently short that crystallisation as a consequence of degassing can be expected to occur in the conduit, depending on ascent rates. Up to 35% volume crystals were observed to grow as a result of the degassing of <1 wt% water. Degassing-induced crystallisation therefore has the potential to rapidly and profoundly change magma rheology before and during eruption, and so have a strong influence on the eruptive style. The effects of degassing-induced crystallisation on rheology depend on crystal fraction, morphology and size distribution. Timescales of rheology changes also depend on crystal growth rates. Here we report on experiments designed to quantify these characteristics. We use a microscope with a heated stage to directly observe crystallisation events and record crystal growth at temperatures up to 1300 °C. Experiments are conducted on quenched (i.e. with near-eruptive volatile content) samples from Mt. Etna, Sicily, and Mauna Loa, Hawaii, and recorded with time lapse imaging. From these images, crystal growth rates as a result of degassing are measured, and the crystal contents, morphologies and size distributions at different stages of degassing determined. The undercooling experienced by the samples as a result of degassing can be estimated from the crystal morphology. Crystal contents on eruption are much higher at Etna (~30%) than Hawaii (~2%), meaning the effects of degassing on samples with radically different initial textures can be observed. Comparing textures produced during degassing with those produced during cooling at different rates allows assessment of the contribution of degassing to textural evolution of the lava, and hence could provide a means of estimating the effect of degassing on magma rheology. This work has implications for the modelling of magma flow in conduits, and of the flow of lava after eruption. Item Type: Conference or Workshop Item (Other) Subjects: Departments: Faculty of Science and Technology > Lancaster Environment Centre ID Code: 52162 Deposited By: ep_importer_pure Deposited On: 20 Dec 2011 11:32 Refereed?: No Published?: Published Last Modified: 09 Jul 2016 00:02 Identification Number: URI: http://eprints.lancs.ac.uk/id/eprint/52162 Actions (login required) View Item | 科技 |
2016-40/3982/en_head.json.gz/7234 | The Development of an Enhanced Electropalatography System for Speech research
Chiu, W.S.C.
(1995) The Development of an Enhanced Electropalatography System for Speech research.
: University of Southampton, Doctoral Thesis
To understand how speech is produced by individual human beings, it is fundamentally important to be able to determine exactly the three-dimensional shape of the vocal tract. The vocal tract is inaccessible so its exact form is difficult to determine with live subjects. There is a wide variety of methods that provide information on the vocal tract shape. The technique of Electropalatography (EPG) is cheap, relatively simple, non-invasive and highly informative. Using EPG on its own, it is possible to deduce information about the shape, movement and position of tongue-palate contact during continuous speech. However, data provided by EPG is in the form of a two-dimensional representation in which all absolute positional information is lost. This thesis describe the development of an enhanced Electropalatography (eEPG) system, which retains most of the advantages of EPG while overcoming some of the disadvantages by representing the three-dimensional (3D) shape of the palate. The eEPG system uses digitised palate shape data to display the tongue-palate contact pattern in 3D. The 3D palate shape is displayed on a Silicon Graphics workstation as a surface made up of polygons represented by a quadrilateral mesh. EPG contact patterns are superimposed onto the 3D palate shape by displaying the relevant polygons in a different colour. By using this system, differences in shape between individual palates, apparent on visual inspection of the actual palates, are also apparent in the image on screen. The contact patterns can be related more easily to articulatory features such as the alveolar ridge since the ridge is visible on the 3D display. Further, methods have been devised for computing absolute distances along paths lying on the palate surface. Combining this with calibrated palate shape data allows measurements accurate to 1 mm to be made between contact locations on the palate shape. These have been validated with manual measurements. The sampling rate for EPG is 100Hz and the data rate is equivalent to 62 bits per 10ms. In the past few years, some coding (parameterization) methods have been introduced to try to reduce the amount of data while retaining the important aspects. Feature coding methods are proposed here and several parameters are investigated, expressed in terms of both conventional measures such as row number, and in absolute measures of distance and area (i.e. mm and mm2). Features studied include location of constriction and degree of constriction. Finally, in order to reduce the amount of data while retaining the spatial information, composite frames that represent a series of EPG frames are computed. Measures of goodness of the composite frames that do and do not use 3D data are described. Some example are given in which fricative data has been processed by generating a composite frame for the entire fricative, and computing an area estimate for each row of the composite frame using the assumption of a flat tongue. This thesis demonstrates the current capability and inherent flexibility of the enhanced electropalatography system. In the future, the eEPG system can be extended to compute volume estimates again using a flat tongue model. By incorporating information on the tongue surface provided by other imaging methods such as ultrasound, more accurate area and volume estimates can be obtained.
Address: Faculty of Engineering and Applied Science
Faculty of Physical Sciences and Engineering > Electronics and Computer Science
Further Information:Google Scholar | 科技 |
2016-40/3982/en_head.json.gz/7244 | Flood risk science
Programme research areas
Fluvial Design Guide
SearchFluvial Design Guide OverviewUse of the Fluvial Design GuideEditorial TeamSteering GroupReviewers
Fluvial Design Guide - Chapter 8
Works in the river channel
8.1 Introduction 8.1.1 Striking the right balance Successful – and indeed sustainable – engineering works are those that are planned, designed, operated and maintained with due regard for the environment in which they function. In this context, the term ‘environment’ is used in its broadest interpretation, including the physical forces that have to be resisted as well as the ecological, social and visual contexts. Works in a river or stream are no different in this regard; indeed it can be argued that the fluvial environment is one of the most challenging for engineers. A common mistake in the past was to ‘over-engineer’ river works. Although such an approach may have achieved the hydraulic objectives (increased flood capacity, for example), this was often at the expense of other matters such as ecology and visual amenity. The quest to improve the hydraulic conveyance of river channels also created a huge maintenance obligation. This is because, left alone, a river tries to return to its ‘regime’ condition, in which flows significantly greater than the mean annual flood are not contained by the dominant channel. Today’s river managers must strive to achieve the right balance (Environment Agency, 2003). This is not to say that under-engineering is the right approach. If the works cannot withstand the hydraulic forces imposed upon them, they will fail to achieve the desired objective and may damage the environment in the process. Nor is the complete abandonment of historic maintenance practices an appropriate response to environmental pressures, as this can also result in environmental degradation of a different kind (see Figure 8.1). Figure 8.1 Inadequate maintenance? In the past this urban stream has been concrete-lined to improve its hydraulic capacity. But it has not been maintained regularly and the sediment in the bed now supports a lush growth, which acts to obstruct flow. The appropriate level of maintenance should be determined by the desired hydraulic capacity of the channel as well as by the ecological value of the stream. Complete clearance of the vegetation may not be appropriate. However, the removal of most of the vegetation, perhaps leaving a strip on in the inside of the bend, could improve flood capacity without unduly damaging the ecological status. Getting the right balance ensures that both hydraulic and environmental objectives are achieved. The stability of a natural channel (that is, its resistance to short-term change) is inextricably linked to its geomorphology and ecology. Aggressive channel ‘improvement’ works or vegetation clearance not only inflict severe damage on the ecology and visual amenity, but may also lead to erosion on the channel bed or banks, with consequential damage to adjacent infrastructure. It is therefore important that engineers and river managers: seek expert advice on geomorphology and ecology when contemplating significant works in rivers (see Chapters 3 and 4); consider the impact of any works on other users of the river or stream, from those who simply appreciate the fluvial environment to those who actively use it for sports and recreation (see Chapters 5 and 6). Works in a river channel not only have the potential to alter the river environment permanently, but they can also have a significant impact during construction and when maintenance works are required. Consultation with all affected parties (stakeholders) in the early planning stages will therefore pay dividends later on. In particular, it may be possible to programme works both to minimise the environmental impact and to avoid conflict with river users (such as anglers, boaters, canoeists and ramblers). When works in river channels are planned, opportunities for the enhancement of the fluvial environment should also be considered. Examples include: improvements to riverside access; the provision of fish ledges; installation of nesting pipes in the riverbank for sand martins; the removal of invasive plant species (see Chapter 4). The overall aim should be to achieve the objectives in a manner that is sustainable, while adding to the environmental value whenever possible. There is useful guidance in an RSPB handbook (1994). Since the starting point for channel improvement works, including vegetation removal and desilting, is usually the desire to increase (or maintain) flood conveyance, it is clearly important to define the channel capacity required. In its simplest form this means defining a flow (in m3/s – cubic metres per second) for given water levels along a reach of the channel. The recently developed ‘conveyance estimation system’ (CES) allows the user to estimate the flow capacity of any channel for a range of assumptions regarding channel maintenance; for details of the CES and its development, see the project website (http://www.river-conveyance.net). 8.1.2 Legal issues The legal framework for works in rivers is outlined in Chapter 1. The fundamental legal issues are set out in Acts of Parliament which define the rights and responsibilities of all parties with respect to rivers and streams. More recently, the European Union has enacted the Water Framework Directive with the aim of securing and, where possible, improving the ecological status of watercourses throughout all Member States. Anyone contemplating works in any watercourse must therefore consider carefully the impact that such works may have, either directly or indirectly. In particular, the works should not cause damage or loss to other users of the river – in the reaches upstream and downstream as well as in the reach in question (see Box 8.1) – and there should be no long-term reduction in the ecological status of the watercourse. It is recognised that a temporary, short-term reduction may be inevitable during and immediately following the works. An environmental impact assessment (EIA) considers both short-term and long-term impacts of proposed works. As a general rule, the first point of contact when considering river works should be the Environment Agency, which will be able to provide advice on current use of the river and any restrictions or legal requirements. This should then lead the way to establishing contact with all persons with rights or responsibilities for the reach of river in question, not least of all the riparian owners (the owners of land that abuts a watercourse, with ownership usually extending to its centreline). Box 8.1 Consider and consult before starting work In a recent legal case, one landowner (the plaintiff) sought compensation from another (the defendant) when works carried out on behalf of the defendant fundamentally changed the flow regime through the plaintiff’s land. The watercourse in question comprised two roughly parallel channels, one being the original course of the stream and the other a man-made channel constructed generations ago to feed water meadows. The crude structure that regulated the division of flow was on the defendant’s land and he owned the land through which the man-made channel flowed. The plaintiff’s land was on the original channel downstream of the division structure. The defendant asked the Environment Agency to carry out maintenance works on his channel because it was overgrown. The work was carried out with the result that the majority of the stream flow was thereafter channelled through the defendant’s land, leaving the plaintiff with a mere trickle. One would have thought, perhaps, that this would be an easy problem to resolve, but the reality was different. The case lasted for several years and the Environment Agency (as executor of the maintenance works) was inevitably drawn in to the legal battle. Two very important lessons come out of this example: Always consider fully the range of impacts that any proposed works might have before undertaking them. Always consult all stakeholders in advance of doing the work. Anyone who intends to construct works in a watercourse must seek land drainage consent from the Environment Agency. This applies to weirs, culverts, sluices and any works that could have an impact on the flow or water level in the watercourse. Anyone wishing to take water from a watercourse (for irrigation, for example) must first obtain an abstraction licence from the Environment Agency or from the Internal Drainage Board (IDB) as appropriate for the specific area. The Environment Agency has particular responsibility for rivers designated as a ‘main river’. Other watercourses are generally the responsibility of the local council (or the IDB in an IDB area). In the specific case of culverting a watercourse, Environment Agency policy is that watercourses should not be culverted except where there is no other viable option because of the environmental degradation that would result. This policy is strongly reinforced by the Water Framework Directive. Similarly, the construction of weirs is generally discouraged because of the potential impact on fish migration. Riparian owners have certain rights and responsibilities regarding use of the watercourse (Environment Agency, 2007). There may also be byelaws that define rights and responsibilities. There is no duty in common law for a landowner to improve the drainage capacity of a watercourse, but there is a responsibility to maintain the bed and banks and any trees and shrubs growing on the banks. The riparian landowners must also keep the channel clear of debris, including the removal of material that does not originate from their land. Figure 8.2 shows an example of neglect. Figure 8.2 Fluvial neglect by riparian owners Allowing a watercourse to deteriorate to this degree is courting disaster. There are potential hazards relating to pollution, obstruction of the flow, and health and safety. The tree on the right has been allowed to grow too large. The sheetpile wall is collapsing into the channel. The demolition debris invites vandalism. Although this is clearly a temporary state, the risks of flood damage, pollution or injury will persist until the channel has been restored to a more natural state with adequate capacity and stable banks. A riparian landowner has the right to receive water in its natural quantity and quality, although it is often difficult to define what is meant by ‘natural’ in this context. By the same token, a riparian landowner has the responsibility to pass on flow without obstruction, pollution or diversion affecting the rights of others. Although many of the fundamentals of the law affecting watercourses are straightforward and sensible, their interpretation can be legally complex. If there is any doubt about the legality of any proposed works – whether new works or the maintenance of existing works – the promoters of such works should seek legal advice at an early stage of their proposal. 8.1.3 Rivers as dynamic systems Natural channels are subject to continuous change in response to a wide range of influences. Not least of these is the ever-changing flow pattern which reflects day-to-day weather conditions, seasonal changes, and longer term changes to the catchment and in the global climate. Figure 8.3 illustrates this hydrological variability with flow data from the River Trent. Fluvial designers ignore this dynamic nature of rivers at their peril. Figure 8.3 Hydrological variability This graph shows the changing flow conditions in the River Trent. The three bars for each month represent (from left to right) the long-term average flows, flows in 1998 (a wet year) and flows in 1996 (a dry year). The variability is pronounced, even more so when it is appreciated that these are monthly averages. Although the monthly average varies from a low of 25 m3/s to over 200 m3/s, mean daily flows (not shown here) fell as low as 15 m3/s in August 1976 and reached a peak of 1019 m3/s in November 2000. This natural variability has implications for construction, operation and maintenance activities in the river. Not only does a natural watercourse exhibit variability in its flow regime, but the boundaries of the channel are also subject to change. The rate of change depends on a wide range of factors including: the nature of the soils through which the channel flows; climatic conditions; human activity. In any given reach of channel, there tends to be a natural regime which defines the channel cross section (that is, the width, the depth and the slopes of the banks). Artificial changes to the channel cross section are generally temporary as the channel naturally reverts to its regime condition. Gravel-bed rivers can be particularly problematic, because large quantities of gravel can be deposited in a single flood event. This can result in any benefits achieved by dredging works being wiped out in a matter of hours. Considerable research has been carried out into the regime state of natural channels (Nixon, 1959) and it is generally accepted that the regime cross section is one that is capable of conveying the mean annual flood. This makes it clear that engineering a channel to carry much bigger flows (for example, the flood with a 1% annual exceedance probability – see Section 2.4.1) represents a major change from the natural state. An artificially deepened channel tends to silt up so that the bed level returns to its pre-dredged level (hence the cyclical nature of maintenance dredging in navigable rivers). Similarly, widening a river to achieve greater flow capacity often achieves only a temporary outcome, as shoaling tends to restore the natural width. Steepening a reach of channel (for example, by cutting off a meander loop) can introduce instability, leading to erosion of the bed or banks as the stream attempts to revert to its former regime state. An obstruction in a river channel also changes the natural regime, though over time a new regime may establish itself. Thus the construction of a weir in a river causes a backwater effect, and the resulting slower flow velocities encourage sedimentation upstream of the weir. A new balance will be achieved after some time (possibly many years). Weirs and similar structures also create an obstruction for wildlife – especially fish – unless special measures are incorporated (such as a fishpass). Figure 8.4 shows an example of fluvial adaptation. Figure 8.4 Fluvial adaptation This large capacity culvert has been engineered to ensure that it does not cause a restriction to flood flows. But for most of the time flows in the stream are relatively small, with low velocities that encourage sedimentation. The islands of sediment become colonised with vegetation, making them erosion-resistant and prone to attracting more sedimentation. Regular maintenance is needed to clear out the sediment and vegetation to avoid loss of flood capacity. Provision of an access ramp may be required to facilitate sediment removal. A two-stage culvert (with the outer boxes at a higher level), with similar approach channel geometry, could help to reduce maintenance obligations. With construction works in river channels, there is a wide range of potential adverse impacts that must be addressed in advance in order to avoid or mitigate the impacts. Of course, there are often opportunities for positive impacts associated with works in river channels. It is up to the promoters of the works to liaise with river users and local interest groups (anglers, conservationists, fisheries, navigation interests, etc) to explore the possibilities for mutual benefit. Table 8.1 indicates potential negative impacts but also includes references to potential benefits. Table 8.1 Works in river channels – potential negative impacts Nature of work Potential negative impacts Notes Construction of a structure in the watercourse (a weir, for example) Rise in flood level upstream. Obstruction to the passage of fish. Sedimentation upstream. Restriction to navigation. These are all potentially permanent negative impacts which may require mitigating action (such as raising flood defences or creating a fishpass). Potential benefits include improved amenity, aeration of the water, navigation and micro-power generation. Diversion of the watercourse (resulting in shorter stream length) Increase in stream slope, leading to erosion of bed and banks. Possible interference with agricultural drainage systems. Can be addressed by erosion protection measures. Potential benefits include reduced maintenance requirement and freeing up land for other uses. Widening of the watercourse Temporary loss of marginal and bankside vegetation. Damage to habitats (for example, water voles). Reduction in flow depth. Natural streams tend to revert to regime width over time. Benefits include the potential for a wider range of habitats. Deepening of the watercourse (dredging) Promotion of sedimentation (also depriving downstream reaches of sediment). Temporary damage to the ecology of the stream bed and hence the natural fauna. There are also negative impacts while the work is being carried out (increased sediment entrainment in the flow). In any sediment-transporting stream, a deepened reach of channel tends to silt up. In gravel-transporting streams, reversion to the former bed level may be rapid (perhaps in the course of one flood). Dredged material has to be safely disposed of, and this may be expensive, particularly as the material may be contaminated. Benefits are improved flood capacity and increased depth for navigation. Construction of erosion protection works on the bed or banks Damage to natural vegetation and habitats. Water pollution (avoid by using appropriate materials). Mature trees and shrubs should be maintained where practicable. Benefits may include a reduction in regular maintenance liability and the security of adjacent land, property and infrastructure. prev 1 2 3 4 5 6 7 8 9 10 next
Fluvial Design Guide - Chapter 8 Links
© Environment Agency 2010 | Privacy policy | Terms and conditions
Author: The Environment Agency | enquiries@environment-agency.gov.uk | 科技 |
2016-40/3982/en_head.json.gz/7252 | Federal Drive Interviews — March 5, 2013
CIO Council harnessing growing list of ‘world class’ IT programs
Home » Tom Temin » Federal Drive » Joint Chiefs welcome House… Joint Chiefs welcome House fiscal 2013 budget proposal
By Jared Serbu | @jserbuWFED March 6, 2013 6:36 am Share
Jared Serbu, DoD reporter, Federal News Radio
Jared Serbu
http://federalnewsradio.com/wp-content/uploads/2013/03/279950.mp3Download audio Members of the military’s Joint Chiefs of Staff told lawmakers Tuesday that the fiscal 2013 budget package the House intends to take to the floor this week would be a welcome alternative to the conflagration of budget shortfalls that face the government, and would go a considerable distance toward easing the military’s budget challenges in the current year.
House lawmakers are planning a Thursday vote on a funding package that would provide 2013 budgets for the Defense and Veterans Affairs departments while leaving the rest of the government in continuing resolution mode. The plan does not check off every box on the Joint Chiefs’ wish list, but they told the House Appropriations Committee that it would lift much of the burden they face this year: the potential for a full-year continuing resolution topped off by across-the-board cuts under sequestration. “It’s absolutely critical that we do this,” said Gen. Ray Odierno, the Army’s chief of staff. “It mitigates at least one-third of our problem.” Register for the Ask the CIO Chat with Andy Ozment of the Homeland Security Department on Oct. 11, at 1:30 p.m.
The other two-thirds for the Army are sequestration and unexpected costs involved in the war in Afghanistan this year, contributing to a projected shortfall of $18 billion in Army operation and maintenance accounts under current law. Advertisement
The House budget plan got an even more enthusiastic reception from Adm. Jonathan Greenert, the chief of naval operations. “For us, it’s almost night-and-day,” he said. “Right now, I’m $8.6 billion out of balance in my operations account. This would eliminate $4.6 billion of that right off the bat. What that means in simple terms is that, right now, we can put one carrier strike group and one amphibious ready group forward, and pretty much not much else in the rest of the world. If we get a bill, we can restore our covenant with the combatant commanders and get almost all that back.” No relief for civilian agencies Under the measure, DoD and VA would receive full budgets mostly along the lines of what they requested for the year. Across-the-board cuts required by sequestration would be left in place, but military leaders say the automatic spending cuts will be more manageable because the slicing would at least come from accounts that generally match up with the Pentagon’s current-year needs, rather than cutting from individual budget lines that are billions of dollars out of whack to begin with. In the Navy, for example, a full-year CR copied-and-pasted from 2012 would provide several billion dollars more than the service requested for this year for weapons systems and several billion fewer than it needs for its operational accounts, which fund everything from combat training to civilian salaries to military healthcare to maintenance on military barracks. “It would leave us out of balance,” Greenert said. Raw numbers aside, the military also would be relieved from the prohibition against initiating new contracts that it had already planned for. A ban on “new starts” is an inherent feature of operating under a continuing resolution. “We’d get two carrier overhauls. We’d get the construction of a new carrier. We’d get new construction for submarines. We’d get all of our military construction. We don’t have any of that right now,” Greenert said. “All of that comes back, and of course, we’d get facility renovation, which we don’t have right now because we’ve had to put it off to pay for this imbalance.” Gen. James Amos, the commandant of the Marine Corps, said a bona fide 2013 budget would let his service sign numerous contracts that have been waiting for a budget, including its final planned multi-year purchase of V-22 tilt-rotor aircraft. “If I had to buy those things one-at-a-time instead of doing a multi-year, it would cost the government an extra billion dollars at the end of the day,” he said. More flexibility for budget planners? The House bill doesn’t cancel sequestration for DoD or any other agency, but it does offer additional mechanisms to soften the blow of the automatic cuts, at least for Defense. Congress would give the Pentagon general authority to reprogram up to $4 billion between any of its accounts and an additional $3.5 billion in “special reprogramming authority” between its military construction accounts. “It provides flexibility to the military, maybe not as much as they would like,” said Rep. Bill Young (R-Fla.), the chairman of the Appropriations Subcommittee on Defense. “But there’s flexibility in the plan that we’re moving this week.” But with some narrow exceptions, that flexibility would not be available to the rest of the government. Besides DoD and VA, other agencies would continue to operate under both a full-year continuing resolution and sequestration, making the bill unpalatable to the Democratic majority in the Senate. In the House, the bill met with significant skepticism among members of the Democratic minority, including those who serve on Defense-friendly committees, who would like some relief from sequestration for the rest of the government. “Why can’t this include other agencies? We have a Homeland Security bill ready to go, and the same is true of many other bills,” said Rep. Sam Farr (D-Calif.). Likewise, the bill does not appear to offer agencies such as the Department of Transportation the ability to apportion its program-by-program sequestration cuts in a more thought-out manner, said Rep. Rick Larsen (D-Wash.) “A lot of people act like there’s a line-item in the budget for waste, fraud and abuse in every agency,” Larsen said. “The FAA has to cut about $627 million, but even if all of that were waste, fraud or abuse, they would only be allowed to cut 8-to-10 percent of that under sequestration. That just underscores the inflexibility of sequestration. This flexibility should apply to all agencies. I have folks who are making choices about housing vouchers in my district and choosing who is going to get Meals on Wheels. If it’s good enough for the goose, it’s good enough for the gander.” The Pentagon said earlier this week that its 2013 cutbacks, under a scenario that included both sequestration and a continuing resolution, would include the closure of military commissaries one-day-per-week, a potential reduction in the staff and funding available to run military healthcare facilities, and the furlough of about 15,000 school teachers on military bases out of the broader population of 780,000 civil servants in the military services who would be told to take an effective 20 percent salary cut for the remainder of the year. No veto threat For DoD, the enactment of a 2013 budget could change that, said Gen. Mark Welsh, the Air Force’s chief of staff, even if sequestration remains in effect. “It allows us to figure out a way around this idea of furloughing 180,000 great civilian airmen. We want no part of that,” he said. The House proposal includes other items that make it unlikely for it to pass the Senate in its current form, including bans on funding for the implementation of the Affordable Care Act and the Dodd-Frank financial reform bill. Nonetheless, the White House, in its official response to the bill’s introduction, did not voice an explicit veto threat. OMB said only that it was “deeply concerned” about the measure. “While the legislation includes the Department of Defense … the remainder of federal agencies are left to operate at last year’s level, which will impede their ability to provide services to Americans and efficiently allocate funding to key programs including those in infrastructure, clean energy, education, and research and development,” the White House said in a Statement of Administration Policy. “The administration looks forward to working with the Congress to refine the legislation to address these concerns.” RELATED STORIES: House budget plan includes fed pay freeze extension Sequestration treadmill picking up steam across DoD Pentagon to furlough teachers, cut commissary time Topics:
continuing resolution
House Appropriations Committee
James Amos
Jonathan Greenert
Mark Welsh
Ray Odierno
Home » Tom Temin » Federal Drive » Joint Chiefs welcome House… Partners | 科技 |
2016-40/3982/en_head.json.gz/7447 | Wilkinson Microwave Anisotropy Probe
Search / Site Map
Observatory Science Goals
Early Universe
How Did Structure Form in the Universe?
The Big Bang theory is widely considered to be a successful theory of cosmology, but the theory is incomplete. In its simplest form, the Big Bang theory assumes that matter and radiation are uniformly distributed throughout the universe and that general relativity is universally valid. While this can account for the existence of the cosmic microwave background radiation and explain the origin of the light elements, it does not explain the existence of stars, galaxies and large-scale structure. The famous "Deep Field Image" taken by the Hubble Space Telescope, shown below, provides a stunning view of such structure. How did these structures form? Most cosmologists believe that the galaxies that we observe today grew from the gravitational pull of small fluctuations in the nearly-uniform density of the early universe. These fluctuations leave an imprint in the cosmic microwave background radiation in the form of temperature fluctuations from point to point across the sky. The WMAP satellite measures these small fluctuations in the temperature of the cosmic microwave background radiation and in turn probe the early stages of structure formation.
Hubble Deep Field Image:
HST press release describing this image
The solution of the structure problem must be built into the framework of the Big Bang theory. WMAP's observations provide the type of data needed to form detailed theories to answer these questions.
Learn More About Structure Formation:
Visit our Universe 101 pages for more details on structure and first objects in oure universe.
Reproduction Guidelines / Citations
wmap.gsfc.nasa.gov
Webmaster: Britt Griswold
NASA Official: Dr. Edward J. Wollack
Page Updated: Friday, 04-16-2010 | 科技 |
2016-40/3982/en_head.json.gz/7477 | This former GM plant site in Lansing, shown here in 2006, is among the locations being considered for a 20 MW solar array by the city's utility. (Photo by Keith Kris via Creative Commons)
Planned project would nearly double Michigan’s solar capacity
Written By Andy Balaskovitz03/11/2015
Developers and Michigan’s largest municipally owned utility could nearly double the state’s solar energy portfolio by partnering in what would decidedly be the largest single solar project here.
An official with the Lansing Board of Water and Light confirmed with Midwest Energy News Wednesday that the utility has selected a developer for a 20 MW solar project.
The original request for proposals, which was sent out last summer and attracted more than a dozen responses, was for a 5 MW project. The state’s largest solar projects operating or under development are less than 1.5 MW, while roughly 23 MW of commercial-scale solar statewide is tracked by the Michigan Public Service Commission.
Solar currently makes up roughly 1 percent of Michigan’s 2,300 MW renewable energy portfolio, according to the PSC.
One of the sites in the running is a vacant industrial property formerly owned by General Motors less than two miles from the state Capitol building.
A power purchase agreement is yet to be finalized and the original proposed location — GM’s former Verlinden plant that was demolished nearly 10 years ago — still needs to be secured, but the BWL is intent on increasing the scale of its original proposal.
“We got a whole lot of bids, there was a lot of interest,” said George Stojic, the BWL’s executive director of planning and development. “It just made sense to scale this thing up.”
The utility selected Vermont-based groSolar to develop a 20 MW project. Stojic said it’s up to the developer to ultimately secure a location.
Steve Remen, groSolar’s executive vice president of business development, said the former GM site is one of several the company is considering in the Lansing area and the project could need more than one property.
“Oftentimes, those sites make very good sites for solar projects,” Remen said, referring to vacant industrial parcels that likely have legacy contamination. “It’s an excellent re-use of the property.”
Stojic said the cost of solar would be roughly $60 per MWh at the former GM site, but that could vary depending on the location.
The property is owned by RACER Trust, which was created to manage, clean up and market former GM properties after the company’s bankruptcy in 2009. RACER oversees more than 250 acres of vacant industrial land in the Lansing area. The proposed site is on 57 acres west of downtown Lansing where Pontiacs, Oldsmobiles and Chevrolets used to be assembled.
Stojic said it made sense economically to scale up the project based on data taken from the BWL’s smaller, 150 kW solar array. Also, the BWL has limited, point-to-point transmission, he said.
“We are a summer-peaking utility. Solar fits well into that very well for two reasons: It’s there in the summertime if we need it and it helps offset transmission costs,” he said.
Stojic said he hopes details will be finalized and made public within the next couple of months.
“I wanted a fairly significantly sized project,” Stojic said. “I’m very excited about it.”
NewsMichigansolar Read Next
Modeling shows efficiency, carbon trading could play major role in Michigan’s Clean Power Plan compliance
Michigan energy officials released a pair of reports Monday showing that a stronger commitment to energy efficiency and trading carbon credits with other states would be an affordable compliance strategy for the Clean Power Plan.
One thought on “Planned project would nearly double Michigan’s solar capacity” Bob Fittro on 03/12/2015 at 9:13 am said:
I am wondering how this will effect people in the Northern counties? | 科技 |
2016-40/3982/en_head.json.gz/7530 | NAU news
01Saturday, Oct. 01, 2016
Research & Academics
For NAU Mars researcher, a plaque in hand is worth an asteroid in the belt
June 3, 2014 0 Comments Nadine Barlow may be keeping her eyes on Mars, but a faint asteroid zipping along an orbit between the red planet and Jupiter has her name on it.
The asteroid, 15466 Barlow, was named in her honor by the International Astronomical Union. Now, years later, the Northern Arizona University professor of physics and astronomy finally has the official plaque.
Barlow received the plaque in March at the Lunar and Planetary Science Conference in Houston. Longtime collaborators Faith Vilas, from the Planetary Science Institute, and Joe Boyce, from the University of Hawaii, handled all the details, making sure Barlow would have something on her wall to signify the achievement.
“It seems to be a very well-behaved main belt asteroid and there’s nothing too exciting about it,” Barlow said, light-heartedly deflecting any suggestion that the space rock reflects her personality. “But what I think is really cool is that it was discovered here in Flagstaff at Lowell Observatory.”
That discovery took place at the Anderson Mesa Station in 1999, and the naming originated as a way to acknowledge Barlow’s place as one of the world’s foremost experts on Martian impact craters.
“It’s a very nice recognition of the work I’ve done, and the fact that my colleagues recognize it and are happy with it,” Barlow said.
The pace of that work continues at a prodigious pace and along numerous lines of pursuit. Research findings about low-aspect-ratio layered ejecta craters, on which Boyce collaborated, are to be published in the planetary science journal Icarus. Barlow recently participated in a panel discussion about recent discoveries from spacecraft exploration at the Brookings Institution in Washington, D.C., and she has also been busy working with a team of analysts within the Mars Exploration Program Analysis Group.
The researchers are analyzing “special regions” on Mars: areas where Martian life could potentially exist, or where terrestrial microbes could survive and propagate if they hitched a ride on a probe from Earth.
“I was asked to participate because of my expertise with impact craters and how they might tie in to heating up the environment,” Barlow said. She explained that near-surface ice is common on Mars, so a large enough impact in such an area could create a warm and wet environment hospitable to microbes.
“Spacecraft going to the surface of Mars are sterilized but the level of sterilization depends on where they’re actually going,” Barlow said. “There are a few areas where you’d have to be very cautious about introducing microbes.”
Barlow and the rest of the group expect to finish writing their findings this summer. But when she has the time and the proper telescope access, she wouldn’t mind catching a glimpse of a certain asteroid. 15466 Barlow isn’t expected to come any closer to Earth than 4.1 million miles, and that won’t be until 2166.
“At least it doesn’t look like it’s one of those asteroids that’s going to threaten the Earth,” Barlow said. “That’s the kind of asteroid you don’t want named after you.”
Leave a Reply Cancel reply Your email address will not be published. Required fields are marked *Comment Name * Email * Website More Research & Academic News
NAU named top institution by Times Higher Education
Engineering program earns top 50 spot in ‘U.S. News’ rankings
$3 million grant to support first detailed map of the nation’s food, energy and water systems
New research suggests global warming began decades earlier
NAU researcher sparks new debate with findings showing little, if any, present-day water on Mars
Latest Site Activity
President Cheng to host campus forum Oct. 18 President Rita Cheng will host a campus and community discussion from 3-4:30 p.m. Tuesday, Oct. 18, in the High Country Conference Center.
Horizon Concert Series at NAU showcases global talents and music The Horizons Concert Series, hosted by the School of Music, is back again this fall showcasing talents and music from across the globe. NAU Bookstore celebrates new look Friday, Sept. 30 An open house will be held from 1 – 3 p.m. Friday, Sept. 30.
Northern Arizona University named top US college Northern Arizona University recently was named a top college in the inaugural Wall Street Journal/Times Higher Education ranking of U.S. colleges.
University Leadership Program kicks off new year The development program brings together partners from all over campus to help prepare the next generation of higher education leaders.
Assistant dean of students named NAU Homecoming Dedicatee Kevin Gemoets was selected by student organizations as the 2016 Homecoming Dedicatee for his more than 15 years of service on campus.
NAU recognized for excellence in diversity Northern Arizona University recently received the prestigious 2016 Higher Education Excellence in Diversity (HEED) Award from INSIGHT Into Diversity magazine. Autopsies reveal new information about the Sverdlovsk anthrax accident NAU scientists, along with TGen, uncovered instrumental data on what some refer to as the “biological Chernobyl.” NAU ranked No. 1 in the nation for social media engagement in higher education Northern Arizona University has placed itself at the top of the mountain when it comes to social media engagement. Visit Athletics Site | 科技 |
2016-40/3982/en_head.json.gz/7554 | Buff Kennewick Man Had Coastal Diet By Anna King
Oct 11, 2012 ShareTwitter Facebook Google+ Email Final Kennewick Man facial reconstruction. Photo by Brittney Tatchell
/ Northwest News Network
For nearly a decade, scientists and Northwest tribes fought bitterly over whether to bury or study the 9,500 year old bones known as Kennewick Man. Now, after years of careful examination, scientists are releasing some of their findings to tribes at meetings this week in Central Washington. As correspondent Anna King reports, Kennewick Man grew up on the coast. Kennewick Man was buff. I mean really – beefcake. So says Doug Owsley. He’s the head of physical anthropology at the Smithsonian’s National Museum of Natural History and led the study of the ancient remains. Owsley can read the bones like we might read a book. He looks for ridge lines that indicate which muscles Kennewick Man used the most and what he was doing with them. First off? He had muscular legs like a soccer player – likely from running, trudging and hunting. Owsley: “In his leg structure he’s certainly accustomed to very rapid movement, quick movement and you can read that in those muscle ridges.” He also likely had killer arms, because he threw a spear with the aid of a lever like tool. Owsley says Kennewick Man was so strong in his right arm he was like a pro baseball pitcher, and the bones show he got today’s equivalent of a career-ending sports injury. Owsley: “If it happened to a contemporary baseball pitcher, they’d need surgery. And so it took off a piece of bone off the back side of the shoulder joint that would have been essentially loose. And I’m sure that caused great complications in his ability to throw.” Owsley says Kennewick Man who stood about 5’7” and weighed about 170 pounds. And he wasn’t any stranger to pain. The evidence shows "K-Man," as he’s known in Eastern Washington, got hit on the head a few times and stabbed with a basalt rock point that imbedded in his hip. Owsley’s research includes this big revelation: Kennewick Man wasn’t from the southeast Washington region along the Columbia River where he was found. Instead, Owsley said he was from the coast. The scientists can tell from chemical tests on tiny bits of his bones and the enamel on his teeth that he ate mostly marine animals. Owsley: “Once a tooth erupts it doesn’t change. So that tiny, tiny piece of tooth enamel with just hitting it with the same sort of process, you can tell where he grew up as as a child.” Owsley and forensic artists came up with a new sculpture of what Kennewick Man looks like. The remains known to tribes as the Ancient One, draws his ancestry from the ancient peoples of Asia, Owsley says. The scientist describes the moment he looked at the new reconstruction of Kennewick Man’s face. Owsley: “He’s so lifelike. And when you look at those eyes, those eyes have such a piercing glare. I think this man has a story to tell us. There is very little known about that time. And he’s a true messenger.” Owsley has a new young-adult book out with author Sally Walker. It’s called “Their Skeletons Speak.” Owsley plans to release a much larger scientific text soon. Copyright 2012 Northwest Public Radio Tags: Kennewick ManarchaeologyPacific NorthwesthistoryView the discussion thread. © 2016 Northwest Public Radio | 科技 |
2016-40/3982/en_head.json.gz/7585 | James Webb on Energy & Oil
Democratic Sr Senator
Strong supporter of nuclear power & all-of-the-above
The question really is how are we going to solve energy problems if you really want to address climate change? When I was in the Senate,
I was an all-of-the-above energy voter. We introduced legislation to bring in alternate energy as well as nuclear power. I'm a strong proponent of nuclear power. It is safe, it is clean.
Source: 2015 CNN Democratic primary debate in Las Vegas
To solve climate change, India and China must participate
We are not going to solve climate change simply with the laws here. We've done a good job in this country since 1970. If you look at China and India, they're the greatest polluters in the world.
Fifteen out of the 20 most polluted cities in the world are in one of those two countries. We need to solve this in a global way. It's a global problem and the so-called agreements that we have had with China are illusory.
Energy expansion: Keystone XL and off-coast drilling
While in the U.S. Senate, Webb voted for an amendment to at least temporarily block the Environmental Protection Agency from regulating greenhouse gas emissions, arguing that the nation's energy concerns were pressing and Congress needed to
have more input in regulation. He has strongly advocated energy expansion, including construction of the Keystone XL pipeline and drilling off the coast of Virginia.
Source: PBS News Hour "2016 Candidate Stands" series
Bush called for energy independence every year since 2001
This is the seventh time the president has mentioned energy independence in his state of the union message, but for the first time this exchange is taking place in a Congress led by the Democratic Party. We are looking for affirmative solutions that will
strengthen our nation by freeing us from our dependence on foreign oil, and spurring a wave of entrepreneurial growth in the form of alternate energy programs. We look forward to working with the president and his party to bring about these changes.
Source: Democratic response to 2007 State of the Union address
Support alternative sources instead of drilling ANWR
Webb supports expanding solar-, nuclear-, and ethanol-energy sources rather than allowing drilling in the Arctic National Wildlife Refuge.
Source: Jeanne Cummings, Wall Street Journal, p. A6
Voted NO on barring EPA from regulating greenhouse gases.
Voted YES on protecting middle-income taxpayers from a national energy tax.
Congressional Summary:On budget resolutions, it shall not be in order in the Senate to consider any bill or amendment that includes a National energy tax increase which would have widespread applicability on middle-income taxpayers.The term "middle-income" taxpayers means single individuals with $200,000 or less in adjusted gross income and married couples filing jointly with $250,000 or less.The term "widespread applicability" includes the definition with respect to individual income taxpayers.The term "National energy tax increase" means any legislation that the Congressional Budget Office would score as leading to an increase in the costs of producing, generating or consuming energy.
Proponent's argument to vote Yes:Sen. LINDSEY GRAHAM (R, SC): The climate change proposal that was in the President's budget would create a massive tax increase on anybody who uses energy, and that would be every American middle-class family, which already has a tough time getting by. This [amendment creates a procedure to block] any bill that would raise the cost of energy on our middle-class families who are struggling to get by. I ask the Senate to rally around this concept. We can deal with climate change without passing a $3,000-per-household energy tax on the families of America who are having a hard time paying their bills.Opponent's argument to vote No:No senators spoke against the amendment.
Reference: Graham Amendment; Bill S.Amdt.910 to S.Con.Res.13
Voted YES on requiring full Senate debate and vote on cap-and-trade.
Congressional Summary:AMENDMENT PURPOSE: To prohibit the use of reconciliation in the Senate for climate change legislation involving a cap and trade system.Sec. 202 is amended by inserting at the end the following: "The Chairman of the Senate Committee on the Budget shall not revise the allocations in this resolution if the legislation is reported from any committee pursuant to sec. 310 of the Congressional Budget Act of 1974."Proponent's argument to vote Yes:Sen. LINDSEY GRAHAM (R, SC): This idea to most people of a debate about reconciliation probably is mind-numbing and not very interesting. But there is a process in the Congress where you can take legislation and basically put it on a fast track. It is subject to 50 votes. The whole idea of the Senate kind of cooling things down has served the country well. In that regard, to end debate you need 60 votes. If 41 Senators are opposed to a piece of legislation, strongly enough to come to the floor every day and talk about it, that legislation doesn't go anywhere. If you took climate change and health care, two very controversial, big-ticket items, and put them on the reconciliation track, you would basically be doing a lot of damage to the role of the Senate in a constitutional democracy.Senator Byrd, who is one of the smartest people to ever serve in the Senate about rules and parliamentary aspects of the Senate, said that to put climate change and health care reform in reconciliation is like "a freight train through Congress" and is "an outrage that must be resisted." Senator Conrad said: "I don't believe reconciliation was ever intended for this purpose."I think both of them are right. Under the law, you cannot put Social Security into reconciliation because we know how controversial and difficult that is. I come here in support of the Johanns amendment that rejects that idea.Opponent's argument to vote No:No senators spoke against the amendment.
Reference: Johanns Amendment; Bill S.Amdt.735 to S.Con.Res.13
Voted YES on tax incentives for energy production and conservation.
OnTheIssues.org Explanation:A "Cloture Motion" would end debate on the bill, and then allow a vote on passage. This motion failed (3/5ths of the Senators must vote YEA), based on objections of how the new incentives would be paid for.Congressional Summary:A bill to amend the Internal Revenue Code of 1986 to provide Tax incentives for energy production and conservation, to extend certain expiring provisions, and to provide individual income tax relief.TITLE I--ENERGY TAX INCENTIVESSec. 102. Production credit for electricity produced from marine renewables.Sec. 104. Credit for residential energy efficient property.Sec. 106. New clean renewable energy bonds.Part II--Carbon Mitigation ProvisionsSec. 112. Expansion and modification of coal gasification investment credit.Sec. 115. Carbon audit of the tax code.Sec. 121. Inclusion of cellulosic biofuel in bonus depreciation for biomass ethanol plant property.Sec. 122. Credits for biodiesel and renewable diesel.Sec. 124. Credit for new qualified plug-in electric drive motor vehicles.Sec. 127. Transportation fringe benefit to bicycle commuters.Sec. 146. Qualified green building and sustainable design projects.Opponents argument for voting NAY:Sen. SPECTER: H.R. 6049 would revive important tax provisions that expired at the end of 2007 and extend provisions that are set to expire at the end of 2008. I support extension of the R&D tax credit, the renewable energy tax incentives, and many other important provisions in this package.Despite the positive elements of this legislation, the main sticking point is whether temporary extensions of tax relief should be offset with permanent tax increases elsewhere. The White House issued a statement recommending a Presidential veto of this bill in its current form. [Vote NAY to] allow the Senate to work its will and pass legislation that can be quickly signed by the President. Reference: Renewable Energy and Job Creation Act; Bill HR6049
Voted YES on addressing CO2 emissions without considering India & China.
OnTheIssues.org Explanation: This is a motion on an omnibus spending bill, sending instructions to the committee resolving differences between the House and Senate versions of the bill. Sen. Boxer introduced this motion, and Sen. DeMint introduced a counter-motion. Voting for the Boxer motion means you favor Boxer's method over DeMint's method, which means speeding up Congressional action on global warming.Opponents argument for voting NAY:Sen. DeMINT. When we are talking about trade agreements, there needs to be a level playing field. This motion would prevent Congress from passing any law with new mandates on greenhouse gas emissions that would harm the U.S. economy or result in job loss unless both China and India had the same mandates--in other words, if we had a level playing field. It is not going to help the environment in the United States or the world if we pass mandates that raise the cost of doing business in our country, if we create mandates that do not exist in India or China.Proponents argument for voting YEA:Sen. BOXER. I rise to speak against the DeMint motion and in favor of the Boxer motion. The DeMint motion is a throwback to 10 years ago when everybody, including myself, was saying we better watch out and not do anything about global warming until the undeveloped world acts. We cannot do that anymore. This is a time when we need to stand up as the leading country in the world and say that we can fight global warming, and we can win this fight. But what happens with the DeMint motion, he gives China and India a veto power over what we should be doing. Imagine saying we are not going to do anything about human rights until China acts. Why would we give up our chance to take the mantle of leadership and finally grab hold of this issue? I cannot look into the eyes of my grandchildren and tell them: Sorry, I am giving over my proxy to China & India, and I can't do anything about it. Reference: Motion to Instruct Conferees (China-India) re: S.Con.Res.70; Bill Motion to Instruct S.Con.Res70
Voted YES on removing oil & gas exploration subsidies.
Creating Long-term Energy Alternatives for the Nation (CLEAN) ActTitle I: Ending Subsidies for Big Oil Act--denying a deduction for income attributable to domestic production of oil, natural gas, or their related primary products.Title II: Royalty Relief for American Consumers Act--to incorporate specified price thresholds for royalties on oil & gas leases in the Gulf of Mexico. Title III: Strategic Energy Efficiency And Renewables Reserve--makes the Reserve available to accelerate the use of clean domestic renewable energy resources and alternative fuels.Proponents support voting YES because:This legislation seeks to end the unwarranted tax breaks & subsidies which have been lavished on Big Oil over the last several years, at a time of record prices at the gas pump and record oil industry profits. Big Oil is hitting the American taxpayer not once, not twice, but three times. They are hitting them at the pump, they are hitting them through the Tax Code, and they are hitting them with royalty holidays put into oil in 1995 and again in 2005.It is time to vote for the integrity of America's resources, to vote for the end of corporate welfare, to vote for a new era in the management of our public energy resources.Opponents support voting NO because:I am wearing this red shirt today, because this shirt is the color of the bill that we are debating, communist red. It is a taking. It will go to court, and it should be decided in court. This bill will increase the competitive edge of foreign oil imported to this country. If the problem is foreign oil, why increase taxes and make it harder to produce American oil and gas? That makes no sense. We should insert taxes on all foreign oil imported. That would raise your money for renewable resources. But what we are doing here today is taxing our domestic oil. We are raising dollars supposedly for renewable resources, yet we are still burning fossil fuels. Status: Bill passed Bill passed, 65-27
Reference: Creating Long-Term Energy Alternatives for the Nation (CLEAN); Bill H.R.6
Voted YES on making oil-producing and exporting cartels illegal.
Voting YES would amend the Sherman Anti-Trust Act to make oil-producing and exporting cartels illegal. It would be a violation for any foreign state:to limit the production or distribution of oil & natural gas;to set or maintain the price of oil & natural gas; orto otherwise take any action in restraint of trade for oil & natural gas;when such collective action has a direct, substantial, and reasonably foreseeable effect on the market, supply, price, or distribution of oil & natural gas in the US.Proponents recommend voting YES because:Our NOPEC bill will authorize filing suit against nations that participate in a conspiracy to limit the supply, or fix the price, of oil. In addition, it will specify that the doctrines of sovereign immunity do not exempt nations that participate in oil cartels from basic antitrust law. Opponents recommend voting NO because:No one likes OPEC. But this amendment, in my opinion, would make bad law. The Framers of the Constitution wisely assigned responsibility for formulating foreign policy and conducting foreign relations to the President and to the Congress, not to the law courts. The amendment before us has its roots in a lawsuit filed by the labor union nearly 30 years ago. The union at that time charged OPEC with price fixing in violation of our antitrust laws. The trial court dismissed the case on the ground that OPEC members are sovereign nations and are immune from suit. Adopting the amendment will undoubtedly be very popular, but it is also very unwise.In addition, we here in the Senate ought to consider how enactment of this amendment might affect our relations with OPEC members. What will be the international repercussions when the US starts awarding judgments against foreign nations and attaching their assets in this country? Will other nations start to view our trade policies--such as our nuclear trade restrictions--as violations of their antitrust laws? Reference: NOPEC Amendment to CLEAN Energy Act; Bill S.Amdt.1519 to H.R.6
Voted NO on factoring global warming into federal project planning.
Amendment would require the consideration of global climate change, in planning, feasibility studies, & general reevaluation reports. Would require accounting for the costs & benefits from the impacts of global climate change on flood, storm, and drought risks; potential future impacts of global climate change-related weather events, such as increased hurricane activity, intensity, storm surge, sea level rise, and associated flooding; & employs nonstructural approaches and design modifications to avoid or prevent impacts to streams, wetlands, and floodplains that provide natural flood and storm buffers. Proponents recommend voting YES because:It just seems logical that we ask the Corps of Engineers to include in their analyses, judgments about the potential impact of global climate change. All this amendment seeks to do, as a matter of common sense, is to ask the Army Corps of Engineers to factor climate change into their future plans. Secondly, we are making a statement here to finally recognize the reality of what is happening with respect to climate change. Opponents recommend voting NO because:The same people today who are saying we are all going to die from global warming, just back in the middle 1970s were saying another ice age is coming and we are all going to die. Which way do you want it? If a surge of anthropogenic gases--this CO2, methane, or whatever it is--were causing a warming period, then around 1945 we would have a warming period because in the middle 1940s we had the greatest increase in greenhouse gases. But what happened? It did not precipitate a warming period. Peer reviewed evidence shows that the sun has actually been driving the temperature change. You don't have to be a scientist to know that the Sun can have something to do with climate change. Implementing Kyoto would reduce the average annual household income nearly $2,700, at a time when the cost of all goods would rise sharply. Reference: Kerry Amendment; Bill S.Amdt.1094 to H.R.1495
Sign on to UN Framework Convention on Climate Change.
Webb co-sponsored signing on to UN Framework Convention on Climate Change Whereas there is a scientific consensus that the continued buildup of anthropogenic greenhouse gases in the atmosphere threatens the stability of the global climate; Whereas there are significant long-term risks to the economy and the environment of the US from the temperature increases and climatic disruptions that are projected to result from increased greenhouse gas concentrations; Whereas the US has the largest economy in the world and is also the largest emitter of greenhouse gases; Whereas reducing greenhouse gas emissions to the levels necessary to avoid serious climatic disruption requires the introduction of new energy technologies and other climate-friendly technologies; Whereas the development and sale of climate-friendly technologies in the US and internationally present economic opportunities for workers and businesses in the United States; Whereas President Bush, in the State of the Union Address given in January 2006, called on the US to reduce its 'addiction' to oil and focus its attention on developing cleaner, renewable, and sustainable energy sources; Now, therefore, be it Resolved, That it is the sense of the Senate that the United States should act to reduce the health, environmental, economic, and national security risks posed by global climate change and foster sustained economic growth through a new generation of technologiesby participating in negotiations under the United Nations Framework Convention on Climate Change, and leading efforts in other international fora,with the objective of securing United States participation in binding agreements that establish mitigation commitments by all countries that are major emitters of greenhouse gases;establish flexible international mechanisms to minimize the cost of efforts by participating countries; andachieve a significant long-term reduction in global greenhouse gas emissions. Source: S.RES.30/H.CON.RES.104 07-SR30 on Jan 16, 2007
Click here for a Wikipedia profile of James Webb.
Click here for a Ballotpedia profile of James Webb.
Click here for SenateMatch quiz answers by James Webb.
Click here for AmericansElect quiz answers by James Webb.
Click here for MyOcracy quiz answers by James Webb.
Click here for Huffington Post quiz answers by James Webb.
Click here for a summary of James Webb's positions on all issues. Click here for issue positions of other VA politicians. Click here for issue statements from VA archives. Other candidates on Energy & Oil:
James Webb on other issues:
VA Gubernatorial:Bob McDonnellKen CuccinelliRobert SarvisTerry McAuliffeVA Senatorial:Ed GillespieMark WarnerRobert SarvisTim Kaine
VA politicians
VA Archives
vs.Begich(D)
Jolly(R)
vs.Nunn(D)
vs.Culver(D)
vs.Minnick(D)
Stutzman(R)
vs.Sebelius(D)
vs.Chandler(D)
vs.McAllister(R)
Edwards(D)
vs.Van Hollen(D)
vs.Hagan(D)
vs.Beers(R)
vs.King(R)
vs.Gibson(R)
Contact info:Email Contact FormFax Number: 202-228-6363Phone number: (202) 224-4024 Search for...
Page last updated: Apr 23, 2016 | 科技 |
2016-40/3982/en_head.json.gz/7688 | Entropic Acquires Technology From PLX
By Mike Allen
Entropic Communications, a San Diego maker of hardware and software used in home entertainment systems, said it acquired the broadcast satellite intellectual property and technology from Sunnyvale-based PLX Technology Inc. for up to $8 million and a one-time licensing fee of $4 million.
The acquired assets are complementary to Entropic’s direct broadcast satellite (DBS) outdoor unit product portfolio, and will strengthen its position as the DBS market transitions to a satellite-to-Internet-protocol technology, the company said.
Entropic announced in late June that it expects a better second quarter than it originally forecast in April. The company raised its anticipated revenue for the quarter to a range of $81 million to $82 million, up from the earlier forecast revenue of $75 million to $77 million.
It also forecast earnings per share at break-even. Entropic will release its quarterly financial results Aug. 1.
Acquisitions Give Entropic Momentum in Market
Entropic Communications Completes $65M Acquisition
Entropic Keeps Making The Right Connections
Entropic’s MoCA Is A Double Whammy
Entropic Communications to Acquire Company’s Assets for $65M
Entropic Raises Forecast
Chipmaker Entropic Posts Record Net Income
Entropic Raises $41M in IPO | 科技 |
2016-40/3982/en_head.json.gz/7727 | Subscribe to our weekly Tech.eu Newsletter
You've subscribed to our newsletter
Enterprise SaaS
Follow Us Malta: the pros and cons of building a startup on the sunny island
Scheduit’s Jeffrey Romano writes about the strengths and weaknesses of the small startup ecosystem that’s currently brewing over in Malta.
Jeffrey RomanoJeffrey works at Malta-based startup Scheduit. He regularly contributes to a variety of business blogs covering startups, innovation and technology trends.Posted in malta, malta startupsUpdatedFebruary 17th, 2016. Editor’s note: this is a guest post from Jeffrey Romano, who works at Malta-based startup Scheduit.
When thinking about startup hubs in Europe, most people immediately thinks of Berlin, London, Paris, Stockholm, Dublin and increasingly, Lisbon.
One other promising startup location that doesn’t get much press is Malta. Few have noticed, but the tiny Mediterranean island has an intriguing opportunity to position itself uniquely on the European startup map.
For the past 15 years, Malta has built a strong economic foundation by focusing on growing in specific industries. In the online gambling sector, Malta is considered as one of the best European locations with market leaders like Betsson Group, Betfair and Tipico all based on the island. In the financial sector, Malta’s stable bank sector helped the country get through the 2008 recession practically unscathed. And in ICT, Malta boasts an excellent infrastructure backed up by an increasingly technical workforce. As an example, for the past three years, Malta has boasted the leading e-government system in the EU.
Growing government support
One other major reason why the island has managed to grow and diversify its economy is due to generous (individual and business) tax benefits that successive governments have implemented and promoted. In reality though, such tax benefits are in competition with others such as the SEIS (Seed Enterprise Investment Scheme) in the UK.
Apart from tax benefits, the Maltese government provides other forms of assistance such as help with access to finance and internationalization. Although such schemes rarely make or break a business, having support such as financial assistance when participating in exhibitions abroad is always helpful.
Government is a key player and understanding startups is important in building a sustainable vision.
Andy Linnas, head of TAKEOFF, Malta’s leading startup incubator, is optimistic about the future of Malta’s ecosystem, pointing out that “companies like HotJar, Reaqta, DiscountIF and Oulala have already achieved impressive traction and we believe that they deserve Malta’s support and attention”. At the same time he points out that “if Malta wants to stay ahead in the new knowledge economy, it has to start thinking and planning in terms of five, ten, and twenty year perspectives”.
The context behind starting up in Malta
The benefits of starting up in Malta go beyond government initiatives. For example, if one takes into consideration local developers, these are technically competent, speak good English and are hard working. Historically, the island has always relied on its people to be competitive, as it has no natural resources, and this attitude of grinding through work has permeated into Malta’s culture.
On the other hand, Malta is only a country of 425,000 people and the size of the workforce is limited. Similar to other startup hubs, there aren’t enough developers available. However, Malta is a popular place to come and work in; the fantastic weather, lively nightlife and the fact that it is a family-friendly location has attracted many professionals to the island.
In the stressful life of startup founders, little things like waking up to a sunny day, living close to your office and being able to go out and converse with anyone (in English), can have positive productivity effects in the long term. In that respect, Malta is more similar to Silicon Valley than most other European cities.
However, there are aspects of Maltese culture that don’t make it ideal as a startup hub. The preference to settle down, get a mortgage and a stable job is still strongly prevalent, although things are changing.
An aversion to taking risks, together with the tendency towards ‘small thinking’ in how locals run their businesses, can be a dangerous combination. Although this is a common trait in Europe, it is even more serious in Malta where many locals have traditionally perceived themselves as inferior to their continental and American counterparts.
On the flip side, a developer in Malta costs significantly less than a developer in London. In fact, many things cost less in Malta when compared to the UK or continental Europe. From office space to food to entertainment, startups can get more bang for their buck by operating in Malta. Startups can make their money last longer. The lower operating costs can also mean that they need to ask for less venture capital and give up less equity early on.
The need to look beyond Maltese shores
Unfortunately, when it comes to finding venture capital, Malta is not of the best place to be in.
Although there are a few business angels and corporates prepared to invest in startups, access to finance is often brought up as a major business challenge.
Startup founders must turn their attention to events and contacts abroad in order to find opportunities to pitch experienced investors. This was the approach taken by Rawstream founder Brian Azzopardi. Brian and his startup were winners of Seedcamp’s London program in 2012 and used that as a platform to attract investment.
That said, the reality is that certain investors will only back startups that are close geographically and that limits the options for startups operating in Malta.
Attracting finance is not the only reason why Maltese startups need to travel abroad. Finding high quality networking opportunities is another reason. Although the size of the local startup community is growing, it is still in its infancy. In addition, most local businessmen are still relatively inexperienced with startups. We are only just starting to see the first local startups developing into ‘scale-ups’, so nobody in Malta can really speak about their experience scaling a business to ten million users or about successfully selling their company.
However, there are still opportunities to speak and learn from more experienced founders and investors. We are now beginning to see a number of speakers come to Malta to share their stories and lessons and individuals from the expat community are also coming forward and offering their expertise.
Furthermore, since Malta is well connected to major European airports, participating in the growing number of startup conferences around Europe provides founders with new networking options. One might argue that it is easier to validate and get feedback on your product when you are part of a lively startup ecosystem. That much is true. Yet at the same time, in many cases validation can be achieved by speaking to customers online and by investing in attending a few serious startup and industry events abroad.
Dr. Abdalla Kablan, founder of Scheduit, a Maltese startup that has introduced one of world’s first social networks that uses artificial intelligence to match compatible professionals, points out that “we validated our idea by speaking to founders, prospects and investors that we met at international conferences abroad as well as in Malta. We are very happy with using Malta as our base due to high calibre of professionals, as well as the strategic central location of Malta that enables us to travel flexibly to any summits around the world.”
In conclusion, Malta might not be a startup hub that is on the same level as other bigger European cities. Yet, it has an opportunity to attract early-stage startups who want to work in a location that combines good weather, low operating costs and increasing government support.
With Maltese startups beginning to think big and take on the global market, it is inevitable that they will start attracting interest from abroad. Will Malta ever develop its’ own unicorn, similar to how Estonia gave birth to Skype? It’s way too early to say. But what is for sure is that the option to escape to start a company from a sunny island in the centre of the Mediterranean is one that more founders will seriously consider in the years to come.
How digital is Europe? New EU Commission index ranks countries based on ‘digital performance’
Posted in malta, malta startups, Marketplace, Scheduit
Stay up to date about the latest tech news. Something went wrong
Tech Tour’s flagship Growth Forum event hits Switzerland next month
Building up: SaaS companies are gaining steam across Europe, and it's just the beginning
Tech.eu is kindly supported by:
CategoriesFinTech
Subscribe to our weekly Tech.eu Newsletter Copyright © 2015 Fores Media Ltd. All rights reserved. | 科技 |
2016-40/3982/en_head.json.gz/7734 | the Breakthrough Institute
Clean Energy Stagnation
Growth in Renewables Outpaced by Fossil Fuels
Long before climate policy became fashionable, global energy consumption data shows that from 1965 to 1999 the proportion of carbon-free energy more than doubled to more than 13 percent. Since then, there has been little if any progress in expanding the share of carbon-free energy in the global mix. Despite the rhetoric around the rise of renewable energy, this stagnation suggests how policies employed to accelerate rates of decarbonization of the global economy have been largely ineffective.
July 09, 2013 | Roger Pielke Jr Share
The world was moving faster towards reducing its reliance on carbon intensive energy consumption in the 1970s and 1980s than in the past several decades. In fact, over the past 20 years there has been little if any progress in expanding the share of carbon-free energy in the global mix. Despite the rhetoric around the rise of renewable energy, the data tells a far different story.
The Year of the Good Anthropocene
Lowballing Carbon Dioxide Emissions Projections
A Look at Wind and Solar
Bill McKibben’s Misleading New Chemistry
The Fossil Fuel Subsidy Red Herring
Adaptation for a High-Energy Planet
The Grid Will Not Be Disrupted
Beyond Technology Tribalism
Policy makers around the world have frequently expressed their desire to reduce the emissions of carbon dioxide to a level consistent with stabilizing the amount in the atmosphere at a low level. Conceptually, the challenge is akin to stabilizing the amount of water in a bathtub by modulating the amount filling the tub from a spigot. If there is an open drain at the bottom letting a bit of water out, then stabilization of the water’s height occurs when the amount coming into the tub equals the amount draining out.
The carbon dioxide is akin to the water filling the bathtub and the oceans and the land surface provide some take-up of carbon dioxide, serving like a small open drain at the bottom of the tub. For the stabilization of carbon dioxide, this means that emissions of carbon dioxide, which result primarily from the combustion of fossil fuels (oil, natural gas and coal), must be reduced by something like 80 percent or more.
However, instead of looking at the issue through the lens of emissions, another way to look at the challenge of stabilizing carbon dioxide in the atmosphere is through energy consumption. Whatever the total level of future energy supply turns out to be, to be consistent with stabilization – metaphorically stopping the rise of the water in the bathtub – the proportion of global energy that comes from carbon-free sources needs to exceed 90 percent. So how are we doing working towards that 90 percent?
BP, in its excellent annual statistical report on world energy, provides data that allows us to answer this question. The figure above shows the proportion of global energy consumption that comes from carbon-free sources. These sources include nuclear, hydro, solar, wind, geothermal, and biomass. The graph shows that from 1965 to 1999 the proportion of carbon-free energy in global consumption more than doubled to more than 13 percent, coincident with nuclear power increasing by a factor of 100 and hydropower by a factor of 6.
However, since 1999 the proportion of carbon-free energy in the global mix has dropped slightly. In fact, 1999 was the peak year for non-carbon energy. From 1999 to 2012 consumption of nuclear power dropped by 2 percent. While solar has increased its contribution to consumption by a factor of 100 and wind by 25 from 1999 to 2012, these sources remain at about 1 percent of total global energy consumption, and are dwarfed by the resurgence of coal.
Much is often made about the rise of renewable energy, but the data tells a more sobering story. In the ten years that ended in 2012, the world added about 2,500 million metric tonnes of oil equivalent (in layman’s terms, a lot) to its total energy consumption. Of that increase about 14 percent came from non-carbon sources. Compare that to the ten years ending in 2002, during which about 19 percent of the new energy consumption over the previous decade came from non-carbon sources. The figure above shows the proportion of annually added energy consumption that comes from carbon-free and carbon-intensive sources.
The data shows that for several decades the world has seen a halt in progress towards less carbon-intensive energy consumption, at about 13 percent of the total global supply. This stagnation provides further evidence that the policies that have been employed to accelerate rates of decarbonization of the global economy have been largely ineffective. The world was moving faster towards decarbonizing its energy mix long before climate policy became fashionable. Why this was so and what the future might hold will be the subject of future posts in this continuing discussion.
I suppose when anything bumps into economic reality, economics win out. I notice that the US administration expects renewables to continue at between 9.3% and 10.8% of US energy generation between now and 2040.
I also note that NASA Climate Change research has been defunded by $3.8 billion while space research has been un-funded by a similar amount. It begins to look as if AGW is no longer as fashionable as it once was.
By Craig King on 2013 07 10
Reply to this comment / Quote and reply
Sorry that should have been UP-funded for space research.
USA and China have greatly increased renewables especially wind but the world can’t get enough renewables to keep up with increased demand. The world’s on fire, ice caps are melting and as they said on the Titanic, “And the band played on”. Sad.
By Randall Smith on 2013 07 10
The world may be on fire (or not), but what doesn’t work, doesn’t work, an no amount of panic mongering will make wind and solar substitute our carbon energy sources.
We can’t live without energy, and wind and solar aren’t capable to provide the energy we need in sufficient amounts.
So, coal and gas and oil it is, until a technological breakthrough comes along, to give as a better energy source.
Wasting more money on wind and solar won’t do the trick.
By Jacob on 2013 07 10
Actually we can, and wind and solar can too, but that is besides the point: What we can’t live without are functioning ecosystems, which provide air to breathe, liveable temperatures and a relatively stable weather system.
By Honegger Matthias on 2013 07 30
Jacob, we have the capacity in this day and age to replace the world’s power grid with solar alone. The energy of the sun radiates the earth with 10000 times the energy that is used by the world’s consumers today. Just a few centralized locations can provide the world the 18 TW of electricity it needs. And the prospects of solar do not stop there. Passive systems have been tested and proven that use the winter sun’s angle of incidence to heat the inside of a house with minimal energy inputs, use the dynamics of convection currents to heat water without a tank, and, on a larger scale, use mirrors to generate power in a similar manner that coal-burning does. Many technologies exist beyond photovoltaic panels that can both produce electricity and make our use of it more efficient. What it needs is MORE funding to become even more efficient- and to be competitive with the ‘business as usual’ fuels.
By William Z on 2013 11 19
We often see figures about how much energy the earth receives from the sun in comparison to how much energy we need. That figure is about as useful as knowing the distance of Jupiter’s moon Titan from Jupiter.
Most of the energy the earth receives from the sun falls over the oceans and other areas where there is no practical way to utilize it. What is left is very diffuse which makes harvesting it very expensive. Moreover, it is intermittent.
Although the are situations in which harvesting solar energy is reasonable, it just is not practical to use solar energy as a major source of power for most large countries.
By Frank Eggers on 2013 11 26
William The sun’s energy that meets our outer atmosphere is 1370 Watts per square meter,. then only 10% of it reaches our earth’s surface, and only 1% is converted by PV into electricity, then only 25% of the time this works in zero latitude, then as you increase latitude this 25% decreases drastically in the winter when power is needed mostly, then transmitting power over long distances costs 7% per 100 km in loses. It is good to dream but sometimes we have to use math,logic & reality..
By cosmos V on 2013 12 28
The BP chart of energy supply over time shows an earlier rise in non-carbon supply because of nuclear and hydro. Other sources of non-carbon were practically non-existent then. Today, hydro growth is limited to more efficient turbines rather than new dams, with a few exceptions such as China’s Three Gorges project. Nuclear is stalled because of cost and unresolved waste disposal issues.
As I see it, new supply needs to come from renewable sources. Progress is quite good in the developed world. Now to get the developing world headed in the right direction as well.
Critics will question this assessment, but consider that countries serious about renewable energy are supplying around 20% of their electric power from renewables. In our own country, investors are now flocking to giant power-purchase agreement projects that supply 200-400,000 homes with renewable energy. As usual, all eyes are on Warren Buffet:
http://www.sustainablebusiness.com/index.cfm/go/news.display/id/25018
By Lee James on 2013 07 10
You wrote, “Nuclear is stalled because of cost and unresolved waste disposal issues.”
These are problems only because R & D on better nuclear technologies was halted, else we would already be using better, safer, and more economical nuclear technologies.
What many people do not understand is that our present pressurized water reactors are not the only possible type of reactor. There are many possible types of reactors. The liquid fluoride thorium reactor (LFTR) looks especially promising but unfortunately, after a few years of successful testing with a prototype, R & D funds were cut off. Another promising approach was the integral fast reactor (IFR). Both approaches would generate only about 1% as much waste as our present horribly inefficient reactors and could even use our present nuclear waste for fuel.
Before taking a long motorcycle trip a few years ago, I strongly favored renewables, but then I saw many wind farms with stationary turbines. I began to wonder whether the intermittent nature of renewables had been considered. After spending countless hours searching, I could find no evidence indicating that renewables could provide the power required by most large prosperous countries.
We need power at all times, not just when the wind is blowing or the sun is shining.
The chart does not show an important factoid—nearly all of the non-carbon resource has been hydro and nuclear. Hydro has remained steady at aropund 6%—it was the addition of nuclear power that accounts for nearly all of the increase since 1965. Only in the last couple of year have renewables begun to make a dent in the numbers.
By Paul Lorenzini on 2013 07 11
Noted the same as Paul Lorenzini. Here’s the breakdown by energy source for those interested:
http://jmkorhonen.net/2013/07/12/the-stagnation-of-clean-energy-with-more-detail/
By J. M. Korhonen on 2013 07 12
I’m not sure what might make one think, as Craig puts it, that “AGW is not as fashionable as it once was”. The science is getting better, not worse, and public opinion in North America and the rest of the world is trending toward acknowledging human-influenced climate change. The IPCC and other developed-world scientific communities are coalescing around a consensus, not splintering. I can’t say for certain why NASA climate change budget was cut, probably something to do with large government deficits, or whether and where it was made up for in another area. Government budgets are complicated, messy things. What I can say with certainty is that it’s a faulty conclusion to equate a $3.8 bil cut with the idea that AGW is now “less fashionable”- whatever that does indeed mean. As for clean energy stagnation, the point that many have made about the fall of nuclear energy is on the nose. JM Korhonen provides a great link. Another question to be asked is simply the arithmetic of it all. Renewables are rising relative to where they were before; fossils are simply rising more. Greater consumption and more extraction from the developing world is likely to be the culprit.
By Eric on 2013 07 12
There are reasons for the increase in CO2 emissions.
Germany is phasing out nuclear power and building more coal-fired power plants to replace it. In addition, they are importing more power from France where almost 80% of it is generated with nuclear reactors, but of course that does not increase CO2 emissions. Their renewable energy stations are insufficient to do the job.
Japan is also increasing coal imports to replace nuclear power.
China has increased coal burning to generate more power. It is also, with the cooperation of the U.S. and other countries, doing R & D work on different nuclear technologies, especially on the liquid fluoride thorium reactor (LFTR). Because we halted R & D on the LFTR decades ago, we may end up buying the technology from China.
Other countries are burning more fossil fuels as their demand for power increases. This will continue until or unless we greatly expand generation using nuclear reactors, very preferably with better nuclear technology than we are now using.
Heh. After a certain point, usually reached fairly early on, the futility of shooting yourself in the lower limbs becomes evident.
By Brian H on 2013 07 14
The consensus isn’t what you think it is. Pew polls
Top Policy Priorities for 2012
% considering each as a “top priority” for the president and Congress this year
Five One Today Five years year year ago ago chg
Jan Jan Jan
% % % Economy 68 87 86 +18
Jobs 57 84 82 +25
Terrorism 80 73 69 -11
Budget deficit 53 64 69 +16
Social Security 64 66 68 Education 69 66 65 Medicare 63 61 61 Tax fairness — — 61 Health care costs 68 61 60 -8
Energy 57 50 52 Help poor and needy 55 52 52 Crime 62 44 48 -14
Moral breakdown 47 43 44 Environment 57 40 43 -14
Lobbyist influence 35 37 40 Illegal immigration 55 46 39 -16
Strengthening military 46 43 39 -7
Global trade 34 34 38 Transportation — 33 30 Lower military spending — — 29 Campaign finance 24* — 28 Global warming 38 26 25 -13
PEW RESEARCH CENTER Jan. 11-16, 2012. * Campaign finance reform trend from Jan. 2004.
Sorry, the fixed font tag didn’t work.
In 2013, still last, Climate Change fell to 8% rating as “Top Priority”.
The Wisdom of Crowds is finally starting to bite.
Nov 2009 American Scientific Article by Mark Z Jacobson & Mark A Delucchi A plan to power 100 percent of the planets with renewables http://www.scientificamerican.com/article.cfm?id=a-path-to-sustainable-energy-by-2030
Mark Jacobson’s plan includes no real economic analysis or technical analysis for providing relaible load following electricity from such a heavy penetration of variable generation. It simply means nothing.
By Joe on 2013 09 20
Quite so.
At one time, I was somewhat opposed to nuclear power and strongly supported renewables. What changed my mind was a 5,500 mile motorcycle trip I took from here in Albuquerque to Savannah, Georgia. I saw many wind farms and noticed that in many, the blades were stationary. I began to wonder whether the intermittent nature of renewable power sources had been adequately considered. Previously I had incorrectly assumed that there would not be a push for renewables unless they had been demonstrated to be practical; I should have known better than to make such an assumption.
Upon returning, I spent countless hours searching for thorough and objective quantitative studies that indicated that renewables could adequately meet the power requirements of most large countries. I was not able to find even one such study!! It appears that we are expected, on faith alone, to spend countless billions of dollars on renewable systems.
A believable study might require putting wind and solar sensors in many places where installing wind farms and solar power systems would be practical, and transmitting the data to a central location for careful analysis. It would be necessary to demonstrate that for a period of several years, there would not be even a momentary lapse in the ability to supply the necessary power. There has been no such study.
What climate policies? What clean energy policy? Not one was named. Until you do name the policies that were such an abysmal failure, I’m inclined to think climate change and energy policies are just what we need to make real impacts. Pretty obvious you’re pro-fossil fuel industry.
By Anne Chastain on 2013 07 14
The Earth Summit in 1992 (https://en.wikipedia.org/wiki/United_Nations_Framework_Convention_on_Climate_Change)
which was followed on by The Kyoto Protocol in 1997 (https://en.wikipedia.org/wiki/Kyoto_Protocol)
The goals of Kyoto on GHG emissions have been entirely missed overall. “World Bank (2010)[139] commented on how the Kyoto Protocol had only had a slight effect on curbing global emissions growth. The treaty was negotiated in 1997, but in 2006, energy-related carbon dioxide emissions had grown by 24%.”
See several graphs & charts in the article.Some countries have reduced their emissions by providing heavy subsidies to solar or wind installations. The most notable of those countries - Germany - seems to be in the process of unfunding the subsidies. Time will tell what the outcome of that will be, but it is predictable that investment in new installations will decline there and across Europe.
By BJH on 2013 08 08
In response to public attitudes, Germany is reducing nuclear power generation. As a result, it is building more coal generating plants thereby increasing its reliance on coal. Making matters even worse, the kind of coal they are burning is lignite which is the dirtiest form of coal.
In addition, Germany is increasing its importation of electricity from France which produces most of its electricity from nuclear reactors.
Considering Germany’s unquestionable dedication to wind and solar power, and the fact that it is increasing its CO2 emissions, it is reasonable to question the practicality of renewable energy sources as a major source of power for most large countries.
Correct me if I’m wrong, but this chart doesn’t seem to jibe at all with the ones from IEA; http://blogs.scientificamerican.com/plugged-in/2013/07/10/the-global-outlook-for-renewable-power-in-one-graph/ Either one of them is wrong or the growth in renewables comes all from biomass or other not-carbon-free renewable sources. Not that there is such a thing as entirely “carbon-free” method of energy production if you look at the entire system.
By Sami on 2013 07 16
The charts seemingly don’t “jibe” as one relates to total energy use and the other relates just to electricity production.
By Duane Pendergast on 2013 12 29
No energy density….http://m.nationalreview.com/nro-energy/364885/wind-turbines-are-climate-change-scarecrows-robert-bryce….
Moreover windmills don’t work most of the time as noted above…http://www.ieso.ca/imoweb/marketdata/windpower.asp
Compare installed capacity to power generated in real time. Wind and solar are ancient technologies which were abandoned by previous generations for a reason.
By Paul Bell on 2013 12 29
Roger, the rate of “decarbonization” that occurred in the 70s and 80s, did so for reasons completely separate from carbon. Little things like dollars, and cents. “Decarbonization” has no value in the real world, so rational folks responding to market forces, continue to basically not give a darn about however much carbon a Kilo-watt of energy liberates to the atmosphere.
you started off with an analogy that didn’t hold true, and the article went down hill from there. “stabilizing the amount of water in a bathtub by modulating the amount filling the tub from a spigot. If there is an open drain at the bottom letting a bit of water out, then stabilization of the water’s height occurs when the amount coming into the tub equals the amount draining out.”
Each and every time I’ve filled up MY tub, it’s started empty.
By Frank From Texas on 2013 12 29
Connect With Breakthrough
Tweets by @rogerpielkejr
Popular Posts Ignoring Innovation
Keeping the Poor Poor
Should Scientists Rule?
It’s Not About the Machines
Andrew Revkin, "The Silent Partner Behind the Shale Energy Boom -- Taxpayers," July 31, 2013
Walter Russell Mead, "Shale Gas is Fracking Green," July 6, 2013
Kevin Begos, "Fracking Developed with Decades of Government Investment," September 26, 2012
"President Obama, Coal Killer"
"Where the Shale Gas Revolution Came From"
"Uniting a Fractured Republic"
COAL KILLER
info@thebreakthrough.org
© 2012 the Breakthrough Institute | 科技 |
2016-40/3982/en_head.json.gz/7746 | Watch Mayor Bloomberg from this week’s New York Tech Meetup
In case you haven’t heard of the New York Tech Meetup, it’s unlike any other tech meetup in the world. It’s in a town so terrifically proud of its dynamic and fast-growing tech scene that there’s literally a secondary market to snag tickets.
Every month, 9 or 10 companies get 3 to 5 minutes each to demo something cool to New York’s tech community of geeks, investors, entrepreneurs and hackers. This week was an incredibly big week for the NYTM community, with Mayor Michael Bloomberg making his first ever appearance at the Tuesday night event. After speaking about his own background as an entrepreneur, he focused on New York’s growing role in the country’s technology industry:
“I am not here to pitch you the Bloomberg terminal, although if you have $20,000 a month lying around, I will take it. I want to pitch you New York City, which is near and dear to my heart. Tech will define the 21st century and this crowd has the power to make New York our nation’s tech capital.”
Jessica Lawrence of New York Tech Meetup was kind enough to post his speech and other announcements, including the upcoming Raise Cache event in this video. (Mayor’s remarks start about 5 minutes in.)
Facebook might be behind the success of university students [Infographic] Share on Facebook (2)
David Packard
Marketing is too important to be left to the marketing department. | 科技 |
2016-40/3982/en_head.json.gz/7774 | Search ... Go Home About the Trilateral Parties to the Trilateral Current Table Co-Chairs Memo of Understanding Terms of Reference Current Priorities Working Tables CITES Ecosystem Conservation Executive Law Enforcement Migratory Birds Species of Common Concern Annual Meetings 2016 Annual Meeting The State of North America's Birds Report 2016 Signing of the Migratory Bird Conservation Letter of Intent 2015 Annual Meeting 2014 Annual Meeting View all Annual Meetings Projects Island Ecosystems & Species Conservation Contact Us PrevNext Build Partnerships and Prevent Extinctions April 29, 2015: Biodiversity loss on islands are occurring at an alarming rate and invasive alien species on island ecosystems are the leading cause. In an effort to address this challenge, the Director of the U.S. Fish and Wildlife Service and Chief Executive Officer of Island Conservation formally adopted a Memorandum of Understanding (MoU) during the most recent annual Trilateral Committee me... Native American Tribes Pledge to Save the Monarch... Stone Hearth News (May 13, 2016) - Leaders of seven native American tribes in Oklahoma announced on Tuesday their commitment to planting specific vegetation for monarch butterflies, whose population has declined in recent years.
The Sound of Endangered Salmon Surviving Science Daily (Feb. 1, 2016) With California in the fourth year of a historic drought, there is much controversy over how to supply cities, farms, and ecosystems with the water they need. Technology may help solve the puzzle.
Celebrating 100 Years of Migratory Birds Conservation ... May 18, 2016: Today, the Trilateral Committee celebrates great milestones of collaborative conservation for birds among the three nations with the signing of the Letter of Intent Related to the Conservation of Migratory Birds and Their Habitats in the United Mexican States, the United States of America, and Canada. 2016 marks the 100th anniversary of the Convention between the United State... The Canada/Mexico/U.S. Trilateral Committee
Canada, Mexico, and the U.S. share a wide array of ecosystems, habitats and species. They are also linked by strong economic, social and cultural ties. However, existing strategies for natural resources conservation have not adequately addressed increased development in the region, nor emerging problems such as climate change, toxic substances, rapidly-spreading wildlife diseases and invasive species. To more effectively address priorities of continental significance and boost the concerted efforts of the three countries of the North America bioregion, the Canada/Mexico/U.S. Trilateral Committee of Wildlife and Ecosystem Conservation and Management was established in 1995. The Trilateral Committee is headed by the directors of the Canadian Wildlife Service (CWS), the U.S. Fish and Wildlife Service (USFWS), and the Ministry of Environment and Natural Resources of Mexico (SEMARNAT).
HomeAbout the TrilateralWorking TablesAnnual MeetingsProjectsContact Us
XXI Annual Meeting Ottawa, Ontario, Canada May 16-19, 2016
Inside the Trilateral
San Diego Bay NWR
Facilitating international cooperation for conserving the living heritage of North America.
Copyright © 2015. The Canada/Mexico/US Trilateral Committee for Wildlife and Ecosystem Conservation and Management. All Rights Reserved. Joomla Web Design By JM-Experts | 科技 |
2016-40/3982/en_head.json.gz/7814 | Containers Expo Blog Authors: Yeshim Deniz, Liz McMillan, Ian Khan, Anders Wallgren, Pat Romanski Blog Feed Post
New Amazon Web Services data center in Sydney added to Apica Agent Network
By Sven Hammar
Earlier this week, Amazon added a brand new Amazon Web Services (AWS) Region to its Global Infrastructure. The Asia Pacific (Sydney) Region, as it is being called, offers important benefits to organizations operating in this area. Companies can run their cloud-based applications through the new AWS data center to reduce latency and other performance issues for local users without having to own and operate their own costly infrastructure.
The new AWS Region is great news for Apica customers as well, who can now use the Amazon Cloud to test and monitor their websites and applications to see how they perform for users in Australia, New Zealand, and the broader Asia-Pacific (APAC) region. In fact, as an Amazon Web Services partner, we’re proud to offer performance management services, tools, and support from all nine AWS Regions (plus more than 100 other locations) across the globe to ensure an optimal user experience regardless of location. Soon, customers also will be able to load test from AWS Sydney to determine the capacity and scalability of their web apps.
CIO, CTO & Developer Resources If you’d like to add the AWS Asia Pacific (Sydney) Region Agent to your web performance monitoring, you can do so immediately within your Apica ProxySniffer/LoadTest portal.
More Stories By Sven Hammar
Sven Hammar is Co-Founder and CEO of Apica. In 2005, he had the vision of starting a new SaaS company focused on application testing and performance. Today, that concept is Apica, the third IT company I’ve helped found in my career.
Before Apica, he co-founded and launched Celo Commuication, a security company built around PKI (e-ID) solutions. He served as CEO for three years and helped grow the company from five people to 85 people in two years. Right before co-founding Apica, he served as the Vice President of Marketing Bank and Finance at the security company Gemplus (GEMP).
Sven received his masters of science in industrial economics from the Institute of Technology (LitH) at Linköping University. When not working, you can find Sven golfing, working out, or with family and friends. | 科技 |
2016-40/3982/en_head.json.gz/7878 | More Stairway to Hologram: Led Zeppelin's John Bonham to appear in son's show?
posted by Craig Rosen | Yahoo! Stop The Presses!
(Yahoo!) - For Jason Bonham, his Led Zeppelin Experience tribute to his father's band isn't enough. He wants to take things a step further and rock out with a hologram version of his late father, Led Zeppelin drummer John Bonham.
In a recent chat with Legendary Rock Interviews, Jason revealed that he's planning to take his Zeppelin tribute up a notch. "It’s been fun and it’s important to me to have something in addition to the music, that was the whole point of it," he said. "When I first started doing this I was working with some of the people behind some of the biggest tribute tours like Pink Floyd Experience, the Beatles, and now I’m doing it. It’s imperative that I continue putting together the best shows and take it to the next level. I’m talking to people about holograms and my dream is to do the hologram drum solo with Dad next to me."
However, it remains to be seen if it's technically possible to pull it off, since footage of John Bonham with Led Zeppelin, shot in the '70s, might not be high-quality enough to convert into a hologram. | 科技 |
2016-40/3982/en_head.json.gz/7929 | Space | Air & Space Magazine
Outback Scramjet
A University of Queensland lab has supersonic success.
Luba Vangelova
A small group of Australian scientists made aviation history July 30 with the successful atmospheric test of a supersonic air-breathing engine in flight. Working with a budget most big science programs would consider petty cash, the team had researchers around the world rooting for them. Their road to success can only be called unique.
Swooping across the south Australian outback in a rented Cessna 180 last November, Allan Paull learned the hard way that an aeronautics career doesn’t teach you how to keep your lunch down while airborne. Make that barely airborne: At times Paull could have leaned out and high-fived pedestrians, had there been any in this vast wasteland. But the nerve-racking maneuvers allowed him to better scan the desert for his missing scramjet. And he had a keen incentive to find it, for the remains of his papal-mitre-shaped contraption could hold information that would help him fine-tune his second scramjet, which sat in pieces some 1,500 miles away in his lab at the University of Queensland. The low-level aerial search failed, so he and his co-workers next took turns strapping themselves to the roof rack of a rented Toyota Land Cruiser. Balancing like skiers, they scanned the ground on either side, as the driver jostled along the route the scramjet should have overflown. They saw plenty of shrubs but no scramjet. Time to recruit reinforcements for a third outing in late February. Someone had a brainstorm: Why not enlist University of Queensland zoologists who had performed aerial surveys for kangaroos in these parts? The zoologists were accustomed to scrutinizing the monotonous landscape from airplanes without reaching for sick bags.
When working on a tight budget, it helps to be creative. It’s not for nothing that Paull’s scramjet has been termed a “scrounge jet.” With a budget of less than $2 million (“pin money,” Paull says, compared to the $185 million NASA has for its hypersonic program), Paull’s four-person team managed to be at the forefront of research that may help pull off an aviation dream: inexpensive vehicles that can fly at speeds measured in miles per second rather than miles per hour.
Many see the scramjet (short for “supersonic combustion ramjet”) as the key. Deceptively simple in principle, a ramjet is essentially a duct that funnels onrushing air into a combustion chamber, where it mixes with fuel. Its distinguishing feature is the way in which it raises the pressure of the incoming air in order to make the fuel self-ignite. Rather than use turbine-powered fans to compress the air, the ramjet forces the air to slow down and essentially compress itself as it passes through the engine’s narrowing intake duct. The end result is the same: As they escape through the rear nozzle, the burning gases produce forward thrust.
The ramjet’s simplicity offers practical advantages. It has no moving parts and therefore fewer chances for failure. It’s not limited by turbine blades’ inability to withstand engine temperatures associated with flying above Mach 3. In fact, the ramjet can’t fly below Mach 3 (it therefore requires a conventional engine to reach that speed). But it too has its limitations: Slowing down the air to subsonic speeds generates extremely high temperatures. A ramjet can therefore operate up to only about Mach 6; to operate beyond, the engine requires so much structure that it becomes impractically heavy.
In a scramjet, this problem is circumvented by slowing the air less dramatically, so that it passes through the combustion chamber at supersonic speed. A scramjet can therefore match rocket velocities, but unlike a rocket, it uses the air’s oxygen and so doesn’t have to carry tanks of oxidizer. The result, in theory, is a lighter (and therefore cheaper) craft capable of flying about three times faster than the long-standing speed record for rocket-powered aircraft, set by NASA’s X-15 in 1967: Mach 6.7. More tantalizing still, a scramjet’s upper speed limit is unknown.
On a typically hot and humid Brisbane summer day Paull, clad in short-sleeve shirt and shorts, receives me in his un-air-conditioned office at the University of Queensland. A cartoon-emblazoned punching bag sits wedged in the gap between the credenza and the window.
With wiry, sandy hair, and blue eyes, the six-foot-three Paull could pass as Christopher Lloyd’s chilled-out, antipodean cousin. Leaning back in his chair, the 42-year-old reflects on his groundbreaking work. Speaking in a broad Aussie accent, he liberally punctuates the tale with rightos and underscores his drier observations with a slightly mischievous smile.
In 1985, after earning a graduate degree in applied mathematics, Paull netted himself a job crunching numbers for Ray Stalker, a University of Queensland space engineer of global renown who had designed and built one of the world’s most sophisticated wind tunnels at the university and used it for pioneering scramjet research. When Stalker suffered a stroke in the 1990s, Paull found himself in charge of the program. Progress remained hampered by a central limitation: Even the university’s most cutting-edge shock wave tunnel allowed a test window of only two milliseconds.
A Galileo Gallery
Commentary: Emergency Exit | 科技 |
2016-40/3982/en_head.json.gz/7993 | Services WAN Optimization as-a-Service
Network as-a-Service
Web App Acceleration IP App Acceleration Solutions Optimize Microsoft Cloud Services
Blazing fast IP Application Delivery to Remote Employees
Improve Office 365 performance
Split second response times for saas applications
Enhanced Website and Web Application Performance
Faster Access to Cloud Applications
Improved Application Performance over the WAN
Instant Connectivity to Global Offices
Resources Document repository
Aryaka in the news
Aryaka Launches Cloud-Based Application Acceleration and WAN Optimization Platform; Raises $14 Million in Funding
« Back to press
Analysts Call First-of-its-Kind Service a Market Disruptor; Aryaka Customers Realize 150 Times Performance Improvement and 500 Percent Return on Investment
MILPITAS, Calif. – Sept. 21, 2010 – Aryaka Networks (www.aryaka.com), provider of the world’s first cloud-based application acceleration and WAN optimization solution, today announced it has officially launched the Aryaka™ platform. Aryaka’s solution enables enterprises to deploy WAN optimization in a matter of minutes and with no capital expenditures. Unlike traditional appliance-based or managed-service models that often take months to test and implement, Aryaka democratizes the world of application acceleration and WAN optimization by offering exemplary performance at a significantly lower cost. Aryaka delivers a guaranteed 99.999% network uptime as enterprises scale to accommodate new applications, users, and locations across the globe.
In addition, the company announced that it has raised $14 million, including Series A funding from Trinity Ventures, Mohr Davidow Ventures, Nexus Venture Partners and Stanford University.
“Aryaka’s approach is unique and has a chance to be a truly disruptive force in the market today,” said Bojan Simic, principal analyst with TRAC Research. “Aryaka is taking what has traditionally been a hardware play and turning it into a service, and, on top of that, they’re providing the connectivity – all in a single solution. This approach gives organizations more flexibility in today’s virtual and hybrid network environments, and it is easier to deploy and manage than the box-based model we’re seeing at the core of the other major players’ strategies. There has been a need and a market for this solution for many years, and Aryaka has addressed that need in the most comprehensive way.”
Aryaka’s lead lies in the fact that its network operates with the advantages of a private cloud combined with the economics of a public cloud, thereby offering enterprises the unprecedented ability to overcome the inherent challenges of appliance-based solutions. The forward-thinking enterprise looking to leverage the cloud now has access to the key enabler – the network – in addition to computing and storage resources. Aryaka gives enterprises, for the first time, a high-performance, highly-available and scalable, application-aware network whenever and wherever they do business.
“The application acceleration and WAN optimization markets are growing quickly, thanks to increasing globalization and “applications everywhere” driven by mobility. There is an increasing demand for cloud-based solutions; services are faster and easier to deploy than equipment, and allow the avoidance of capital expenditure, important in today’s economy,” said Lydia Leong, research vice president for Gartner’s Technology and Services Provider Group.
For an organization spending $6,000 per month on dedicated connectivity for six locations across the globe, Aryaka would deliver cumulative benefits, over a period of three years, in excess of $1 million. This figure is calculated bearing in mind the network savings and productivity gains. This translates to a 567% ROI over three years.
“We’ve evaluated many approaches to resolving application performance issues and found that the unpredictability of future costs, combined with the significant IT management burden, has made the hardware solution untenable,” said Sarvesh Mahesh, CEO of Tavant Technologies. “In contrast, Aryaka’s solution was quick and easy to deploy, and today is providing the performance we need across our multiple locations in the U.S. and India. Plus, the solution can scale as we expand.”
With the total addressable market for WAN Optimization rapidly growing and projected to reach $4.27 billion globally by 2014 , this industry-first, cloud-based solution addresses the underserved needs of mid-sized enterprises and broadens the availability, affordability and adoption of this technology. The explosive demand for WAN optimization stems from the market demand for solutions that enable strategic IT initiatives, including real-time collaboration for higher-quality, unified communications; data center consolidation; server and storage centralization; and disaster recovery. Aryaka is the only true innovation in the WAN optimization market in the last five years that addresses the business requirements for high capacity and real-time optimization of all applications.
Key features of the Aryaka solution include the following:
Revolutionary technology built for the cloud
Intelligent network with global points of presence
Cloud-based application acceleration and WAN optimization resulting in 150 times performance improvement
Ease of use, complemented by 24x7x365 support
Significant cost savings via a pay-as-you-go pricing model with no capex or maintenance fees
Reduced complexity – a single solution for connectivity and optimization
Immediate scalability – easily add or remove geographic locations as needed
Guaranteed performance – 99.999% network uptime
Enterprise-grade security, visibility and control
Aryaka delivers performance guarantees, which removes a significant burden from already overworked IT departments that have been tasked to do more with less. Organizations retain full and comprehensive views of their network and management capabilities through a web-based “MyAryaka” portal with executive dashboards. The solution offers comprehensive real-time data and alerts on the mix of applications, geographic locations, and performance and availability criteria, integrated into a single management console accessible from anywhere, anytime.
“Establishing connectivity and bandwidth among locations across the globe was a big challenge,” said Ajit Gupta, founder and CEO of Aryaka Networks. “We saw that existing WAN optimization solutions did not adequately address the needs of most enterprises. Organizations were deluged with appliances and burdened by their inherent cost and complexity. We are offering a fundamental architectural and business-model shift in how application acceleration and WAN optimization should be delivered. We bring headquarters closer to their branches. Our goal is to enhance enterprise collaboration and delight our customers, one branch at a time. We want the world to think beyond the box.”
The Aryaka solution is priced based on the link size connecting to the network and the associated geography. The pay-as-you-go model requires no capex and includes 24x7x365 support and management.
About Aryaka Networks
Founded in November 2008, Aryaka Networks, Inc. delivers cloud-based application acceleration and WAN optimization with true business results. Headquartered in Milpitas, California, with offices in Bangalore, India, Aryaka is funded by Trinity Ventures, Mohr Davidow Ventures, Nexus Venture Partners and Stanford University. Key advantages of the Aryaka solution include a secure, scalable and reliable application acceleration and WAN optimization platform as a service, rapid deployment, ease of integration, complete visibility and a significantly lower total cost of ownership. For more information, visit www.aryaka.com.
We would love to keep you informed about new solutions and things going on at Aryaka. Sign up below and we will send you an occasional email about what is going on.
© Copyright 2015-2016 Aryaka Networks, Inc. All rights reserved.
Aryaka, the Aryaka logo, Smart Connect and Smart CDN are trademarks or registered trademarks of Aryaka Networks, Inc. in the U.S. and/or other countries. All other trademarks are the property of their respective owners. | 科技 |
2016-40/3982/en_head.json.gz/8076 | Print this page Avesthagen to earn $30 mn in 10 years from AvestaDHA 27 Aug 2013, BioSpectrum Bureau , BioSpectrum
New Delhi: Avesthagen entered into a license agreement with a leading specialty chemicals company for its 100 percent vegetarian AvestaDHA polyunsaturated Omega-3 fatty acid for animal feed application. Avesthagen will earn upto $30 million over the next decade from the non-exclusive agreement .
Avesthagen's innovative patented and trademarked technology allows commercial production of superior quality, 100 percent vegetarian AvestaDHA derived from microalgae found in the Indian Ocean.
DHA is naturally found in different types of fish oil and marine plankton, but it is anticipated that the supply from this source will be insufficient to meet growing demands. As per industry analysts, just the algal DHA/EPA market for 2013 is expected to exceed $525 million while growing at an annual rate of 24 percent. The total global market for products containing DHA exceeds $26 billion, according to Frost and Sullivan.
Avesthagen developed AvestaDHA with a grant from CSIR under the New Millennium Technology Leadership Initiative (NMITLI) scheme for production of cost effective and renewable sources of DHA and other long chain polyunsaturated fatty acids. The project also involved the National Institute of Oceanography (NIO), Goa; the Indian Institute of Integrative Medicine (IIIM), Jammu, and the Indian Institute of Chemical Technology (IICt), Hyderabad.
Dr Villoo Morawala Patell, founder and chairperson, Avesthagen, said that, "This is one step towards our continuing commitment to discovery and commercialization of science based products that promote health and well-being throughout life. AvestaDHA will serve as an acceptable and safe source of DHA for the much neglected field of animal health."
Adds Dr Patell, "Avesthagen is open for global licensing, manufacturing and marketing alliances for human and medical nutrition, animal feed and many other applications. AvestaDHA is targeted at improving the nutritional health of the bottom of the pyramid and Avesthagen invites like-minded government, private organizations and civil society to join in the mission." | 科技 |
2016-40/3982/en_head.json.gz/8079 | Bird Calls and Sounds
Identifying Birds
Bird Breeding
Bird Clubs
Bird Care
Showing and Displaying
Bird Safety
Extinct & Rare Birds
Bird Classifications
Black Swift (Cypseloides niger)
Black-Capped Chickadee (Poecile atricapilla)
Ashy Storm-Petrel (Oceanodroma homochroa)
Bee Hummingbird (Mellisuga helenae)
Bald Eagle (Haliaeetus leucocephalus)
Atlantic Puffin (Fratercula arctica)
Anhinga (Anhinga anhinga)
Arctic Loon (Gavia arctica)
Andean Condors (Vultur gryphus)
American White Pelican (Pelecanus erythrorhynchos)
American Oystercatcher (Haematopus palliates)
American Kestrel (Falco sparverious)
American Goldfinch (Carduelis tristis)
American Coot (Fulica americana)
American Bittern (Botaurus lentiginosus)
African Fish Eagle (Haliaeetus vocifer)
American Avocets (Recurvirostra americana)
Acorn Woodpecker (Melanerpes formicivorus)
Evening Grosbeak (Coccothraustes vespertinus)
Emperor Goose (Chen canagica)
Elegant Trogon (Trogon elegans)
Eastern Bluebird (Sialia sialis)
Dark-Eyed Junco (Junco hyemalis)
Crested Caracara (Polyborus plancus)
Corn Crake (Crex crex)
Common Redpoll (Carduelis flammea)
Common Moorhen (Gallinula chloropus)
Common Loon (Gavia immer)
Collared Plover (Charadrius collaris)
Common Cuckoo (Cuculus canorus)
Cinnamon Teal (Anas cyanoptera)
Cattle Egret (Bubulcus ibis)
Cape Verde Shearwater (Calonectris edwardsii)
Cape May Warbler (Dendroica tigrina)
Common Barn Owl (Tyto alba)
Horned Grebe (Podiceps auritus)
Herald Petrel (Pterodroma arminjoniana)
Gyrfalcon (Falco rusticolus)
Green Parakeet (Aratinga holochlora)
Green Kingfisher (Chloroceryle americana)
Green Heron (Butorides virescens)
Greater Roadrunner (Geococcyx californianus)
Greater Flamingo (Phoenicopterus rubber)
Gray Hawk (Asturina nitida)
Great Blue Heron (Ardea herodias)
Golden Eagle (Aquila chrysaetos)
Gadwall (Anas strepera)
Florida Scrub-jay (Aphelocoma coerulescens)
Osprey (Pandion haliaetus)
Ostrich (Struthio camelus)
Northern Mockingbird (Mimus polyglottos)
Northern Jacana (Jacana spinosa)
Northern Harrier (Circus cyaneus)
Northern Gannet (Morus bassanus)
Northern Cardinal (Cardinalis cardinalis)
Neotropic Cormorant (Phalacrocorax brasilianus)
Muscovy Duck (Cairina moschata)
Mountain Quail (Oreortyx pictus)
Montezuma Quail (Cyrtonyx montezumae)
Mississippi Kite (Ictinia mississippiensis)
Merlin (Falco colombarius)
Marsh Sandpiper (Tringa stagnatilis)
Marabou Stork (Leptoptilos crumeniferus)
Manx Shearwater (Puffinus puffinus)
Mallard (Anas platyrhynchos)
Magnolia Warbler (Dendroica magnolia)
Little Stint (Calidris minuta)
Laysan Albatross (Diomedea immutabilis)
Kirtland’s Warbler (Dendroica kirtlandii)
King Eider (Somateria spectabilis)
Killdeer (Charadrius vociferus)
Turkey Vulture (Cathartes aura)
Snowy Owl (Bubo scandiacus)
Spruce Grouse (Falcipennis canadensis)
Steller’s Sea Eagle (Haliaeetus pelagicus)
Tufted Titmouse (Baeolophus bicolor)
Ross’s Goose (Chen rossii)
Ruby-Throated Hummingbirds (Archilochus colubris)
Ruddy Duck (Oxyura jamaicensis)
Scarlet Ibis (Eudocimus ruber)
Scarlet Tanager (Piranga olivacea)
Skylark (Alauda arvensis)
Roseate Spoonbill (Ajaia ajaja)
Rose-breasted Grosbeak (Pheucticus ludovicianus)
Rock Ptarmigan (Lagopus mutus)
Rock Pigeon (Columba livia)
Ring-necked Pheasant (Phasianus colchicus)
Purple Martin (Progne subis)
Purple Gallinule (Porphyrula martinica)
Piping Plover (Charadrius melodus)
Peregrine Falcon (Falco peregrinus)
Western Grebe (Aechmophorus occidentalis)
Wandering Albatross (Diomedea exulans)
Wood Thrushes (Hylocichla mustelina)
Wood Stork (Mycteria americana)
Wood Duck (Aix sponsa)
Whooping Crane (Grus americana)
Wild Turkey (Meleagris gallopavo)
Add Your Video
Eliza Kuklinski
Kenneth Moore
Lisa Kendall
Jessica Pickett
Ernie Allison
Saikat Kumar Basu
Jackie Bray
Mia Petersen
Jhilam Chattaraj
Birding Tips
Browse > Home / Fork-tailed Drongos: Marvelous Mimics
April 25, 2015 by Editor Filed under Features Leave a Comment A recent study by evolutionary biologist Tom Flower of the University of Cape Town in South Africa has revealed that the African fork-tailed drongo mimics alarm calls of other species as part of its food gathering strategy. Wildlife observers in Africa have noted that the drongo is an accomplished thief, but it was thought that it was using its own alarm call to falsely alert other birds and meerkats that a predator was nearby, thereby causing them to drop their meal, which the drongo would swoop in and claim. It is estimated that the drongo steals more than twenty percent of its daily food. But the lengthy study carried out by Flower in the Kuruman River Reserve, located in the Kalahari Desert, yielded some astounding insight into the drongo’s ability to perfectly mimic a variety of bird and mammal species for its own advantage.
In the wild, birds and mammals often pay attention to other species in their environment when it comes to sounding the alarm. An extra pair of eyes and ears can be handy when it comes to safety. But as researchers have discovered, the drongo can’t be trusted. Perched high up in a tree a drongo watches with keen interest as meerkats forage, and when one of them catches something, an insect or lizard, the drongo sounds its own alarm call, anticipating that the meerkat will drop its prey and head for cover. However, the foraging meerkats are likely to ignore the drongo after it has used its own alarm call a few times. Undaunted, the drongo will switch to the alarm call of another bird species, often with successful results. During the study, Flower and his colleagues tracked and recorded the calls of 42 drongos as they attempted to steal food from the same target. It was noted that of the 151 recorded incidents, the drongos switched to a different alarm call a total of 74 times. After giving its own alarm call without success, a drongo may give the alarm call of its target, which general proved successful. Flower notes that he doubts the birds have ‘theory of mind’ – the ability to understand that another being has different beliefs and intentions – which is currently only attributable to humans. It’s more likely that they are responding to feedback, or have an ability to grasp cause and effect, and use this to their advantage. Nonetheless, this is another example of the keen intelligence of the feathered creatures that share our planet.
Tags: biologist, drongo, research, south africa, species, wildlife
Some Fascinating Facts About Pelicans
February 18, 2014 by Editor Filed under Features Leave a Comment Based on the oldest recorded pelican fossil found at Luberon in southeastern France belonging to the Early Oligocene era, it has been deduced that pelicans have existed virtually unchanged for at least thirty million years. Fossils of several birds from the Pelecanus species have been identified elsewhere in the world – South Australia; Siwalik Hills, India; Bavaria, Germany; Idaho, United States; Odessa, Ukraine; and North Carolina, United States – backing up this claim. Today there are eight living pelican species distributed around the world and some of which are considered ‘vulnerable’ or ‘threatened’ by the IUCN, and all of which use their amazingly elastic pouches to catch fish.
With the exception of the brown pelican, which dives for fish and snatches it up in its bill, pelicans usually form cooperative groups for their fishing expeditions. They either swim along in a line or U-shape formation, beating their wings on the surface of the water to drive the fish into a group in the shallows where the pelicans scoop them up in their pouches. Contrary to popular belief, pelicans do not store fish in their pouches, but swallow them almost immediately upon catching them. Baby pelicans feed by retrieving fish from the throats of their parents. Pelicans are very social birds, traveling in flocks and breeding in colonies, either along the coastline or inland alongside rivers and lakes. The brown pelican (Pelecanus occidentalis) was at one time considered to be ‘vulnerable’ in North America – primarily due to poisoning by chemical pesticides such as the notorious DDT which devastated the populations of many seabirds – but recent reports indicate that significant recovery has taken place and the birds’ conservation status is now that of ‘least concern’. The Dalmation pelican (Pelecanus crispus), found in South-eastern Europe through to India and China, has the IUCN conservation status of ‘vulnerable’, while the Peruvian pelican (Pelecanus thagus) found on the Pacific Coast of South America, and the spot-billed pelican (Pelecanus philippensis) found in Southern Asia, are both considered to be ‘near threatened’. The other pelican species – pink-backed pelican (Pelecanus rufescens) found in Africa, Seychelles and southwestern Arabia; the American white pelican (Pelecanus erythrorhynchos) found in North America; the great white pelican (Pelecanus onocrotalus) found in the eastern Mediterranean, Malay Peninsula and South Africa; and the Australian pelican (Pelecanus conspicillatus) found in Australia, New Guinea, New Zealand, Bismarck Archipelago, Fiji and Walacea are all listed as being of ‘least concern’ from a conservation standpoint.
Tags: australia, conservation, iucn, pelicans, south africa, united states
Montecasino Bird Gardens in South Africa
July 17, 2012 by Editor Filed under Features Leave a Comment Situated in the midst of the hustle and bustle of Johannesburg, South Africa, Montecasino Bird Gardens is home to more than sixty species of birds, along with a variety of small mammals, amphibians and reptiles from around the world. With pathways winding through lush gardens and a huge walk-through aviary, visitors can enjoy a tropical paradise and get back to nature without leaving the city.
One of the highlights of a visit to this award winning attraction is the Flight of Fantasy show which take place weekdays at 11h00 and 15h00, with an extra show at 13h00 on weekends and public holidays. Staged at the beautifully crafted Tuscan amphitheater, trainers guide talented and colorful birds through a forty minute performance that is both educational and entertaining, with (quite literally) the biggest star of the show being Oliver, the Southern White Pelican.
Features of Montecasino Bird Gardens include the largest collection of South African Cycads in the world, with 37 different species and over 750 plants, the oldest of which is estimated to be older than 2,500 years. The Lorikeet aviary offers visitors the opportunity to feed these colorful birds their favorite treat of nectar, while Macaws and Cockatoos roam freely in the park’s Parrot Gallery. In addition to a variety of frogs, the Frog Room features scorpions and spiders. Reptiles at Montecasino include a six-meter Reticulated Python as well as all of Southern Africa’s most venomous snakes, including the Black Mamba and Puff Adder. Resident mammals include Lemurs, Meerkats, Sloths and Blue Duikers.
Among the latest arrivals at Montecasino Bird Gardens are Laughing Kookaburras and Blue-Wing Kookaburras, Caribbean Flamingoes, Green-Naped Pheasant Pigeons and Keel-Billed Toucans. One of Montecasino’s ambassadors for conservation is Moholoholo the Cape Vulture. Named for the rehabilitation center in Hoedspruit which nursed him back to health after being poisoned by farmers who were attempting to eradicate predatory jackals. Moholoholo was the only survivor of his eighteen member family. Through the dedication of the staff at Moholoholo Rehabilitation Center, the bird was taught to walk and fly again and now helps to educate the public on the necessity of conservation.
Tags: bird species, birding, conservation, rehabilitation, sanctuary, south africa
Visit the African Bird of Prey Sanctuary
January 31, 2012 by Editor Filed under Features Leave a Comment Established in 2006, the African Bird of Prey Sanctuary in South Africa cares for more than 180 birds representing 50 different raptor species. The sanctuary’s permanent residents have either been bred in captivity, or have sustained injuries which significantly limit their chances of survival in the wild. Located close enough to both Durban and Pietermaritzburg to allow easy access for a day trip, the sanctuary offers unique insight into South Africa’s amazing predatory birds which is both educational and entertaining.
The sanctuary’s permanent residents include vultures, eagles, falcons, kestrels, goshawks, sparrowhawks, buzzards, hawks, kites and owls. Many of the birds have been named, with a record of their rescue story available to visitors. Eagles are rightly viewed as the mightiest of the birds of prey and the sanctuary’s Eagle Alley allows visitors a close up look at some of these majestic birds. Other sections of the sanctuary are Hoot Hollow for the owls; Honeycomb Habitats housing diurnal raptors; and the Vulture Hide with its eight indigenous vulture species, all of which are considered to be threatened.
In addition to being a popular tourism attraction, the African Bird of Prey Sanctuary is dedicated to ongoing research, including breeding and rehabilitation projects, with a view to conserving the birds in their natural South African environment. The Raptor Rescue operation run by the sanctuary is kept separate from the public area and is not open to visitors. If rescued birds are to be rehabilitated and released into the wild again, it is in their best interests not to be exposed to too many people. In addition to being stressful for them, too much interaction with humans could make the birds tame, thereby hampering their chances of survival in the wild. For research purposes birds are ringed before being released into a suitable habitat, if possible where they were found.
One of the most exciting features of the African Bird of Prey Sanctuary is the flying display, and visitors should be sure to plan their day to include one of these demonstrations, bearing in mind that they are weather dependent. Flying display times are Monday to Friday at 10:30am, and at 10:30am and 3pm on weekends and public holidays. As a privately funded conservation initiative, the African Bird of Prey Sanctuary relies on entrance fees to continue their work. So, why not support this worthy cause, and enjoy an outing you are not likely to forget.
Tags: birds of prey, eagle, falcon, falconry, hawk, owl, south africa
Young Penguins Fitted with Monitors
July 19, 2011 by Editor Filed under News Leave a Comment The African Penguin, also referred to as the Jackass Penguin, might be a little awkward on land, but can definitely hold its own in the water as a very efficient hunter. Tourists who visit Cape Town, South Africa, and see the beauty of these birds do not realize that they are actually witnessing a very rare moment, as the population of these birds has decreased from approximately four million in the 1900s. The last census done by the Southern African Foundation of the Conservation of Coastal Birds in 2010 counted only sixty thousand. This alarming decrease has led to the creation of a new project to protect these valuable birds.
Humans, as the story usually goes, had a great influence in the reduction of numbers of African Penguins, as up until the 1960s the penguin eggs were being harvested for human consumption. Another factor was the harvesting of guano that was used as fertilizer, but is crucial for adult penguins, as they use the hardened guano to make nest burrows. To add to the penguins’ problems, oil spills and over harvesting of anchovies and other fish species that are a part of their diet has made their fight for survival even harder.
Scientists want to try and create artificial hatcheries to assist in the breeding of African Penguins for release, but to recreate the hatcheries efficiently, it is vital for them to have the correct information to understand the penguins better. In order to do this they have attached a transmitter, which is approximately the size of a matchbox, to baby penguins that are about ten weeks of age. The penguins are first placed in a pool so they can get used to swimming with the transmitter and then released into the ocean. One penguin has already been released, and a penguin named Richie is due for release. Scientists will be releasing approximately five penguins with transmitters.
Dr Richard Sherley, a key member of the scientific team from the University of Cape Town, commented that he hoped that the data collected would allow them to understand what influences breeding colonies in the choices they make and the early life of a penguin, as these questions have not been answered as yet. Lucy, which was the first penguin to be released, has already transmitted back data, which showed scientists that young penguins are able to swim approximately twenty-eight miles in one day. Sherley commented that because no-one really knows much about the early days and life of young penguins, it is crucial for them to collect this data to assist in their conservation projects. The transmitters will eventually fall off of the penguins, but it is hoped that by then enough information has been gathered to assist scientists in finding the ideal breeding site for a colony that can be protected and will be the site of the hatchery.
Tags: africa, conservation, penguin, research, south africa, transmitter
bird show
pet bird
April 2015 March 2015 February 2015 January 2015 December 2014 November 2014 September 2014 August 2014 July 2014 June 2014 May 2014 April 2014 March 2014 February 2014 January 2014 December 2013 November 2013 October 2013 September 2013 August 2013 July 2013 June 2013 May 2013 April 2013 March 2013 February 2013 January 2013 December 2012 November 2012 October 2012 September 2012 August 2012 July 2012 June 2012 May 2012 April 2012 March 2012 February 2012 January 2012 December 2011 November 2011 October 2011 September 2011 August 2011 July 2011 June 2011 May 2011 April 2011 March 2011 February 2011 January 2011 December 2010 November 2010 October 2010 September 2010 August 2010 July 2010 June 2010 May 2010 April 2010 March 2010 February 2010 January 2010 December 2009 November 2009 October 2009 September 2009 August 2009 July 2009 June 2009 May 2009 April 2009 March 2009 February 2009 January 2009 December 2008 November 2008 October 2008 September 2008 August 2008 July 2008 June 2008 May 2008 April 2008 March 2008 February 2008 January 2008 December 2007 November 2007 October 2007 September 2007 August 2007 July 2007 June 2007 May 2007 April 2007 March 2007 February 2007 January 2007 December 2006 November 2006 October 2006 September 2006 August 2006 July 2006 June 2006 May 2006 April 2006 March 2006 Categories | 科技 |
2016-40/3982/en_head.json.gz/8146 | Startup Funding
Get the Job
Retail Solutions
SMB Solutions
Product and service reviews are conducted independently by our editorial team, but we sometimes make money when you click on links. Learn more.
Gen Y Uses Mobile Tech to Hunt Down Meals By Ned Smith, BusinessNewsDaily Senior Writer November 12, 2012 11:41 am EST
. / Credit: Ordering meal from smartphone image via Shutterstock When it comes to satisfying the munchies while on the move, Gen Y, the 18- to 34-year-olds of the millennial generation, hasn't evolved all that much beyond its caveman forebears, a new survey shows. They're still a tribe of hunters. The only difference between then and now is that millennials use their smartphones instead of spears to forage for their next meal.
The massive penetration of smartphones into the U.S. market has made the mobile internet a normal part of life for a large chunk of the population, according to eMarketer, a research company. And one of the key activities is finding their next meal.
While simply searching for a place to eat nearby is the most popular activity for mobile users, young users are far more likely than their elders to associate their mobile devices with food and develop deeper relationships with their food sources, the survey found. Nearly a third (32 percent) of millennials told the Technomic food researchers that they had checked menus on their phones.
Gen Xers, born between the '60s and '80s, were about half as likely to do so, and just 8 percent of baby boomers said the same. A similar generation gap was present for other digital activities associated with restaurants, including following them on social media and checking in via mobile apps.
The top activity for all age groups was looking up menus, probably because it is the most practical activity, eMarketer said. Following restaurants on social media or checking in via mobile apps, while a fun activity, just don't have the same payoff or measurably improve the dining experience.
One activity that does improve the dining experience— and one that's growing in popularity — is the ability to place an order ahead of time via mobile phone, an option most commonly offered by quick-service or fast-casual restaurants. Nearly a quarter of U.S. smartphone and tablet users reported having placed an order in this way in a September survey by Prosper Mobile Insights.
More than six in 10 respondents said ordering ahead of time improved their dining experience at least a little bit (including 29 percent who said their experience was “a lot better”).
Reach BusinessNewsDaily senior writer Ned Smith at nsmith@techmedianetwork.com. Follow him on Twitter @nedbsmith.We're also on Facebook & Google+.
Ned Smith
Ned was senior writer at Sweeney Vesty, an international consulting firm, and was Vice President of communications for iQuest Analytics. Before that, he has been a web editor and managed the Internet and intranet sites for Citizens Communications. He began his journalism career as a police reporter with the Roanoke (Va.) Times, and was managing editor of American Way magazine and senior editor of Us. He was a Captain in the U.S. Air Force and has a masters in journalism from the University of Arizona.
Restaurants Should Put Technology on the Menu
Millennials Make the Most of Mobile Connections
Lead Your Team | 科技 |
2016-40/3982/en_head.json.gz/8198 | Appeared on: Wednesday, September 16, 2009
Sharp to Incorporate UV2A Technology into Production of LCD Panels
Sharp has developed a photo-alignment technology called UV2A Technology for LCD panels that can precisely control the alignment of liquid crystal molecules in a simple LCD panel structure.
Sharp today announced plans to fully incorporate this world-first technology as a core technology for the production of a new type of LCD panel that will significantly evolve LCD TVs to the next generation.
The ASV (Advanced Super View) technology thus far adopted by Sharp delivers high-resolution images to LCD TVs by precisely controlling the movement of liquid crystal molecules within a complex LCD panel structure. The next generation of TVs, however, must deliver very high performance in terms of image resolution and energy efficiency. That will require not merely an extension of past developments, but rather further innovation in LCD technology itself.
Thus Sharp has adopted a special material that responds to UV (ultraviolet) radiation and has developed UV2A Technology as a photo-alignment technology for the alignment film in LCD panels. This innovative technology, the result of combining proprietary materials developed by Sharp with UV exposure equipment and processing technologies, provides highly accurate control over the alignment of liquid crystal molecules in accordance with the direction of the radiation. It can be termed "pico-technology" in that it goes beyond nanotechnology to control the tilt angle of liquid crystal molecules, which are only around two nanometers in size, with an accuracy measured in picometers.
In addition to eliminating light leakage from the backlight, making it possible to display extremely deep blacks, this technology also enables higher aperture ratios in the LCD panel for the most efficient use of light from the backlight, thereby saving energy while displaying bright, vivid colors. Sharp?s UV2A Technology will also be ideal for enhancing the performance of high-definition 4Kx2K displays and 3D TVs, which are expected to form the next generation of TVs.
Sharp will be introducing this technology across the board for panels to be produced at the new LCD panel plant in Sakai and at Kameyama Plant No. 2.
Major Features of LCD Panels Using UV2A Photo-Alignment Technology
1. High contrast ratio of 5000:1 for the display of extremely deep blacks (contrast ratio 60% higher than that of conventional panels).
2. High optical efficiency for outstanding energy efficiency (aperture ratio 20% higher than that of conventional panels).
3. Fast response time ideal for next-generation 3D TVs (double the speed of conventional panels).
4. Improved production efficiency resulting from simplification of panel structure. | 科技 |
2016-40/3982/en_head.json.gz/8212 | Physical ChemistryChemical Thermodynamics This Article
Toward 'Greener,' Inexpensive Solar Cells (Sep. 29, 2016) - Solar panels are proliferating across the globe to help reduce the world's dependency on fossil fuels. But conventional panels are not without enviro...read more Ames Lab Discovers Way to Make Alane a Better Hydrogen Fuel Option for Vehicles (Sep. 28, 2016) - Scientists at the U.S. Department of Energy’s Ames Laboratory, in collaboration with several partners, have discovered a less-expensive, more energ...read more Making Catalysts Smarter (Sep. 21, 2016) - in a paper published Monday, Sept. 19, in the journal Nature Chemistry.
Titled, “Adsorbate-mediated strong metal-support interactions in oxide-suppo...read more Proton diffusion discovery a boost for fuel cell technologies (Sep. 17, 2016) - Scientists at the University of Liverpool have made an important breakthrough which could lead to the design of better fuel cell materials....read more Plutonium Keeps Its Electrons Close to Home (Sep. 16, 2016) - Found in nuclear fuel and nuclear weapons, plutonium is an incredibly complex element that has far-ranging energy, security, and environmental effect...read more New steering tech for heavy equipment saves fuel, ups efficiency Sep. 24, 2013 — Researchers at Purdue University have shown how to reduce fuel consumption while improving the efficiency of hydraulic steering systems in heavy construction equipment.
The new approach incorporates several innovations: It eliminates valves now needed to direct the flow of hydraulic fluid in steering systems and uses advanced algorithms and models to precisely control hydraulic pumps. New designs might also incorporate textured "microstructured" surfaces inside pumps to improve performance.
"Fuel consumption of heavy off-road equipment accounts for a significant portion of total global fuel usage, so improving efficiency is very important," said Monika Ivantysynova, Maha Fluid Power Systems Professor in Purdue's School of Mechanical Engineering. "It's also important from a commercial business point of view because money saved on fuel improves a company's bottom line."
Typical hydraulic systems in heavy equipment use a central "variable displacement pump" that delivers fluid, and valves that throttle the flow of fluid to linear and rotary "actuators" that move tools such as shovels, buckets and steering mechanisms. This throttling causes energy to be dissipated as heat and wasted.
In the new valveless design, each actuator has its own pump, eliminating the need for valves. The actuator motion can be precisely controlled by adjusting the pump displacement, which changes the amount of fluid being delivered to the actuator. Being able to adjust the pump displacement makes it possible to run the machinery's diesel engine at optimal speeds, resulting in additional fuel savings.
Findings are detailed in a research paper being presented during the SAE 2013 Commercial Vehicle Engineering Congress on Oct. 1-3 in Rosemont, Ill. The paper was authored by doctoral student Naseem Daher and Ivantysynova, director of Purdue's Maha Fluid Power Research Center.
Present hydrostatic steering systems are plagued by poor energy efficiency, and industry is developing new "steer-by-wire technologies" to reduce fuel consumption and improve performance. However, the steer-by-wire systems being developed still require energy-wasting valves.
Testing the new "electro-hydraulic power steering system" on a front loader has shown a 15 percent fuel savings and 23 percent increased machine productivity, for a total fuel efficiency increase of 43 percent during steering maneuvers.
"The world's first pump-controlled steer-by-wire prototype machine is now readily available for further research and development," said Ivantysynova, who has a dual appointment in the Department of Agricultural and Biological Engineering.
In previous projects, Maha researchers have shown that valveless systems could reduce fuel consumption by 40 percent in an excavator equipped with the technology. Measurements on the same excavator prototype also showed 70 percent productivity improvement in terms of tons of soil removed per kilogram of fuel consumed.
The new steering system also may help reduce operator fatigue while improving safety by controlling the level of "steering-wheel torque feedback." Steer-by-wire technology removes all torque -- the twisting force required to turn the steering wheel. However, removing the torque is potentially dangerous because the driver lacks the tactile feedback needed to properly control the vehicle.
In the new system, torque feedback is regulated according to parameters such as steering wheel angle and turning speed, vehicle speed and the angle of a rotating joint that connects the vehicle's two subframes.
New thermodynamic modeling by the group also has found that steel parts in the pump undergo significant deformations from high heat during operation.
"The deformation due to heat can be as large as the thickness of the lubricating film, and this is very important," she said. "We have developed the only code that models these lubricating interfaces under extreme heat and high pressure."
The research paper includes details of the system's layout, the hardware and electronic controller developed through the use of modeling. The researchers developed and used the modeling to simulate the system's performance.
The Purdue laboratory is working with industry partners on applied research projects, said Anthony Franklin, the Maha lab's manager.
"Our prototypes are very close to commercial prototypes, so they are readily adaptable to machines now in use and can be easily industrialized when manufacturersdecide to make the transition into valveless systems," he said.
About 25 students are involved in research in the center.
The Maha Fluid Power Research Center is part of the Engineering Research Center for Compact and Efficient Fluid Power, funded by the National Science Foundation, participating companies and universities.
Share this story with your friends! | 科技 |
2016-40/3982/en_head.json.gz/8238 | Dead Space 3: Big News Coming Tomorrow
Visceral Games has a big announcement to make regarding Dead Space 3. They're giving us a day to get ready for this news. In the meantime, they've released a teaser image.
The teaser shot shows Isaac Clarke and his new co-op partner Sgt. John Carver heading through a darkened facility together. There's nothing really remarkable about this picture. If this is supposed to be a clue to tomorrow's news, it's not much of one.
It's possible that Visceral is about to release another gameplay trailer for DS3. It's been over a month since the last one. That's not a long time to wait between trailers, but updates on the game should start coming at a quicker rate as we approach February.
Another possibility is that they're going to announce a demo for the game. Game releases slow to a crawl in December and January so Visceral could get some exposure for their game by releasing a free taste.
Have any good (or bad) ideas about what the announcement could be? Let us know in the comments section below. | 科技 |
2016-40/3982/en_head.json.gz/8256 | Stories from Climate Central's Science Journalists and Content Partners
Subscribe to our News via RSS
Katrina: Lasting Climate Lessons for a Sinking City
By Bobby Magill
Follow @bobbymagill
This week marks a decade since Hurricane Katrina spun violently toward the coasts of Louisiana and Mississippi, ravaging both states when it barreled ashore on Aug. 29, 2005. Katrina taught New Orleans and the Gulf Coast many lessons about how vulnerable the region is to natural disaster, especially to sea level rise and storm surge made worse by climate change. But a more complex, man-made problem also threatens New Orleans and it was captured in the indelible images taken in the aftermath of the hurricane, when miasmal flood waters submerged up to 80 percent of the city: as sea levels rise, the Crescent City is sinking.
About 80 percent of New Orleans flooded when Katrina barreled ashore in 2005.
Credit: Kelly Garbato/flickr
New Orleans flooded because the levees protecting it broke after the hurricane struck. The water stayed put, however, because the city is in a bowl dipping below sea level — and that bowl is getting deeper, sinking at a rate of up to 4 feet a century, primarily because the surrounding swamps were drained so the metro area could be expanded.
Accounting for the land’s subsidence, the sea level in southeast Louisiana is expected to rise by more than 20 inches by 2050. That, coupled with increased tropical storm intensity driven by climate change — and the inexorable disappearance of the coastal wetlands that act as a storm surge buffer — has put New Orleans in a precarious position in a warming world.
“As sea level rises, the vulnerability of the land that is exposed to the ocean is higher if the land is sinking,” Virginia Burkett, a lead author of the Intergovernmental Panel on Climate Change’s Fifth Assessment Report and the chief scientist for global change at the U.S. Geological Survey, said. “The rate of subsidence of the land’s surface here in Louisiana is two to three times the global rate of mean sea level rise.”
10-Year Anniversary of Hurricane KatrinaStunning NASA Visualizations: Sandy vs. KatrinaThe Front Lines of Climate Change: Charleston’s Struggle
In other words, it’s as if the sea level in southeastern Louisiana is rising three times as fast as the global average.
Much of New Orleans, except for the oldest areas situated on higher ground, was built around the turn of the 20th Century on swampy soil which was drained by engineers. In the long run, that caused the soil to compact, or subside, forcing anything on the surface — houses, for example — to sink.
“As soon as you start draining these wetlands, they start sinking like crazy,” Tobjorn Tornqvist, a geology professor specializing in sea level and climate at Tulane University in New Orleans, said. “This added a lot to the (Katrina) disaster. If you imagine a situation where the land had not subsided due to this artificial drainage, there would still be a lot of flooding, but the water would have drained relatively fast.”
Another reason the region is sinking is that humans re-engineered the Mississippi River, shutting off its natural land-building process in southeastern Louisiana.
It worked like this: For millennia, the Mississippi River flooded, depositing layers of sediment in the wetlands of the Mississippi Delta, creating new land by layering the region with new soil. That soil naturally compacts and sinks under its own weight, but with each flood and new layer of sediment, the land was replenished and its elevation stayed more or less the same.
That process ended when the U.S. Army Corps of Engineers built levees and dams along the river to preserve and solidify the Mississippi’s famously meandering main channel for shipping, trapping some of the river’s rich sediment upstream and sending the rest into the deep waters of the Gulf of Mexico, forcing it off the edge of the Continental Shelf.
All of those local factors are aiding and abetting the submergence of southern Louisiana along with global factors such as accelerating sea level rise and increasing storm intensity because of climate change, Burkett said.
“Overnight, during hurricanes Rita and Katrina, about 215 square miles of land were lost — natural lands eroded away overnight,” she said. “That would leave the coast more vulnerable to the next storm. Meanwhile, 10 years later, there has been a tremendous effort by the state to restore the barrier islands that attenuate the storm surge, to restore some of the marshes, to get the sediment out of the river into to the marshes to make them more resilient.”
Since Katrina, the U.S. Army Corps of Engineers has rebuilt and strengthened the levees around New Orleans, and the state has begun restoring some of the barriers islands that protect the region. It is also considering a $4 billion plan to divert sediment from the Mississippi River as a way to restore some of the river’s land-building capabilities.
The diversions, likely accomplished with gates installed in levees along the river’s shore, is still in the design and approval phase and is facing pushback from fishing communities and oyster farmers, said Chip Kline, director of coastal activities for Louisiana Gov. Bobby Jindal and chairman of the Louisiana Coastal Protection and Restoration Authority. Credit: Climate Central
“The logic behind these diversions is if they built this state, then they should absolutely be used in restoring the state,” Kline said. “Every single credible study (and) planning effort has called for reconnecting the river to coastal wetlands. The Mississippi River is really the lifeline to the land loss crisis down here.”
However effective higher levees, coastal restoration and sediment diversions from the Mississippi River may be, the endgame for New Orleans likely rests in humanity’s ability to slow the progress of climate change.
“Ultimately, climate change is our biggest problem,” Tornqvist said. “You can divert sediment as much as you want. If sea level starts ramping up to rates that approach five to 10 millimeters per year, well, it’s going to be an incredibly difficult situation.”
Large river diversions that help create new wetlands are likely to take decades to be effective against future storm surges.
“In the meantime, if sea levels start to ramp up, all these diversions are going to do is delay the point of no return basically by a couple decades,” Tornqvist said. “It will be increasingly clear that the city is not going to survive. Then you will get into issues like, what are we going to do? Are we going to relocate the city? How that’s going to play out, it’s hard to predict. What’s really critical is what kind of action will be taken within the next decades.”
You May Also Like: Study: Warming May Bring Heightened Salmonella RiskMissing: One Year’s Worth of California RainDrought May Stunt Forests’ Ability to Capture CarbonWhat Warming Means for 4 of Summer’s Worst Pests
Posted in Basics, Causes, Impacts, Responses, Trends, Projections, Climate, Extremes, Flooding, Hurricanes, Water, Oceans & Coasts, Sea Level, Business, Policy, Weather, Extreme Weather, Solutions, Landscapes, Society
The wet are getting wetter and the dry are getting drier. View Gallery | 科技 |
2016-40/3982/en_head.json.gz/8261 | International Space Station: Benefits for Humanity (NASA Publication 2015-01-001-JSC) profile | register | preferences | faq | search
Topic: International Space Station: Benefits for Humanity (NASA Publication 2015-01-001-JSC)
posted 07-07-2015 08:24 AM NASA release NASA Book Shows How Space Station Research Offers "Benefits for Humanity"A new book from NASA is showing how research aboard the International Space Station helps improve lives on Earth while advancing NASA's ambitious human exploration goals.NASA will release "Benefits for Humanity" online and in print at the fourth annual International Space Station Research and Development Conference, which is being held Tuesday through Thursday in Boston. The book highlights benefits in a number of key areas including human health, disaster relief and education programs to inspire future scientists, engineers and space explorers."Some 250 miles overhead, astronauts are conducting critical research not possible on Earth, which makes tremendous advances in our lives while helping to expand human presence beyond low Earth orbit," said William Gerstenmaier, NASA associate administrator for Human Exploration and Operations. "Since 2012, this research has been carried to orbit by our U.S. commercial cargo providers Orbital ATK and SpaceX. Both companies will return to flight soon, having learned from recent challenges to perform even stronger. In the next few years, SpaceX and Boeing will send our crews to orbit from the United States, increasing the size of space station crews to seven, doubling the amount of crew time to conduct research for all of humanity." The space station, which has been continuously occupied since November 2000, has been visited by more than 200 people and a variety of international and commercial spacecraft. It is an unprecedented success in global cooperation to build and operate a research platform in space. In a partnership between five member space agencies representing 15 countries, it advances a unified goal to utilize the orbiting laboratory for the betterment of humanity. The partner agencies include NASA, the Russian Federal Space Agency (Roscosmos), the Japan Aerospace Exploration Agency (JAXA), the European Space Agency (ESA) and the Canadian Space Agency (CSA)."People do not realize how much their lives today have been made better by the space station," said Julie Robinson, NASA International Space Station chief scientist. "You would be surprised to know that station research has resulted in devices that can help control asthma and sensor systems that significantly improve our ability to monitor the Earth and respond to natural hazards and catastrophes, among many other discoveries." Scientists use the Japanese Experiment Module (JEM), also known as Kibo, to research effective drugs that may improve the lives of patients suffering around the globe."The International Space Station and Kibo remind me of a computer," said Kazuyuki Tasaki, deputy director of the JAXA JEM Utilization Center. "After being invented, the computer disseminated diverse public knowledge applicable in many fields, such as computing, simulation, word processing, games and the Internet. The space station and Kibo also offer huge potential for benefitting humankind."Since 2010, the Vessel-ID System, installed on ESA's Columbus module, has improved the ship tracking ability of coast guards around the world and even aided rescue services for a lone shipwreck survivor stranded in the North Sea. "The International Space Station with its European Columbus laboratory is steadily producing lots of important research results which are relevant for many areas of life on Earth," said Martin Zell, head of ESA's Space Station Utilisation and Support. "Experimental demonstration of new technologies, as well as the interaction between astronauts and younger generations on Earth for educational activities are invaluable benefits from the permanent human space laboratory in low-Earth orbit."CSA's robotic heavy-lifters aboard the space shuttle and station, Canadarm, Canadarm2 and the Special Purpose Dexterous Manipulator (Dextre), inspired medical technology that is changing the lives of patients on Earth."Technologies developed for the assembly and maintenance of the station are helping to save lives here on Earth," said Nicole Buckley, CSA chief scientist, Life Sciences and ISS utilization. "The Canadian robotics system that helped build and now operates on the International Space Station has led to tools that give doctors new ways to detect cancer, operate on sick children, and perform neurosurgery on patients once considered to be inoperable."In addition to the updated benefits book, NASA released its third iteration of the International Space Station Reference Guide, which explains what the space station does and how it works. This release focuses on the station's capabilities to perform pioneering science in its microgravity environment. To date, 83 countries have taken part in more than 1,700 experiments and educational efforts on this world-class laboratory in space.The Center for the Advancement of Science in Space (CASIS), who is hosting the conference in cooperation with the American Astronautical Society and NASA, is releasing a new research-focused, interactive website that provides tools, information and resources to give researchers a competitive edge sending new investigations to the space station.
GoesTo11Member Posts: 1215From: Denver, CO USARegistered: Jun 2004
posted 07-07-2015 09:43 PM Robert (or anyone else), do you know if the third edition ISS guide will be produced in a print edition as well? Thank you! All times are CT (US)next newest topic | next oldest topic | 科技 |
2016-40/3982/en_head.json.gz/8298 | RIM, India To Continue Discussions Over BlackBerry Monitoring: Reports byJoseph F. Kovar on August 26, 2010, 10:09 pm EDT
Officials from India's government and RIM executives are planning to meet on Friday to discuss India's demand that the country can have the ability to monitor communications sent over BlackBerry devices in that country.
The two sides on Thursday met to discuss India's request to be able to monitor Research In Motion's BlackBerry corporate e-mail service after it set an August 31 deadline for a solution, the Dow Jones News Service reported.
After Friday's meeting, which includes officials of India's Intelligence Bureau and the National Technical Research Organization, Indian officials will discuss the issue over the weekend to prepare the government's response, Dow Jones reported.
Meanwhile, RIM on Thursday offered to lead a group of wireless industry companies to discuss the security concerns raised by the government of India, MarketWatch reported.
RIM has been facing questions from a number of developing countries, including India, Saudi Arabia, and the United Arab Emirates, over their concerns that the level of security encryption of its BlackBerry is so high that they are unable to monitor emails sent from the device.
These countries cite national security issues, including the ability to thwart terrorist activities, as reasons for needing that monitoring capability. RIM has responded in the past that it has no "back door" or encryption key that can be provided to allow such monitoring.
RIM has told India that placing BlackBerry networking infrastructure in India would not answer that country's concerns because the network would still work the same, and that shutting down the service in India would not work because other wireless messaging networks have similar security technologies, MarketWatch reported. | 科技 |
2016-40/3982/en_head.json.gz/8335 | Print Email Font ResizeClimate change conversation gets boost from expert speaker networkUCAR building nationwide stable of experts on hot topicBy Charlie Brennan, Camera Staff WriterPosted:
04/09/2014 10:27:16 AM MDTMore informationWhat: Climate Voices science speakers network
Info: climatevoices.org
It seems everyone is talking about climate change these days — but who can you trust?
Boulder's University Corporation for Atmospheric Research believes it has the answer. More than 160 of them, at this point.
UCAR, which manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation, has launched what is calls ClimateVoices.org, a database of experts from around the country who will be available to speak to groups about one of the hottest topics affecting the planet today.
"What we're trying to do with Climate Voices is to give context-setting presentations about the science when they meet with public groups," said Cindy Schmidt, UCAR adviser for climate outreach.
"But then the very important thing for them to do is to begin conversations with the public about climate science, about climate change, about the possible effects of climate change, what they might be seeing in their own region of the country, and, as citizens, what they might want to do about this in their community."
'Yes, climate change is happening'Events at the national and even the global level are coming together to supercharge the climate change discussion.
The Working Group III report on the ongoing Fifth Assessment of the International Panel on Climate Change, which will address the mitigation of climate change, is due to be released early Sunday in Berlin. That comes right on the heels of the Working Group II report of the IPCC, which came late last month and included contributions from several Boulder scientists.
In addition to that, the National Climate Change Assessment, with a focus on climate change in the United States, instead of the global perspective of the IPCC reports, will be coming from the U.S. Global Change Research Program late this month or early in May.
"We know that 2014 is a year of opportunity and progress when it comes to climate change," Kathy Calvin, president and CEO of the U.N. Foundation, said in a news release. "By bringing expert voices to the forefront, Climate Voices will ensure that science is at the heart of the worldwide discussion this year about what can and should be done to confront this issue."
"This shouldn't be a debate about whether climate change is happening," Schmidt said. "We are starting from the scientific results, saying, 'Yes, climate change is happening and human beings have something to do with it.'"
The idea of developing a resource base for people who can help lead community-level discussions came from Tim Wirth, the former U.S. senator from Colorado who sits on the board of the United Nations Foundation, and brought the proposal to UCAR.
'The people who have the facts'Through both recruiting scientists from all 50 states through the UCAR consortium of more than 100 colleges and universities, and by encouraging those in the scientific community to sign up, a database of more than 160 experts can now be accessed at climatevoices.org.
"Members of the public can go there and contact them and ask them to come to their particular groups," Schmidt said. "We're talking about libraries, chambers of commerce, service organizations, faith-based institutions, all sorts of organizations can reach out and use this resource; even if there is a group of neighbors, they could get together in their living rooms and invite one of the scientists to to come talk them."
All visitors to Climate Voices need to do is go the site, enter a city and state or ZIP code, after which they will be greeted by a list of participating scientists in their area, and a tab offering means by which an invitation to that speaker can be issued.
Schmidt emphasized that although climate change is often a politically charged topic, what UCAR is presenting is not an advocacy program.
"That's why we're asking scientists to do this, because they are the people who have the facts about the climate science, and they can relay those facts to their own neighbors and fellow citizens," Schmidt said.
Contact Camera Staff Writer Charlie Brennan at 303-473-1327, brennanc@dailycamera.com or twitter.com/chasbrennan.Print Email Font ResizeReturn to Top RELATED | 科技 |
2016-40/3982/en_head.json.gz/8350 | Oracle to Buy Sun Microsystems
Shane McGlaun (Blog) - April 20, 2009 12:10 PM
29 comment(s) - last by ChuckDriver.. on Apr 22 at 10:47 AM
IBM's loss is Oracle's gain
Sun has been shopping around for a buyer to help it turn around its falling profits and margins for a while. IBM and Sun were in talks over a potential deal -- IBM offered Sun $9.40 per share and the offer was met with resistance by Sun's board.
That resistance led IBM to walk away from the negotiations. The Wall Street Journal reports that Oracle has now agreed to purchase Sun for $9.50 per share. The value of the transaction is $5.6 billion and excludes Sun's cash debt. Sun reportedly had about $2.6 billion in cash and short-term investments and $700 million in long-term debt as of December 28, 2008.
A Sun/Oracle merger makes sense with Sun servers being sold with Oracle database software for a long time. Buying Sun will allow Oracle to offer complete solutions of hardware and software to businesses looking for a one-stop shop.
Other than the issue with IBM offering less than Sun's board wanted for the company, reports had IBM being concerned about antitrust issues stemming from the purchase. Oracle is believed to have less of an antitrust issue since it has fewer businesses that compete directly with Sun.
Oracle executives believe that the purchase will pay off quickly for the company despite the fact that Sun has been posting losses for the last three quarters. Sun is expected to add more than $1.5 billion to Oracles operating profit excluding charges and other items in the first year with that number growing to over $2 billion the second year after the purchase.
The Wall Street Journal reports that some analysts were stunned by the purchase. AMR Research analyst Bruce Richardson said, "The last thing you expected was a database-software company to buy a hardware customer base." Comments Threshold -1
RE: A dime more a share...
MySql can't really be compared to Oracle lite. MySQL is used alongside oracle all the time, Google and Facebook do behind the scenes to name a few. Oracle lite is a stripped down version of oracle that lacks basic features thay MySQL has had for a very long time. Parent
IBM Walks Away from Sun Negotiations | 科技 |
2016-40/3982/en_head.json.gz/8353 | DOJ Files Paperwork to Kill AT&T/T-Mobile Merger; AT&T, Sprint Respond
Brandon Hill & Jason Mick - August 31, 2011 11:21 AM
72 comment(s) - last by sorry dog.. on Sep 1 at 3:43 PM
“AT&T’s elimination of T-Mobile as an independent, low- priced rival would remove a significant competitive force from the market." -- U.S. DOJ (Source: Miramax Films)
U.S. Justice Department says combined company would break antitrust laws
AT&T Inc.'s (T) bid to acquire to rival U.S. carrier Deutsche Telekom AG's (ETR:DTE) T-Mobile USA may have slammed into a roadblock today, with Bloomberg reporting that the U.S. Department of Justice had filed paperwork to block the acquisition. Back in March, AT&T first announced
its intentions to purchase T-Mobile for a whopping $39 billion USD ($25
billion USD in cash, the remainder in stock). Reportedly AT&T had been negotiating the purchase behind closed doors since January.The deal would mark the latest consolidation in the race by America's top carriers Verizon Communications, Inc. (VZ) and AT&T to gobble up would-be competitors. Had the deal been approved it would have granted AT&T and Verizon control over more than 80 percent of American cell phone contracts -- a "duopoly" according to some. AT&T would have approximately 130 million subscribers, while Verizon would have 97 million subscribers.Verizon and AT&T often marched in lock step when it came to pricing and contract terms, as witnessed by both companies' decisions to kill "unlimited" smart phone internet privileges for new customers [1][2].AT&T Chairman and CEO Randall Stephenson championed the deal, stating that it would fulfill President's Obama's vision to blanket the nation with high-speed wireless access:This transaction delivers significant customer, shareowner and public benefits that are available at this level only from the combination of these two companies with complementary network technologies, spectrum positions and operations. We are confident in our ability to execute a seamless integration, and with additional spectrum and network capabilities, we can better meet our customers’ current demands, build for the future and help achieve the President’s goals for a high-speed, wirelessly connected America.However, almost the majority AT&T and T-Mobile customers we interviewed expressed concern about the deal -- particularly T-Mobile customers who feared increases from T-Mobile's budget prices.The deal would have left Sprint Nextel Corp. (S), who bitterly opposed the deal, in a distant third place, with 52 million customers. Sprint thus far has refused to join the informal Verizon-AT&T coalition when it comes to contract terms -- it continues to offer unlimited internet. Reportedly the U.S. Justice Department sided with Sprint in the end, filing paperwork to block the merger, citing antitrust concerns. "AT&T’s
elimination of T-Mobile as an independent, low- priced rival would remove a
significant competitive force from the market," said the filing [press release]."The combination of AT&T and T-Mobile would result in tens of millions of consumers all across the United States facing higher prices, fewer choices and lower quality products for mobile wireless services," said Deputy Attorney General James M. Cole. "Consumers across the country, including those in rural areas and those with lower incomes, benefit from competition among the nation's wireless carriers, particularly the four remaining national carriers. This lawsuit seeks to ensure that everyone can continue to receive the benefits of that competition."
U.S. Government makes good on its intentions and officially blocks the sale of
T-Mobile, AT&T has promised to pay Deutsche Telekom $3 billion USD in cash.
In addition, Deutsche Telekom would receive an additional $2 billion USD worth
of spectrum and secure a $1 billion USD roaming agreement reports
Reuters.Both AT&T and Sprint have responded to the actions by the DOJ. Up first is Wayne Watts, AT&T Senior Executive Vice President and General Counsel:We are surprised and disappointed by today’s action, particularly since we have met repeatedly with the Department of Justice and there was no indication from the DOJ that this action was being contemplated.We plan to ask for an expedited hearing so the enormous benefits of this merger can be fully reviewed. The DOJ has the burden of proving alleged anti-competitive affects and we intend to vigorously contest this matter in court.At the end of the day, we believe facts will guide any final decision and the facts are clear. This merger will:Help solve our nation’s spectrum exhaust situation and improve wireless service for millions.Allow AT&T to expand 4G LTE mobile broadband to another 55 million Americans, or 97% of the population;Result in billions of additional investment and tens of thousands of jobs, at a time when our nation needs them most. We remain confident that this merger is in the best interest of consumers and our country, and the facts will prevail in court.Next up is Vonya B. McCann, senior vice president of Government Affairs for Sprint:The DOJ today delivered a decisive victory for consumers, competition and our country. By filing suit to block AT&T’s proposed takeover of T-Mobile, the DOJ has put consumers’ interests first. Sprint applauds the DOJ for conducting a careful and thorough review and for reaching a just decision – one which will ensure that consumers continue to reap the benefits of a competitive U.S. wireless industry. Contrary to AT&T’s assertions, today’s action will preserve American jobs, strengthen the American economy, and encourage innovation....
RE: Really???
Nutzo
quote: Pure capitalism results in robber barons and workers barely above slave status. Unless you are a robber baron I don't see why anyone would want pure capitalism. You guys listen to too much fox news propaganda. Wrong.What you are referring to is closer to MercantilismThe robber barons cannot exist without support from the government to hold onto their monopolies.Pure capitalism, with limited laws & regulations (like in the US constitution) provided the best opportunity and the highest overall standard of living for the middle class. Parent
quote: Pure capitalism, with limited laws & regulations (like in the US constitution) provided the best opportunity and the highest overall standard of living for the middle class. So-called pure capitlism is what's giving us high unemployment today. It's given us a so-called global market. I don't know how "global" a market is when everyone is free to sell their crap to us because we have so little regulation yet we can't sell diddly or own property in other countries. They already know the truth; we Americans are just so stupid that refuse to see. Parent
Pure capitalism, with limited laws & regulations = capitalism.You dont need the Pure in it. Parent
AT&T Clarifies Stance on T-Mobile Plans; T-Mobile Overages Rumored
Verizon Unveils Industry's Most Expensive Metered Data Plan
Sprint Officially Declares War on AT&T/T-Mobile
AT&T to Buy T-Mobile for $39 Billion USD
AT&T Drops Unlimited Data Plan for New Smartphone Customers, Adds 200MB, 2GB Caps | 科技 |
2016-40/3982/en_head.json.gz/8356 | Study: Greenhouse Gas Emissions Have Caused Irreversible Effects on Sea-Level Rise
77 comment(s) - last by WLee40.. on Oct 9 at 1:08 PM
(Source: climatepedia.org)
Emissions up until this point have ensured an irreversible sea-level rise of 1.1 meters by the year 3000
A new study has found that it's too late to reverse the effects that greenhouse gas emissions will have on sea levels over the next thousand years -- but we could lessen the impact of these effects if proper changes are made. According to research by scientists at Vrije Universiteit Brussel, Manchester Metropolitan University and the Université catholique de Louvain, greenhouse gas emissions produced up to this point has ensured an irreversible sea-level rise of 1.1 meters by the year 3000. This number could increase, they warn, if no action is taken to reduce these levels -- and the effects could extend into thousands of years into the future. The research team came to this conclusion by modeling sea-level changes over thousands of years while including all of our planet's ice sheets and warming of the oceans into its projections. This includes glaciers, ice caps and the Greenland and Antarctic ice sheets. The team said this has never been done before. Using a climate modeling system called LOVECLIM, the team analyzed several scenarios over the next thousand years. It found that there will be a sea-level rise of at least 1.1 meters by the year 3000, but if other certain emissions scenarios were followed, it could increase to 2.1, 4.1 or even 6.8 meters. The study also found that the Greenland ice sheet was the cause of over half of the sea-level rises while thermal expansion of the ocean came in second place and glaciers/ice came in third. "Ice sheets are very slow components in the climate system; they respond on time scales of thousands of years," said Professor Philippe Huybrechts, co-author of the study. "Together with the long lifetime of greenhouse gases in the atmosphere, this inertia is the real poison on the climate system; anything we do now that changes the forcing in the climate system will necessarily have long consequences for the ice sheets and sea level.
"Ultimately, the current polar ice sheets store about 65 metres of equivalent sea level and if climatic warming will be severe and long-lasting, all ice will eventually melt. Mankind should limit the concentration of greenhouse gases at the lowest possible level as soon as possible. The only realistic option is a drastic reduction of the emissions. The lower the ultimate warming will be, the less severe the ultimate consequences will be." This study was published in Environmental Research Letters. Source: Science Daily Comments Threshold -1
RE: Irreversible
Milliamp
If you want to go that far back you can't ignore the 99.9% extinction rate.. Parent
Denigrate
Great point. Many mass extinction events throughout history, but somehow it's only human's who cause extinction events today. We are destroying the earths diversity, and it'll never recover. Parent
icemansims
Never recover? Good god, you don't understand at all do you? Our environment is changing. With or without us many species die off and develop almost on an annual basis. Most of this happens at the microscopic level so you never even notice. The point, however, is that nothing is EVER recoverable.Is the planet warming? Yes.Is there evidence that human influence is accelerating that (solar output accounts for more)? Equivocally, yes.Does that mean we're "destroying the planet"? No.Does that mean we could make the planet's environment hostile to humanity's needs? Yes.At very worst, we'll kill off ourselves. Life will continue. All species die off in their own time, usually around ~10 million years, according to fossil record. More will take their place. Everything changes. Parent
WLee40
Great points. I wish I could vote you up, but I already made a post and you said many things I was trying to say. Parent
freedom4556
This. A thousand times this. Pretty much sums up my whole viewpoint on "climate change". Everybody needs to just accept the bigger picture and move on. Parent
NASA: Hotter Summers Since 1980 Caused by Global Warming August 7, 2012, 1:21 PM
Global Warming Impact on Human/Mammal Health by 2100 Predicted July 2, 2012, 8:46 PM | 科技 |
2016-40/3982/en_head.json.gz/8362 | San Francisco Can No Longer Warn Cell Phone Customers About Radiation Hazards
Tiffany Kaiser - May 9, 2013 11:26 AM
29 comment(s) - last by dali71.. on May 12 at 6:10 PM
The city Board of Supervisors voted in favor of a settlement on Tuesday
The city of San Francisco has lost the right to place warnings about cell phone radiation levels in its retail stores.
San Francisco was the first U.S. city to pass an ordinance that required retailers to warn consumers about cell phone radiation before they made the purchase. However, the city lost a court battle with the Cellular Telecommunications Industry Association (CTIA) -- which is comprised of companies like AT&T, Verizon, Samsung and Apple -- and will now lift the ordinance from city retailers.
The city Board of Supervisors voted in favor of a settlement on Tuesday, where San Francisco agrees to drop the ordinance. In exchange, the CTIA will waive its claims for attorney fees that would have amounted to about $500,000. "I am for pushing the envelope on something as important as this, but I think the legal reality is such that if we do not approve this settlement, we're talking about having to pay half a million in legal fees," Supervisor David Campos said. "It's a very tough situation, but the last thing I want is to have the general fund give half a million dollars to lawyers in this case."
The ordinance went into effect in 2011, where retailers had to tell customers that gadgets like cell phones emit potentially cancer-causing radiation. The city even wanted to post that the World Health Organization (WHO) deemed cell phones "possibly carcinogenic," but a judge blocked this part of the ordinance, since WHO said more research was needed to back that claim. The CTIA filed a lawsuit against the city, saying that the ordinance violated the industry's First Amendment rights. A ruling in the 9th U.S. Circuit Court of Appeals last year said that San Francisco had to prove that scientists agreed with its claims about cell phone radiation. It also had to prove that the FCC no longer believed they were safe for consumers to use.
San Francisco decided to take a settlement offer in the end with the CTIA. Sources: NBC News, SF Gate Comments Threshold -1
RE: First amendment rights?
LucyDoggie
Not true, BRB29!The World Health Organization's panel of the top scientists from around the world reviewed all the studies to date. They determined by a vote of 29 to 1 to classify cell phone radiation as a possible carcinogen based upon an increased risk of brain cancer after 10 years of use for an average of only 30 minutes a day.The majority of the published studies done independently of industry's funding DO show increased risk of brain cancer, reduced fertility, DNA damage, acoustic neuroma, malignant salivary gland cancer, etc. etc. etc.We all love our cell phones and can't imagine life without them now. But, there are safer ways to use cell phones, especially for children - why is the industry suing San Francisco when all they tried to do was to mandate that cell phone retailers disclose the consumer warnings being hidden in all user manuals AND about safer ways to use cell phones, especially with respect to children?What are they REALLY hiding? Parent
WHO Labels Cell Phone Radiation as a Possible Carcinogenic Hazard
San Francisco is First U.S. City to Pass Cell Phone Radiation Law | 科技 |
2016-40/3982/en_head.json.gz/8364 | Extrasolar Planet's Missing Water Discovered
Michael Hoffman (Blog) - April 12, 2007 11:16 AM
39 comment(s) - last by masher2.. on Apr 15 at 6:38 PM
The debate on whether or not Osiris has water in its atmosphere continues
A new analysis again suggests that gas giant HD209458b currently has water in its atmosphere. The planet -- nicknamed Osiris -- is 150 light years away from Earth, located in the Pegasus constellation. The planet was first detected in late November 1999, with the help of astronomical spectroscopy. The hot, Jupiter-like gaseous planet has been the target of research once scientists believed water could be located somewhere on the planet. Three teams of scientists previously believed there could be water in the planet's atmosphere, but those ideas were questioned after the NASA Spitzer Space Telescope was unable to provide evidence.Travis Bartman, an astronomer working at Lowell Observatory, believes he has discovered the missing water after analyzing the light from a star when it passes through HD209458b's atmosphere.Barman and researchers from Harvard University measured the light coming from Osiris as it reached the furthest part of the 3.5-day orbit it makes around the star. With the help of the Hubble Space Telescope, it was possible to further study water absorption in the planet's atmosphere. Each time the planet passes its parent star, it is possible to analyze how the atmosphere absorbs light passing from the star through the atmosphere. Scientists will continue to study and conduct research to either confirm or deny Barman's research.
RE: more on the topic
the presence of water has nothing to do with carbon or silicone based life. The presence of water merely creates an environment much, much more stable for life to form in, increasing its probability greatly. For simple life molecules to form into larger living oranisms, they need nearly stable temperatures and a form of locomotion. Water spreads out heat so that changes in temperatures on a planet occur slowest in the largest bodies of water (kinda like our oceans) and the fluid motion of water provides a locomotion for these molecules which otherwise could not propel themselves to interact with other life like molecules. The other great benefit of water is that since most basic life molecules are hydrophobic (ie, they are repeled by water, like oil is), it causes all the life molecules to concentrate themselves together (like drops of oil in water), increasing their chances of encountering whatever other life molecules they might need to encounter to survive. Of course this could occur with out water, but water is a very good way to accomplish those three (and other) things which help the growth of life like molecules. Parent
> "the presence of water has nothing to do with carbon or silicone based life"Untrue. Water is hugely important in the development of carbon-based life. But in silicon biochemistry it plays a much smaller part. For instance, let's look at the benefits you list. Water provides temperature averaging...from 0-100C, at least. That's also the most chemically active range for carbon-based compounds. But the temperature range for silicon is far wider....and long-chain silicon molecules are stable at much higher temperatures.How about the fact that most "basic life molecules" in the carbon chain are hydrophobic? True...but the basic building blocks of the carbon cycle (CO2 and oxygen) are hydrophillic, and thus water disperses them freely, both providing them to and dispersing them from organisms within it. But most silicon-based analogues to these (such as silicon dioxide) are hydrophobic as well, which means a "silicon cycle" could never develop within an aqueous enviroment....it would need some other agent.This is why worlds without water are (so we think) unlikely to develop carbon-based life. But silicon-based life? If its possible at all, its very possible without water. Parent | 科技 |
2016-40/3982/en_head.json.gz/8401 | Home > Buying Guides > PC ports explained: Get to know the back of your… PC ports explained: Get to know the back of your computer By
Almost any modern communication need can be handled with a wireless solution. File transfer, streaming video, peripheral connections – all of these can be accomplished without a physical connection. The future is now.
Yet the port persists. No, more than that: It’s alive and well. Take a gander at your home office and you’ll likely find wires of all sorts leading to various connections: USB, HDMI, DVI, Thunderbolt, the list goes on.
Physical connections are still the quickest, most reliable way to transfer data. Which means it’s still important to know what goes where, and why. Let’s clear the air and make room for some modern knowledge of old-fashioned connectivity.
The Universal Serial Bus would make a good role model for super-villains everywhere. It pledged to take over the world. Then it did so. It took well over a decade, but it has happened. FireWire is basically obsolete. External SATA is nearly extinct. Only Thunderbolt may provide a serious challenge – but it’s years away from widespread adoption.
Modern USB essentially comes in two forms – USB 2.0 and USB 3.0. The ports look the same and are compatible with each other, which is great. Except it makes separating the two difficult. Manufacturers the world over have tried to resolve the standard says that USB 3.0 ports should be blue or should be identified by super-speed USB 3.0 logo (see below).
If it’s not blue or identified by this logo, it’s not USB 3.0. Or at least it shouldn’t be. We’ve yet to encounter a computer that failed to identify USB 3.0 ports by at least logo, but we have run into a couple (both laptops) that didn’t use blue.
The main difference between the standards is speed. The maximum bandwidth of 3.0 is over 10 times higher than 2.0. This doesn’t mean transfer speeds are ten times better in the real world, but there is a huge difference. You’ll see much quicker file transfers with a USB 3.0 drive plugged in to a 3.0 port. Transfer speeds are not better if you plug a 3.0 drive into a 2.0 port. Data can still be transferred, but only at 2.0 speeds.
FireWire was developed by Apple to solve the lack of high-speed connections available to peripherals during the early 90s. Speed was given high priority, and it showed in the resulting standard. FireWire which went through several revisions, and each was consistently quicker than USB.
Until now. USB 3.0 has upped the ante, and instead of calling, FireWire’s supporters have chosen to fold. It was probably a wise move. FireWire never gained the widespread appeal of USB. Losing its performance advantage made it nearly obsolete.
Still, many people have an older camera or peripheral which must be connected via FireWire. If you’re among this crowd you will need to plan on using adapters in the future. FireWire support is near extinction in the laptop space and nearly dead among desktops as well.
This standard related to the common SATA standard that’s used by nearly all modern hard drives, but designed for external peripherals. It takes advantage of SATA’s excellent bandwidth to provide fast transfer speeds.
Sounds great, right? But there are a couple problems. One is a lack of support for power in the standard. USB and FireWire are both capable of powering devices, which is why most USB peripherals and storage devices don’t need external power. There’s no support for that in eSATA. A work-around port called eSATAp fixes that, but this port is rare and not part of the official SATA standard.
Another issue is the standard’s maximum cable length. SATA was built for use in computers, so the cables only had to work short distances. This means the maximum length of cable is six feet, six inches. Larger cables can be made, of course — it wouldn’t cause the space-time continuum to collapse — but they also wouldn’t be guaranteed to work.
Thunderbolt is a new type of connection developed by Intel under the codename Light Peak. As that name suggests, Thunderbolt was initially intended to be a fiber-optic connection capable of 10 Gbit/s (nearly twice the bandwidth of USB 3.0) but Intel engineers figured out how to accomplish this goal using only copper wire. This made Thunderbolt less expensive and gave it the ability to deliver power, a critical trait for any connection that dreams of widespread adoption. In fact, Thunderbolt can deliver a whopping 10 watts, which is over twice as much as USB 3.0.
This connection also doubles as a DisplayPort 1.2-compatible A/V connection. It’s possible to daisy-chain up to seven different devices (both displays and peripherals) off one Thunderbolt port, though there are limitations based on the types of devices connected.
Thunderbolt seems set to one day replace USB 3.0, but for now it remains expensive and only a handful of companies have adopted it. Apple was first to include it on production PCs. Other manufacturers are beginning to follow this lead, but only on high-end products. Even if you do have the port, there’s not much to connect to it besides DisplayPort-compatible monitors and a small (but growing) selection of external hard drives.
If you plan to buy a computer in 2014, consider this a must-have. For now it’s a great technology that needs to gain market acceptance.
Now it’s time to jump away from the general-purpose connections and start talking about those dedicated only to audio and video. DVI seems like a good place to start.
DVI is the old man of modern video connections. Its bloodline dates all the way back to 1999 and it didn’t see widespread acceptance until 2002 and 2003. Since then, it has resisted several different attempts to completely replace it, though its strength does seem to be fading.
Because it was developed as a successor to VGA, this connection can handle analog signals. That’s not going to be a factor for most readers but it may be worth noting if you still have an old VGA monitor kicking around. DVI also can output audio when paired with an appropriate video card and a DVI-to-HDMI adapter.
DisplayPort was one of two A/V connections (the other being HDMI) developed in the middle of last decade. This connection was developed specifically with computer monitors in mind and is meant to be the full-digital replacement for DVI.
On paper, DisplayPort is a technical masterpiece. It has a maximum data rate of up to 18 Gbp/s in best fighting form. Like its sibling, Thunderbolt, DisplayPort allows for daisy-chain configurations. It’s possible to run up to four 1080p displays with a single DisplayPort connection. Another nice advantage is cable length: The spec supports up to three meters in copper and fifteen with fiber-optic – but be warned, those cables are expensive.
This connection is very good, but only if you have a monitor that supports it. Many inexpensive monitors don’t. The consumer television market is the culprit. Consumers usually know of HDMI, but few know of DisplayPort, which makes it hard to sell. Even so, this connection’s compatibility with Thunderbolt may make it the video standard of the future.
The High Definition Multimedia Interface began production in 2003 as a replacement to all earlier A/V connections. It was built to be a do-it-all cable combining uncompressed audio and video for maximum picture quality.
Computers were never the focus. HDMI was developed for the expected surge of high-definition televisions. But the traits that make HDMI good for televisions also make it good for computers. This connection can handle audio and video with one cable. Better still, the connector is thin and flat, making HDMI great for laptops and other small systems.
All of these advantages also apply to DisplayPort, a connection that has several additional traits that make it technically superior to HDMI. Despite this, HDMI is more common. It’s often standard on inexpensive monitors and on laptops.
Despite be technically inferior in some ways, HDMI is more than adequate for most users. It’s a simple, easy plug that can handle high display resolutions. Its downsides, such as the inability to daisy-chain and shorter cable lengths, usually aren’t a concern.
Most computers now have wireless Internet available, yet Ethernet persists and is used in millions of homes worldwide. This simple connector, which looks a bit like a phone jack, has served the needs of networks for three decades.
Ethernet is most often used to connect to the Internet. It usually doesn’t offer any effective bandwidth advantage because the bandwidth of a strong wireless connection will almost certainly exceed the bandwidth of your Internet connection. Ethernet is more reliable, however. There’s no need to worry about signal interference, concrete walls and other obstructions.
Ethernet’s speed can be used to its full potential on a home network if appropriate routers and cables are used. Two computers networked with Gigabit Ethernet can transfer data at high speeds over relatively long distances.
Copper still rules
Ports matter, and we’ll probably be dealing with them for some time. Wireless bandwidth is now technically capable of handling HD video and can provide excellent data transfer rates, but expensive adapters are still required and reliability isn’t perfect in all environments. Cables, and the ports they plug in to, remain a cheap, reliable, simple solution. Hopefully this guide has helped you understand the galaxy of ports that are commonly used.
[Image credit: USB Connection Port, Nejron Photo/Shutterstock] | 科技 |
2016-40/3982/en_head.json.gz/8404 | Home > Mobile > Intel Merrifield-powered smartphone coming in… Intel Merrifield-powered smartphone coming in 2014, but still no news on 4G LTE connectivity By
Andy Boxall
The bulk of Intel’s Computex 2013 keynote presentation may have been about the new Haswell processor, and its presence in tablet/laptop (Taptop? Lablet?) two-in-one devices, but VP for Sales and Marketing Tom Kilroy also talked briefly about the company’s mobile plans. Specifically, he introduced a reference design smartphone running the new Merrifield mobile processor.
Merrifield is the successor to the Medfield and Clover Trail+ chips currently used in Intel’s smartphones, and is designed to take on the might of the latest ARM-based chips found in the majority of phones around the world. Kilroy claimed the Merrifield chip will offer a 50 percent performance improvement over the current generation mobile Atom chips, and would use four times less energy too.
The Merrifield reference design phone Kilroy pulled from his pocket is coming out next year, and he promised we would see the first examples at Mobile World Congress 2014. Intel found several buyers for its Medfield reference phone, including Indian manufacturer Lava, and the UK network Orange, so its reasonable to expect the Merrifield device to be snapped up by someone prior to the show.
However, there is still a question mark over when Intel smartphones will incorporate 4G LTE connectivity. Kilroy admitted, during a Q&A session after the presentation, that not having 4G LTE was the reason Intel phones weren’t making any headway in the U.S., as carriers aren’t particularly interested in stocking them without it. Intel does have plans to bring 4G LTE to its tablets, which it will do by using the new 22nm Bay Trail-T chips, based on the Silvermont architecture, which will be out by the end of this year.
As Merrifield chips are also built on Silvermont architecture, perhaps it won’t be long until 4G LTE comes to Intel-powered phones, although it wasn’t clear if the 4G radio would be integrated into the Bay Trail-T, or if it would be separate – the latter of which being almost essential if it’s to be used in a phone. Sadly, no further information was shared on Intel’s 4G mobile plans, so it looks like we’ll have to wait until MWC next year for more details. | 科技 |
2016-40/3982/en_head.json.gz/8411 | Some Technical Issues Relaxing Assumptions . . . Stretching the Vision
A Modest View of Some Technical Issues
Ronald L. Larsen
U.S. Defense Advanced Research Projects Agency
(DARPA)
rlarsen@darpa.mil
D-Lib Magazine, April 1997
On March 9-11, 1997, the National Science Foundation (NSF) sponsored a
"Planning Workshop for Research in Distributed Knowledge Environments
(DKE's)." This story is based on one of two plenary papers given on March
10, 1997. The
second was given by William Y. Arms and also
appears in this issue. All slides, transcripts, and workshop notes will be
made available shortly by the University of Michigan, School of Information.
Herein I present a modest examination of seven technical assumptions which I believe have historically and substantially influenced the development of networked information systems. I suggest that these assumptions still influence our ability to conceive of innovative designs and services, while their validity is increasingly becoming open to challenge. Especially in planning long term fundamental research, we need to step beyond the constraints of the past and the present, and examine afresh the options for the future.
My intent here is to consider "challengeable" assumptions. I make no claim of completeness, nor even of appropriateness. Instead, I hope to challenge you to refine my list or to come up with your own. Regardless of the specific list, the underlying purpose here is to explore the effects of relaxing some of our long-held assumptions, and to consider also the potential counter-effects, or unanticipated outcomes, which may result.
Assumption 1: Computing and communication resources are scarce and inflexible
When resources are scarce or costly, their utilization is carefully managed, monitored, and often mediated. Access to commercial information services from university and corporate libraries, for example, is frequently mediated by a reference librarian. This is primarily due to a combination of cost, complexity, and variety of the underlying services available. Mediation by a trained professional is seen as the means of providing value-added, patron-oriented services while controlling the costs of on-line services. Queries to these systems can be quite cryptic
and laborious to construct, and responses may be voluminous and costly (particularly for the ill-formed or ill-informed query).
Relaxing this assumption enables rethinking the manner of interaction between the user and the information source, with the potential for removing the need for a professional mediator. These older systems are built on an underlying assumption of a narrowband, text-based query interface. The typical query is a sequence of, perhaps, 20 - 50 characters which has its roots in dial-up technologies capable of delivering a few hundred characters per second. But today's networks and the networks of the future are many orders of magnitude faster than this. Broadband, active networks accessed through high performance workstations offer the potential of semantically and contextually rich query expression and interaction with the information space.
The corollary to this relaxed assumption is that "more is better," that more bandwidth and higher levels of performance will necessarily improve one's information access. But common experience on the Web suggests the situation is more complex. Duplex bandwidth not only expands the user's access to potentially useful information, but also expands the user's availability (and potential vulnerability) to others. The risks include increased exposure to materials of marginal interest, as well as materials with no enduring value whatsoever, such as junk mail.
As John Cherniavsky indicated in his remarks, it is not hard to envision a world 20 years from now in which computing and communication resources will be essentially unlimited, particularly in comparison to that which is commonly available today. In 1980, I participated in a National Aeronautics and Space Administration/American Society for Engineering Education (NASA/ASEE) summer study considering the progress that could be achieved in space exploration and utilization over a 50-year time period, unconstrained by the fiscal realities of the time. The only constraints imposed were those of scientific and engineering discipline. Namely, anything proposed had to be rigorously investigated and substantiated for feasibility (not affordability). At first blush, this may seem a somewhat ludicrous approach. But the strategy served to clear our collective consciences of traditional resource constraints, particularly time and money. The only remaining constraint was the intellectual power required to envision a different future. What emerged was an extraordinarily creative exploration of interstellar exploration, lunar and asteroidal mining, and earth resources monitoring. It is this kind of attitude and approach which may shed the greater light on the future of distributed knowledge environments.
I suggest that computing and communication resources are not scarce in the future we envision. I also suggest that money is not the constraining resource we typically assume. (Hmmm, have I really gone off the deep end?) I contend that this world is idea-constrained, not resource-constrained. If we put a dynamite idea on the table, one that sweeps others away, the resources will be there to support it.
What kinds of ideas have this potential? Resource constraints introduce intermediation in information retrieval -- intermediation to create viable queries that result in manageable lists of "hits." What happens if we are as unconstrained in our ability to state a query as we are in response to getting the material back. What if we could pose queries that contain not only descriptions of the subject matter being sought, but also the context of the inquiry and the type of information being sought? If one could do that in a much richer way than we are able to do today, then, perhaps, our information retrieval systems would be sufficiently well-informed as to our needs to avoid 200,000 responses to a simple query.
Assumption 2: Metrics focus research productively.
The effect of this assumption is that incremental advances dominate community attention, leaving qualitative breakthroughs at risk -- tantamount to buying research as commodity yard goods. The result is that once-useful metrics, such as precision and recall, bias continuing research in information retrieval, despite the fact that these metrics arose in the context of batch processing.
The challenge here is to relax our traditional approaches to metrics, to seek ways of transforming these familiar points of light into expanded fields of dreams, to find new metrics more appropriate to global, heterogeneous, interactive environments. The objective is to look beyond the familiar community-wide measures of performance and to qualitatively expand potential areas for exploration. But the risk is that inadequate charts of the new territory leave both explorers and pioneers at risk. Metrics are required, but to rigidly bind a community to metrics grounded in prior generations is to foreclose serious exploration of qualitative breakthroughs.
Assumption 3: Better search engines yield better search.
Search engines owe much of their historic development to an implicit assumption of a well-organized, relatively homogeneous collection (the type of collection one would typically find in a library or commercial abstracting & indexing database, for example). The Web violates this assumption. Information sources and resources on the Web are highly diverse, distributed, and heterogeneous, with greatly varying content and quality. The "end-game strategy" of search (alternatively viewed as the "hunter/gatherer" model of information seeking) loses its effectiveness as information volume and source heterogeneity grow. Increased document and information density resists discrimination by traditional search technologies.
Relaxing this assumption suggests considering other "orthogonal" attributes of the information space, such as context-based value and trans-media semantic similarity measures. Understanding search as the end-game exposes the assumption that the user has gotten to the point where specific results can be specified, sought, and identified. It raises the question about what the opening game and mid-game might be. Making a stronger search engine merely focuses more intently on the back end of the information seeking process, when the more striking contemporary problems may exist at the front end. Metaphorically speaking, if we think of search engines as magnifying lenses passing over piles of sand, looking for just the right grains of sand, the inevitable result of dramatic increases in the quantity of sand is that increasing quantities of sand will meet the selection attributes of the lens, resulting in potentially many more grains of sand of marginal relevance within the field of view. As information density increases, there is little more that a search engine can do than to register all of the objects which share a terminological attribute with the stated query.
So, what is the opening or mid game? I don't claim extraordinary insight here, but I do look for orthogonal dimensions to the problem. Consider value-based measures, for example. Can we imagine an information retrieval environment which considers the context of the user's needs? Can we envision trans-media semantic similarity measures, in which the intellectual content of an image, a graph, or a formula would weigh as heavily as the words used in the text? Can we deploy a network-based peer review process comparable to that upon which our traditional scholarly journals depend?
What do we risk in considering such factors? That information seekers will need to confront increasing complexity with a degree of increased sophistication. The well-known paradigm of search would give way to a rich toolbox of filters on orthogonal measures of content, context, and value.
Assumption 4: The objective is to find the correct answer.
If one assumes that the typical user is seeking the answer to a well-formed question in a global information space, then one can be led down a path of ever-increasing complexity involving content- and context-sensitive multi-dimensional, trans-lingual search among semantically interoperable heterogeneous repositories with result ranking, relevance feedback, and so on. The inevitable result is that system complexity, intended to serve the user's needs for more refined information tools, instead confounds all but the most sophisticated users in their well-intentioned search for information. Increased complexity of search tools is not likely to significantly assist the average Web searcher, whose queries rarely include more than two terms.
Recasting the objective to perceiving information spaces at variable resolutions and levels of abstraction may serve the needs of many Web-based information seekers more effectively. Such an approach recasts the objective from one of finding an answer to one of understanding an information space. The risk in such an approach is that the focus may shift to browsing haystacks when the requirement demands seeking needles.
So is the typical user of networked information seeking the "right answer"? Perhaps, but in most cases, I would surmise not. Assuming so leads to complexity beyond anything we ever had before (recall that the typical query on the Web is only one or two words).
Relaxing this assumption involves recasting the objective and the practice of seeking information as a process of working through levels of abstraction, rather than attempting to zero in and drill down to a particular piece of information. The alternative is to present to the user a Gestalt view of the information space, and to provide a sense of the way it is laid out, rather than jumping right to the end game of search. "Fly-through" metaphors come to mind as an alternative, but these raise immediate questions of the dimensions and character of the field of view, in addition to the means by which the user does, indeed, drill down to the materials most relevant to the problem being addressed.
Assumption 5: The correct answer lies in the information.
If one assumes that an answer exists, and that answer can be found in the body of information being searched, then this leads to a focus on information artifacts. As a result, correlations that require collaborative expertise among individuals interacting with information may be missed. Relaxing this assumption leads to a requirement for seamless interoperability among searching, authoring, and collaboration facilities, with the derivative requirement for these capabilities to be integral to Distributed Knowledge Environments (DKEs). One of the open problems here is to satisfy the complex quality of service (QoS) requirements for fixed and mobile, synchronous
and asynchronous interoperation. Some of these requirements are being explored further in DARPA's programs in advanced networking, global-mobile communication systems, and intelligent collaboration and visualization.
Distributed knowledge environments are composed, at least, of collaboration, information analysis, and authoring facilities. If a user is to find, interact, and collaborate with both information resources and people in a network environment, then quality of service issues become very significant, as the system must seamlessly integrate synchronous as well as asynchronous sources of information and services.
Assumption 6: Search is the place to start.
The effect of this (historically valid) assumption is to focus, prematurely, on inadequate analytic tools for global, distributed, heterogeneous information sources. The result is that despite its potential, the Web remains largely unusable for vast numbers of serious professionals. Relaxing this assumption requires exploring new metaphors and algorithms for hierarchical abstraction, analysis, and visualization. Rigorously ill-defined but instinctively appealing concepts such as semantic signal processing suggest directions to pursue here. Early investigations in this direction indicate the need for new mathematical concepts and constructs. An emerging view is to consider the expert information analyst as a master craftsman, armed with a library of analytic, discrimination, and visualization tools to explore n-dimensional information spaces, seeking appropriate chunking, correlation, and visualization primitives. The risk is obvious: the theoretical foundations may be too weak. But this is not cause to be daunted, but, instead, cause to develop the necessary foundations, theories, and techniques.
Relaxing this assumption requires being open to new metaphors and algorithms for
hierarchical abstraction, analysis, and visualization. The search metaphor (which I will characterize as a one-size-fits-all solution), for example, may need to make room for a more flexible metaphor (such as the toolbox metaphor used in signal processing). A richer set of tools enables the analyst a richer opportunity for discrimination along dimensions relevant to the immediate problem. A good expert, using the right kind of tools, could discriminate the signal (and, hence, the information) they need from the vast and diverse resources available.
So, we have begun wrestling with the question of how some of these ideas from signal processing and related disciplines might influence our thinking on the future of digital libraries and information retrieval, and have begun playing with concepts like "semantic wavelets" and "semantic signal processing", with little rigor behind what these terms actually mean. Casual reflection on the process we are engaged in suggests an evolutionary process, for which the immediate next steps are to transform the information professional from a hunter/gatherer to a master craftsman.
Assumption 7: Distributed Knowledge Environments (DKEs) are for everyone.
This assumption is very attractive to those seeking justification for federal investment. The result, however, is an inordinate emphasis on near-term results for low-end users, at the expense of long-term progress derived from the strategic opportunities represented by high-end users. Progress at the high-end enables the longer term mass deployment of new capability, and truly challenges the technology. Relaxing this assumption opens the way for building DKEs for elite teams which are highly mobile and distributed. While these are the technical challenges which are strategically relevant, they also risk a perception that DKE's focus on the elite requirements of the highly trained, rather than the broad-based requirements of the masses. The oft-neglected secondary effect, however, is that those capabilities developed for elite teams quickly become part of the infrastructure available to all.
Conclusion: Out of the Box
We are in Santa Fe because we need people who can think out of the box. And we need your help to identify the opportunities, clarify the challenges, and define the "generation after next" tools. I recall Bill's comments on "the book" - of course, we all still carry books. On the way out here I was reading Geoffrey Nunberg's The Future of the Book. Despite the title, little is said about the technological future of that artifact which we now know as "the book." Why, for example, must it be embodied exclusively in paper? What precludes a book from having digitally-rendered representations of its content between its covers? Just as a network-delivered digital rendition of an intellectual work can contain a wide diversity of materials, from text through multi-media, what precludes a more traditional print text from including materials in an appropriately encapsulated rendering, say, on the inside of the cover? Or on a digitally active paper? Ultimately, can we envision a book-like artifact that is, in fact, independent of paper? I firmly believe that the future has room for digitally rendered, as well as physically rendered (as in "books") containers for the intellectual output of humans. But whereas some would cast the digital artifact in counterpoint and in competition with the physical artifacts, it seems more likely to me that these will increasingly comprise a broad spectrum of information resources, more blurred by similarity than distinguished by difference. Our distributed knowledge environments must inevitably reflect both this continuum and this diversity.
hdl:cnri.dlib/april97-larsen | 科技 |
2016-40/3982/en_head.json.gz/8441 | White House: Legalize Cellphone Unlocking
Business 0 Updated at 1:15 pm, March 4th, 2013 By: ABC Digital Share This Story
We MatchediStockphoto/Thinkstock(WASHINGTON) — The White House has responded to an online petition to make cellphone unlocking legal, and that should make the 100,000-plus people who signed it very happy. The Obama administration says it’s time to legalize the practice, which lets you to take your phone with you if you switch carriers, but was banned in January by the Copyright Office and the Library of Congress.
“The White House agrees with the 114,000+ of you who believe that consumers should be able to unlock their cell phones without risking criminal or other penalties,” R. David Edelman, White House Senior Advisor for Internet, Innovation and Privacy, wrote on the White House Petitions Blog. “In fact, we believe the same principle should also apply to tablets, which are increasingly similar to smart phones.” Typically, if you sign a contract with a wireless carrier, you get a phone at reduced (or no) price as long as you stay with them. Unlocking your phone involves a software alteration and requires a new SIM card — not a big deal if it hadn’t been banned. The White House says the ability to bring your phone to another carrier or network is “crucial for protecting consumer choice” and is important in making sure America maintains its “vibrant, competitive wireless market.” The Obama administration said it is now planning to address the issue and would support legislative fixes to say that “neither criminal law nor technological locks should prevent consumers from switching carriers when they are no longer bound by a service agreement or other obligation.” FCC Chairman Julius Genachowski also supports the efforts. The “ban raises competition concerns; it raises innovation concerns,” Genachowski said last week. Unlocking a cellphone was legal until Jan. 26. In October 2012 the U.S. Copyright Office and Library of Congress decided to remove phone unlocking as an exemption under the Digital Millennium Copyright Act (DMCA). Edelman mentions in his post that the Library of Congress agrees that the process for the exemption is a “rigid and imperfect fit for this telecommunications issue.” “Both the Librarian of Congress and the Register of Copyrights value our colleagues in the administration and the thoughtful discussions we have had with them on this issue,” the Library of Congress said in a statement Monday. “We also agree with the administration that the question of locked cell phones has implications for telecommunications policy and that it would benefit from review and resolution in that context.” Sina Khanifar, the 27-year-old Californian who started the petition on the White House “We the People” website, said, “This is a big victory for consumers, and I’m glad to have played a part in it. A lot of people reacted skeptically when I originally started the petition, with lots of comments to the effect of ‘petitions don’t do anything.’ The optimist in me is really glad to have proved them wrong. The White House just showed that they really do listen, and that they’re willing to take action.” Bradley Shear, a social media and technology attorney in the Washington, D.C. area, said that this just isn’t a simple reversal. “I think it’s a very sound approach that the administration is taking,” Shear said. “But due to the stalemate in Washington over other matters, it might be very hard to pass a bipartisan legislative fix to this issue.” Still, Shear said he believes the involvement of the White House and the FCC is a big move for the phone unlocking movement. “This is a game-changer, now that it has the attention of all the powers that be. It will now be a much larger conversation about this issue,” he added.
Stories You May Be Interested In:Americans break record for charitable donations for 2nd year in a rowSara Weber, Deseret NewsVerizon warns HUGE data users: Switch plans or elseAhiza Garcia, CNNTyson recalls 130,000 pounds of chicken nuggetsDebra Goldschmidt, CNNWalmart to sell deep-fried TwinkiesAaron Smith, CNN Newswire Images in the news | 科技 |
2016-40/3982/en_head.json.gz/8469 | © The Marine Biological Association of the United Kingdom
Mesoplodon bidens is a toothed whale and can be recognised as such by the single blowhole and the presence of teeth (rather than baleen). It is a member of the beaked whale family with the characteristic V-shaped crease on the throat and the short dorsal fin set relatively far back. Sowerby's beaked whale is a small beaked whale that can reach up to 5.5 m in length. The lower jaw has a single pair of teeth (exposed only in adult males). The forehead rises at a shallow angle and has a slight bump. It has a distinct beak and the mouthline is curved down at rear. Sowerby's beaked whale has a charcoal grey dorsal and lateral colouration with a lighter belly. Adults may also have light grey spots on the body and are often covered with scratches and scars.Sowerby's beaked whale may be confused with True's beaked whale Mesoplodon mirus but can be recognised by a slight bump on the forehead and a slightly longer beak. Sowerby's beaked whales are usually found either alone or in groups of up to 10 individuals. Little is known about their behaviour although tail-slapping has been recorded. Dives may last up to 15 minutes long (Kinze, 2002).
Dana Campbell selected "Description" to show in Overview on "Mesoplodon bidens (Sowerby, 1804)".
Mesoplodon bidens (Sowerby, 1804)
Appears under "Comprehensive Description"
©1998-2011, The Marine Biological Association of the United Kingdom
Marine Life Information Network
Editor: Dr Harvey Tyler-Walters
Author: Morvan Barnes
Morvan Barnes 2008. Mesoplodon bidens. Sowerby's beaked whale. Marine Life Information Network: Biology and Sensitivity Key Information Sub-programme [on-line]. Plymouth: Marine Biological Association of the United Kingdom. [cited 26/01/2011]. Available from: Article rating | 科技 |
2016-40/3982/en_head.json.gz/8485 | Microsoft's indie parity clause exists so Xbox One owners feel "first class" And to discourage devs who think "I'll get to Xbox whenever I get to it", Spencer says. By Tom Phillips
Microsoft's indie game parity clause remains in place so that Xbox One owners feel "first class", Xbox boss Phil Spencer has stated.Speaking on the Inner Circle podcast, Spencer said the policy was designed to dissuade developers from leaving Xbox One versions of their games until later - something which leaves owners of Microsoft's console feeling left out.Microsoft-owned Minecraft will continue to be updated for other platforms, although it's unknown whether Xbox versions will gain unique features."I'll be honest, the thing I worry about is I look at all the people who buy an Xbox and invest their time and money in Xbox One," Spencer said. "Millions of people own Xbox One and I want those people to feel like they're first class, because they are. "When a third party game comes out it comes out on all platforms at the same time. When indie games come out, I want them to come out and I want Xbox to to feel like it is a first class citizen when an indie game launches."The parity clause is designed so that indie games do not appear on a different platform first before Xbox One. Or, if they do, that the version that launches for Xbox is different somehow."For me the parity thing is, if you own Xbox One I want to work for you to make sure that when great content launches, if it's coming to Xbox One and another platform, you get it at the same time as everybody else does," Spencer added. But this does not always have to apply, he conceded. When it is a matter of resources - when a small studio simply does not have the budget or staff to juggle simultaneous launches on multiple platforms - Microsoft is open to a discussion."I have a lot of friends who run small indie studios, and I get that timelines around when... they just can't get both games done at the same time or all 3 games, 4 games depending on how many platforms they're supporting," Spencer continued. "So I [have been] just saying 'let's have a conversation', and it's worked. Today, I think we've done a good job working with the indies when they have had strict parity concerns if it's just a dev issue for them." The purpose of the parity clause is not to penalise indie developers who don't enough the resources for an Xbox One version, he concluded, but to discourage studios who might actively choose to release an Xbox One edition at a later date (for example if they have signed an exclusivity deal with Sony)."I don't want somebody to come in and just think 'I'm going to go do a special game on one platform and then I'll get to Xbox whenever I get to it.' I don't think that's right," Spencer concluded."As Xbox one customers we want good games as they come out on both platforms. But I also get that for some guys they just can't afford the time to get both done. So we have entered into the conversations with people as they are launching it and I feel pretty good about the plan." Buy Xbox One from Amazon [?] Games featured in this article
About Tom Phillips
Tom joined Eurogamer in 2010. He writes lots of news, some of the puns and all the stealth Destiny articles. @tomphillipsEG on Twitter. From the web Sponsored links by Taboola
The 10 most popular stories of the day, delivered at 5pm UK time. Never miss a thing. Comments (229)
Should you add an SSD to your Xbox One? What you need and what it'll get you. Loading... hold tight! | 科技 |
2016-40/3982/en_head.json.gz/8496 | Next Generation Firewall News & Analysis / Microsoft Gives IP Indemnity to Its Embedded System OEMs
Microsoft Gives IP Indemnity to Its Embedded System OEMs
By Peter Galli | Posted 2006-02-09
Redmond says it intends to strengthen and broaden the intellectual property protections it offers manufacturers and distributors.
Microsoft has moved to strengthen and broaden the intellectual property protections it offers those original equipment manufacturers and distributors across the globe that build and sell devices powered by its Windows Embedded and Windows Mobile software. The Redmond, Wash. based software maker will announce Feb. 9 that this strengthened IP protection now includes the defense of OEMs and distributors against IP claims in every country Microsoft distributes or markets its Windows Embedded and Windows Mobile products; protection of patent, copyright, trademark and trade secret claims based on that software; and the removal of the monetary cap related to defense costs.
That move brings the IP protections provided to the more than 4,000 OEMs and distributors of Microsofts embedded mobile software, which is used in phones, ATM machines, retail point-of-sale systems, GPS systems, industrial robots and thin clients, in line with those already offered for Microsofts server and client products, partners and customers, David Kaefer, the director of Microsofts Intellectual Property Licensing Group, told eWEEK in an interview.
The move was welcomed by Boris Metlitsky, the senior vice president of product strategy and development at Symbol Technologies, who said that this is another tool to speed up the deployment of its latest enterprise mobility solutions and provide its customers with technologies that increase their competitiveness in their respective markets.
"By extending IP protection to the embedded and mobile device manufacturing space, Microsoft is helping ensure the integrity of our offerings and is allowing us to focus on the next wave of innovations," Melitsky said in a statement.
This latest move was part of Microsofts long-term plan to standardize its indemnification for channel partners and end customers, Kaefer said, adding that the process has been under way since 2003, when Microsoft removed monetary caps for its volume licensees and customers, after hearing that this was the top customer satisfaction issue for them.
"Then, in November 2004, we offered all customers the level of indemnification we were offering our volume licensees. We then switched gears into the partner community and realized that we had all these varying levels of indemnification depending on whether you were an OEM or a system builder or an ISV partner, and by last summer we had brought just about everyone up to one common bar, but we had difficulty slipstreaming in a lot of the folks in the embedded world," he said.
Read more here about Microsofts move to indemnify its PC OEM partners against IP attacks.
The biggest challenge for Microsoft with regard to the embedded market was that it needs great flexibility in the type of software it puts on devices and often needs to modify that software, in large part because many of the devices had small footprints and the software has to be fitted onto that.
"This speaks to why indemnification is so important for this channel: It is one where we allow changes to our software and this makes offering indemnification more challenging," Kaefer said.
Partners were covered under this latest indemnity unless "the partner alters a piece of source code and that piece of source code is the basis of an infringement claim. While the partner is still indemnified for all the unmodified source code, Microsoft does not indemnify them for the modified source code," he said.
This was reflected in the language used in the indemnity policy, which states that Microsofts obligations "will not apply to the extent that the claim or adverse final judgment is based on
your altering the covered software," Kaefer said.
So, if a partner alters source code block X and then an IP holder makes a claim that source code block X infringes its rights, this would not be covered because the partner has authored the allegedly infringing code and not Microsoft, he said. Next Page: Taking a leading position. Page 1 of 2 1
Peter Galli has been a financial/technology reporter for 12 years at leading publications in South Africa, the UK and the US. He has been Investment Editor of South Africa's Business Day Newspaper, the sister publication of the Financial Times of London.He was also Group Financial Communications Manager for First National Bank, the second largest banking group in South Africa before moving on to become Executive News Editor of Business Report, the largest daily financial newspaper in South Africa, owned by the global Independent Newspapers group. He was responsible for a national reporting team of 20 based in four bureaus. He also edited and contributed to its weekly technology page, and launched a financial and technology radio service supplying daily news bulletins to the national broadcaster, the South African Broadcasting Corporation, which were then distributed to some 50 radio stations across the country.He was then transferred to San Francisco as Business Report's U.S. Correspondent to cover Silicon Valley, trade and finance between the US, Europe and emerging markets like South Africa. After serving that role for more than two years, he joined eWeek as a Senior Editor, covering software platforms in August 2000.He has comprehensively covered Microsoft and its Windows and .Net platforms, as well as the many legal challenges it has faced. He has also focused on Sun Microsystems and its Solaris operating environment, Java and Unix offerings. He covers developments in the open source community, particularly around the Linux kernel and the effects it will have on the enterprise.He has written extensively about new products for the Linux and Unix platforms, the development of open standards and critically looked at the potential Linux has to offer an alternative operating system and platform to Windows, .Net and Unix-based solutions like Solaris. His interviews with senior industry executives include Microsoft CEO Steve Ballmer, Linus Torvalds, the original developer of the Linux operating system, Sun CEO Scot McNealy, and Bill Zeitler, a senior vice president at IBM. For numerous examples of his writing you can search under his name at the eWEEK Website at www.eweek.com. | 科技 |
2016-40/3982/en_head.json.gz/8498 | Google Cloud Platform Gains HIPAA Agreement Support
Google Cloud Platform Gains HIPAA Agreement SupportGoogle Cloud Platform has added support for HIPAA-mandated Business Associates Agreements (BAAs), which will help health care organizations comply with the Health Insurance Portability and Accountability Act when using the Google Cloud Platform for their applications.
The BAA support, which will allow organizations to build and deploy their health care applications on the Google Cloud Platform and comply with the HIPAA requirements, was announced by Matthew O'Connor, product manager for the unit, in a Feb. 5 post on the Google Cloud Platform Blog.
"To serve developers who want to build these applications on Google's infrastructure, we're announcing support for Business Associates Agreements (BAAs) for our customers," wrote O'Connor. "A BAA is the contract between a Covered Entity (you, the developer) and their Business Associate (Google) covering the handling of HIPAA-protected information."
HIPAA is the federal law that establishes standards around privacy, security and breach notification in the handling and storage of health care records for patients. The establishment of HIPAA regulations has meant that patient records are more secure and must be handled with more care by health care organizations.
"When you're building a healthcare-related application, not only do you need the right code and a reliable user experience, sometimes it feels like you need to be a lawyer too" since the rise of HIPAA rules, wrote O'Connor. "Often, there are several additional steps to take into consideration."
When using Google Cloud Platform for applications, that can mean even more checklists and compliance issues.
"When building in the cloud, it can be challenging to ensure that you're complying with these regulations," wrote O'Connor.
In 2013, Google began entering into BAAs with Google Apps customers to support their HIPAA-regulated data, according to O'Connor.
Google already features compliance steps in its Cloud Platform and Google Enterprise services for other business users who require it, according to O'Connor, including ISO 27001, which is one of the most widely recognized, internationally accepted independent security standards. "After earning ISO 27001 for Google Apps in 2012, we renewed our certification again last year for Google Apps and received the certification for Google Cloud Platform," he wrote.Google Cloud Platform Gains HIPAA Agreement SupportGoogle Cloud Platform also includes support for the SOC2, the SSAE 16 Type II audits and for its international counterpart, the ISAE 3402 Type II audit, so that users can document and verify the data protections in place for their services, wrote O'Connor.
"We've successfully completed these audits for Google Apps every year since 2008 (when the audits were known by their previous incarnation, SAS 70) and we did so again last year for Google Apps and Google Cloud Platform," he wrote.
Google is always working to expand the customer features of its Google Cloud Platform offering. In November 2013, Google began a program to show game developers how they could build scalable games using the platform so they could expand their games to more users as needed.
In late October 2013, Google replaced its old Google API Console with a new, expanded and redesigned Google Cloud Console to help developers organize and use the more than 60 APIs now offered by Google. The new Google Cloud Console makes managing the over 60 Google APIs housed within easier than ever, according to Google. Soon the new cloud console will be set as the default choice for the console by Google, though users will have the ability to revert back to the old version.
Also in October, Google released several technical papers to help cloud developers learn more about the development tools it offers through its Google Compute Engine services. The papers, "Overview of Google Compute Engine for Cloud Developers" and "Building High Availability Applications on Google Compute Engine," offer insights and details about how the platform can be used and developed for business users.
In September 2013, Google unveiled its second version update of the Google App Engine since August, with the latest release 1.8.4 including a host of features that the company says will make it more flexible and simpler for developers to use for their applications. Included in 1.8.4 is support for Dynamic Web Projects in Eclipse to better support Google Cloud Endpoints and App Engine Backends, as well as fixes for several bugs. One other important new feature is the ability of Google App Engine to handle differential snapshots of a Google Compute Engine persistent disk, so that only the most recently changed data is updated.
The August launch of the previous App Engine 1.8.3 was also accompanied by deeper features for Google Compute Engine and the Google Cloud Datastore as the search giant continues to add functions and robustness to the Google Cloud Platform.
The new tools included Layer 3 load balancing for Google Compute Engine and improvements to the PHP runtime in the latest Google App Engine release. The Layer 3 load balancing capabilities were a key addition in the Google Compute Engine, to provide Google-scale throughput and fault tolerance to manage Internet applications.
In July 2013, Google unveiled several new features in the Google Cloud Storage environment to make it easier for developers to manage, access and upload data into the cloud. Those new capabilities included automatic deletion policies, regional buckets and faster uploads as part of a wide range of services.
In June 2013, Google unveiled a new Cloud Playground environment where developers can quickly try out ideas on a whim, without having to commit to setting up a local development environment that's safe for testing coding experiments outside of the production infrastructure. The Cloud Playground is slated as a place where application developers can try out all kinds of things, from sample code to viewing how production APIs will behave, in a safe, controlled place without having to manage the testing environment, according to Google. The new Cloud Playground initially supported only Python 2.7 App Engine apps. | 科技 |
2016-40/3982/en_head.json.gz/8512 | Published: August 29, 2006 A new online music company said today that it would make a huge catalog of songs from the world�s largest record company, the Universal Music Group, available for consumers to download free. The company, called SpiralFrog, said its intention was to wean music fans, especially young people, away from illegal downloads and pirate music sites by offering a legitimate source that is supported by advertising instead of download fees. SpiralFrog is the latest to offer a challenge to Apple Computer�s hugely successful iTunes service, which allows consumers to download songs legally for 99 cents apiece, and its many smaller imitators. Though the venture is not the first to try a free, ad-supported approach, the backing of Universal, with millions of songs in its catalog from thousands of artists like Eminem and Gwen Stefani, Elton John and Gloria Estefan, Count Basie and Hank Williams, promises to give it instant credibilty and scale. A spokesman for Universal Music confirmed the agreement but declined further comment. SpiralFrog, which is privately held and headed by Robin Kent, a former advertising executive, said it expected to start testing its service in the United States and Canada by the end of the year, and would extend its service to Britain and other European markets next year. The announcement reflects the music industry�s eagerness to experiment with various digital business models and to find a way to overcome piracy and illegal copying, which remains a big problem despite the record companies� efforts to enforce their copyrights in court. While they industry has tried to encourage the growth of legitimate alternatives like iTunes, some record executives have begun to chafe at Apple�s dominance in the online market, particularly its insistence on a one-size-fits-all pricing model, saying it has restricted the growth of digital sales. For consumers, SpiralFrog�s free downloads will come with many more strings attached than Apple�s paid ones do. Users of SpiralFrog will have to sit through advertisements, and will be prevented by special software from making copies of the songs they download or from sharing them with other people. They will have to revisit the SpiralFrog web site monthly to keep access to the music they download. And the songs will be encoded in Microsoft�s WMA format, meaning they will not work on Apple iPod portable music players. The venture is not the first legitimate one to make music available free. Napster, a former peer-to-peer file-sharing scourge of the record companies, introduced an advertising-supported service this year that lets users listen to a few songs without paying fees. But Napster�s free service streams its music to users, rather than allowing them to download and store the files, as iTunes does. Kazaa, another digital file-sharing network that agreed last month to settle copyright-infringement lawsuits with the music and movie industries, is also expected to introduce a free-with-advertising service when it reintroduces itself as a licensed, legitimate distribution business. SpiralFrog beat Kazaa to the punch with its own announcement, which was reported today by The Financial Times. �Offering young consumers an easy-to-use alternative to pirated music sites will be compelling,� said Mr. Kent of SpiralFrog in a statement. �SpiralFrog will offer those consumers a better experience and environment than they can get from any pirate site.� Mr. Kent is a former chief executive of Universal McCann, a media-buying unit of the Interpublic Group that is not connected to Universal Music. Neville Hobson, a spokesman for SpiralFrog, said the company hopes to pursue licensing deals with the other major record companies � Sony BMG, EMI and Warner Music � to augment its deal with Universal Music, a unit of Vivendi. SpiralFrog did not disclose the terms of its licensing agreement with Universal Music. A spokesman for the company said Universal would be compensated for the use of its copyrighted songs by sharing in advertising revenue, but gave no details. Universal�s many record labels control about one-quarter of the worldwide market for recorded music. The license agreement with SpiralFrog includes all recordings to which Universal has North American rights. Given the fragmentation of the digital music business � the hundreds of would-be challengers to iTunes mainly have minuscule shares of the market � analysts said that new services like SpiralFrog would face difficult challenges, despite the lure of �free� music. �Few service providers are currently in a position to provide the large audiences that advertisers require, and few pure music providers have the heritage of building a business funded by advertising,� said Michele Mackenzie, principal analyst at Ovum, a telecommunications and Internet consulting firm. The music industry must also manage its relationship with Apple carefully, analysts said. SpiralFrog took pains to discourage talk that its free-with-advertising model would threaten Apple�s pay-per-song service. �This is certainly not being pitched as a challenger to iTunes,� Mr. Hobson said. �It�s a very different model. It�s complementary to iTunes.� http://www.nytimes.com/2006/08/29/bu...syahoo&emc=rss All times are GMT -4. The time now is 10:12 AM. A TV Guide Digital Network Site -- ©2002-2013 by TVGuide Online, Inc. | 科技 |
2016-40/3982/en_head.json.gz/8513 | Geo-referenced database
Dams, and their associated reservoirs, provide the ability to store water for later use, provide hydropower and provide some level of protection from extreme precipitation events. If designed correctly, dams allow water to be available at times when in its absence it would not be available, therefore increasing exploitable renewable water resources. This is particularly important for countries in which the available water during the wet and dry seasons varies significantly. Dams may also allow for the excess runoff that would normally flow to the ocean without being used to become available for use. However, dams and reservoirs, especially large ones, also can have negative impacts on human societies, requiring resettlement and leading to social disruption. Dams also change the river network and flow regulation is considered one of the main negative ecological consequences of dams and reservoirs. Also, stored water may evaporate at a greater rate than free-flowing water. In short, dams have pros and cons, such that their design characteristics need to be evaluated carefully.
AQUASTAT gathers detailed information about dams in each country during country update processes. AQUASTAT�s data was an important input into the Global Reservoirs and Dams (GRanD) database, especially for African dams. The work on this database was coordinated by the Global Water System Project, in partnership with several organizations. An article has been published in 2011 in the Journal Frontiers in Ecology and the Environment.
Total dam capacities are introduced into the AQUASTAT main country database, and additional details are provided through this page. This interactive visualization has been prepared to provide a sense of context to the global distribution of dams and the variability in their capacities. Please click on the image to the left using the Chrome, Firefox 4 or Webkit Nightly web browsers. The cross-browser limitation is explained by the fact that this visualization was prepared using Chrome experiments.
AQUASTAT has information on around 14 500 dams worldwide, but this visualization of course can only include dams that have values for 'latitude', 'longitude' and 'capacity', which are around 8 300. However, all dams are listed in the files below.
Region Geo-referenced dams database
Notes and References Dams in GeoNetwork
Dams in Google Earth
Middle East (West Asia)
Central Asia - -
Southern and Eastern Asia - -
Europe - -
Oceania - -
Northern America - -
Central America and Caribbean - -
Southern America - -
- - - - Work in progress
Geo-referenced dams databases: The dam databases are provided as Excel files. Each file has two sheets: the first sheet contains the database and the second sheet contains the legend. These databases, in their present format, are neither complete nor can be considered error-free. They correspond to the best available information at the time of the study.
Notes and References: Explanatory document providing specific information about the refrerences used, and brief notes on the more complicated dams contained in the excel spreadsheet.
Dams in GeoNetwork: GeoNetwork is an opensource depository that allows users to share geographically referenced thematic information between different organizations.
Dams in Google Earth: A google .kml file, suitable for viewing basins, rivers, dams in Google Earth. Click on each point for information about each dam. *
Analysis: A document in which the importance of dams is reflected upon. *
* Due to the complexity involved, published reports and maps are not updated frequently and may not reflect updates. The files with information on dams for any country separately can be downloaded using the dropdowns below:
---- Select a country ---
Evaporation from artificial lakes and reservoirs
Dams and their associated reservoirs provide many services, including water storage, flow regulation, navigation, hydropower, in-stream and off-stream uses, flood protection, amongst others. However, these artificial lakes and reservoirs evaporate more water than the natural surface water flow before the dam was built, because dams generally increase the surface area of the body of water. This means that more water is exposed to air and direct sunlight, thus increasing evaporation. This "lost" water is referred to as consumed, because it is removed from the system. In some cases, this water consumption can be quite substantial.
Due to its importance, AQUASTAT has estimated the evaporation for all artificial lakes and reservoirs that are available in the above geo-referenced dams database. This exercise is a very rough estimation, with many limitations, and it thus should be considered as an 'order of magnitude' study only. For more information on the methodology, see the technical note prepared on Evaporation from artificial lakes and reservoirs. As always, AQUASTAT welcomes feedback which would help improving the information provided.
The figure below shows the evaporation by region, resulting from the study.
Click the chart to magnify Other charts, showing evaporation from artificial lakes and reservoirs together with agricultural, industrial and municipal water withdrawal, are available in the theme Water uses. More information on the different types of water uses can be found in the technical note Disambiguation of water statistics. | 科技 |
2016-40/3982/en_head.json.gz/8589 | LittleBigPlanet 3 is being developed by Sumo Digital, but is it a PS4 game?
06.10.2013 :: 10:36AM EDT @mthwgeek
Sony controls a number of key IPs for its consoles and handhelds. We’ve already seen what Sony and Polyphony Digital has in store for Gran Turismo, but what about that more mainstream, family-friendly IP LittleBigPlanet?
Media Molecule is busy working away on the PS Vita game Tearaway, so it was unclear whether a new LittleBigPlanet game was in the works for the launch of the PS4 or to help bolster PS3 sales. However, a resume reveals that LittleBigPlanet 3 is in development, but Media Molecule have entrusted the game with Sumo Digital.
It’s the resume of animator/artist James Kearlsey that has given the game away. He lists LittleBigPlanet 3 as a project he has been working on while at the studio. That resume has now been taking down, but not before a screenshot of the listing was captured.
What remains unclear is which console LBP3 is intended for. I’d assume it’s a PS4 title, but then it was easy to assume Gran Turismo 6 would be a next-gen game, too. As the PS3 has already had two LBP games, I’d put money on this being a strong candidate for release on PS4. And if it is, then that’s a strong title to have if it launches during the first year of the new console being on sale.
Sumo Digital also has a great track record of game releases. As well as handling development of the LittleBigPlanet 2 Cross Controller Pack, the studio has also produced Sonic & All-Stars Racing, a couple of F1 games, and managed to produce a number of great Outrun games for Sega. LittleBigPlanet is clearly in safe hands.
Facebook Twitter Linkedin Pinterest Reddit LittleBigPlanetMedia MoleculePS VitaPS3PS4SonySumo Digital Speak Your Mind | 科技 |
2016-40/3982/en_head.json.gz/8640 | Safety | Greenpeace Canada
Publication - April 1, 2008
Safe reactors are a myth. An accident can occur in any nuclear reactor, causing the release of large quantities of deadly radiation into the environment. Even during normal operation, radioactive materials are regularly discharged into the air and water. The nuclear industry was suffering serious nuclear accidents long before the catastrophic Chernobyl accident in 1986. Today, the industry is still plagued with incidents, accidents and near-misses. Aging of nuclear reactors, embrittlement of metals, corrosion and fatigue are endemic throughout the world’s nuclear industry. At the same time, nuclear operators are continually trying to reduce costs due to greater competition in the electricity market and the need to meet shareholder expectations.
The following examples show the world is never far away from the next nuclear catastrophe.
Japan 1999: Two workers at the Tokai-mura nuclear fuel plant received lethal doses of radiation. A year later, it was revealed that vital safety data and inspections had been manipulated to avoid expensive repairs and long closures. Japan 2004: Despite claims that the nuclear industry and government had adopted higher safety standards, a steam explosion at the Mihama reactor killed five workers. In 2006, a district court ordered the shut down of a nuclear reactor as it could not withstand severe earthquakes. All of Japan’s reactors are sitting on top of one of the world’s most active geological faults. Ohio 2002: A catastrophic accident at the David-Besse reactor was avoided when it was discovered that corrosion had come close to penetrating the vital pressure vessel, which could lead to a complete reactor core meltdown. Ten years earlier, Greenpeace filed a complaint to the U.S. nuclear regulator warning of the risk of corrosion at all U.S. nuclear power plants. The warning was ignored. Following the discovery at David-Besse, the reactor was shut down for two years (costing $600 million US), but then given a licence to operate until 2017.
France 2003: The French nuclear safety agency activated its emergency response centre following torrential rainfall along the lower Rhone River and the emergency shutdown of two reactors (Cruas-3 and 4) due to flood damage.
England 2000: Cumbria’s Sellafield nuclear fuel processing site was found to have a fundamental failure of safety culture by government inspectors, but only after public disclosure of violations of quality control and safety standards at its Sellafield MOX Plant. This helped convince the government of Ireland to launch a legal challenge against the British government at the United Nations International Court in Hamburg on the issue of nuclear safety at Sellafield.
These are just a few examples of a global problem. In 2005, Greenpeace updated its international reactor hazards study. One conclusion was that the standard nuclear reactors (light water), the most common type operating today, could release up to 10 times more radioactivity in an accident than the Chernobyl disaster. | 科技 |
2016-40/3982/en_head.json.gz/8666 | Finding cause of Oso mudslide will take time and science
Sat May 10th, 2014 10:05pmNews By Chris Winters Herald Writer
OSO — The sounds of the countryside were everywhere.
Birds tweeted in the trees, a breeze blew through the leaves and brush. A rooster crowed. The North Fork Stillaguamish River softly rushed through the valley.
The landscape, however, consisted of a square mile of upended terrain left by the March 22 mudslide that wiped out the Steelhead Haven community, killing 41 people and leaving two missing and presumed buried under tons of earth and debris.
On a small hill on the south edge of the slide area sat a survey station. Dale Topham, the supervisor of Snohomish County’s geotechnical engineering group, and surveyor party chief Carl Hagaman last week were taking measurements of several “targets,” two-foot-square markers, each with a bright red “X,” that were placed on the edges of the new ridge line and down below on piles of debris.
“Looks like Seven is moving around a lot,” Hagaman said. “Every time I look at it there’s a new branch in front of it.”
Target seven was on a tree about halfway up the northeastern edge of the slide. The handheld device that communicates with the survey gun mounted on a tripod showed that the target had moved 5.3 feet to the south and 3.1 feet to the west.
On the other hand, targets at the very top of the slope, three-quarters of a mile away and barely visible, had only moved a couple thousandths of a foot since the last measurements were taken a few days earlier.
Dark, vertical striations in the exposed slide showed where soil still sloughs down. Occasionally, Topham said, a larger chunk of earth calves off the scarp and tumbles down.
“You can hear it from here,” Hagaman said.
Work at the mudslide has entered a new phase. With the active search for victims finished, scientists now are working to find out what happened that Saturday morning when the mountain came down.
It will take time for researchers to fully understand what triggered the slide above Steelhead Haven, which sent a deadly torrent of slurry across the valley and up the slopes on the far side.
In coming months, teams of geologists, hydrologists, geomorphologists, engineers and other researchers will visit the site and observe, measure, record and analyze.
Then they’ll come back and do it again and again.
The challenge then becomes making that abstract data accessible and understandable to the public, some of whom are wondering why they weren’t warned, to government planners reconsidering land-use policies, and to politicians who are being asked to account for decisions made over decades.
Surprising magnitude
Geologists have known for years that the hillside that came down March 22 was unstable. A 1999 draft report prepared for the Army Corps of Engineers points out that in the past century, there have been periods of alternating heavy slide activity and dormancy on the hillside.
Slides at Hazel partially blocked the North Fork Stillaguamish River in 1951, 1967, and 2006, and there have been several smaller events over the years.
“This kind of landslide is a tough one to categorize,” said Daniel Miller, the geologist who wrote the 1999 draft report.
“No one was able to anticipate the kind of runout that it would have,” Miller said.
His report suggested that the slope had the “potential for a large catastrophic failure,” which could involve runout of 880 feet, similar to what happened in the 1967 slide.
But that was based on assumptions such as an estimate that a slide would consist of 7,095 cubic yards of soil. The March 22 slide was estimated to have displaced 10 million cubic yards.
Miller’s draft report made clear that more study was needed to make accurate predictions of the slide’s behavior.
“I currently have no basis for estimating the probable rate or timing of future landslide activity,” Miller wrote.
The fact debris ran out a mile across the floor of the valley was a major surprise even to scientists who specialize in debris-flow analysis.
Richard Iverson, a senior research hydrologist at the U.S. Geological Survey’s Cascades Volcano Observatory in Vancouver, Wash., has been building mathematical models of debris flows for 20 years.
The data that have gone into that modeling have come from other landslides around the state, especially those triggered by the 1980 eruption of Mount St. Helens, which Iverson said showed some similar behaviors to the Oso slide in that it ran fast and far.
“It’s no surprise to any geologist that that slope failed. The surprise was what it did when it did fail,” he said.
The big question Iverson hopes to answer concerns the mobility of landslides, and he suspects the answer lies partly in understanding the role played by the geological process known as liquefaction, in which soil loses its solid consistency and behaves like a liquid.
Liquefaction is most commonly associated with earthquakes and is caused by seismic waves causing the soil to vibrate and collapse under its own weight.
In landslides, soil saturation during extreme weather plays a role in creating conditions ripe for a slide, but it’s gravity that pulls down the hill and gives the slide its energy.
Identifying those conditions and pinpointing the threshold at which liquefaction occurs will require studying the amount of water that was already in the soil, the depth at which the liquefaction occurred at Oso and the characteristics of the surrounding terrain and the river that influenced the debris flow once it started moving.
Some of this data can be obtained from surface measurements, but getting the full picture will require gathering as much information as possible, including digging in the dirt, what Iverson called “good old-fashioned field geology.”
Immediate payoff
The USGS’s technical understanding of slide behavior has already had real-world effect in Oso.
The agency has spent years not just building complex mathematical models but testing how different material flows at a facility in central Oregon where it operates a 300-foot-long slide flume.
Repeated tests on different materials, and watching how they behave and mix during a slide, provided knowledge that ultimately led rescuers to search for victims of the slide near the edges of the zone and not where their homes used to be, Iverson said.
When Iverson visited the slide zone during the first week after the event, he saw affirmation that lab work and modeling could yield real-world results.
Remote-sensing imagery of the terrain before and after the slide enabled the USGS to add new data to the mathematical model.
“Within a couple days, the model results were also being used in the search efforts,” Iverson said.
Down on the floor of the valley, Dale Topham pointed out thick, gray blocks of claylike soil.
They are lacustrine deposits, he said, the material that formed the bed of an ancient lake that filled the Stillaguamish Valley during the last glacial period.
On the slope of the slide, horizontal striation shows where those clay layers were buried. Now they’re tossed around with the tumbled hillocks of sandier glacial till in a seemingly random pattern on the valley floor, mixed in with broken trees and pieces of cars, houses and personal items.
Up the slopes and on top of the bluff, the USGS and the transportation department have installed remote monitoring equipment that can track minute movements in the earth.
Around the edges of the slide zone, standing trees show the level where liquid gray mud washed up before settling back to the floor.
“Where you see how high it’s splashed in some areas, it’s very impressive. It witnesses the fluidity of the flow,” Topham said.
Researchers will be looking at all of those things this summer. Some of it, such as the mud splashes on the tree trunks, will need to be documented before it washes away. Grass, and later trees, will start growing in the new terrain, obscuring other details of the slide.
One team of researchers from the Geotechnical Extreme Events Reconnaissance (GEER) Association, will soon start to document that evidence before it gets washed away or covered by vegetation.
GEER sends in a rapid-response team that doesn’t design studies of its own. It gathers data to share with other scientists.
“The mission is to collect data and give it away,” said David Montgomery, a professor of earth and space sciences at the University of Washington who was asked to join the team investigating the Oso slide.
Another team of scientists from the University of Illinois at Urbana-Champaign’s Department of Civil and Environmental Engineering is expected to arrive later this month. That group studies manmade and natural disasters with a goal of producing data that can be used in public policy.
One question that team would like to answer is whether the size of the Oso slide could have been predicted at all.
“If we could have predicted the size and extent, what would have been needed beforehand to predict the size and extent of the slide?” said Timothy Stark, the Illinois team’s lead researcher.
An ultimate analysis
In the end, “we would like to have a comprehensive analysis of the slide,” said Steve Thomsen, public works director for Snohomish County.
Doing the field work to produce that report might cost $2 million to $3 million, he said, noting that a source of money hasn’t been identified yet.
“Land-use decisions wouldn’t be a part of the report, but it could inform policy decisions down the road,” Thomsen said.
At the same time, a team of researchers from the Federal Emergency Management Agency and the U.S. Army Corps of Engineers, and another from the state Department of Transportation, have been working to assess the danger of flooding to people still living and working in the area and to the highway.
Miller, the geologist who studied the slide area earlier, hopes research in Oso will help create a new understanding of landslide risks throughout Washington.
Thomsen pointed out that while hydrologists have developed models for 25-year and 100-year floods, there is no similar rating system for landslides.
Doing scientific work in the debris field will be challenging, not just because of the dynamic nature of the slide zone and the river but because of reminders — of lives lost, the persistent threat of flooding, even further disintegration of the slope.
“It’s the old analogy of trying to repair the car while driving down the road,” Thomsen said.
It’s complicated, there are many moving parts, but for scientists, the slide is also an opportunity to advance knowledge in their fields.
“The scientists are really hungry for data points,” he said.
Chris Winters: 425-374-4165; cwinters@heraldnet.com. | 科技 |
2016-40/3982/en_head.json.gz/8696 | DTS, Inc. launches a variation of of its HD codec named DTS-HD Master Audio Essential
Discussion in 'Playback Devices' started by Sanjay Gupta, Nov 22, 2008.
Is it just me or this another product that seems to have only one reason for it's existence, ie. to further confuse an already confused consumer. As if there was a dearth of absolutely redundant codecs, DTS adds to the list of other such codecs, such as Dolby Digital +, DTS-HD High Resolution Audio.
Member since July - August 1997
Clinton McClure
Casual Enthusiast
What's next Sanjay, caffeine-free DTS-HD Master Audio Essential plus iron? I've been a longtime supporter of DTS but this really doesn't make sense. If they think the change is necessary, why not just do it silently? We really do not need another codec name added to the packaging.
Bob_L
Bob Lindstrom
Am I wrong, or does this appear to be a hardware-licensing issue? Perhaps a way to extend manufacturers the ability to include a DTS-compatible player-based technology in their Blu-ray players that doesn't cost as much as full DTS backward-compatibility? | 科技 |
2016-40/3982/en_head.json.gz/8712 | Why TEDWomen? A Q&A With Host Pat Mitchell
Bianca Bosker
Executive Tech Editor, The Huffington Post
As we reported in July, TED (Technology, Entertainment Design), in partnership with The Paley Center for Media, is launching a brand new TED event: TEDWomen, a two-day conference taking place in December that will focus on innovation and ideas by women and girls worldwide. TEDWomen has been met a range of reactions, from "it's about time" excitement to concern. One blogger worries the event will "encourage further segregation." Salon asked, "Does the world need TEDWomen?" and suggested TED might instead increase the number of female speakers at its existing events.
Pat Mitchell, President and CEO of The Paley Center and the host of TEDWomen, took some time to respond to our questions about the event. Read on for her take on why it's happening, why now, why not TEDMen, and what she hopes the conference will accomplish. What do you think of TEDWomen? Who would you like to see speak there? Share your thoughts in the comments below.
Huffington Post: Why has TED, in conjunction with the Paley Center, decided to launch TEDWomen?
Pat Mitchell: Chris Anderson, TED's curator, and I have been talking for several years about a TED conference that would focus on women and girls and we agreed that the time was right to capture an evolving narrative about women and girls in the unique way that the TED format offers. I really admired how they produced two specialized events -- TEDIndia in 2009 and TEDGlobal in Africa (2007) -- and believed that a similar opportunity had emerged to turn the TED lens on the stories of women and girls as architects of change around the world and across all sectors, to focus on how their ideas and innovations were shaping and reshaping the future. At The Paley Center for Media, through our programs on the role of media, we witnessed the growing interest in the ways that women work, think, learn and lead and the impact of their ideas across the globe and across the media landscape as well as all other sectors of life and work. We agreed that the two institutions together had an opportunity to produce a conference with significance.
HP: Why now?
PM: In my opinion, there's never been a better time. Investing in women and girls may once have been considered a radical notion or even a waste of resources, but in most places in the world today, women and girls are increasingly recognized as a critical link to greater prosperity, political stability, better health and public policy. In the West, of course, generations of educated, empowered women are moving into leadership across all sectors and the impact is measurable. It's an important moment in the evolution of the story of how women and girls in new, and sometimes, old ways are the architects of change across sectors and countries.
HP: Why not TEDMen?
PM: It's an irresistible question, isn't it? But embedded in that question is a dangerous assumption: People tend to assume that the balance between the sexes is a zero-sum game, that when women win, men lose. But it's simply not true. In fact, it's quite the opposite: When women win, we all win. This is one of the key reasons that women are such effective change agents.
HP: One online commenter wondered whether TEDWomen was in danger of blurring the lines between "idea sharing" (TED defines its mission as "ideas worth spreading") and cause advocacy. What do you think? PM: It sounds like this online commentator reacted to the name without reading about the event! TEDWomen isn't championing a cause; it's surfacing and sharing some of the most important ideas of our time. Our focus is on women as change agents, innovators and idea champions, and I think people will be both inspired and surprised by the program. We're exploring some fascinating territory! For example, there's been a flood of data in recent years showing how investment in women and girls in developing nations leads to economic growth, public health improvement, political stability... Why is that? How does it work? What ideas are these women championing? These are profound questions that matter to all of us. HP: What is the mission of the conference?
Pat Mitchell: Now, I attend many women's conferences -- in fact, I went to six on four different continents in one month last year. The increasing numbers of these forums all over the world indicates to me a new awareness of the roles women and girls are playing in bringing new ideas and innovations to their communities and countries. These forums are also ways to discuss the challenges that remain for women to achieve their fullest potential. TEDWomen will focus on the ideas and innovations championed by women and girls. These cover everything from community development to economic growth to biodynamic farming to robotics to medical treatments to the use of technology for personal safety and peace making. Men and women speakers will take the TEDWomen stage with ideas that are reshaping our future, and matter deeply to all of us.
HP: Some have wondered why TED is launching a distinct TEDWomen event, instead of focusing on increasing the number of women speakers at its existing conferences. What's your take on why TEDWomen is necessary? Are there plans to increase the number of female speakers at other TED conferences? If so, how? PM: Thank you for asking that question! There are a few assumptions there, which we'd like to address head-on. First, the intent behind the conference is to explore in depth a subject we find fascinating and timely. We're seeking out talks about women and girls (not just by them). As with every TED, the speaker program will include men and women. And of course, TED will continue to invite extraordinary women to speak at all of their events.
It's important to understand that TED didn't launch TEDWomen to segregate women attendees or speakers outside the main conference, nor as an alternative to putting forward a balanced speaker program at other events. As my TED colleague June Cohen has pointed out, this was already a priority for TED. The launch of TEDWomen marks an enthusiastic "yes/and," not an "either/or." Let's also look at the numbers. Over the past two years, TED Conferences have featured 30-40% women speakers. This isn't ideal, but it's actually much more balanced than many other, similar conferences, and obviously a priority for them. TED2009 had 38% women speakers. TEDGlobal, held this month in Oxford, had 30% women speakers, and they were an extraordinary bunch -- "Half the Sky" co-author Sheryl WuDunn, Women for Women International founder Zainab Salbi, Kiva co-founder Jessica Jackley... also powerhouses like novelist Elif Shafak and musician Annie Lennox. There were similar lineups at TED2010 and TEDIndia, and many more remarkable women booked for TED2011. I know that TED is striving for a balanced program in all their conferences, and will continue to do so.
HP: Will TEDWomen be an annual, ongoing conference, or 2010 only? PM: At the moment, we're focused on creating the most extraordinary 2-day event we can imagine. Ask me again in December about the encore...
TED Women TED Women Conference Why TED Women Ted Pat Mitchell | 科技 |
2016-40/3982/en_head.json.gz/8735 | Buy One, Get Five Free Share TODAY
6 Jan 2000 Buy One, Get Five Free
Sales Push Offers Five Free Movies with DVD Player Purchase By IGN Staff They really want you to buy a DVD player. The fact that DVD players have, in their first three years of availability, outsold CD players and VCRs in their first three years by four to one and five to one, respectively, appears not to be good enough. They want everybody, yes everybody, to have one. So apparently a whole mess of Hollywood studios have partnered up, as have a slew of DVD player manufacturers, to join forces and offer you, yes, you, five free DVDs with the purchase of a player. No, seriously, they are. And, remarkably, the deal seems to be oddly free of fine print or hidden details. Here's the deal: if you buy a DVD player-- any DVD player, any model-- from JVC, Panasonic, Philips, Pioneer, Samsung, Sony, RCA, ProScan, GE, Toshiba, or Zenith between the dates of February 19, 2000 and May 30, 2000, you get five free DVDs: Fools Rush In, The Mask Platinum Series, Get Shorty, The Jackal: Collector's Edition, and Analyze This. Of course, if you don't care about any of those movies, this isn't such a hot deal-- five of crap is still crap-- but unless you really object to any of them, it could be a nice incentive to buy a player: you'll have a jump-start to your library with five movies. This is an expansion on a deal that they had going for the last few months... they've just updated the movies and are offering more. Just so you know. -- Your DVD Editor IGN Recommends | 科技 |
2016-40/3982/en_head.json.gz/8852 | Vol. 33, No. 3, Jun., 1996
Foraging Habitat Pre...
Foraging Habitat Preferences of Vespertilionid Bats in Britain
Allyson L. Walsh and Stephen Harris
Vol. 33, No. 3 (Jun., 1996), pp. 508-518
Description: Journal of Applied Ecology publishes novel papers that apply ecological concepts, theories, models and methods to the management of biological resources in their widest sense. The editors encourage contributions that use applied ecological problems to test and develop basic ecological theory, although there must be clear potential for improving management. The journal includes all major themes in applied ecology: conservation biology, global change, environmental pollution, wildlife and habitat management, land use and management, aquatic resources, restoration ecology, and the management of pests, weeds and disease. Articles that interact with related fields are welcomed providing that their relevance to applied ecology is clear. Further details are available at www.journalofappliedecology.org. JSTOR provides a digital archive of the print version of The Journal of Applied Ecology. The electronic version of The Journal of Applied Ecology is available at http://www3.interscience.wiley.com/journal/117972213/home. Authorised users may be able to access the full text articles at this site.
1. The selection of foraging habitats by vespertilionid bats in Britain was quantified using a stratified sample of 1030 1-km squares. 2. Quantitative analysis of habitat preference and avoidance on a large scale indicated that bats were flexible, yet consistent in their habitat use across contrasting landscapes. Habitats associated with broadleaved woodland and water were most preferred, while arable land, moorland and improved grassland were strongly avoided. Linear vegetation features were selected in all landscapes, demonstrating the importance of habitat continuity to bats. The availability of preferred habitats was low and patchy in all landscapes, indicating that bats have specialized habitat requirements. Differences in habitat selection between landscape types and possible factors influencing habitat selection are discussed. 3. Habitat selection analysed on a local scale demonstrated the same consistent preference for woodland, riparian and corridor habitats across 19 discrete land classes. 4. Management policies for bats in Britain should endeavour to preserve and enhance the availability of woodland, water margins and linear corridor habitats. Lack of continuity of the landscape, loss and fragmentation of habitat patches plus deterioration of the quality of such patches may pose a threat to bat populations.
Journal of Applied Ecology © 1996 British Ecological Society | 科技 |
2016-40/3982/en_head.json.gz/8937 | E3 2004 Opens! First Day Line This year I had the pleasure to attend the first day of "open exhibits" (that's a fancy name for, The-First-Time-People-Get-In-To-Play-With-Stuff) at the 2004 Electronic Entertainment Expo. This year it was located at the LA Convention Center, which is a large venue boasting three huge halls and a grand foyer -- it is one of the larger centers that I've been to. The only time I've ever seen the convention center filled is during the LA International Auto Show and E3. Yeah, E3 is that large!! Upon walking up one of the two main entrances, the first line I encountered was the baggage check line (thanks 9/11). After this line, I thought I was just going to cruise in.nope, that's when I hit the massively long line to get in the door that you see in the photo. Actually, the photo doesn't even do this mother-of-all-lines justice. The only good thing about this line is that I got to meet some people who where traveling in town from out of town. The main guy I met was the manager of a Toys-R-Us here in America. He turned out to be a cool guy and was kind enough to give me the lowdown on some industry inside information. He said he had attended Nintendo's unveiling of the DS (dual screen) the night before. It's reported that one of the screens is a touch screen and that it will launch in Japan first but be available in the States before the end of the year. On the day that I attended, Nintento had not unveiled it to the masses attending E3. Sony's Huge "Booth" Sony PSP Sony on the other had had huge lines of people waiting for the chance to play with their PlayStation Portable. This funky little thing comes in a variety of colors and has the potential for wireless multiplayer action for games like Gran Turismo. Speaking of GT4, the PSP was playing it when I walked by.very, very nice! Anyone who is interested in buying a hand held gaming device might want to hold out until the end of the year to check the Nintendo and Sony machines out. They are going to sell for less than $200. Gizmondo Booth Gizmondo In Use
Speaking of handheld devices, why is Nokia still pushing that N-Gage? They had a huge booth at E3, but their games, graphics, and hardware looked second-rate compared to the Sony and Gizmondo. Yeah, Gizmondo. What a name.another handheld gaming, music, video device this time originating from the UK . Their booth was interesting with a bunch of break dancers performing, DJs spinning records, MCs trying to rhyme, and booth babes passing out flyers. From the looks of the titles they have in development, their marketing seemed to focus squarely on the Urban Gamer which is cool.I'm just wondering who will survive these clone wars - er, I mean handheld wars. My money is on Sony & Nintendo - ones who have been committed to gaming for the longest who also have deepest pockets. Leisure Suit Larry EverQuest II If there was one thing I noticed as a big trend in the software that was presented at E3 was that it seemed like the game developers were very conservative this year with the titles..Sure there was the Leisure Suit Larry game that looks hella funny, but for the most part, the tittles of the show seemed to be all sequels (and or movie tie-ins). Everquest 2; Halo 2; Doom 3; Half-Life 2; Matrix On-line, Lord of the Rings: TT; The Sims sequel; Rollercoaster Tycoon 4; etc. I must say that all of these games looked visually stunning and will push the bounds of the newest video card, but I wanted to see more original titles like Leisure Suit Larry. More:
Phantom Console? Entrance
On another hardware note, we stopped by the Phantom booth to take a look at the phantom console..which isn't like your typical console, but more like a TV/Cable add-on. When we were there they were going through a canned E3 demo of how games are able to be downloaded and played. (Anyone else thinking of Hard|OCP's Article?)We asked what them specifically about their claim to being able to play "any" game and they simply said they "had a lot of partners". We shall see. I'm a bit skeptical about the whole thing and no one that I talked to actually saw a live demo of how a person chooses a game to download and play it on their set. Matrix Online
Ready To RUMBLE! Matrix Online: Due no later than November looked detailed and played extremely smooth (on an ATI, btw)
Ready to Rumble: Have no idea about anything about the game, but I woulda let the booth babes pin me in less time for Roy Jones to get knocked out.
Lord of the Rings? I'm Scared Lord of the Rings: Pretty good looking game play that is capped with bonus live video from the film every time you pass a major level.
Leaving The Expo
That's it for our coverage of the 2004 Electronic Entertainment Expo! As always the crowd was huge and the level of enthusiasm was high on the opening day. We got to see all the latest titles and if all the games come out on time the last half of 2004 should be awesome for gamers of all genera goers. With all the vendors and exhibits present, it's almost impossible for one to spend time with all the developers so you have to pick and choose who you want to talk to. I'm actually looking forward to going next year already! | 科技 |
2016-40/3982/en_head.json.gz/8967 | Sierra Army Depot Deploys Savi Software to Automate Visibility and Management of Inventory
Mountain View, Calif., 01/14/2009 -- Savi, a Lockheed Martin [NYSE: LMT] company, has gone live with an automated asset and inventory tracking system at the Sierra Army Depot in northern California. The depot is a 59-square-mile complex encompassing about 1,200 buildings that serves as an Expeditionary Logistics Center for the storage, maintenance, assembly and containerization of operational stocks and other items – from depot to foxhole.The Depot Total Asset Visibility and Inventory Management System cuts operational costs and improves efficiency by enabling personnel to reduce hours spent searching for containers, major supplies and asset inventory as they move on, through, and off the facilities. Savi's total asset management solution also improves asset inventory utilization, and – via automated alerts – speed the monitoring of environmental conditions of medical supplies in the DEPMED area of a major depot.The fully integrated solution leverages the Savi SmartChain® Enterprise Platform and Savi Asset Management Application, which manage real-time information of supplies affixed with active Radio Frequency Identification (RFID) tags. "Our command and control platform, which is tied to Automatic Identification Technologies, provides more than just real-time visibility," said David Stephens, chief executive officer of Savi Technology. "We're honored to be called upon by the U.S. Department of Defense to support the readiness and operational performance of mission-critical storage and logistics activities at the Sierra Army Depot."The Savi solution at the depot is designed to enhance the visibility and management of assets such as containers, trailers, generators and water purification units. In addition, the solution utilizes sensors that monitor the environmental conditions of DEPMEDS, which are "hospitals in a container" and include necessary medical supplies and equipment required for rapid deployment into the field of operations.As a wholly owned subsidiary of Lockheed Martin [NYSE: LMT], Savi is a leading provider of active Radio Frequency Identification (RFID) and other Automatic Identification Technology-based solutions that improve the visibility, management and security of supply chain assets, shipments and consignments. For more information, visit www.savi.com.Headquartered in Bethesda, MD, Lockheed Martin is a global security company that employs about 140,000 people worldwide and is principally engaged in the research, design, development, manufacture, integration and sustainment of advanced technology systems, products and services. The corporation reported 2007 sales of $41.9 billion. MEDIA CONTACTS | 科技 |
2016-40/3982/en_head.json.gz/8976 | harvey.elliott's Store
The Life Issue
By Harvey Elliott View this Author's Spotlight
This item has not been rated yet Price:
The biannual magazine of the Students for the Exploration and Development of Space (SEDS). "The Life Issue" focuses on the Mars Science Laboratory (MSL), a robotic rover named “Curiosity,” which is now on its way to the red planet. In this issue, you will find a discussion of what it is that makes a planet habitable, an overview of the MSL rover, and a review of Gale Crater - the landing site selected for this flagship mission. In addition to all that, we'll introduce you to the new SEDS-USA executive board and look back on the 2011 High-Powered Rocketry Competition.
About harvey.elliott Harvey is a PhD student at the University of Michigan, studying Space and Planetary Science under Dr. Nilton Renno. His research focuses on the planetary conditions for life, with most of that effort going toward the robotic exploration of Mars. Harvey is the Director of Publications at SEDS-USA and was a founding member of SEDS@UM where he remains highly active on the executive board. | 科技 |
2016-40/3982/en_head.json.gz/8980 | Valve confirms plans to expand Steam content beyond games
updated 05:20 pm EDT, Wed August 8, 2012
First applications on sale through service September 5th
Valve is set to open its Steam store up for non-gaming content. The formal announcement confirms speculation that was fueled last month through the company's mobile apps, which would see Steam gain categories for various productivity and creativity genres. The first non-gaming titles are slated to begin digital distribution on September 5, in a move that evolves Steam into a general app store beyond its original gaming focus. Launch titles will apparently make use of Steamworks features already used in games on the service, such as simplified installation, automatic updates, and a per-user DRM system. Users will also be able to save their files through the Steam Cloud as well as the desktop, a system that is currently used by gamers playing the same game in multiple locations. Steam's 40 million users are "interested in more than playing games," according to Mark Richardson, speaking on behalf of Valve. "They have told us they would like to have more of their software on Steam."
The recent Steam Greenlight initiative allows the community to decide what games will get released onto the Steam store, through direct interaction with developers. It remains unclear if the non-gaming titles will be included in upcoming Greenlight programs.
Additional details have yet to be announced, though the mobile apps showed a wide range of categories for software related to accounting, audio production, photo editing, and software editing, among others. Users were unable to navigate into the categories to view specific titles, however. [via The Verge] Gallery | 科技 |
2016-40/3982/en_head.json.gz/9046 | Home » Northrop Grumman Flight Test New Radar Antenna for B-2 Bomber
Northrop Grumman Flight Test New Radar Antenna for B-2 Bomber
Microwave Journal
Northrop Grumman, Raytheon Closer to Flight Testing of New Radar Anetnna
Northrop Grumman, working closely with Raytheon, has begun flight-testing a new radar antenna on the B-2 stealth bomber that, combined with other upgrades, will enhance the aircraft’s ability to respond to emerging worldwide threats. Testing of the active, electronically scanned array (AESA) antenna on the B-2 represents a milestone for this radar modernization program because it allows engineers to determine, for the first time, how the radar performs under actual conditions. Northrop Grumman is the prime contractor for the B-2, which remains the only long-range, large-payload aircraft that can penetrate deep into protected airspace. Combined with superior airspace control to be provided by the F-22 raptor and global mobility provided by tanker aircraft, the B-2 will ensure an effective US response to threats anywhere in the world. “The radar modernization program is one improvement the Air Force and Northrop Grumman are working on to enhance the B-2’s capabilities,” said Dave Mazur, vice president of Long Range Strike and B-2 program manager for Northrop Grumman’s Integrated Systems sector. “The B-2’s combination of long range, large payload and survivability makes it a unique strike asset and the upgrades will ensure the aircraft remains just as effective in the future.” “Raytheon’s B-2 AESA radar system is performing well so far during the flight test phase,” said Erv Grau, vice president for the Air Combat Avionics Group of Raytheon Space and Airborne Systems. “Integrating our advanced technology onto the platform is critical to ensure the B-2 is not only equipped to deal effectively with a variety of future threats but also has the capability to act as a critical node on the network as the battlespace continues to evolve.” The B-2 radar work is part of a $382 M system development and demonstration contract awarded by the Air Force in 2004. During this phase, Northrop Grumman and Raytheon are developing and testing the radar and will install additional systems on operational B-2 aircraft of the 509th Bomb Wing at Whiteman Air Force Base, MO. This phase will be followed by production to field the new radar and install the antenna into the B-2 fleet. Recent Articles by Microwave Journal
Exclusive product previews for IMS2016
Microwave Journal names event coordinator | 科技 |
2016-40/3982/en_head.json.gz/9115 | EAA AirVenture: An Aviator's Dream World July 27, 2009 Aviation enthusiasts seek out certain destinations. There are Paris and Farnborough for the big international crowd, Kill Devil Hills, N.C. for the historians and Oshkosh, Wisc., for those who crave a look at aircraft that are a little different.
[image-62]
For a week every summer a small airfield in central Wisconsin is an aviator's dream world. It's been that way for more than half a century, since what is now called EAA AirVenture started as a way to celebrate men and women who fly experimental aircraft.
It's grown so much since 1953 that Wittman Regional Airport, the home of the Experimental Aircraft Association, becomes the busiest airport in the country for that week according to the Federal Aviation Administration. That's pretty amazing since it normally doesn't even have scheduled airline service.
Among the aircraft expected to fly into the airfield this year will be a research aircraft from NASA's Dryden Flight Research Center in Edwards Air Force Base, Calif. A NASA Gulfstream III aircraft will land at EAA AirVenture and be parked for public viewing at Aeroshell Square, perhaps not far from a huge Airbus 380 or Virgin Galactic's WhiteKnightTwo spacecraft. The G-III serves as multi-role testbed for a variety of flight research experiments. The aircraft's pilot will be available to answer questions.
And they aren't the only NASA researchers and engineers who will talk to members of the public at the air show about everything from uncrewed air vehicles, past and future moon missions to how the space shuttle flies.
This year marks a special anniversary for NASA and the rest of the world — 40 years since humans first walked on the moon. To commemorate the occasion visitors to EAA AirVenture will be able to see a piece of the lunar surface in person. A moon rock picked up by astronaut Edgar Mitchell in 1971 during the Apollo XIV mission is a star attraction at the NASA pavilion.
This year we're celebrating not only our historic landing on the moon 40 years ago, but looking forward to the next generation of moon missions," said Jim Hull, NASA exhibits manager. "Last month we launched the Lunar Reconnaissance Orbiter. It's circling the moon right now, transmitting images. Then this fall the Lunar Crater Observation and Sensing Satellite will impact the moon looking for water ice."
The Oshkosh exhibit reflects the country's plans to return to the moon. Outside the building are two huge inflatables that represent a lunar habitat concept and the Orion crew capsule. Inside visitors can learn more about robotic moon missions and the systems that will rocket astronauts to the lunar surface from engineers from the Marshall Space Flight Center in Huntsville, Ala.
From the moon, air show participants are able to move onto Mars and a full-scale replica of one of the Mars Exploration Rovers in front of a three-dimensional Martian landscape.
No NASA presentation at an air show is complete without a look at NASA's contributions to aeronautics. Not only do exhibits feature a number of NASA-developed aviation technologies that are now common in airplanes, a special education area allows youngsters to make and take their own ring wing gliders and offer other hands-on activities.
But by far one of the most popular stops at the NASA building is the area known as the NASA craftsmen. Technicians from NASA's Glenn Research Center in Cleveland and Langley Research center in Hampton, Va., show off some of the models and tools researchers use to advance aerospace design.
› View 'NASA at EAA' Gallery› Visit EAA AirVenture Site
Kathy Barnstorff
› Back To Top Please enable JavaScript to view the comments powered by Disqus.
Image Token: [image-47] Inflatable exhibits of a lunar habitat concept and an Orion Crew capsule attract a lot of attention to the NASA Pavilion at the EAA AirVenture air show in Oshkosh, Wisc.Image Credit: NASA / Kathy BarnstorffImage Token: [image-62]Feature Link: View Larger Image Matt Shezifi of Livermore, Calif., tries out a demonstration that shows how astronauts use tools in space.Image Credit: NASA / Kathy BarnstorffImage Token: [image-78]Feature Link: View Larger Image LOADING... Page Last Updated: August 14th, 2013 | 科技 |
2016-40/3982/en_head.json.gz/9149 | Japanese government getting in on the online dating business
Nerve•
Aug. 30, 2010 ••Tweet
Despite studies showing its cities are bursting with people who have little personal space, Japan has a declining birthrate and population. (U.S. births are also on the decline.) The government's solution? Start an internet dating site.
That's right: the Japanese government wants more couples to have children once their married, so they are planning to get involved in the development of the country's relationships from the beginning. In doing so, they will attempt to influence it all the way to the point of Octomomism:
Called the Fukui Marriage-Hunting Cafe, the website makes no attempt to disguise its purpose. And, as if wedded bliss were not its own reward, authorities will offer cash or gifts to couples who tie the knot. "Our goal is to first help people meet each other and then support them as they get married and raise children," says Akemi Iwakabe, deputy director of Fukui's Children & Families division.
At 1.34 children per woman, Japan's fertility rate is one of the lowest in the world, well below the 2.1 that is considered the minimum for a developed nation to maintain a constant population. That means the pool of workers and consumers is shrinking, while the ranks of pensioners are swelling. About 23 percent of the population is over 65, the highest ratio among the 62 countries tracked by Bloomberg. "It's difficult to breathe life back into an economy without children, without young people," says Naoki Iizuka, an economist at Mizuho Securities in Tokyo. "When an area like this keeps aging, the public finances of that government won't last." [Business Week]
Maybe our own government should get in on this, encouraging intelligent people to have kids and paying members of the Texas state legislature to not have any. To avoid Idiocracy, of course.
Love & SexPoliticsscannerWeb You May Also Like New Dating Site Ventures Beyond The Personal Profile
Lunchtime Link Love: Online Dating Mistakes Guys Make Every Day | 科技 |
2016-40/3982/en_head.json.gz/9209 | MYSTERY PLANT: Mystery plant is a member of the orchid family
https://www.facebook.com/pages/University-of-South-Carolina-Herbarium/357630248199
Photo by Linda Lee
This mystery plant is aquatic, mostly seen in wet places like ditches and ponds or sometimes as a component of soggy, floating mats. Which plants display the showiest, most flamboyant flowers? Some will insist that they are the various orchid species.
The orchid family truly is a giant group, easily the largest plant family in the world, in terms of number of different species. Orchids as a family cover the earth � almost. They are known from all but the coldest parts of the planet.
Many are epiphytic, or growing on the branches of trees, but quite a large number, too, are terrestrial, at home on the ground. (Some are even weeds.) Orchids typically have sheathing leaves on the stems, which are alternating, one at each node. There is a tremendous variety of flower shapes, but they all follow a basic theme.
Two very interesting things for some people to realize are that orchid species aren�t all tropical, and that there are plenty of these species that don�t have big, showy corsage-quality blossoms.
In fact, some of these species have flowers that are very tiny and inconspicuous. Something else: all orchid species produce a dry capsule as a fruit, and it will be packed with lots of lots of extremely tiny seeds: probably the smallest seeds of ant plant group.
Native, or wild, orchids are always a crowd-pleaser. In the Southeastern United States, there are plenty of native orchid species, and some of these have relatively large, spectacular flowers.
Among these striking orchids are the lady-slippers, grass-pinks, whorled pogonia, rosebud orchid, bog-rose and showy orchids. Other orchids in our area have flowers that are a bit more modest. This week�s Mystery Plant is a species in the latter group.
It is a bit unusual in that it is aquatic, mostly seen in very wet places, often in ditches or ponds, sometimes as a component of soggy, floating mats. For some reason it seems to like golf-course ponds.
It occurs from southern Virginia all the way to eastern Texas, and then south into South America. In our area, it is a fairly common wetland plant, but it�s often overlooked. The stems bear many leaves, which tightly them.
The sword-shaped leaves themselves are bright green, or sometimes yellowish. In fact, the flowers tend to be greenish, sharing the color of the foliage, and so the flowers tend to be somewhat inconspicuous.
These flowers are typical of orchids, though, in bearing three sepals and three petals. Each of the two upper petals is cleft into a pair of narrow segments. The third, lowest petal is also deeply divided, but into three very narrow, wiggly, thread-like portions.
The whole effect of all this is that the flowers, which are crowded into a spike, appear something like little green spiders crawling around.
The plants often develop slender, pale runners which can produce new flowering stems. This water-lover is blooming now, and will continue until frost.
It can be expected in just about all of the coastal plain counties of Georgia and South Carolina, and the more southern of those in North Carolina.
John Nelson is the curator of the A. C. Moore Herbarium at the University of South Carolina, in the Department of Biological Sciences, Columbia. As a public service, the Herbarium offers free plant identifications. For more information, visit www.herbarium.org or call 803-777-8196, or email nelson@sc.edu.
Answer: �Water-spider orchid,� Habenaria repens | 科技 |
2016-40/3982/en_head.json.gz/9220 | Finding the nano-needle in the haystack - Nortrade
Finding the nano-needle in the haystack
Nanotechnology can be used to improve the properties of commercial products, to kill bacteria, fight odours and more. But what happens when these products are discarded and these incomprehensibly small particles are released into the environment? Can these same qualities that kill bacteria in athletic garments, washing machines and refrigerators have an unintended detrimental impact on health and the environment as well?
RSS From: The Research Council of Norway
Nanosilver is used in various applications such as antibacterial bandages. (Photo: Shutterstock)
Millionths of a millimeter
Nanoparticle use is posing new questions to researchers studying environmentally hazardous substances. The size of nanoparticles is actually measured in millionths of a millimetre. They are far too small for the examination methods used with common chemicals.
The first challenge is to find a way to track where these particles end up. Unless they can be located it will not be possible to determine much about their impact. So what needs to be done to track them down?
“Simply put, it’s easier to find a needle in a haystack when that needle is radioactive,” says Dr Deborah H. Oughton. Dr Oughton is a professor at the Norwegian University of Life Sciences (UMB), specialising in nuclear chemistry. As part of a joint research project involving UMB, the Norwegian Institute for Water Research (NIVA), Bioforsk and international research partners, she has headed the effort to develop methods for tracing nanoparticles by rendering them radioactive. The project is part of the Research Council of Norway’s programme on Norwegian Environmental Research towards 2015 (MILJO2015).
Difficult to discover
“Nanoparticles are so miniscule that they are hard to find using methods common for other environmentally hazardous substances. Researchers looking for potential effects of nanoparticle contamination often resort to the use of unrealistically high concentrations of particles in their experiments,” Dr Oughton states.
But there are several drawbacks to such an approach. First of all, the properties of particles change when concentrations become so dense. Secondly, such methods reveal very little about the spread patterns of the nanoparticles, break down over time or the capacity of nanoparticles to accumulate in concentrations presumed to normally occur in nature.
“This is why we wanted to test if it was possible to track nanoparticles using radioactivity,” Dr Oughton explains.
“It’s easier to find the needle in the haystack when it’s radioactive,” asserts Deborah Oughton at the Norwegian University of Life Sciences (UMB). (Photo: Privat)
Radioactivity as a marker
Dr Oughton has her background in nuclear chemistry and took the idea that radioactivity could be used as a marker from this and several other fields. Related methods are already in use for studies of radioactive environmental contamination as well as in medical diagnostics.
“We looked at methods used in other fields and worked our way towards something which would work for nanoparticles. The idea is that when particles are radioactive, they can be traced. Our trials demonstrate that we can obtain a large amount of new and valuable information using this method even with a very low concentration of particles,” says Dr Oughton.
Highly dangerous to fish
Dr Oughton and colleague Dr Erik Joner of Bioforsk, among others, came up with a method in which earthworms were fed horse dung containing radioactive nanoparticles of silver, cobalt and uranium. Subsequently, they were able to study the uptake and accumulation of the nanoparticles by observing how the radioactivity was distributed and then comparing their observations with physiological findings. In other experiments from the same project, fish were exposed to various concentrations of nanoparticles.
“One of our discoveries showed that nanoparticles can accumulate in different parts of an organism. In salmon, we witnessed that certain nanoparticles affected gill function and had a severely toxic effect. The presence of surprisingly low concentrations of certain types of nanosilvers led to gill failure and resulting death of the fish.”
The study used lake water in order to optimise the relevance of the findings. The water in many lakes in Norway is relatively low in calcium. We found that this increases the amount of time nanoparticles stay in the water, says Dr Oughton.
The finding that nanoparticles may have detrimental effects on fish gives cause for concern as the presence of nanosilver has previously been detected in waste water from sewage purification plants. Nanosilver is also widely used in clothing, and studies show that washing clothes releases nanosilver into the water drainage. Nanosilver is even used in washing machines themselves in many countries, although this is not allowed in Norway.
Researchers discovered that certain types of nanosilver adversely affected salmon gill function and that the toxic effect was greater than expected. (Photo: Shutterstock)
Nanoparticles can release ions over long periods
The researchers also found out new information about the long-term behaviour of nanoparticles in soil.
"Nanoparticles break down over time through the slow release of ions. For some nanoparticles, these ions are the agents responsible for toxic effects on organisms. This gradual leakage of ions means that free nanoparticles continue to pollute the environment over a long period of time,” Dr Oughton states.
After feeding earthworms horse dung containing radioactive nanoparticles, researchers were able to detect the radiation and follow its trail. (Photo: Shutterstock)
Some more toxic than others
The researchers also discovered differences among nanoparticles. Nanosilver was the most toxic of the group.
“Some types of nanosilver had a greater toxic effect than others. It is important for governments and industry to learn more about the risks involved with the various types of nanoparticles. Our research findings, along with other research on the environmental impact of nanoparticles, mean that we will soon know enough to know how to regulate use in order to prevent damage to the environment,” says Dr Oughton.
Growing market, budding research activity
The use of nanoparticles has increased greatly in recent years. Areas of application include cosmetics, clothing, toys and food. The use of nanosilver as an anti-bacterial coating in refrigerators, sports clothing and bandages is among the most common uses.
A great deal of research on the effects of nanosilver and other nanoparticles on health and the environment is being conducted both in Norway and other countries.
“The results from our research project have been published internationally and have attracted interest from many countries. We are currently collaborating with researchers in France, among others, to develop measurement methods based on radioactive markers. At the same time, research institutions in Norway and the EU have also indicated their interest. The work is being continued under several projects funded by the Research Council of Norway and European research institutions,” Dr Oughton points out.
“Technological developments must be carried out responsibly,” asserts Arvid Hallén. (Photo: Sverre Jarild)
Integrated research efforts
Nanotechnology and new materials have been a priority area in the Research Council since it established the research programme, Nanotechnology and New Materials (NANOMAT) under the Large-scale Programme Initiative in 2002. Research activities will be continued under the new large-scale programme, Nanotechnology and Advanced Materials (NANO2021).
“Nanotechnology can open some very exciting doors that could be instrumental in solving major global challenges. This is why we are establishing a new programme to generate knowledge and value creation within the areas of energy, the environment, health and sustainable use of natural resources,” says Director General Arvid Hallén of the Research Council of Norway.
“At the same time, we will take steps to ensure that technological developments are carried out responsibly, in a manner that benefits both individuals and society as a whole. Risks related to the growing use of nanotechnology and advanced materials will be a key focus for research under the new programme,” he adds.
Other Research Council programmes also address the same issues but from a different perspective.
“The purpose of the environmental research programme, MILJO2015, is to look at unintended impacts affecting the natural environment, for example nanoparticles and other materials added to products we use,” explains Mr Hallén. “This makes it possible to view the research in an overall perspective, ensuring the best possible foundation for future innovation and regulation,” he concludes.
Facts about the project
Project title: “Development of methods for tracing nanoparticles in the environment”
Project period: 2008–2010
Norwegian institutions: Norwegian University of Life Sciences (UMB), Norwegian Institute for Water Research (NIVA), Bioforsk
Institutions in other countries: Purdue University, University of Antwerp, Catalan Institute of Nanotechnology (Barcelona).
Project manager: Deborah Helen Oughton, professor at UMB. Email: deborah.oughton@umb.no
Funding: Research Council of Norway’s programme on Norwegian Environmental Research towards 2015 (MILJO2015)
From : http://www.forskningsradet.no/servlet/Satellite?c=Nyhet&pagename=miljo2015/Hovedsidemal&cid=1253979295951
Svalbard Science Forum (SSF)
Transport 2025 (TRANSPORT)
Forskningsmidlene for jordbruk og matindustri (MATFONDAVTALE)
Brukerstyrt innovasjonsarena (BIA)
Forskning, utvikling og demonstrasjon av CO2-håndteringsteknologi (CLIMIT)
Polarforskningsprogrammet (POLARPROG) | 科技 |
2016-40/3982/en_head.json.gz/9276 | News / Mirrors Convey Sunshine to Dark Valley Town
Scatterings
Mirrors Convey Sunshine to Dark Valley Town
Patricia Daukantas Giant mirrors to lighten up dark valley town in winter.
Since its founding a century ago, a factory town in Norway has gone without direct sunlight for six months of each year—until now. A new set of mountaintop mirrors is beaming sunshine into the town square.
Rjukan, a community of 3,500 in the Telemark region of southern Norway, sits in a deep valley, running from east to west, in order to provide homes for workers at a massive hydroelectric plant and other factories that operated for most of the 20th century. The price that its residents paid for their prosperity was a half-year in the shadow of the mountains, from 28 September to 12 March.
In 1913, the founder of both the Norsk Hydro company and Rjukan, Sam Eyde, proposed placing a mirror up in the mountains to bounce sunlight into the town during winter. No one followed up on his idea, although the company built an aerial tramway to take Rjukan residents up to the mountain peaks to get some solar photons.
Ten years ago, artist Martin Andersen moved to Rjukan and noted the lack of sun. After years of fundraising, his campaign to build a Solspiel or “sun mirror” concluded at the end of October, when it went into operation.
The Solspiel consists of three solar-powered, computer-controlled mirrors, each 17 m2 in area, sitting at an altitude of 742 m above sea level, or about 450 m above the Rjukan town square. Together they reflect more than 80 percent of the gathered light down to the town square, to make a sunny patch in which residents can bask.
Andersen served as project manager for the design phase of the Solspiel project, according to Karin Rø, the community's tourist manager. Engineering firms Devotek (Norway) and Bilfinger (Germany) managed the 5-million-krone (U.S. $810,000) project with heliostat technology from Solar Tower Systems GmbH (Germany), mostly used for solar-power farms.
“Looking up to the mirrors is like looking up to the sun—you are not able to do that for more than a half second,” Rø says. No warning sign has been posted, but townspeople seem to understand the danger of staring directly into the reflected beam.
Since the Solspiel is new, there are no plans yet to add more mirrors to bring sunlight to a wider area of Rjukan. “but you never know,” Rø adds. “This was a crazy idea we have realized. We are proud to have made it, and we have got a lot of attention!”
Publish Date: 13 November 2013 Add a Comment | 科技 |
2016-40/3982/en_head.json.gz/9286 | Merchants seek to cash in on Internet chatter | November 25, 2011 | Palo Alto Weekly | Palo Alto Online |
http://paloaltoonline.com/print/story/print/2011/11/25/merchants-seek-to-cash-in-on-internet-chatter
News - November 25, 2011
Merchants seek to cash in on Internet chatter
Palo Alto start-up creates rewards program for loyal shoppers
by Cyrus Hedayati
Subscribe for unlimited accessRead FAQFor years, small businesses have been cautiously looking for ways to convert their online publicity into sales. Recently, the dominant trend among consumers has been "daily deal" websites like Groupon and LivingSocial, which offer a stream of one-time-only price promotions to lure shoppers into stores. This month, Groupon successfully made its initial public offering debut, selling $700 million in shares.
But other entrepreneurs, such as the Palo Alto-based PunchTab, are betting on another model to help merchants leverage the Internet. Rather than seeking new customers through deals, they're trying to ramp up the loyalty of the store's current shoppers.
"It's about getting someone who's spending $20 to spend $30," said Ranjith Kumaran, founder of PunchTab, "rather than getting a bunch of new shoppers to come in and buy a bunch of stuff."
Rewards programs are nothing new, from frequent-flier miles to "buy 10 sandwiches, get one free" cards. But a handful of start-ups, including San Mateo-based Chatterfly, CrowdTwist, Badgeville and PunchTab, are moving that model online. "You know when you go to your favorite restaurant and they have that little fish bowl" for a drawing of customers' business cards, Kumaran said. "We're trying to take that and apply it online. ... It's really the secret of reinforcing behaviors that you're already seeing."
PunchTab has created a mobile-phone app, called PunchTab Local, that rewards consumers for actions they take both online and off, with a focus on reaching loyal shoppers through social media. Customers can gain points for tweeting about the business, promoting it with a Facebook "like," or through repeat purchases. The goal for the business is to amplify its promotions by encouraging customers to share them online. The goal for the customer is to earn a reward, such as a gift card, discount or product.
To get online shoppers into the stores themselves, the app awards more points, known as "entries," for actions taken closer to the cash register. Customers earn 10 entries if they check in to the store using PunchTab Local.
The app is free for merchants to use, making it cost-effective for small businesses, unlike many of the "daily deal" sites, said Kumaran, whose company will move to its new El Camino office in December. Groupon promotions end up losing money for close to one-third of the small businesses that run them, according to a recent study conducted by Rice University Associate Professor of Management Utpal Dholakia. The study also found that close to 80 percent of "daily deal" users were new customers, but significantly fewer spent beyond the deal's value or returned to make a purchase at full price.
"A lot of the merchants say, 'The daily deals guys have been here, and we're not sure that applies to our model,'" said Kumaran, who also founded the digital-file delivery site YouSendIt. "But we always say that we're not trying to create some new kind of behavior in customers, we just want to amplify what they're already doing."
For their part, the "daily deals" sites are looking for ways to incorporate rewards programs into their business models to encourage customer loyalty. LevelUp, which launched this year, offers a series of increasingly better giveaways from the same merchant in an effort to turn one-time, daily-deals-shoppers into repeat customers.
"People forget that it's a young, nascent space that's going through a lot of changes," Kumaran said of his "daily deal" competitors. "Our focus from the beginning was, the user who sticks with you for years is going to be the one who creates the most value for you." Keith Wilson, owner of the Boardroom bar and restaurant in San Francisco, has been using PunchTab Local to market his business.
His customers use the app to recommend the restaurant to their friends on Facebook, earning entries for themselves, he said. "I'm always trying to get those eyeballs on my web page so that when people go out and they're deciding where to go, they think, 'Let's go to the Boardroom,'" he said. "You want a good base of regulars — that way there's always a few seats in the bar, and then everything else builds from there."
With PunchTab, "It's not just getting random people in to get cheap deals," he said.
In exchange for PunchTab users' loyalty to his bar, Wilson is offering the chance to win a $20 gift certificate. This story contains 716 words.
Freelance writer Cyrus Hedayati can be emailed at cyrus.hedayati@gmail.com. | 科技 |
2016-40/3982/en_head.json.gz/9299 | Internet/
Fire Damages Internet Archive Scanning Building
By Chloe Albanesius
November 7, 2013 12:15pm EST
Early on Wednesday morning, a fire broke out at a scanning center at San Francisco's Internet Archive, destroying about $600,000 worth of high-end digitization equipment.
The fire occurred at around 3:30 a.m. local time, when the building was empty, so no one was hurt, and the Internet Archive said that no data was lost. "Some physical materials were in the scanning center because they were being digitized, but most were in a separate locked room or in our physical archive and were not lost," the group said in a statement. "Of those materials we did unfortunately lose, about half had already been digitized. We are working with our library partners now to assess."
The main building was not affected except for damage to the electrical run, which cut power to some servers for a time. But scanning equipment was damaged, and the group will have to repair or rebuild the scanning building. As a result, the Internet Archive is asking for any donations people might be able to provide, as well as assistance with scanning as it recovers.
"This episode has reminded us that digitizing and making copies are good strategies for both access and preservation," the group said. "We have copies of the data in the Internet Archive in multiple locations, so even if our main building had been involved in the fire we still would not have lost the amazing content we have all worked so hard to collect."
The Internet Archive thanked the San Francisco Fire Department for being "fast and great," as well as its city supervisor and a representative of the mayor's office, who "have come by to check up on us."
The Internet Archive, founded in 1996, describes itself as an Internet library. As PCMag noted in 2008, the site's showcase is the Wayback Machine, which hosts snapshots of websites throughout time.
Last year, the organization said it would make more than 1 million pieces of archived Internet content available via BitTorrent. A month later, it launched an information database called TV News Search & Borrow.
Twitter Goes Public, Opens Strong
Netflix, Marvel Partner for Original Superhero Shows
Executive Editor, News & Features
Chloe Albanesius has been with PCMag.com since April 2007, most recently as Executive Editor for News and Features. Prior to that, she worked for a year covering financial IT on Wall Street for Incisive Media. From 2002 to 2005, Chloe covered technology policy for The National Journal's Technology Daily in Washington, DC. She has held internships at NBC's Meet the Press, washingtonpost.com, the Tate Gallery press office in London, Roll Call, and Congressional Quarterly. She graduated with a bachelor's degree in journalism from American University...
More Stories by Chloe
Pepe the Frog Deemed a Hate Symbol
The Internet meme has landed in the Anti-Defamation League's Hate on Display database.
No Hotspots Allowed: Reporters at Debate Forced to Buy $200 Wi-Fi
Reporters said technicians were patrolling the venue to shut down hotspots and sell $200 Wi-Fi inste...
Here's What Clinton, Trump Said About 'the Cyber' During the Debate
Read on for a transcript of Clinton and Trump on ISIS's online recruitment efforts and state-sponsor... | 科技 |
2016-40/3982/en_head.json.gz/9300 | Germany's IFA show, debut ground for numerous new products, hits 90 years old
Martyn Williams
When German radio manufacturers gathered in 1924 to show off their products, just a year after regular broadcasting began in the country, they probably didn’t imagine they were sowing the seeds for what eventually would become IFA.Despite war, the rise of new communications technologies and the fall of European consumer electronics companies to Asian rivals, Berlin’s annual “funkausstellung” (still today, the “FA” in IFA) has grown to become one of the world’s biggest consumer electronics shows. This year’s event kicks off this week in Berlin. IFA 2014 is likely to be notable for a number of smartwatches that are expected to debut. They will join a long line of products that have been introduced at IFA, beginning in the 1920s with those radios.Television arrived at the show in 1928, but continued to be dominated by radio. One of the earliest pictures from IFA shows Albert Einstein inspecting radio sets at the event. He delivered the opening address at what the time was called the “7th Great German Radio and Phonograph Show.”The rise of the Nazi party in Germany in the 1930s meant for several years the event was organized by the Ministry for Public Enlightenment and Propaganda, and the show came to a temporary halt in 1939 after its 16th occurrence when Europe was plunged into World War II.Following the end of the war and the division of Germany, the show left Berlin and was held sporadically from 1950 in Dusseldorf and then in Frankfurt. These years marked the first big expansion of consumer electronics as FM radio began, television became more widespread, transistorized equipment appeared and inventions like the audio cassette and remote control began appearing.In 1967, the wandering event returned to West Berlin and was the launchpad for a technology that wowed consumers at the time: color television. Berlin Mayor Will Brandt used the event to switch on broadcasting in the city, a couple of years before the same happened in East Germany.As the pace of development picked up in the consumer electronics industry, so did the number of product launches that took place at IFA. The show saw teletext (1977), the compact disc (1979), Radio Data System (1987), widescreen TV (1989), MiniDisc and MP3 (1991), DVD (1997) and digital TV (2003). It wasn’t until 1995 that organizers say IFA became a truly multimedia show, but to visitors then the future of consumer electronics must have been clear: small, digital and networked.As wave after wave of new products began flooding onto the market, manufacturers could no longer be held to two-year product cycles and IFA organizers made one of their most important decisions: to hold the show annually.That decision, from 2005, led to a defection over the next few years of major consumer electronics companies from Cebit, a large IT show held annually in Hanover during the cold month of March, to the warm Berlin sunshine of early September. What made matters worse for Cebit was that phone makers were also decamping to Mobile World Congress, held annually in Barcelona in February.For now, IFA remains one of the two major consumer electronics shows in the world. The other is January’s CES in Las Vegas, which in comparison is much younger. It began in 1967.
Martyn Williams Senior U.S. Correspondent Martyn Williams covers Silicon Valley and general technology breaking news for the IDG News Service, and is based in San Francisco.More by Martyn Williams
Report: Police question LG exec for damaging Samsung products
Email "Germany's IFA show, debut ground..." | 科技 |
2016-40/3982/en_head.json.gz/9304 | Media RoomMedia
Home » Ed Whittingham Ed WhittinghamExecutive Director Ed Whittingham is the Executive Director of the Pembina Institute, Canada’s leading energy and environment think tank. The Pembina Institute advocates for strong, effective policies to support Canada’s clean energy transition. Its 45 staff work out of four regional offices across Canada on a $4 million annual budget. In 2011 Ed was named to the Clean50 list, which honours 50 outstanding contributors to sustainable development and clean capitalism in Canada, and in 2016 he was named to Alberta Venture magazine’s list of Alberta’s 50 Most Influential People.
Through his work Ed advises governments, regulators, companies, and research networks and civil society on clean energy. He regularly speaks on climate/energy policy and emerging trends. Ed’s affiliations include Leadership Development at The Banff Centre, the World Economic Forum’s Global Agenda Council on the Future of Oil and Gas, Smart Prosperity and Shell Global’s External Review Committee.
Ed holds an International MBA from York University’s Schulich School of Business, where he specialized in corporate sustainability and international business. His interest in international issues began when he spent a year in Japan as part of the Rotary International Youth Exchange Program. During his graduate studies he was a Social Sciences and Humanities Research Council of Canada scholar, an Export Development Canada scholar and a visiting researcher at the United Nations Environment Programme’s Japan branch. From 2007-2008 he served as an Alcoa Foundation Conservation and Sustainability Practitioner Fellow for his research on the U.S. Climate Action Partnership.
Ed has been profiled in the Globe and Mail, the National Post and Alberta Oil Magazine, and his op-eds have been published in newspapers and magazines across Canada and internationally.
Ed Whittingham is available for speaking engagements. Contact Ed Whittingham
cell: 403-899-0578 • tweet: @edwhittingham
This is a media request or other urgent request
Send a copy of this email to myself
Ed Whittingham's Recent Publications
Shining a spotlight on Alberta’s efforts to tackle climate change
Reflecting on the Pembina Institute's 2016 Climate Summit
Sept. 27, 2016 - By Ed Whittingham
Alberta’s action to transition from fossil fuels is bold, but the province is not alone. Around the world the movement toward a greener future is gaining momentum. The province has joined jurisdictions at the forefront of the transition, and the success of this year’s Alberta Climate Summit speaks volumes about Alberta’s commitment to become a leader in this movement.
An opportunity for continental climate leadership
June 29, 2016 - By Ed Whittingham, Andrew Steer, Marcela López-Vallejo
By establishing an ambitious North American climate agenda, President Obama, President Peña Nieto, and Prime Minister Trudeau could bolster their environmental legacies and help ensure the prosperity of our continent—and our planet—for current and future generations.
Why a carbon price alone would not make Ontario’s climate plan work
June 20, 2016 - By Ed Whittingham, Tim Gray, Sidney Ribaux
A price signal is one of the most efficient measures to change behaviour. But it isn’t the only one needed, especially for essential goods and services like energy and transportation.
U.S.-Canada methane deal: Small investment, big payoff Oped
March 12, 2016 - By Ed Whittingham, Fred Krupp
If we’re going to prevent this catastrophe, we need to accelerate the solutions. One such solution is controlling methane emissions from the oil and gas industry. Natural gas is mostly methane, an extremely potent contributor to climate change that accounts for one-quarter of the world’s current warming. Globally, the oil-and-gas sector is our largest industrial source of methane emissions.
Celebrating crucial climate progress in Canada's oil and gas sector
March 10, 2016 - By Ed Whittingham
Prime Minister Trudeau has announced that Canada will reduce methane emissions in its oil and gas sector by 40 to 45 per cent below 2012 levels by 2025. The announcement marks an important milestone on an issue that the Pembina Institute has been working on for decades. Subscribe Our perspectives to your inbox.
Together, we can lead Canada's transition to clean energy.
Copyright © The Pembina Institute. All rights reserved. Privacy Statement
Community Consulting
Public Sector Consulting
Buildings and Urban Solutions Electricity Liquefied Natural Gas
Oilsands Transportation and Urban Solutions | 科技 |
2016-40/3982/en_head.json.gz/9399 | Nanotechnology New Ventures CompetitionMarch 25 @ 8:00 AM - 5:00 PM - Burton Morgan CenterA joint initiative sponsored by Purdue University and the University of Notre Dame in conjunction with the State of Indiana’s Midwest Institute of Nanoelectronics Discovery (MIND)
Purdue University and the University of Notre Dame are proud partners with the state of Indiana, through the Indiana Economic Development Corporation, in an initiative to promote nanotechnology discoveries and new ventures in Indiana. One component of this initiative is a Nanotechnology New Ventures Competition. The Nanotechnology New Ventures Competition aims to foster translational research and accelerate the commercialization of intellectual property in the nanotechnology arena within the state of Indiana. In doing so, the Competition exemplifies the spirit of Discovery-to-Delivery by generating entrepreneurship opportunities and driving economic development of nanotechnology research. Participants compete for cash prizes totaling $57,000.
New Venture Workshop The Midwest Institute of Nanoelectronics Discovery will provide two sessions on new venture planning on October 29th at Purdue University and November 30th at the University of Notre Dame. Each identical session will be a two-hour, focused session introducing the essentials of the business model and venture planning. Critical needs for success will be discussed and relevant supporting resources will be identified. Jim Davis is the John F. O’Shaughnessy Professor of Family Enterprises and associate professor of strategic management in the Mendoza College of Business at the University of Notre Dame. He has been at the University of Notre Dame since 1991. He launched and directed the Gigot Center for Entrepreneurial studies at Notre Dame from 1998 through 2008. He earned his Ph.D. from the University of Iowa. He has worked with many major national and multinational corporations throughout the world on strategic planning and positioning. He has been a secondary education teacher, a school psychologist and a regional mental health coordinator for the Head Start Program. His primary research interests include strategy, trust, stewardship theory, social capital, corporate governance and family business. He has five children. Jon Gortat is a Technology Project Manager with the Purdue Research Foundation, Office of Technology Commercialization and interim director of Purdue University’s Emerging Innovations Fund. Prior to that, he has more than 5 years experience working in industrial pharmacy research and project management. He has been a small business consultant for several companies and non-profit organizations in the areas of market positioning, pricing analysis, exit financing, and corporate strategy. He holds an MBA from the Krannert Graduate School of Management at Purdue University and a baccalaureate of science degree in Chemical Engineering from Purdue University. Contact DetailsJackie Lanterlanter@purdue.edu 49-41335 | 科技 |
2016-40/3982/en_head.json.gz/9409 | Advertisement Advertisement Advertisement Researchers develop new technique for probing subsurface electronic structure Wed, 01/15/2014 - 8:30am Comments by Lynn Yarris, Lawrence Berkeley National Laboratory “The interface is the device,” Nobel laureate Herbert Kroemer famously observed, referring to the remarkable properties to be found at the junctures where layers of different materials meet. In today’s burgeoning world of nanotechnology, the interfaces between layers of metal oxides are becoming increasingly prominent, with applications in such high-tech favorites as spintronics, high-temperature superconductors, ferroelectrics and multiferroics. Realizing the vast potential of these metal oxide interfaces, especially those buried in subsurface layers, will require detailed knowledge of their electronic structure.
A new technique from an international team of researchers working at Lawrence Berkeley National Laboratory (Berkeley Lab)’s Advanced Light Source (ALS) promises to deliver the goods. In a study led by Charles Fadley, a physicist who holds joint appointments with Berkeley Lab’s Materials Sciences Div. and the Univ. of California Davis, where he is a prof. of physics, the team combined two well-established techniques for studying electronic structure in crystalline materials into a new technique that is optimized for examining electronic properties at subsurface interfaces. They call this new technique SWARPES, for Standing Wave Angle-Resolved Photoemission Spectroscopy.
“SWARPES allows us for the first time to selectively study buried interfaces with either soft or hard x-rays,” Fadley says. “The technique can be applied to any multilayer prototype device structure in spintronics, strongly correlated/high-TC superconductors, or semiconductor electronics. The only limitations are that the sample has to have a high degree of crystalline order, and has to be grown on a nanoscale multilayer mirror suitable for generating an x-ray standing wave.”
As the name indicates, SWARPES combines the use of standing waves of x-rays with ARPES, the technique of choice for studying electronic structure. A standing wave is a vibrational pattern created when two waves of identical wavelength interfere with one another: one is the incident x-ray and the other is the x-ray reflected by a mirror. Interactions between standing waves and core-level electrons reveal much about the properties of each atomic species in a sample. ARPES from the outer valence levels is the long-standing spectroscopic workhorse for the study of electronic structure. X-rays striking a material surface or interface cause the photoemission of electrons at angles and kinetic energies that can be measured to obtain detailed electronic energy levels of the sample. While an extremely powerful tool, ARPES, a soft x-ray technique, is primarily limited to the study of near-surface atoms. It’s harder x-ray cousin, HARPES, makes use of more energetic x-rays to effectively probe subsurface interfaces, but the addition of the standing wave capability provides a much desired depth selectivity.
“The standing wave can be moved up and down in a sample simply by rocking the angle of incidence around the Bragg angle of the mirror,” says Alexander Gray, a former member of Fadley’s UC Davis research group and affiliate with Berkeley Lab’s Materials Sciences Div., who is now a postdoctoral associate at Stanford/SLAC. “Observing an interface between a ferromagnetic conductor (lanthanum strontium manganite) and an insulator (strontium titanate), which constitute a magnetic tunnel junction used in spintronic logic circuits, we’ve shown that changes in the electronic structure can be reliably measured, and that these changes are semi-quantitatively predicted by theory at several levels. Our results point to a much wider use of SWARPES in the future for studying the electronic properties of buried interfaces of many different kinds.”
Fadley, Gray and their collaborators carried out their SWARPES tests at ALS Beamline 7.0.1. The Advanced Light Source is a U.S. Dept. of Energy (DOE) national user facility and Beamline 7.0.1 features a premier endstation for determining the electronic structure of metals, semiconductors and insulators.
Results of this study appear in Europhysics Letters (EPL).
Source: Lawrence Berkeley National Laboratory Advertisement Advertisement View the discussion thread. Home
ChannelsDDD
Topics3D Printing
A.I./Robotics
International R&D
Policy/Regulations
R&D 100
R&D Management
EventsR&D 100 Awards
Connect with R&D Facebook
Product Annoucement Form
Topics 3D Printing | 科技 |
2016-40/3982/en_head.json.gz/9422 | Global Energy Meet Agrees to Roadmap on Renewables
By Lindsay Beck
BEIJING -- Environment officials from around the world agreed in Beijing on Tuesday to work to increase reliance on renewable sources of energy, underscoring a commitment to renewables after oil prices hit record highs.
The draft statement stopped short of setting a firm goal but it recommended the U.N. Commission on Sustainable Development consider the launch of a 10-year framework to "substantially increase the use of renewable energy."
The Beijing Declaration was the culmination of a two-day international conference that was a follow-up to meetings in Johannesburg in 2002 and last year in Bonn that aim to promote cooperation on renewable energy.
"The 10-year framework is much more specific than Bonn. They now have an official request of the UN Commission that feeds back into the UN system," Christine Woerlen, of the Global Environment Facility, told Reuters on the sidelines of the meeting.
The statement also did not set a target for investment in the renewables sector, though it stressed the need for funds for research and development, support for commercialization of new technologies and the transfer of technologies from rich nations to poor.
"Targets and timetables do matter. But there is a dispirited feeling that the U.S. just rejects multilateral target-setting for the time being," said James Cameron of Climate Change Capital, a UK-based merchant bank that focuses on energy and the environment.
Nonetheless, he said the commitment to renewable forms of energy such as solar and wind power was growing.
"Years ago, there wasn't the same solidarity about exposure to oil price risk, exposure to climate risk, the manifest air pollution problems. Those are powerful confluences," he said.
The world will need massive investment in infrastructure to meet surging energy demand, otherwise it will face soaring greenhouse gas emissions, increased dependence on the volatile Middle East for fuel and even higher prices, the International Energy Agency said in a long-term outlook on Monday.
Global investment in renewable energy hit a record $30 billion last year, accounting for 20-25 percent of all investment in the power industry, according to a Worldwatch Institute report released on Sunday.
Although renewable forms of energy are still more expensive than coal and oil, the Beijing Declaration acknowledged that record high global oil prices were focusing attention on alternative sources of power.
"We also note with concern that recent trends in the world energy market, especially the doubling of oil prices in less than two years, has increased the economic risk of relying primarily on imported energy and a volatile world energy market," it said.
On Monday, China, which is the world's second-largest emitter of greenhouse gases after the United States, raised its target for renewable energy, saying it should account for 15 percent of national consumption by 2020.
Some delegates said the conference was only meaningful if there was a commitment to similar targets globally and more concrete pledges on technology transfer.
"For the developing world, we want concrete terms," said Ramialiarisoa Harivelo, of Madagascar's Energy Ministry.
"We don't want declaration after declaration." | 科技 |
2016-40/3982/en_head.json.gz/9439 | Seascape with methane plumes
by John Michael Greer, originally published by The Archdruid Report
| Apr 26, 2012
In the wake of last week’s post, I’d meant to plunge straight into the next part of this sequence of posts and talk about the unraveling of American politics. Still, it’s worth remembering that the twilight of America’s global empire is merely an incident in the greater trajectory of the end of the industrial age, and part of that greater trajectory may just have come into sight over the last week. Some background might be in order. For several years now, it has been possible for ships to sail from the northern Atlantic to the northern Pacific via the Arctic Ocean in late summer and early autumn. In the great days of European maritime exploration, any number of expeditions wrecked themselves in Arctic ice in futile attempts to find the fabled Northwest Passage; now, for the first time in recorded history, it’s a routine trip for a freighter, and as often as not the route is blue water all the way without an ice floe in sight. (Somehow global warming denialists never get around to talking about this.) Last autumn, though, crew members aboard several ships reported seeing, for the first time, patches of sea that appeared to be bubbling, and initial tests indicated that the bubbles were methane. This was a source of some concern, since methane is a far more powerful greenhouse gas than carbon dioxide, there’s a great deal of it trapped in formerly frozen sediments in the Arctic, and the risk of massive methane releases from the polar regions has played a substantial role in the last decade or so of discussions of the risks of global warming. Word of the bubbling ocean up north got briefly into the media, and provoked a fascinating response. The New York Times, for example, published a story that mentioned the reports,and then insisted in strident terms that reputable scientists had proven that the methane plumes were perfectly normal, part of the Arctic Ocean’s slow response to the warming that followed the end of the last ice age. This same “nothing to see here, move along” attitude duly appeared elsewhere in the media. What makes this fascinating is that the New York Times, not that many years earlier, carried bucketloads of stories about the threat of climate change, including stories that warned about the risk that the thawing out of the Arctic might release plumes of methane into the atmosphere. Weirdly, this same reversal seems to have guided the response – or more precisely the nonresponse – of the climate change activist community to these same reports. It might seem reasonable to expect that global warming activists would have leapt on these initial reports as ammunition for their cause; when initial estimates suggested that global warming would melt the glaciers of the Himalayas and deprive India of much of its water supply, certainly, a great deal was made of those claims. Still, that’s not what happened. Instead, a great many people who a few years ago were busily talkng about the terrible risk of methane releases from the Arctic suddenly found something else to discuss once those methane releases stopped being a purely theoretical possibility. Fast forward to this spring. After yet another unseasonably warm Arctic winter, Russian scientists are busy studying the methane releases reported last fall, and initial reports – well, let’s understate things considerably and call them “rather troubling.” Areas of open water up to a kilometer across arefizzing with methane, a condition that one experienced Arctic researcher, Dr. Igor Semiletov, described as completely unprecedented. Another team of researchers, flying a plane with methane sensors over the disintegrating ice cap, has tracked plumes of methane rising into the atmosphere wherever the ice is broken. The amounts detected, they comment, are significant enough to affect global climate. Is this unsettling news being splashed around by the same mainstream media that, only a few years ago, were somberly warning about the risks of global climate change, and trumpeted from the rooftops by climate change activists as proof that their warnings were justified? Not that I’ve heard. In fact, according to recent media reports, James Lovelock – creator of the Gaia hypothesis and author of books painting worst-case global warming scenarios in spectacularly lurid terms – has just announced that, well, actually, he overstated things dramatically, so did other climate activists such as Al Gore, and global warming actually won’t be as bad as all that. In order to make sense of this curious reversal, it’s going to be necessary to take a hard look at some of the less creditable dimensions of the climate change movement. I should say first that as far as I can tell, the great majority of ordinary people who got involved in the climate change movement were guided by the most sincere and sensible motives. Dumping billions of tons of fossil carbon into the atmosphere was a dumb idea all along; pretending that all that carbon could be dumped there without disrupting the subtle and complex balance of the world’s climate was even dumber; and the response to those paired stupidities included a great deal that was praiseworthy. Equally, as far as I can tell, the great majority of scientists whose efforts have helped to prove the reality of anthropogenic climate change have produced honest and competent research, and even the minority that hasn’t met this standard rarely managed to rise, or rather sink, to the levels of cherrypicking, obfuscation, and outright fiction routinely found in climate change denialist literature. That being said, there’s more going on in the world of climate change activism than the honest concern of citizens and the honest labor of researchers, and it’s past time to examine the reasons why the climate change movement got so large and accomplished so little. In the process, we’ll be touching on issues that bear directly on the broader theme I’ve been developing in the last few months, because the rise and fall of climate change activism over the last decade or so has an uncomfortably great deal to do with the mechanisms of empire and the balance of power in a strained and fraying global political system. Until the end of the 1990s, climate change was simply one more captive issue in the internal politics of industrial nations. The political role of captive issues, and the captive constituencies that correspond to them, is too rarely discussed these days. In the United States, for example, environmental protection is one of the captive issues of the Democratic Party; that party mouths slogans about the environment, and even though those slogans are rarely if ever followed up by concrete policies, environmentalists are expected to vote Democratic, since the Republicans are supposed to be so much worse, and willingly play the part of bogeyman. The Republican party, in turn, works the same good cop-bad cop routine on its own captive constituencies, such as gun owners and Christian fundamentalists, and count on the Democrats to act out the bogeyman’s role in turn. It’s an ingenious system for neutralizing potential protest, and it plays a major role in maintaining business as usual in the world’s democratic societies. After the year 2000, though, global climate change got coopted on a grander scale, as the rise of a handful of nonwestern nations to great power status put growing pressure on the United States and its allies. China is the most widely recognized of these, but India and Brazil are also emerging powers; meanwhile Russia, which was briefly subjected to an Anglo-American wealth pump after the collapse of Communism and nearly got bled dry, managed to extract itself in the late 1990s and has been clawing its way back to great power status since then. Faced with these rising or resurgent powers – the BRIC (Brazil, Russia, India, China) nations, as they were called – the United States and its inner circle of allies have tried a number of gambits to keep them in their former places. Historically speaking, war is the usual method for settling such issues, but that isn’t a useful option this time around. Even if nuclear weapons weren’t an issue, and of course they are, I suspect too many people in the Pentagon still remember what happened the last time the US military went head to head with the People’s Liberation Army. (Readers who have no idea what I’m talking about will want to read up on the Korean War.) That left trade policy as the next logical line of defense, and so the late 1990s saw a series of attempts by the US and its allies to use global free trade treaties to put the rest of the world at a permanent economic disadvantage. That effort ran into solid resistance at the 1999 World Trade Organization ministerial talks in Seattle, and collapsed completely four years later. Those of my readers who remember how the WTO talks at Cancun in 2003 crashed and burned may have experienced deja vu when the climate talks at Copenhagen in 2009 did exactly the same thing. The resemblance is not accidental. In the years leading up to the Copenhagen climate talks, the US and its allies argued that it was necessary to replace the Kyoto protocols of 1997 – which mostly restricted carbon emissions from the industrial nations – with a new set that would apply to industrializing countries as well. This was fair enough in the abstract, but the devil was in the details: in this case, the quotas that would place China, India, and other industrializing nations at a permanent disadvantage, and grandfather in the much higher per capita carbon emissions of the United States, Europe and Japan. Environmental rhetoric has been used for such purposes often enough in the past. One of my college ecology textbooks, copyright 1981, mentions ruefully that attempts to pressure Third World nations into enacting strict environmental protections had come to be recognized by those nations as simply one more round of attempts to keep them in a state of permanent economic dependence. While there was more going on than this – the environmental movement in general, like the climate change activist movement in particular, has always included a large number of idealists with the purest of motives – it’s a safe bet that the Third World nations were broadly correct in their assessment, as none of the industrial nations that exerted the pressure ever proposed, let’s say, to forbid their own nationals from exporting environmentally destructive products to the Third World. The stakes at Copenhagen, in other words, were rather different from those discussed in the media, and the outcome could have been predicted from the debacle six years earlier at Cancun. When it became clear to the major players that the United States and its allies were not going to get what they wanted, the entire process fell apart, leaving China to seize the initiative and offer a face-saving compromise that committed neither bloc to any limits that matter. Afterwards, since climate change had failed to keep the BRIC nations at bay, the US dropped the issue like a hot rock; the financial hangover of the housing bubble made climate change lose its appeal to the Democratic Party; and activists suddenly discovered that what they thought was a rising groundswell of support was simply the result of being temporarily funded and used for somebody else’s political advantage. Claims that large-scale methane releases from the warming Arctic would send the planet’s climate spinning out of control played a significant role in both the domestic and the international rhetoric of climate change during the time the movement was coopted, and got dropped along with the movement once it was no longer useful. The same claims, though, also played a broader role in mobilizing citizen activism and scientific concern, and the reasons why nobody outside the corridors of power is talking about the methane plumes deserve some attention as well. What’s at work here is the basic structure of contemporary activism itself. Pick nearly any issue that inspires activism nowadays, and you’ll find that it fits into a strict and stereotyped narrative. It centers on something bad that’s going to get much worse if nothing is done, and the “much worse” generally ends up described in ever more luridly apocalyptic terms as the movement proceeds. Victory for the movement, in turn, is defined for all practical purposes as preventing the worst case scenarios the movement itself offers up; high-level abstractions such as “peace” and “justice” get a lot of play, but it’s very rare for there to be any kind of meaningful vision of a goal to be sought, much less a pragmatic plan for getting there. Opposing the bad, for all practical purposes, replaces seeking the good. Those of my readers who followed the discussion of the tactics of magic in last autumn’s Archdruid Report will doubtless be able to think of several good reasons why this approach is problematic, but there’s another dimension to the problem. In contemporary activism, the worst case scenarios that play so large a part in the rhetoric are there to pressure people into supporting the movement. In climate change activism, certainly, that was the case. Read James Lovelock’s more recent and strident books, or any of the good-sized bookshelf of parallel literature, and you’ll find the claim that failing to support the climate change movement amounts to dooming the planet to a hothouse future in which, by 2100, the sole surviving human beings are a few “breeding pairs” – that’s Lovelock’s phrase – huddled around the tropical shores of the Arctic Ocean, with catastrophic methane releases from the Arctic regions among the driving forces behind that lurid scenario. It’s a compelling image, but once methane plumes actually start boiling up through the waters of the Arctic Ocean, you’ve just lost your rationale for further activism – or, really, for anything else short of jumping off the nearest bridge. That’s the dilemma in which the news from the Arctic has landed climate activists. Having by and large bought into the idea that once the methane starts rising, it’s all over, they have very few options left. It’s a self-created dilemma, though, because methane releases aren’t a new thing in the planet’s history. If it’s true that, as George Santayana said, those who forget their history are condemned to repeat it, it’s equally true that those who forget their paleoecology are condemned not to notice that they’re repeating it – and in this case, as in many others, a good basic knowledge of what happened the last time large scale methane releases coincided with a period of planetary warming. That wasn’t that long ago, as it happened. The end of the last ice age saw sharp increases in methane concentrations in the atmosphere, the rapid melting of continental glaciers, and a steep rise in global temperature that peaked around 6,000 years ago at levels considerably higher than they are today. A controversial theory, the “clathrate gun” hypothesis, argues that the warming was triggered by massive methane releases from the oceans. Whether or not that was the major factor, ice cores from Greenland document rising levels of methane in the air around the same time as the stunningly sudden global warming – an increase of more than 15°F in global average temperatures in less than a decade – that triggered the final collapse of the great ice sheets. The first point to grasp from this is that methane releases aren’t the end of the world. Our ancestors got through the last rounds of it without any sign of massive dieoff, and it’s been argued that the nearly worldwide legends of a great flood may embody a dim folk memory of the vast postglacial floods that took place as the ice melted and the seas rose. For that matter, during most of Earth’s history, the planet has been much hotter than it is now; only a few tens of millions of years ago – yes, that’s practically an eyeblink in deep time – crocodiles sunned themselves on the subtropical shores of Canada’s north coast, at a time when Canada was nearly as close to the North Pole as it is today. Thus Lovelock’s extreme scenario deserves the label of “alarmist” that he himself put on it in the interview cited above. On the other hand, that doesn’t mean that a methane spike in the Arctic can simply be ignored. Since the dim folk memories that might be embodied in flood legends are the only records we’ve got for the human experience of abrupt global warming, we simply don’t know how fast the temperature shift might affect, for example, the already unstable Greenland ice sheet, which contains enough water to raise sea level worldwide by around 30 feet. Some theoretical models argue that Greenland’s ice will melt slowly, while others argue that water pooling beneath the ice could cause huge sections of it to slide off into the sea in short order, filling the North Atlantic first with icebergs, then with meltwater. Which model is correct? Only Gaia knows, and she ain’t telling. Equally, we don’t know whether the melting of the Greenland ice sheet will make nearby continental shelves unstable, as it did the last time around, and reproduce the same set of conditions that caused gargantuan tsunamis at the end of the last ice age. There’s abundant evidence for these; one of them, according to recent research, flooded the North Sea and carved the English Channel in a single day around 8000 years ago; we don’t know how soon those might become a factor around the Atlantic basin, or even if they will. It’s unsettling to realize that we may have no way of finding out until the first one hits. All that’s certain at this point is that something potentially very troubling is happening in Arctic waters, and the possibility that it might have destructive consequences on a local, regional, or continental scale can’t be ruled out. Panic is the least useful response I can think of, so I’ll say this very quietly: if the news from Arctic waters in the months and years to come suggests that things are moving in the wrong direction, and those of my readers who live close to the shores of the northern Atlantic basin happen to have the opportunity to move inland or to higher ground, it might not be unreasonable to do so. **************** On a different topic, the folks at Scarlet Imprint tell me that they’ve still got a few remaining unsold copies of the handbound deluxe "Black Gold" edition of my book The Blood of the Earth: An Essay on Magic and Peak Oil. I know it’s a chunk of money, but there’s something to be said for a book crafted to standards high enough that it’ll still be readable long after industrial civilization has faded into memory. If that interests you, might be worth considering. **************** End of the World of the Week #19 Nostradamus, who’s featured in the last two weekly Ends of the World here, has also had a remarkable track record for inspiring false prophecies in others – and I’m not just thinking of the cheap tabloids that trot out newly manufactured prophecies with his name on them every few months. Many Nostradamus researchers have embarrassed themselves once they moved from trying to force-fit quatrains onto the past, and attempted to use the French prophet’s writings to anticipate the future. One example is Henry C. Roberts, whose The Complete Prophecies of Nostradamus saw print in 1994. After careful study of the quatrains, Roberts came to believe that Nostradamus had infallibly predicted a dramatic event in the near future: the election of Edward Kennedy as president of the United States. (You’ll find this prediction on pages 210 and 218 of Roberts’ book.) Any chance Roberts might have had at a reputation for infallibility went away when Kennedy died in 2009, having never gotten closer to the White House than a failed 1980 run for the Democratic nomination. Oddly enough, a failed Nostradamus prophecy concerning Edward Kennedy also featured in pop musician Al Stewart’s 1973 piece Nostradamus: In the new lands of America three brothers now shall come to power Two alone are born to rule but all must die before their hour It’s not hard to figure out who’s being discussed, but Edward Kennedy died at the age of 77. —story from Apocalypse Not
Tags: Politics
Geopolitics & Military
About John Michael Greer: Author and blogger John Michael Greer writes The Archdruid Report, a weekly blog on peak oil and the future of industrial society, and is the author of four books on peak oil -- 'The Long Descent', 'The Ecotechnic …
Environment | TODAY
by Wilko Duprez, Solutions Journal
Beyond geological and technical concerns, investing in terracing requires …
Environment | Sep 29, 2016
Make Money by Conserving Water in Arizona’s Verde Valley
by Mindy Riesenberg, Cronkite News
One of the last remaining continually flowing rivers in Arizona, the Verde …
Climate Change Will Cripple Coastal Septic Systems
by Brett Walton, Circle of Blue
According to new research, many septic systems — which are simple, …
The Grass Really is Greener, Especially Between the Rails
by Bob Wise, Eclectications blog
If you can use your local bus service without too much inconvenience, you …
Standing Like a Sioux
by Albert Bates, The Great Change
On April 1, in the last phase of istawicayazan wi, the moon of sore eyes, …
US Study Confirms Rapid Increase of Methane Emissions by Oil and Gas
by Andrew Nikiforuk, The Tyee
Another U.S. scientific study has confirmed that methane emissions from oil …
Scientists Uncover Surprising Source of Carbon Storage Hidden in Plain Sight
by Kristen Satre Meyer, Ensia
Agroforestry — integrating trees into cropland or pastureland — …
Jumbo Wild
Water in Plain Sight: Hope for a Thirsty World
Extinction: A Radical History
Seeing Wetiko
Waking the Frog: Solutions for our Climate Change Paralysis | 科技 |
2016-40/3982/en_head.json.gz/9499 | Franklin Museum's gadgets, presentations bring visitors
PHILADELPHIA – Setting Ben Franklin loose in the new Philadelphia museum dedicated to his accomplishments would be as electric as giving him another kite to fly during a lightning storm. He would get a charge out of it.
The Benjamin Franklin Museum’s techno gadgets and virtual presentations bring visitors up to speed on one of Philadelphia’s most famous residents in a style that would wow Franklin himself.Strolling from room to room of the underground museum in Franklin Court, he could tap plenty of touch screens, chuckle along with the animated and amusing film segments told in his own voice and play matching games about his life.But the museum is much more than techno-thrills of playing “Yankee Doodle Dandy” on a virtual armonica (a musical instrument Franklin invented) or seeing your name appear on a computer screen, upside-down and backward in an old-style type, as if Franklin hand-set it for you during his days at a printer.The new museum is a total “re-imagining” of the former Underground Museum built in 1976. It was two years in the making, with funding from the National Park Service, charitable organizations, Pennsylvania and Philadelphia.It visits all aspects of Franklin’s life (citizen, printer, inventor, author, statesman and philosopher), surveys his accomplishments and sums them up in a style that’s engaging and easy to understand.Not far from Independence Hall, the Liberty Bell, the President’s House and other Independence National Historic Park sites, the new highlight of Franklin Court is well worth a visit.The museum, which opened Aug. 24, corrects common misconceptions. Franklin did not discover electricity, as many people think. However, he invented the lightning rod and discovered the importance of grounding it. Internationally known for additional work with electricity, he also invented the way to store electricity and called it a “battery.”Although he never was president of the United States, he served as “president” of Pennsylvania in the days before governors took charge.You’ll find fascinating stories about Franklin’s always-on brain that was churning with ideas and inventions to improve daily life. When he identified a problem, he found a solution. He was having difficulty with his vision. Eureka! He created bifocals. Houses in the 1700s were smoky and cold. Eureka again! He created the Franklin stove. He couldn’t reach books on the top shelves in his library. He created a long-handled reacher. You get the idea.Animated, cartoon-like clips offer humorous and interesting anecdotes drawn from Franklin’s letters and are told in his “voice.”Did you know Franklin flew a kite for a totally different experiment? While swimming, he used one to harness the wind and effortlessly cross a mile-wide pond. He also created the first swim fins and hand-paddles, which he noted were awkward but did help him swim faster.Fascinated by whirlwinds (mini tornadoes), Franklin rode his horse in pursuit of one for nearly three-quarters of a mile. He stopped only because he feared he or his horse would be hurt by the branches and debris spewing from it.
The museum even incorporates Franklin’s love of gray squirrels, which were called “skuggs” in colonial times. Visitors will meet Skuggs, a “tour guide” whose image is used to direct parents and children along “paths” to family activities throughout the museum.“The museum is for visitors from 3 to 103,” says curator Page Talbott. “We want them to come away, not feeling as if they have been taught, but as if they’ve just had an engaging encounter with Franklin and have come to know him better.”Some highlights of the museum’s 45-item artifact collection: A family Bible that Franklin bought for his daughter; a mastodon tooth fossil from Franklin’s collection; a sedan chair used to transport him to the Constitutional Convention (the convention’s oldest member, Franklin suffered terrible pain from gout) and a glass generator Franklin designed and used for his electrical experiments.Although the pieces are small, Franklin’s chess set is one of the artifacts that looms large in his life. He observed, “Life is a kind of chess,” and claimed playing the game made him a better representative for the colonists and diplomat for the United States. Why? He said it honed his skills to think strategically, anticipate moves during negotiations and check himself from making rash decisions. An observer wrote: “His passion for late-night chess games was checked only by his supplies of candles.”The museum steers away from a heavy time-lined approach to telling Franklin’s story. Instead, you’ll learn of his times and accomplishments as they exemplify his most outstanding character traits including “ardent and dutiful,” “ambitious and rebellious,” “motivated to improve,” “curious and full of wonder” and “strategic and persuasive.”The one-story museum’s main entrance and gift shop are on the west side of Franklin Court and next to the “ghost house” structures representing Franklin’s home and print shop. (His Philadelphia home was demolished in 1812.) But the heart of the museum is underground and arranged to suggest different rooms of Franklin’s house.Deciding how to tell Franklin’s story was difficult, according to Cynthia McLeod, superintendent of Independence National Historical Park. Trying to quantify Franklin’s greatness, McLeod struggled to find one modern figure whose qualities could match Franklin’s. “The best I could do was create a composite from the best of Steve Jobs, Bill Clinton, Henry Kissinger, Ted Turner, Katherine Graham, Jon Stewart …” Her voice trails off.Museum displays show Franklin lived the advice he gave, including, “Be frugal and industrious and you will be free.” He mastered the printing trade, lived on a tight budget in his early days and chose a healthier, less-expensive diet to become a stronger, more productive worker. It all paid off. He retired at 42. Although he told his mother, “I’d rather be useful than rich,” he was both.Taking in the rest of the rooms and their contents exemplifies another of his sage sayings: “If you don’t want to be forgotten as soon as you are dead and rotten, either write things worth reading or do things worth the writing.”
Notes McLeod, “It’s amazing to realize a man born 307 years ago could be so recognized and relevant today.”However, there’s one place in the museum that could leave his admirers in a quandary about following his advice. Would it be smarter to save the money, or spend the money for one of the gift shop’s piggy banks, emblazoned with: “A penny saved is a penny earned”? | 科技 |
2016-40/3982/en_head.json.gz/9509 | Update: Apple says 2 new iPhone models coming this year
AP with KPCC staff | AP
Apple CEO Tim Cook speaks about the new iPhone during an Apple product announcement at the Apple campus on September 10, 2013 in Cupertino, California. The company launched two new iPhone models that will run iOS 7. The 5C is made from a hard-coated polycarbonate and comes in five colors. The 5S comes in three colors, features a fingerprint sensor, has an upgraded camera, and contains an A7 chip. Justin Sullivan/Getty Images
Apple CEO Tim Cook revealed the company plans to release two new iPhone designs later this year so it can serve more customers. The iPhone 5C will be cheaper, at $99 for a 16 gigabyte model and $199 for a 32 gigabyte model. An update for Apple's new mobile operating system, iOS7, is slated for Sep. 18.
10:54 a.m.: Apple says 2 new iPhone models coming this year
8:53 a.m.: Apple's next big thing may be lower-priced iPhone
Apple says it is planning to release two new iPhone designs later this year so it can serve more customers.
One is the iPhone 5C, which will be available in five colors — green, blue, yellow, pink and white. CEO Tim Cook calls it "more fun and colorful" than any other iPhone. The 5C has a 4-inch Retina display and is powered by Apple's A6 chip. It also has an 8 megapixel camera, live photo filters and a rear cover that lights up.
The iPhone 5C will cost $99 for a 16 gigabyte model and $199 for a 32 gigabyte model with a two-year wireless contract.
LA Times reporter Chris O’Brien live-tweeted the event at Apple's headquarters in Cupertino, Calif.: Federighi: iOS7 coming.....Sept. 18!
— Chris O'Brien (@obrien) September 10, 2013
iPhone 5C starts at $99 for 16GB. Five colors. And they're "Android Free" Schiller says.
Cook: iOS7. Coming later this month. It will become world's most popular operating system. #iPhone5S #iPhone2013 $AAPL— Chris O'Brien (@obrien) September 10, 2013
— Barbara Ortutay and Michael Liedtke, AP technology writers; KPCC staff
8:53 a.m.: Apple's much-anticipated update to its line-up of iPhones may leave the impression that the technology pioneer's focus has shifted to making more affordable products than engineering innovative breakthroughs.
In keeping with its tight-lipped ways, Apple Inc. hasn't disclosed what's on the agenda for the coming-out party scheduled to begin at 10 a.m. PDT at its Cupertino, Calif., headquarters today. But this is the time of year that Apple typically shows off the latest generation of its iPhone, a device that has reshaped the way people use computers since its debut in 2007. Apple took the wraps off the iPhone 5, the current model, last September. The company has never waited longer than a year to update the iPhone, which has generated $88 billion in revenue during the past year.
Last year's unveiling of the iPhone 5 was held at the Yerba Buena Convention Center in downtown San Francisco, according to Mashable. Today's event will be held at Apple's headquarters in Cupertino. Apple's timetable for rolling out products has vexed many investors who have watched the company's growth slow and profit margins decrease. Meanwhile, a bevy of smartphone makers, most of whom rely on Google Inc.'s free Android software, release wave after wave of devices that cost less than the iPhone. Those concerns are reflected in Apple's stock price, which has declined nearly 30 percent since peaking at $705.07 at about the same time the iPhone 5 went on sale last year. The Standard & Poor's 500 index has risen about 14 percent during the same stretch.
RELATED: Apple and Samsung's war for the smartphone market
Even though Apple's market value of roughly $460 billion is more than any other company in the world, the deterioration in itsstock price is escalating the pressure on CEO Tim Cook to prove he's the right leader to carry on the legacy of co-founder Steve Jobs. Since Cook became CEO two years ago, Apple has only pushed out new versions of products developed under Jobs, raising questions about whether the company's technological vision has become blurred under the new regime.
In public appearances, Cook has repeatedly said Apple is working on some exciting breakthroughs, but he hasn't revealed details. The company is believed to be working on a so-called "smartwatch" that would work like a wrist-bound smartphone. Samsung Electronics, one of Apple's biggest rivals, introduced its own $300 smartwatch called Gear last week, as did Sony and Qualcomm Inc. It's unclear whether a smartwatch will be on Apple's Tuesday agenda.
The company isn't expected to reveal the latest model of its tablet computer, the iPad, until later in the fall. Apple introduced a smaller, less expensive version of the iPad last year in response to the success of more compact and cheaper tablets running on the Android system.
This year's refresh of the iPhone line may address the growing popularity of cheaper Android phones. Based on leaks from suppliers, it appears Apple is poised to release a less elaborate and less expensive version of the iPhone in an attempt to appeal to consumers too frugal or too poor to pay for the high-end model that sells for more than $600 without a wireless contract.
If reports published in technology blogs and newspapers pan out, the stripped-down iPhone will be called the "5C" and be housed in plastic casing that will be offered in a variety of colors instead of an aluminum casing.
Apple declined to comment, but an invitation for Tuesday's event fed the multi-hued speculation swirling around the less expensive iPhone. The invitation was filled with colored bubbles and predicted, "This should brighten everyone's day."
If it introduces a cheaper iPhone, Apple might end production of the iPhone 4 and iPhone 4S that were released in 2010 and 2011, respectively. Those models have been sold at a discount to the iPhone 5, a factor that has lowered the average priceApple has fetched for its phones.
A new version of the high-end iPhone also is expected to be revealed today. The top-of-the-line model, expected to be dubbed the "5S," will be the first to be sold with Apple's revamped mobile software, iOS 7, already installed. The new system, which will automatically update apps installed on the device, can be downloaded on the iPhone 4 and later models, as well as on the tablets beginning with the iPad 2.
The redesigned software announced in June relies on simple graphical elements in neon and pastel colors. Gone is the effort to make the icons look like three-dimensional, embossed objects — a tactic known as "skeuomorphism," that was favored by Jobs. This will be the second iPhone model that Apple has released since Jobs' death in October 2011.
Besides running on iOS 7, the upgraded iPhone may include technology that enables its owner to unlock the device with a fingerprint instead of a four-digit code. There is also speculation that the high-end iPhone will be sold in a golden color to supplement the product line's more prosaic choice of black or white.
"One of the big questions is whether Apple is going to push the envelope on the iPhone or do they feel they have pretty much gone as far as they can go on the smartphone side of things?" said Gartner Inc. analyst Carolina Milanesi.
If there is a gold iPhone, it would be the latest sign of Apple's intensifying focus on China — a market where hundreds of millions of Internet-connected devices are expected to eventually to be sold as the standard of living improves in the world's most populous country. The color gold is considered to be a sign of good fortune in China.
A less expensive iPhone would also help Apple boost sales in China and other less-developed countries where people don't have as much disposable income as in the U.S. and Europe.
In an unusual move, Apple has invited media to another event in Beijing that will be held a few hours after the gathering at its headquarters is scheduled to adjourn. The Beijing event has fed speculation that Apple has lined up a deal to sell its new iPhones through China Mobile, the country's largest wireless carrier. It is an alliance that Cook has been openly courting. The Wall Street Journal last week cited anonymous people who said Apple is preparing to ship iPhones to China Mobile.
Although Apple still touts as iPhone as the best of its breed, the device has been losing some of its panache among consumers.
In the three months ending in June, Apple sold 31 million iPhones worldwide compared to 187 million Android phones made by the likes of Samsung, HTC and LG Electronics, according to the research firm International Data Corp. That left the iPhone with 13 percent of the global market, down from 17 percent at the same time last year. Android phones held a 79 percent share, up from 69 percent last year, according to IDC. | 科技 |
2016-40/3982/en_head.json.gz/9523 | http://www.sfgate.com/news/article/In-the-brains-of-mice-grow-the-cells-of-man-2589248.php
In the brains of mice grow the cells of man / Embryonic implants mature into neurons to help fight diseases
Carl T. Hall, Chronicle Science Writer
Published 4:00 am, Tuesday, December 13, 2005
Researchers in San Diego have designed mice containing fully functional human nerve cells as a novel way to study and potentially treat neurodegenerative diseases such as Parkinson's and Alzheimer's. The neurons were formed in the brains of mice that had been injected with human embryonic stem cells as 2-week-old embryos. Studies at the Salk Institute for Biological Sciences in La Jolla showed that the human cells migrated throughout the mouse brain and took on the traits of their mouse-cell neighbors. The results present direct evidence that primitive human stem cells can be cultured in the lab, be injected into an animal, and then develop into a particular type of desired cell. Related Stories
Stem cell ruling puts research at risk
A way around dilemmas of stem cells
U.S. identifies 10 possible stem cell centers
A door opens for easing stem cell ethical dilemma / Experiments on mice could lead to studies that bypass need to use human embryos
Breakthrough could sidestep stem cell debate
The report appears in this week's Proceedings of the National Academy of Sciences. Scientists said it was the first time cultured human embryonic stem cells have been shown to develop into a particular type of cell in the body of another living species. Creation of a so-called "mouse-human chimeric nervous system" stops well short of spawning a mouse with a human-like cerebral cortex. In fact, all the brain structures of the four mice used in the Salk experiments had been formed before the human cells were injected, and less than 0.1 percent of the mice brain cells were found to be of human origin. Yet the new experiments approach an ethical divide that makes some observers squeamish. The term chimera comes from a creature in Greek mythology that had the head of a lion on the body of a goat. Such monstrous connotations have been tamed somewhat in the modern era of pig valves for heart patients and protein drugs manufactured in hamster cells. Still, some critics of human embryonic stem cell research argue that more attention should be paid to the ethics and potential dangers of cutting-edge biomedical research. "Where are the lines, and how do we decide where the lines are?" wondered Jennifer Lahl, a bioethicist at the Center for Bioethics and Culture Network in Oakland. "What if someone decides to start doing this for art? I'm glad science has progressed to the level that we can do this incredible stuff, but we also have to be a lot more thoughtful about it." The bioethical implications aside, the findings may have broad interest in the stem cell field because they suggest that stem cells respond to much the same signals in mice as in humans -- if not all mammals. Previous development or "differentiation" studies of stem cells into the various cell types of the body have been done primarily in laboratory dishes. Researchers now hope to discover how brain disorders develop out of the mysterious interaction of neurons and their surroundings. Two types of experiments are envisioned: Healthy human cells might be injected into animal brains carrying human-like genetic disorders; and cells carrying disease traits, perhaps derived through cloning techniques from cells of living patients, might be put into normal animal brains. "Is it a diseased environment that influences nondiseased cells, or are diseased cells hurting a healthy, intact environment?" asked Fred Gage, a Salk Institute professor and co-head of the institute's Laboratory of Genetics, who was senior author of the new study. Similarly, if a drug candidate is found, it might first be given to chimeric mice to study the drug's effect on human cells in a living system before proceeding to human trials. Gage said care was taken to ensure that the human-mouse brain experiments were done ethically and noted that the experimental design was approved in advance by an independent ethics review board sponsored by the Salk Institute. Begun in 2003, the work predated the appearance of a National Academies report on stem cell ethics, which has become the tentative guideline of stem cell programs including the California Institute for Regenerative Medicine, created by Proposition 71 in 2004. Gage said a special embryonic stem cell ethics review panel was set up at Salk as suggested by the National Academies, and approved the experiments after they were done. Also, in accord with the guidelines, the mice in the experiments were isolated in separate cages so they couldn't breed, to avoid the possibility of creating lines of chimeric animals. This was done even though it would be highly unlikely that human cells -- which in this case were injected directly into the fluid-filled ventricles of the mouse brains -- would alter mouse reproductive cells. Despite the precautions, leading bioethics experts said careful reviews by someone outside the laboratory are critical before such experiments are conducted. "This kind of research should not be done without proper ethical oversight," said David Magnus, director of the Stanford Center for Biomedical Ethics. "These are very, very tricky issues." He noted that mice are routinely bred with human immune systems, but the nervous system is another matter. In this set of experiments, he said, the important question would be not whether the mice had human cells, but whether any "human-like structures were starting to show up in the mouse brain." The experiments by Gage and his lead collaborator, Alysson Muotri, also of the Salk Institute, may be reassuring on that point. Their work found that a developing mouse brain can tolerate a few human cells, and remain a mouse brain, because the human cells proved to be incapable of restructuring the host. Today's Deals | 科技 |
2016-40/3982/en_head.json.gz/9528 | PSN outage, two weeks and counting
After 16 days of the PlayStation Network downtime, we take a look back at the various time frames we've been given, and what the service restoration will mean for you.
When the PlayStation Network was pulled on April 20, no one could have predicted how long the outage would last, least of all Sony. Now it's been down for over two weeks -- 16 days so far -- and it's looking increasingly like we'll continue to be out over the weekend. Sony assured that the network entered the final stages of internal testing last night. But if the company is hesitant to flip the switch just before the weekend (which would be understandable given the security headaches so far), we'll be looking at almost three straight weeks of PlayStation Network downtime.Even in those 16 days so far, we've seen a flurry of news breaking on an almost daily basis. It's ironic at this point, but Sony has actually offered multiple targets to get the network back online, many of them too ambitious. First it was "a day or two" on April 21, which was obviously off. We didn't know at the time, but Sony was in the midst of hiring consulting firms to investigate the breach.By April 26, Sony claimed to be aiming to have some services back online within a week. Then, in an update on April 30, we heard that services were to be restored this week. It's too early to call that final target as missed yet.When the service does return, you can expect the Welcome Back program to begin, including a free month of PlayStation Plus, and so-far-undisclosed free content. You'll also be required to change your password, using the PlayStation 3 that is associated with the account, for good measure. This morning, Sony also outlined identity theft insurance for all affected users.Check out our in-depth timeline for more details on the PlayStation Network data breach. You can also hear Garnett and the crew discussing the ongoing PSN plight on this week's Weekend Confirmed. Steve Watts
SOTGAce
I guess I'll have to look into again. Thanks for the info. Mr. Goodwrench
You're upgrading wrong then, you should be doing it around once every three years to max out games at a good res like 19... saturn96
Well if Sony is telling the truth only 13k SOE customers lost their Credit Card info, and those where all non-U.S. type ... Visit Chatty to Join The Conversation | 科技 |
2016-40/3982/en_head.json.gz/9558 | Why We Should Hold off on Manned Space Exploration for Now
SlateFuture TenseThe citizen’s guide to the future.Sept. 19 2013 8:08 AM
FROM SLATE, NEW AMERICA, AND ASU
To Boldly Go Nowhere, for Now
Why we should hit pause on manned space exploration.
By Srikanth Saripalli
Future space explorers should be somewhere between human astronauts and robots like NASA's Curiosity rover on Mars. Courtesy of NASA/JPL-Caltech/MSSS On Aug. 20, NASA’s administrator formally welcomed the newest candidates of the astronaut corps and released a space exploration roadmap that includes robotic and human missions to destinations that include near-Earth asteroids, the moon, and Mars.
But given the success (both scientific and in the popular imagination) of Curiosity on Mars, we have to wonder: Is human space exploration really necessary? Can’t we just send robots for exploration and let them do the dangerous work?
Most of the arguments in favor of manned space exploration boil down to the following: a) We need to explore space using people since keeping the entire human race on a single piece of rock is a bad strategy, and even if we send robots first, people would have to make the journey eventually; and b) humans can explore much better than robots. Both these arguments are very near-sighted—in large part because they assume that robots aren’t going to get any better. They also fail to recognize that technology may radically change humans in the next century or so.
Advertisement The first claim is based on the assumption that placing all our bets on Earth is a bad strategy. That is probably true. But there are already folks who are willing to be vitrified so that they can be immortal by transplanting their brain into a fresh (robotic) body. The Russian billionaire Dmitry Itskov hopes to do by 2035 or 2045. Cryonics, or the science of preserving human beings, has been endorsed by numerous scientists. This is fringe science, to be sure. But even if one does not believe that we will have fully robotic bodies in the next 20 or 30 years, it is not far-fetched to think that at least some of us might be a combination of robotic and human systems—yes, cyborgs—in 100 years or so. Researchers like professor Kevin Warwick of the United Kingdom have been working on such brain-computer interfaces for the past decade. Ray Kurzweil in his book The Singularity Is Near predicts that human beings will soon “transcend biology” and traverse the universe as immortal cyborgs. This has far-reaching implications for space travel: One can imagine cyborgs (with human consciousness) that are able to explore inhabitable planets such as Venus and Jupiter or can travel for centuries to the furthest galaxies.
Get Future Tense in your inbox.
Given that the future of our bodies is uncertain, it makes more sense to send robots with intelligence to other planets and galaxies. Nature has built us a certain way—we are best-suited for our planet "Earth." Future space explorers will quickly realize that the human body is not the perfect machine for these environments. We will also want to explore other planets such as Venus and maybe even think about living on those planets. Rather than make those planets habitable, does it not make sense to purposefully evolve ourselves such that we are habitable in those worlds?
The second argument in favor of manned space exploration—that human eyes can be more thorough—is based on the past robotic and human missions to the moon. Several articles in popular press have argued that humans on the moon have produced far more scientific data than the robots on Mars. While this is true, the robots that have been used till now are not at all "autonomous" or "intelligent" in any sense. They are complex machines that are controlled carefully from Earth; each instruction and move made by these rovers on Mars is first tested carefully and then uploaded. These are no different from the industrial welding machines of automobile plants or the drones used in Afghanistan. Indeed, we are very far from having autonomous robots on planetary missions, but such machines are being built in university labs every day. Robot Magellans (with scientific skills to boot) could be here long before colonists take off for Mars.
A third argument that is rarely discussed, but that everyone agrees on, is that human exploration of space provides a valuable public relations opportunity. Contrary to popular belief, there never has been a groundswell of popular support from the general public for the space program. Even during the Apollo era, more people were against the space program than for it. Getting robots into space costs a lot less than humans and is safer —so we can keep the space program going without creating budgetary battles.
Advertisement So what will the future space exploration robots look like? They will look nothing like the rovers that are on Mars today. While NASA is interested in sending big missions with large robots to accomplish tasks, I believe future robots will be smaller, “distributed,” and much cheaper. To understand this, let us look at the current computing environment: We have moved from supercomputers to using distributed computing; from large monolithic data warehouses to saving data in the cloud; from using laptops to tablets and our smartphones. The future of space exploration is going to be the same—we will transition from large, heavy robots and satellites to “nanosats” and small, networked robots. We will use hundreds or thousands of cheap, small "sensor networks" that can be deployed on planetary bodies. These will form a self-organizing network that can quickly explore areas of interest and also organize themselves into larger machines that can mine metals or develop new vehicles for future exploration.
Astronauts may be able to capture the imagination better now than a personality-less robot. But for humanity’s long-term goals of exploration, science and eventual survival, the “evolving” robot may be the better bet.
More from Slate’s series on the future of exploration: Is the ocean the real final frontier, or is manned sea exploration dead? Why are the best meteorites found in Antarctica? Can humans reproduce on interstellar journeys? Why are we still looking for Atlantis? Why do we celebrate the discovery of new species but keep destroying their homes? Who will win the race to claim the melting Arctic—conservationists or profiteers? Why don’t travelers ditch Yelp and Google in favor of wandering? What can exploring Google’s Ngram Viewer teach us about history? How did a 1961 conference jump-start the serious search for extraterrestrial life? Why are liminal spaces—where urban areas meet nature—so beautiful?
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.
Srikanth Saripalli is an assistant professor in the School of Earth and Space Exploration at Arizona State University. He can be reached at srikanth.saripalli@asu.edu. | 科技 |
2016-40/3982/en_head.json.gz/9604 | Long Island Sound dump plans puts NY, Connecticut at odds
Feds announce $82 million for nuclear energy research
Rough Friday: Heat Advisory, Smog Advisory, Severe T-storm Risk Later
Weather Goes into "Beast Mode" - Coldest Week of Winter Coming
The benefits of fracking are clear. The impacts remain a mystery.
It's hard to discuss the issue rationally when definitive, objective studies are lacking. By Kate Galbraith
Ralph Wilson • Associated Press
In this April 23, 2010, photo, workers moved a section of well casing into place at a natural gas well site in Pennsylvania. So vast is the wealth of natural gas locked into dense rock deep beneath Pennsylvania, New York, West Virginia and Ohio that some geologists estimate it’s enough to supply the entire East Coast for 50 years. But freeing it requires a powerful drilling process called hydraulic fracturing or “fracking,” using millions of gallons of water brewed with toxic chemicals that some fear threaten to pollute water above and below ground, deplete aquifers and perhaps endanger human health and the environment.
With Russia menacing Ukraine and Europe with its natural-gas heft, the cry has gone out from British Prime Minister David Cameron, the Wall Street Journal, and even (implicitly) U.S. President Obama: More fracking! If only the European Union would stop importing a third of its natural gas from Russia, the argument goes, it would be easier to impose sterner sanctions and go beyond grandly booting Russia from the G-8. Fracking sounds like a simple and smart solution. Not only can the United States export liquefied shale gas to Europe, but Europe can also help itself diversify by embracing a technology that taps homegrown reserves. “You cannot just rely on other people’s energy,” Obama reportedly told E.U. leaders.
The trouble, of course, is that much of Europe, especially the western half, doesn’t want to frack. France (which has considerable reserves) has banned it, Germany has effectively done the same, and Cameron’s enthusiasm has been slowed in the United Kingdom by not-in-my-backyard environmental protests. As Conservative MP Nick Herbert (who’s not reflexively against fracking) put it last year, fracking has sparked a “fear of the unknown.”
Ah, those pesky known unknowns! Herbert actually nailed the problem. So, here’s a way to help spread fracking: Banish the unknowns. There is still so much uncertainty and hence controversy surrounding fracking, even in the shale-crazed United States, that other countries inevitably have qualms about adopting the technology even as they hanker for its benefits. Fracking, aka hydraulic fracturing, involves shooting water, sand and chemicals beneath the earth to break rock and extract oil or gas. People living in shale-rich areas have raised concerns about air pollution, potential groundwater contamination and even earthquakes. Here’s Herbert again: “People understand the national arguments about the need for secure and cheap energy, but they don’t know how much this is going to damage the local environment.” Exactly.
Definitive, comprehensive, objective studies of fracking are needed to help both ourselves and our allies think rationally about fracking and how it stacks up to the alternatives, like renewable energy, nuclear power, coal, or the cheap-gas trough of Vladimir Putin. Alas, such studies are elusive — and those that exist are quickly challenged by one side or another. As ProPublica has written, “A long-term systematic study of the adverse effects of gas drilling on communities has yet to be undertaken.” That’s a notable omission, given that shale accounted for one-third of U.S. natural gas production in 2011 and is rising quickly.
Fracking is a complex, multistage procedure that can affect the environment in many ways, each of which deserves careful independent review. From an environmental perspective, the key difference from conventional drilling is the amount of liquid involved. Fracking uses a mix of water, sand and chemicals to blast rock and extract oil or gas. That liquid, often several million gallons or more per oil or gas well, must be acquired, transported and used in the frack job. Leftover wastewater must be stored and then disposed of, usually by injection into an underground formation where it is supposed to remain in perpetuity. (Recycling of this excess liquid is still in its infancy.)
If a spill occurs or the liquid seeps into the ground during any of these steps, that’s a problem. Strange things can happen. An official who oversees groundwater in part of West Texas told me that in a few instances, salty water from underground has unexpectedly shot up out of abandoned old oil wells. What he describes is like something out of a sci-fi movie, only real. “They’ll be in a field where they are pumping some of these old wells,” he said, “and they have an injection in one part of it, and all of a sudden something happens and there’s this big leak and it shoots up though the well, and the neighbor’s water well starts getting salty.” It’s basically a mini-geyser of brine.
Other, more typical fracking concerns include air pollution from gas storage or well sites, as chemicals like hydrogen sulfide or benzene are released; methane leaks from natural gas infrastructure; wasteful flaring (that is, the burning of excess natural gas that comes up with oil), and earthquakes that could be caused in a few areas by the underground disposal of frack water.
How often do things actually go wrong, things like brine shooting out of an old well or earthquakes resulting from underground injections? How many pollutants enter the air, and how dangerous are they? Frankly, we don’t know many of the answers. An eight-month investigation by the Center for Public Integrity, InsideClimate News, and the Weather Channel found that in Texas, the top oil- and gas-producing state, the air-monitoring system in a major fracking region known as the Eagle Ford Shale “is so flawed that the state knows almost nothing about the extent of the [air] pollution” in the area.
Fragments of data on fracking do exist. For example, a new study by British and American academics in the journal Marine and Petroleum Geology calculates that 6.3 percent of 8,030 inspected gas wells in Pennsylvania’s Marcellus Shale experienced structural problems between 2005 and 2013. That’s useful information, but it only takes account of one state and one type of problem (albeit an important one). In Europe, the researchers said, little equivalent public data exists on the structural problems of onshore oil and gas wells. (Because geology can vary substantially from place to place, data from as many areas as possible is needed in the public domain. This geologic variation also means that interested groups are sure to challenge studies as inapplicable to other regions.)
The U.S. Environmental Protection Agency expects to complete a study of how fracking impacts water in 2016, two years behind schedule. It will include consolidated information on spills of fracking-related fluid, meaning problems like leaking storage pits and spills from trucks. This is material we need, but even the EPA is finding it hard to pull the data together, according to its latest progress report. For example, in frack-frenzied Texas no database exists on accidents related to hydraulic fracturing. Oil and gas regulators keep data on spills such as the recent Galveston barge collision, but they do not tally chemical spills linked to fracking, according to the EPA report. Wyoming and Colorado, among others, do not break out hydraulic fracturing data on accidents either. An industry website, FracFocus.org, contains some information about fracked wells (unrelated to accidents), but it is partial — especially as it relates to chemical disclosures — as well as voluntary and difficult to pull data sets from.
Case studies are needed, too, and the EPA is performing some. But this is hard. For one thing, the geology is complex, and fracking cannot be studied up close without industry cooperation. Both Ohio State University and the University of Tennessee have courted controversy by considering contracting with — and accepting fees from — drilling companies that would work on university lands.
The influence of oil and gas money has a long reach into academic institutes, not to mention state government. “ ‘Frackademia’ has become the preferred term to describe the new partnerships forming between academia and the fracking industry,” Cary Nelson, a professor at the University of Illinois, Urbana-Champaign, wrote in the Times Higher Education last year. (Similarly, the industry challenges studies in which academics are perceived to have an environmentalist bias.) When I covered oil and gas in Texas between 2010 and 2013, one of the hot topics was the amount of water used in fracking. Fracking can use 4 million to 6 million gallons of water per well, or more, so at a time when drought was hitting Texas hard, that naturally came under scrutiny. For journalists, it was frustrating that the major study on the subject (performed by University of Texas researchers with the imprimatur of the state government’s water board) was funded by an oil and gas association. The 2013 study found that less than 1 percent of annual Texas water use went into fracking. But a subsequent San Antonio Express-News analysis found that the figures for the Eagle Ford Shale, the major new formation in Texas, “far outpace[d]” certain estimates in the industry-funded study.
The benefits of fracking are clear. It has been a giant step toward energy independence for the United States, and it can be for the rest of the world. Everyone wants the jobs it brings, the wealth and tax revenues it produces, and the energy it provides. It’s cleaner-burning than coal, though the dynamic between those two fuels is complex. But it’s time for an honest, levelheaded conversation involving scientists, the federal and state governments, and the public about what we know and what we don’t know about its environmental impacts. We need to collect data and make it available, and we need to figure out how to get answers for the many remaining unknowns, so that countries can decide how to regulate fracking or indeed whether to allow it. Businesses hate uncertainty, as the saying goes — so ending these environmental uncertainties might just help the oil and gas industry by allowing it to make a clear case to the public in the United States and abroad.
This will require cooperation on all sides and, of course, money. In the ideal world, the public and disinterested groups would provide funding. Another suggestion comes from Nelson, the University of Illinois, Urbana-Champaign, professor, who has recommended a levy on drilling companies and the creation of a pool of independent resources for study grants.
“But there is no time!” the cries ring out as the Russian wolf stands on the doorstep, baying. The clock is ticking, yes, but it’s also true that homegrown shale gas in Europe cannot fill the gap in the near term; it may take a decade for it to be extracted in meaningful quantities. That’s plenty of time for study and analysis to lay the groundwork for long-term development of an extraordinary resource. My great hope is to get beyond the juvenile conversation we’re having now — the echoes of which are heard worldwide — in which environmentalists holler loosely, “Fracking contaminates groundwater!” To which the industry — taking the term “fracking” to mean the specific process of rock-breaking, perhaps the least of the risks — responds, “No, it doesn’t!”
Even with more information and continued pressure from Russia, Western Europe still may not be tempted by fracking. At its core, fracking is a mini-industrial operation that often takes place near homes. If the wealthy can avoid it, they will, because the disruption in their back yards will not be worth it. There are also other barriers to shale gas development in Europe, such as the cost of drilling, Europe’s relatively high population density, and the ownership structure of mineral rights, as my friend Russell Gold of the Wall Street Journal (and author of the forthcoming fracking tome “The Boom”) recently explained. The quality of European shales are still uncertain, though France, Poland, Norway and the Netherlands, as well as Ukraine, are among the countries believed to have substantial reserves. But if Britain or Poland wants to proceed, they deserve to have as much information as possible about what lies ahead.
If those of us here in the United States don’t have all the information ourselves — and we (cue the chest-thumping) invented fracking — how are our allies expected to figure it out?
Kate Galbraith is a San Francisco-based journalist who writes about energy and climate. She is co-author of “The Great Texas Wind Rush.” She wrote this article for Foreign Policy.
Environmentalists and elected officials in New York are rekindling a long-running dispute with Connecticut over dumping what critics say is potentially harmful silt from dredging projects into Long Island Sound, the massive waterway that separates the states.
___Netflix facing tougher times as subscriber growth slowsSAN FRANCISCO (AP) — Netflix is running into trouble as the internet video service wrestles with slowing U.S.…
U.S. Energy Secretary Ernest Moniz has announced $82 million for nuclear energy projects in 28 states as part of the government's plan to reduce carbon emissions.
Today will be the kind of day you'd expect to find in Florida: heat, wilting humidity with a threat of a late-day severe thunderstorm blasting through town. Throw in a serving of smog, too. Saturday looks hot but dew points come down a notch. Next week may bring significant rains as we slide back into an extra-wet pattern.
34F yesterday - I took my shirt off, did a little grilling, mowed the lawn (which ended badly). Seriously, we have been spoiled the last few months; more of a Kansas City Winter than an old fashioned Minnesota Winter. That's about to change. 3 dips below zero are sharping up as polar air finally pushes south. Probably not record-setting, but definitely cold enough to get your attention. I'm guessing the Seahawks will enjoy a double-whammy, from the Vikings - with an assist from Old Man Winter. | 科技 |
2016-40/3982/en_head.json.gz/9652 | Save Your Kid's Hearing with These Headphones
posted by Suzanne Kantra on April 06, 2011 in Family and Parenting, Phones and Mobile, Headphones, Health and Home, Health & Fitness, Baby & Toddler, Kids, Guides & Reviews :: 0 comments
Planes, trains and automobiles can be challenging environments for little ones. So when the books, plastic action figures and other assorted toys fail to entertain my 2 ½-year-old son, I’m ready with an iPod Touch loaded with Blues Clues and a pair of kid-friendly headphones.
The kid-friendly bit is essential. According to a U.S. government survey, 12.5% of children ages 6 to 19 (approximately 5.2 million children) have permanent damage caused by exposure to loud noises.
Noise-induced hearing loss occurs when kids have been exposed briefly to a very loud noise or over time to noisy environments. Most MP3 players max out at about 103 decibels (dB), though some can reach sound levels of up to 120dB, which is like standing 100 feet behind the engine of a jet plane as it's taking off. According to the National Institute on Deafness and Other Communication Disorders (NIDCD), more than a minute of exposure to 110 dB (ex. a chain saw) risks permanent hearing loss, as can 15 minutes at 100dB or prolonged exposure at or above 85dB.
For my 2 ½-year-old, I like the Kidz Gear Wired Headphones ($20). These traditional-style headphones have a built-in volume control, are fit him comfortably and deliver good sound. There’s also a wireless version ($35) that works with infrared-based car entertainment systems. Since he can easily turn the volume all the way up on the headphones, I also make use of my iPod Touch’s “volume limit feature (found under settings) to set the maximum volume. About 75 percent is the maximum recommended limit.
My 6-year-old son is very active and tends to walk around our home with his game system, so I want to make sure he’s aware of his environment. So the perfect choice for him is the Mad Catz Airdrives Fit Interactive for Kids ($15), which is rated at a maximum of 80dB at the inner ear. The speaker sits outside the ear canal, so he can hear what’s going on around him, and is held in place with a wire that loops over the ear and cinched for a perfect fit.
Since my 9-year-old daughter likes to retreat to her room with a movie or her music, I chose an in-ear style for her, the Logitech Ultimate Ears Loud Enough Volume Limiting Earphones ($19). The earphone physically blocks ambient sound, so she’s not tempted to crank the volume. Plus, the earphones themselves knock 20dB off any sound source.
To test your child’s headphones and MP3 player, try them on and crank the player to full volume. If you can’t hear someone talking to you in the same room, the player is too loud. The National Institutes of Health also suggests taking periodic 15 to 20 minute breaks when listening at high volume to let the inner ear recover. Frankly it’s good advice for all of us. | 科技 |
2016-40/3982/en_head.json.gz/9698 | Destructive bamboo becomes home space invader
Invasive bamboo sprouts at the Bozrah home of Robin Arcarese.
This spring, Robin Arcarese began to realize her home was being invaded.One day, she looked up at the roof of the garage of the Bozrah home she and her husband purchased 18 months ago, and noticed some leafy stalks poking through the shingles."I said, 'What's coming out of my roof?'" she recalled Monday.Over the next few months, she found the same stalks growing up out of her cement walkway, through the siding of her Cape and under the foundation of the garage."We had to have the siding removed and replaced," she said.The invader was yellow groove bamboo, a native of China that a neighbor had planted in his yard some seven years earlier. Unless contained by special barriers more than 2 feet deep, it can spread rapidly underground by sending out rhizomes that produce new shoots. Fully grown, the bamboo, also known as running bamboo or giant timber bamboo, can grow 40 feet tall and spread out 15 to 20 feet per year, according to Caryn Rickel of Seymour, founder of the Institute of Invasive Bamboo Research. At her home and two other properties she owns, she said, it's invaded a septic system, driveway, gazebo and other areas.Tonight in Orange, Rickel and others hoping for state action to curb invasive bamboo will call attention to their concerns at a meeting at 7 at the High Plains Community Center. They're hoping their stories about the damage bamboo has caused to their properties will persuade the state lawmakers invited to the meeting to introduce legislation in the 2013 session. The problem, she noted, has prompted more than a dozen communities around the country to enact or consider bans."It's very, very hard to kill without risking human health and the environment," Rickel said. "We feel it should be banned."Donna Ellis, co-chairwoman of the state's Invasive Plant Working Group and senior extension educator at the University of Connecticut, said there are more than 100 locations around the state where running bamboo is a problem. Often planted as a natural fence along property lines, she noted that it is different from clumping bamboo, which does not have the same aggressive growth habits and is not considered a problem. It is also different from the small ornamental plants known as lucky bamboo.One of those who plans to attend tonight's meeting is Daniel Wade of East Lyme. In April, he said, a neighbor planted 16 running bamboo stalks, each about 25 feet high, along the property line as the latest volley in an ongoing dispute. Wade said he contacted the town for help, only to be told that since there is no law against yellow grove bamboo, there was nothing officials could do."The state should pass a law preventing people from growing it outside of containers," Wade said. "People shouldn't use it as a weapon to destroy a neighbor's property. It would be best if the state just got rid of it entirely, because it's not native."Last month, the state's Invasive Plants Council considered adding yellow grove bamboo to the official list of plants considered invasive, but found it didn't meet all criteria of the state's legal definition for invasives, said Bill Hyatt, chairman of the council and chief of the Bureau of Natural Resources at the state Department of Energy and Environmental Protection. Still, after hearing from affected property owners and visiting several sites, the council was convinced that action is needed."We did recognize that considerable property damage is being caused by this plant," he said.It voted to send a recommendation to the legislature for a new law that would require education for bamboo sellers about the plant's aggressive ways and that it be contained within pots or barriers set into the ground."We also support assigning liability to property owners for its spread (to adjacent properties)," Hyatt said.But because the legislation would only apply to in-state nurseries and not to the out-of-state online sellers, there is also a need to educate the general public about yellow grove bamboo, he added."The challenge is to make people aware of what they're buying," he said.One recent effort to educate the public came from the Connecticut Nursery and Landscape Association. About a year ago, it developed a tag it recommends nursery owners use on yellow grove bamboo plants for sale. The tag warns buyers that the plant spreads rapidly and that a barrier 28 to 30 inches deep and 2 inches above the soil should be placed around the plants. Concrete, fiberglass, polyethylene or metal are the materials recommended for the barrier. Bob Heffernan, executive director of the association, said he's heard the horror stories, but doesn't believe an outright ban is necessary."When it's properly contained, it doesn't go anywhere," he said. "But there is a personal responsibility (on the part of the property owner) to construct a barrier."At Arcarese's property and her neighbor's, the bamboo is now gone, thanks to the work of Dennis Rogan, owner of an excavating company. About two weeks ago, Rogan said, he finished digging it up with a backhoe, removing the soil up to 16 inches deep where it grew, sifting out all the rhizomes and sending them to an incinerator. New soil was brought in.For the first several years on the neighbor's property, he said, "it was a pretty plant. Then all of a sudden it turned into this monster."He alerted Arcarese and her neighbor to keep watch for any shoots next spring and attack them immediately.He estimates he collected about five cubic yards of rhizomes from an area about 75-by-45-foot area."It was a pretty considerable effort," he said.j.benson@theday.com Next Article Loading comment count...
State steps in as invasive bamboo turns neighbor against neighbor | 科技 |
2016-40/3982/en_head.json.gz/9848 | Trade in the Digital Age: Can e-Residency be an enabler for Asia-Pacific Developing Countries? (Trade Insights: Issue No. 17)20 Apr 2016Working paper series The advent of the digital age in international trade has opened new possibilities for countries at all stages of development. Digital trade can support the achievement of the United Nations Sustainable Development Goals (SDGs) and increase economic prosperity worldwide. However, many developing economies, and particularly least developed countries, often lack the digital infrastructure and legal and policy frameworks to enable their citizens to seize these opportunities. Building e-resilience: Enhancing the role of ICTs for Disaster Risk Management (DRM)12 Apr 2016Books This report highlighted some emerging technologies such as the use of Big Data for DRM purposes. It is one that is still being explored but has so far demonstrated immense potential. However, along with it come significant challenges that have to be overcome in order to truly benefit from real-time use of MNBD. Utilizing new sources of data such as MNBD and even social media for assisting in predicting emerging trends and shocks as well as for building greater resilience is still an emergent field. Working Paper Series (SD/WP/02/April 2016): Asymmetries in International Merchandise Trade Statistics: A case study of selected countries in Asia-Pacific5 Apr 2016Working paper series This working paper introduces the concept of bilateral asymmetries in international merchandise trade statistics (IMTS), i.e. the discrepancies that can be seen in reported bilateral trade flows between trading partners. Such discrepancies mean that the value of exports reported by one country does not equal to the value of imports reported by its partner, also called mirror data. These discrepancies impact bilateral trade balances and other economic variables reliant upon trade balance. Transformations for Sustainable Development: Promoting Environmental Sustainability in Asia and the Pacific 3 Apr 2016Books Asia and the Pacific is a dynamic region. Regional megatrends, such as urbanization, economic and trade integration and rising incomes and changing consumption patterns, are transforming its societies and economies while multiplying the environmental challenges. UNNExT Handbook on Implementing UN/CEFACT e-Business standards in agricultural trade31 Mar 2016Books This handbook presents a general framework for the implementation of e-Business standards in the agrifood sector. The handbook looks specifically at four e-Business standards developed by UN/CEFACT in the areas of electronic phytosanitary certificates; electronic reporting of sustainable fishery management; electronic exchange of laboratory analysis results; and management and exchange for certificates for trade in CITES controlled species. Impacts of Imported Technology in Asia-Pacific Developing Countries: Evidence from Firm-Level Data (Trade Insights: Issue No. 16)24 Mar 2016Working paper series The expansion of technological capabilities among firms in developing countries has often been linked to international integration. Access to larger pools of higher-quality intermediate inputs, as well as the opportunity to employ technology developed in other countries, can stimulate firms to undertake innovative activities and develop new products. This note explores these linkages making use of a firm-level dataset obtained from the World Bank Enterprise Surveys containing information on 22,466 firms across 19 Asia-Pacific economies and 18 industrial sectors. United Nations World Water Development 2016 -- Water and Jobs The United Nations World Water Development Report 2016 - "Water and Jobs"23 Mar 2016Books Water is an essential component of national and local economies, and is needed to create and maintain jobs across all sectors of the economy. Half of the global workforce is employed in eight water and natural resource-dependent industries: agriculture, forestry, fisheries, energy, resource-intensive manufacturing, recycling, building and transport. Disasters in Asia and the Pacific: 2015 Year in Review10 Mar 2016Books This study is part of an annual series, developed by the Information and Communications Technology and Disaster Risk Reduction Division of ESCAP. It provides a yearly overview of natural disasters in the Asia-Pacific region and its impacts. Building e-Resilience in Mongolia, Enhancing the Role of Information and Communications Technology for Disaster Risk Management4 Mar 2016Books The Information and Communications Technology and Disaster Risk Reduction Division (IDD) of the United Nations Economic and Social Commission for Asia and the Pacific (ESCAP) has conducted a series of research on building e-resilience that examines the use of information and communications technology (ICT) for disaster risk reduction (DRR) in selected Asia-Pacific countries. Building e-Resilience in Sri Lanka, Enhancing the Role of Information and Communications Technology for Disaster Risk Management4 Mar 2016Books Disasters affect multiple facets of human life. Therefore, disaster risk management (DRM) requires multiple mechanisms across different silos in order to prepare for and deal with all types of disasters. The multiple mechanisms will most definitely require collaboration at the international or regional level, and coordination with government at the national and local levels, with community organizations and with individuals. In all these instances, effective communication is critical. Pages« first | 科技 |
2016-40/3982/en_head.json.gz/9876 | PC gamer market in “tatters,” says Molyneux
Thursday, 2 October 2008 20:23 GMT
Speaking to Videogamer, Lionhead boss Peter Molyneux’s issued a damning verdict on the state of the PC gaming industry, saying it’s in “tatters”.
“If you look at the gamer market on PC, I’ll be quite honest with you, it’s in tatters,” he said.
“There aren’t that many releases on PC. There are some high points like Crysis and what Blizzard is doing, but other than that you are restricted to The Sims and World of Warcraft, they seem to be dominating the PC side.”
The developer spoke directly about DRM issues currently affecting the platform.
“I would say while me as a player hates any restrictions, [but] I can understand that publishers need to do something to give them the confidence to make games for the PC, to spend the huge amounts of money necessary to spend on development and to get their return,” he said.
“Anything that may give them more confidence on the PC means that ultimately we as gamers will come out better off because they will invest more in the game.”
More through the link. | 科技 |
2016-40/3982/en_head.json.gz/9911 | Thinking of Check-Ins As Searches That Aren't Going to Google
Chris Crum10.27.2010Business Share this Post
More WebProNews Videos
Businesses are still struggling with finding the right social media strategies, let alone strategies for check-in apps like Foursquare, Gowalla, and the recently launched Facebook Places. WebProNews spoke with Lawrence Coburn, CEO of geolocation app provider DoubleDutch about where this industry is headed and what it means for businesses looking to take advantage. "I think it's still early days," he tells us. "I think a huge step for the whole industry was when Facebook Places launched a check-in. And I think the big question that we all need to answer is like 'is the check-in becoming a gesture that's gonna be as common as a status update?' which is, you know, Twitter's thing...I think it is. I think we've passed critical mass, especially with Facebook in the game." "What Facebook did is they launched a very basic service, and with a couple exceptions," he adds. "They did some pretty cool stuff, but for the most part, it's very basic." Facebook has over half a billion users, so naturally, when the company launched Facebook Places, a lot questions surfaced about other check-in services. Would they be able to compete with such a monster? "It looks to me like Foursquare has weathered the storm with Facebook's initial launch, and I don't think Foursquare's going anywhere," Coburn says. "I think they're gonna keep growing. I do think that there's gonna be a shake-out, like right now there's probably 10 or 15 consumer-facing check-in apps, and I don't know if the market can support all those identical apps, but I think we'll see some fragmentation as well." "I think the real competition from the Facebook ecoystem is gonna come from third-party developers like Zynga that build on top of the Facebook Places API, because you'll notice with Facebook Places that they haven't done anything with like virtual goods like badges or points, and these are some of the main attractions of Foursquare and Gowalla," he adds. "So they've just left that open, but I know that third-party developers won't be so shy, and they'll come in and make games, because there's a lot of good game developers on Facebook." Games are one thing, and there are ways businesses can take advantage of games themselves, but is there more to this phenomenon than just games? Real business applications? "I think that location is so fundamental that it has its chance to be its own mega-hit in it's own right," Coburn says. "Think about it. You want to know where your friends are. You want to know where your family is. You need to know where your co-workers are....To me it's almost more fundamental than like a status update, which if you had to pitch Twitter to me on paper now, if I had never seen it, it would be a tough sell, but it worked. It became like a communication channel in its own right." "Right now you hear about a lot of campaigns that big brands are doing with Foursquare, in terms of giving away free stuff, I think Gowalla gave away a bunch of New Jersey Nets tickets. Then there's like deals with Starbucks and I think, Brightkite," he says. "It's all interesting stuff, but I think there's a lot of experimentation, trying to see where the value is for big brands...We do know this: if a consumer tells you (a brand, a company) where they are, it's a big deal." "It's almost like a search query that isn't going to Google," he continues. "It's like they're telling you 'I'm here, what do you have for me?' and that's an opportunity for brands and marketers, and I think we're gonna figure it out as to where the monetary value is." Have you found interesting ways to use check-in apps for your business? Comment here.
Tags:BlogWorld, check-in apps, DoubleDutch, foursquare, Gowalla, Lawrence Coburn, mobile, semper, WebProNews interviews Chris Crum Chris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow Chris on Twitter, on StumbleUpon, on Pinterest and/or on Google: +Chris Crum. Post navigation
Previous: Previous post: StumbleUpon Starts Taking Video More SeriouslyNext: Next post: Newspapers and Magazines Come to Kobo eReader, iPhone and iPad Apps | 科技 |
2016-40/3982/en_head.json.gz/9928 | Partner With Us Press Room
Registered Login
The login area provides access to embargoed journal content and journal papers to registered members of the media. To request access to this content, please complete the form available at www.wiley.com/go/journalnews. Please note you may be asked to provide evidence of media affiliation. If you have any questions please email sciencenewsroom@wiley.com. Home
/ Press Room
Contact Publicity to:
Request an Interview
More Press Releases in:
Earth Science, Life Sciences, Wiley-Blackwell
American Geophysical Union and Wiley-Blackwell Announce Publishing Partnership
The American Geophysical Union, the world’s leading society of Earth and space science, and Wiley-Blackwell, the scientific, medical, technical and scholarly business of John Wiley & Sons, Inc. (NYSE: JWa and JWb), a global provider of content and content-enabled services in research, professional development, and education, announced today that the AGU has selected Wiley-Blackwell as its publishing partner for its portfolio of journals and books. The new partnership will be effective January 2013, subject to completion of a publishing agreement in accordance with a Memorandum of Understanding signed last week.
The AGU, a scientific society with a membership of more than 60,000 Earth and space scientists worldwide, publishes 17 journals for the global research community and an extensive book program in Earth and space science under the Geopress imprint.
“We are delighted that AGU has selected Wiley-Blackwell as their publishing partner. Wiley-Blackwell and AGU are developing an extraordinary partnership that will emphasize strategic development, provide flexible and innovative delivery solutions, and expand access and discoverability of AGU’s Earth and space science publications,” said Stephen M. Smith, Wiley’s President and Chief Executive Officer. He continued, “This partnership offers a rare opportunity to create a powerful resource for Earth and space science researchers and professionals by combining Wiley-Blackwell’s publishing experience and vision, powerful digital platform, and wide global reach with AGU’s high quality content, editorial strength, brand reputation, and extensive author base and membership.”
”AGU is very pleased and excited to enter into this partnership with Wiley-Blackwell,” said Michael J. McPhaden, President. “We have thoroughly considered how best to transform AGU’s publishing program to better serve the scientific community as well as position the program for the future. Wiley-Blackwell’s commitment to partnering with scientific societies and cultural fit with AGU based on shared organizational values and strategic priorities promise to make this a powerful partnership.”
Subject to the finalization of the agreement, Wiley-Blackwell will assume responsibility for AGU’s journal publishing activities in January 2013, collaborating closely on new product and delivery models, subscription fulfillment, production, content management, sales and marketing, and distribution. AGU will retain control and ownership of all its journals and the scientific aspects of publishing including editorial control and oversight by AGU governance. A separate agreement is being negotiated for partnership on AGU’s book publishing program. Both will help to support AGU’s wide array of member services and programs to advance scholarly communication.
Founded in 1807, John Wiley & Sons, Inc. has been a valued source of information and understanding for more than 200 years, helping people around the world meet their needs and fulfill their aspirations. Wiley and its acquired companies have published the works of more than 450 Nobel laureates in all categories: Literature, Economics, Physiology or Medicine, Physics, Chemistry, and Peace. Our core businesses publish scientific, technical, medical, and scholarly journals, encyclopedias, books, and online products and services; professional/trade books, subscription products, training materials, and online applications and Web sites; and educational materials for undergraduate and graduate students and lifelong learners. Wiley's global headquarters are located in Hoboken, New Jersey, with operations in the U.S., Europe, Asia, Canada, and Australia. The Company's Web site can be accessed at http://www.wiley.com. The Company is listed on the New York Stock Exchange under the symbols JWa and JWb. About AGU
The American Geophysical Union is a not-for-profit, professional, scientific organization with more than 60,000 members representing over 148 countries. Established in 1919 and headquartered in Washington, D.C., AGU advances the Earth and space sciences through its scholarly publications, meetings and conferences, and outreach programs. The society’s Fall Meeting, held annually in San Francisco, CA, is the world’s largest gathering of Earth and space scientists. For additional information visit www.agu.org. | 科技 |
2016-40/3982/en_head.json.gz/9944 | Category: Technology
What is Diode Forward Voltage?
Diodes.
G.W. Poulos
Edited By: Daniel Lindley
The diode forward voltage is the drop in electrical voltage that occurs when electrical current is conducted through a diode. Diodes are two-lead semiconductor devices that conduct an electrical signal in one direction but not the other. When a diode is conducting electricity, it is said to be forward biased and it consumes a small amount of the voltage passing through it in the process. The amount of voltage used by the diode itself when forward biased is called the diode forward voltage, diode voltage, or diode voltage drop.
Diodes are constructed of two pieces of the same type of material fused together with a lead attached to each end. One piece of the material, called the cathode, has an additive that makes it negatively charged. The other piece, called the anode, has an additive that makes it positively charged. When these two pieces are fused together, they exchange electrons at the point where they meet, which then becomes balanced, having neither a positive nor a negative charge. This area is called the depletion layer. Ad
If a negative voltage is applied to the cathode of a standard silicon diode, the depletion layer widens, creating an electrical field that resists the voltage. A diode in this condition is said to be reverse biased. As a result, no electrical current can pass through the diode as the diode consumes all of the applied voltage. Hence, the voltage drop, or diode reverse voltage, is 100% of the voltage applied. On the other hand, if a negative voltage is applied to the anode of a standard silicon diode, the negative voltage combines with the negative cathode joining forces. This force is strong enough to overcome the depletion layer and the positively charged anode of the diode. The diode then becomes forward biased and begins to conduct electrical current; however, the electrical force needed to overcome the depletion layer and travel across the positive anode of the diode requires the use of a small amount of the electrical voltage. This used voltage is the diode forward voltage and typically consumes about 0.7 volts in a standard silicon diode.
The diode forward voltage varies from one type of diode to another, depending on the base material used, the amount of charge added to the anode and cathode of the diode, and the diode’s intended application. In applications dealing with very low voltages, special diodes are used that have very thin depletion layers, weak anodes, and subsequently very small diode forward voltages. Likewise, there are special diodes that have diode reverse voltages that are less than 100%, allowing them to conduct electricity even when in a reverse biased condition. Ad
What is a Reverse Diode?
What is a Diode Voltage Regulator?
What is Diode Voltage Drop?
What is a Diode Bar?
What is an Infrared Laser Diode?
What is a Blocking Diode?
What is a High-Voltage Diode? | 科技 |
2016-40/3982/en_head.json.gz/10086 | Unidentified flying object
The common thread weaving among them of breathtaking alterations in consciousness associated with the experiences -- sensations of leaving the body, of flying through the air or being "carried along by the wind," and receiving "startling and novel insights into the nature of reality" that reverberated thereafter with profound, life-changing effects. ~ Susan M. Watkins
Unidentified flying object (commonly abbreviated as UFO or U.F.O.) is the popular term for any aerial phenomenon whose cause cannot be easily or immediately identified. Both military and civilian research show that a significant majority of UFO sightings have been identified after further investigation, either explicitly or indirectly through the presence of clear and simple explanatory factors.
Some years ago I had a conversation with a layman about flying saucers — because I am scientific I know all about flying saucers! I said "I don't think there are flying saucers'. So my antagonist said, "Is it impossible that there are flying saucers? Can you prove that it's impossible?" "No", I said, "I can't prove it's impossible. It's just very unlikely". At that he said, "You are very unscientific. If you can't prove it impossible then how can you say that it's unlikely?" But that is the way that is scientific. It is scientific only to say what is more likely and what less likely, and not to be proving all the time the possible and impossible. To define what I mean, I might have said to him, "Listen, I mean that from my knowledge of the world that I see around me, I think that it is much more likely that the reports of flying saucers are the results of the known irrational characteristics of terrestrial intelligence than of the unknown rational efforts of extra-terrestrial intelligence." It is just more likely. That is all.
Richard Feynman in The Character of Physical Law (1964)
Anyway, I have to argue about flying saucers on the beach with people, you know. And I was interested in this: they keep arguing that it is possible. And that's true. It is possible. They do not appreciate that the problem is not to demonstrate whether it's possible or not but whether it's going on or not.
Richard Feynman in The Meaning of It All : Thoughts of a Citizen Scientist (1998)
It is important to state here -- though evidence will be considered in detail later on -- that all three women have either had "dreams" or normal recollections of having been shown, at later times, tiny offspring whose appearance suggests they are something other than completely human . . . that they are in fact hybrids, partly human and partly what we must call, for want of a better term, alien. It is unthinkable and unbelievable -- yet the evidence points in that direction. An ongoing and systematic breeding experiment must be considered one of the central purposes of UFO abductions.
Budd Hopkins in Intruders: The Incredible Visitations at Copley Woods, p. 130
In this unexpected scenario, the UFO occupants -- despite their obvious technological superiority -- are desperate for both human genetic material and the ability to feel human emotions -- particularly maternal emotions. Unlikely though it may seem, it is possible that the very survival of these extraterrestrials depends upon their success in absorbing chemical and psychological properties received from human abductees.
Recently, the press has been filled with reports of sightings of flying saucers. While we need not give credence to these stories, they allow our imagination to speculate on how visitors from outer space would judge us. I am afraid they would be stupefied at our conduct. They would observe that for death planning we spend billions to create engines and strategies for war. They would also observe that we spend millions to prevent death by disease and other causes. Finally they would observe that we spend paltry sums for population planning, even though its spontaneous growth is an urgent threat to life on our planet. Our visitors from outer space could be forgiven if they reported home that our planet is inhabited by a race of insane men whose future is bleak and uncertain.
Dr. Martin Luther King Jr. Upon Accepting The Planned Parenthood Federation of America Margaret Sanger Award (May 5, 1966)
"Abductees," Eva said, "are souls that have, for their individual purposes and reasons, chosen the probability of physical form." But through their experiences they are "regaining their memory of source . . . The process of abduction is one form of such, of regaining memory." The abduction "experience itself," Eva said, "is a mechanism to remove" the "structures that impede the reconnection with source," and to purify the physical vehicle in such a way to serve to regain better memory and to bring knowledge to others."
John E. Mack in Abduction: Human Encounters With Aliens , p 258-259
Do you know how many times we've come close to world war three over a flock of geese on a computer screen?
The Joker, Batman The Killing Joke, written by Alan Moore
Hypnotism will become more and more a tool of scientific investigation. Telepathy will be proven without a doubt, and utilized, sadly enough in the beginning, for purposes of war and intrigue. Nevertheless telepathy will enable your race to make its first contact with alien intelligence.
Jane Roberts, in The Early Sessions: Book 2, Session 45, Page 21
When science progresses on various planes, then such visitations become less accidental and more planned. However, since the inhabitants of each plane are bound by the particular materialized patterns of their 'home,' they bring this pattern of camouflaged vitality with them. Certain kinds of science cannot operate without it. When the inhabitants of a plane have learned mental science patterns, then they are to a great degree freed from the more regular camouflage patterns . . . the flying saucer appearances come from a system much more advanced in technological sciences than yours. However, this is still not a mental science plane. Therefore, the camouflage paraphernalia appears, more or less visible, to your astonishment. So strong is this tendency for vitality to change from one apparent form to another, that what you have here in your flying saucers is something that is actually not of your plane nor of the plane of its origins. What happens is this: When the 'flying saucer' starts out toward its destination, the atoms and molecules that compose it (and which are themselves formed by vitality) are more or less aligned according to the pattern inflicted upon it by its own territory. As it enters your plane, a distortion occurs. The actual structure of the craft is caught in a dilemma of form. It is caught between transforming itself completely into earth's particular camouflage pattern, and retaining its original pattern.
Jane Roberts, in Seth, Dreams & Projections of Consciousness, p. 101-102
What struck me more than the book's UFO stories, however, was the common thread weaving among them of breathtaking alterations in consciousness associated with the experiences -- sensations of leaving the body, of flying through the air or being "carried along by the wind," and receiving "startling and novel insights into the nature of reality" that reverberated thereafter with profound, life-changing effects.
Susan M. Watkins, in Speaking of Jane Roberts, p. 2 (2001)
(Gardner) writes about various kinds of cranks with the conscious superiority of the scientist, and in most cases one can share his sense of the victory of reason. But after half a dozen chapters this non-stop superiority begins to irritate; you begin to wonder about the standards that make him so certain he is always right. He asserts that the scientist, unlike the crank, does his best to remain open-minded. So how can he be so sure that no sane person has ever seen a flying saucer, or used a dowsing rod to locate water? And that all the people he disagrees with are unbalanced fanatics? A colleague of the positivist philosopher A. J. Ayer once remarked wryly "I wish I was as certain of anything as he seems to be about everything". Martin Gardner produces the same feeling.
Colin Wilson in The Quest For Wilhelm Reich , pp. 2-3
These left me in no doubt that something was trying to communicate with us, but that direct communication would be counterproductive. It seemed to be an important part of the scheme to create a sense of mystery.
Colin Wilson in Alien Dawn, p. 352 (1998)
Astronaut UFO Quotes
Retrieved from "https://en.wikiquote.org/w/index.php?title=Unidentified_flying_object&oldid=2150129" Categories: AviationBeliefTechnologyEngineeringAstronomyWeaponsWar Navigation menu
Wikimedia CommonsWikipediaWikivoyage Tools
БългарскиEspañolPolski Edit links This page was last modified on 22 July 2016, at 16:23. | 科技 |
2016-40/3982/en_head.json.gz/10091 | Alion and Tetra Tech Win EPA Contract Awards
Alion Science and Technology, an employee-owned technology solutions company, was awarded a five-year task-order from the U.S. Environmental Protection Agency, worth a maximum of $72 million, to continue its work providing research support to assess exposure to atmospheric pollutants and other materials that have the potential to cause adverse human health effects.
Alion supports EPA’s Office of Research and Development by assisting with the conduct of research that characterizes human pollutant exposures across a broad spectrum. Subcontractor RTI International is a key partner in this effort. Results will be applied to practical efforts related to air quality measurement, pollution control and homeland security. “We support the continuing improvement in our understanding of how people become exposed to air pollutants,” explained Chris Amos, Alion senior vice president and manager of the company’s Technology Solutions Group. “Alion has a long history with EPA, dating back to 1973, in conducting atmospheric sciences and environmental methods development research, with an increased emphasis on studying human exposure. Ultimately, the research we are conducting now may help prevent negative health effects from pollutants.” Under the contract, Alion will develop and support human exposure models and support research in aerosol and human exposure studies. Pollutants and other compounds will be sampled and analyzed in air, water, food, dust, soil and biological media, and tracked through inhalation, dermal exposure and ingestion. Other subcontractors on the five-year contract include The McConnell Group and Alpha-Gamma Technologies. Alion Science and Technology is an employee-owned technology solutions company delivering technical expertise and operational support to the Department of Defense, civilian government agencies and commercial customers. Building on almost 75 years of R&D and engineering expertise, Alion brings innovation and insight to multiple business areas: naval architecture & marine engineering; defense operations; modeling & simulation; technology integration; information technology & wireless communications; and energy and environmental sciences.
Tetra Tech, Inc. has been awarded a five-year, $19.7 million contract with the U.S. Environmental Protection Agency’s landfill methane outreach program.
Methane is a potent greenhouse gas that, when properly managed and captured, can be used to fuel power plants, manufacturing facilities, vehicles, and homes. The EPA’s program aims to reduce methane emissions from landfills by encouraging recovery and beneficial use of landfill gas as an energy source. Working with the EPA’s Office of Air and Radiation, Climate Change Division, Tetra Tech will provide technical support for a wide variety of landfill program activities in the United States and internationally. Tetra Tech will help EPA conduct feasibility assessments; review policies and analyze markets; support tracking and reporting programs; develop emissions inventory protocols; conduct training and outreach; and support technology transfer and demonstration projects, among other activities. With approximately 12,000 employees worldwide, Tetra Tech’s capabilities span the entire project life cycle. E-Mail this page | 科技 |
2016-40/3982/en_head.json.gz/10104 | Big DataMobile
Cities turn the corner on 'smart city' investment
By Brian RobinsonJan 30, 2014
This year cities around the world start getting smart, according to an analysis by IDC Government Insights, which studied trends in the smart city movement to use data analytics and IT in improving civic life.
Up to now, says IDC, cities have been mostly nibbling at the edges of the smart city concept. In a full realization of the idea, cities would use embedded sensors and mobile and cloud technologies to understand the costs and efficiencies of traffic flows, water management and other infrastructure problems. The goal is a more livable city, one that improves services for its citizens and offers competitive advantages for attracting tourists and industry. In 2014, IDC predicts, some 15 percent of cities around the world will be in what it calls the “opportunistic phase” of smart city maturity. Technology investments — some needed to support this smart city push, others a consequence of it — will result.
For one thing, IDC says, somewhere between 15 and 20 percent of traditional IT spending will be redirected to the cloud.
That’s not required for cities to go smart, according to Ruthbea Yesner Clarke, IDC’s smart city strategies director, but it saves city governments from having to switch money from capital to operational expenditures.
“I haven’t come across a city government yet that thinks that’s a brilliant way to do business,” she said, noting that “operational budgets can be cut at someone’s whim.” Also, she said, “when you are looking at such things as the sensors you need for a smart city and you simply want to manage all of that and not buy, then cloud becomes an enabler in moving to the next innovative thing quickly and easily, and there are obvious business models for that.”
For many cities, cloud might be the only way they can become smart. Cloud is what lets cities big and small leapfrog the state of the art without going through the painful process of getting there, according to Jesse Berst, chairman of the Smart Cities Council.
“If you are a small or medium-sized city in particular, you could never afford to custom develop these kind of sophisticated [smart city] applications,” he said, in a recent interview with GCN. “You could never afford the staff for that, or the hardware or such things as the 24/7 disaster proof server farms you will need.”
There will also be a big increase in spending on the Internet of Things during 2014, IDC predicts, to a total of $265 billion worldwide. The term is a current buzz phrase used to describe the spread of embedded sensors and how they are interconnected so the data they collect can be analyzed. In a smart city context, sensors connected to the IP network would be used to measure traffic flow or energy use in buildings.
Over five years, cities are looking at an average yearly IT spending growth rate of 11 percent, which “is pretty high compared to a ‘normal’ IT spend,” IDC’s Clarke said. The rise is being driven by the aspiration of cities to reduce waste and make cost savings, as well as to improve public safety and transportation and better prepare for weather disasters.
“There are value propositions that are almost impossible to ignore, and that means cities have to find ways to pay for all of this,” Clarke said, “That’s where the cloud and mobile apps help to push them over the edge in being able to adopt all of these things.”
Mobile will occupy a central part of the smart city movement, Clarke believes, as cities try to shift more and more to e-government and self service. “That will cause a lot of changes to what’s happening in city IT,” Clarke said. “Because you can’t have the traditional [online] user interface and workflow when you are talking about getting usability out of native mobile apps.”
Brian Robinson is a freelance technology writer for GCN. E-Mail this page
LA launches open source business portal09/19/2016
The GCN dig IT Awards: What makes for a great nomination07/07/2016
The many benefits of DC's mobile-coverage map06/30/2016
NASCIO: Make a policy roadmap before venturing into the IoT 06/07/2016
With FedRAMP ATO, Google clients get unlimited storage01/29/2016
Startup offers city websites as a service01/22/2016
Wed, Feb 5, 2014
Lee Kermode tampa
Need to start with better as built information look at photogrammetry from 3dsi. That will lower risk and expense of transforming to a smart city. | 科技 |
2016-40/3982/en_head.json.gz/10144 | Interferograms showing land subsidence and uplift in Las Vegas Valley, Nevada, 1992-99
Scientific Investigations Report 2006-5218
Michael T. Pavelko,
Jorn Hoffmann,
and Nancy A. Damar
The U.S. Geological Survey, in cooperation with the Nevada Department of Conservation and Natural Resources-Division of Water Resources and the Las Vegas Valley Water District, compiled 44 individual interferograms and 1 stacked interferogram comprising 29 satellite synthetic aperture radar acquisitions of Las Vegas Valley, Nevada, from 1992 to 1999. The interferograms, which depict short-term, seasonal, and long-term trends in land subsidence and uplift, are viewable with an interactive map. The interferograms show that land subsidence and uplift generally occur in localized areas, are responsive to ground-water pumpage and artificial recharge, and, in part, are fault controlled. Information from these interferograms can be used by water and land managers to mitigate land subsidence and associated damage.
Land subsidence attributed to ground-water pumpage has been documented in Las Vegas Valley since the 1940s. Damage to roads, buildings, and other engineered structures has been associated with this land subsidence. Land uplift attributed to artificial recharge and reduced pumping has been documented since the 1990s. Measuring these land-surface changes with traditional benchmark and Global Positioning System surveys can be costly and time consuming, and results typically are spatially and temporally sparse. Interferograms are relatively inexpensive and provide temporal and spatial resolutions previously not achievable.
The interferograms are viewable with an interactive map. Landsat images from 1993 and 2000 are viewable for frames of reference to locate areas of interest and help determine land use. A stacked interferogram for 1992-99 is viewable to visualize the cumulative vertical displacement for the period represented by the individual interferograms. The interactive map enables users to identify and estimate the magnitude of vertical displacement, visually analyze deformation trends, and view interferograms and Landsat images side by side. The interferograms and Landsat images are available for download, in formats for use with Geographic Information System software.
Scientific Investigations Report
Time Range Start:
Time Range End:
URL: http://pubs.er.usgs.gov/publication/sir20065218 | 科技 |