Text
stringlengths
71
32.7k
Summary
stringlengths
335
1.85k
Electrical supply company Crescent Electric (CESCO) study reveals that the state of Louisiana is the cheapest state in the US to mine Bitcoin. Digital currency mining requires a lot of electric power and the power rates differ in every state. Based on CESCO’s latest study of the cost of cryptocurrency mining across the US, it is currently cheapest to mine Bitcoin in Louisiana -- electricity costs at 9.87 cents per watt puts the average cost of mining one Bitcoin at $3,224. This is significantly cheaper than the current price of Bitcoin, which is currently trading at around $12,000 per coin, as of press time. Where else in the US is it cheap to mine? In their study, CESCO also estimated the cost of Bitcoin mining based on the wattage consumption of the three most popular mining rigs, namely, the AntMiner S9, the AntMiner S7, and the Avalon 6, as well as the average days each rig takes to mine a token. These figures were then multiplied by the average electricity rate in each state. Aside from Louisiana, the other top five states with the lowest cost to mine Bitcoin are Idaho ($3,289 per token), Washington ($3,309), Tennessee ($3,443) and Arkansas ($3,505). The study also names the most expensive states for digital currency mining. The list of costliest states is led by Hawaii, which takes an average mining cost of $9,483 per coin. Rounding up the top five states with the highest Bitcoin mining rates are Alaska ($7,059), Connecticut ($6,951), Massachusetts ($6,674) and New Hampshire ($6,425). The growing interest in cryptocurrency has been accompanied by growing concern over the energy required to mine crypto, namely Bitcoin. Such claims have been recently countered, as a report came out claiming that put cryptocurrency mining in the larger context of energy consumption.
A new study has named Louisiana as the cheapest state in the US in which to mine bitcoin. Electrical supply company Crescent Electric based its calculation on the cost of electricity in each state, the power requirements of the equipment needed, and the average length of time taken to mine a token. This produced a figure of $3,224 per bitcoin for Louisiana, with the most expensive places being Hawaii, at $9,483, and Alaska at $7,059. All of these figures are notably less than the current trading price of bitcoin.
Although the name of the complex was changed to Spring Creek Towers several years ago, it is still widely known as Starrett City. A massive development, it has its own power plant, schools, recreation center and ZIP code. The sale has garnered some notoriety not just because of its size but also because President Trump has a small stake in the complex. Carol G. Deane, the managing partner of Starrett City Associates, who was behind the sale, had argued in court that she balanced the need to satisfy shareholders with a deal that could win government approval and preserve Starrett City as a home for low- and moderate-income New Yorkers. More than 70 percent of the limited partners and beneficial owners approved the deal in September. “We are pleased with the decision denying plaintiffs’ efforts to derail the sale of Spring Creek Towers,” Ms. Deane said in a statement Tuesday, “but not surprised because of the care we put into the process and choosing a buyer who is committed to maintaining the development as affordable and a quality place to live for the 15,000 residents who call it home.” Joshua D.N. Hess, a lawyer for the dissidents, said Tuesday that they were reviewing the judge’s order and their options, which could include suing for damages. Mr. Deane died in 2010. Ms. Deane was his third wife. He tried to sell the complex to the highest bidder during a debt-fueled real estate boom in 2007, but the deal fell apart amid sharp criticism from city, state and federal officials, as well as tenants.
The sale of a huge apartment complex in New York has been cleared by a judge after it was challenged in court. Starrett City in Brooklyn is the largest federally subsidised housing development in the United States, with 5,581 apartments in a 145-acre site. It is being sold for $905m by the widow of its original developer, but the transaction has been opposed by a rival bidder, backed by a partner in the complex. The Supreme Court of the state of New York has now dismissed its objection, but the transaction still requires approval by state and federal officials.
L'Oreal will expand a media ownership strategy it piloted in Mexico to other Spanish-speaking countries to generate first-party cookies from its customer base. The idea is similar to sponsored content, but instead of working with a brand-name media company on a story package, for the past year L’Oreal has developed fiufiu, a kind of pop-up media brand that turns out social media aggregated lists, influencer columns and work by the content generation startup Cultura Colectiva. Fiufiu has a website and a Facebook presence. During some months it generates unique visitor numbers on par with well-known beauty magazines, said L'Oreal Hispanic countries CMO Andres Amezquita. The company could generate page views and engagement if it paired social influencer campaigns with beauty magazine sponsored content deals, but with fiufiu L’Oreal gets first-party cookies and the opportunities to request email addresses that come with page ownership. “Part of our programmatic approach is to have a lot of data and understand our consumers,” Amezquita said. “The idea is to be able to have a one-on-one conversation at the scale of mass communication, but to do that in the current environment, we need to capture cookies.” Recent policy changes at the government and operating system levels, such as GDPR in Europe and Apple’s Safari Intelligent Tracking Prevention, block access to cookie data for marketers. Younger customers, who make up most of the social-driven L’Oreal site, also respond better to in-article recommendations than the hard sell of ads on the page, Amezquita said. and every page is another chance to gather an email address or send someone to a L’Oreal product page. Consumer brands need first-party cookies to advance their online advertising. Since ad tech companies that do audience targeting or retargeting are losing access to first-party data, brands will need to bring their own data to continue running data-driven campaigns. For instance, L’Oreal can use fiufiu to generate retargeting audiences and extrapolate its first-party data through lookalike models scaled for programmatic, Amezquita said. “If you look at industry right now, it’s a moment where content, CRM and digital advertising are becoming one,” he said.
Cosmetics firm L'Oreal is set to expand a media strategy piloted in Mexico to other Spanish-speaking nations, according to CMO Andres Amezquita. Using its media brand Fiufiu, L'Oreal plans to capture first-party cookies to promote its online advertising. Brands will have to take more ownership of their data and apply similar models to those employed by the adtech companies to target their audience effectively.
Though it has billions in the bank, Scripps Health will pursue layoffs in 2018 as part of a reorganization strategy that emphasizes lower costs and greater reliance on caring for patients outside of its five hospitals. In a recent memo to all of the health system’s 15,000 employees and 3,000 affiliated doctors, Chris Van Gorder, Scripps’ chief executive officer, says that cuts are necessary to remain competitive in a health care world where health insurance companies increasingly consider low prices as a main factor in contracting and patients are more often shopping around for services as deductibles increase. Scripps missed its annual budget by $20 million last year for the first time in 15 years, Van Gorder said in an interview this week. It was a wake-up call, he said, that added urgency to the need to both cut costs and also re-think how the private not-for-profit health network does business. “We’ve got to shift our organizational structures around to be able to deal with the new world of health care delivery, find ways of lowering our costs significantly,” Van Gorder said. “If we don’t, we will not be able to compete.” Advertisement Scripps is far from the only large health care system to announce cost-cutting measures in recent months. Advocate Health, Illinois’ largest health care provider with 12 hospitals and 35,000 employees, announced in May that it would try to cut more than $200 million in costs due to flat revenue projections. Tenet Health, the nation’s third-largest medical chain with 77 hospitals, announced in October that it will eliminate a middle layer of management after posting a $56 million loss after a 1.4 percent single-quarter revenue decline, according to industry news source Modern Healthcare. Things haven’t gotten quite so bad yet at Scripps. A look at Scripps’ bond industry financial filings make it clear that this is not a company teetering on the edge of insolvency. Far from it. Scripps, the audited financial statements for the 2017 budget year show, has banked about $2.6 billion in unrestricted cash and other investments. However, Scripps has recently seen its operating margin, the percentage of revenue left over after all the bills are paid, shrink significantly from about 9 percent in 2012 to 2.3 percent in 2017. Financial statements show that Scripps’ bottom line was significantly bolstered this year by investment income. A roaring stock market helped significantly increase the value of its holdings, pushing total profitability up to $350 million for the year, a number that is 25 percent greater than last year. How can an organization be making money, have a significant financial cushion and yet still be planning for layoffs in the coming year? Van Gorder said the basic fact is that operation of the organization must continue to bring in more revenue than it spends and declining operating margins must be addressed even if revenue from the stock market is currently masking those declines. “As strong as we are on the bottom, bottom line, the trends at the top end are changing, and we have to adjust to them,” Van Gorder said. Advertisement Chris Van Gorder, president and chief executive of Scripps Health, speaks to the editorial board of The San Diego Union-Tribune in 2012. (John Gastaldo ) Still, with more than two billion in the bank, couldn’t Scripps afford to burn some savings and stave off layoffs? Van Gorder said that it’s not responsible management to fix a recurring problem with savings. And, having a hefty balance sheet, he added, is necessary to get favorable interest rates from lenders as Scripps moves to execute a recently announced $2.6 billion building plan that will replace Scripps Mercy Hospital in Hillcrest, add a new patient tower at Scripps Memorial Hospital, La Jolla and upgrade facilities in Encinitas, Chula Vista and Torrey Pines. These upgrades and replacements, the executive added, are made more urgent than they would otherwise be due to a state law that requires all hospitals to meet certain seismic requirements by 2030 or cease operation. Advertisement In addition to borrowing and revenue from philanthropy, Scripps plans to tap its savings to fund its building plan. That plan, Van Gorder said, will continue but will be undertaken with knowledge that insurance companies want to avoid paying the higher prices charged by hospitals whenever possible. That means pulling back on the previous tendency to include plenty of space inside hospital complexes to patients whose medical needs don’t require them to be admitted for a overnight stay. Scripps has already started on this path with the purchase of Imaging Healthcare Specialists in 2015. The company operates stand-alone imaging centers that offer cheaper scans than are available in the outpatient centers attached to the region’s major hospitals. “We’re seeing a huge shift in the ambulatory side,” Van Gorder said. “We’re now, for example, doing total joint replacement in ambulatory surgery centers. A year ago, that didn’t take place. The experts are telling me and others that you’re going to see a huge jump in that as technology continues to improve … that’s going to pull a whole lot of utilization away from hospitals and hospitals that haven’t prepared for that shift are going to be in deep trouble in the not-to-distant future. That’s why we’re trying to make this shift proactively.” Scripps’ bond filings do show that it has seen a significant shift in its business over the last five years. Advertisement From 2011 to 2016, the most recent year for which aggregated patient data is available, Scripps reported that inpatient discharges, the total number of days that patients spent in its hospitals, and surgeries performed in hospitals all decreased slightly. Meanwhile, the number of surgeries performed at its outpatient surgery centers increased 69 percent. Visits to Scripps Clinic and Scripps Coastal Medical Group increased 14 percent and 19 percent respectively and emergency visits were up 21 percent. Scripps’ first step in its reorganization plan to better address the shift from inpatient to outpatient is to collapse its ranks of hospital management. Instead of having a chief executive officer to manage operations at each of its five hospitals, the plan is to have two executives handle those duties, one for hospitals in La Jolla and Encinitas and another to oversee operations in San Diego and Chula Vista. A third executive will be in charge of all ancillary services, including operation of the medical offices, outpatient surgery centers and other facilities that Scripps owns or leases throughout the region. This consolidation will not require any layoffs. Gary Fybel, currently the chief executive officer at Scripps Memorial Hospital La Jolla, and Robin Brown, chief executive officer at Scripps Green Hospital, will retire. The northern chief executive officer position has been awarded to Carl Etter, current chief executive officer of Scripps Memorial Hospital Encinitas. The south position will be handled by Tom Gammiere, who currently runs Scripps Mercy Hospital in Hillcrest. Lisa Thakur, currently corporate vice president of operating rooms, pharmacy and supply chain, will fill the ancillary services post, and her previous job will not be filled. Van Gorder said further personnel cuts are coming and are intended to save $30 million during the current budget year ending Sept. 30, 2018. Cuts should save $40 million in subsequent years, he added. Most will be focused on administrative job classifications. Advertisement “There will be layoffs,” Van Gorder said. “I don’t like it, but it has to be done for me to protect the organization and our ability to take care of our community into the future.” So far, he said, there are no specifics to share on which particular jobs are most at risk. He added that, as the current reorganization effort takes shape, hiring is also anticipated. More workers will be needed to provide quicker service to patients. “What I want to do at the point of service is support our nursing staff with more paid professional staff,” Van Gorder said. Advertisement Gerard Anderson, a professor of health policy and management at Johns Hopkins School of Medicine in Baltimore, said the current trend of cost-cutting at large health care systems does indeed appear to be driven by decreases in reimbursement by Medicare and private health insurance companies. “In almost every single hospital they’re losing some amount of money in operating margin but making it up in spades when you add in their investment income,” Anderson said. The researcher said he was a bit disturbed to see organizations flush with stock market earnings discuss cuts, especially if those cuts are to employees who directly care for patients. “You’re seeing layoffs but I don’t necessarily understand why you need to lay someone off when your margin is something like 11 percent after non-operating income is added into the mix,” Anderson said. Advertisement But what about the need to shift business strategy and reduce administrative expenses as reimbursement falls? Anderson said he can see that point as long as it’s true. “If they really are doing this in the non-patient-care area, I think that makes sense,” he said. “In general, we’ve seen more across-the-baord cuts than targeted specifically to areas of management and administration. The salaries in management have grown faster than they have for clinicians.” Advertisement Health Playlist On Now Video: Why aren't Americans getting flu shots? 0:37 On Now Video: Leaders urge public to help extinguish hepatitis outbreak On Now San Diego starts cleansing sidewalks, streets to combat hepatitis A On Now Video: Scripps to shutter its hospice service On Now Video: Scripps La Jolla hospitals nab top local spot in annual hospital rankings On Now Video: Does a parent's Alzheimer's doom their children? On Now Video: Vaccine can prevent human papillomavirus, which can cause cancer 0:31 On Now 23 local doctors have already faced state discipline in 2017 0:48 On Now EpiPen recall expands On Now Kids can add years to your life paul.sisson@sduniontribune.com (619) 293-1850 Advertisement Twitter: @paulsisson
The San Diego-based non-profit health care system Scripps Health is set to pursue layoffs in a restructuring process in 2018. Scripps missed its annual budget by $20 million and intends to target lower costs and a greater dependence on caring for patients outside of its hospitals. The company’s CEO, Chris Van Gorder, revealed in a memo that the cuts are necessary with insurers and patients increasingly targeting lower prices as health costs rise. The announcement by Scripps follows in a cost-cutting trend amongst large health care organizations; both Advocate Health of Illinois and Tenet Health announced similar measures following disappointing results. Although Scripps maintains a $2.6 billion balance in cash and investments, with an increased profitability of $350 million, its operating margin has slimmed from 9 percent in 2012 to 2.3 percent in 2017. The firm believes that it would not be responsible management to turn the savings towards a recurring problem especially as it requires favorable interest rates from lenders on its $2.6 billion building project. Alongside the personnel cuts intended to save $30 million in the current budget year, there will be a consolidation of management positions.
Beneath the waves, oxygen disappears As plastic waste pollutes the oceans and fish stocks decline, unseen below the surface another problem grows: deoxygenation. Breitburg et al. review the evidence for the downward trajectory of oxygen levels in increasing areas of the open ocean and coastal waters. Rising nutrient loads coupled with climate change—each resulting from human activities—are changing ocean biogeochemistry and increasing oxygen consumption. This results in destabilization of sediments and fundamental shifts in the availability of key nutrients. In the short term, some compensatory effects may result in improvements in local fisheries, such as in cases where stocks are squeezed between the surface and elevated oxygen minimum zones. In the longer term, these conditions are unsustainable and may result in ecosystem collapses, which ultimately will cause societal and economic harm. Science, this issue p. eaam7240 Structured Abstract BACKGROUND Oxygen concentrations in both the open ocean and coastal waters have been declining since at least the middle of the 20th century. This oxygen loss, or deoxygenation, is one of the most important changes occurring in an ocean increasingly modified by human activities that have raised temperatures, CO 2 levels, and nutrient inputs and have altered the abundances and distributions of marine species. Oxygen is fundamental to biological and biogeochemical processes in the ocean. Its decline can cause major changes in ocean productivity, biodiversity, and biogeochemical cycles. Analyses of direct measurements at sites around the world indicate that oxygen-minimum zones in the open ocean have expanded by several million square kilometers and that hundreds of coastal sites now have oxygen concentrations low enough to limit the distribution and abundance of animal populations and alter the cycling of important nutrients. ADVANCES In the open ocean, global warming, which is primarily caused by increased greenhouse gas emissions, is considered the primary cause of ongoing deoxygenation. Numerical models project further oxygen declines during the 21st century, even with ambitious emission reductions. Rising global temperatures decrease oxygen solubility in water, increase the rate of oxygen consumption via respiration, and are predicted to reduce the introduction of oxygen from the atmosphere and surface waters into the ocean interior by increasing stratification and weakening ocean overturning circulation. In estuaries and other coastal systems strongly influenced by their watershed, oxygen declines have been caused by increased loadings of nutrients (nitrogen and phosphorus) and organic matter, primarily from agriculture; sewage; and the combustion of fossil fuels. In many regions, further increases in nitrogen discharges to coastal waters are projected as human populations and agricultural production rise. Climate change exacerbates oxygen decline in coastal systems through similar mechanisms as those in the open ocean, as well as by increasing nutrient delivery from watersheds that will experience increased precipitation. Expansion of low-oxygen zones can increase production of N 2 O, a potent greenhouse gas; reduce eukaryote biodiversity; alter the structure of food webs; and negatively affect food security and livelihoods. Both acidification and increasing temperature are mechanistically linked with the process of deoxygenation and combine with low-oxygen conditions to affect biogeochemical, physiological, and ecological processes. However, an important paradox to consider in predicting large-scale effects of future deoxygenation is that high levels of productivity in nutrient-enriched coastal systems and upwelling areas associated with oxygen-minimum zones also support some of the world’s most prolific fisheries. OUTLOOK Major advances have been made toward understanding patterns, drivers, and consequences of ocean deoxygenation, but there is a need to improve predictions at large spatial and temporal scales important to ecosystem services provided by the ocean. Improved numerical models of oceanographic processes that control oxygen depletion and the large-scale influence of altered biogeochemical cycles are needed to better predict the magnitude and spatial patterns of deoxygenation in the open ocean, as well as feedbacks to climate. Developing and verifying the next generation of these models will require increased in situ observations and improved mechanistic understanding on a variety of scales. Models useful for managing nutrient loads can simulate oxygen loss in coastal waters with some skill, but their ability to project future oxygen loss is often hampered by insufficient data and climate model projections on drivers at appropriate temporal and spatial scales. Predicting deoxygenation-induced changes in ecosystem services and human welfare requires scaling effects that are measured on individual organisms to populations, food webs, and fisheries stocks; considering combined effects of deoxygenation and other ocean stressors; and placing an increased research emphasis on developing nations. Reducing the impacts of other stressors may provide some protection to species negatively affected by low-oxygen conditions. Ultimately, though, limiting deoxygenation and its negative effects will necessitate a substantial global decrease in greenhouse gas emissions, as well as reductions in nutrient discharges to coastal waters. Low and declining oxygen levels in the open ocean and coastal waters affect processes ranging from biogeochemistry to food security. The global map indicates coastal sites where anthropogenic nutrients have exacerbated or caused O 2 declines to <2 mg liter−1 (<63 μmol liter−1) (red dots), as well as ocean oxygen-minimum zones at 300 m of depth (blue shaded regions). [Map created from data provided by R. Diaz, updated by members of the GO 2 NE network, and downloaded from the World Ocean Atlas 2009]. Abstract Oxygen is fundamental to life. Not only is it essential for the survival of individual animals, but it regulates global cycles of major nutrients and carbon. The oxygen content of the open ocean and coastal waters has been declining for at least the past half-century, largely because of human activities that have increased global temperatures and nutrients discharged to coastal waters. These changes have accelerated consumption of oxygen by microbial respiration, reduced solubility of oxygen in water, and reduced the rate of oxygen resupply from the atmosphere to the ocean interior, with a wide range of biological and ecological consequences. Further research is needed to understand and predict long-term, global- and regional-scale oxygen changes and their effects on marine and estuarine fisheries and ecosystems. Oxygen levels have been decreasing in the open ocean and coastal waters since at least the middle of the 20th century (1–3). This ocean deoxygenation ranks among the most important changes occurring in marine ecosystems (1, 4–6) (Figs. 1 and 2). The oxygen content of the ocean constrains productivity, biodiversity, and biogeochemical cycles. Major extinction events in Earth’s history have been associated with warm climates and oxygen-deficient oceans (7), and under current trajectories, anthropogenic activities could drive the ocean toward widespread oxygen deficiency within the next thousand years (8). In this Review, we refer to “coastal waters” as systems that are strongly influenced by their watershed, and the “open ocean” as waters in which such influences are secondary. Fig. 1 Oxygen has declined in both the open ocean and coastal waters during the past half-century. (A) Coastal waters where oxygen concentrations ≤61 μmol kg−1 (63 μmol liter−1 or 2 mg liter−1) have been reported (red) (8, 12). [Map created from data in (8) and updated by R. Diaz and authors] (B) Change in oxygen content of the global ocean in mol O 2 m−2 decade−1 (9). Most of the coastal systems shown here reported their first incidence of low oxygen levels after 1960. In some cases, low oxygen may have occurred earlier but was not detected or reported. In other systems (such as the Baltic Sea) that reported low levels of oxygen before 1960, low-oxygen areas have become more extensive and severe (59). Dashed-dotted, dashed, and solid lines delineate boundaries with oxygen concentrations <80, 40, and 20 μmol kg−1 ­ , respectively, at any depth within the water column (9). [Reproduced from (9)] Fig. 2 Dissolved oxygen concentrations in the open ocean and the Baltic Sea. (A) Oxygen levels at a depth of 300 m in the open ocean. Major eastern boundary and Arabian Sea upwelling zones, where oxygen concentrations are lowest, are shown in magenta, but low oxygen levels can be detected in areas other than these major OMZs. At this depth, large areas of global ocean water have O 2 concentrations <100 μmol liter−1 (outlined and indicated in red). ETNP, eastern tropical North Pacific; ETSP, eastern tropical South Pacific; ETSA, eastern tropical South Atlantic; AS, Arabian Sea. [Max Planck Institute for Marine Microbiology, based on data from the World Ocean Atlas 2009] (B) Oxygen levels at the bottom of the Baltic Sea during 2012 (59). In recent years, low-oxygen areas have expanded to 60,000 km2 as a result of limited exchange, high anthropogenic nutrient loads, and warming waters (59) (red, O 2 concentration ≤63 μmol liter−1 [2 mg liter−1]; black, anoxia). [Reproduced from (59)] The open ocean lost an estimated 2%, or 4.8 ± 2.1 petamoles (77 billion metric tons), of its oxygen over the past 50 years (9). Open-ocean oxygen-minimum zones (OMZs) have expanded by an area about the size of the European Union (4.5 million km2, based on water with <70 μmol kg−1 oxygen at 200 m of depth) (10), and the volume of water completely devoid of oxygen (anoxic) has more than quadrupled over the same period (9). Upwelling of oxygen-depleted water has intensified in severity and duration along some coasts, with serious biological consequences (11). Since 1950, more than 500 sites in coastal waters have reported oxygen concentrations ≤2 mg liter−1 (=63 μmol liter−1 or ≅61 µmol kg-1), a threshold often used to delineate hypoxia (3, 12) (Fig. 1A). Fewer than 10% of these systems were known to have hypoxia before 1950. Many more water bodies may be affected, especially in developing nations where available monitoring data can be sparse and inadequately accessed even for waters receiving high levels of untreated human and agricultural waste. Oxygen continues to decline in some coastal systems despite substantial reductions in nutrient loads, which have improved other water quality metrics (such as levels of chlorophyll a) that are sensitive to nutrient enrichment (13). Oxygen is naturally low or absent where biological oxygen consumption through respiration exceeds the rate of oxygen supplied by physical transport, air-sea fluxes, and photosynthesis for sufficient periods of time. A large variety of such systems exist, including the OMZs of the open ocean, the cores of some mode-water eddies, coastal upwelling zones, deep basins of semi-enclosed seas, deep fjords, and shallow productive waters with restricted circulation (14, 15). Whether natural or anthropogenically driven, however, low oxygen levels and anoxia leave a strong imprint on biogeochemical and ecological processes. Electron acceptors, such as Fe(III) and sulfate, that replace oxygen as conditions become anoxic yield less energy than aerobic respiration and constrain ecosystem energetics (16). Biodiversity, eukaryotic biomass, and energy-intensive ecological interactions such as predation are reduced (17–19), and energy is increasingly transferred to microbes (3, 16). As oxygen depletion becomes more severe, persistent, and widespread, a greater fraction of the ocean is losing its ability to support high-biomass, diverse animal assemblages and provide important ecosystem services. But the paradox is that these areas, sometimes called dead zones, are far from dead. Instead they contribute to some of the world’s most productive fisheries harvested in the adjacent, oxygenated waters (20–22) and host thriving microbial assemblages that utilize a diversity of biogeochemical pathways (16). Eukaryote organisms that use low-oxygen habitats have evolved physiological and behavioral adaptations that enable them to extract, transport, and store sufficient oxygen, maintain aerobic metabolism, and reduce energy demand (23–26). Fishes, for example, adjust ventilation rate, cardiac activity, hemoglobin content, and O 2 binding and remodel gill morphology to increase lamellar surface area (27). For some small taxa, including nematodes and polychaetes, high surface area–to–volume ratios enhance diffusion and contribute to hypoxia tolerance (26). Metabolic depression (23, 25, 28) and high H 2 S tolerance (24) are also key adaptations by organisms to hypoxic and anoxic environments. Causes of oxygen decline Global warming as a cause of oxygen loss in the open ocean The discovery of widespread oxygen loss in the open ocean during the past 50 years depended on repeated hydrographic observations that revealed oxygen declines at locations ranging from the northeast Pacific (29) and northern Atlantic (30) to tropical oceans (2). Greenhouse gas–driven global warming is the likely ultimate cause of this ongoing deoxygenation in many parts of the open ocean (31). For the upper ocean over the period 1958–2015, oxygen and heat content are highly correlated with sharp increases in both deoxygenation and ocean heat content, beginning in the mid-1980s (32). Ocean warming reduces the solubility of oxygen. Decreasing solubility is estimated to account for ~15% of current total global oxygen loss and >50% of the oxygen loss in the upper 1000 m of the ocean (9, 33). Warming also raises metabolic rates, thus accelerating the rate of oxygen consumption. Therefore, decomposition of sinking particles occurs faster, and remineralization of these particles is shifted toward shallower depths (34), resulting in a spatial redistribution but not necessarily a change in the magnitude of oxygen loss. Intensified stratification may account for the remaining 85% of global ocean oxygen loss by reducing ventilation—the transport of oxygen into the ocean interior—and by affecting the supply of nutrients controlling production of organic matter and its subsequent sinking out of the surface ocean. Warming exerts a direct influence on thermal stratification and indirectly enhances salinity-driven stratification through its effects on ice melt and precipitation. Increased stratification alters the mainly wind-driven circulation in the upper few hundred meters of the ocean and slows the deep overturning circulation (9). Reduced ventilation, which may also be influenced by decadal to multidecadal oscillations in atmospheric forcing patterns (35), has strong subsurface manifestations at relatively shallow ocean depths (100 to 300 m) in the low- to mid-latitude oceans and less pronounced signatures down to a few thousand meters at high latitudes. Oxygen declines closer to shore have also been found in some systems, including the California Current and lower Saint Lawrence Estuary, where the relative strength of various currents have changed and remineralization has increased (36, 37). There is general agreement between numerical models and observations about the total amount of oxygen loss in the surface ocean (38). There is also consensus that direct solubility effects do not explain the majority of oceanic oxygen decline (31). However, numerical models consistently simulate a decline in the total global ocean oxygen inventory equal to only about half that of the most recent observation-based estimate and also predict different spatial patterns of oxygen decline or, in some cases, increase (9, 31, 39). These discrepancies are most marked in the tropical thermocline (40). This is problematic for predictions of future deoxygenation, as these regions host large open-ocean OMZs, where a further decline in oxygen levels could have large impacts on ecosystems and biogeochemistry (Fig. 2A). It is also unclear how much ocean oxygen decline can be attributed to alterations in ventilation versus respiration. Mechanisms other than greenhouse gas–driven global warming may be at play in the observed ocean oxygen decline that are not well represented in current ocean models. For example, internal oscillations in the climate system, such as the Pacific Decadal Oscillation, affect ventilation processes and, eventually, oxygen distributions (35). Models predict that warming will strengthen winds that favor upwelling and the resulting transport of deeper waters onto upper slope and shelf environments in some coastal areas (41, 42), especially at high latitudes within upwelling systems that form along the eastern boundary of ocean basins (43). The predicted magnitude and direction of change is not uniform, however, either within individual large upwelling systems or among different systems. Upwelling in the southern Humboldt, southern Benguela, and northern Canary Eastern Boundary upwelling systems is predicted to increase in both duration and intensity by the end of the 21st century (43). Where the oxygen content of subsurface source waters declines, upwelling introduces water to the shelf that is both lower in oxygen and higher in CO 2 . Along the central Oregon coast of the United States in 2006, for example, anoxic waters upwelled to depths of <50 m within 2 km of shore, persisted for 4 months, and resulted in large-scale mortality of benthic macro-invertebrates (11). There are no prior records of such severe oxygen depletion over the continental shelf or within the OMZ in this area (11). Nutrient enrichment of coastal waters Sewage discharges have been known to deplete oxygen concentrations in estuaries since at least the late 1800s (44), and by the mid 1900s the link to agricultural fertilizer runoff was discussed (45). Nevertheless, the number and severity of hypoxic sites has continued to increase (Fig. 2B). The human population has nearly tripled since 1950 (46). Agricultural production has greatly increased to feed this growing population and meet demands for increased consumption of animal protein, resulting in a 10-fold increase in global fertilizer use over the same period (47). Nitrogen discharges from rivers to coastal waters increased by 43% in just 30 years from 1970 to 2000 (48), with more than three times as much nitrogen derived from agriculture as from sewage (49). Eutrophication occurs when nutrients (primarily N and P) and biomass from human waste and agriculture, as well as N deposition from fossil fuel combustion, stimulate the growth of algae and increase algal biomass. The enhanced primary and secondary production in surface waters increases the delivery rate of degradable organic matter to bottom waters where microbial decomposition by aerobic respiration consumes oxygen. Once oxygen levels are low, behavioral and biogeochemical feedbacks can hinder a return to higher-oxygen conditions (50). For example, burrowing invertebrates that introduce oxygen to sediments die or fail to recruit, and sediment phosphorus is released, fueling additional biological production in the water column and eventual increased oxygen consumption. Coastal systems vary substantially in their susceptibility to developing low oxygen concentrations. Low rates of vertical exchange within the water column reduce rates of oxygen resupply (51), and long water-retention times favor the accumulation of phytoplankton biomass (14) and its eventual subsurface degradation. Chesapeake Bay develops hypoxia and anoxia that persist for several months during late spring through early autumn and cover up to 30% of the system area. In contrast, the nearby Delaware Bay, which has weaker stratification and a shorter retention time, does not develop hypoxia, in spite of similar nutrient loads (52). Manila Bay is adjacent to a megacity and also receives similar loads on an annual basis, but it becomes hypoxic principally during the wet southwest monsoon period, when rainfall increases nutrient loads and stratification (53). Low oxygen in coastal waters and semi-enclosed seas can persist for minutes to thousands of years and may extend over spatial scales ranging from less than one to many thousands of square kilometers. Both local and remote drivers lead to temporal and spatial variations in hypoxia. Local weather can influence oxygen depletion in very shallow water through wind mixing and the effect of cloud cover on photosynthesis (54). At larger spatial scales, variations in wind direction and speed (55), precipitation and nutrient loads (56), sea surface temperature (57), and nutrient content of water masses transported into bottom layers of stratified coastal systems contribute to interannual and longer-period variations in hypoxic volume, duration, and rate of deoxygenation (14). Climate change in coastal waters Warming is predicted to exacerbate oxygen depletion in many nutrient-enriched coastal systems through mechanisms similar to those of the open ocean: increased intensity and duration of stratification, decreased oxygen solubility, and accelerated respiration (4, 58, 59). The current rate of oxygen decline in coastal areas exceeds that of the open ocean (60), however, likely reflecting the combined effects of increased warming of shallow water and higher concentrations of nutrients. Higher air temperatures can result in earlier onset and longer durations of hypoxia in eutrophic systems through effects on the seasonal timing of stratification and the rate of oxygen decline (58). An ensemble modeling study of the Baltic Sea projects declining oxygen under all but the most aggressive nutrient-reduction plans, owing to increased precipitation and consequent nutrient loads, decreased flux of oxygen from the atmosphere, and increased internal nutrient cycling. Even aggressive nutrient reduction is projected to yield far less benefit under climate change than under current conditions (61). Because of regional variations in the effects of global warming on precipitation and winds, the rate and direction of change in oxygen content is expected to vary among individual coastal water bodies (4, 58). Where precipitation increases, both stratification and nutrient discharges are expected to increase, with the reverse occurring in regions where precipitation decreases. Changes in seasonal patterns of precipitation and rates of evaporation can also be important. Coastal wetlands that remove nutrients before they reach open water are predicted to be lost as sea levels rise, decreasing capacity to remove excess nitrogen, but the rate of wetland inundation and the ability of wetlands to migrate landward will vary. Effects of ocean deoxygenation Oxygen influences biological and biogeochemical processes at their most fundamental level (Fig. 3). As research is conducted in more habitats and using new tools and approaches, the range of effects of deoxygenation that have been identified, and the understanding of the mechanisms behind those effects, has increased substantially. Although 2 mg liter−1 (61 μmol kg−1) is a useful threshold for defining hypoxia when the goal is to quantify the number of systems or the spatial extent of oxygen-depleted waters, a more appropriate approach when considering biological and ecological effects is to simply define hypoxia as oxygen levels sufficiently low to affect key or sensitive processes. Organisms have widely varying oxygen tolerances, even in shallow coastal systems (19). In addition, because temperature affects not only oxygen supply (through its effect on solubility and diffusion) but also the respiratory demand by organisms, oxygen limitation for organisms is better expressed as a critical oxygen partial pressure below which specific organisms exhibit reduced metabolic functions than in terms of oxygen concentration (62, 63). Fig. 3 Life and death at low oxygen levels. (A) Animals using low-oxygen habitats exhibit a range of physiological, morphological, and behavioral adaptations. For example, teribellid worms (Neoamphitrite sp., Annelida) with large branchaea and high hemoglobin levels can survive in the extremely low oxygen levels found at 400 m depth in the Costa Rica Canyon. (B) Fish kills in aquaculture pens in Bolinao, Philippines, had major economic and health consequences for the local population. (C) The ctenophore Mnemiopsis leidyi is more tolerant of low oxygen than trophically equivalent fishes in its native habitat in the Chesapeake Bay and can use hypoxic areas from which fish are excluded. (D) A low-oxygen event caused extensive mortality of corals and associated organisms in Bocas del Toro, Panama. These events may be a more important source of mortality in coral reefs than previously assumed. PHOTOS: (CLOCKWISE FROM TOP LEFT) GREG ROUSE/SCRIPPS INSTITUTION OF OCEANOGRAPHY; PHILIPPINE DAILY INQUIRER/OPINION/MA. CERES P. DOYO; PETRA URBANEK/WIKIMEDIA COMMONS/HTTPS://CREATIVECOMMONS.ORG/LICENSES/BY-SA/4.0/; ARACDIO CASTILLO/SMITHSONIAN INSTITUTION Biological responses Ocean deoxygenation influences life processes from genes to emergent properties of ecosystems (Fig. 4). All obligate aerobic organisms have limits to the severity or duration of oxygen depletion for which they can compensate. Low oxygen levels can reduce survival and growth and alter behavior of individual organisms (3, 4, 26, 64). Reproduction can be impaired by reduced energy allocation to gamete production, as well as interference with gametogenesis, neuroendocrine function, and hormone production, and can ultimately affect populations and fisheries (65–67). Exposure to hypoxia can trigger epigenetic changes expressed in future generations, even if these generations are not exposed to hypoxia (68). Brief, repeated exposure to low oxygen can alter immune responses, increase disease, and reduce growth (69, 70). Fig. 4 Oxygen exerts a strong control over biological and biogeochemical processes in the open ocean and coastal waters. Whether oxygen patterns change over space, as with increasing depth, or over time, as the effects of nutrients and warming become more pronounced, animal diversity, biomass, and productivity decline with decreasing levels of oxygen. At the edge of low-oxygen zones, where nutrients are high and predators and their prey are concentrated into an oxygenated habitat, productivity can be very high, but even brief exposures to low oxygen levels can have strong negative effects. (Top) Well-oxygenated coral reef with abundant fish and invertebrate assemblages. (Middle) Low-oxygen event in Mobile Bay, United States, in which crabs and fish crowd into extreme shallows where oxygen levels are highest. (Bottom) Anoxic mud devoid of macrofauna. PHOTOS: (TOP) UXBONA/WIKIMEDIA COMMONS/HTTP://CREATIVECOMMONS.ORG/LICENSES/BY/3.0; (BOTTOM) B. FERTIG/COURTESY OF THE INTEGRATION AND APPLICATION NETWORK, UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE In both oceanic and coastal systems, vertical and horizontal distributions of organisms follow oxygen gradients and discontinuities, and migratory behavior is constrained in response to both oxygen availability and the ways that oxygen alters the distributions of predators and prey (64, 71). Because oxygen tolerances and behavioral responses to low oxygen levels vary among species, taxa, trophic groups, and with mobility (19), encounter rates, feeding opportunities, and the structure of marine food webs change. Movement to avoid low oxygen can result in lost feeding opportunities on low-oxygen–tolerant prey and can increase energy expended in swimming (19, 70). Hypoxia effects on vision, a function that is highly oxygen intensive, may contribute to these constraints, in part through changing light requirements (72). The presence and expansion of low–water column oxygen reduces diel migration depths, compressing vertical habitat and shoaling distributions of fishery species and their prey (73–75). For pelagic species, habitat compression can increase vulnerability to predation as animals are restricted to shallower, better-lit waters and can increase vulnerability to fishing by predictably aggregating individuals at shallower or lateral edges of low-oxygen zones (71, 76–78). For demersal species, hypoxia-induced habitat compression can lead to crowding and increased competition for prey (73), potentially resulting in decreased body condition of important fishery species such as Baltic cod (79). In contrast, migration into and out of hypoxic waters can allow some animals to utilize oxygen-depleted habitats for predator avoidance or to feed on hypoxia-tolerant prey, and then to return to more highly oxygenated depths or locations (23, 80). Habitat compression may also enhance trophic efficiency in upwelling regions, contributing to their extraordinary fish productivity (20, 21). Some hypoxia-tolerant fish and invertebrate species expand their ranges as OMZs expand (28, 81), and their predators and competitors are excluded. Multiple stressors Deoxygenation is mechanistically linked to other ocean stressors, including warming (82) and acidification (83), and thus it is often their combined effects that shape marine ecosystems (84, 85). Because hypoxia limits energy acquisition, it is especially likely to exacerbate effects of co-occurring stressors that increase energy demands (65). The thermal tolerance of ectotherms is limited by their capacity to meet the oxygen demands of aerobic metabolism (62). Increased temperature elevates oxygen demand while simultaneously reducing oxygen supply, thus expanding the area of the oceans and coastal waters where oxygen is insufficient. Through this mechanism, ocean warming is predicted to result in shifts in the distribution of fishes and invertebrates poleward by tens to hundreds of kilometers per decade, shifts into deeper waters, and local extinctions (63, 86). Models project that warming combined with even modest O 2 declines (<10 μmol kg−1) can cause declines in important fishery species that are sensitive to low oxygen levels (87). Physiological oxygen limitation in warming waters is also predicted to reduce maximum sizes of many fish species, including some that support important fisheries (88). Increased respiration that causes deoxygenation also amplifies the problem of ocean acidification because the by-product of aerobic respiration is CO 2 . Temporal and spatial variations in oxygen in subpycnocline and shallow eutrophic waters are accompanied by correlated fluctuations in CO 2 . In highly productive estuarine, coastal, and upwelling regions, oxygen concentrations and pH can exhibit extreme fluctuations episodically and on diel, tidal, lunar, and seasonal cycles (83, 89). Elevated CO 2 can sometimes decrease the oxygen affinity of respiratory proteins (90), reduce tolerance to low oxygen by increasing the metabolic cost of maintaining acid-base balance (91), and reduce responses to low oxygen that would otherwise increase survival (92). Neither the occurrence nor the magnitude of cases in which acidification exacerbates the effects of low oxygen are currently predictable (83). Other covarying factors, such as nutrients and fisheries dynamics, can mask or compensate for effects of deoxygenation, complicating management decisions. Fisheries management is designed to adjust effort and catch as population abundance changes (93). Thus, direct and indirect effects of deoxygenation on a harvested population may not be easily traceable in monitoring or catch data because management actions adjust for the loss in abundance. In addition, high nutrient loads can stimulate production in a habitat that remains well oxygenated, at least partially offsetting lost production within a hypoxic habitat (52). Total landings of finfish, cephalopods, and large mobile decapods are positively correlated with nitrogen loads (22), in spite of hypoxia in bottom waters (52). The conflation of habitat loss and nutrient enrichment is prominent in upwelling zones, as well as eutrophic coastal waters. Increased upwelling of nutrient-rich, oxygen-depleted waters from the 1820s to the 20th century has increased primary and fish productivity off the coast of Peru, for example (94). However, there are limits to the extent of hypoxia that can form before total system-wide fishery landings decline. In addition, individual species dependent on
Ocean dead zones, which contain no oxygen, have become four times larger since 1950, while the number of areas with very low oxygen close to coasts has increased tenfold, according to the first comprehensive analysis of these areas. Most marine species cannot exist in such conditions, and the continuation of such trends would result in mass extinction, endangering the livelihood of millions of people. Large-scale deoxygenation is the result of climate change caused by burning fossil fuels; as waters warm, they contain less oxygen.
The second phase of a development by Anwyl and Redrow Homes, providing 151 homes off Middlewich Road in Sandbach, is set to be discussed by Cheshire East’s Southern planning committee next week. Anwyl and Redrow have made two applications on the site: a reserved matters application for 126 homes, and a full planning application for 25 houses at the site’s southern end. The wider 39-acre site was granted outline planning permission for up to 280 homes, alongside public open space and highways improvements, in 2012, and a reserved matters planning application for the first phase of 154 houses was approved in 2015. The latest reserved matters application includes a mix of 74 four-bedroom homes; 26 three-beds; 21 two-beds; four one-beds; and one five-bed house. The four-bed units are expected to vary in price between £264,000 and £475,000. Recommending the application for approval, Cheshire East planning officers said the proposals would “much needed affordable housing provision” and “would help in the Council’s delivery of five-year housing land supply”. Cheshire East planning officers have stipulated that 30% of the homes – around 38 units – should be provided as affordable homes under the application’s Section 106 agreement. A contribution of £514,000 towards education services was already secured as part of the outline planning permission, secured in 2012. The recommendation to approve has been put forward despite objections from Sandbach Town Council, which argued the housing was “far too dense” in the second phase, and that it offered “no green space of any significance”. The Town Council also criticised the application for not providing any bungalows “for older residents who wish to downsize”. However, planning officers said the development’s open space was already covered under the outline application, which provides a six-acre park on the site. Cheshire East planners also recommended the full planning application, covering 25 homes, for approval. The homes on designated open countryside land are in addition to the 280 houses approved as part of 2012’s outline planning application. These will provide a mix of 17 four-bed homes; four three-beds; three two-beds; and a single five-bedroom house, with prices for a four-bed house expected to be in a similar range as for the wider development. Planning officers said the additional homes would “not have a detrimental impact upon residential amenity” in the area, and recommended the scheme for approval, subject to agreeing affordable homes on the site, as well as a £120,000 provision towards local education provision. The professional team for the development includes Astle Planning & Design.
The second phase of a development by UK housebuilder Redrow Homes and Anwyl in Sandbach, Cheshire is set to be examined by Cheshire East’s Southern planning committee. The latest reserved matters application, covering 126 homes, has been recommended for approval by Cheshire East planning officers, but Sandbach town council has complained about the lack of green space and inadequate provision for downsizers.
British wind farms generated more electricity than coal plants on more than 75% of days this year, an analysis of energy figures has shown. Solar also outperformed coal more than half the time, the data provided by website MyGridGB revealed. Overall, renewables provided more power than coal plants on 315 days in 2017, figures up to 12 December showed. Wind beat coal on 263 days, and solar outperformed the fossil fuel on 180 days. Between April and August inclusive, coal generation exceeded solar on only 10 days. In total, renewables generated more than three times the amount of electricity as coal over the year to 12 December. The figures – provided by BM Reports and Sheffield University – reflect a year in which a number of green records have been set for the power sector, including the first full day without any coal power in the system, record solar generation and tumbling prices for new offshore wind farms. The government has committed to phasing out coal power that does not have technology to capture and permanently store its carbon emissions by 2025, as part of efforts to meet targets on greenhouse gases. The focus now turns to gas, with daily output from wind outstripping gas on only two days of the year, and renewables overall – including wind, solar, biomass and hydropower – beating the fossil fuel on just 23 days. Dr Andrew Crossland from MyGridGB and the Durham Energy Institute said: “The government has focused on reducing coal use which now supplies less than 7% of our electricity. However, if we continue to use gas at the rate that we do, then Britain will miss carbon targets and be dangerously exposed to supply and price risks in the international gas markets. “Clearly, refreshed government support for low-carbon alternatives is now needed to avoid price and supply shocks for our heat and electricity supplies.” Emma Pinchbeck, executive director at industry body RenewableUK, said the decision to phase out coal was being made possible by a homegrown renewables industry “coming into its own”. She added: “We want to see more boldness from the Conservative government. In 2018, the government should move to allow onshore wind, now the cheapest form of power for consumers, to be developed in parts of the UK where it is wanted, and agree an ambitious sector deal with the offshore wind industry. “The new year could be the first in a golden age for UK renewables.”
The UK had its greenest ever year in electricity production in 2017, breaking 13 different renewable energy records. Figures from BM Reports and Sheffield University showed that renewables produced more electricity than coal power stations on 315 days last year, and April saw the first day with no coal-fired power used in the UK. Coal now supplies less than 7% of the UK's electricity, and the government has a target to phase it out by 2025. The UK has halved carbon emissions in electricity production since 2012, and the increase in renewable power is expected to continue in 2018.
"Coral reefs cover less than 0.1% of the world's oceans and yet they house a third of all marine biodiversity. And the oceans cover 70% of our planet so they're housing a huge amount of the biodiversity of our planet. So, anyone who cares about extinction, about biodiversity, needs to worry about the future of coral reefs."
Tropical coral reefs across the world, on which millions of livelihoods depend and which are home to a third of all marine biodiversity, are under threat from repeated deadly bouts of warmer water, according to new research. The study of 100 reefs reveals that the interval between bleaching events, when unusually warm water causes coral to eject algae with often fatal consequences, has fallen from once in every 25-30 years in the 1980s, to once in every six years. The researchers have called for greater efforts to reduce the emissions of greenhouse gases to combat the warming.
The Indian government's policy think tank, Niti Aayog, is testing waters to employ blockchain technology in education, health and agriculture, several media reports stated. The top government think tank is developing a proof of concept to take advantage of the new technology in key sectors, a senior government official told The Economic Times on condition of anonymity. The think tank along with blockchain startup Proffer, which was founded by MIT and Harvard graduates, held a blockchain hackathon from 10 November to 13 November 2017 at IIT Delhi, a report in YourStory said in November last year. About 1,900 students from the IITs, MIT, Harvard University, UC Berkeley College of Engineering, and top engineering institutions around the world participated in the event. AgroChain, a blockchain-based marketplace that helps farmers and consumers through co-operative farming, bagged the first prize at the competition, the report added. The marketplace was developed by students from the Indian Institute of Information Technology and Management-Kerala (IIITM-K). Niti Aayog has also been working on developing a country-wide blockchain network called IndiaChain which looks to reduce corruption and frauds, maximise transparency of transactions, report in November 2017 in technology news website Factor Daily had stated. The think tank is also expected to connect the blockchain infra to IndiaStack—the country's digital identification database, the report added. Blockchain technology uses cryptographic tools to create an open and decentralised body of data, which can include banks transactions and the like. The data record can be verified by anyone involved in the transaction and information can be tracked via a secure network.
India is testing blockchain applications in education, health and agriculture, among other sectors of the economy, and is working on a proof-of-concept platform, according to an anonymous senior government official. Government think tank Niti Aayog co-hosted a blockchain hackathon alongside start-up Proffer in November. Reports the same month revealed the think tank was also developing a fraud-resistant transaction platform called IndiaChain, which is expected to be linked to national digital identification database IndiaStack.
China’s first space lab will crash down to earth in the coming months, but don’t worry: The odds of any debris hitting a person are astronomically small. The Aerospace Corporation, a California nonprofit, estimates the Tiangong-1 will enter the atmosphere in mid-March. Sent up in 2011, the “Heavenly Palace,” as it’s also known, has witnessed a number of milestones as China races to become a space superpower. For example in 2012 the country sent its first female astronaut, Liu Yang, as part of the team behind the first successful manual docking with the lab. The station was designed with a two-year lifespan, but authorities extended its service life by two and a half years to conduct more experiments. In September 2016, China’s Manned Space Engineering Office announced that the Tiangong-1 would re-enter the atmosphere around the latter half of 2017, which many interpreted to mean the lab had fallen into an uncontrolled orbit. (Satellite trackers suggest the lab has been that way since at least June 2016, notes the Aerospace Corporation.) The agency said most of the lab would burn up in the fall, adding that it would release its updated forecasts of the descent, internationally if necessary. Although it’s hard to predict where surviving pieces might land, based on Tiangong-1’s inclination, the Aerospace Corporation estimates the station will re-enter somewhere between the latitudes of 43° N and 43° S, an area largely covered by ocean, but also traversing countries including the US, Brazil, and China itself. According to the latest available trajectory data (link in Chinese), the Tiangong-1 is orbiting at an average height of 287 km (178 miles), about 100 km lower than it was in September 2016. Though it’s not uncommon for spacecraft and satellites to re-enter the atmosphere, rarely does it lead to injury or destruction of property. The largest manmade object to re-enter was Russia’s Mir space station, with surviving fragments falling into the Pacific east of New Zealand in 2001. Whereas the Tiangong-1 weighs 8,500 kg (18,739 lbs), the Mir weighed 120,000 kg. The Mir, though, was still under control when it entered the atmosphere. Letting objects enter uncontrolled is not considered a best practice. One (still remote) danger from the Tiangong-1, according to the Aerospace Corporation, is that someone will find and pick up a piece of its debris that’s covered in a corrosive substance. While the odds of getting hit by space debris are absurdly remote, it happened to at least one woman. In 1997 Lottie Williams was strolling through a park in Tulsa, Oklahoma when a piece of light metal measuring about 6 inches (15.2 cm) glanced off her shoulder. NASA later confirmed the timing and location were consistent with the re-entry and breakup of a second-stage Delta rocket, the main wreckage of which was found a few hundred miles away in Texas. Williams wasn’t injured, but she’s thought to be the only person ever hit by space debris. You won’t likely become the second. Correction: An earlier version of this story stated the Mir space station re-entered the atmosphere in 2013 instead of 2001.
China's space lab Tiangong-1 is estimated to crash to Earth in March, with the Aerospace Corporation predicting it will re-enter between the latitudes of 43°N and 43°S. Although the area is largely covered by ocean, the US, Brazil and China also traverse it. China said it will release updated forecasts of the craft's descent; the expectation is that some debris will hit Earth, while the rest will burn up on re-entry. As we previously noted, many believe China's space authority lost control of the 8.5-tonne lab and are allowing it to enter Earth's atmosphere "naturally".
In Nagoya, Japan, a city that once held an entire museum dedicated to robotics, a hospital will soon add robots developed by Toyota to its medical staff. No, they won't be scrubbing in for surgery: In February, the Nagoya University Hospital will deploy four bots to ferry medicine and test samples between floors for a year. Nagoya hospital to use robots for deliveries of drugs, materials:The Asahi Shimbun:The Asahi Shimbun https://t.co/bhlkzM0Rnp — Asahi Shimbun AJW (@AJWasahi) January 1, 2018 The robots are essentially mobile refrigerators with a 90-liter capacity that rely on radar and cameras to zoom through the hospital. Should they run into humans, they're programmed to dodge them or politely voice 'Excuse me, please let me pass,' according to The Asahi Shimbun. Staff can summon the robots and assign a destination for their medical payload using a tablet. Nagoya built the robot system in partnership with Toyota Industries, a subsidiary of the automaker that produces auto parts and electronics. The trial run will run the robots between 5pm and 8am during the night shift when fewer people are walking the floors. Should the trial go well, the facility may choose to deploy more units.
A hospital in Japan is deploying robots to deliver supplies around the building. The Nagoya University Hospital is to use four of the devices to run during the night shift from 5pm to 8am, when there are fewer people using the corridors. The robots have been developed in partnership with Toyota industries, and use radar and cameras to guide themselves around the hospital. They contain mobile refrigerators to carry medical supplies and staff can use a tablet device to summon them and set their destination.
First of all, my role is the Global Head of Product at Datscha (I was one of the founding employees.) working with everything from strategy and business development down to what bugs [we should focus on] in the upcoming sprint. Datscha has been around for 20-plus years, but we still keep a very outspoken startup mentality and an effective organization, enabling us to be in three markets: Sweden, Finland and the UK with only 45 employees. In addition to my role at Datscha, I’m also a Partner in Stronghold Invest (the sole owner of Datscha), where we own, among others, the largest property consulting firm in the Nordics (Newsec with 2000 employees and 31 million sqm under management) and the most successful private equity real estate firm in Northern Europe with real estate assets under management of approximately €3.5 billion. Furthermore, Stronghold is an active #PropTech investor.
The commercial property market will become more reliant on data and more transparent, according to the global head of product at Swedish company Datscha, Magnus Svantegård, who was one of the founding employees of the commercial property company over 20 years ago. He said the current handling of properties as investments was based too much on gut feeling and small networks of people. Datscha has 45 employees in Sweden, Finland and the UK, but retains a "start-up mentality", with a focus on property technology, according to Svantegård.
A worker pulls carts full of customer orders along the floor inside the million-square foot Amazon distribution warehouse that opened last fall in Fall River, MA. Land fit for future fulfillment centers for the likes of Amazon and Walmart saw huge spikes in prices last year, according to real estate services firm CBRE. In a trend largely stemming from the growth of e-commerce players across the U.S., some plots of land now cost twice the amount they did a year ago, the group found. This is especially true in major markets, including Atlanta and Houston. In surveying 10 U.S. markets, CBRE found the average price for "large industrial parcels" (50 to 100 acres) now sits at more than $100,000 per acre, up from about $50,000 a year ago. Industrial land plots of five to 10 acres, which typically house infill distribution centers for completing "last-mile" deliveries, watched their prices soar to more than $250,000 per acre by the end of 2017, up from roughly $200,000 a year ago, according to CBRE. Located in more bustling metropolitan settings, these warehouses must help retailers serve consumers closer to their homes. To be sure, industry experts say that despite an uptick in construction of late, there's still a long way to go before supply aligns with demand.
The average price for large industrial plots of land of between 50 and 100 acres doubled last year from $50,000 to $100,000 per acre, thanks to increased demand for data hubs and distribution centres, according to a survey by CBRE. In an examination of 10 US markets, plots of between five and 10 acres, suitable for "last-mile" depots, cost $250,000 per acre by the end of last year, an increase of $50,000 on 2016. David Egan, the global head of CBRE's Industrial & Logistics Research division, said that demand is not likely to drop in the near future.
Courtesy of the City of London Corp. The City of London skyline in 2026 London's skyline is rapidly changing, as this image from the City of London Corp., forecasting the City skyline in 2026, shows. In the shorter term, 2018 is set to be a pivotal year for the London office market. Supply of new space will begin to drop after 2017's cyclical peak, according to Deloitte Real Estate, and 44% of the space set to be completed this year is already leased. But with Brexit negotiations at their most delicate, demand will be at its most skittish. Here are the five biggest office schemes opening in 2018 and who is occupying them, according to Deloitte. In terms of lettings, some are doing significantly better than others. Begin slideshow 70 Farringdon St. — Goldman's new HQ Courtesy of Goldman Sachs Goldman's new London HQ Goldman Sachs broke ground on its 825K SF London headquarters before the U.K. voted to leave the EU. As Brexit has been negotiated, there has been speculation Goldman would move staff to Frankfurt and thus not occupy the entire building — especially after Chief Executive Lloyd Blankfein started trolling the U.K. government on Twitter. There is no sign of it looking to sublease any space yet ahead of the building completing in the third quarter. The International Quarter — the new home of the Financial Conduct Authority Courtesy of Lendlease Building S5 at Lendlease's International Quarter at Stratford The Financial Conduct Authority pre-let 425K of the 515K SF building at the International Quarter in Stratford. Lendlease and LCR are delivering the scheme in the middle of 2018. The FCA's pre-let in building S5 convinced Deutsche Asset Management to pay £370M for a building in what is still an emerging office location. 10 Fenchurch Ave. — a new HQ for M&G Investments Courtesy of Greycoat 10 Fenchurch Ave. M&G, the investment division of insurer Prudential, in 2014 signed for 11 of the 13 floors at the 398K SF 10 Fenchurch Ave., which is being developed by Greycoat and CORE on behalf of Italian insurance company Generali. The Scalpel — two-thirds still to be leased Courtesy of WR Berkley The Scalpel The Scalpel at 52 Lime St. in the City is being built by the development arm of U.S. insurance company WR Berkeley. The company will occupy 81K SF of the 387K SF building, and financial and insurance firms Axis and BPL have taken space, but 61% of the building is unlet. The 35-storey building is scheduled to complete in the second quarter.
Five major London office projects are due to open this year. Goldman Sachs's 825,000 sq ft London headquarters at 70 Farringdon Street was the subject of subletting speculation after rumours circulated that the firm would move staff to Frankfurt post-Brexit, while the International Quarter in Stratford represents a location gamble for Deutsche Asset Management. M&G Investments has leased 11 of the 13 floors at 10 Fenchurch Avenue. However, The Scalpel at 52 Lime Street is barely more than one-third leased and 70 St Mary Axe, also known as The Can of Ham, has yet to secure a tenant.
Xi'an is the latest Chinese city to accept Alipay on its subway system, according to local reports. The system started to accept Alipay as of Jan. 1. As part of the initiative, riders can participate in a program meant to encourage more "green" ways of travel such as public transit. Once an Alipay user accumulates a certain amount of green "energy" from using the mobile wallet to pay for subway fares, Alipay's partners, such as the Alxa SEE foundation, will plant a real tree in areas suffering from desertification upon the user's request. Alipay is now accepted on public transport in more than 30 Chinese cities, including Hangzhou, Wuhan, Tianjin, Qingdao and Guangzhou. Zhengzhou became the first Chinese city to adopt mobile payments in its subway system in September, followed by Beijing and Shanghai.
Chinese payment system Alipay is now accepted on on public transport across 30 cities, after Xi'an's subway became the latest addition this month. The scheme also includes an initiative to plant trees in regions suffering desertification, with Alipay users accumulating 'green' points every time they pay for subway travel using the wallet. Zhengzhou, Beijing and Shanghai were the first cities to allow mobile subway system payments last year.
The leading Chinese messaging app said it doesn’t store users’ chat history, after a top businessman said WeChat was ‘watching’ users WeChat, China’s most popular messaging application, has denied “storing” users’ messages, following accusations by one of the country’s top businessmen that the Tencent Holdings-owned firm was spying on its users. “WeChat does not store any users’ chat history. That is only stored in users’ mobiles, computers and other terminals,” WeChat said in a post on the platform. The statement comes after Li Shufu, chairman of Geely Holdings, which owns the worldwide Volvo and Lotus car brands, was quoted by local media on Monday as saying Tencent chairman Ma Huateng “must be watching all our WeChats every day”. Geely Holdings is one of China’s largest car manufacturers, and one of the few major companies without ties to the country’s government. It has owned Volvo since 2010, British taxi maker The London Electric Vehicle Company since 2013 and took a majority stake in British sports car maker Lotus Cars last year. ‘Misunderstanding’ In its carefully worded response, WeChat said Li’s remarks were the result of a “misunderstanding”. “WeChat will not use any content from user chats for big data analysis,” the firm said in its post. “Because of WeChat’s technical model that does not store or analyse user chats, the rumour that ‘we are watching your WeChat everyday’ is pure misunderstanding.” WeChat, like all social media firms operating in China, is legally required to censor public posts the country’s Communist Party designates as illegal, and its privacy policy says it may need to retain and disclose users’ information in response to government or law enforcement requests. In a 2016 report, Amnesty International ranked Tencent zero out of 100 on various privacy criteria, noting it was the only company on the list that “has not stated publicly that it will not grant government requests to access encrypted messages by building a ‘backdoor'”. Cyber laws Tencent is the only Chinese company on Amnesty’s list, which also includes Japan’s Viber and Line and South Korea’s Kakao, as well as services such US-based companies such as Facebook, Apple, Telegram and Google. Last September China’s internet regulator announced a new rule making chat group administrators and companies accountable for breaches of content laws. The regulator also fined firms including Tencent, Baidu and Weibo for censorship lapses and demanded they improve content auditing measures. Last June China brought into force the restrictive Cyber Security Law (CSL), which mandates certain companies to hold data within the country and to undergo on-site security reviews. What do you know about the history of mobile messaging? Find out with our quiz!
WeChat has said a claim that the company was storing users' chat history was a "misunderstanding". In a blog post, WeChat said it "will not use any content from user chats for big data analysis". The comments followed media quotes from Li Shufu, chairman of Geely Holdings, who said Tencent chairman Ma Huateng "must be watching all our WeChats every day". Tencent scored 0 out of 100 for various privacy issues in a 2016 report by Amnesty International.
From a tech perspective, Bitcoin seems to be just getting started: 2018 promises to be the year that a number of highly anticipated projects are either launched or adopted. In many ways, 2017 was Bitcoin’s best year yet. Most obviously, increased adoption made the pioneering cryptocurrency’s exchange rate skyrocket from under $1000 to well over 10 times that value. But from a tech perspective, things seem to be just getting started: 2018 promises to be the year that a number of highly anticipated projects are either launched or adopted. Here’s a brief overview of some of the most promising upcoming technological developments to keep an eye on in the new year. Cheaper Transactions with Segregated Witness and a New Address Format Segregated Witness (SegWit) was one of Bitcoin’s biggest — if not the biggest — protocol upgrade to date. Activated in August 2017, it fixed the long-standing malleability bug, in turn better enabling second-layer protocols. Additionally, SegWit replaced Bitcoin’s block size limit with a block weight limit, allowing for increased transactions throughout the network, thereby lowering fees per transaction. However, adoption of the upgrade has been off to a relatively slow start. While some wallets and services are utilizing the added block space offered by SegWit, many others are not yet doing so. This means that, while Bitcoin is technically capable of supporting between two and four megabytes worth of transactions per ten minutes, it barely exceeds 1.1 megabytes. This is set to change in 2018. For one, the Bitcoin Core wallet interface will allow users to accept and send SegWit transactions. Bitcoin Core 0.16, scheduled for May 2018 (though this may be moved forward), will most likely realize this through a new address format known as “bech32,” which also has some technical advantages that limit risks and mistakes (for example, those caused by typos). “To spend coins from the P2SH format currently used for SegWit, users need to reveal a redeem script in the transaction,” Bitcoin Core and Blockstream developer Dr. Pieter Wuille, who also co-designed the bech32 address format, told Bitcoin Magazine. “With native SegWit outputs this is no longer necessary, which means transactions take up less data. Recipients of SegWit transactions will be able to spend these coins at a lower cost.” Perhaps even more importantly, several major Bitcoin services — like Coinbase — plan to upgrade to SegWit in 2018 as well. Since such services account for a large chunk of all transactions on the Bitcoin network, this could significantly decrease network congestion, thereby decreasing average transaction fees and confirmation times, even for those who do not use these services. The Lightning Network Rolling Out on Bitcoin’s Mainnet While further SegWit adoption should provide immediate relief of fee pressure and confirmation times, truly meaningful long-term scalability will likely be achieved with second-layer solutions built on top of Bitcoin’s blockchain. One of the most highly anticipated solutions in this regard — especially for lower value transactions — is the lightning network. This overlay network, first proposed by Joseph Poon and Tadge Dryja in 2015, promises to enable near-free transactions and instant confirmations, all while leveraging Bitcoin’s security. The solution has been under active development for about two years now, with major efforts by ACINQ, Blockstream and Lightning Labs. Progress on the scaling layer has been significant all throughout 2017, with early software releases of different but compatible software implementations, useable wallets interfaces and test transactions happening both on Bitcoin’s testnet and even on Bitcoin’s mainnet on a regular basis now. “I'd say we have solved the main technical problems and have a relatively good idea on how to improve on the current system,” Christian Decker, lightning developer at Blockstream, told Bitcoin Magazine. “One last hurdle that's worth mentioning is the network topology: We'd like to steer the network formation to be as decentralized as possible.” Given the current state of development, adoption of the lightning network should only increase throughout 2018 — not just among developers, but increasingly among end users as well. “Integration and testing will be the next major step forward,” Lightning Labs CEO Elizabeth Stark agreed, noting: “Some exchanges and wallets are already working on it.” Increased Privacy Through TumbleBit and ZeroLink While it is sometimes misrepresented as such, Bitcoin is not really private right now. All transactions are included in the public blockchain for anyone to see, and transaction data analysis can reveal a lot about who owns what, who transacts with whom and more. While there are solutions available to increase privacy right now — like straightforward bitcoin mixers — these usually have significant drawbacks: They often require trusted parties or have privacy leaks. This situation could be improved significantly in 2018. Two of the most promising projects in this domain — TumbleBit and ZeroLink — are both getting close to mainnet deployment. TumbleBit was first proposed in 2016 by a group of researchers led by Ethan Heilman. It is essentially a coin-mixing protocol that uses a tumbler to create payment channels from all participants to all participants in a single mixing session. Everyone effectively receives different bitcoins than what they started with, breaking the trail of ownership for all. And importantly, TumbleBit utilizes clever cryptographic tricks to ensure that the tumbler can’t establish a link between users either. An initial implementation of the TumbleBit protocol was coded by NBitcoin developer Nicolas Dorier in early 2017. His work was picked up by Ádám Ficsór as well as other developers, and blockchain platform Stratis announced it would implement the technology in its upcoming Breeze wallet, which also supports Bitcoin, by March 2018. Recently, in mid- December of 2017, Stratis released TumbleBit integration in this wallet in beta. The other promising solution, ZeroLink, is an older concept: it was first proposed (not under the same name) by Bitcoin Core contributor and Blockstream CTO Gregory Maxwell, back in 2013. Not unlike TumbleBit, ZeroLink utilizes a central server to connect all users but without being able to link their transactions. As opposed to TumbleBit, however, it creates a single (CoinJoin) transaction between all participants, which makes the solution significantly cheaper. This idea seemed to have been forgotten for some years until Ficsór (indeed, the same Ficsór that worked on TumbleBit) rediscovered it earlier this year. He switched his efforts from TumbleBit to a new ZeroLink project and has since finished an initial ZeroLink implementation. Ficsór recently ran some tests with his ZeroLink implementation, and while results showed that his implementation needs improvement, Ficsór considers it likely that it will be properly usable within months. “I could throw it out in the open right now and let people mix,” he told Bitcoin Magazine. "There is no risk of money loss at any point during the mix, and many mixing rounds were executing correctly. It is just some users would encounter some bugs I am not comfortable with fixing on the fly.” More Sidechains, More Adoption Sidechains are alternative blockchains but with coins pegged one-to-one to specific bitcoins. This allows users to effectively “move” bitcoins to chains that operate under entirely different rules and means that Bitcoin and all its sidechains only use the “original” 21 million coins embedded in the Bitcoin protocol. A sidechain could then, for example, allow for faster confirmations, or more privacy, or extended smart contract capabilities, or just about anything else that altcoins are used for today. The concept was first proposed by Blockstream CEO Dr. Adam Back and others back in 2014; it formed the basis around which Blockstream was first founded. Blockstream itself also launched the Liquid sidechain, which allows for instant transactions between — in particular — Bitcoin exchanges. Liquid is currently still in beta but could see its 1.0 release in 2018. Another highly anticipated sidechain that has been in development for some time is RSK. RSK is set to enable support of Turing-complete smart contracts, hence bringing the flexibility of Ethereum to Bitcoin. RSK is currently in closed beta, with RSK Labs cofounder Sergio Demian Lerner suggesting a public release could follow soon. Further, Bloq scientist Paul Sztorc recently finished a rough implementation of his drivechain project. Where both Liquid and RSK for now apply a “federated” model, where the sidechain is secured by a group of semi-trusted “gatekeepers,” drivechains would be secured by bitcoin miners. If drivechains are deployed in 2018, the first iteration of such a sidechain could well be “Bitcoin Extended:” essentially a “big block" version of Bitcoin to allow for more transaction throughput. That said, reception of the proposal on the Bitcoin development mailing list and within Bitcoin’s development community has been mixed so far. Since drivechains do need a soft-fork protocol upgrade, the contention does make the future of drivechains a bit more uncertain. “Miners could activate drivechains tomorrow, but they often outsource their understanding of ‘what software is good’,” Sztorc told Bitcoin Magazine. “So they'll either have to decide for themselves that it is good, or it would have to make it into a Bitcoin release.” A Schnorr Signatures Proposal Schnorr signatures, named after its inventor Claus-Peter Schnorr, are considered by many cryptographers to be the best type cryptographic signatures in the field. They offer a strong level of correctness, do not suffer from malleability, are relatively fast to verify and enable useful features, thanks to their mathematical properties. Now, with the activation of Segregated Witness, it could be relatively easy to implement Schnorr signatures on the Bitcoin protocol. Perhaps the biggest advantage of the Schnorr signature algorithm is that multiple signatures can be aggregated into a single signature. In the context of Bitcoin, this means that one signature can prove ownership of multiple Bitcoin addresses (really, “inputs”). Since many transactions send coins from multiple inputs, having to include only one signature per transaction should significantly benefit Bitcoin’s scalability. Analysis based on historical transactions suggest it would save an average of 25 percent per transaction, which would increase Bitcoin’s maximum transaction capacity by about 33 percent. Further on, Schnorr signatures could enable even more. For example, with Schnorr, it should also be possible to aggregate different signatures from a multi-signature transaction, which require multiple signatures to spend the same input. This could, in turn, make CoinJoin a cheaper alternative to regular transactions for participants, thereby incentivizing a more private-use Bitcoin. Eventually the mathematical properties of Schnorr signatures could even enable more advanced applications, such as smart contracts utilizing “Scriptless Scripts.” Speaking to Bitcoin Magazine, Wuille confirmed that there will probably be a concrete Bitcoin Improvement Proposal for Schnorr signatures in 2018. “We might, as a first step, propose an upgrade to support Schnorr signatures without aggregation,” he said. “This would be a bit more straightforward to implement and already offers benefits. Then a proposal to add aggregation would follow later.” Whether Schnorr signatures will already be adopted and used on Bitcoin’s mainnet is harder to predict. It will require a soft fork protocol upgrade, and much depends on the peer review and testing process.
The technology underpinning bitcoin is set for major changes in 2018, with several projects scheduled. Among them is the wider adoption of the segwit upgrade, originally activated in August, thanks to the new bech32 address format, scheduled for a March release. Also expected is the launch of bitcoin's Lightning Network, offering secure, instant confirmations and near-free transfers, while privacy solutions ZeroLink and TumbleBit are set to make the network more secure. Activation of segwit will also make it easier to improve cryptographic signatures by facilitating the use of Schnorr signatures, which would reduce transaction costs and increase bitcoin's maximum capacity.
Publishers have a lot to gripe about when it comes to Facebook, from the platform choking off their referral traffic, dominating digital advertising and giving them whiplash with its constantly changing video strategy. But what if it got even worse? In 2018, Facebook could take a step further and separate news from the news feed. It’s not a crazy idea. The platform tested a newsless news feed, called the Explore Feed, in six countries outside the U.S., causing a major publisher freakout. (Facebook said it didn’t expect to roll out the test further.) In the past year, Facebook also launched Watch, a TV-like video tab; and prioritized Facebook Groups, communities for people who share interests or characteristics — also underscoring the idea of separating user interaction from other media content. Other platforms have made moves to separate users’ messages from media and brands’ content. Snapchat redesigned its app to separate users’ feeds from brands’ content. Instagram is testing a private messaging app, which would take peer-to-peer chat out of the main app. Twitter has its Moments tab, a dedicated home for news and entertainment stories. Fundamental to the success of platforms like Twitter and Facebook is keeping users happy, and as such, they’re always running experiments to see if changes will get people to return more often and stay longer. Given a lot of news is negative or controversial, a feed with no news (unless it’s shared by a user) could be less contentious and more enjoyable for users. And another group that likes less controversy, of course, is another important Facebook constituency: advertisers. “Sometimes people get really annoyed and confused when they’re reading about their cousin’s bar mitzvah or whatever and they see a very serious story afterward,” said Andrew Montalenti, CTO and co-founder of web analytics firm Parsely. “All of the platforms, what they’re really concerned about with fake news is that I think you kind of draw on a bank account of trust with the user. If you come across that stuff too much, you declare it to be a problem, and you stop using it. So they have to play this delicate balance — ‘We can’t show you too many ads or show you too much spammy content.’” Another factor is the fake-news imbroglio that blew up in Facebook’s face in the past year, leading lawmakers to threaten regulation. Facebook responded by trying to police fake news, which has proved to be a challenge. Further de-emphasizing news or taking it out of the feed altogether is one way to deal with the problem. As to the Explore test, Facebook said: “There is no current plan to roll this out beyond these test countries or to charge pages on Facebook to pay for all their distribution in News Feed or Explore.” That was cold comfort to those publishers who depend on the news feed to reach audiences, though. As much as Facebook has declined in reach, it’s still a significant source of traffic for many publishers, which have already seen their direct traffic from Facebook decline in recent months, if not years, as Facebook has prioritized users’ posts and video content in the news feed. Some publishers whose audience strategy is closely tied to Facebook and follow the company closely are starting to consider the possibility of a newsless news feed. An executive at a traditional publishing company said this is “definitely on our minds” given the company gets a “ton of traffic from Facebook,” and it’s a risk the company has to think about in the next few years. “It would be seismic shift,” said another publishing exec. “There’s good reason to be concerned if publishers’ content becomes separated out of the main news feed,” said Vivian Schiller, a former Twitter news executive. “Their criteria [for the Explore test] was about user experience. That’s their business. But it’s hard to imagine this not having a deleterious effect on publishers.” There are other reasons for Facebook to go in this direction. Facebook could make an exception for publishers and other commercial content providers that pay to be in the news feed, which could mean more revenue for Facebook. Separating news from the feed also could give Facebook a way to test a potential new product, similar to how it took Messenger out of the site and made it its own app, Schiller said. Of course, none of this is a fait accompli. There’s good reason to think Facebook will keep news in the feed. Scrolling through the news feed is the core daily habit for most Facebook users. It’s what Facebook uses to promote its many other products, like the Watch video tab and Marketplace. It’s hard to get people to toggle from the news feed to other places on Facebook. That said, even if a newsless news feed doesn’t materialize, publishers have to adapt. Facebook, and Google, are here to stay, and Facebook has proven time and time again that it’s not always going to act in publishers’ interests. Publishers have to take matters into their own hands, and take advantage of other audience and revenue opportunities.
Online publishers are considering strategies for a future in which Facebook removes news stories from its main pages, called newsfeeds. The social network last year experimented with a separate newsfeed called Explore in six countries outside the US, but said it didn't plan to roll this out further. However, a number of other online platforms including Twitter and Snapchat have separated news stories from user-generated content to some degree, leading to speculation Facebook will follow suit. Publishers have already seen their direct traffic from Facebook decline in recent months as user content and videos have taken precedence.
Source: Xinhua| 2017-12-23 21:15:29|Editor: Zhou Xin Video Player Close HANGZHOU, Dec. 23 (Xinhua) -- A team of researchers from Zhejiang University have developed a new type of aluminum-graphene battery that can be charged in seconds, instead of hours. The team, led by professor Gao Chao, from Department of Polymer Science and Engineering of Zhejiang University, designed a battery using graphene films as anode and metallic aluminum as cathode. The battery could work well after quarter-million cycles and can be fully charged in seconds. Experiments show that the battery retains 91 percent of its original capacity after 250,000 recharges, surpassing all the previous batteries in terms of cycle life. In quick charge mode, the battery can be fully charged in 1.1 seconds, according to Gao. The finding was detailed in a paper recently published in Science Advances. The assembled battery also works well in temperatures range of minus 40 to 120 degrees Celsius. It can be folded, and does not explode when exposed to fire. However, the aluminum-ion battery cannot compete with commonly-used Li-ion batteries in terms of energy density, or the amount of power you can store in a battery in relation to the size, according to Gao. "It is still costly to make such battery. Commercial production of the battery can only be possible until we can find cheaper electrolyte," Gao said.
Scientists from Zhejiang University have developed an aluminium-graphene battery that can be charged in seconds, rather than hours. The battery is able to maintain 91% of its original capacity after 250,000 charges and could be fully charged in 1.1 seconds. The researchers also claimed the battery can work in temperatures ranging from minus 40C to 120C. However, they added that the battery does not have the energy density of lithium-ion batteries and is at present too expensive for commercial production.
The automation of repetitive tasks in construction and manufacturing has been around for some time. Last month, Panasonic introduced an agricultural robot at Tokyo’s International Robot Exhibition that could have implications for workers in the fruit-picking business. Harvesting tomatoes is more complicated than you might think. Each fruit has to be plucked from the vine once it is ripe enough, not before. It’s also a delicate operation: tomatoes bruise easily and a single scratch in one can lead to a whole box going bad, fast. Read more: Robot tax could ease drawbacks of automation Panasonic robot harvests tomatoes To handle the perception and dexterity-related challenges that come with fruit picking, Panasonic’s new robot relies on a combination of camera, range image sensor and artificial intelligence technologies. First, it recognizes which tomatoes are ready to be picked. Then, it performs a precise cut-and-catch technique to move each tomato from vine to bucket. The robot can be mounted on a rail, enabling it to slide along one vine from start to finish. In terms of speed, Panasonic expects the robot to perform at least as well as a human, harvesting at an average pace of 10 tomatoes per minute. However, as the robot doesn’t need breaks, pay rises or sick days, it’s easy to see where the attraction might lie in terms of wider efficiency gains. Panasonic has so far only demonstrated its harvesting robot and no announcement has yet been made regarding its readiness for market or cost. Read more: Tarzan robot swings above crops for automated agriculture With great dexterity comes great responsibility The rise of computer vision and faster, more agile robots has made complex tasks accessible to automation. Tomato picking is just one example. Last month, Ocado released footage of a new bagging robot, capable of picking products and carefully placing them into shopping bags based on the shape and weight of each item. This level of processing and dexterity could pave the way for applications that go far beyond monotonous tasks in agriculture and retail. Read more: Italian start-up Evja launches smart agriculture platform for salad growers
A robot that can autonomously harvest tomatoes has been developed by Japanese electronics firm Panasonic. The device uses a camera, range-image sensors and artificial intelligence (AI) systems to detect which fruit are ripe for picking and cut them from the plant. Mounted on a rail alongside the vine, the company said the robot can perform as well as a human picker, averaging a pace of 10 tomatoes per minute. The device has been publicly demonstrated but Panasonic has made no announcement about its likely cost or plans for its production.
A British campaign group set up by young venture capitalists to boost diverse representation in VC is expanding to the US. Diversity VC will be backed in the US by partners and associates at Female Founders Fund, Entrepreneur First, and financially by law firm Cooley. 2017 was a shocking year for US venture capital, with high-profile figures such as Uber investor Shervin Pishevar, 500startups founder Dave McClure, Binary Capital cofounder Justin Caldbeck and many others accused of sexual harassment. A British initiative set up to tackle venture capital's monoculture will launch in the US, where the industry is currently reeling from multiple sexual harassment scandals. Diversity VC was set up by five young venture capitalists in March to highlight the fact that most British venture capital investors are middle-class white men, with a knock-on effect on which startups get funding. It released a landmark report in May showing that almost half of British VC firms had no women on their investment teams. Diversity VC founder and investor Check Warner told Business Insider that the group had decided to expand to the US after multiple requests from venture capital associates. "What's been lacking in conversations in the US is that basis in fact, [research into] why we have such a big problem," Warner said. "There wasn't anything comparable to what we had done. "People got in touch to say 'It would be great to have something like this, a system and process to start uncovering this.'" Warner and her cofounders set up Diversity VC only a few months before a wave of harassment scandals hit US venture firms. Six women accused Binary Capital cofounder Justin Caldbeck of inappropriate behaviour, resulting in his leave of absence from the firm. Subsequent scandals engulfed Uber investor Shervin Pishevar, 500 Startups founder Dave McClure, and Tesla investor Steve Jurvetson. "It's been interesting to see these things come out," said Warner. "It's reflective of the culture not including people, and having this sense of entitlement. You don't have checks and balances on behaviour. "The lack of female diversity around the table is a contributing factor to people getting away with what they got away with." Like its UK counterpart, the US chapter of Diversity VC will promote greater diversity in venture capital through four initiatives, including helping underrepresented groups build a VC network, getting minority interns into VC firms, and publishing data about diverse representation. The group has backing from Female Founders Fund partner Sutian Dong, Insight Venture Partners associate Juliet Bailin, and Entrepreneur First's US head of funding Matt Wichrowski. Law firm Cooley is giving financial backing.
Diversity VC, a UK initiative established last year by five young venture capitalists to tackle the lack of diversity within their industry, is set to launch in the US. The group released a report in May that revealed almost 50% of UK venture capital firms had no women working on their investment teams. Diversity VC in the US will get financial backing from law firm Cooley and support from Insight Venture Partners, Entrepreneur First and Female Founders Fund. Diversity VC said the US launch was in response to requests from workers in the US industry.
Ocean dead zones with zero oxygen have quadrupled in size since 1950, scientists have warned, while the number of very low oxygen sites near coasts have multiplied tenfold. Most sea creatures cannot survive in these zones and current trends would lead to mass extinction in the long run, risking dire consequences for the hundreds of millions of people who depend on the sea. Climate change caused by fossil fuel burning is the cause of the large-scale deoxygenation, as warmer waters hold less oxygen. The coastal dead zones result from fertiliser and sewage running off the land and into the seas. The analysis, published in the journal Science, is the first comprehensive analysis of the areas and states: “Major extinction events in Earth’s history have been associated with warm climates and oxygen-deficient oceans.” Denise Breitburg, at the Smithsonian Environmental Research Center in the US and who led the analysis, said: “Under the current trajectory that is where we would be headed. But the consequences to humans of staying on that trajectory are so dire that it is hard to imagine we would go quite that far down that path.” “This is a problem we can solve,” Breitburg said. “Halting climate change requires a global effort, but even local actions can help with nutrient-driven oxygen decline.” She pointed to recoveries in Chesapeake Bay in the US and the Thames river in the UK, where better farm and sewage practices led to dead zones disappearing. However, Prof Robert Diaz at the Virginia Institute of Marine Science, who reviewed the new study, said: “Right now, the increasing expansion of coastal dead zones and decline in open ocean oxygen are not priority problems for governments around the world. Unfortunately, it will take severe and persistent mortality of fisheries for the seriousness of low oxygen to be realised.” The oceans feed more than 500 million people, especially in poorer nations, and provide jobs for 350 million people. But at least 500 dead zones have now been reported near coasts, up from fewer than 50 in 1950. Lack of monitoring in many regions means the true number may be much higher. The open ocean has natural low oxygen areas, usually off the west coast of continents due to the way the rotation of the Earth affects ocean currents. But these dead zones have expanded dramatically, increasing by millions of square kilometres since 1950, roughly equivalent to the area of the European Union. Furthermore, the level of oxygen in all ocean waters is falling, with 2% – 77bn tonnes – being lost since 1950. This can reduce growth, impair reproduction and increase disease, the scientists warn. One irony is that warmer waters not only hold less oxygen but also mean marine organisms have to breathe faster, using up oxygen more quickly. There are also dangerous feedback mechanisms. Microbes that proliferate at very low oxygen levels produce lots of nitrous oxide, a greenhouse gas that is 300 times more potent than carbon dioxide. In coastal regions, fertiliser, manure and sewage pollution cause algal blooms and when the algae decompose oxygen is sucked out of the water. However, in some places, the algae can lead to more food for fish and increase catches around the dead zones. This may not be sustainable though, said Breitburg: “There is a lot of concern that we are really changing the way these systems function and that the overall resilience of these systems may be reduced.” The new analysis was produced by an international working group created in 2016 by Unesco’s Intergovernmental Oceanographic Commission. The commission’s Kirsten Isensee said: “Ocean deoxygenation is taking place all over the world as a result of the human footprint, therefore we also need to address it globally.” Lucia von Reusner, campaign director of the campaign group, Mighty Earth, which recently exposed a link between the dead zone in the Gulf of Mexico and large scale meat production, said: “These dead zones will continue to expand unless the major meat companies that dominate our global agricultural system start cleaning up their supply chains to keep pollution out of our waters.” Diaz said the speed of ocean suffocation already seen was breathtaking: “No other variable of such ecological importance to coastal ecosystems has changed so drastically in such a short period of time from human activities as dissolved oxygen.” He said the need for urgent action is best summarised by the motto of the American Lung Association: “If you can’t breathe, nothing else matters.”
Ocean dead zones, which contain no oxygen, have become four times larger since 1950, while the number of areas with very low oxygen close to coasts has increased tenfold, according to the first comprehensive analysis of these areas. Most marine species cannot exist in such conditions, and the continuation of such trends would result in mass extinction, endangering the livelihood of millions of people. Large-scale deoxygenation is the result of climate change caused by burning fossil fuels; as waters warm, they contain less oxygen.
Online retail and e-commerce giant Amazon is reportedly on the verge of making its first investment in an insurtech start-up, with the company said close to finalising an investment in online-only insurance start-up Acko. Acko wants to disrupt India’s insurance industry through a digital-only platform, having raised $30 million and recently received in-principal approval from the financial market regulators in India. Amazon and Indian rival Flipkart had both been pursuing investing in Acko, it has been reported widely, but at this stage it is now thought that Amazon is close to signing a term-sheet for the investment and a partnership deal with Acko. It’s said that the arrangement will see Amazon acting as an online distributor for Acko’s insurance products, selling a range of financial products. The potential for Amazon to enter the insurance space has been much-discussed in recent months, including in our article from November, Incumbents could be relegated, if tech giants come for re/insurance. Now it appears Amazon is close to taking a sensible step of investing in and partnering with an insurtech start-up, in order to gain the ability to add insurance products to its retail offering, seeing the firm stepping into the sale of financial products for the first time. Targeting India first is also a smart move, as the burgeoning financial services market there has a strong focus on technology and take-up rates of insurance products are rising all the time. If Amazon can crack selling insurance online to the Indian market, it will stand it in good stead to break into more established markets such as the United States and Europe. Of course, if Amazon does move into insurance meaningfully it will likely only be a matter of time before other tech giants such as Google follow suit with their own integrated e-commerce offerings. It’s also been reported that Flipkart is readying its own entry into insurance sales online, with the establishment of a new entity to focus on financial services and venture investing. Also read: Incumbents could be relegated, if tech giants come for re/insurance.
Amazon is reportedly finalising an investment in Indian insurtech firm Acko. The arrangement would see Amazon acting as a distribution platform for the online-only insurer. Amazon's biggest rival in India, Flipkart, was also reportedly considering an investment in Acko. The start-up has raised $30m to date and has provisional approval to operate from India's financial markets regulator. 
Some smartphone games have been found using a specific software that uses your device's microphone to track users' TV watching habits and collect data for advertisers. According to a recent New York Times report, more than 250 games on the Google Play Store use software from a company called Alphonso that uses the smartphone's mic to listen for audio signals in TV ads and shows. The data collected is then sold to advertisers for ad targeting and analysis. NYT reports that the software is used in games, some of which are geared towards children. Although the software does not record human conversations, it can detect sounds even when a phone is stowed away in someone's pocket and the apps are running in the background. Alphonso's chief executive Ashish Chordia told NYT that the company has also worked with film studios to analyse viewers' movie-watching habits in theaters as well. "A lot of the folks will go and turn off their phone, but a small portion of people don't and put it in their pocket," Chordia said. "In those cases, we are able to pick up in a small sample who is watching the show or the movie." While most apps seemed to be available in the Google Play Store, the Times noted that some were on Apple's App Store as well. Although the software's activities are creepy, some of the apps do disclose its tracking of "TV viewership details" in their descriptions under the "read more" button and software use policies. Both Apple and Google require apps to get explicit permission from users in order to access certain phone features such as the camera, microphone, location, photo gallery and more. However, most users don't usually read the disclosure and are often unaware they have agreed to let the app access their phone's microphone. "The consumer is opting in knowingly and can opt out any time," Chordia said. He also noted that the company's disclosures comply with Federal Trade Commission guidelines and offer instructions for users to opt-out of the software on its website. He added that the firm does not approve of its software being used in apps targeting children. However, it has been found integrated into a number of games such as "Teeth Fixed" and "Zap Balloons" by India-based KLAP Edutainment. A simple search for "Alphonso software" and "Alphonso automated" on the Play Store yields numerous apps that integrate the software. One game called "Dream Run" by Dumadu Games - which has been downloaded and installed by about 5000 to 10,000 users - discloses under a "Read More" button that it is integrated with Alphonso Automated Content Recognition (ACR) software. "With your permission provided at the time of downloading the app, the ACR software receives short duration audio samples from the microphone on your device," the disclosure reads. "Access to the microphone is allowed only with your consent, and the audio samples do not leave your device but are instead hashed into digital 'audio signatures.' "The audio signatures are compared to commercial content that is playing on your television, including content from set-top-boxes, media players, gaming consoles, broadcast, or another video source (e.g., TV shows, streaming programs, advertisements, etc.)." 1 of 2 The revelation does seem to echo the years-long conspiracy theory that apps by major tech giants such as Facebook tap into users' smartphone mics to secretly listen in on conversations and offer up relevant ads. Facebook has long tried to dismiss the speculation. "We have to be really careful as we have more devices capturing more information in living rooms and bedrooms and on the street and in other people's homes that the public is not blindsided and surprised by things," Dave Morgan, the founder and CEO of Simulmedia that works with marketers on targeted ads, told the Times. "It's not what's legal. It is what's not creepy."
More than 250 games sold in the Google Play Store and Apple Store have been found to contain software that uses smartphone microphones to track user's TV-watching habits and sell the data on to advertisers. A New York Times report said the software, developed by a company called Alphonso, often goes undetected by users who do not read their phone software use policies, where it is detailed. Alphonso CEO Ashish Chordia said its activity was compliant with Federal Trade Commission guidelines and users could opt out at any time.
In 2007, the tragic case of 12-year-old Deamonte Driver gained national attention when the Maryland boy died from an untreated tooth infection because his family couldn’t find a dentist who would treat him. Instead of an $80 procedure that could have prevented Deamonte’s death, his saga turned into a series of hospital visits that came too late, ending with the needless loss of his life, but also costing taxpayers tens of thousands of dollars through Medicaid. And yet, little has changed across the U.S. since then when it comes to dental care access. There has been a serious market failure, harming lives and raising costs. Fortunately, a market solution now exists, if only states will adopt it: dental therapists.
Poor dental health affects tens of millions of Americans with 63 million live in places described as dental shortage areas; suffering from decaying teeth, toothaches and chronic dental pains. Not only does this lead to general suffering, from an economic stance work productivity is damaged across all age ranges. Moreover, high medical costs mean only 1 in 3 dentists actually accept Medicaid patients. However, states are now attempting to overcome the issues that stop both adults and children receiving proper dental care at little cost to the taxpayer. Political advocates Grover Norquist and Don Berwick see a simple solution, one that received over a 75 percent backing from Democrats and Republicans alike. They want to allow dentists to hire a variety of professionals from dental therapists to nurse practitioners who can supply preventive care and perform routine procedures that do not require dentists. This would utilize the free market at little cost, allowing dental therapists to operate without unnecessary government barriers while being able to treat more patients, offering the potential opportunity for SME’s expansion. Tragic cases like Deamonte Driver, who lost his life after receiving poor dental treatment would be prevented if dental therapists were able to operate. Subsequently, dental therapists are far cheaper than dentists, they provide high-quality care, are well educated and readily available, having existed in more than 50 countries for over a century. Should states turn to dental therapists, it would provide a simple solution to help patients, businesses and combat rising health care costs.
Sign up to FREE email alerts from Wales Online - The CardiffOnline Newsletter Invalid Email Something went wrong, please try again later. Subscribe Thank you for subscribing We have more newsletters Show me See our privacy notice Investment into Cardiff's commercial property sector reached record levels in 2017 at more than £400m, according to international real estate advisory firm Savills. In 2016 investment into the city totalled £298m. The research by Savills shows that volumes were heavily skewed this year by the activity at the Central Square regeneration scheme, which amounted to a combined £224.7m. In total, investment into the Cardiff office sector surpassed £350m, which represents a record high. Prime yields for Cardiff office investments fall 75 basis points from the beginning of the year to 5.50% as strong investor interest has resulted in downward pressure. The second largest sector in terms of investment levels in 2017 was the leisure sector with investment reaching £43m. The sector was dominated by the £20.5m acquisition of Stadium Plaza by Naissance Capital Real Estate and the £22.1m purchase of the Clayton hotel by M&G Real Estate. Ross Griffin, director of investment at Savills Cardiff, said: “Cardiff remains a popular investment destination, particularly for those looking to place their money into the regional office market. The development at Central Square has significantly boosted the volumes for this year, providing an attractive opportunity. "Looking ahead to 2018 we expect to see continued activity from institutions on both the buy and sell side as they look for long term income. Overseas capital will also be active, particularly for prime assets at attractive yields."
Investment in Cardiff's commercial property sector reached record levels in 2017, at more than £400m ($542m), compared with £298m in 2016, according to Savills. Ross Griffin, director of investment at Savills Cardiff, said the figure was boosted by the £224.7m invested in the Central Square regeneration scheme. The data also revealed total investment in the office sector reached £350m, while prime yields for office investments dropped 75 basis points to 5.50%.
Office take up in quarter four, 2017 in the central Birmingham office market totalled 354,530 sq ft in 49 deals as scheduled in the table below, compiled by the Birmingham Office Market Forum. When added to the 81 deals totalling 650,542 sq ft recorded in the first three quarters of the year, the 2017 year-end total take up amounts to 1,005,072 sq ft in 130 deals. This compares with: 692,729 sq ft in 139 deals for 2016 970,458 sq ft in 132 deals for 2015 713,460 sq ft in 148 deals for 2014 664,147 sq ft in 128 deals for 2013 The 2017 outcome is a record take up year, beating the previous high seen in 2015. Office take up was boosted by the emergence of HS2 related demand together with the Government committing to the largest prelet seen in the city for a decade. In addition, key larger transactions were concluded in the Professional Services and the Serviced Office sectors. Breaking 1 million sq ft office take up for the first time is extremely positive for Birmingham during the current period of unprecedented development activity and further regeneration, visible across the BOMF area. It is also particularly encouraging bearing in mind the slow first half of the year following on from the dip seen in the previous year’s total. For further information please contact the author of the report Jonathan Carmalt, Director, Office Agency, JLL on 0121 214 9935 or email [email protected] Birmingham Office Market Forum was established in 2007 to present a co-ordinated voice to investors, developers and occupiers about Birmingham’s city centre office market. The Forum brings together the city’s leading office agents and Business Birmingham. For a list of member firms or further information visit their website
Office take-up in Birmingham during 2017 broke the one million sq ft barrier for the first time, in spite of a slow first half to the year, according to data from the city's Office Market Forum. The year saw a total of 130 deals and beating the previous record of 970,458 sq ft in 2015. The figures were given a fillip by demand linked to the planned HS2 high-speed rail link, and the UK government committing to the city's biggest pre-let in 10 years.
This omission was noted by popular share investing website The Motley Fool with British analyst GA Chester writing that previously, various figures Purplebricks gave made it possible to at least estimate the number. "My calculations of the average sale price suggested that either the company was cornering the market in trailer park homes sales or that a rather large proportion of instructions weren't being converted to completions," Mr Chester wrote. "Obviously, if you're charging a fixed fee but fail to complete the sale in too many cases, you're not going to have a sustainable business in the longer term," he added. Purplebricks' Australian website shows its agents have sold 2247 homes since September 2016 out of total listings of 3495. This suggests a clearance rate of 70-75 per cent, if recent listings are excluded. Also a concern is that while Purplebricks has continued to ramp up its British advertising spending, UK revenue growth has halved in the past two years from 154 per cent in the first of 2016-17 to 77 per cent in the second half of 2017-18. "For me, this trend appears ominous for the market's future top and bottom-line growth expectations," Mr Chester said. Advertisement Investors nervous Despite these issues, Neil Woodford, Britain's most high-profile fund manager, remains a strong backer of Purplebricks, with his Woodford Investment Management Ltd retaining a 27 per cent stake having bought into the float. However, investors are clearly nervous – the Purplebricks share price dropped 6 per cent in September when it was briefly, incorrectly reported that Woodford had reduced its stake to just 2.99 per cent. Old Mutual is another backing Purplebricks – the insurance and banking group recently increased its stake in the company to 12.6 per cent from 11.1 per cent. In its latest interim results released in mid-December, Purplebricks reported that total group revenue more than doubled to £46.8 million with its British business posting a healthy £3.2 million operating profit. However, losses at its Australian business more than doubled to £5.1 million after it spent £5.7 million on marketing the brand locally. Advertisement Purplebricks upgraded its UK revenue guidance 5 per cent to £84 million as part of its interim results. The company said it was on course to achieve full-year revenue guidance of £12 million ($20.8 million) in Australia. Chief executive and co-founder Michael Bruce said the company's Australian business was "on track" and performing ahead of expectations. "Our progress in Australia has been exciting and encouraging. Our market share in Australia is greater than our market share was in the UK at the same time in its evolution," Mr Bruce said. The Australian Financial Review has reported on a few notable successes by Purplebricks estate agents, including veteran property executive Bryce Mitchelson, the managing director of $500 million childcare trust Arena REIT selling a three-bedroom Edwardian home in Elsternwick for more than $1.7 million after just two weeks with Purplebricks and saving an estimated $38,000 in commission fees. In another high-profile result, Purplebricks saved property developer David Fam more than $61,000 on the $3.1 million sale of his Sutherland Shire mansion. However, the Financial Review has also highlighted what happens when Purplebricks does not achieve a result, leaving a customer with a big bill and a house that is still for sale. This was the case for Sydney woman Kerryn Lehmann who ended up owing Purplebricks $12,000 when her four-bedroom riverfront home in Como in the Sutherland Shire failed to sell.
UK hybrid estate agency Purplebricks revealed a £3.2m ($4.3m) operating profit in its mid-December interim results, and has revised up its revenue guidance by 5% to £84m. However, the results did not include data on how many properties had been sold in the six months to October, prompting some to query the company's £4.16 share price and question if its up-front agent fee business model was sustainable. The firm's Australian arm was on target to achieve its full-year revenue guidance of £12m, despite incurring losses of £5.1m, the results showed.
The system came into effect January 1, 2018. The measure has been introduced to improve water quality in the country by limiting phosphate production from dairy cattle manure and promote a shift to land-based farming. The EC said that given the high density of dairy cattle in the Netherlands, the phosphate contained in dairy cattle manure represents a significant environmental concern. In addition to the main environmental objectives, the system also provides support for young farmers and is intended to have a positive effect on grazing and grassland. Trading rights Dairy farms will be awarded phosphate rights for free and will only be allowed to produce phosphate from dairy cattle manure corresponding to the phosphate production rights they hold. At the end of each calendar year, farms will be required to demonstrate that they have sufficient phosphate rights to justify the amount of phosphate produced by their dairy cattle manure. Dairy farms, including new entrants, can acquire phosphate rights on the market, as phosphate rights will be traded. When a transaction occurs, 10% of the traded rights will be withheld and kept in a ‘phosphate bank.’ This is intended to encourage the development of more land-based dairy farming by providing temporary, non-tradable rights to "land-based farms," which can fully absorb on their land all the phosphate from their own manure production. Based on the environmental objectives the system aims to achieve, the European Commission concluded that the system is in line with the EU rules for environmental State aid.
The European Commission has given the go-ahead to a trading system for phosphate rights for dairy cattle in the Netherlands, aimed at improving the country's water quality by limiting phosphate production from dairy cattle manure and encouraging a move to land-based farming. Dairy farmers will receive phosphate rights for free and will be obligated each year to prove they have sufficient rights to justify the quantity of phosphate produced by their manure. Phosphate rights can be obtained on the market, with 10% of the traded rights held back to promote the development of more land-based dairy farming.
"But, on the other hand, we would be foolish to rule anything out. We know that Asia-Pacific will be a very important market and we know a lot of the global growth in the future will come from there."
According to the UK's International Trade Secretary, Liam Fox, the UK could feasibly join the Trans-Pacific Partnership (TPP), saying "it would be foolish to rule anything out". The organisation is made up of Australia, Mexico, New Zealand, Canada, Chile, Japan, Singapore, Brunei, Peru, Vietnam and Malaysia - with Donald Trump pulling the US out last year - and is currently in renegotiation under the new name of the Comprehensive and Progressive Agreement for Trans-Pacific Partnership. Its aims are to lower both non-tariff and tariff barriers to trade and to provide a forum to settle international disputes.
HONG KONG, Jan. 3, 2018 /PRNewswire/ -- For an even better property searching experience, GoHome.com.hk, part of REA Group, has recently launched a new website with refined functionalities and layout improving the user experience. People can now can search for their dream property and find out the latest property insights about the area, property prices and information about the property through one simple click. The new website has a new search result page and property details page which offers: A new property and serviced apartment section Comprehensive secondary property listings Responding to Hong Kong consumer demand, GoHome.com.hk's has introduced a new mobile responsive function which allow layouts to be automatically fitted for multiscreen devices such as desktops, tablets and mobile devices, meaning the search for an ideal home is now even easier when you're on the go. Ms. Kerry Wong, Chief Executive Officer, Greater China Region, REA Group, said "GoHome.com.hk is the place that people use to find their perfect home. We've focused on improving the experience so people can now effortlessly explore and search for their ideal properties using specific criteria anywhere and anytime they want to." "By providing comprehensive and timely property information, we're changing the way our customers and consumers better understand property insights and trends by giving them access to the latest information in addition to searching for the perfect property through the one portal," said Ms Wong. Across its global network, REA Group's purpose is to change the way people experiences property through delivering the best property insights and information on their websites and creating the most engaging consumer experience to help people find their perfect place more quickly and easily. For media queries, please contact: REA Group (Hong Kong) Vis Communications Consultancy Limited Ms. Hermia Chan Mr. Felix Poon Tel.: +852 3965 4326 / + 852 9386 0166 Tel.: +852 2804 2388 / +852 9202 2885 Email: hermia.chan@rea-group.com Email: felix@vis-pr.com About GoHome.com.hk GoHome.com.hk is Hong Kong's leading online property platform. Since 1999, GoHome.com.hk has been focused on providing value-added search experiences for the property-related industry and market in Hong Kong, Greater China and ASEAN. GoHome.com.hk was named "Property Portal of the Year" by Marketing Magazine in 2011, 2012 and 2013, "Best Property Developer Partner – Most Comprehensive Property Website" by Capital Magazine in 2013, 2014 and 2015, as well as "Outstanding Online Property Information Platform" at the Hong Kong Digital Brand Awards by Metro Broadcast Corporation Limited and CHKCI in 2017. About REA Group Limited REA Group Limited ACN 068 349 066 (ASX:REA) ("REA Group") is a multinational digital advertising business specialising in property. REA Group operates Australia's leading residential and commercial property websites, realestate.com.au and realcommercial.com.au, Chinese property site myfun.com and a number of property portals in Asia via its ownership of iProperty Group. REA Group also has a significant shareholding in US based Move, Inc and PropTiger in India. Within Hong Kong, REA Group Asia operates GoHome.com.hk, squarefoot.com.hk and SMART Expo. The brands aim to provide consumers with extensive local and overseas property news, listings and investment opportunities while offering property and home-related advertisers with a one-stop, multi-platform solution. SOURCE GoHome.com.hk, part of REA Group
Hong Kong online property platform GoHome.com.hk, part of REA Group, has overhauled its website to improve its customer experience. As well as featuring a new search result and property details page, the site now includes a property and serviced-apartment listings section, and automatically adapts to whatever device is being used to access it.
Two-thirds of Americans believe robots will soon take over the majority of tasks currently done by humans. Swedes, on the other hand, are not concerned about new technology. “No, I’m afraid of old technology,” the Swedish minister for employment and integration, Ylva Johansson, told the New York Times. “The jobs disappear, and then we train people for new jobs. We won’t protect jobs. But we will protect workers.” A recent survey by the European Commision found that 80 percent of Swedes have a positive view of robots and AI. Why such enthusiasm? Swedish citizens tend to trust that their government and the companies they work for will take care of them, and they see automation as a way to improve business efficiency. Since Swedish employees actually do benefit from increased profits by getting higher wages, a win for companies is a win for workers. As the Times points out, the American tendency to worry about robots’ replacing human workers is driven by the severe consequences of losing a job in the U.S. The risk of losing health insurance and a steady income makes people reluctant to leave jobs in favor of new career options or training. Sweden’s free health care, education, and job transition programs dampen the risk of such undertakings—which may be why people in the country are mostly happy to pay income tax rates of up to nearly 60 percent. The U.S., by contrast, provides almost none of these services. The difference is especially stark in the area of employment assistance: the U.S. spends only about 0.1 percent of GDP on programs designed to help people deal with changes in the workplace (see “The Relentless Pace of Automation”).
A majority of Swedish people have a positive view of the rise of robots and artificial intelligence, while the majority of Americans are concerned about it. The European Commission published a survey which found that 80% of respondents from Sweden had a positive view of such technology, higher than the European average of 61%, and lower only than Denmark (82%) and the Netherlands (81%). A separate survey from the Pew Research Centre in the United States found that 72% of adults there were worried about the technology.
In Illinois, researchers from University of Illinois at Urbana-Champaign are engineering sugarcane plants, called lipidcane, to produce more oil as well as sugar. Growing lipidcane containing 20 percent oil would be five times more profitable per acre than soybeans, the main feedstock currently used to make biodiesel in the United States, and twice as profitable per acre as corn, according to their research. They estimate that compared to soybeans, lipidcane containing 5 percent oil could produce four times more jet fuel per acre of land. Lipidcane with 20 percent oil could produce more than 15 times more jet fuel per acre. Researchers estimate that if 23 million acres in the southeastern United States was devoted to lipidcane with 20 percent oil, the crop could produce 65 percent of the U.S. jet fuel supply at a cost to airlines of US$5.31 per gallon, which is less than bio-jet fuel produced from algae or other oil crops.
Researchers at University of Illinois at Urbana-Champaign are tinkering with sugarcane plants to create a more cost-effective feedstock for biofuels for aircraft than corn or soybeans. According to the team, lipidcane containing 20% oil is twice as profitable per acre than corn and five times more than soybeans, and could yield over 15 times more jet fuel per acre. The researchers also estimated that 23 million acres of lipidcane could produce 65% of the jet fuel used to supply US airlines, at a cost of $5.31 per gallon, cheaper than other biofuels.
by Laurie Sullivan @lauriesullivan, January 2, 2018 Virtual assistants have marketers scrambling to figure out how to optimize content as companies like Amazon begin testing voice-triggered audio search advertisements. Reports surfaced Tuesday that Amazon has been speaking to consumer products goods companies such as Clorox and Procter & Gamble to develop advertisements. The CPG companies would promote their products on Echo devices powered by the Alexa voice assistant. Early discussions have centered on whether companies would pay for higher placement if a user searches for a product on the device, similar to how paid-search works on Google, according to CNBC, which cited "people." This should not come as a surprise to marketers preparing to optimize content for voice searches. The ads are being described as what the industry refers to as sponsored content. For example, if someone asks Alexa how to clean up a spill, Amazon's voice assistant would respond to the query by naming a specific brand that paid for the sponsorship or bid a higher price to serve the voice advertisement. advertisement advertisement Advertisers are focused on search placement on Alexa and on other hubs because voice assistants typically only provide one answer per consumer query. Amazon has hinted at launching a voice-operated advertisement platform for sponsored products. And last week, reports surfaced that Amazon is testing several new advertising units and types. Another offering would allow brands to target users based on past shopping behavior or perhaps shopping behavior at the Whole Foods market. In May 2017, eMarketer estimated that the number of people in the U.S. using voice-enabled speakers would more than double to 36 million, with Amazon capturing nearly 71% of the market.
Amazon is testing audio advertisements for its voice-activated virtual assistant, Alexa. It is reported to have approached a number of companies to develop adverts that would be promoted on its Echo devices, with advertisers able to pay to optimise the placement of their products. The development would mean customers who ask for help with domestic problems could have branded products suggested to them or be played adverts for such products. The use of voice-activated virtual assistants in the US is predicted to grow significantly, making them attractive to advertisers as they often only offer one answer to a query.
Maybe start hoarding now. Photo: Diana Miller/Getty Images/Cultura Exclusive Here’s some climate-change news that President Trump will have trouble ignoring: Earth’s junk food is in danger of losing a crucial ingredient. Scientists now predict that chocolate — which POTUS will sometimes eat to celebrate making important military decisions — could become impossible to grow in the coming decades because of hotter temperatures and less rain in regions where cacao plants are cultivated. The year 2050 is when they predict that people will be forced to satisfy a sweet tooth with toffee or caramels seasoned with tears. Like coffee plants and wine grapvines, cacao is a finicky tree. It only grows well in rain-forest land that’s within 20 degrees of the equator. Half of the world’s chocolate is produced in Côte d’Ivoire and Ghana, where the plants thrive at around 300 to 850 feet above sea level and under dependably humid weather conditions. But by 2050, researchers say that rising temperatures could push the optimal cultivation zone “uphill,” to as high as 1,500 feet. Thankfully, a team from UC Berkeley is working on a possible fix. It’s actually part of a new partnership that Mars announced last year. The M&M’s and Snickers maker is investing $1 billion into a variety of efforts to fight climate change, and the scientists in a plant-genomics program at Berkeley hope to develop hardier cacao plants that won’t wilt or rot at their current altitude. Berkeley’s gene-editing technology, called CRISPR, has been in the works for a while, though when it gets attention, it’s almost always for the potential to eliminate genetic diseases or (sort of on the extreme end of this) build “designer babies.” But creator Jennifer Doudna tells Business Insider that the “most profound” application will likely be saving food.
Chocolate could disappear by 2050 because of higher temperatures and less rainfall in regions where cacao trees are grown, according to the US's National Oceanic and Atmospheric Administration. Cacao plants only grow well in rainforest regions that are near the equator, with half of the world's chocolate being produced in Côte d’Ivoire and Ghana at up to 850 ft above sea level. Rising temperatures are pushing growing regions to about 1,000 ft above sea level and into mountainous terrain. UC Berkeley is working with Mars to generate new strains of cacao plants using CRISPR gene-editing technology.
California-based asset management firm Reality Share Advisors announced on Wednesday its advisory board now includes six blockchain and cryptocurrency executives. The members are the following: Erik Voorhees: The founder of Coinapult and CEO of ShapeShift. The founder of Coinapult and CEO of ShapeShift. Dr. Garrick Hileman: A research fellow at the University of Cambridge and researcher at the London School of Economics. A research fellow at the University of Cambridge and researcher at the London School of Economics. Jeff Garzik: The co-founder and CEO of Bloq, a blockchain enterprise software company. The co-founder and CEO of Bloq, a blockchain enterprise software company. Matthew Roszak: The founding partner of Tally Capital, a private investment firm focused on blockchain-enabled technology. The founding partner of Tally Capital, a private investment firm focused on blockchain-enabled technology. Steve Beauregard: The founder and former CEO of leading blockchain payment processor GoCoin. The founder and former CEO of leading blockchain payment processor GoCoin. Derin Cag: The founder of Richtopia and co-founder of Blockchain Age, a research center and digital data consultancy for blockchain technology. While sharing more details about the board, Eric Ervin, CEO of Reality Shares, stated: “In recognizing the tremendous growth potential for blockchain technology while still in its infancy, this advisory board seeks to infuse our investment products and decisions with the knowledge and research of credible thought leaders in the space.” Ervin then added: “Our newly-formed advisory board is comprised of well-regarded influencers at the forefront of blockchain innovation who are deeply entrenched in the disruptive technologies and ideas propelling the distributed ledger and cryptocurrency revolution.” Founded in 2011, Reality Shares is described as an innovative asset management firm, ETF issuer, and index provider. The firm noted that its goal is democratize the world’s best investing ideas, using systematic quantitative methods to deliver products and solutions that support a range of investing objectives, such as diversification, lower correlation, risk mitigation, or unique market exposures.
Asset management company Reality Shares Advisors has appointed six blockchain and cryptocurrency executives to its advisory board to "infuse its investment products and decisions with the knowledge and research of credible thought leaders", according to CEO Eric Ervin. The appointees include Jeff Garzik, the CEO of blockchain enterprise software company Bloq, Derin Cag, co-founder of research centre Blockchain Age, and Steve Beauregard, former CEO of payment processor GoCoin. Reality Shares Advisors focuses on ETF and index investments.
CubeSats, low-cost, bite-sized satellites inspired by the tubes used to hold Beanie Babies, were invented in 1999 as educational tools. Their creators — engineering professors Bob Twiggs and Jordi Puig-Suari — hoped building satellites the size and shape of Rubik’s Cubes would help students of all ages how to design and engineer efficiently. Now, aerospace suppliers and governments across the globe see the tools as the future of space commercialization and deep space exploration. They want to turn CubeSats into tools for low Earth orbit activities like telecommunications and reconnaissance. Companies like SpaceX, Virgin Galactic, Boeing and Airbus, for instance, want to create a space internet — a network of thousands of CubeSats that provide high speed broadband to remote parts of the world. And people like Paulo Lozano, director of the Space Propulsion Lab at the Massachusetts Institute of Technology, say sending the tiny satellites to asteroids could help improve space research (or even save the planet from an asteroid attack, he said). “Instead of going to an asteroid every five, 10 years the traditional way, release a fleet of these tiny little CubeSats and visit 100 asteroids because it’s so cheap,” he said. “Because some of these asteroids, especially the very small ones, have the potential to collide with the Earth. Detecting them in time is important [for stopping them], but also knowing their composition.” Over the first decade of the CubeSat era, universities dominated the landscape, sending two of every three devices into space. Today, commercial companies and militaries have taken over, launching 70 percent of CubeSats in the last five years. But there’s still one big problem: CubeSats can’t move once they’re in space — which limits their survival to months or years and makes them dangerous. “One of the big limitations in CubeSats is that they are launched as secondary payload. Once they are in space, they cannot move,” Lozano said. Of the 750 or so CubeSats sent into space so far, almost all have lacked their own propulsion systems. The tiny satellites are transported alongside regular cargo, and then flung into space. But without their own rockets, the CubeSats cannot maneuver on their own. Most fall slowly back to Earth, but some remain in orbit for years, where they join the other 100 million pieces of space debris that are at risk of colliding with other satellites and space stations. The U.S. Air Force, whose Joint Space Operation Center monitors more than 23,000 orbiting objects larger than four inches in diameter, issues about 700,000 of these collision warnings to satellite owners per year. Imagine what would happen if thousands of CubeSats were added to the fray. What CubeSats need to stay in space are mini boosters, and scientists like Lozano are racing to build them. Moving with static Lozano’s early work focused on big chemical rockets — the kind that you see strapped to space shuttles or on SpaceX missions. He knew these conventional rockets require huge fuel tanks — too big to be carried by CubeSats. Meanwhile, government standards limit how much chemical propellant can be carried by secondary cargo like CubeSats in order to prevent accidental explosions. “You don’t have a lot of leeway in what you’re allowed to bring up because if your satellite blows up and you’re the secondary payload, the primary people are going to be really angry,” said Kristina Lemmer, a mechanical and aerospace engineer at Western Michigan University, who isn’t involved with Lozano’s research. So, Lozano needed an alternative. His inspiration: static electricity and tiny drops of salt water. Static electricity is caused by an imbalance between positive and negative charges in an object. Rub a balloon on your sweater, and its rubber surface becomes covered in negative charge (electrons). Place the balloon near your positively charged hair, and it tugs on the strands until you have a misshapen mohawk. Lozano’s team designed a set of mini thrusters that rely on the same principle. The devices create an electric field that tugs on the charged particles in salt water until they peel off. The result is a spray made of charged molecules called ions. This ion spray doesn’t create a lot force. It’s always less than a millinewton, which is akin to the force produced when a mosquito lands on your arm. But the spray moves very fast, and even a small action creates a reaction in the frictionless vacuum of outer space. Use this to move ions in one direction, and a CubeSat will move uber fast in the other. By launching a fleet of CubeSats, scientists could learn the chemical compositions of asteroids, which could be the key to destroying or redirecting them. Lozano said the best chemical rockets produce a fiery exhaust that moves at about 9,000 miles per hour. His electrospray thrusters can go more than 111,000 miles per hour, he said. The thrusters, which look like computer microchips, are the size of quarters. The chips contain a grid of 500 needles — each a custom-built nozzle for spewing ions. His team tests them inside a large vacuum chamber at their lab in Boston. “In an ideal situation, all of the ions would have the same energy, but the physics of these ion beams makes it so that some ions have less energy than others,” said Catherine Miller, an MIT doctoral and NASA Space Technology Research Fellow. By studying how energy is distributed among the ion beams, she can calculate and standardize how each thruster will perform. Leading the space charge Only three propulsion boosters for CubeSats have had successfully demoed in space, Lemmer said. Lozano’s system was one of them, through a partnership with the Aerospace Corporation in 2015. But Lemmer, who published a comprehensive review of CubeSat propulsion systems last year, said Lozano’s ion engines stand out because each one can produce so much thrust. “Dr. Lozano’s system is probably the frontrunner for the possibility for deep space missions,” Lemmer said. “In order to go interplanetary, you’re going to have to have an electric propulsion system because they are so much more efficient.” Folks like NASA have counted on the high efficiency of ion engines in the past, such as with the Dawn mission to the asteroids Vesta and Ceres. That journey would have been impossible without the Dawn’s high velocity ion engine. But the Dawn mission cost half a billion dollars. Commercial CubeSats can cost as little as $100,000 — and this price is dropping. Even children are building CubeSats at their elementary schools. While Lozano’s electrospray thrusters don’t work exactly like the Dawn’s ion engines, Lemmer said the advantages are the same. You can carry less propellant — Lozano’s fuel tanks are the size of sugar cubes — but move more efficiently. What’s next? With another demonstration scheduled in early 2018, Lozano is dreaming big. He hopes his tiny thrusters can help CubeSats reach Mars or send them on asteroid scouting voyages. “Since they are so small, you can actually land on the asteroid with these rockets and take off again,” Lozano said. Though nobody over the last 1,000 years has died because of an asteroid strike, as far as anyone knows, the chances are still disturbing. Your odds of being killed by an asteroid in your lifetime — one in 700,000 — rival death by flood or earthquake in the U.S. By launching a fleet of CubeSats, scientists could learn the chemical compositions of asteroids, which could be the key to destroying or redirecting them. An asteroid made of silicon, for instance, would be much tougher to stop than one made of iron. Lemmer said CubeSats with propulsion could also provide a cheaper way to test new space technologies. “Right now, if you want to put a new technology on a NASA satellite, it’s years in the making to run sufficient tests,” Lemmer said. Instead, “if you launch a new technology on a CubeSat and show that it works in space without bad things happening, then you can more easily translate into a NASA mission down the road.”
Engineers are racing to create a propulsion method to allow Cubesats, 10 cm-sided satellite cubes, to move independently and in large numbers through space, enabling cheaper exploration of asteroids and planets. Among the projects is Paulo Lozano's chemical propulsion system, which uses electricity and tiny drops of salt water to create speeds of 111,000 mph. The device from Lozano, who is the director of the Space Propulsion Lab at the Massachusetts Institute of Technology, will undergo another demonstration in the coming months and is one of three to have been tested in space.
Approximately one hundred years ago, Erwin "Cannonball" Baker began driving cross-country, as quickly as possible, in anything he could get his hands on. His point: to demonstrate the reliability, range, and ease of refueling internal combustion cars. On Thursday, December 28th, 2017, Alex Roy joined Daniel Zorrilla, a Tesla Model 3 owner, to test the range and reliability of that vehicle—which happens to be one of the first delivered Model 3 customer cars. The pair departed the Portofino Inn in Redondo Beach, California; their final destination was the Red Ball garage in New York City. The two completed the cross-country drive in 50 hours and 16 minutes, setting a new electric Cannonball Run record. Total time: 50 hours, 16 minutes, 32 seconds Total mileage: 2860 miles Charging cost: $100.95
One of the first Tesla Model 3 customer cars has set a new record for the Cannonball Run in an electric vehicle. The car made the trip from Redondo Beach, California to New York City, a total journey mileage of 2,860 miles, in 50 hours and 16 minutes. The total charging cost for the duration of the Cannonball Run came in at $100.95.
Pioneering new technology is set to accelerate the global quest for crop improvement in a development which echoes the Green Revolution of the post war period. The speed breeding platform developed by teams at the John Innes Centre, University of Queensland and University of Sydney, uses a glasshouse or an artificial environment with enhanced lighting to create intense day-long regimes to speed up the search for better performing crops. Using the technique, the team has achieved wheat generation from seed to seed in just 8 weeks. These results appear today in Nature Plants. This means that it is now possible to grow as many as 6 generations of wheat every year -- a threefold increase on the shuttle-breeding techniques currently used by breeders and researchers. Dr Brande Wulff of the John Innes Centre, Norwich, a lead author on the paper, explains why speed is of the essence: "Globally, we face a huge challenge in breeding higher yielding and more resilient crops. Being able to cycle through more generations in less time will allow us to more rapidly create and test genetic combinations, looking for the best combinations for different environments." For many years the improvement rates of several staple crops have stalled, leading to a significant impediment in the quest to feed the growing global population and address the impacts of climate change. advertisement Speed breeding, says Dr Wulff, offers a potential new solution to a global challenge for the 21st century. "People said you may be able to cycle plants fast, but they will look tiny and insignificant, and only set a few seed. In fact, the new technology creates plants that look better and are healthier than those using standard conditions. One colleague could not believe it when he first saw the results." The exciting breakthrough has the potential to rank, in terms of impact, alongside the shuttle-breeding techniques introduced after the second world war as part of the green revolution. Dr Wulff goes on to say: "I would like to think that in 10 years from now you could walk into a field and point to plants whose attributes and traits were developed using this technology." This technique uses fully controlled growth environments and can also be scaled up to work in a standard glass house. It uses LED lights optimised to aid photosynthesis in intensive regimes of up to 22 hours per day. advertisement LED lights significantly reduce the cost compared to sodium vapour lamps which have long been in widespread use but are ineffective because they generate much heat and emit poor quality light. The international team also prove that the speed breeding technique can be used for a range of important crops. They have achieved up to 6 generations per year for bread wheat, durum wheat, barley, pea, and chickpea; and four generations for canola (a form of rapeseed). This is a significant increase compared with widely used commercial breeding techniques. Speed breeding, when employed alongside conventional field-based techniques, can be an important tool to enable advances in understanding the genetics of crops. "Speed breeding as a platform can be combined with lots of other technologies such as CRISPR gene editing to get to the end result faster," explains Dr Lee Hickey from the University of Queensland. The study shows that traits such as plant pathogen interactions, plant shape and structure, and flowering time can be studied in detail and repeated using the technology. The speed breeding technology has been welcomed by wheat breeders who have become early adopters. Ruth Bryant, Wheat Pathologist at RAGT Seeds Ltd, Essex, UK, said: "Breeders are always looking for ways to speed up the process of getting a variety to market so we are really interested in the concept of speed breeding. We are working closely with Dr Wulff's group at the John Innes Centre to develop this method in a commercial setting." Dr Allan Rattey, a wheat crop breeder with Australian company Dow AgroSciences, has used the technology to breed wheat with tolerance to pre-harvest sprouting (PHS) a major problem in Australia. "Environmental control for effective PHS screening and the long time taken to cycle through several cycles of recurrent selection were major bottle necks. The speed breeding and targeted selection platform have driven major gains for both of these areas of concerns."
UK and Australian scientists have developed a speed-breeding technique that could enable crops to be harvested every eight weeks, according to an article in Nature Plants. LED lights were used to create day-long regimes in fully controlled growth environments, with success rates the team believes could lead to six generations of bread and durum wheat, as well as barley, peas, and chickpeas, each year. The research, by the John Innes Centre in the UK, the University of Queensland and University of Sydney, is aimed at finding ways to feed the growing human population.
North West Property Richard Frost A typical Regus workspace Workspace provider Regus has opened two new business centres in Liverpool, meaning that it now has three locations in the city. The centres are located on Merchants Court on Derby Square and 1 Mann Island. Regus also has a site in Exchange Flags. Richard Morris, UK chief executive of Regus, said: "Demand for flexible workspace in Liverpool is booming so the city was a natural choice for our expansion plans. "The city is well-connected and offers excellent value and it's increasingly attracting investment and visitors from across the world. "We expect our new centres to be popular with a wide range of users including local small businesses, start-ups and remote workers as well as national firms opening satellite offices and global businesses establishing a footprint in the area."
Shared office provider Regus has added two new business centres to its portfolio in Liverpool. The new sites are located at Merchants Court, on Derby Square, and at 1 Mann Island, alongside the docks – both close to its Exchange Flags city-centre offices. Regus UK Chief Executive Richard Morris said the expansion is in response to growing demand for flexible office space in Liverpool, which is attracting investors from around the world.
Updated Story Several Democratic-led states are looking to implement state-level individual mandates for insurance coverage in an effort to reduce the prominence of bare counties and failing risk pools due to the end of the Affordable Care Act’s individual mandate in 2019 and other instabilities surround the law. California, New York, Maryland, Connecticut, and Washington state are all considering pursuing state individual mandates for insurance coverage when their state legislatures come into session in early 2018 ,...
Following on from the Republican tax bill, several Democratic-led states are looking to implement state-level individual mandates to overcome bare counties and the prospect of failing risk pools. California, Connecticut, New York, Maryland and Washington State, are all considering the move when their state legislatures come into session in early 2018. It is likely that the states will attempt to implement a model similar to that of RomneyCare, introduced in Massachusetts in 2006, and the ACA’s individual mandate. These states do not require federal approval for the move as the mandate penalty is a tax, and as a result, have the ability to implement their own version of the Obamacare mandate. It is unlikely that the move will extend beyond these states as there is far more likely to be partisan pushback as state legislatures often skew to the right. California, in particular, is looking at the possibility of a state individual mandate to overcome the uncertainty at federal level surrounding Obamacare. Maryland too would likely introduce an individual mandate, Massachusetts could fall back on its original proposition, with Washington State proving the most complicated.
Get email updates with the day's biggest stories Invalid Email Something went wrong, please try again later. Sign up Thank you for subscribing We have more newsletters Show me See our privacy notice McDonald's UK has pledged to give its employees their biggest pay rise in ten years, Mirror Online can reveal. The move comes into force on January 22 this year and is banded by position, region, and age. Only company-owned McDonald's restaurants (about a quarter of branches in Britain) are affected. A staff member at a McDonald's branch in London shared with Mirror Online a company notice put up by management on Tuesday night. Pay will increase up in all company-owned McDonald's restaurants (Image: Daily Mirror) The employee, who falls into the 21-24 category and has asked to remain anonymous, said in a private Facebook post: "WE WON THIS. Biggest pay rise for 10 years! If 0.001% going on strike can win this imagine what more can do!" They told Mirror Online: "[We've been told] pay will be raised, with some crew over 25 even getting £10 an hour! "Everyone's pay has gone up. It's not loads, but it's a win! My pay was around £7.45 and now it will be £7.95. It's the biggest raise in ten years." McDonald's recommends starting rates to managers. For perspective, under 18s currently get around £5.10 per hour, while those over 25 usually start on £7.60. McDonald's has confirmed to us that the wages on the pamphlet are correct. Now, 16-17 year-olds will join on a minimum wage of £5.75, while crew over the age of 25 will initially receive £8 per hour. The decision comes after last year's strikes – a British first for McDonald's – that saw staff from two branches stage a 24-hour protest. McDonald's workers took their protest to Parliament (Image: AFP) Video Loading Video Unavailable Click to play Tap to play The video will auto-play soon 8 Cancel Play now Workers at a branch in Cambridge, and another in Crayford, south east London, made history on Monday September 4 after repeated claims of poor working conditions, zero-hour contracts, and low pay. Some staff talked about "extreme stress" and even "bullying". Cambridge restaurant crew member Steve Day, who took part in the strikes, told Mirror Online: "Obviously we welcome this. It's brilliant and a step in the right direction. And it's good McDonald's are finally listening to us. "But there's much more to be done. It's still not really enough money to live on. Wages have been stagnant for so long, and this is McDonald's just buckling under a bit of pressure. "When the CEO gets £8,000 per hour [according to Steve] we think we should maybe get a little more. The burgers and fries don't cook themselves – we keep him in a job." The 25-year-old, whose wage will rise from £7.65 to £8 per hour, suggested more could be achieved were a greater proportion of the workforce organised. "It shows what an impact a small number of us can have. A tiny number did this – tiny, but not insignificant. I think we can do more." Steve, who's originally from Yorkshire and has worked for the company for nearly six months, also told us that he works around 35-40 hours a week on a zero-hour contract, and would like to be given better job security. More can be done While today is a small victory, the 30 strikers initially wanted to see their [crew member] wages rise to £10 per hour from around £7.50. McDonald's management had earlier in the year promised to give permanent positions to workers on zero-hour contracts. It's not clear whether this has been implemented. The fast food workers who took action were at the time represented by The Baker’s, Food and Allied Workers Union (BFAWU). A representative for the union called the strike a "historic step", that it would give employees its full support, and had previously seen a ballot of 95.7 per cent in favour of striking. At the time, Lewis Baker, who then worked at the Crayford McDonald's, one of the restaurants at which workers took action, wrote a blog post explaining the strike. He said: "We have been left with no choice but to strike. It’s our only real option. We need to raise awareness over our working conditions and the way we are treated in McDonald’s. "I – like many others – have had [my] grievances ignored by the company, time and time again." Labour leader Jeremy Corbyn said: "Congratulations to McDonald's workers and @BFAWU1 for winning pay rises but the fight for £10 an hour is not over. "We achieve more together than we can alone, which is why we should all join a trade union." McDonald’s employs around 85,000 people in the UK. A spokesman told Mirror Online: "Reward and recognition for our people and their contribution is a key priority, and to ensure we can attract and retain the best people, we regularly review pay and benefits. "While our franchisees set their own pay rates, we have recommended an increase across all age bands for our hourly employees to be implemented from 22 January."
McDonald's UK will introduce its largest pay rise in a decade following the first-ever strikes in the country’s branches in September last year. The increases, which will be implemented from 22 January, mean that 16-17 year-olds will start working for the fast food company on a minimum wage of £5.75 ($7.77), up from £5.10. Those aged over 25 will receive an initial wage of £8 per hour, up from £7.60. The strikes, which took place in two branches in Cambridge and London, were called in protest at poor working conditions, low pay and the use of zero-hour contracts.
Not enough time for recovery Coral bleaching occurs when stressful conditions result in the expulsion of the algal partner from the coral. Before anthropogenic climate warming, such events were relatively rare, allowing for recovery of the reef between events. Hughes et al. looked at 100 reefs globally and found that the average interval between bleaching events is now less than half what it was before. Such narrow recovery windows do not allow for full recovery. Furthermore, warming events such as El Niño are warmer than previously, as are general ocean conditions. Such changes are likely to make it more and more difficult for reefs to recover between stressful events. Science, this issue p. 80 Abstract Tropical reef systems are transitioning to a new era in which the interval between recurrent bouts of coral bleaching is too short for a full recovery of mature assemblages. We analyzed bleaching records at 100 globally distributed reef locations from 1980 to 2016. The median return time between pairs of severe bleaching events has diminished steadily since 1980 and is now only 6 years. As global warming has progressed, tropical sea surface temperatures are warmer now during current La Niña conditions than they were during El Niño events three decades ago. Consequently, as we transition to the Anthropocene, coral bleaching is occurring more frequently in all El Niño–Southern Oscillation phases, increasing the likelihood of annual bleaching in the coming decades. The average surface temperature of Earth has risen by close to 1°C as of the 1880s (1), and global temperatures in 2015 and 2016 were the warmest since instrumental record keeping began in the 19th century (2). Recurrent regional-scale (>1000 km) bleaching and mortality of corals is a modern phenomenon caused by anthropogenic global warming (3–10). Bleaching before the 1980s was recorded only at a local scale of a few tens of kilometers because of small-scale stressors such as freshwater inundation, sedimentation, or unusually cold or hot weather (3–5). The modern emergence of regional-scale bleaching is also evident from the growth bands of old Caribbean corals: synchronous distortions of skeletal deposition (stress bands) along a 400-km stretch of the Mesoamerican Reef have only been found after recent hot conditions, confirming that regional-scale heat stress is a modern phenomenon caused by anthropogenic global warming (10). Bleaching occurs when the density of algal symbionts, or zooxanthellae (Symbiodinium spp.), in the tissues of a coral host diminishes as a result of environmental stress, revealing the underlying white skeleton of the coral (8). Bleached corals are physiologically and nutritionally compromised, and prolonged bleaching over several months leads to high levels of coral mortality (11, 12). Global climate modeling and satellite observations also indicate that the thermal conditions for coral bleaching are becoming more prevalent (13, 14), leading to predictions that localities now considered to be thermal refugia could disappear by midcentury (15). Although several global databases of bleaching records are available (notably ReefBase, reefbase.org), they suffer from intermittent or lapsed maintenance and from uneven sampling effort across both years and locations (7). The time spans of five earlier global studies of coral bleaching range from 1870 to 1990 (3), 1960 to 2002 (4), 1973 to 2006 (5), 1980 to 2005 (6), and 1985 to 2010 (7). Here we compiled de novo the history of recurrent bleaching from 1980 to 2016 for 100 globally distributed coral reef locations in 54 countries using a standardized protocol to examine patterns in the timing, recurrence, and intensity of bleaching episodes, including the latest global bleaching event from 2015 to 2016 (table S1). This approach avoids the bias of the continuous addition of new sites in open-access databases and retains the same range of spatial scales through time (fig. S1). A bleaching record in our analysis consists of three elements: the location, from 1 to 100; the year; and the binary presence or absence of bleaching. Our findings reveal that coral reefs have entered the distinctive human-dominated era characterized as the Anthropocene (16–18), in which the frequency and intensity of bleaching events is rapidly approaching unsustainable levels. At the spatial scale we examined (fig. S1), the number of years between recurrent severe bleaching events has diminished fivefold in the past four decades, from once every 25 to 30 years in the early 1980s to once every 5.9 years in 2016. Across the 100 locations, we scored 300 bleaching episodes as severe, i.e., >30% of corals bleached at a scale of tens to hundreds of kilometers, and a further 312 as moderate (<30% of corals bleached). Our analysis indicates that coral reefs have moved from a period before 1980 when regional-scale bleaching was exceedingly rare or absent (3–5) to an intermediary phase beginning in the 1980s when global warming increased the thermal stress of strong El Niño events, leading to global bleaching events. Finally, in the past two decades, many additional regional-scale bleaching events have also occurred outside of El Niño conditions, affecting more and more former spatial refuges and threatening the future viability of coral reefs. Increasingly, climate-driven bleaching is occurring in all El Niño–Southern Oscillation (ENSO) phases, because as global warming progresses, average tropical sea surface temperatures are warmer today under La Niña conditions than they were during El Niño events only three decades ago (Fig. 1). Since 1980, 58% of severe bleaching events have been recorded during four strong El Niño periods (1982–1983, 1997–1998, 2009–2010, and 2015–2016) (Fig. 2A), with the remaining 42% occurring during hot summers in other ENSO phases. Inevitably, the link between El Niño as the predominant trigger of mass bleaching (3–5) is diminishing as global warming continues (Fig. 1) and as summer temperature thresholds for bleaching are increasingly exceeded throughout all ENSO phases. Fig. 1 Global warming throughout ENSO cycles. Sea surface temperature anomalies from 1871 to 2016, relative to a 1961–1990 baseline, averaged across 1670 1° latitude–by–1° longitude boxes containing coral reefs between latitudes of 31°N and 31°S. Data points differentiate El Niño (red triangles), La Niña (blue triangles), and ENSO neutral periods (black squares). Ninety-five percent confidence intervals are shown for nonlinear regression fits for years with El Niño and La Niña conditions (red and blue shading, respectively; overlap is shown in purple). Fig. 2 Temporal patterns of recurrent coral bleaching. (A) Number of 100 pantropical locations that have bleached each year from 1980 to 2016. Black bars indicate severe bleaching affecting >30% of corals, and white bars depict moderate bleaching of <30% of corals. (B) Cumulative number of severe and total bleaching events since 1980 (red; right axis) and the depletion of locations that remain free of any bleaching or severe bleaching over time (blue; left axis). (C) Frequency distribution of the number of severe (black) and total bleaching events (red) per location. (D) Frequency distribution of return times (number of years) between successive severe bleaching events from 1980 to 1999 (white bars) and 2000 to 2016 (black bars). The 2015–2016 bleaching event affected 75% of the globally distributed locations we examined (Figs. 2A and 3) and is therefore comparable in scale to the then-unprecedented 1997–1998 event, when 74% of the same 100 locations bleached. In both periods, sea surface temperatures were the warmest on record in all major coral reef regions (2, 19). As the geographic footprint of recurrent bleaching spreads, fewer and fewer potential refuges from global warming remain untouched (Fig. 2B), and only 6 of the 100 locations we examined have escaped severe bleaching so far (Fig. 2B and table S1). This result is conservative because of type 2 errors (false negatives) in our analyses, where bleaching could have occurred but was not recorded. Fig. 3 The global extent of mass bleaching of corals in 2015 and 2016. Symbols show 100 reef locations that were assessed: red circles, severe bleaching affecting >30% of corals; orange circles, moderate bleaching affecting <30% of corals; and blue circles, no substantial bleaching recorded. See table S1 for further details. After the extreme bleaching recorded from 2015 to 2016, the median number of severe bleaching events experienced across our study locations since 1980 is now three (Fig. 2C). Eighty-eight percent of the locations that bleached from 1997 to 1998 have bleached severely at least once again. As of 1980, 31% of reef locations have experienced four or more (up to nine) severe bleaching events (Fig. 2C), as well as many moderate episodes (table S1). Globally, the annual risk of bleaching (both severe and more moderate events) has increased by a rate of approximately 3.9% per annum (fig. S2), from an expected 8% of locations in the early 1980s to 31% in 2016. Similarly, the annual risk of severe bleaching has also increased, at a slightly faster rate of 4.3% per annum, from an expected 4% of locations in the early 1980s to 17% in 2016 (fig. S2). This trend corresponds to a 4.6-fold reduction in estimated return times of severe events, from once every 27 years in the early 1980s to once every 5.9 years in 2016. Thirty-three percent of return times between recurrent severe bleaching events since 2000 have been just 1, 2, or 3 years (Fig. 2D). Our analysis also reveals strong geographic patterns in the timing, severity, and return times of mass bleaching (Fig. 4). The Western Atlantic, which has warmed earlier than elsewhere (13, 19), began to experience regular bleaching sooner, with an average of 4.1 events per location before 1998, compared with 0.4 to 1.6 in other regions (Fig. 4 and fig. S2). Furthermore, widespread bleaching (affecting >50% of locations) has now occurred seven times since 1980 in the Western Atlantic, compared to three times for both Australasia and the Indian Ocean, and only twice in the Pacific. Over the entire period, the number of bleaching events has been highest in the Western Atlantic, with an average of 10 events per location, two to three times more than in other regions (Fig. 4). Fig. 4 Geographic variation in the timing and intensity of coral bleaching from 1980 to 2016. (A) Australasia (32 locations). (B) Indian Ocean (24 locations). (C) Pacific Ocean (22 locations). (D) Western Atlantic (22 locations). For each region, black bars indicate the percentage of locations that experienced severe bleaching, affecting >30% of corals. White bars indicate the percentage of locations per region with additional moderate bleaching affecting <30% of corals. In the 1980s, bleaching risk was highest in the Western Atlantic followed by the Pacific, with the Indian Ocean and Australasia having the lowest bleaching risk. However, bleaching risk increased most strongly over time in Australasia and the Middle East, at an intermediate rate in the Pacific, and slowly in the Western Atlantic (Fig. 4, fig. S3B, and tables S2 and S3). The return times between pairs of severe bleaching events are declining in all regions (fig. S3C), with the exception of the Western Atlantic, where most locations have escaped a major bleaching event from 2010 to 2016 (Fig. 2D). We tested the hypothesis that the number of bleaching events that have occurred so far at each location is positively related to the level of postindustrial warming of sea surface temperatures that has been experienced there (fig. S4). However, we found no significant relationship for any of the four geographic regions, consistent with each bleaching event being caused by a short-lived episode of extreme heat (12, 19, 20) that is superimposed on much smaller long-term warming trends. Hence, the long-term predictions of future average warming of sea surface temperatures (13) are also unlikely to provide an accurate projection of bleaching risk or the location of spatial refuges over the next century. In the coming years and decades, climate change will inevitably continue to increase the number of extreme heating events on coral reefs and further drive down the return times between them. Our analysis indicates that we are already approaching a scenario in which every hot summer, with or without an El Niño event, has the potential to cause bleaching and mortality at a regional scale. The time between recurrent events is increasingly too short to allow a full recovery of mature coral assemblages, which generally takes from 10 to 15 years for the fastest growing species and far longer for the full complement of life histories and morphologies of older assemblages (21–24). Areas that have so far escaped severe bleaching are likely to decline further in number (Fig. 2B), and the size of spatial refuges will diminish. These impacts are already underway, with an increase in average global temperature of close to 1°C. Hence, 1.5° or 2°C of warming above preindustrial conditions will inevitably contribute to further degradation of the world’s coral reefs (14). The future condition of reefs, and the ecosystem services they provide to people, will depend critically on the trajectory of global emissions and on our diminishing capacity to build resilience to recurrent high-frequency bleaching through management of local stressors (18) before the next bleaching event occurs. Supplementary Materials www.sciencemag.org/content/359/6371/80/suppl/DC1 Materials and Methods Figs. S1 to S4 Tables S1 to S3 References (25–29) http://www.sciencemag.org/about/science-licenses-journal-article-reuse This is an article distributed under the terms of the Science Journals Default License. Acknowledgments: Major funding for this research was provided by the Australian Research Council’s Centre of Excellence Program (CE140100020). The contents of this manuscript are solely the opinions of the authors and do not constitute a statement of policy, decision, or position on behalf of the National Oceanic and Atmospheric Administration or the U.S. government. Data reported in this paper are tabulated in the supplementary materials.
Tropical coral reefs across the world, on which millions of livelihoods depend and which are home to a third of all marine biodiversity, are under threat from repeated deadly bouts of warmer water, according to new research. The study of 100 reefs reveals that the interval between bleaching events, when unusually warm water causes coral to eject algae with often fatal consequences, has fallen from once in every 25-30 years in the 1980s, to once in every six years. The researchers have called for greater efforts to reduce the emissions of greenhouse gases to combat the warming.
NASA's Flight Opportunities program is already flying experiments on Blue Origin's New Shepard vehicle, but researchers and companies alike want NASA to also fund experiments with people on board. BROOMFIELD, Colo. — As commercial suborbital vehicles capable of carrying both payloads and people prepare to enter service, NASA officials say they're willing to consider allowing agency-funded researchers to fly on those vehicles. In an interview after a speech at the Next-Generation Suborbital Researchers Conference here Dec. 19, Steve Jurczyk, NASA associate administrator for space technology, said the agency would be open to allowing researchers funded by NASA's Flight Opportunities program to fly on suborbital spacecraft to carry out their experiments. "As principal investigators propose, both internal to NASA and external, we'll do the same kind of process that we do with Zero G," he said, referring to the company that performs parabolic aircraft flights. Zero G flies investigations as part of the Flight Opportunities program, with researchers flying on the aircraft with their experiments. [Watch Blue Origin's New Shepard 2.0 Spacecraft Soar in 1st Test Flight] Zero G's aircraft, a Boeing 727, is regulated by the Federal Aviation Administration. Jurczyk said that, in addition to the FAA oversight, NASA's Armstrong Flight Research Center performs an evaluation of the aircraft for investigations selected by the Flight Opportunities program for flights on it. "It just ensures that our grantees and contractors are safe to fly, and then we allow them to go fly," he said in a speech at the conference. A similar procedure is not yet in place for suborbital vehicles, but Jurczyk said the agency would be open to finding some process analogous to that used for Zero G. "Moving forward, as these capabilities start coming online, we’ll figure it out," he said in the interview. His comments come four and a half years after another agency official opened the door to flying people on commercial suborbital vehicles through the Flight Opportunities program. Speaking at the same conference in June 2013, Lori Garver, NASA deputy administrator at the time, said that past prohibitions about flying people would be lifted. "We absolutely do not want to rule out paying for research that could be done by an individual spaceflight participant — a researcher or payload specialist — on these vehicles in the future," Garver said then. "That could open up a lot more opportunities." That announcement took the program by surprise, with the program's managers saying at the time they had yet to craft a policy for allowing people to fly with their experiments. Development of such a policy suffered years of delays, in part because of Garver's departure from NASA just a few months after her announcement as well as extended delays in the development of commercial suborbital vehicles capable of carrying people. "It mostly resulted in a bunch of ostriches sticking heads in the sand for a few years," said Erika Wagner, business development manager at Blue Origin, during a panel discussion at the conference Dec. 18. Blue Origin's New Shepard vehicle is already carrying research payloads, including for Flight Opportunities, but without people on board. However, the vehicle will be able to support missions carrying payloads and people in the future. Virgin Galactic’s SpaceShipTwo vehicle will also fly research payloads accompanied by a payload specialist. Wagner said she has seen some progress as both companies' vehicles advance through flight testing. "The heads are back out. They're looking around trying to understand what really are the barriers, what is the liability regime." Those liability issues today, she said, prevent NASA civil servants from flying on the Zero G aircraft, even though outside researchers whose experiments are funded by NASA are able to do so. Jurczyk, in his speech at the conference, said that’s because they would have to sign a liability waiver to do so. "Right now, that’s just NASA policy. We don't have a strong mission need to do that," he said. "That's current policy. I’m not saying it's going to be policy forever and ever." [In Photos: Blue Origin's New Shepard 2.0 Aces Maiden Test Flight] Scientists who would like to fly experiments on suborbital vehicles argue that such missions are analogous to fieldwork — oftentimes hazardous — performed in other fields. "Marine biologists and marine geologists get to put themselves in that very same operationally risky environment by going to the bottom of the ocean, to a deep sea vent," said Dan Durda, a planetary scientist at the Southwest Research Institute, during the Dec. 18 panel. "These vehicles offer us, as space scientists, that opportunity to get into the field the way that biologists and geologists do." Advocates of commercial suborbital research, such as the Commercial Spaceflight Federation’s Suborbital Applications Research Group, have been pushing to allow NASA to fund human-tended experiments. "They're working quietly to get the word out that there are very definite needs for human-tended payloads," said Steven Collicott, a Purdue University professor, in a conference speech Dec. 19. "We've heard some encouraging words and we’re working quietly to try and move that ahead." Others at the conference noted a decades-old precedent that suggests existing barriers to flying NASA-funded researchers on commercial suborbital vehicles can be overcome. In the 1980s, several payload specialists flew on the space shuttle, including Charles Walker, a McDonnell Douglas engineer who was part of three shuttle missions. Walker, in the Dec. 18 panel discussion, noted that on those shuttle missions he and his family signed liability waivers. He supported similar approaches to allow researchers to fly on commercial suborbital vehicles. "The environments opened up by suborbital flight and, at a greater scale, orbital flight, are laboratory environments," he said. "You should be there to maximize the answers that are coming out of the conduct in that environment." This story was provided by SpaceNews, dedicated to covering all aspects of the space industry.
NASA is looking at allowing researchers from the agency on board commercial suborbital flights, according to Steve Jurczyk, NASA associate administrator for space technology. Blue Origin's New Shepard craft and Virgin Galactic’s SpaceShipTwo are among the vehicles that could carry researchers on suborbital flights. Liability issues on the relatively dangerous missions have so far deterred NASA from allowing staff aboard them.
README.md exists but content is empty.
Downloads last month
46