Text
stringlengths
71
32.7k
Summary
stringlengths
335
1.85k
Electrical supply company Crescent Electric (CESCO) study reveals that the state of Louisiana is the cheapest state in the US to mine Bitcoin. Digital currency mining requires a lot of electric power and the power rates differ in every state. Based on CESCO’s latest study of the cost of cryptocurrency mining across the US, it is currently cheapest to mine Bitcoin in Louisiana -- electricity costs at 9.87 cents per watt puts the average cost of mining one Bitcoin at $3,224. This is significantly cheaper than the current price of Bitcoin, which is currently trading at around $12,000 per coin, as of press time. Where else in the US is it cheap to mine? In their study, CESCO also estimated the cost of Bitcoin mining based on the wattage consumption of the three most popular mining rigs, namely, the AntMiner S9, the AntMiner S7, and the Avalon 6, as well as the average days each rig takes to mine a token. These figures were then multiplied by the average electricity rate in each state. Aside from Louisiana, the other top five states with the lowest cost to mine Bitcoin are Idaho ($3,289 per token), Washington ($3,309), Tennessee ($3,443) and Arkansas ($3,505). The study also names the most expensive states for digital currency mining. The list of costliest states is led by Hawaii, which takes an average mining cost of $9,483 per coin. Rounding up the top five states with the highest Bitcoin mining rates are Alaska ($7,059), Connecticut ($6,951), Massachusetts ($6,674) and New Hampshire ($6,425). The growing interest in cryptocurrency has been accompanied by growing concern over the energy required to mine crypto, namely Bitcoin. Such claims have been recently countered, as a report came out claiming that put cryptocurrency mining in the larger context of energy consumption.
A new study has named Louisiana as the cheapest state in the US in which to mine bitcoin. Electrical supply company Crescent Electric based its calculation on the cost of electricity in each state, the power requirements of the equipment needed, and the average length of time taken to mine a token. This produced a figure of $3,224 per bitcoin for Louisiana, with the most expensive places being Hawaii, at $9,483, and Alaska at $7,059. All of these figures are notably less than the current trading price of bitcoin.
Although the name of the complex was changed to Spring Creek Towers several years ago, it is still widely known as Starrett City. A massive development, it has its own power plant, schools, recreation center and ZIP code. The sale has garnered some notoriety not just because of its size but also because President Trump has a small stake in the complex. Carol G. Deane, the managing partner of Starrett City Associates, who was behind the sale, had argued in court that she balanced the need to satisfy shareholders with a deal that could win government approval and preserve Starrett City as a home for low- and moderate-income New Yorkers. More than 70 percent of the limited partners and beneficial owners approved the deal in September. “We are pleased with the decision denying plaintiffs’ efforts to derail the sale of Spring Creek Towers,” Ms. Deane said in a statement Tuesday, “but not surprised because of the care we put into the process and choosing a buyer who is committed to maintaining the development as affordable and a quality place to live for the 15,000 residents who call it home.” Joshua D.N. Hess, a lawyer for the dissidents, said Tuesday that they were reviewing the judge’s order and their options, which could include suing for damages. Mr. Deane died in 2010. Ms. Deane was his third wife. He tried to sell the complex to the highest bidder during a debt-fueled real estate boom in 2007, but the deal fell apart amid sharp criticism from city, state and federal officials, as well as tenants.
The sale of a huge apartment complex in New York has been cleared by a judge after it was challenged in court. Starrett City in Brooklyn is the largest federally subsidised housing development in the United States, with 5,581 apartments in a 145-acre site. It is being sold for $905m by the widow of its original developer, but the transaction has been opposed by a rival bidder, backed by a partner in the complex. The Supreme Court of the state of New York has now dismissed its objection, but the transaction still requires approval by state and federal officials.
L'Oreal will expand a media ownership strategy it piloted in Mexico to other Spanish-speaking countries to generate first-party cookies from its customer base. The idea is similar to sponsored content, but instead of working with a brand-name media company on a story package, for the past year L’Oreal has developed fiufiu, a kind of pop-up media brand that turns out social media aggregated lists, influencer columns and work by the content generation startup Cultura Colectiva. Fiufiu has a website and a Facebook presence. During some months it generates unique visitor numbers on par with well-known beauty magazines, said L'Oreal Hispanic countries CMO Andres Amezquita. The company could generate page views and engagement if it paired social influencer campaigns with beauty magazine sponsored content deals, but with fiufiu L’Oreal gets first-party cookies and the opportunities to request email addresses that come with page ownership. “Part of our programmatic approach is to have a lot of data and understand our consumers,” Amezquita said. “The idea is to be able to have a one-on-one conversation at the scale of mass communication, but to do that in the current environment, we need to capture cookies.” Recent policy changes at the government and operating system levels, such as GDPR in Europe and Apple’s Safari Intelligent Tracking Prevention, block access to cookie data for marketers. Younger customers, who make up most of the social-driven L’Oreal site, also respond better to in-article recommendations than the hard sell of ads on the page, Amezquita said. and every page is another chance to gather an email address or send someone to a L’Oreal product page. Consumer brands need first-party cookies to advance their online advertising. Since ad tech companies that do audience targeting or retargeting are losing access to first-party data, brands will need to bring their own data to continue running data-driven campaigns. For instance, L’Oreal can use fiufiu to generate retargeting audiences and extrapolate its first-party data through lookalike models scaled for programmatic, Amezquita said. “If you look at industry right now, it’s a moment where content, CRM and digital advertising are becoming one,” he said.
Cosmetics firm L'Oreal is set to expand a media strategy piloted in Mexico to other Spanish-speaking nations, according to CMO Andres Amezquita. Using its media brand Fiufiu, L'Oreal plans to capture first-party cookies to promote its online advertising. Brands will have to take more ownership of their data and apply similar models to those employed by the adtech companies to target their audience effectively.
Though it has billions in the bank, Scripps Health will pursue layoffs in 2018 as part of a reorganization strategy that emphasizes lower costs and greater reliance on caring for patients outside of its five hospitals. In a recent memo to all of the health system’s 15,000 employees and 3,000 affiliated doctors, Chris Van Gorder, Scripps’ chief executive officer, says that cuts are necessary to remain competitive in a health care world where health insurance companies increasingly consider low prices as a main factor in contracting and patients are more often shopping around for services as deductibles increase. Scripps missed its annual budget by $20 million last year for the first time in 15 years, Van Gorder said in an interview this week. It was a wake-up call, he said, that added urgency to the need to both cut costs and also re-think how the private not-for-profit health network does business. “We’ve got to shift our organizational structures around to be able to deal with the new world of health care delivery, find ways of lowering our costs significantly,” Van Gorder said. “If we don’t, we will not be able to compete.” Advertisement Scripps is far from the only large health care system to announce cost-cutting measures in recent months. Advocate Health, Illinois’ largest health care provider with 12 hospitals and 35,000 employees, announced in May that it would try to cut more than $200 million in costs due to flat revenue projections. Tenet Health, the nation’s third-largest medical chain with 77 hospitals, announced in October that it will eliminate a middle layer of management after posting a $56 million loss after a 1.4 percent single-quarter revenue decline, according to industry news source Modern Healthcare. Things haven’t gotten quite so bad yet at Scripps. A look at Scripps’ bond industry financial filings make it clear that this is not a company teetering on the edge of insolvency. Far from it. Scripps, the audited financial statements for the 2017 budget year show, has banked about $2.6 billion in unrestricted cash and other investments. However, Scripps has recently seen its operating margin, the percentage of revenue left over after all the bills are paid, shrink significantly from about 9 percent in 2012 to 2.3 percent in 2017. Financial statements show that Scripps’ bottom line was significantly bolstered this year by investment income. A roaring stock market helped significantly increase the value of its holdings, pushing total profitability up to $350 million for the year, a number that is 25 percent greater than last year. How can an organization be making money, have a significant financial cushion and yet still be planning for layoffs in the coming year? Van Gorder said the basic fact is that operation of the organization must continue to bring in more revenue than it spends and declining operating margins must be addressed even if revenue from the stock market is currently masking those declines. “As strong as we are on the bottom, bottom line, the trends at the top end are changing, and we have to adjust to them,” Van Gorder said. Advertisement Chris Van Gorder, president and chief executive of Scripps Health, speaks to the editorial board of The San Diego Union-Tribune in 2012. (John Gastaldo ) Still, with more than two billion in the bank, couldn’t Scripps afford to burn some savings and stave off layoffs? Van Gorder said that it’s not responsible management to fix a recurring problem with savings. And, having a hefty balance sheet, he added, is necessary to get favorable interest rates from lenders as Scripps moves to execute a recently announced $2.6 billion building plan that will replace Scripps Mercy Hospital in Hillcrest, add a new patient tower at Scripps Memorial Hospital, La Jolla and upgrade facilities in Encinitas, Chula Vista and Torrey Pines. These upgrades and replacements, the executive added, are made more urgent than they would otherwise be due to a state law that requires all hospitals to meet certain seismic requirements by 2030 or cease operation. Advertisement In addition to borrowing and revenue from philanthropy, Scripps plans to tap its savings to fund its building plan. That plan, Van Gorder said, will continue but will be undertaken with knowledge that insurance companies want to avoid paying the higher prices charged by hospitals whenever possible. That means pulling back on the previous tendency to include plenty of space inside hospital complexes to patients whose medical needs don’t require them to be admitted for a overnight stay. Scripps has already started on this path with the purchase of Imaging Healthcare Specialists in 2015. The company operates stand-alone imaging centers that offer cheaper scans than are available in the outpatient centers attached to the region’s major hospitals. “We’re seeing a huge shift in the ambulatory side,” Van Gorder said. “We’re now, for example, doing total joint replacement in ambulatory surgery centers. A year ago, that didn’t take place. The experts are telling me and others that you’re going to see a huge jump in that as technology continues to improve … that’s going to pull a whole lot of utilization away from hospitals and hospitals that haven’t prepared for that shift are going to be in deep trouble in the not-to-distant future. That’s why we’re trying to make this shift proactively.” Scripps’ bond filings do show that it has seen a significant shift in its business over the last five years. Advertisement From 2011 to 2016, the most recent year for which aggregated patient data is available, Scripps reported that inpatient discharges, the total number of days that patients spent in its hospitals, and surgeries performed in hospitals all decreased slightly. Meanwhile, the number of surgeries performed at its outpatient surgery centers increased 69 percent. Visits to Scripps Clinic and Scripps Coastal Medical Group increased 14 percent and 19 percent respectively and emergency visits were up 21 percent. Scripps’ first step in its reorganization plan to better address the shift from inpatient to outpatient is to collapse its ranks of hospital management. Instead of having a chief executive officer to manage operations at each of its five hospitals, the plan is to have two executives handle those duties, one for hospitals in La Jolla and Encinitas and another to oversee operations in San Diego and Chula Vista. A third executive will be in charge of all ancillary services, including operation of the medical offices, outpatient surgery centers and other facilities that Scripps owns or leases throughout the region. This consolidation will not require any layoffs. Gary Fybel, currently the chief executive officer at Scripps Memorial Hospital La Jolla, and Robin Brown, chief executive officer at Scripps Green Hospital, will retire. The northern chief executive officer position has been awarded to Carl Etter, current chief executive officer of Scripps Memorial Hospital Encinitas. The south position will be handled by Tom Gammiere, who currently runs Scripps Mercy Hospital in Hillcrest. Lisa Thakur, currently corporate vice president of operating rooms, pharmacy and supply chain, will fill the ancillary services post, and her previous job will not be filled. Van Gorder said further personnel cuts are coming and are intended to save $30 million during the current budget year ending Sept. 30, 2018. Cuts should save $40 million in subsequent years, he added. Most will be focused on administrative job classifications. Advertisement “There will be layoffs,” Van Gorder said. “I don’t like it, but it has to be done for me to protect the organization and our ability to take care of our community into the future.” So far, he said, there are no specifics to share on which particular jobs are most at risk. He added that, as the current reorganization effort takes shape, hiring is also anticipated. More workers will be needed to provide quicker service to patients. “What I want to do at the point of service is support our nursing staff with more paid professional staff,” Van Gorder said. Advertisement Gerard Anderson, a professor of health policy and management at Johns Hopkins School of Medicine in Baltimore, said the current trend of cost-cutting at large health care systems does indeed appear to be driven by decreases in reimbursement by Medicare and private health insurance companies. “In almost every single hospital they’re losing some amount of money in operating margin but making it up in spades when you add in their investment income,” Anderson said. The researcher said he was a bit disturbed to see organizations flush with stock market earnings discuss cuts, especially if those cuts are to employees who directly care for patients. “You’re seeing layoffs but I don’t necessarily understand why you need to lay someone off when your margin is something like 11 percent after non-operating income is added into the mix,” Anderson said. Advertisement But what about the need to shift business strategy and reduce administrative expenses as reimbursement falls? Anderson said he can see that point as long as it’s true. “If they really are doing this in the non-patient-care area, I think that makes sense,” he said. “In general, we’ve seen more across-the-baord cuts than targeted specifically to areas of management and administration. The salaries in management have grown faster than they have for clinicians.” Advertisement Health Playlist On Now Video: Why aren't Americans getting flu shots? 0:37 On Now Video: Leaders urge public to help extinguish hepatitis outbreak On Now San Diego starts cleansing sidewalks, streets to combat hepatitis A On Now Video: Scripps to shutter its hospice service On Now Video: Scripps La Jolla hospitals nab top local spot in annual hospital rankings On Now Video: Does a parent's Alzheimer's doom their children? On Now Video: Vaccine can prevent human papillomavirus, which can cause cancer 0:31 On Now 23 local doctors have already faced state discipline in 2017 0:48 On Now EpiPen recall expands On Now Kids can add years to your life paul.sisson@sduniontribune.com (619) 293-1850 Advertisement Twitter: @paulsisson
The San Diego-based non-profit health care system Scripps Health is set to pursue layoffs in a restructuring process in 2018. Scripps missed its annual budget by $20 million and intends to target lower costs and a greater dependence on caring for patients outside of its hospitals. The company’s CEO, Chris Van Gorder, revealed in a memo that the cuts are necessary with insurers and patients increasingly targeting lower prices as health costs rise. The announcement by Scripps follows in a cost-cutting trend amongst large health care organizations; both Advocate Health of Illinois and Tenet Health announced similar measures following disappointing results. Although Scripps maintains a $2.6 billion balance in cash and investments, with an increased profitability of $350 million, its operating margin has slimmed from 9 percent in 2012 to 2.3 percent in 2017. The firm believes that it would not be responsible management to turn the savings towards a recurring problem especially as it requires favorable interest rates from lenders on its $2.6 billion building project. Alongside the personnel cuts intended to save $30 million in the current budget year, there will be a consolidation of management positions.
Beneath the waves, oxygen disappears As plastic waste pollutes the oceans and fish stocks decline, unseen below the surface another problem grows: deoxygenation. Breitburg et al. review the evidence for the downward trajectory of oxygen levels in increasing areas of the open ocean and coastal waters. Rising nutrient loads coupled with climate change—each resulting from human activities—are changing ocean biogeochemistry and increasing oxygen consumption. This results in destabilization of sediments and fundamental shifts in the availability of key nutrients. In the short term, some compensatory effects may result in improvements in local fisheries, such as in cases where stocks are squeezed between the surface and elevated oxygen minimum zones. In the longer term, these conditions are unsustainable and may result in ecosystem collapses, which ultimately will cause societal and economic harm. Science, this issue p. eaam7240 Structured Abstract BACKGROUND Oxygen concentrations in both the open ocean and coastal waters have been declining since at least the middle of the 20th century. This oxygen loss, or deoxygenation, is one of the most important changes occurring in an ocean increasingly modified by human activities that have raised temperatures, CO 2 levels, and nutrient inputs and have altered the abundances and distributions of marine species. Oxygen is fundamental to biological and biogeochemical processes in the ocean. Its decline can cause major changes in ocean productivity, biodiversity, and biogeochemical cycles. Analyses of direct measurements at sites around the world indicate that oxygen-minimum zones in the open ocean have expanded by several million square kilometers and that hundreds of coastal sites now have oxygen concentrations low enough to limit the distribution and abundance of animal populations and alter the cycling of important nutrients. ADVANCES In the open ocean, global warming, which is primarily caused by increased greenhouse gas emissions, is considered the primary cause of ongoing deoxygenation. Numerical models project further oxygen declines during the 21st century, even with ambitious emission reductions. Rising global temperatures decrease oxygen solubility in water, increase the rate of oxygen consumption via respiration, and are predicted to reduce the introduction of oxygen from the atmosphere and surface waters into the ocean interior by increasing stratification and weakening ocean overturning circulation. In estuaries and other coastal systems strongly influenced by their watershed, oxygen declines have been caused by increased loadings of nutrients (nitrogen and phosphorus) and organic matter, primarily from agriculture; sewage; and the combustion of fossil fuels. In many regions, further increases in nitrogen discharges to coastal waters are projected as human populations and agricultural production rise. Climate change exacerbates oxygen decline in coastal systems through similar mechanisms as those in the open ocean, as well as by increasing nutrient delivery from watersheds that will experience increased precipitation. Expansion of low-oxygen zones can increase production of N 2 O, a potent greenhouse gas; reduce eukaryote biodiversity; alter the structure of food webs; and negatively affect food security and livelihoods. Both acidification and increasing temperature are mechanistically linked with the process of deoxygenation and combine with low-oxygen conditions to affect biogeochemical, physiological, and ecological processes. However, an important paradox to consider in predicting large-scale effects of future deoxygenation is that high levels of productivity in nutrient-enriched coastal systems and upwelling areas associated with oxygen-minimum zones also support some of the world’s most prolific fisheries. OUTLOOK Major advances have been made toward understanding patterns, drivers, and consequences of ocean deoxygenation, but there is a need to improve predictions at large spatial and temporal scales important to ecosystem services provided by the ocean. Improved numerical models of oceanographic processes that control oxygen depletion and the large-scale influence of altered biogeochemical cycles are needed to better predict the magnitude and spatial patterns of deoxygenation in the open ocean, as well as feedbacks to climate. Developing and verifying the next generation of these models will require increased in situ observations and improved mechanistic understanding on a variety of scales. Models useful for managing nutrient loads can simulate oxygen loss in coastal waters with some skill, but their ability to project future oxygen loss is often hampered by insufficient data and climate model projections on drivers at appropriate temporal and spatial scales. Predicting deoxygenation-induced changes in ecosystem services and human welfare requires scaling effects that are measured on individual organisms to populations, food webs, and fisheries stocks; considering combined effects of deoxygenation and other ocean stressors; and placing an increased research emphasis on developing nations. Reducing the impacts of other stressors may provide some protection to species negatively affected by low-oxygen conditions. Ultimately, though, limiting deoxygenation and its negative effects will necessitate a substantial global decrease in greenhouse gas emissions, as well as reductions in nutrient discharges to coastal waters. Low and declining oxygen levels in the open ocean and coastal waters affect processes ranging from biogeochemistry to food security. The global map indicates coastal sites where anthropogenic nutrients have exacerbated or caused O 2 declines to <2 mg liter−1 (<63 μmol liter−1) (red dots), as well as ocean oxygen-minimum zones at 300 m of depth (blue shaded regions). [Map created from data provided by R. Diaz, updated by members of the GO 2 NE network, and downloaded from the World Ocean Atlas 2009]. Abstract Oxygen is fundamental to life. Not only is it essential for the survival of individual animals, but it regulates global cycles of major nutrients and carbon. The oxygen content of the open ocean and coastal waters has been declining for at least the past half-century, largely because of human activities that have increased global temperatures and nutrients discharged to coastal waters. These changes have accelerated consumption of oxygen by microbial respiration, reduced solubility of oxygen in water, and reduced the rate of oxygen resupply from the atmosphere to the ocean interior, with a wide range of biological and ecological consequences. Further research is needed to understand and predict long-term, global- and regional-scale oxygen changes and their effects on marine and estuarine fisheries and ecosystems. Oxygen levels have been decreasing in the open ocean and coastal waters since at least the middle of the 20th century (1–3). This ocean deoxygenation ranks among the most important changes occurring in marine ecosystems (1, 4–6) (Figs. 1 and 2). The oxygen content of the ocean constrains productivity, biodiversity, and biogeochemical cycles. Major extinction events in Earth’s history have been associated with warm climates and oxygen-deficient oceans (7), and under current trajectories, anthropogenic activities could drive the ocean toward widespread oxygen deficiency within the next thousand years (8). In this Review, we refer to “coastal waters” as systems that are strongly influenced by their watershed, and the “open ocean” as waters in which such influences are secondary. Fig. 1 Oxygen has declined in both the open ocean and coastal waters during the past half-century. (A) Coastal waters where oxygen concentrations ≤61 μmol kg−1 (63 μmol liter−1 or 2 mg liter−1) have been reported (red) (8, 12). [Map created from data in (8) and updated by R. Diaz and authors] (B) Change in oxygen content of the global ocean in mol O 2 m−2 decade−1 (9). Most of the coastal systems shown here reported their first incidence of low oxygen levels after 1960. In some cases, low oxygen may have occurred earlier but was not detected or reported. In other systems (such as the Baltic Sea) that reported low levels of oxygen before 1960, low-oxygen areas have become more extensive and severe (59). Dashed-dotted, dashed, and solid lines delineate boundaries with oxygen concentrations <80, 40, and 20 μmol kg−1 ­ , respectively, at any depth within the water column (9). [Reproduced from (9)] Fig. 2 Dissolved oxygen concentrations in the open ocean and the Baltic Sea. (A) Oxygen levels at a depth of 300 m in the open ocean. Major eastern boundary and Arabian Sea upwelling zones, where oxygen concentrations are lowest, are shown in magenta, but low oxygen levels can be detected in areas other than these major OMZs. At this depth, large areas of global ocean water have O 2 concentrations <100 μmol liter−1 (outlined and indicated in red). ETNP, eastern tropical North Pacific; ETSP, eastern tropical South Pacific; ETSA, eastern tropical South Atlantic; AS, Arabian Sea. [Max Planck Institute for Marine Microbiology, based on data from the World Ocean Atlas 2009] (B) Oxygen levels at the bottom of the Baltic Sea during 2012 (59). In recent years, low-oxygen areas have expanded to 60,000 km2 as a result of limited exchange, high anthropogenic nutrient loads, and warming waters (59) (red, O 2 concentration ≤63 μmol liter−1 [2 mg liter−1]; black, anoxia). [Reproduced from (59)] The open ocean lost an estimated 2%, or 4.8 ± 2.1 petamoles (77 billion metric tons), of its oxygen over the past 50 years (9). Open-ocean oxygen-minimum zones (OMZs) have expanded by an area about the size of the European Union (4.5 million km2, based on water with <70 μmol kg−1 oxygen at 200 m of depth) (10), and the volume of water completely devoid of oxygen (anoxic) has more than quadrupled over the same period (9). Upwelling of oxygen-depleted water has intensified in severity and duration along some coasts, with serious biological consequences (11). Since 1950, more than 500 sites in coastal waters have reported oxygen concentrations ≤2 mg liter−1 (=63 μmol liter−1 or ≅61 µmol kg-1), a threshold often used to delineate hypoxia (3, 12) (Fig. 1A). Fewer than 10% of these systems were known to have hypoxia before 1950. Many more water bodies may be affected, especially in developing nations where available monitoring data can be sparse and inadequately accessed even for waters receiving high levels of untreated human and agricultural waste. Oxygen continues to decline in some coastal systems despite substantial reductions in nutrient loads, which have improved other water quality metrics (such as levels of chlorophyll a) that are sensitive to nutrient enrichment (13). Oxygen is naturally low or absent where biological oxygen consumption through respiration exceeds the rate of oxygen supplied by physical transport, air-sea fluxes, and photosynthesis for sufficient periods of time. A large variety of such systems exist, including the OMZs of the open ocean, the cores of some mode-water eddies, coastal upwelling zones, deep basins of semi-enclosed seas, deep fjords, and shallow productive waters with restricted circulation (14, 15). Whether natural or anthropogenically driven, however, low oxygen levels and anoxia leave a strong imprint on biogeochemical and ecological processes. Electron acceptors, such as Fe(III) and sulfate, that replace oxygen as conditions become anoxic yield less energy than aerobic respiration and constrain ecosystem energetics (16). Biodiversity, eukaryotic biomass, and energy-intensive ecological interactions such as predation are reduced (17–19), and energy is increasingly transferred to microbes (3, 16). As oxygen depletion becomes more severe, persistent, and widespread, a greater fraction of the ocean is losing its ability to support high-biomass, diverse animal assemblages and provide important ecosystem services. But the paradox is that these areas, sometimes called dead zones, are far from dead. Instead they contribute to some of the world’s most productive fisheries harvested in the adjacent, oxygenated waters (20–22) and host thriving microbial assemblages that utilize a diversity of biogeochemical pathways (16). Eukaryote organisms that use low-oxygen habitats have evolved physiological and behavioral adaptations that enable them to extract, transport, and store sufficient oxygen, maintain aerobic metabolism, and reduce energy demand (23–26). Fishes, for example, adjust ventilation rate, cardiac activity, hemoglobin content, and O 2 binding and remodel gill morphology to increase lamellar surface area (27). For some small taxa, including nematodes and polychaetes, high surface area–to–volume ratios enhance diffusion and contribute to hypoxia tolerance (26). Metabolic depression (23, 25, 28) and high H 2 S tolerance (24) are also key adaptations by organisms to hypoxic and anoxic environments. Causes of oxygen decline Global warming as a cause of oxygen loss in the open ocean The discovery of widespread oxygen loss in the open ocean during the past 50 years depended on repeated hydrographic observations that revealed oxygen declines at locations ranging from the northeast Pacific (29) and northern Atlantic (30) to tropical oceans (2). Greenhouse gas–driven global warming is the likely ultimate cause of this ongoing deoxygenation in many parts of the open ocean (31). For the upper ocean over the period 1958–2015, oxygen and heat content are highly correlated with sharp increases in both deoxygenation and ocean heat content, beginning in the mid-1980s (32). Ocean warming reduces the solubility of oxygen. Decreasing solubility is estimated to account for ~15% of current total global oxygen loss and >50% of the oxygen loss in the upper 1000 m of the ocean (9, 33). Warming also raises metabolic rates, thus accelerating the rate of oxygen consumption. Therefore, decomposition of sinking particles occurs faster, and remineralization of these particles is shifted toward shallower depths (34), resulting in a spatial redistribution but not necessarily a change in the magnitude of oxygen loss. Intensified stratification may account for the remaining 85% of global ocean oxygen loss by reducing ventilation—the transport of oxygen into the ocean interior—and by affecting the supply of nutrients controlling production of organic matter and its subsequent sinking out of the surface ocean. Warming exerts a direct influence on thermal stratification and indirectly enhances salinity-driven stratification through its effects on ice melt and precipitation. Increased stratification alters the mainly wind-driven circulation in the upper few hundred meters of the ocean and slows the deep overturning circulation (9). Reduced ventilation, which may also be influenced by decadal to multidecadal oscillations in atmospheric forcing patterns (35), has strong subsurface manifestations at relatively shallow ocean depths (100 to 300 m) in the low- to mid-latitude oceans and less pronounced signatures down to a few thousand meters at high latitudes. Oxygen declines closer to shore have also been found in some systems, including the California Current and lower Saint Lawrence Estuary, where the relative strength of various currents have changed and remineralization has increased (36, 37). There is general agreement between numerical models and observations about the total amount of oxygen loss in the surface ocean (38). There is also consensus that direct solubility effects do not explain the majority of oceanic oxygen decline (31). However, numerical models consistently simulate a decline in the total global ocean oxygen inventory equal to only about half that of the most recent observation-based estimate and also predict different spatial patterns of oxygen decline or, in some cases, increase (9, 31, 39). These discrepancies are most marked in the tropical thermocline (40). This is problematic for predictions of future deoxygenation, as these regions host large open-ocean OMZs, where a further decline in oxygen levels could have large impacts on ecosystems and biogeochemistry (Fig. 2A). It is also unclear how much ocean oxygen decline can be attributed to alterations in ventilation versus respiration. Mechanisms other than greenhouse gas–driven global warming may be at play in the observed ocean oxygen decline that are not well represented in current ocean models. For example, internal oscillations in the climate system, such as the Pacific Decadal Oscillation, affect ventilation processes and, eventually, oxygen distributions (35). Models predict that warming will strengthen winds that favor upwelling and the resulting transport of deeper waters onto upper slope and shelf environments in some coastal areas (41, 42), especially at high latitudes within upwelling systems that form along the eastern boundary of ocean basins (43). The predicted magnitude and direction of change is not uniform, however, either within individual large upwelling systems or among different systems. Upwelling in the southern Humboldt, southern Benguela, and northern Canary Eastern Boundary upwelling systems is predicted to increase in both duration and intensity by the end of the 21st century (43). Where the oxygen content of subsurface source waters declines, upwelling introduces water to the shelf that is both lower in oxygen and higher in CO 2 . Along the central Oregon coast of the United States in 2006, for example, anoxic waters upwelled to depths of <50 m within 2 km of shore, persisted for 4 months, and resulted in large-scale mortality of benthic macro-invertebrates (11). There are no prior records of such severe oxygen depletion over the continental shelf or within the OMZ in this area (11). Nutrient enrichment of coastal waters Sewage discharges have been known to deplete oxygen concentrations in estuaries since at least the late 1800s (44), and by the mid 1900s the link to agricultural fertilizer runoff was discussed (45). Nevertheless, the number and severity of hypoxic sites has continued to increase (Fig. 2B). The human population has nearly tripled since 1950 (46). Agricultural production has greatly increased to feed this growing population and meet demands for increased consumption of animal protein, resulting in a 10-fold increase in global fertilizer use over the same period (47). Nitrogen discharges from rivers to coastal waters increased by 43% in just 30 years from 1970 to 2000 (48), with more than three times as much nitrogen derived from agriculture as from sewage (49). Eutrophication occurs when nutrients (primarily N and P) and biomass from human waste and agriculture, as well as N deposition from fossil fuel combustion, stimulate the growth of algae and increase algal biomass. The enhanced primary and secondary production in surface waters increases the delivery rate of degradable organic matter to bottom waters where microbial decomposition by aerobic respiration consumes oxygen. Once oxygen levels are low, behavioral and biogeochemical feedbacks can hinder a return to higher-oxygen conditions (50). For example, burrowing invertebrates that introduce oxygen to sediments die or fail to recruit, and sediment phosphorus is released, fueling additional biological production in the water column and eventual increased oxygen consumption. Coastal systems vary substantially in their susceptibility to developing low oxygen concentrations. Low rates of vertical exchange within the water column reduce rates of oxygen resupply (51), and long water-retention times favor the accumulation of phytoplankton biomass (14) and its eventual subsurface degradation. Chesapeake Bay develops hypoxia and anoxia that persist for several months during late spring through early autumn and cover up to 30% of the system area. In contrast, the nearby Delaware Bay, which has weaker stratification and a shorter retention time, does not develop hypoxia, in spite of similar nutrient loads (52). Manila Bay is adjacent to a megacity and also receives similar loads on an annual basis, but it becomes hypoxic principally during the wet southwest monsoon period, when rainfall increases nutrient loads and stratification (53). Low oxygen in coastal waters and semi-enclosed seas can persist for minutes to thousands of years and may extend over spatial scales ranging from less than one to many thousands of square kilometers. Both local and remote drivers lead to temporal and spatial variations in hypoxia. Local weather can influence oxygen depletion in very shallow water through wind mixing and the effect of cloud cover on photosynthesis (54). At larger spatial scales, variations in wind direction and speed (55), precipitation and nutrient loads (56), sea surface temperature (57), and nutrient content of water masses transported into bottom layers of stratified coastal systems contribute to interannual and longer-period variations in hypoxic volume, duration, and rate of deoxygenation (14). Climate change in coastal waters Warming is predicted to exacerbate oxygen depletion in many nutrient-enriched coastal systems through mechanisms similar to those of the open ocean: increased intensity and duration of stratification, decreased oxygen solubility, and accelerated respiration (4, 58, 59). The current rate of oxygen decline in coastal areas exceeds that of the open ocean (60), however, likely reflecting the combined effects of increased warming of shallow water and higher concentrations of nutrients. Higher air temperatures can result in earlier onset and longer durations of hypoxia in eutrophic systems through effects on the seasonal timing of stratification and the rate of oxygen decline (58). An ensemble modeling study of the Baltic Sea projects declining oxygen under all but the most aggressive nutrient-reduction plans, owing to increased precipitation and consequent nutrient loads, decreased flux of oxygen from the atmosphere, and increased internal nutrient cycling. Even aggressive nutrient reduction is projected to yield far less benefit under climate change than under current conditions (61). Because of regional variations in the effects of global warming on precipitation and winds, the rate and direction of change in oxygen content is expected to vary among individual coastal water bodies (4, 58). Where precipitation increases, both stratification and nutrient discharges are expected to increase, with the reverse occurring in regions where precipitation decreases. Changes in seasonal patterns of precipitation and rates of evaporation can also be important. Coastal wetlands that remove nutrients before they reach open water are predicted to be lost as sea levels rise, decreasing capacity to remove excess nitrogen, but the rate of wetland inundation and the ability of wetlands to migrate landward will vary. Effects of ocean deoxygenation Oxygen influences biological and biogeochemical processes at their most fundamental level (Fig. 3). As research is conducted in more habitats and using new tools and approaches, the range of effects of deoxygenation that have been identified, and the understanding of the mechanisms behind those effects, has increased substantially. Although 2 mg liter−1 (61 μmol kg−1) is a useful threshold for defining hypoxia when the goal is to quantify the number of systems or the spatial extent of oxygen-depleted waters, a more appropriate approach when considering biological and ecological effects is to simply define hypoxia as oxygen levels sufficiently low to affect key or sensitive processes. Organisms have widely varying oxygen tolerances, even in shallow coastal systems (19). In addition, because temperature affects not only oxygen supply (through its effect on solubility and diffusion) but also the respiratory demand by organisms, oxygen limitation for organisms is better expressed as a critical oxygen partial pressure below which specific organisms exhibit reduced metabolic functions than in terms of oxygen concentration (62, 63). Fig. 3 Life and death at low oxygen levels. (A) Animals using low-oxygen habitats exhibit a range of physiological, morphological, and behavioral adaptations. For example, teribellid worms (Neoamphitrite sp., Annelida) with large branchaea and high hemoglobin levels can survive in the extremely low oxygen levels found at 400 m depth in the Costa Rica Canyon. (B) Fish kills in aquaculture pens in Bolinao, Philippines, had major economic and health consequences for the local population. (C) The ctenophore Mnemiopsis leidyi is more tolerant of low oxygen than trophically equivalent fishes in its native habitat in the Chesapeake Bay and can use hypoxic areas from which fish are excluded. (D) A low-oxygen event caused extensive mortality of corals and associated organisms in Bocas del Toro, Panama. These events may be a more important source of mortality in coral reefs than previously assumed. PHOTOS: (CLOCKWISE FROM TOP LEFT) GREG ROUSE/SCRIPPS INSTITUTION OF OCEANOGRAPHY; PHILIPPINE DAILY INQUIRER/OPINION/MA. CERES P. DOYO; PETRA URBANEK/WIKIMEDIA COMMONS/HTTPS://CREATIVECOMMONS.ORG/LICENSES/BY-SA/4.0/; ARACDIO CASTILLO/SMITHSONIAN INSTITUTION Biological responses Ocean deoxygenation influences life processes from genes to emergent properties of ecosystems (Fig. 4). All obligate aerobic organisms have limits to the severity or duration of oxygen depletion for which they can compensate. Low oxygen levels can reduce survival and growth and alter behavior of individual organisms (3, 4, 26, 64). Reproduction can be impaired by reduced energy allocation to gamete production, as well as interference with gametogenesis, neuroendocrine function, and hormone production, and can ultimately affect populations and fisheries (65–67). Exposure to hypoxia can trigger epigenetic changes expressed in future generations, even if these generations are not exposed to hypoxia (68). Brief, repeated exposure to low oxygen can alter immune responses, increase disease, and reduce growth (69, 70). Fig. 4 Oxygen exerts a strong control over biological and biogeochemical processes in the open ocean and coastal waters. Whether oxygen patterns change over space, as with increasing depth, or over time, as the effects of nutrients and warming become more pronounced, animal diversity, biomass, and productivity decline with decreasing levels of oxygen. At the edge of low-oxygen zones, where nutrients are high and predators and their prey are concentrated into an oxygenated habitat, productivity can be very high, but even brief exposures to low oxygen levels can have strong negative effects. (Top) Well-oxygenated coral reef with abundant fish and invertebrate assemblages. (Middle) Low-oxygen event in Mobile Bay, United States, in which crabs and fish crowd into extreme shallows where oxygen levels are highest. (Bottom) Anoxic mud devoid of macrofauna. PHOTOS: (TOP) UXBONA/WIKIMEDIA COMMONS/HTTP://CREATIVECOMMONS.ORG/LICENSES/BY/3.0; (BOTTOM) B. FERTIG/COURTESY OF THE INTEGRATION AND APPLICATION NETWORK, UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE In both oceanic and coastal systems, vertical and horizontal distributions of organisms follow oxygen gradients and discontinuities, and migratory behavior is constrained in response to both oxygen availability and the ways that oxygen alters the distributions of predators and prey (64, 71). Because oxygen tolerances and behavioral responses to low oxygen levels vary among species, taxa, trophic groups, and with mobility (19), encounter rates, feeding opportunities, and the structure of marine food webs change. Movement to avoid low oxygen can result in lost feeding opportunities on low-oxygen–tolerant prey and can increase energy expended in swimming (19, 70). Hypoxia effects on vision, a function that is highly oxygen intensive, may contribute to these constraints, in part through changing light requirements (72). The presence and expansion of low–water column oxygen reduces diel migration depths, compressing vertical habitat and shoaling distributions of fishery species and their prey (73–75). For pelagic species, habitat compression can increase vulnerability to predation as animals are restricted to shallower, better-lit waters and can increase vulnerability to fishing by predictably aggregating individuals at shallower or lateral edges of low-oxygen zones (71, 76–78). For demersal species, hypoxia-induced habitat compression can lead to crowding and increased competition for prey (73), potentially resulting in decreased body condition of important fishery species such as Baltic cod (79). In contrast, migration into and out of hypoxic waters can allow some animals to utilize oxygen-depleted habitats for predator avoidance or to feed on hypoxia-tolerant prey, and then to return to more highly oxygenated depths or locations (23, 80). Habitat compression may also enhance trophic efficiency in upwelling regions, contributing to their extraordinary fish productivity (20, 21). Some hypoxia-tolerant fish and invertebrate species expand their ranges as OMZs expand (28, 81), and their predators and competitors are excluded. Multiple stressors Deoxygenation is mechanistically linked to other ocean stressors, including warming (82) and acidification (83), and thus it is often their combined effects that shape marine ecosystems (84, 85). Because hypoxia limits energy acquisition, it is especially likely to exacerbate effects of co-occurring stressors that increase energy demands (65). The thermal tolerance of ectotherms is limited by their capacity to meet the oxygen demands of aerobic metabolism (62). Increased temperature elevates oxygen demand while simultaneously reducing oxygen supply, thus expanding the area of the oceans and coastal waters where oxygen is insufficient. Through this mechanism, ocean warming is predicted to result in shifts in the distribution of fishes and invertebrates poleward by tens to hundreds of kilometers per decade, shifts into deeper waters, and local extinctions (63, 86). Models project that warming combined with even modest O 2 declines (<10 μmol kg−1) can cause declines in important fishery species that are sensitive to low oxygen levels (87). Physiological oxygen limitation in warming waters is also predicted to reduce maximum sizes of many fish species, including some that support important fisheries (88). Increased respiration that causes deoxygenation also amplifies the problem of ocean acidification because the by-product of aerobic respiration is CO 2 . Temporal and spatial variations in oxygen in subpycnocline and shallow eutrophic waters are accompanied by correlated fluctuations in CO 2 . In highly productive estuarine, coastal, and upwelling regions, oxygen concentrations and pH can exhibit extreme fluctuations episodically and on diel, tidal, lunar, and seasonal cycles (83, 89). Elevated CO 2 can sometimes decrease the oxygen affinity of respiratory proteins (90), reduce tolerance to low oxygen by increasing the metabolic cost of maintaining acid-base balance (91), and reduce responses to low oxygen that would otherwise increase survival (92). Neither the occurrence nor the magnitude of cases in which acidification exacerbates the effects of low oxygen are currently predictable (83). Other covarying factors, such as nutrients and fisheries dynamics, can mask or compensate for effects of deoxygenation, complicating management decisions. Fisheries management is designed to adjust effort and catch as population abundance changes (93). Thus, direct and indirect effects of deoxygenation on a harvested population may not be easily traceable in monitoring or catch data because management actions adjust for the loss in abundance. In addition, high nutrient loads can stimulate production in a habitat that remains well oxygenated, at least partially offsetting lost production within a hypoxic habitat (52). Total landings of finfish, cephalopods, and large mobile decapods are positively correlated with nitrogen loads (22), in spite of hypoxia in bottom waters (52). The conflation of habitat loss and nutrient enrichment is prominent in upwelling zones, as well as eutrophic coastal waters. Increased upwelling of nutrient-rich, oxygen-depleted waters from the 1820s to the 20th century has increased primary and fish productivity off the coast of Peru, for example (94). However, there are limits to the extent of hypoxia that can form before total system-wide fishery landings decline. In addition, individual species dependent on
Ocean dead zones, which contain no oxygen, have become four times larger since 1950, while the number of areas with very low oxygen close to coasts has increased tenfold, according to the first comprehensive analysis of these areas. Most marine species cannot exist in such conditions, and the continuation of such trends would result in mass extinction, endangering the livelihood of millions of people. Large-scale deoxygenation is the result of climate change caused by burning fossil fuels; as waters warm, they contain less oxygen.
The second phase of a development by Anwyl and Redrow Homes, providing 151 homes off Middlewich Road in Sandbach, is set to be discussed by Cheshire East’s Southern planning committee next week. Anwyl and Redrow have made two applications on the site: a reserved matters application for 126 homes, and a full planning application for 25 houses at the site’s southern end. The wider 39-acre site was granted outline planning permission for up to 280 homes, alongside public open space and highways improvements, in 2012, and a reserved matters planning application for the first phase of 154 houses was approved in 2015. The latest reserved matters application includes a mix of 74 four-bedroom homes; 26 three-beds; 21 two-beds; four one-beds; and one five-bed house. The four-bed units are expected to vary in price between £264,000 and £475,000. Recommending the application for approval, Cheshire East planning officers said the proposals would “much needed affordable housing provision” and “would help in the Council’s delivery of five-year housing land supply”. Cheshire East planning officers have stipulated that 30% of the homes – around 38 units – should be provided as affordable homes under the application’s Section 106 agreement. A contribution of £514,000 towards education services was already secured as part of the outline planning permission, secured in 2012. The recommendation to approve has been put forward despite objections from Sandbach Town Council, which argued the housing was “far too dense” in the second phase, and that it offered “no green space of any significance”. The Town Council also criticised the application for not providing any bungalows “for older residents who wish to downsize”. However, planning officers said the development’s open space was already covered under the outline application, which provides a six-acre park on the site. Cheshire East planners also recommended the full planning application, covering 25 homes, for approval. The homes on designated open countryside land are in addition to the 280 houses approved as part of 2012’s outline planning application. These will provide a mix of 17 four-bed homes; four three-beds; three two-beds; and a single five-bedroom house, with prices for a four-bed house expected to be in a similar range as for the wider development. Planning officers said the additional homes would “not have a detrimental impact upon residential amenity” in the area, and recommended the scheme for approval, subject to agreeing affordable homes on the site, as well as a £120,000 provision towards local education provision. The professional team for the development includes Astle Planning & Design.
The second phase of a development by UK housebuilder Redrow Homes and Anwyl in Sandbach, Cheshire is set to be examined by Cheshire East’s Southern planning committee. The latest reserved matters application, covering 126 homes, has been recommended for approval by Cheshire East planning officers, but Sandbach town council has complained about the lack of green space and inadequate provision for downsizers.
British wind farms generated more electricity than coal plants on more than 75% of days this year, an analysis of energy figures has shown. Solar also outperformed coal more than half the time, the data provided by website MyGridGB revealed. Overall, renewables provided more power than coal plants on 315 days in 2017, figures up to 12 December showed. Wind beat coal on 263 days, and solar outperformed the fossil fuel on 180 days. Between April and August inclusive, coal generation exceeded solar on only 10 days. In total, renewables generated more than three times the amount of electricity as coal over the year to 12 December. The figures – provided by BM Reports and Sheffield University – reflect a year in which a number of green records have been set for the power sector, including the first full day without any coal power in the system, record solar generation and tumbling prices for new offshore wind farms. The government has committed to phasing out coal power that does not have technology to capture and permanently store its carbon emissions by 2025, as part of efforts to meet targets on greenhouse gases. The focus now turns to gas, with daily output from wind outstripping gas on only two days of the year, and renewables overall – including wind, solar, biomass and hydropower – beating the fossil fuel on just 23 days. Dr Andrew Crossland from MyGridGB and the Durham Energy Institute said: “The government has focused on reducing coal use which now supplies less than 7% of our electricity. However, if we continue to use gas at the rate that we do, then Britain will miss carbon targets and be dangerously exposed to supply and price risks in the international gas markets. “Clearly, refreshed government support for low-carbon alternatives is now needed to avoid price and supply shocks for our heat and electricity supplies.” Emma Pinchbeck, executive director at industry body RenewableUK, said the decision to phase out coal was being made possible by a homegrown renewables industry “coming into its own”. She added: “We want to see more boldness from the Conservative government. In 2018, the government should move to allow onshore wind, now the cheapest form of power for consumers, to be developed in parts of the UK where it is wanted, and agree an ambitious sector deal with the offshore wind industry. “The new year could be the first in a golden age for UK renewables.”
The UK had its greenest ever year in electricity production in 2017, breaking 13 different renewable energy records. Figures from BM Reports and Sheffield University showed that renewables produced more electricity than coal power stations on 315 days last year, and April saw the first day with no coal-fired power used in the UK. Coal now supplies less than 7% of the UK's electricity, and the government has a target to phase it out by 2025. The UK has halved carbon emissions in electricity production since 2012, and the increase in renewable power is expected to continue in 2018.
"Coral reefs cover less than 0.1% of the world's oceans and yet they house a third of all marine biodiversity. And the oceans cover 70% of our planet so they're housing a huge amount of the biodiversity of our planet. So, anyone who cares about extinction, about biodiversity, needs to worry about the future of coral reefs."
Tropical coral reefs across the world, on which millions of livelihoods depend and which are home to a third of all marine biodiversity, are under threat from repeated deadly bouts of warmer water, according to new research. The study of 100 reefs reveals that the interval between bleaching events, when unusually warm water causes coral to eject algae with often fatal consequences, has fallen from once in every 25-30 years in the 1980s, to once in every six years. The researchers have called for greater efforts to reduce the emissions of greenhouse gases to combat the warming.
The Indian government's policy think tank, Niti Aayog, is testing waters to employ blockchain technology in education, health and agriculture, several media reports stated. The top government think tank is developing a proof of concept to take advantage of the new technology in key sectors, a senior government official told The Economic Times on condition of anonymity. The think tank along with blockchain startup Proffer, which was founded by MIT and Harvard graduates, held a blockchain hackathon from 10 November to 13 November 2017 at IIT Delhi, a report in YourStory said in November last year. About 1,900 students from the IITs, MIT, Harvard University, UC Berkeley College of Engineering, and top engineering institutions around the world participated in the event. AgroChain, a blockchain-based marketplace that helps farmers and consumers through co-operative farming, bagged the first prize at the competition, the report added. The marketplace was developed by students from the Indian Institute of Information Technology and Management-Kerala (IIITM-K). Niti Aayog has also been working on developing a country-wide blockchain network called IndiaChain which looks to reduce corruption and frauds, maximise transparency of transactions, report in November 2017 in technology news website Factor Daily had stated. The think tank is also expected to connect the blockchain infra to IndiaStack—the country's digital identification database, the report added. Blockchain technology uses cryptographic tools to create an open and decentralised body of data, which can include banks transactions and the like. The data record can be verified by anyone involved in the transaction and information can be tracked via a secure network.
India is testing blockchain applications in education, health and agriculture, among other sectors of the economy, and is working on a proof-of-concept platform, according to an anonymous senior government official. Government think tank Niti Aayog co-hosted a blockchain hackathon alongside start-up Proffer in November. Reports the same month revealed the think tank was also developing a fraud-resistant transaction platform called IndiaChain, which is expected to be linked to national digital identification database IndiaStack.
China’s first space lab will crash down to earth in the coming months, but don’t worry: The odds of any debris hitting a person are astronomically small. The Aerospace Corporation, a California nonprofit, estimates the Tiangong-1 will enter the atmosphere in mid-March. Sent up in 2011, the “Heavenly Palace,” as it’s also known, has witnessed a number of milestones as China races to become a space superpower. For example in 2012 the country sent its first female astronaut, Liu Yang, as part of the team behind the first successful manual docking with the lab. The station was designed with a two-year lifespan, but authorities extended its service life by two and a half years to conduct more experiments. In September 2016, China’s Manned Space Engineering Office announced that the Tiangong-1 would re-enter the atmosphere around the latter half of 2017, which many interpreted to mean the lab had fallen into an uncontrolled orbit. (Satellite trackers suggest the lab has been that way since at least June 2016, notes the Aerospace Corporation.) The agency said most of the lab would burn up in the fall, adding that it would release its updated forecasts of the descent, internationally if necessary. Although it’s hard to predict where surviving pieces might land, based on Tiangong-1’s inclination, the Aerospace Corporation estimates the station will re-enter somewhere between the latitudes of 43° N and 43° S, an area largely covered by ocean, but also traversing countries including the US, Brazil, and China itself. According to the latest available trajectory data (link in Chinese), the Tiangong-1 is orbiting at an average height of 287 km (178 miles), about 100 km lower than it was in September 2016. Though it’s not uncommon for spacecraft and satellites to re-enter the atmosphere, rarely does it lead to injury or destruction of property. The largest manmade object to re-enter was Russia’s Mir space station, with surviving fragments falling into the Pacific east of New Zealand in 2001. Whereas the Tiangong-1 weighs 8,500 kg (18,739 lbs), the Mir weighed 120,000 kg. The Mir, though, was still under control when it entered the atmosphere. Letting objects enter uncontrolled is not considered a best practice. One (still remote) danger from the Tiangong-1, according to the Aerospace Corporation, is that someone will find and pick up a piece of its debris that’s covered in a corrosive substance. While the odds of getting hit by space debris are absurdly remote, it happened to at least one woman. In 1997 Lottie Williams was strolling through a park in Tulsa, Oklahoma when a piece of light metal measuring about 6 inches (15.2 cm) glanced off her shoulder. NASA later confirmed the timing and location were consistent with the re-entry and breakup of a second-stage Delta rocket, the main wreckage of which was found a few hundred miles away in Texas. Williams wasn’t injured, but she’s thought to be the only person ever hit by space debris. You won’t likely become the second. Correction: An earlier version of this story stated the Mir space station re-entered the atmosphere in 2013 instead of 2001.
China's space lab Tiangong-1 is estimated to crash to Earth in March, with the Aerospace Corporation predicting it will re-enter between the latitudes of 43°N and 43°S. Although the area is largely covered by ocean, the US, Brazil and China also traverse it. China said it will release updated forecasts of the craft's descent; the expectation is that some debris will hit Earth, while the rest will burn up on re-entry. As we previously noted, many believe China's space authority lost control of the 8.5-tonne lab and are allowing it to enter Earth's atmosphere "naturally".
In Nagoya, Japan, a city that once held an entire museum dedicated to robotics, a hospital will soon add robots developed by Toyota to its medical staff. No, they won't be scrubbing in for surgery: In February, the Nagoya University Hospital will deploy four bots to ferry medicine and test samples between floors for a year. Nagoya hospital to use robots for deliveries of drugs, materials:The Asahi Shimbun:The Asahi Shimbun https://t.co/bhlkzM0Rnp — Asahi Shimbun AJW (@AJWasahi) January 1, 2018 The robots are essentially mobile refrigerators with a 90-liter capacity that rely on radar and cameras to zoom through the hospital. Should they run into humans, they're programmed to dodge them or politely voice 'Excuse me, please let me pass,' according to The Asahi Shimbun. Staff can summon the robots and assign a destination for their medical payload using a tablet. Nagoya built the robot system in partnership with Toyota Industries, a subsidiary of the automaker that produces auto parts and electronics. The trial run will run the robots between 5pm and 8am during the night shift when fewer people are walking the floors. Should the trial go well, the facility may choose to deploy more units.
A hospital in Japan is deploying robots to deliver supplies around the building. The Nagoya University Hospital is to use four of the devices to run during the night shift from 5pm to 8am, when there are fewer people using the corridors. The robots have been developed in partnership with Toyota industries, and use radar and cameras to guide themselves around the hospital. They contain mobile refrigerators to carry medical supplies and staff can use a tablet device to summon them and set their destination.
First of all, my role is the Global Head of Product at Datscha (I was one of the founding employees.) working with everything from strategy and business development down to what bugs [we should focus on] in the upcoming sprint. Datscha has been around for 20-plus years, but we still keep a very outspoken startup mentality and an effective organization, enabling us to be in three markets: Sweden, Finland and the UK with only 45 employees. In addition to my role at Datscha, I’m also a Partner in Stronghold Invest (the sole owner of Datscha), where we own, among others, the largest property consulting firm in the Nordics (Newsec with 2000 employees and 31 million sqm under management) and the most successful private equity real estate firm in Northern Europe with real estate assets under management of approximately €3.5 billion. Furthermore, Stronghold is an active #PropTech investor.
The commercial property market will become more reliant on data and more transparent, according to the global head of product at Swedish company Datscha, Magnus Svantegård, who was one of the founding employees of the commercial property company over 20 years ago. He said the current handling of properties as investments was based too much on gut feeling and small networks of people. Datscha has 45 employees in Sweden, Finland and the UK, but retains a "start-up mentality", with a focus on property technology, according to Svantegård.
A worker pulls carts full of customer orders along the floor inside the million-square foot Amazon distribution warehouse that opened last fall in Fall River, MA. Land fit for future fulfillment centers for the likes of Amazon and Walmart saw huge spikes in prices last year, according to real estate services firm CBRE. In a trend largely stemming from the growth of e-commerce players across the U.S., some plots of land now cost twice the amount they did a year ago, the group found. This is especially true in major markets, including Atlanta and Houston. In surveying 10 U.S. markets, CBRE found the average price for "large industrial parcels" (50 to 100 acres) now sits at more than $100,000 per acre, up from about $50,000 a year ago. Industrial land plots of five to 10 acres, which typically house infill distribution centers for completing "last-mile" deliveries, watched their prices soar to more than $250,000 per acre by the end of 2017, up from roughly $200,000 a year ago, according to CBRE. Located in more bustling metropolitan settings, these warehouses must help retailers serve consumers closer to their homes. To be sure, industry experts say that despite an uptick in construction of late, there's still a long way to go before supply aligns with demand.
The average price for large industrial plots of land of between 50 and 100 acres doubled last year from $50,000 to $100,000 per acre, thanks to increased demand for data hubs and distribution centres, according to a survey by CBRE. In an examination of 10 US markets, plots of between five and 10 acres, suitable for "last-mile" depots, cost $250,000 per acre by the end of last year, an increase of $50,000 on 2016. David Egan, the global head of CBRE's Industrial & Logistics Research division, said that demand is not likely to drop in the near future.
Courtesy of the City of London Corp. The City of London skyline in 2026 London's skyline is rapidly changing, as this image from the City of London Corp., forecasting the City skyline in 2026, shows. In the shorter term, 2018 is set to be a pivotal year for the London office market. Supply of new space will begin to drop after 2017's cyclical peak, according to Deloitte Real Estate, and 44% of the space set to be completed this year is already leased. But with Brexit negotiations at their most delicate, demand will be at its most skittish. Here are the five biggest office schemes opening in 2018 and who is occupying them, according to Deloitte. In terms of lettings, some are doing significantly better than others. Begin slideshow 70 Farringdon St. — Goldman's new HQ Courtesy of Goldman Sachs Goldman's new London HQ Goldman Sachs broke ground on its 825K SF London headquarters before the U.K. voted to leave the EU. As Brexit has been negotiated, there has been speculation Goldman would move staff to Frankfurt and thus not occupy the entire building — especially after Chief Executive Lloyd Blankfein started trolling the U.K. government on Twitter. There is no sign of it looking to sublease any space yet ahead of the building completing in the third quarter. The International Quarter — the new home of the Financial Conduct Authority Courtesy of Lendlease Building S5 at Lendlease's International Quarter at Stratford The Financial Conduct Authority pre-let 425K of the 515K SF building at the International Quarter in Stratford. Lendlease and LCR are delivering the scheme in the middle of 2018. The FCA's pre-let in building S5 convinced Deutsche Asset Management to pay £370M for a building in what is still an emerging office location. 10 Fenchurch Ave. — a new HQ for M&G Investments Courtesy of Greycoat 10 Fenchurch Ave. M&G, the investment division of insurer Prudential, in 2014 signed for 11 of the 13 floors at the 398K SF 10 Fenchurch Ave., which is being developed by Greycoat and CORE on behalf of Italian insurance company Generali. The Scalpel — two-thirds still to be leased Courtesy of WR Berkley The Scalpel The Scalpel at 52 Lime St. in the City is being built by the development arm of U.S. insurance company WR Berkeley. The company will occupy 81K SF of the 387K SF building, and financial and insurance firms Axis and BPL have taken space, but 61% of the building is unlet. The 35-storey building is scheduled to complete in the second quarter.
Five major London office projects are due to open this year. Goldman Sachs's 825,000 sq ft London headquarters at 70 Farringdon Street was the subject of subletting speculation after rumours circulated that the firm would move staff to Frankfurt post-Brexit, while the International Quarter in Stratford represents a location gamble for Deutsche Asset Management. M&G Investments has leased 11 of the 13 floors at 10 Fenchurch Avenue. However, The Scalpel at 52 Lime Street is barely more than one-third leased and 70 St Mary Axe, also known as The Can of Ham, has yet to secure a tenant.
Xi'an is the latest Chinese city to accept Alipay on its subway system, according to local reports. The system started to accept Alipay as of Jan. 1. As part of the initiative, riders can participate in a program meant to encourage more "green" ways of travel such as public transit. Once an Alipay user accumulates a certain amount of green "energy" from using the mobile wallet to pay for subway fares, Alipay's partners, such as the Alxa SEE foundation, will plant a real tree in areas suffering from desertification upon the user's request. Alipay is now accepted on public transport in more than 30 Chinese cities, including Hangzhou, Wuhan, Tianjin, Qingdao and Guangzhou. Zhengzhou became the first Chinese city to adopt mobile payments in its subway system in September, followed by Beijing and Shanghai.
Chinese payment system Alipay is now accepted on on public transport across 30 cities, after Xi'an's subway became the latest addition this month. The scheme also includes an initiative to plant trees in regions suffering desertification, with Alipay users accumulating 'green' points every time they pay for subway travel using the wallet. Zhengzhou, Beijing and Shanghai were the first cities to allow mobile subway system payments last year.
The leading Chinese messaging app said it doesn’t store users’ chat history, after a top businessman said WeChat was ‘watching’ users WeChat, China’s most popular messaging application, has denied “storing” users’ messages, following accusations by one of the country’s top businessmen that the Tencent Holdings-owned firm was spying on its users. “WeChat does not store any users’ chat history. That is only stored in users’ mobiles, computers and other terminals,” WeChat said in a post on the platform. The statement comes after Li Shufu, chairman of Geely Holdings, which owns the worldwide Volvo and Lotus car brands, was quoted by local media on Monday as saying Tencent chairman Ma Huateng “must be watching all our WeChats every day”. Geely Holdings is one of China’s largest car manufacturers, and one of the few major companies without ties to the country’s government. It has owned Volvo since 2010, British taxi maker The London Electric Vehicle Company since 2013 and took a majority stake in British sports car maker Lotus Cars last year. ‘Misunderstanding’ In its carefully worded response, WeChat said Li’s remarks were the result of a “misunderstanding”. “WeChat will not use any content from user chats for big data analysis,” the firm said in its post. “Because of WeChat’s technical model that does not store or analyse user chats, the rumour that ‘we are watching your WeChat everyday’ is pure misunderstanding.” WeChat, like all social media firms operating in China, is legally required to censor public posts the country’s Communist Party designates as illegal, and its privacy policy says it may need to retain and disclose users’ information in response to government or law enforcement requests. In a 2016 report, Amnesty International ranked Tencent zero out of 100 on various privacy criteria, noting it was the only company on the list that “has not stated publicly that it will not grant government requests to access encrypted messages by building a ‘backdoor'”. Cyber laws Tencent is the only Chinese company on Amnesty’s list, which also includes Japan’s Viber and Line and South Korea’s Kakao, as well as services such US-based companies such as Facebook, Apple, Telegram and Google. Last September China’s internet regulator announced a new rule making chat group administrators and companies accountable for breaches of content laws. The regulator also fined firms including Tencent, Baidu and Weibo for censorship lapses and demanded they improve content auditing measures. Last June China brought into force the restrictive Cyber Security Law (CSL), which mandates certain companies to hold data within the country and to undergo on-site security reviews. What do you know about the history of mobile messaging? Find out with our quiz!
WeChat has said a claim that the company was storing users' chat history was a "misunderstanding". In a blog post, WeChat said it "will not use any content from user chats for big data analysis". The comments followed media quotes from Li Shufu, chairman of Geely Holdings, who said Tencent chairman Ma Huateng "must be watching all our WeChats every day". Tencent scored 0 out of 100 for various privacy issues in a 2016 report by Amnesty International.
From a tech perspective, Bitcoin seems to be just getting started: 2018 promises to be the year that a number of highly anticipated projects are either launched or adopted. In many ways, 2017 was Bitcoin’s best year yet. Most obviously, increased adoption made the pioneering cryptocurrency’s exchange rate skyrocket from under $1000 to well over 10 times that value. But from a tech perspective, things seem to be just getting started: 2018 promises to be the year that a number of highly anticipated projects are either launched or adopted. Here’s a brief overview of some of the most promising upcoming technological developments to keep an eye on in the new year. Cheaper Transactions with Segregated Witness and a New Address Format Segregated Witness (SegWit) was one of Bitcoin’s biggest — if not the biggest — protocol upgrade to date. Activated in August 2017, it fixed the long-standing malleability bug, in turn better enabling second-layer protocols. Additionally, SegWit replaced Bitcoin’s block size limit with a block weight limit, allowing for increased transactions throughout the network, thereby lowering fees per transaction. However, adoption of the upgrade has been off to a relatively slow start. While some wallets and services are utilizing the added block space offered by SegWit, many others are not yet doing so. This means that, while Bitcoin is technically capable of supporting between two and four megabytes worth of transactions per ten minutes, it barely exceeds 1.1 megabytes. This is set to change in 2018. For one, the Bitcoin Core wallet interface will allow users to accept and send SegWit transactions. Bitcoin Core 0.16, scheduled for May 2018 (though this may be moved forward), will most likely realize this through a new address format known as “bech32,” which also has some technical advantages that limit risks and mistakes (for example, those caused by typos). “To spend coins from the P2SH format currently used for SegWit, users need to reveal a redeem script in the transaction,” Bitcoin Core and Blockstream developer Dr. Pieter Wuille, who also co-designed the bech32 address format, told Bitcoin Magazine. “With native SegWit outputs this is no longer necessary, which means transactions take up less data. Recipients of SegWit transactions will be able to spend these coins at a lower cost.” Perhaps even more importantly, several major Bitcoin services — like Coinbase — plan to upgrade to SegWit in 2018 as well. Since such services account for a large chunk of all transactions on the Bitcoin network, this could significantly decrease network congestion, thereby decreasing average transaction fees and confirmation times, even for those who do not use these services. The Lightning Network Rolling Out on Bitcoin’s Mainnet While further SegWit adoption should provide immediate relief of fee pressure and confirmation times, truly meaningful long-term scalability will likely be achieved with second-layer solutions built on top of Bitcoin’s blockchain. One of the most highly anticipated solutions in this regard — especially for lower value transactions — is the lightning network. This overlay network, first proposed by Joseph Poon and Tadge Dryja in 2015, promises to enable near-free transactions and instant confirmations, all while leveraging Bitcoin’s security. The solution has been under active development for about two years now, with major efforts by ACINQ, Blockstream and Lightning Labs. Progress on the scaling layer has been significant all throughout 2017, with early software releases of different but compatible software implementations, useable wallets interfaces and test transactions happening both on Bitcoin’s testnet and even on Bitcoin’s mainnet on a regular basis now. “I'd say we have solved the main technical problems and have a relatively good idea on how to improve on the current system,” Christian Decker, lightning developer at Blockstream, told Bitcoin Magazine. “One last hurdle that's worth mentioning is the network topology: We'd like to steer the network formation to be as decentralized as possible.” Given the current state of development, adoption of the lightning network should only increase throughout 2018 — not just among developers, but increasingly among end users as well. “Integration and testing will be the next major step forward,” Lightning Labs CEO Elizabeth Stark agreed, noting: “Some exchanges and wallets are already working on it.” Increased Privacy Through TumbleBit and ZeroLink While it is sometimes misrepresented as such, Bitcoin is not really private right now. All transactions are included in the public blockchain for anyone to see, and transaction data analysis can reveal a lot about who owns what, who transacts with whom and more. While there are solutions available to increase privacy right now — like straightforward bitcoin mixers — these usually have significant drawbacks: They often require trusted parties or have privacy leaks. This situation could be improved significantly in 2018. Two of the most promising projects in this domain — TumbleBit and ZeroLink — are both getting close to mainnet deployment. TumbleBit was first proposed in 2016 by a group of researchers led by Ethan Heilman. It is essentially a coin-mixing protocol that uses a tumbler to create payment channels from all participants to all participants in a single mixing session. Everyone effectively receives different bitcoins than what they started with, breaking the trail of ownership for all. And importantly, TumbleBit utilizes clever cryptographic tricks to ensure that the tumbler can’t establish a link between users either. An initial implementation of the TumbleBit protocol was coded by NBitcoin developer Nicolas Dorier in early 2017. His work was picked up by Ádám Ficsór as well as other developers, and blockchain platform Stratis announced it would implement the technology in its upcoming Breeze wallet, which also supports Bitcoin, by March 2018. Recently, in mid- December of 2017, Stratis released TumbleBit integration in this wallet in beta. The other promising solution, ZeroLink, is an older concept: it was first proposed (not under the same name) by Bitcoin Core contributor and Blockstream CTO Gregory Maxwell, back in 2013. Not unlike TumbleBit, ZeroLink utilizes a central server to connect all users but without being able to link their transactions. As opposed to TumbleBit, however, it creates a single (CoinJoin) transaction between all participants, which makes the solution significantly cheaper. This idea seemed to have been forgotten for some years until Ficsór (indeed, the same Ficsór that worked on TumbleBit) rediscovered it earlier this year. He switched his efforts from TumbleBit to a new ZeroLink project and has since finished an initial ZeroLink implementation. Ficsór recently ran some tests with his ZeroLink implementation, and while results showed that his implementation needs improvement, Ficsór considers it likely that it will be properly usable within months. “I could throw it out in the open right now and let people mix,” he told Bitcoin Magazine. "There is no risk of money loss at any point during the mix, and many mixing rounds were executing correctly. It is just some users would encounter some bugs I am not comfortable with fixing on the fly.” More Sidechains, More Adoption Sidechains are alternative blockchains but with coins pegged one-to-one to specific bitcoins. This allows users to effectively “move” bitcoins to chains that operate under entirely different rules and means that Bitcoin and all its sidechains only use the “original” 21 million coins embedded in the Bitcoin protocol. A sidechain could then, for example, allow for faster confirmations, or more privacy, or extended smart contract capabilities, or just about anything else that altcoins are used for today. The concept was first proposed by Blockstream CEO Dr. Adam Back and others back in 2014; it formed the basis around which Blockstream was first founded. Blockstream itself also launched the Liquid sidechain, which allows for instant transactions between — in particular — Bitcoin exchanges. Liquid is currently still in beta but could see its 1.0 release in 2018. Another highly anticipated sidechain that has been in development for some time is RSK. RSK is set to enable support of Turing-complete smart contracts, hence bringing the flexibility of Ethereum to Bitcoin. RSK is currently in closed beta, with RSK Labs cofounder Sergio Demian Lerner suggesting a public release could follow soon. Further, Bloq scientist Paul Sztorc recently finished a rough implementation of his drivechain project. Where both Liquid and RSK for now apply a “federated” model, where the sidechain is secured by a group of semi-trusted “gatekeepers,” drivechains would be secured by bitcoin miners. If drivechains are deployed in 2018, the first iteration of such a sidechain could well be “Bitcoin Extended:” essentially a “big block" version of Bitcoin to allow for more transaction throughput. That said, reception of the proposal on the Bitcoin development mailing list and within Bitcoin’s development community has been mixed so far. Since drivechains do need a soft-fork protocol upgrade, the contention does make the future of drivechains a bit more uncertain. “Miners could activate drivechains tomorrow, but they often outsource their understanding of ‘what software is good’,” Sztorc told Bitcoin Magazine. “So they'll either have to decide for themselves that it is good, or it would have to make it into a Bitcoin release.” A Schnorr Signatures Proposal Schnorr signatures, named after its inventor Claus-Peter Schnorr, are considered by many cryptographers to be the best type cryptographic signatures in the field. They offer a strong level of correctness, do not suffer from malleability, are relatively fast to verify and enable useful features, thanks to their mathematical properties. Now, with the activation of Segregated Witness, it could be relatively easy to implement Schnorr signatures on the Bitcoin protocol. Perhaps the biggest advantage of the Schnorr signature algorithm is that multiple signatures can be aggregated into a single signature. In the context of Bitcoin, this means that one signature can prove ownership of multiple Bitcoin addresses (really, “inputs”). Since many transactions send coins from multiple inputs, having to include only one signature per transaction should significantly benefit Bitcoin’s scalability. Analysis based on historical transactions suggest it would save an average of 25 percent per transaction, which would increase Bitcoin’s maximum transaction capacity by about 33 percent. Further on, Schnorr signatures could enable even more. For example, with Schnorr, it should also be possible to aggregate different signatures from a multi-signature transaction, which require multiple signatures to spend the same input. This could, in turn, make CoinJoin a cheaper alternative to regular transactions for participants, thereby incentivizing a more private-use Bitcoin. Eventually the mathematical properties of Schnorr signatures could even enable more advanced applications, such as smart contracts utilizing “Scriptless Scripts.” Speaking to Bitcoin Magazine, Wuille confirmed that there will probably be a concrete Bitcoin Improvement Proposal for Schnorr signatures in 2018. “We might, as a first step, propose an upgrade to support Schnorr signatures without aggregation,” he said. “This would be a bit more straightforward to implement and already offers benefits. Then a proposal to add aggregation would follow later.” Whether Schnorr signatures will already be adopted and used on Bitcoin’s mainnet is harder to predict. It will require a soft fork protocol upgrade, and much depends on the peer review and testing process.
The technology underpinning bitcoin is set for major changes in 2018, with several projects scheduled. Among them is the wider adoption of the segwit upgrade, originally activated in August, thanks to the new bech32 address format, scheduled for a March release. Also expected is the launch of bitcoin's Lightning Network, offering secure, instant confirmations and near-free transfers, while privacy solutions ZeroLink and TumbleBit are set to make the network more secure. Activation of segwit will also make it easier to improve cryptographic signatures by facilitating the use of Schnorr signatures, which would reduce transaction costs and increase bitcoin's maximum capacity.
Publishers have a lot to gripe about when it comes to Facebook, from the platform choking off their referral traffic, dominating digital advertising and giving them whiplash with its constantly changing video strategy. But what if it got even worse? In 2018, Facebook could take a step further and separate news from the news feed. It’s not a crazy idea. The platform tested a newsless news feed, called the Explore Feed, in six countries outside the U.S., causing a major publisher freakout. (Facebook said it didn’t expect to roll out the test further.) In the past year, Facebook also launched Watch, a TV-like video tab; and prioritized Facebook Groups, communities for people who share interests or characteristics — also underscoring the idea of separating user interaction from other media content. Other platforms have made moves to separate users’ messages from media and brands’ content. Snapchat redesigned its app to separate users’ feeds from brands’ content. Instagram is testing a private messaging app, which would take peer-to-peer chat out of the main app. Twitter has its Moments tab, a dedicated home for news and entertainment stories. Fundamental to the success of platforms like Twitter and Facebook is keeping users happy, and as such, they’re always running experiments to see if changes will get people to return more often and stay longer. Given a lot of news is negative or controversial, a feed with no news (unless it’s shared by a user) could be less contentious and more enjoyable for users. And another group that likes less controversy, of course, is another important Facebook constituency: advertisers. “Sometimes people get really annoyed and confused when they’re reading about their cousin’s bar mitzvah or whatever and they see a very serious story afterward,” said Andrew Montalenti, CTO and co-founder of web analytics firm Parsely. “All of the platforms, what they’re really concerned about with fake news is that I think you kind of draw on a bank account of trust with the user. If you come across that stuff too much, you declare it to be a problem, and you stop using it. So they have to play this delicate balance — ‘We can’t show you too many ads or show you too much spammy content.’” Another factor is the fake-news imbroglio that blew up in Facebook’s face in the past year, leading lawmakers to threaten regulation. Facebook responded by trying to police fake news, which has proved to be a challenge. Further de-emphasizing news or taking it out of the feed altogether is one way to deal with the problem. As to the Explore test, Facebook said: “There is no current plan to roll this out beyond these test countries or to charge pages on Facebook to pay for all their distribution in News Feed or Explore.” That was cold comfort to those publishers who depend on the news feed to reach audiences, though. As much as Facebook has declined in reach, it’s still a significant source of traffic for many publishers, which have already seen their direct traffic from Facebook decline in recent months, if not years, as Facebook has prioritized users’ posts and video content in the news feed. Some publishers whose audience strategy is closely tied to Facebook and follow the company closely are starting to consider the possibility of a newsless news feed. An executive at a traditional publishing company said this is “definitely on our minds” given the company gets a “ton of traffic from Facebook,” and it’s a risk the company has to think about in the next few years. “It would be seismic shift,” said another publishing exec. “There’s good reason to be concerned if publishers’ content becomes separated out of the main news feed,” said Vivian Schiller, a former Twitter news executive. “Their criteria [for the Explore test] was about user experience. That’s their business. But it’s hard to imagine this not having a deleterious effect on publishers.” There are other reasons for Facebook to go in this direction. Facebook could make an exception for publishers and other commercial content providers that pay to be in the news feed, which could mean more revenue for Facebook. Separating news from the feed also could give Facebook a way to test a potential new product, similar to how it took Messenger out of the site and made it its own app, Schiller said. Of course, none of this is a fait accompli. There’s good reason to think Facebook will keep news in the feed. Scrolling through the news feed is the core daily habit for most Facebook users. It’s what Facebook uses to promote its many other products, like the Watch video tab and Marketplace. It’s hard to get people to toggle from the news feed to other places on Facebook. That said, even if a newsless news feed doesn’t materialize, publishers have to adapt. Facebook, and Google, are here to stay, and Facebook has proven time and time again that it’s not always going to act in publishers’ interests. Publishers have to take matters into their own hands, and take advantage of other audience and revenue opportunities.
Online publishers are considering strategies for a future in which Facebook removes news stories from its main pages, called newsfeeds. The social network last year experimented with a separate newsfeed called Explore in six countries outside the US, but said it didn't plan to roll this out further. However, a number of other online platforms including Twitter and Snapchat have separated news stories from user-generated content to some degree, leading to speculation Facebook will follow suit. Publishers have already seen their direct traffic from Facebook decline in recent months as user content and videos have taken precedence.
Source: Xinhua| 2017-12-23 21:15:29|Editor: Zhou Xin Video Player Close HANGZHOU, Dec. 23 (Xinhua) -- A team of researchers from Zhejiang University have developed a new type of aluminum-graphene battery that can be charged in seconds, instead of hours. The team, led by professor Gao Chao, from Department of Polymer Science and Engineering of Zhejiang University, designed a battery using graphene films as anode and metallic aluminum as cathode. The battery could work well after quarter-million cycles and can be fully charged in seconds. Experiments show that the battery retains 91 percent of its original capacity after 250,000 recharges, surpassing all the previous batteries in terms of cycle life. In quick charge mode, the battery can be fully charged in 1.1 seconds, according to Gao. The finding was detailed in a paper recently published in Science Advances. The assembled battery also works well in temperatures range of minus 40 to 120 degrees Celsius. It can be folded, and does not explode when exposed to fire. However, the aluminum-ion battery cannot compete with commonly-used Li-ion batteries in terms of energy density, or the amount of power you can store in a battery in relation to the size, according to Gao. "It is still costly to make such battery. Commercial production of the battery can only be possible until we can find cheaper electrolyte," Gao said.
Scientists from Zhejiang University have developed an aluminium-graphene battery that can be charged in seconds, rather than hours. The battery is able to maintain 91% of its original capacity after 250,000 charges and could be fully charged in 1.1 seconds. The researchers also claimed the battery can work in temperatures ranging from minus 40C to 120C. However, they added that the battery does not have the energy density of lithium-ion batteries and is at present too expensive for commercial production.
The automation of repetitive tasks in construction and manufacturing has been around for some time. Last month, Panasonic introduced an agricultural robot at Tokyo’s International Robot Exhibition that could have implications for workers in the fruit-picking business. Harvesting tomatoes is more complicated than you might think. Each fruit has to be plucked from the vine once it is ripe enough, not before. It’s also a delicate operation: tomatoes bruise easily and a single scratch in one can lead to a whole box going bad, fast. Read more: Robot tax could ease drawbacks of automation Panasonic robot harvests tomatoes To handle the perception and dexterity-related challenges that come with fruit picking, Panasonic’s new robot relies on a combination of camera, range image sensor and artificial intelligence technologies. First, it recognizes which tomatoes are ready to be picked. Then, it performs a precise cut-and-catch technique to move each tomato from vine to bucket. The robot can be mounted on a rail, enabling it to slide along one vine from start to finish. In terms of speed, Panasonic expects the robot to perform at least as well as a human, harvesting at an average pace of 10 tomatoes per minute. However, as the robot doesn’t need breaks, pay rises or sick days, it’s easy to see where the attraction might lie in terms of wider efficiency gains. Panasonic has so far only demonstrated its harvesting robot and no announcement has yet been made regarding its readiness for market or cost. Read more: Tarzan robot swings above crops for automated agriculture With great dexterity comes great responsibility The rise of computer vision and faster, more agile robots has made complex tasks accessible to automation. Tomato picking is just one example. Last month, Ocado released footage of a new bagging robot, capable of picking products and carefully placing them into shopping bags based on the shape and weight of each item. This level of processing and dexterity could pave the way for applications that go far beyond monotonous tasks in agriculture and retail. Read more: Italian start-up Evja launches smart agriculture platform for salad growers
A robot that can autonomously harvest tomatoes has been developed by Japanese electronics firm Panasonic. The device uses a camera, range-image sensors and artificial intelligence (AI) systems to detect which fruit are ripe for picking and cut them from the plant. Mounted on a rail alongside the vine, the company said the robot can perform as well as a human picker, averaging a pace of 10 tomatoes per minute. The device has been publicly demonstrated but Panasonic has made no announcement about its likely cost or plans for its production.
A British campaign group set up by young venture capitalists to boost diverse representation in VC is expanding to the US. Diversity VC will be backed in the US by partners and associates at Female Founders Fund, Entrepreneur First, and financially by law firm Cooley. 2017 was a shocking year for US venture capital, with high-profile figures such as Uber investor Shervin Pishevar, 500startups founder Dave McClure, Binary Capital cofounder Justin Caldbeck and many others accused of sexual harassment. A British initiative set up to tackle venture capital's monoculture will launch in the US, where the industry is currently reeling from multiple sexual harassment scandals. Diversity VC was set up by five young venture capitalists in March to highlight the fact that most British venture capital investors are middle-class white men, with a knock-on effect on which startups get funding. It released a landmark report in May showing that almost half of British VC firms had no women on their investment teams. Diversity VC founder and investor Check Warner told Business Insider that the group had decided to expand to the US after multiple requests from venture capital associates. "What's been lacking in conversations in the US is that basis in fact, [research into] why we have such a big problem," Warner said. "There wasn't anything comparable to what we had done. "People got in touch to say 'It would be great to have something like this, a system and process to start uncovering this.'" Warner and her cofounders set up Diversity VC only a few months before a wave of harassment scandals hit US venture firms. Six women accused Binary Capital cofounder Justin Caldbeck of inappropriate behaviour, resulting in his leave of absence from the firm. Subsequent scandals engulfed Uber investor Shervin Pishevar, 500 Startups founder Dave McClure, and Tesla investor Steve Jurvetson. "It's been interesting to see these things come out," said Warner. "It's reflective of the culture not including people, and having this sense of entitlement. You don't have checks and balances on behaviour. "The lack of female diversity around the table is a contributing factor to people getting away with what they got away with." Like its UK counterpart, the US chapter of Diversity VC will promote greater diversity in venture capital through four initiatives, including helping underrepresented groups build a VC network, getting minority interns into VC firms, and publishing data about diverse representation. The group has backing from Female Founders Fund partner Sutian Dong, Insight Venture Partners associate Juliet Bailin, and Entrepreneur First's US head of funding Matt Wichrowski. Law firm Cooley is giving financial backing.
Diversity VC, a UK initiative established last year by five young venture capitalists to tackle the lack of diversity within their industry, is set to launch in the US. The group released a report in May that revealed almost 50% of UK venture capital firms had no women working on their investment teams. Diversity VC in the US will get financial backing from law firm Cooley and support from Insight Venture Partners, Entrepreneur First and Female Founders Fund. Diversity VC said the US launch was in response to requests from workers in the US industry.
Ocean dead zones with zero oxygen have quadrupled in size since 1950, scientists have warned, while the number of very low oxygen sites near coasts have multiplied tenfold. Most sea creatures cannot survive in these zones and current trends would lead to mass extinction in the long run, risking dire consequences for the hundreds of millions of people who depend on the sea. Climate change caused by fossil fuel burning is the cause of the large-scale deoxygenation, as warmer waters hold less oxygen. The coastal dead zones result from fertiliser and sewage running off the land and into the seas. The analysis, published in the journal Science, is the first comprehensive analysis of the areas and states: “Major extinction events in Earth’s history have been associated with warm climates and oxygen-deficient oceans.” Denise Breitburg, at the Smithsonian Environmental Research Center in the US and who led the analysis, said: “Under the current trajectory that is where we would be headed. But the consequences to humans of staying on that trajectory are so dire that it is hard to imagine we would go quite that far down that path.” “This is a problem we can solve,” Breitburg said. “Halting climate change requires a global effort, but even local actions can help with nutrient-driven oxygen decline.” She pointed to recoveries in Chesapeake Bay in the US and the Thames river in the UK, where better farm and sewage practices led to dead zones disappearing. However, Prof Robert Diaz at the Virginia Institute of Marine Science, who reviewed the new study, said: “Right now, the increasing expansion of coastal dead zones and decline in open ocean oxygen are not priority problems for governments around the world. Unfortunately, it will take severe and persistent mortality of fisheries for the seriousness of low oxygen to be realised.” The oceans feed more than 500 million people, especially in poorer nations, and provide jobs for 350 million people. But at least 500 dead zones have now been reported near coasts, up from fewer than 50 in 1950. Lack of monitoring in many regions means the true number may be much higher. The open ocean has natural low oxygen areas, usually off the west coast of continents due to the way the rotation of the Earth affects ocean currents. But these dead zones have expanded dramatically, increasing by millions of square kilometres since 1950, roughly equivalent to the area of the European Union. Furthermore, the level of oxygen in all ocean waters is falling, with 2% – 77bn tonnes – being lost since 1950. This can reduce growth, impair reproduction and increase disease, the scientists warn. One irony is that warmer waters not only hold less oxygen but also mean marine organisms have to breathe faster, using up oxygen more quickly. There are also dangerous feedback mechanisms. Microbes that proliferate at very low oxygen levels produce lots of nitrous oxide, a greenhouse gas that is 300 times more potent than carbon dioxide. In coastal regions, fertiliser, manure and sewage pollution cause algal blooms and when the algae decompose oxygen is sucked out of the water. However, in some places, the algae can lead to more food for fish and increase catches around the dead zones. This may not be sustainable though, said Breitburg: “There is a lot of concern that we are really changing the way these systems function and that the overall resilience of these systems may be reduced.” The new analysis was produced by an international working group created in 2016 by Unesco’s Intergovernmental Oceanographic Commission. The commission’s Kirsten Isensee said: “Ocean deoxygenation is taking place all over the world as a result of the human footprint, therefore we also need to address it globally.” Lucia von Reusner, campaign director of the campaign group, Mighty Earth, which recently exposed a link between the dead zone in the Gulf of Mexico and large scale meat production, said: “These dead zones will continue to expand unless the major meat companies that dominate our global agricultural system start cleaning up their supply chains to keep pollution out of our waters.” Diaz said the speed of ocean suffocation already seen was breathtaking: “No other variable of such ecological importance to coastal ecosystems has changed so drastically in such a short period of time from human activities as dissolved oxygen.” He said the need for urgent action is best summarised by the motto of the American Lung Association: “If you can’t breathe, nothing else matters.”
Ocean dead zones, which contain no oxygen, have become four times larger since 1950, while the number of areas with very low oxygen close to coasts has increased tenfold, according to the first comprehensive analysis of these areas. Most marine species cannot exist in such conditions, and the continuation of such trends would result in mass extinction, endangering the livelihood of millions of people. Large-scale deoxygenation is the result of climate change caused by burning fossil fuels; as waters warm, they contain less oxygen.
Online retail and e-commerce giant Amazon is reportedly on the verge of making its first investment in an insurtech start-up, with the company said close to finalising an investment in online-only insurance start-up Acko. Acko wants to disrupt India’s insurance industry through a digital-only platform, having raised $30 million and recently received in-principal approval from the financial market regulators in India. Amazon and Indian rival Flipkart had both been pursuing investing in Acko, it has been reported widely, but at this stage it is now thought that Amazon is close to signing a term-sheet for the investment and a partnership deal with Acko. It’s said that the arrangement will see Amazon acting as an online distributor for Acko’s insurance products, selling a range of financial products. The potential for Amazon to enter the insurance space has been much-discussed in recent months, including in our article from November, Incumbents could be relegated, if tech giants come for re/insurance. Now it appears Amazon is close to taking a sensible step of investing in and partnering with an insurtech start-up, in order to gain the ability to add insurance products to its retail offering, seeing the firm stepping into the sale of financial products for the first time. Targeting India first is also a smart move, as the burgeoning financial services market there has a strong focus on technology and take-up rates of insurance products are rising all the time. If Amazon can crack selling insurance online to the Indian market, it will stand it in good stead to break into more established markets such as the United States and Europe. Of course, if Amazon does move into insurance meaningfully it will likely only be a matter of time before other tech giants such as Google follow suit with their own integrated e-commerce offerings. It’s also been reported that Flipkart is readying its own entry into insurance sales online, with the establishment of a new entity to focus on financial services and venture investing. Also read: Incumbents could be relegated, if tech giants come for re/insurance.
Amazon is reportedly finalising an investment in Indian insurtech firm Acko. The arrangement would see Amazon acting as a distribution platform for the online-only insurer. Amazon's biggest rival in India, Flipkart, was also reportedly considering an investment in Acko. The start-up has raised $30m to date and has provisional approval to operate from India's financial markets regulator. 
Some smartphone games have been found using a specific software that uses your device's microphone to track users' TV watching habits and collect data for advertisers. According to a recent New York Times report, more than 250 games on the Google Play Store use software from a company called Alphonso that uses the smartphone's mic to listen for audio signals in TV ads and shows. The data collected is then sold to advertisers for ad targeting and analysis. NYT reports that the software is used in games, some of which are geared towards children. Although the software does not record human conversations, it can detect sounds even when a phone is stowed away in someone's pocket and the apps are running in the background. Alphonso's chief executive Ashish Chordia told NYT that the company has also worked with film studios to analyse viewers' movie-watching habits in theaters as well. "A lot of the folks will go and turn off their phone, but a small portion of people don't and put it in their pocket," Chordia said. "In those cases, we are able to pick up in a small sample who is watching the show or the movie." While most apps seemed to be available in the Google Play Store, the Times noted that some were on Apple's App Store as well. Although the software's activities are creepy, some of the apps do disclose its tracking of "TV viewership details" in their descriptions under the "read more" button and software use policies. Both Apple and Google require apps to get explicit permission from users in order to access certain phone features such as the camera, microphone, location, photo gallery and more. However, most users don't usually read the disclosure and are often unaware they have agreed to let the app access their phone's microphone. "The consumer is opting in knowingly and can opt out any time," Chordia said. He also noted that the company's disclosures comply with Federal Trade Commission guidelines and offer instructions for users to opt-out of the software on its website. He added that the firm does not approve of its software being used in apps targeting children. However, it has been found integrated into a number of games such as "Teeth Fixed" and "Zap Balloons" by India-based KLAP Edutainment. A simple search for "Alphonso software" and "Alphonso automated" on the Play Store yields numerous apps that integrate the software. One game called "Dream Run" by Dumadu Games - which has been downloaded and installed by about 5000 to 10,000 users - discloses under a "Read More" button that it is integrated with Alphonso Automated Content Recognition (ACR) software. "With your permission provided at the time of downloading the app, the ACR software receives short duration audio samples from the microphone on your device," the disclosure reads. "Access to the microphone is allowed only with your consent, and the audio samples do not leave your device but are instead hashed into digital 'audio signatures.' "The audio signatures are compared to commercial content that is playing on your television, including content from set-top-boxes, media players, gaming consoles, broadcast, or another video source (e.g., TV shows, streaming programs, advertisements, etc.)." 1 of 2 The revelation does seem to echo the years-long conspiracy theory that apps by major tech giants such as Facebook tap into users' smartphone mics to secretly listen in on conversations and offer up relevant ads. Facebook has long tried to dismiss the speculation. "We have to be really careful as we have more devices capturing more information in living rooms and bedrooms and on the street and in other people's homes that the public is not blindsided and surprised by things," Dave Morgan, the founder and CEO of Simulmedia that works with marketers on targeted ads, told the Times. "It's not what's legal. It is what's not creepy."
More than 250 games sold in the Google Play Store and Apple Store have been found to contain software that uses smartphone microphones to track user's TV-watching habits and sell the data on to advertisers. A New York Times report said the software, developed by a company called Alphonso, often goes undetected by users who do not read their phone software use policies, where it is detailed. Alphonso CEO Ashish Chordia said its activity was compliant with Federal Trade Commission guidelines and users could opt out at any time.
In 2007, the tragic case of 12-year-old Deamonte Driver gained national attention when the Maryland boy died from an untreated tooth infection because his family couldn’t find a dentist who would treat him. Instead of an $80 procedure that could have prevented Deamonte’s death, his saga turned into a series of hospital visits that came too late, ending with the needless loss of his life, but also costing taxpayers tens of thousands of dollars through Medicaid. And yet, little has changed across the U.S. since then when it comes to dental care access. There has been a serious market failure, harming lives and raising costs. Fortunately, a market solution now exists, if only states will adopt it: dental therapists.
Poor dental health affects tens of millions of Americans with 63 million live in places described as dental shortage areas; suffering from decaying teeth, toothaches and chronic dental pains. Not only does this lead to general suffering, from an economic stance work productivity is damaged across all age ranges. Moreover, high medical costs mean only 1 in 3 dentists actually accept Medicaid patients. However, states are now attempting to overcome the issues that stop both adults and children receiving proper dental care at little cost to the taxpayer. Political advocates Grover Norquist and Don Berwick see a simple solution, one that received over a 75 percent backing from Democrats and Republicans alike. They want to allow dentists to hire a variety of professionals from dental therapists to nurse practitioners who can supply preventive care and perform routine procedures that do not require dentists. This would utilize the free market at little cost, allowing dental therapists to operate without unnecessary government barriers while being able to treat more patients, offering the potential opportunity for SME’s expansion. Tragic cases like Deamonte Driver, who lost his life after receiving poor dental treatment would be prevented if dental therapists were able to operate. Subsequently, dental therapists are far cheaper than dentists, they provide high-quality care, are well educated and readily available, having existed in more than 50 countries for over a century. Should states turn to dental therapists, it would provide a simple solution to help patients, businesses and combat rising health care costs.
Sign up to FREE email alerts from Wales Online - The CardiffOnline Newsletter Invalid Email Something went wrong, please try again later. Subscribe Thank you for subscribing We have more newsletters Show me See our privacy notice Investment into Cardiff's commercial property sector reached record levels in 2017 at more than £400m, according to international real estate advisory firm Savills. In 2016 investment into the city totalled £298m. The research by Savills shows that volumes were heavily skewed this year by the activity at the Central Square regeneration scheme, which amounted to a combined £224.7m. In total, investment into the Cardiff office sector surpassed £350m, which represents a record high. Prime yields for Cardiff office investments fall 75 basis points from the beginning of the year to 5.50% as strong investor interest has resulted in downward pressure. The second largest sector in terms of investment levels in 2017 was the leisure sector with investment reaching £43m. The sector was dominated by the £20.5m acquisition of Stadium Plaza by Naissance Capital Real Estate and the £22.1m purchase of the Clayton hotel by M&G Real Estate. Ross Griffin, director of investment at Savills Cardiff, said: “Cardiff remains a popular investment destination, particularly for those looking to place their money into the regional office market. The development at Central Square has significantly boosted the volumes for this year, providing an attractive opportunity. "Looking ahead to 2018 we expect to see continued activity from institutions on both the buy and sell side as they look for long term income. Overseas capital will also be active, particularly for prime assets at attractive yields."
Investment in Cardiff's commercial property sector reached record levels in 2017, at more than £400m ($542m), compared with £298m in 2016, according to Savills. Ross Griffin, director of investment at Savills Cardiff, said the figure was boosted by the £224.7m invested in the Central Square regeneration scheme. The data also revealed total investment in the office sector reached £350m, while prime yields for office investments dropped 75 basis points to 5.50%.
Office take up in quarter four, 2017 in the central Birmingham office market totalled 354,530 sq ft in 49 deals as scheduled in the table below, compiled by the Birmingham Office Market Forum. When added to the 81 deals totalling 650,542 sq ft recorded in the first three quarters of the year, the 2017 year-end total take up amounts to 1,005,072 sq ft in 130 deals. This compares with: 692,729 sq ft in 139 deals for 2016 970,458 sq ft in 132 deals for 2015 713,460 sq ft in 148 deals for 2014 664,147 sq ft in 128 deals for 2013 The 2017 outcome is a record take up year, beating the previous high seen in 2015. Office take up was boosted by the emergence of HS2 related demand together with the Government committing to the largest prelet seen in the city for a decade. In addition, key larger transactions were concluded in the Professional Services and the Serviced Office sectors. Breaking 1 million sq ft office take up for the first time is extremely positive for Birmingham during the current period of unprecedented development activity and further regeneration, visible across the BOMF area. It is also particularly encouraging bearing in mind the slow first half of the year following on from the dip seen in the previous year’s total. For further information please contact the author of the report Jonathan Carmalt, Director, Office Agency, JLL on 0121 214 9935 or email [email protected] Birmingham Office Market Forum was established in 2007 to present a co-ordinated voice to investors, developers and occupiers about Birmingham’s city centre office market. The Forum brings together the city’s leading office agents and Business Birmingham. For a list of member firms or further information visit their website
Office take-up in Birmingham during 2017 broke the one million sq ft barrier for the first time, in spite of a slow first half to the year, according to data from the city's Office Market Forum. The year saw a total of 130 deals and beating the previous record of 970,458 sq ft in 2015. The figures were given a fillip by demand linked to the planned HS2 high-speed rail link, and the UK government committing to the city's biggest pre-let in 10 years.
This omission was noted by popular share investing website The Motley Fool with British analyst GA Chester writing that previously, various figures Purplebricks gave made it possible to at least estimate the number. "My calculations of the average sale price suggested that either the company was cornering the market in trailer park homes sales or that a rather large proportion of instructions weren't being converted to completions," Mr Chester wrote. "Obviously, if you're charging a fixed fee but fail to complete the sale in too many cases, you're not going to have a sustainable business in the longer term," he added. Purplebricks' Australian website shows its agents have sold 2247 homes since September 2016 out of total listings of 3495. This suggests a clearance rate of 70-75 per cent, if recent listings are excluded. Also a concern is that while Purplebricks has continued to ramp up its British advertising spending, UK revenue growth has halved in the past two years from 154 per cent in the first of 2016-17 to 77 per cent in the second half of 2017-18. "For me, this trend appears ominous for the market's future top and bottom-line growth expectations," Mr Chester said. Advertisement Investors nervous Despite these issues, Neil Woodford, Britain's most high-profile fund manager, remains a strong backer of Purplebricks, with his Woodford Investment Management Ltd retaining a 27 per cent stake having bought into the float. However, investors are clearly nervous – the Purplebricks share price dropped 6 per cent in September when it was briefly, incorrectly reported that Woodford had reduced its stake to just 2.99 per cent. Old Mutual is another backing Purplebricks – the insurance and banking group recently increased its stake in the company to 12.6 per cent from 11.1 per cent. In its latest interim results released in mid-December, Purplebricks reported that total group revenue more than doubled to £46.8 million with its British business posting a healthy £3.2 million operating profit. However, losses at its Australian business more than doubled to £5.1 million after it spent £5.7 million on marketing the brand locally. Advertisement Purplebricks upgraded its UK revenue guidance 5 per cent to £84 million as part of its interim results. The company said it was on course to achieve full-year revenue guidance of £12 million ($20.8 million) in Australia. Chief executive and co-founder Michael Bruce said the company's Australian business was "on track" and performing ahead of expectations. "Our progress in Australia has been exciting and encouraging. Our market share in Australia is greater than our market share was in the UK at the same time in its evolution," Mr Bruce said. The Australian Financial Review has reported on a few notable successes by Purplebricks estate agents, including veteran property executive Bryce Mitchelson, the managing director of $500 million childcare trust Arena REIT selling a three-bedroom Edwardian home in Elsternwick for more than $1.7 million after just two weeks with Purplebricks and saving an estimated $38,000 in commission fees. In another high-profile result, Purplebricks saved property developer David Fam more than $61,000 on the $3.1 million sale of his Sutherland Shire mansion. However, the Financial Review has also highlighted what happens when Purplebricks does not achieve a result, leaving a customer with a big bill and a house that is still for sale. This was the case for Sydney woman Kerryn Lehmann who ended up owing Purplebricks $12,000 when her four-bedroom riverfront home in Como in the Sutherland Shire failed to sell.
UK hybrid estate agency Purplebricks revealed a £3.2m ($4.3m) operating profit in its mid-December interim results, and has revised up its revenue guidance by 5% to £84m. However, the results did not include data on how many properties had been sold in the six months to October, prompting some to query the company's £4.16 share price and question if its up-front agent fee business model was sustainable. The firm's Australian arm was on target to achieve its full-year revenue guidance of £12m, despite incurring losses of £5.1m, the results showed.
The system came into effect January 1, 2018. The measure has been introduced to improve water quality in the country by limiting phosphate production from dairy cattle manure and promote a shift to land-based farming. The EC said that given the high density of dairy cattle in the Netherlands, the phosphate contained in dairy cattle manure represents a significant environmental concern. In addition to the main environmental objectives, the system also provides support for young farmers and is intended to have a positive effect on grazing and grassland. Trading rights Dairy farms will be awarded phosphate rights for free and will only be allowed to produce phosphate from dairy cattle manure corresponding to the phosphate production rights they hold. At the end of each calendar year, farms will be required to demonstrate that they have sufficient phosphate rights to justify the amount of phosphate produced by their dairy cattle manure. Dairy farms, including new entrants, can acquire phosphate rights on the market, as phosphate rights will be traded. When a transaction occurs, 10% of the traded rights will be withheld and kept in a ‘phosphate bank.’ This is intended to encourage the development of more land-based dairy farming by providing temporary, non-tradable rights to "land-based farms," which can fully absorb on their land all the phosphate from their own manure production. Based on the environmental objectives the system aims to achieve, the European Commission concluded that the system is in line with the EU rules for environmental State aid.
The European Commission has given the go-ahead to a trading system for phosphate rights for dairy cattle in the Netherlands, aimed at improving the country's water quality by limiting phosphate production from dairy cattle manure and encouraging a move to land-based farming. Dairy farmers will receive phosphate rights for free and will be obligated each year to prove they have sufficient rights to justify the quantity of phosphate produced by their manure. Phosphate rights can be obtained on the market, with 10% of the traded rights held back to promote the development of more land-based dairy farming.
"But, on the other hand, we would be foolish to rule anything out. We know that Asia-Pacific will be a very important market and we know a lot of the global growth in the future will come from there."
According to the UK's International Trade Secretary, Liam Fox, the UK could feasibly join the Trans-Pacific Partnership (TPP), saying "it would be foolish to rule anything out". The organisation is made up of Australia, Mexico, New Zealand, Canada, Chile, Japan, Singapore, Brunei, Peru, Vietnam and Malaysia - with Donald Trump pulling the US out last year - and is currently in renegotiation under the new name of the Comprehensive and Progressive Agreement for Trans-Pacific Partnership. Its aims are to lower both non-tariff and tariff barriers to trade and to provide a forum to settle international disputes.
HONG KONG, Jan. 3, 2018 /PRNewswire/ -- For an even better property searching experience, GoHome.com.hk, part of REA Group, has recently launched a new website with refined functionalities and layout improving the user experience. People can now can search for their dream property and find out the latest property insights about the area, property prices and information about the property through one simple click. The new website has a new search result page and property details page which offers: A new property and serviced apartment section Comprehensive secondary property listings Responding to Hong Kong consumer demand, GoHome.com.hk's has introduced a new mobile responsive function which allow layouts to be automatically fitted for multiscreen devices such as desktops, tablets and mobile devices, meaning the search for an ideal home is now even easier when you're on the go. Ms. Kerry Wong, Chief Executive Officer, Greater China Region, REA Group, said "GoHome.com.hk is the place that people use to find their perfect home. We've focused on improving the experience so people can now effortlessly explore and search for their ideal properties using specific criteria anywhere and anytime they want to." "By providing comprehensive and timely property information, we're changing the way our customers and consumers better understand property insights and trends by giving them access to the latest information in addition to searching for the perfect property through the one portal," said Ms Wong. Across its global network, REA Group's purpose is to change the way people experiences property through delivering the best property insights and information on their websites and creating the most engaging consumer experience to help people find their perfect place more quickly and easily. For media queries, please contact: REA Group (Hong Kong) Vis Communications Consultancy Limited Ms. Hermia Chan Mr. Felix Poon Tel.: +852 3965 4326 / + 852 9386 0166 Tel.: +852 2804 2388 / +852 9202 2885 Email: hermia.chan@rea-group.com Email: felix@vis-pr.com About GoHome.com.hk GoHome.com.hk is Hong Kong's leading online property platform. Since 1999, GoHome.com.hk has been focused on providing value-added search experiences for the property-related industry and market in Hong Kong, Greater China and ASEAN. GoHome.com.hk was named "Property Portal of the Year" by Marketing Magazine in 2011, 2012 and 2013, "Best Property Developer Partner – Most Comprehensive Property Website" by Capital Magazine in 2013, 2014 and 2015, as well as "Outstanding Online Property Information Platform" at the Hong Kong Digital Brand Awards by Metro Broadcast Corporation Limited and CHKCI in 2017. About REA Group Limited REA Group Limited ACN 068 349 066 (ASX:REA) ("REA Group") is a multinational digital advertising business specialising in property. REA Group operates Australia's leading residential and commercial property websites, realestate.com.au and realcommercial.com.au, Chinese property site myfun.com and a number of property portals in Asia via its ownership of iProperty Group. REA Group also has a significant shareholding in US based Move, Inc and PropTiger in India. Within Hong Kong, REA Group Asia operates GoHome.com.hk, squarefoot.com.hk and SMART Expo. The brands aim to provide consumers with extensive local and overseas property news, listings and investment opportunities while offering property and home-related advertisers with a one-stop, multi-platform solution. SOURCE GoHome.com.hk, part of REA Group
Hong Kong online property platform GoHome.com.hk, part of REA Group, has overhauled its website to improve its customer experience. As well as featuring a new search result and property details page, the site now includes a property and serviced-apartment listings section, and automatically adapts to whatever device is being used to access it.
Two-thirds of Americans believe robots will soon take over the majority of tasks currently done by humans. Swedes, on the other hand, are not concerned about new technology. “No, I’m afraid of old technology,” the Swedish minister for employment and integration, Ylva Johansson, told the New York Times. “The jobs disappear, and then we train people for new jobs. We won’t protect jobs. But we will protect workers.” A recent survey by the European Commision found that 80 percent of Swedes have a positive view of robots and AI. Why such enthusiasm? Swedish citizens tend to trust that their government and the companies they work for will take care of them, and they see automation as a way to improve business efficiency. Since Swedish employees actually do benefit from increased profits by getting higher wages, a win for companies is a win for workers. As the Times points out, the American tendency to worry about robots’ replacing human workers is driven by the severe consequences of losing a job in the U.S. The risk of losing health insurance and a steady income makes people reluctant to leave jobs in favor of new career options or training. Sweden’s free health care, education, and job transition programs dampen the risk of such undertakings—which may be why people in the country are mostly happy to pay income tax rates of up to nearly 60 percent. The U.S., by contrast, provides almost none of these services. The difference is especially stark in the area of employment assistance: the U.S. spends only about 0.1 percent of GDP on programs designed to help people deal with changes in the workplace (see “The Relentless Pace of Automation”).
A majority of Swedish people have a positive view of the rise of robots and artificial intelligence, while the majority of Americans are concerned about it. The European Commission published a survey which found that 80% of respondents from Sweden had a positive view of such technology, higher than the European average of 61%, and lower only than Denmark (82%) and the Netherlands (81%). A separate survey from the Pew Research Centre in the United States found that 72% of adults there were worried about the technology.
In Illinois, researchers from University of Illinois at Urbana-Champaign are engineering sugarcane plants, called lipidcane, to produce more oil as well as sugar. Growing lipidcane containing 20 percent oil would be five times more profitable per acre than soybeans, the main feedstock currently used to make biodiesel in the United States, and twice as profitable per acre as corn, according to their research. They estimate that compared to soybeans, lipidcane containing 5 percent oil could produce four times more jet fuel per acre of land. Lipidcane with 20 percent oil could produce more than 15 times more jet fuel per acre. Researchers estimate that if 23 million acres in the southeastern United States was devoted to lipidcane with 20 percent oil, the crop could produce 65 percent of the U.S. jet fuel supply at a cost to airlines of US$5.31 per gallon, which is less than bio-jet fuel produced from algae or other oil crops.
Researchers at University of Illinois at Urbana-Champaign are tinkering with sugarcane plants to create a more cost-effective feedstock for biofuels for aircraft than corn or soybeans. According to the team, lipidcane containing 20% oil is twice as profitable per acre than corn and five times more than soybeans, and could yield over 15 times more jet fuel per acre. The researchers also estimated that 23 million acres of lipidcane could produce 65% of the jet fuel used to supply US airlines, at a cost of $5.31 per gallon, cheaper than other biofuels.
by Laurie Sullivan @lauriesullivan, January 2, 2018 Virtual assistants have marketers scrambling to figure out how to optimize content as companies like Amazon begin testing voice-triggered audio search advertisements. Reports surfaced Tuesday that Amazon has been speaking to consumer products goods companies such as Clorox and Procter & Gamble to develop advertisements. The CPG companies would promote their products on Echo devices powered by the Alexa voice assistant. Early discussions have centered on whether companies would pay for higher placement if a user searches for a product on the device, similar to how paid-search works on Google, according to CNBC, which cited "people." This should not come as a surprise to marketers preparing to optimize content for voice searches. The ads are being described as what the industry refers to as sponsored content. For example, if someone asks Alexa how to clean up a spill, Amazon's voice assistant would respond to the query by naming a specific brand that paid for the sponsorship or bid a higher price to serve the voice advertisement. advertisement advertisement Advertisers are focused on search placement on Alexa and on other hubs because voice assistants typically only provide one answer per consumer query. Amazon has hinted at launching a voice-operated advertisement platform for sponsored products. And last week, reports surfaced that Amazon is testing several new advertising units and types. Another offering would allow brands to target users based on past shopping behavior or perhaps shopping behavior at the Whole Foods market. In May 2017, eMarketer estimated that the number of people in the U.S. using voice-enabled speakers would more than double to 36 million, with Amazon capturing nearly 71% of the market.
Amazon is testing audio advertisements for its voice-activated virtual assistant, Alexa. It is reported to have approached a number of companies to develop adverts that would be promoted on its Echo devices, with advertisers able to pay to optimise the placement of their products. The development would mean customers who ask for help with domestic problems could have branded products suggested to them or be played adverts for such products. The use of voice-activated virtual assistants in the US is predicted to grow significantly, making them attractive to advertisers as they often only offer one answer to a query.
Maybe start hoarding now. Photo: Diana Miller/Getty Images/Cultura Exclusive Here’s some climate-change news that President Trump will have trouble ignoring: Earth’s junk food is in danger of losing a crucial ingredient. Scientists now predict that chocolate — which POTUS will sometimes eat to celebrate making important military decisions — could become impossible to grow in the coming decades because of hotter temperatures and less rain in regions where cacao plants are cultivated. The year 2050 is when they predict that people will be forced to satisfy a sweet tooth with toffee or caramels seasoned with tears. Like coffee plants and wine grapvines, cacao is a finicky tree. It only grows well in rain-forest land that’s within 20 degrees of the equator. Half of the world’s chocolate is produced in Côte d’Ivoire and Ghana, where the plants thrive at around 300 to 850 feet above sea level and under dependably humid weather conditions. But by 2050, researchers say that rising temperatures could push the optimal cultivation zone “uphill,” to as high as 1,500 feet. Thankfully, a team from UC Berkeley is working on a possible fix. It’s actually part of a new partnership that Mars announced last year. The M&M’s and Snickers maker is investing $1 billion into a variety of efforts to fight climate change, and the scientists in a plant-genomics program at Berkeley hope to develop hardier cacao plants that won’t wilt or rot at their current altitude. Berkeley’s gene-editing technology, called CRISPR, has been in the works for a while, though when it gets attention, it’s almost always for the potential to eliminate genetic diseases or (sort of on the extreme end of this) build “designer babies.” But creator Jennifer Doudna tells Business Insider that the “most profound” application will likely be saving food.
Chocolate could disappear by 2050 because of higher temperatures and less rainfall in regions where cacao trees are grown, according to the US's National Oceanic and Atmospheric Administration. Cacao plants only grow well in rainforest regions that are near the equator, with half of the world's chocolate being produced in Côte d’Ivoire and Ghana at up to 850 ft above sea level. Rising temperatures are pushing growing regions to about 1,000 ft above sea level and into mountainous terrain. UC Berkeley is working with Mars to generate new strains of cacao plants using CRISPR gene-editing technology.
California-based asset management firm Reality Share Advisors announced on Wednesday its advisory board now includes six blockchain and cryptocurrency executives. The members are the following: Erik Voorhees: The founder of Coinapult and CEO of ShapeShift. The founder of Coinapult and CEO of ShapeShift. Dr. Garrick Hileman: A research fellow at the University of Cambridge and researcher at the London School of Economics. A research fellow at the University of Cambridge and researcher at the London School of Economics. Jeff Garzik: The co-founder and CEO of Bloq, a blockchain enterprise software company. The co-founder and CEO of Bloq, a blockchain enterprise software company. Matthew Roszak: The founding partner of Tally Capital, a private investment firm focused on blockchain-enabled technology. The founding partner of Tally Capital, a private investment firm focused on blockchain-enabled technology. Steve Beauregard: The founder and former CEO of leading blockchain payment processor GoCoin. The founder and former CEO of leading blockchain payment processor GoCoin. Derin Cag: The founder of Richtopia and co-founder of Blockchain Age, a research center and digital data consultancy for blockchain technology. While sharing more details about the board, Eric Ervin, CEO of Reality Shares, stated: “In recognizing the tremendous growth potential for blockchain technology while still in its infancy, this advisory board seeks to infuse our investment products and decisions with the knowledge and research of credible thought leaders in the space.” Ervin then added: “Our newly-formed advisory board is comprised of well-regarded influencers at the forefront of blockchain innovation who are deeply entrenched in the disruptive technologies and ideas propelling the distributed ledger and cryptocurrency revolution.” Founded in 2011, Reality Shares is described as an innovative asset management firm, ETF issuer, and index provider. The firm noted that its goal is democratize the world’s best investing ideas, using systematic quantitative methods to deliver products and solutions that support a range of investing objectives, such as diversification, lower correlation, risk mitigation, or unique market exposures.
Asset management company Reality Shares Advisors has appointed six blockchain and cryptocurrency executives to its advisory board to "infuse its investment products and decisions with the knowledge and research of credible thought leaders", according to CEO Eric Ervin. The appointees include Jeff Garzik, the CEO of blockchain enterprise software company Bloq, Derin Cag, co-founder of research centre Blockchain Age, and Steve Beauregard, former CEO of payment processor GoCoin. Reality Shares Advisors focuses on ETF and index investments.
CubeSats, low-cost, bite-sized satellites inspired by the tubes used to hold Beanie Babies, were invented in 1999 as educational tools. Their creators — engineering professors Bob Twiggs and Jordi Puig-Suari — hoped building satellites the size and shape of Rubik’s Cubes would help students of all ages how to design and engineer efficiently. Now, aerospace suppliers and governments across the globe see the tools as the future of space commercialization and deep space exploration. They want to turn CubeSats into tools for low Earth orbit activities like telecommunications and reconnaissance. Companies like SpaceX, Virgin Galactic, Boeing and Airbus, for instance, want to create a space internet — a network of thousands of CubeSats that provide high speed broadband to remote parts of the world. And people like Paulo Lozano, director of the Space Propulsion Lab at the Massachusetts Institute of Technology, say sending the tiny satellites to asteroids could help improve space research (or even save the planet from an asteroid attack, he said). “Instead of going to an asteroid every five, 10 years the traditional way, release a fleet of these tiny little CubeSats and visit 100 asteroids because it’s so cheap,” he said. “Because some of these asteroids, especially the very small ones, have the potential to collide with the Earth. Detecting them in time is important [for stopping them], but also knowing their composition.” Over the first decade of the CubeSat era, universities dominated the landscape, sending two of every three devices into space. Today, commercial companies and militaries have taken over, launching 70 percent of CubeSats in the last five years. But there’s still one big problem: CubeSats can’t move once they’re in space — which limits their survival to months or years and makes them dangerous. “One of the big limitations in CubeSats is that they are launched as secondary payload. Once they are in space, they cannot move,” Lozano said. Of the 750 or so CubeSats sent into space so far, almost all have lacked their own propulsion systems. The tiny satellites are transported alongside regular cargo, and then flung into space. But without their own rockets, the CubeSats cannot maneuver on their own. Most fall slowly back to Earth, but some remain in orbit for years, where they join the other 100 million pieces of space debris that are at risk of colliding with other satellites and space stations. The U.S. Air Force, whose Joint Space Operation Center monitors more than 23,000 orbiting objects larger than four inches in diameter, issues about 700,000 of these collision warnings to satellite owners per year. Imagine what would happen if thousands of CubeSats were added to the fray. What CubeSats need to stay in space are mini boosters, and scientists like Lozano are racing to build them. Moving with static Lozano’s early work focused on big chemical rockets — the kind that you see strapped to space shuttles or on SpaceX missions. He knew these conventional rockets require huge fuel tanks — too big to be carried by CubeSats. Meanwhile, government standards limit how much chemical propellant can be carried by secondary cargo like CubeSats in order to prevent accidental explosions. “You don’t have a lot of leeway in what you’re allowed to bring up because if your satellite blows up and you’re the secondary payload, the primary people are going to be really angry,” said Kristina Lemmer, a mechanical and aerospace engineer at Western Michigan University, who isn’t involved with Lozano’s research. So, Lozano needed an alternative. His inspiration: static electricity and tiny drops of salt water. Static electricity is caused by an imbalance between positive and negative charges in an object. Rub a balloon on your sweater, and its rubber surface becomes covered in negative charge (electrons). Place the balloon near your positively charged hair, and it tugs on the strands until you have a misshapen mohawk. Lozano’s team designed a set of mini thrusters that rely on the same principle. The devices create an electric field that tugs on the charged particles in salt water until they peel off. The result is a spray made of charged molecules called ions. This ion spray doesn’t create a lot force. It’s always less than a millinewton, which is akin to the force produced when a mosquito lands on your arm. But the spray moves very fast, and even a small action creates a reaction in the frictionless vacuum of outer space. Use this to move ions in one direction, and a CubeSat will move uber fast in the other. By launching a fleet of CubeSats, scientists could learn the chemical compositions of asteroids, which could be the key to destroying or redirecting them. Lozano said the best chemical rockets produce a fiery exhaust that moves at about 9,000 miles per hour. His electrospray thrusters can go more than 111,000 miles per hour, he said. The thrusters, which look like computer microchips, are the size of quarters. The chips contain a grid of 500 needles — each a custom-built nozzle for spewing ions. His team tests them inside a large vacuum chamber at their lab in Boston. “In an ideal situation, all of the ions would have the same energy, but the physics of these ion beams makes it so that some ions have less energy than others,” said Catherine Miller, an MIT doctoral and NASA Space Technology Research Fellow. By studying how energy is distributed among the ion beams, she can calculate and standardize how each thruster will perform. Leading the space charge Only three propulsion boosters for CubeSats have had successfully demoed in space, Lemmer said. Lozano’s system was one of them, through a partnership with the Aerospace Corporation in 2015. But Lemmer, who published a comprehensive review of CubeSat propulsion systems last year, said Lozano’s ion engines stand out because each one can produce so much thrust. “Dr. Lozano’s system is probably the frontrunner for the possibility for deep space missions,” Lemmer said. “In order to go interplanetary, you’re going to have to have an electric propulsion system because they are so much more efficient.” Folks like NASA have counted on the high efficiency of ion engines in the past, such as with the Dawn mission to the asteroids Vesta and Ceres. That journey would have been impossible without the Dawn’s high velocity ion engine. But the Dawn mission cost half a billion dollars. Commercial CubeSats can cost as little as $100,000 — and this price is dropping. Even children are building CubeSats at their elementary schools. While Lozano’s electrospray thrusters don’t work exactly like the Dawn’s ion engines, Lemmer said the advantages are the same. You can carry less propellant — Lozano’s fuel tanks are the size of sugar cubes — but move more efficiently. What’s next? With another demonstration scheduled in early 2018, Lozano is dreaming big. He hopes his tiny thrusters can help CubeSats reach Mars or send them on asteroid scouting voyages. “Since they are so small, you can actually land on the asteroid with these rockets and take off again,” Lozano said. Though nobody over the last 1,000 years has died because of an asteroid strike, as far as anyone knows, the chances are still disturbing. Your odds of being killed by an asteroid in your lifetime — one in 700,000 — rival death by flood or earthquake in the U.S. By launching a fleet of CubeSats, scientists could learn the chemical compositions of asteroids, which could be the key to destroying or redirecting them. An asteroid made of silicon, for instance, would be much tougher to stop than one made of iron. Lemmer said CubeSats with propulsion could also provide a cheaper way to test new space technologies. “Right now, if you want to put a new technology on a NASA satellite, it’s years in the making to run sufficient tests,” Lemmer said. Instead, “if you launch a new technology on a CubeSat and show that it works in space without bad things happening, then you can more easily translate into a NASA mission down the road.”
Engineers are racing to create a propulsion method to allow Cubesats, 10 cm-sided satellite cubes, to move independently and in large numbers through space, enabling cheaper exploration of asteroids and planets. Among the projects is Paulo Lozano's chemical propulsion system, which uses electricity and tiny drops of salt water to create speeds of 111,000 mph. The device from Lozano, who is the director of the Space Propulsion Lab at the Massachusetts Institute of Technology, will undergo another demonstration in the coming months and is one of three to have been tested in space.
Approximately one hundred years ago, Erwin "Cannonball" Baker began driving cross-country, as quickly as possible, in anything he could get his hands on. His point: to demonstrate the reliability, range, and ease of refueling internal combustion cars. On Thursday, December 28th, 2017, Alex Roy joined Daniel Zorrilla, a Tesla Model 3 owner, to test the range and reliability of that vehicle—which happens to be one of the first delivered Model 3 customer cars. The pair departed the Portofino Inn in Redondo Beach, California; their final destination was the Red Ball garage in New York City. The two completed the cross-country drive in 50 hours and 16 minutes, setting a new electric Cannonball Run record. Total time: 50 hours, 16 minutes, 32 seconds Total mileage: 2860 miles Charging cost: $100.95
One of the first Tesla Model 3 customer cars has set a new record for the Cannonball Run in an electric vehicle. The car made the trip from Redondo Beach, California to New York City, a total journey mileage of 2,860 miles, in 50 hours and 16 minutes. The total charging cost for the duration of the Cannonball Run came in at $100.95.
Pioneering new technology is set to accelerate the global quest for crop improvement in a development which echoes the Green Revolution of the post war period. The speed breeding platform developed by teams at the John Innes Centre, University of Queensland and University of Sydney, uses a glasshouse or an artificial environment with enhanced lighting to create intense day-long regimes to speed up the search for better performing crops. Using the technique, the team has achieved wheat generation from seed to seed in just 8 weeks. These results appear today in Nature Plants. This means that it is now possible to grow as many as 6 generations of wheat every year -- a threefold increase on the shuttle-breeding techniques currently used by breeders and researchers. Dr Brande Wulff of the John Innes Centre, Norwich, a lead author on the paper, explains why speed is of the essence: "Globally, we face a huge challenge in breeding higher yielding and more resilient crops. Being able to cycle through more generations in less time will allow us to more rapidly create and test genetic combinations, looking for the best combinations for different environments." For many years the improvement rates of several staple crops have stalled, leading to a significant impediment in the quest to feed the growing global population and address the impacts of climate change. advertisement Speed breeding, says Dr Wulff, offers a potential new solution to a global challenge for the 21st century. "People said you may be able to cycle plants fast, but they will look tiny and insignificant, and only set a few seed. In fact, the new technology creates plants that look better and are healthier than those using standard conditions. One colleague could not believe it when he first saw the results." The exciting breakthrough has the potential to rank, in terms of impact, alongside the shuttle-breeding techniques introduced after the second world war as part of the green revolution. Dr Wulff goes on to say: "I would like to think that in 10 years from now you could walk into a field and point to plants whose attributes and traits were developed using this technology." This technique uses fully controlled growth environments and can also be scaled up to work in a standard glass house. It uses LED lights optimised to aid photosynthesis in intensive regimes of up to 22 hours per day. advertisement LED lights significantly reduce the cost compared to sodium vapour lamps which have long been in widespread use but are ineffective because they generate much heat and emit poor quality light. The international team also prove that the speed breeding technique can be used for a range of important crops. They have achieved up to 6 generations per year for bread wheat, durum wheat, barley, pea, and chickpea; and four generations for canola (a form of rapeseed). This is a significant increase compared with widely used commercial breeding techniques. Speed breeding, when employed alongside conventional field-based techniques, can be an important tool to enable advances in understanding the genetics of crops. "Speed breeding as a platform can be combined with lots of other technologies such as CRISPR gene editing to get to the end result faster," explains Dr Lee Hickey from the University of Queensland. The study shows that traits such as plant pathogen interactions, plant shape and structure, and flowering time can be studied in detail and repeated using the technology. The speed breeding technology has been welcomed by wheat breeders who have become early adopters. Ruth Bryant, Wheat Pathologist at RAGT Seeds Ltd, Essex, UK, said: "Breeders are always looking for ways to speed up the process of getting a variety to market so we are really interested in the concept of speed breeding. We are working closely with Dr Wulff's group at the John Innes Centre to develop this method in a commercial setting." Dr Allan Rattey, a wheat crop breeder with Australian company Dow AgroSciences, has used the technology to breed wheat with tolerance to pre-harvest sprouting (PHS) a major problem in Australia. "Environmental control for effective PHS screening and the long time taken to cycle through several cycles of recurrent selection were major bottle necks. The speed breeding and targeted selection platform have driven major gains for both of these areas of concerns."
UK and Australian scientists have developed a speed-breeding technique that could enable crops to be harvested every eight weeks, according to an article in Nature Plants. LED lights were used to create day-long regimes in fully controlled growth environments, with success rates the team believes could lead to six generations of bread and durum wheat, as well as barley, peas, and chickpeas, each year. The research, by the John Innes Centre in the UK, the University of Queensland and University of Sydney, is aimed at finding ways to feed the growing human population.
North West Property Richard Frost A typical Regus workspace Workspace provider Regus has opened two new business centres in Liverpool, meaning that it now has three locations in the city. The centres are located on Merchants Court on Derby Square and 1 Mann Island. Regus also has a site in Exchange Flags. Richard Morris, UK chief executive of Regus, said: "Demand for flexible workspace in Liverpool is booming so the city was a natural choice for our expansion plans. "The city is well-connected and offers excellent value and it's increasingly attracting investment and visitors from across the world. "We expect our new centres to be popular with a wide range of users including local small businesses, start-ups and remote workers as well as national firms opening satellite offices and global businesses establishing a footprint in the area."
Shared office provider Regus has added two new business centres to its portfolio in Liverpool. The new sites are located at Merchants Court, on Derby Square, and at 1 Mann Island, alongside the docks – both close to its Exchange Flags city-centre offices. Regus UK Chief Executive Richard Morris said the expansion is in response to growing demand for flexible office space in Liverpool, which is attracting investors from around the world.
Updated Story Several Democratic-led states are looking to implement state-level individual mandates for insurance coverage in an effort to reduce the prominence of bare counties and failing risk pools due to the end of the Affordable Care Act’s individual mandate in 2019 and other instabilities surround the law. California, New York, Maryland, Connecticut, and Washington state are all considering pursuing state individual mandates for insurance coverage when their state legislatures come into session in early 2018 ,...
Following on from the Republican tax bill, several Democratic-led states are looking to implement state-level individual mandates to overcome bare counties and the prospect of failing risk pools. California, Connecticut, New York, Maryland and Washington State, are all considering the move when their state legislatures come into session in early 2018. It is likely that the states will attempt to implement a model similar to that of RomneyCare, introduced in Massachusetts in 2006, and the ACA’s individual mandate. These states do not require federal approval for the move as the mandate penalty is a tax, and as a result, have the ability to implement their own version of the Obamacare mandate. It is unlikely that the move will extend beyond these states as there is far more likely to be partisan pushback as state legislatures often skew to the right. California, in particular, is looking at the possibility of a state individual mandate to overcome the uncertainty at federal level surrounding Obamacare. Maryland too would likely introduce an individual mandate, Massachusetts could fall back on its original proposition, with Washington State proving the most complicated.
Get email updates with the day's biggest stories Invalid Email Something went wrong, please try again later. Sign up Thank you for subscribing We have more newsletters Show me See our privacy notice McDonald's UK has pledged to give its employees their biggest pay rise in ten years, Mirror Online can reveal. The move comes into force on January 22 this year and is banded by position, region, and age. Only company-owned McDonald's restaurants (about a quarter of branches in Britain) are affected. A staff member at a McDonald's branch in London shared with Mirror Online a company notice put up by management on Tuesday night. Pay will increase up in all company-owned McDonald's restaurants (Image: Daily Mirror) The employee, who falls into the 21-24 category and has asked to remain anonymous, said in a private Facebook post: "WE WON THIS. Biggest pay rise for 10 years! If 0.001% going on strike can win this imagine what more can do!" They told Mirror Online: "[We've been told] pay will be raised, with some crew over 25 even getting £10 an hour! "Everyone's pay has gone up. It's not loads, but it's a win! My pay was around £7.45 and now it will be £7.95. It's the biggest raise in ten years." McDonald's recommends starting rates to managers. For perspective, under 18s currently get around £5.10 per hour, while those over 25 usually start on £7.60. McDonald's has confirmed to us that the wages on the pamphlet are correct. Now, 16-17 year-olds will join on a minimum wage of £5.75, while crew over the age of 25 will initially receive £8 per hour. The decision comes after last year's strikes – a British first for McDonald's – that saw staff from two branches stage a 24-hour protest. McDonald's workers took their protest to Parliament (Image: AFP) Video Loading Video Unavailable Click to play Tap to play The video will auto-play soon 8 Cancel Play now Workers at a branch in Cambridge, and another in Crayford, south east London, made history on Monday September 4 after repeated claims of poor working conditions, zero-hour contracts, and low pay. Some staff talked about "extreme stress" and even "bullying". Cambridge restaurant crew member Steve Day, who took part in the strikes, told Mirror Online: "Obviously we welcome this. It's brilliant and a step in the right direction. And it's good McDonald's are finally listening to us. "But there's much more to be done. It's still not really enough money to live on. Wages have been stagnant for so long, and this is McDonald's just buckling under a bit of pressure. "When the CEO gets £8,000 per hour [according to Steve] we think we should maybe get a little more. The burgers and fries don't cook themselves – we keep him in a job." The 25-year-old, whose wage will rise from £7.65 to £8 per hour, suggested more could be achieved were a greater proportion of the workforce organised. "It shows what an impact a small number of us can have. A tiny number did this – tiny, but not insignificant. I think we can do more." Steve, who's originally from Yorkshire and has worked for the company for nearly six months, also told us that he works around 35-40 hours a week on a zero-hour contract, and would like to be given better job security. More can be done While today is a small victory, the 30 strikers initially wanted to see their [crew member] wages rise to £10 per hour from around £7.50. McDonald's management had earlier in the year promised to give permanent positions to workers on zero-hour contracts. It's not clear whether this has been implemented. The fast food workers who took action were at the time represented by The Baker’s, Food and Allied Workers Union (BFAWU). A representative for the union called the strike a "historic step", that it would give employees its full support, and had previously seen a ballot of 95.7 per cent in favour of striking. At the time, Lewis Baker, who then worked at the Crayford McDonald's, one of the restaurants at which workers took action, wrote a blog post explaining the strike. He said: "We have been left with no choice but to strike. It’s our only real option. We need to raise awareness over our working conditions and the way we are treated in McDonald’s. "I – like many others – have had [my] grievances ignored by the company, time and time again." Labour leader Jeremy Corbyn said: "Congratulations to McDonald's workers and @BFAWU1 for winning pay rises but the fight for £10 an hour is not over. "We achieve more together than we can alone, which is why we should all join a trade union." McDonald’s employs around 85,000 people in the UK. A spokesman told Mirror Online: "Reward and recognition for our people and their contribution is a key priority, and to ensure we can attract and retain the best people, we regularly review pay and benefits. "While our franchisees set their own pay rates, we have recommended an increase across all age bands for our hourly employees to be implemented from 22 January."
McDonald's UK will introduce its largest pay rise in a decade following the first-ever strikes in the country’s branches in September last year. The increases, which will be implemented from 22 January, mean that 16-17 year-olds will start working for the fast food company on a minimum wage of £5.75 ($7.77), up from £5.10. Those aged over 25 will receive an initial wage of £8 per hour, up from £7.60. The strikes, which took place in two branches in Cambridge and London, were called in protest at poor working conditions, low pay and the use of zero-hour contracts.
Not enough time for recovery Coral bleaching occurs when stressful conditions result in the expulsion of the algal partner from the coral. Before anthropogenic climate warming, such events were relatively rare, allowing for recovery of the reef between events. Hughes et al. looked at 100 reefs globally and found that the average interval between bleaching events is now less than half what it was before. Such narrow recovery windows do not allow for full recovery. Furthermore, warming events such as El Niño are warmer than previously, as are general ocean conditions. Such changes are likely to make it more and more difficult for reefs to recover between stressful events. Science, this issue p. 80 Abstract Tropical reef systems are transitioning to a new era in which the interval between recurrent bouts of coral bleaching is too short for a full recovery of mature assemblages. We analyzed bleaching records at 100 globally distributed reef locations from 1980 to 2016. The median return time between pairs of severe bleaching events has diminished steadily since 1980 and is now only 6 years. As global warming has progressed, tropical sea surface temperatures are warmer now during current La Niña conditions than they were during El Niño events three decades ago. Consequently, as we transition to the Anthropocene, coral bleaching is occurring more frequently in all El Niño–Southern Oscillation phases, increasing the likelihood of annual bleaching in the coming decades. The average surface temperature of Earth has risen by close to 1°C as of the 1880s (1), and global temperatures in 2015 and 2016 were the warmest since instrumental record keeping began in the 19th century (2). Recurrent regional-scale (>1000 km) bleaching and mortality of corals is a modern phenomenon caused by anthropogenic global warming (3–10). Bleaching before the 1980s was recorded only at a local scale of a few tens of kilometers because of small-scale stressors such as freshwater inundation, sedimentation, or unusually cold or hot weather (3–5). The modern emergence of regional-scale bleaching is also evident from the growth bands of old Caribbean corals: synchronous distortions of skeletal deposition (stress bands) along a 400-km stretch of the Mesoamerican Reef have only been found after recent hot conditions, confirming that regional-scale heat stress is a modern phenomenon caused by anthropogenic global warming (10). Bleaching occurs when the density of algal symbionts, or zooxanthellae (Symbiodinium spp.), in the tissues of a coral host diminishes as a result of environmental stress, revealing the underlying white skeleton of the coral (8). Bleached corals are physiologically and nutritionally compromised, and prolonged bleaching over several months leads to high levels of coral mortality (11, 12). Global climate modeling and satellite observations also indicate that the thermal conditions for coral bleaching are becoming more prevalent (13, 14), leading to predictions that localities now considered to be thermal refugia could disappear by midcentury (15). Although several global databases of bleaching records are available (notably ReefBase, reefbase.org), they suffer from intermittent or lapsed maintenance and from uneven sampling effort across both years and locations (7). The time spans of five earlier global studies of coral bleaching range from 1870 to 1990 (3), 1960 to 2002 (4), 1973 to 2006 (5), 1980 to 2005 (6), and 1985 to 2010 (7). Here we compiled de novo the history of recurrent bleaching from 1980 to 2016 for 100 globally distributed coral reef locations in 54 countries using a standardized protocol to examine patterns in the timing, recurrence, and intensity of bleaching episodes, including the latest global bleaching event from 2015 to 2016 (table S1). This approach avoids the bias of the continuous addition of new sites in open-access databases and retains the same range of spatial scales through time (fig. S1). A bleaching record in our analysis consists of three elements: the location, from 1 to 100; the year; and the binary presence or absence of bleaching. Our findings reveal that coral reefs have entered the distinctive human-dominated era characterized as the Anthropocene (16–18), in which the frequency and intensity of bleaching events is rapidly approaching unsustainable levels. At the spatial scale we examined (fig. S1), the number of years between recurrent severe bleaching events has diminished fivefold in the past four decades, from once every 25 to 30 years in the early 1980s to once every 5.9 years in 2016. Across the 100 locations, we scored 300 bleaching episodes as severe, i.e., >30% of corals bleached at a scale of tens to hundreds of kilometers, and a further 312 as moderate (<30% of corals bleached). Our analysis indicates that coral reefs have moved from a period before 1980 when regional-scale bleaching was exceedingly rare or absent (3–5) to an intermediary phase beginning in the 1980s when global warming increased the thermal stress of strong El Niño events, leading to global bleaching events. Finally, in the past two decades, many additional regional-scale bleaching events have also occurred outside of El Niño conditions, affecting more and more former spatial refuges and threatening the future viability of coral reefs. Increasingly, climate-driven bleaching is occurring in all El Niño–Southern Oscillation (ENSO) phases, because as global warming progresses, average tropical sea surface temperatures are warmer today under La Niña conditions than they were during El Niño events only three decades ago (Fig. 1). Since 1980, 58% of severe bleaching events have been recorded during four strong El Niño periods (1982–1983, 1997–1998, 2009–2010, and 2015–2016) (Fig. 2A), with the remaining 42% occurring during hot summers in other ENSO phases. Inevitably, the link between El Niño as the predominant trigger of mass bleaching (3–5) is diminishing as global warming continues (Fig. 1) and as summer temperature thresholds for bleaching are increasingly exceeded throughout all ENSO phases. Fig. 1 Global warming throughout ENSO cycles. Sea surface temperature anomalies from 1871 to 2016, relative to a 1961–1990 baseline, averaged across 1670 1° latitude–by–1° longitude boxes containing coral reefs between latitudes of 31°N and 31°S. Data points differentiate El Niño (red triangles), La Niña (blue triangles), and ENSO neutral periods (black squares). Ninety-five percent confidence intervals are shown for nonlinear regression fits for years with El Niño and La Niña conditions (red and blue shading, respectively; overlap is shown in purple). Fig. 2 Temporal patterns of recurrent coral bleaching. (A) Number of 100 pantropical locations that have bleached each year from 1980 to 2016. Black bars indicate severe bleaching affecting >30% of corals, and white bars depict moderate bleaching of <30% of corals. (B) Cumulative number of severe and total bleaching events since 1980 (red; right axis) and the depletion of locations that remain free of any bleaching or severe bleaching over time (blue; left axis). (C) Frequency distribution of the number of severe (black) and total bleaching events (red) per location. (D) Frequency distribution of return times (number of years) between successive severe bleaching events from 1980 to 1999 (white bars) and 2000 to 2016 (black bars). The 2015–2016 bleaching event affected 75% of the globally distributed locations we examined (Figs. 2A and 3) and is therefore comparable in scale to the then-unprecedented 1997–1998 event, when 74% of the same 100 locations bleached. In both periods, sea surface temperatures were the warmest on record in all major coral reef regions (2, 19). As the geographic footprint of recurrent bleaching spreads, fewer and fewer potential refuges from global warming remain untouched (Fig. 2B), and only 6 of the 100 locations we examined have escaped severe bleaching so far (Fig. 2B and table S1). This result is conservative because of type 2 errors (false negatives) in our analyses, where bleaching could have occurred but was not recorded. Fig. 3 The global extent of mass bleaching of corals in 2015 and 2016. Symbols show 100 reef locations that were assessed: red circles, severe bleaching affecting >30% of corals; orange circles, moderate bleaching affecting <30% of corals; and blue circles, no substantial bleaching recorded. See table S1 for further details. After the extreme bleaching recorded from 2015 to 2016, the median number of severe bleaching events experienced across our study locations since 1980 is now three (Fig. 2C). Eighty-eight percent of the locations that bleached from 1997 to 1998 have bleached severely at least once again. As of 1980, 31% of reef locations have experienced four or more (up to nine) severe bleaching events (Fig. 2C), as well as many moderate episodes (table S1). Globally, the annual risk of bleaching (both severe and more moderate events) has increased by a rate of approximately 3.9% per annum (fig. S2), from an expected 8% of locations in the early 1980s to 31% in 2016. Similarly, the annual risk of severe bleaching has also increased, at a slightly faster rate of 4.3% per annum, from an expected 4% of locations in the early 1980s to 17% in 2016 (fig. S2). This trend corresponds to a 4.6-fold reduction in estimated return times of severe events, from once every 27 years in the early 1980s to once every 5.9 years in 2016. Thirty-three percent of return times between recurrent severe bleaching events since 2000 have been just 1, 2, or 3 years (Fig. 2D). Our analysis also reveals strong geographic patterns in the timing, severity, and return times of mass bleaching (Fig. 4). The Western Atlantic, which has warmed earlier than elsewhere (13, 19), began to experience regular bleaching sooner, with an average of 4.1 events per location before 1998, compared with 0.4 to 1.6 in other regions (Fig. 4 and fig. S2). Furthermore, widespread bleaching (affecting >50% of locations) has now occurred seven times since 1980 in the Western Atlantic, compared to three times for both Australasia and the Indian Ocean, and only twice in the Pacific. Over the entire period, the number of bleaching events has been highest in the Western Atlantic, with an average of 10 events per location, two to three times more than in other regions (Fig. 4). Fig. 4 Geographic variation in the timing and intensity of coral bleaching from 1980 to 2016. (A) Australasia (32 locations). (B) Indian Ocean (24 locations). (C) Pacific Ocean (22 locations). (D) Western Atlantic (22 locations). For each region, black bars indicate the percentage of locations that experienced severe bleaching, affecting >30% of corals. White bars indicate the percentage of locations per region with additional moderate bleaching affecting <30% of corals. In the 1980s, bleaching risk was highest in the Western Atlantic followed by the Pacific, with the Indian Ocean and Australasia having the lowest bleaching risk. However, bleaching risk increased most strongly over time in Australasia and the Middle East, at an intermediate rate in the Pacific, and slowly in the Western Atlantic (Fig. 4, fig. S3B, and tables S2 and S3). The return times between pairs of severe bleaching events are declining in all regions (fig. S3C), with the exception of the Western Atlantic, where most locations have escaped a major bleaching event from 2010 to 2016 (Fig. 2D). We tested the hypothesis that the number of bleaching events that have occurred so far at each location is positively related to the level of postindustrial warming of sea surface temperatures that has been experienced there (fig. S4). However, we found no significant relationship for any of the four geographic regions, consistent with each bleaching event being caused by a short-lived episode of extreme heat (12, 19, 20) that is superimposed on much smaller long-term warming trends. Hence, the long-term predictions of future average warming of sea surface temperatures (13) are also unlikely to provide an accurate projection of bleaching risk or the location of spatial refuges over the next century. In the coming years and decades, climate change will inevitably continue to increase the number of extreme heating events on coral reefs and further drive down the return times between them. Our analysis indicates that we are already approaching a scenario in which every hot summer, with or without an El Niño event, has the potential to cause bleaching and mortality at a regional scale. The time between recurrent events is increasingly too short to allow a full recovery of mature coral assemblages, which generally takes from 10 to 15 years for the fastest growing species and far longer for the full complement of life histories and morphologies of older assemblages (21–24). Areas that have so far escaped severe bleaching are likely to decline further in number (Fig. 2B), and the size of spatial refuges will diminish. These impacts are already underway, with an increase in average global temperature of close to 1°C. Hence, 1.5° or 2°C of warming above preindustrial conditions will inevitably contribute to further degradation of the world’s coral reefs (14). The future condition of reefs, and the ecosystem services they provide to people, will depend critically on the trajectory of global emissions and on our diminishing capacity to build resilience to recurrent high-frequency bleaching through management of local stressors (18) before the next bleaching event occurs. Supplementary Materials www.sciencemag.org/content/359/6371/80/suppl/DC1 Materials and Methods Figs. S1 to S4 Tables S1 to S3 References (25–29) http://www.sciencemag.org/about/science-licenses-journal-article-reuse This is an article distributed under the terms of the Science Journals Default License. Acknowledgments: Major funding for this research was provided by the Australian Research Council’s Centre of Excellence Program (CE140100020). The contents of this manuscript are solely the opinions of the authors and do not constitute a statement of policy, decision, or position on behalf of the National Oceanic and Atmospheric Administration or the U.S. government. Data reported in this paper are tabulated in the supplementary materials.
Tropical coral reefs across the world, on which millions of livelihoods depend and which are home to a third of all marine biodiversity, are under threat from repeated deadly bouts of warmer water, according to new research. The study of 100 reefs reveals that the interval between bleaching events, when unusually warm water causes coral to eject algae with often fatal consequences, has fallen from once in every 25-30 years in the 1980s, to once in every six years. The researchers have called for greater efforts to reduce the emissions of greenhouse gases to combat the warming.
NASA's Flight Opportunities program is already flying experiments on Blue Origin's New Shepard vehicle, but researchers and companies alike want NASA to also fund experiments with people on board. BROOMFIELD, Colo. — As commercial suborbital vehicles capable of carrying both payloads and people prepare to enter service, NASA officials say they're willing to consider allowing agency-funded researchers to fly on those vehicles. In an interview after a speech at the Next-Generation Suborbital Researchers Conference here Dec. 19, Steve Jurczyk, NASA associate administrator for space technology, said the agency would be open to allowing researchers funded by NASA's Flight Opportunities program to fly on suborbital spacecraft to carry out their experiments. "As principal investigators propose, both internal to NASA and external, we'll do the same kind of process that we do with Zero G," he said, referring to the company that performs parabolic aircraft flights. Zero G flies investigations as part of the Flight Opportunities program, with researchers flying on the aircraft with their experiments. [Watch Blue Origin's New Shepard 2.0 Spacecraft Soar in 1st Test Flight] Zero G's aircraft, a Boeing 727, is regulated by the Federal Aviation Administration. Jurczyk said that, in addition to the FAA oversight, NASA's Armstrong Flight Research Center performs an evaluation of the aircraft for investigations selected by the Flight Opportunities program for flights on it. "It just ensures that our grantees and contractors are safe to fly, and then we allow them to go fly," he said in a speech at the conference. A similar procedure is not yet in place for suborbital vehicles, but Jurczyk said the agency would be open to finding some process analogous to that used for Zero G. "Moving forward, as these capabilities start coming online, we’ll figure it out," he said in the interview. His comments come four and a half years after another agency official opened the door to flying people on commercial suborbital vehicles through the Flight Opportunities program. Speaking at the same conference in June 2013, Lori Garver, NASA deputy administrator at the time, said that past prohibitions about flying people would be lifted. "We absolutely do not want to rule out paying for research that could be done by an individual spaceflight participant — a researcher or payload specialist — on these vehicles in the future," Garver said then. "That could open up a lot more opportunities." That announcement took the program by surprise, with the program's managers saying at the time they had yet to craft a policy for allowing people to fly with their experiments. Development of such a policy suffered years of delays, in part because of Garver's departure from NASA just a few months after her announcement as well as extended delays in the development of commercial suborbital vehicles capable of carrying people. "It mostly resulted in a bunch of ostriches sticking heads in the sand for a few years," said Erika Wagner, business development manager at Blue Origin, during a panel discussion at the conference Dec. 18. Blue Origin's New Shepard vehicle is already carrying research payloads, including for Flight Opportunities, but without people on board. However, the vehicle will be able to support missions carrying payloads and people in the future. Virgin Galactic’s SpaceShipTwo vehicle will also fly research payloads accompanied by a payload specialist. Wagner said she has seen some progress as both companies' vehicles advance through flight testing. "The heads are back out. They're looking around trying to understand what really are the barriers, what is the liability regime." Those liability issues today, she said, prevent NASA civil servants from flying on the Zero G aircraft, even though outside researchers whose experiments are funded by NASA are able to do so. Jurczyk, in his speech at the conference, said that’s because they would have to sign a liability waiver to do so. "Right now, that’s just NASA policy. We don't have a strong mission need to do that," he said. "That's current policy. I’m not saying it's going to be policy forever and ever." [In Photos: Blue Origin's New Shepard 2.0 Aces Maiden Test Flight] Scientists who would like to fly experiments on suborbital vehicles argue that such missions are analogous to fieldwork — oftentimes hazardous — performed in other fields. "Marine biologists and marine geologists get to put themselves in that very same operationally risky environment by going to the bottom of the ocean, to a deep sea vent," said Dan Durda, a planetary scientist at the Southwest Research Institute, during the Dec. 18 panel. "These vehicles offer us, as space scientists, that opportunity to get into the field the way that biologists and geologists do." Advocates of commercial suborbital research, such as the Commercial Spaceflight Federation’s Suborbital Applications Research Group, have been pushing to allow NASA to fund human-tended experiments. "They're working quietly to get the word out that there are very definite needs for human-tended payloads," said Steven Collicott, a Purdue University professor, in a conference speech Dec. 19. "We've heard some encouraging words and we’re working quietly to try and move that ahead." Others at the conference noted a decades-old precedent that suggests existing barriers to flying NASA-funded researchers on commercial suborbital vehicles can be overcome. In the 1980s, several payload specialists flew on the space shuttle, including Charles Walker, a McDonnell Douglas engineer who was part of three shuttle missions. Walker, in the Dec. 18 panel discussion, noted that on those shuttle missions he and his family signed liability waivers. He supported similar approaches to allow researchers to fly on commercial suborbital vehicles. "The environments opened up by suborbital flight and, at a greater scale, orbital flight, are laboratory environments," he said. "You should be there to maximize the answers that are coming out of the conduct in that environment." This story was provided by SpaceNews, dedicated to covering all aspects of the space industry.
NASA is looking at allowing researchers from the agency on board commercial suborbital flights, according to Steve Jurczyk, NASA associate administrator for space technology. Blue Origin's New Shepard craft and Virgin Galactic’s SpaceShipTwo are among the vehicles that could carry researchers on suborbital flights. Liability issues on the relatively dangerous missions have so far deterred NASA from allowing staff aboard them.
Elevate your enterprise data technology and strategy at Transform 2021. People use chatbots to find homes, interact with their favorite brands, and schedule appointments. Many consumers are onboard with using chatbots to gather instant, personalized information. In many cases, chatbots are the first point of contact for individuals who feel unwell and need to decide whether to head to the doctor. As this technology becomes more prominent, people understandably begin to wonder if insurance companies will cover sessions with chatbot doctors. Given their innovative use of chatbots in the health care sector, it’s looking like insurers and health organizations in the U.K. could be the first to establish insurance coverage for health consultations with chatbots. How do chatbot doctors work? People who have yet to interact with a chatbot doctor might wonder how they work. Could a bot know as much as physicians who have completed years of medical school and relevant work experience? Sometimes, the chatbot makes a preliminary assessment about a person’s health depending on the responses the individual gives to targeted questions. One such chatbot called Ada is available to residents in the United Kingdom. The assessment is free, and the bot avoids providing a set-in-stone diagnosis. Ada uses artificial intelligence to get continually smarter with ongoing use. The developers know Ada won’t replace doctors but hope the bot will help more patients understand what their symptoms might mean. However, a person can also supplement the assessment portion by talking to an actual doctor. That option is offered for a fee and includes receiving a prescription if needed. Individuals frequently head to sites such as WebMD and end up with a questionable self-diagnosis. This is why it makes sense that insurers would be open to the idea of paying for patient interactions that begin with chatbots. A bot’s ability to personalize its conversations with patients could theoretically yield a smaller margin of error and be less likely to provide misleading information. Reducing emergency room visits The U.K.’s National Health Service (NHS) is also trying out a chatbot that asks people a series of questions when they dial the nationwide emergency number to indicate whether or not they need an ambulance. Representatives hope the service will reduce the number of people dispatchers send to emergency rooms. When too many callers are sent to the emergency room, patients wait in hospitals for several hours or even longer before receiving treatment. Those in favor of chatbots say the technology could keep emergency room visits at more manageable levels. In contrast, critics assert that the NHS previously used a symptom checker app that made highly publicized and dangerous blunders when advising some patients, and they think the same problems could occur with the chatbot. The NHS provides free health care to legal residents of the United Kingdom that covers most needs, including emergency care and visits with general practitioners. However, the NHS assists over 1 million patients in England every 36 hours. People who rely on the NHS for health care often deal with long waits. To compensate, those who can afford them often subscribe to private insurance plans. It will be interesting to gauge the outcome of this NHS program, especially considering how many people rely on the NHS. If things go well, the positive result could prompt insurance companies to accept claims from customers who receive advice from chatbots regarding their well-being. How much are chatbots worth? It remains to be seen if insurance companies in the United Kingdom and elsewhere will cover chatbot doctors. The evidence to suggest their willingness — or lack thereof — to do so is not available yet because the technology is too new. One reason insurers might balk at the idea of paying for this kind of coverage is the need to determine the fair market value of compensation for such services. In some areas, health facilities use telehealth providers to reduce the need for on-call personnel. However, the relevant valuators in those locations must remain aware of federal and state laws surrounding telemedicine. If they fail to do so, over- or undercompensation could occur during the billing and collection processes. Some telemedicine laws have not been in place for very long. Not surprisingly, many locations have not even considered chatbots in the equation. Helping people understand their coverage Although insurance companies haven’t made it clear whether they’re completely onboard with covering chatbots, it seems promising that some are already using chatbots to help customers achieve a higher level of understanding about their coverage packages. For example, Now Health International is a company that provides health insurance for expatriates. The headquarters is in Hong Kong, and the establishment has other branches in Asia and the United Kingdom. This summer, the insurance provider launched a chatbot through Facebook Messenger. Regardless of whether users are existing customers or are only thinking about purchasing coverage packages, they can use the chatbot to find physicians in the Now Health International network or get questions answered about filing claims and receiving quotes. The chatbot can also recognize keywords entered by a user into the chat window. When it picks up on those words, the technology automatically directs the person to the proper area of the website for further information. The increasingly widespread use of chatbots for health — including those provided by insurance companies — indicates that some insurers are laying the necessary foundation for covering chatbots in the future. However, it’s likely that before that happens, we must pass legislation clarifying the valuation-related questions that could arise during claims, billing, and other aspects. As consumers become more comfortable with using chatbots to ask questions about their health or insurance, the increased prevalence of the technology in the marketplace could push officials to iron out the details. That would pave the way for insurance companies to clearly mention they accept claims related to treatment that involves chatbot doctors on their websites or insurance documents. Insurers may also stipulate that they will not offer coverage to patients who did not speak to actual doctors during their chatbot-driven conversations and only used the chatbots for preliminary advice. Chatbots could drastically change how people manage their health care needs. Similarly, they could alter how doctors treat patients and how insurers handle the claims. Since the technology is in its infancy, however, it’s only possible to perform research and make educated speculations. Kayla Matthews is senior writer for MakeUseOf. Her work has also appeared on Vice, The Next Web, The Week, and TechnoBuffalo.
Machine learning chatbots – such as Ada, which helps UK residents make preliminary assessments about their health – could lead to insurance companies offering coverage for virtual consultations. Industry insiders have suggested that a chatbot's ability to personalise consultations makes them less likely to make mistakes, leading to less risk for insurers. The country's National Health Service is using a chatbot assessment service in an attempt to reduce A&E patient numbers.
The return to work after Christmas is never easy. Unless you’re an estate agent: they love January. Following the pre-Christmas lull, families rush back into wanting to buy and sell their houses (helped in part by the traditional post-festivity spike in family breakdown). But for an increasing number of us, house hunting is becoming little more than an exercise in window shopping (or ‘property porn’ if you’d rather). The share of the population owning a home has been falling since 2003, with particularly profound consequences for younger families. As the chart below shows, today’s 30 year olds (that is, the oldest members of the millennial generation born between 1981 and 2000) are only half as likely to own their house as their parents were at the same age. Like so much of the Christmas TV schedule, this is a story that’s been on repeat for some time. Britons get that their country is no longer a nation of home owners. As research carried out by Ipsos MORI back in the summer for the Intergenerational Commission showed, 71 per cent of people (across all generations) think millennials face worse prospects than their parents in this regard. Just 7 per cent think young adults are better off. Indeed, of all of the questions asked in the survey, it was the one on which respondents were most pessimistic. Yet, despite being pessimistic about the overall picture for millennials, new data shows that a significant share of the generation think they personally will manage to beat the gloom. The next chart takes data from the Bank of England’s latest NMG survey to show that more than half (52 per cent) of non-owning households headed by someone aged under-35 (roughly speaking, the millennial generation) expect to buy at some point in the future. And that proportion holds up even among lower income millennials. If such expectations were borne out, around 75 per cent of millennial households would eventually own a home. That would put the generation on a par with the home ownership rates recorded among baby boomers. It would also be roughly 10 percentage points higher than the ‘optimistic’ scenario we set out in September (our ‘pessimistic’ scenario put the figure under 50 per cent). Short of a significant turnaround in housing trends, the implication is that many members of the younger generation will find their aspirations go unmet. And, while the one-in-four (24 per cent) non-owning millennials who think they’ll never buy a home might have a more realistic outlook of the future, they’re just as likely to be unhappy with their lot. The next chart sets out the factors which this group identify as being among the three most important reasons for not owning. What stands out is that just one-in-ten of them cite positive-sounding reasons: 10 per cent say they like their current home and just 8 per cent prefer the flexibility of renting. The upshot is that as few as 1 per cent of millennials appear to be happy with the idea of never owning a home. It’s this finding that goes a long way to explaining why politicians are so keen to be seen to be offering hope on home ownership. And, with ‘purchase costs’ (such as the deposit, stamp duty and estate agents fees) being cited by millennials as the main barrier to owning, it’s easy to understand the temptation to focus on subsidising buyers. Measures such as the removal of stamp duty for first time buyers of property worth up to £300,000 – which Philip Hammond announced in the Autumn Budget – give the impression of extending home ownership to a wider group. But they largely miss the mark. The OBR’s assessment of the stamp duty policy was that it would benefit just 3,500 first time buyers who would not otherwise have been able to buy a home, costing roughly £160,000 per additional owner. Supply-based approaches represent a preferable and more sustainable option, but they take time to take effect. That’s not to say government should give up, and the Autumn Budget plans for returning housing capital spending back to the levels of the 2000s (outside of the fiscal stimulus peak of 2008-10) is a very welcome one. But it’s hard to escape the conclusion that, even if we get to grips with the longer-term problem, home ownership will remain off-limits for significant numbers of millennials. Some might expect to benefit from the bank of mum and dad in the near-term and from inheritances as they age. But such support may come too late to cover expensive family-rearing years for many households, and will never arrive for many – mainly lower income – others. That reality raises a number of challenges for today’s young people. Over the longer-term, home ownership plays an important role in building wealth (via semi-enforced saving), providing leverage and hedging against costs and location in retirement. In its absence, alternatives are needed. More immediately, the generally higher housing costs associated with renting leave young people with less disposable income and less opportunity to save than earlier cohorts faced. The chart below sets out the share of income allocated to rent among younger respondents to the NMG survey. It shows that 30 per cent of renters in this group spend more than one-third of their pre-tax income – a threshold that is often taken as a sign of housing unaffordability. And that figure jumps to a massive 71 per cent among the poorest fifth of millennials. We’ll turn to the question of how the country might rise to these challenges in a forthcoming policy options paper for the Intergenerational Commission. But our politicians – unlike our estate agents – need to be more honest about the housing aspiration gap. It’s good to offer hope, but a healthy dose of realism would sharpen the focus on the broader living standards challenge posed by our housing crisis.
A majority of millennials anticipate they will own a home in the future despite being pessimistic about their overall prospects. A recent survey commissioned by the Bank of England found that 52% of under-35s expect to buy a home at some point in the future, 24% expect never to buy and 25% are unsure. Purchase costs, such as deposits, estate agency fees and stamp duty, were cited as the biggest barriers to home ownership. An earlier Ipsos MORI poll found 71% of respondents thought millennials faced worse prospects than their parents of owning a home. 
With publishers realizing that they can no longer be wholly dependent on ads for their revenue, Purch is getting more serious about selling proprietary technology to other publishers. Purch — a commerce-focused publisher that owns tech and product review sites such as Tom’s Guide, Top Ten Reviews and Live Science — is profitable. It makes about $120 million a year in revenue, with about 20 percent coming from ad tech products that it licenses to 25 publisher clients, said Purch CRO Mike Kisseberth. Over the next year, the company plans to grow its number of publisher clients to roughly 40, and have its tech licensing operation account for about 25 percent of its overall revenue, he said. Purch began developing its own ad platform nearly four years ago. What’s changed is that the company has gotten more serious about licensing it to other publishers. In December, Purch broke out its tech licensing business into a separate unit, called Purch Publisher Services. About 60 of Purch’s 400 employees work on Purch’s tech platform at least part of the time, and 10 work on it full-time, Kisseberth said. These employees are made up of salespeople, engineers, data scientists and support specialists. By licensing software, Purch is aiming to build a revenue stream in an area that most publishers have avoided. This is because most publishers don’t have the resources or patience to build their own ad tech, let alone build tech that can be licensed to other media companies. One exception is The Washington Post, which calls itself a tech company and sells ad products to other publishers. But the Post is an outlier due to the fact that its owner, Amazon CEO Jeff Bezos, is a tech enthusiast who happens to be the richest person in the world. What has driven the growth of Purch’s tech licensing business is that it was an early adopter of server-side bidding. Unlike on-page header bidding — where publishers simultaneously offer inventory to multiple exchanges before making calls to their ad servers — going server-to-server speeds up page-load times since the ad calls are hosted on publishers’ servers and not on users’ browsers. For over a year, Purch has sold all of its programmatic inventory through server-to-server connections. The benefits of server-to-server might sound enticing, but as publisher tech teams are typically stretched thin, few publishers have shifted over to selling the bulk of their programmatic inventory this way. This is where Purch pitches itself as a vendor. Purch’s server-side solution operates on a revenue share, but Kisseberth wouldn’t disclose monetary terms. It is a managed-service product where Purch takes control of the setup and maintenance, ad ops and relationships with the 30 supply-side platforms that are plugged into the product. Purch last summer tested a self-service bidding product with some of its clients but found that it required more tech and service support than was worth it. As self-service gains steam within the ad tech industry, Purch is open to shifting its products to be self-service in the future, but that likely won’t happen in 2018, Kisseberth said. Other tech products that Purch sells publishers include commerce management and CRM platforms. But these products are more geared toward B2B publishers. Purch’s programmatic bidding product is the main driver of its licensing business. Purch doesn’t limit itself to selling its tech to non-competitors; its publisher clients include tech sites like VentureBeat, Mobile Nations and How-To-Geek. Purch figures that selling products to other tech sites, as long as they’re brand-safe, can bolster the reputation of Purch’s ad tech with buyers and in turn help Purch’s case when it comes to setting up private marketplace deals, where the ad rates tend to be much higher than on the open exchange. Kisseberth emphasized that Purch isn’t looking to simply grow an audience extension. If a publisher already builds its own ad tech or runs its auctions server-to-server, then it’s likely not a fit as a client. Unlike the Washington Post, which licenses self-service products to a wide swath of publishers, Purch is focusing on selling its products to niche sites that want another publisher to control and scale their programmatic selling. “This is not a huge land grab where we are signing thousands of publishers,” he said. “It is signing publishers who we think are good additions to our portfolio.”
Profitable Utah-based publisher Purch is building up a healthy business from selling licences for its ad tech products to its peers. CRO Mike Kisseberth said the company's managed-service product, which relies on server-side connections, is responsible for 20% of the company's $120m annual revenue. The now separate unit, Purch Publisher Services, plans to increase that to a quarter. Kisseberth said Purch may look to self-serve products in the future.
European Commission gives green light for dairy cattle phosphate system The European Commission has agreed to the introduction of phosphate rights in Dutch dairy farming. This system and the legislation in which it is enshrined satisfy the applicable guidelines for state aid, European Commissioner for Competition Margrethe Vestager reported to the Minister of Agriculture, Nature and Food Quality Carola Schouten. The House of Representatives and the Senate previously assented to the phosphate system. This system will ensure that the quantity of phosphate produced by cattle as a constituent of manure is kept below the European maximum. The system is set to come into force on 1 January 2018. Schouten꞉ “Now that Brussels has confirmed that the legislation does not constitute state aid, we can be certain that the phosphate rights system will go ahead. Moreover, this decision is also key for the purpose of obtaining a new derogation, a special exemption on the basis of which the Netherlands will be entitled to use more animal manure. Dairy farmers are waiting anxiously for this decision. I can now say to the entire sector: ‘We’re not there yet, but this achievement is already significant’”. From 1 January 2018, dairy farms will be allocated an amount of phosphate rights based on the number of cattle kept as at 2 July 2015 (the date on which the system was announced), less the previously announced generic reduction of 8.3%. Land-based farms with plenty of land in proportion to the number of cattle are exempt from this reduction, which is necessary to keep phosphate production below the European maximum. The phosphate rights are tradable. Farmers wishing to keep more cattle will have to purchase rights to this end from dairy farmers who are reducing their livestock or terminating their company. The system of phosphate rights follows the Phosphate Reduction Scheme, which saw trade associations and the Government agree to curb phosphate production in 2017. Considerable reductions in livestock have already been made over the past year through this plan. The most recent figures (October 2017) indicate that the Netherlands is on course with its ambition to reduce phosphate production below the national ceiling again by the end of this year. The Netherlands is striving to secure a decision from the European Commission granting a new derogation for the 2018–2021 period around April 2018. In this regard, it is important for phosphate production to be brought back below the European maximum by the end of 2017 and for the phosphate system to come into force on 1 January 2018. It is also imperative that agreement has been reached on the Sixth Action Programme of the EU Nitrates Directive. In the unfortunate event that the new derogation is not granted, farmers would be forced to incur additional expenses such as for the responsible disposal of manure and for the supply of extra fertiliser.
The European Commission has given the go-ahead to a trading system for phosphate rights for dairy cattle in the Netherlands, aimed at improving the country's water quality by limiting phosphate production from dairy cattle manure and encouraging a move to land-based farming. Dairy farmers will receive phosphate rights for free and will be obligated each year to prove they have sufficient rights to justify the quantity of phosphate produced by their manure. Phosphate rights can be obtained on the market, with 10% of the traded rights held back to promote the development of more land-based dairy farming.
China aims to build a database that includes all the eligible people for social insurance to implement targeted beneficiary measures, according to the Ministry of Human Resources and Social Security. Officials will use the latest technology, including big data, to reach those without social insurance and to ensure universal coverage by 2020, according to the ministry's Social Insurance Management Center. China's social insurance system is the largest in the world. About 900 million people are included in the endowment insurance system, and more than 1.3 billion are covered by medical insurance, according to a statement from the center provided exclusively to China Daily.
China's Ministry of Human Resources and Social Security will use big data analytics to ensure 100 million people access the country's social endowment insurance system, and aims to achieve universal coverage by 2020. The ministry's Social Insurance Management Centre plans to reach out to 10% of the population who are not part of the system, including immigrant workers or those in new forms of industry such as e-commerce, using internet platforms and mobile terminals. In 2014, the government launched a four-year campaign to register all eligible people for social insurance into a national database.
Revenues earned by the robo-advice industry could shoot up to $25bn by 2022, more than ten times what revenues were worth in 2017, new research shows. The findings were revealed in a study by Juniper Research. The research predicts revenues generated by the robo-advice industry to reach as high as $25bn by 2022, up from an estimated $1.7bn in 2017 thanks to automated wealth management services. Robo-advisers will make investments that appeal to a wider segment of high net worth individuals (HNWIs) and to lower income individuals for as little as 0.6% of assets under management (AUM), Juniper said. This is due to new disruptive fintechs such as Moneybox and Nutmeg. Juniper also said robo-advisers are making the investment process far more convenient by changing their delivery methods. They are specifically targeting smartphone apps, offering millennials more compelling reasons to invest. According to Juniper this would drive AUM held by robo-advisers to about $4.1trn by 2022, up from $330bn in 2017. Nick Maynard, who authored the research, said: “The technologies powering robo-advisers will mature to such an extent that they move from their current human supervised role to being utilised in a fully automated way. This will be aided by track records of performance automated robo-adviser systems are establishing.” The implementation of robo-advice is not restricted to new participants, with even traditional players inching towards the service. BlackRock, currently the world’s largest asset manager, and Aberdeen Asset Management have partnered with robo-advisory startups. Juniper said: “The appeal of these [robo-advice] technologies is clear to established players, as automated systems even in a limited role will enable significant cost reductions and therefore increase overall quality of service and profitability [for traditional players].”
Revenues generated by robo-advice platforms could hit $25bn by 2022, according to a study from Juniper research. The figures would represent more than a ten-fold increase on 2017 robo-advice revenues, which accounted for roughly $1.7bn. The study also estimates assets under management on robo-advice platforms could rise to $4.1tn by 2022, up from $330bn in 2017.
Depending on whom you ask, WeWork is either a brilliant company that is re-imagining office space and the modern workplace, or a glorified, overvalued real estate play with no sustainable competitive advantage. One thing is clear: The start-up is making waves in the business world and real estate market as it now commands a valuation of $21 billion, up from $18 billion earlier last year, just eight years after its founding in 2010. It's received investments from the likes of Softbank, Fidelity Investments, and JPMorgan Chase along the way. Last year was a big one for the co-working specialist as the company made a number of acquisitions, taking over Meetup, The Flatiron School, and Spacemob, among others, and making a splashy real estate play with its $850 million purchase of the Lord & Taylor flagship building on Manhattan's Fifth Avenue, which will become the company's new headquarters. With WeWork's valuation north of $20 billion, revenue on track for over $1 billion last year, and a pedigree as a disruptor, it's not a surprise that investors would be anticipating the company's IPO (initial product offering). Co-founder Adam Neumann reiterated last summer that the company plans to go public. Before we explore whether WeWork will debut on the market this year, let's take a look at how far the company has come. What is WeWork? Like Airbnb or Uber, WeWork is a company distinctly of the mobile and digital era. The internet has rewritten the norms of white-collar work, allowing those in industries like media, design, and tech to work from anywhere with an internet connection. WeWork capitalized on this shift by creating workspaces that cater to freelancers, entrepreneurs, and small businesses looking for customizable office space and the community that a WeWork hub fosters. The company leases office space and redesigns it to attract millennials in the knowledge economy as well as others looking for office space. Founders Neumann and Miguel McKelvey started a similar business in 2008 in Brooklyn called Green Desk, as an eco-friendly co-working space, which paved the way for WeWork after they sold Green Desk to their landlord in 2010. Making community the central focus, the pair opened the first WeWork in New York in April 2011. Today, WeWork is growing quickly, with its locations nearly tripling over the last year, and the company now has 275 locations across 59 cities around the world. WeWork is even experimenting with living spaces with WeLive, which has two locations, one in New York and one in Washington, D.C., and offers short- or long-term housing in community-focused microapartments based on the WeWork model. Will 2018 be the year? While the company has given no indication of when it would issue an IPO, there's good reason to think that WeWork could go public this year. The company said last year it would reach the key milestone of $1 billion in annual revenue, nearly doubling sales from a year ago. The stock market -- especially tech stocks -- is strong, as the Nasdaq jumped 28% last year and the valuation of the S&P 500 is as high as it's been since the tech bubble. Not all IPOs have been winners lately, as Snap Inc and Blue Apron were duds, but WeWork looks like the type of company -- the clear leader in a new industry and demonstrably profitable on a unit business -- that would generate excitement among investors. When it opens a new location, WeWork says it takes a loss for the first few months, but locations are generally profitable nine months after opening. Having just scored a $4.4 billion investment from Softbank in August, WeWork may not necessarily need the cash that would come with an IPO, but the company has been expanding fast and spending money to boot with its $850 million purchase of the Lord & Taylor building and its $200 million acquisition of Meetup in November 2017. If the company seeks more such acquisitions, tapping the public markets may be its best option for a cash influx. WeWork may also want to be mindful of the lesson in Uber's travails as a private company; the ride-hailing specialist may have waited too long to go public as it's now saddled with a slew of image problems after a number of scandals, not to mention regulatory challenges. WeWork's reputation in that area remains spotless for now, but it needs to avoid mistakes that would potentially damage that reputation. With the market roaring, investors hungry for new tech IPOs, and a strong economy pushing WeWork's growth, 2018 could be a great year for WeWork stock to go public.
Co-working start-up WeWork may go public this year after its valuation rose to $21bn from $18bn in 2017. WeWork acquired a number of smaller companies in the past year, including Meetup and Spacemob, and also bought the former Lord & Taylor building on New York's Fifth Avenue for $850m to serve as its headquarters. Co-founder Adam Neumann confirmed last year that the company intends to go public, and with the stock market performing well for tech companies, 2018 appears a good time.
As noted in The Drive earlier this month, Lockheed Martin is now working with Aerion Corporation to develop a 12-passenger business jet that could fly at Mach 1.4—nearly twice the speed of most commercial airliners. The plane, dubbed the AS2, is expected to begin deliveries in 2025 at a reported cost of $120 million. Aerion announced that, with the right atmospheric conditions, it'll be able to fly at speeds approaching Mach 1.2 without a sonic boom reaching the ground. The jet is in the vanguard of a rebirth in civilian supersonic travel with promises of speed up to Mach 2.2. Mach is a measure of the speed of sound in air and it generally decreases with altitude. At sea level, the sound barrier (Mach 1) equates to 761 mph, but at 50,000 feet of elevation, it's 660 mph. Though military planes have broken the sound barrier since the late 1940s, civilian supersonic travel took a couple of decades to catch up. It was achieved from 1976 to 2003 by the Concorde, which topped out at Mach 2.04, and by the Soviet Union's similar-looking Tupolev TU-144, which flew from 1977 to 1983 and could reach Mach 2.15. Earlier supersonic passenger planes failed because of cost and sound levels. They used more fuel and were much more expensive to operate than sub-sonic planes their size. Plus, the sonic boom they created breaking the sound barrier made them abrasively loud to operate over land, leading a ban on commercial flights hitting supersonic speeds over the continental U.S. There are currently several business jets that can approach Mach 1, including Gulfstream's flagship G650ER, which hits Mach .925, and Cessna's Citation X+, the fastest business jet currently in production with a top speed of Mach .935. Most commercial airliners top out below Mach .9, including Boeing's 747-8, the largest commercial aircraft built in the U.S. The return of viable supersonic flight is great news for business travelers. It'd make the journey from New York to London or Paris into a viable day trip, for example, and would facilitate business between Asia and North America by knocking more than a dozen hours off round-trip travel times. If claimed speeds are accurate, the AS2 is actually the slowest of the three supersonic civilian planes currently under development.
Supersonic commercial aviation may be about to make a comeback, as we highlighted last year, with three planes being developed by relatively new companies. Colorado-based Boom Technology is developing a 55-seat Mach 2.2 commercial jet to go on the market by 2023, and has been backed by Japan Airlines and Virgin Group. Boston-based Spike Aerospace is working on an 18-seat business jet capable of Mach 1.6 and Lockheed Martin and Aerion are planning to release a 12-passenger business jet that could reach Mach 1.4. All the planes are being designed to be much quieter than previous supersonic aircraft.
FarmWise, an agricultural robotics and IoT startup has raised a $5.7 million seed round to commercialize the company’s automated weeding robot. FarmWise has developed a weeding robot that uses computer vision to identify weeds and robotics to remove them from vegetable farms without herbicides. While removing weeds with a mechanical motion similar to a garden hoe, the robot also gathers data about the plants. “The machine can drive itself through the field and use cameras and computer vision to understand the field. It can analyze each plant, gathering info such as size, health status, and growth stage,” FarmWise founder and CEO Sebastien Boyer told AgFunderNews. The round was led by hardware-focused VC Playground Global with Felicis Ventures, Basis Set Ventures, and food, agriculture and health investor Valley Oak Investments also participating. “Farmwise is using the latest advances in computer vision and deep learning to build an autonomous platform that not only gathers crop-level data but also carries out actual tasks such as precision weeding and thinning, which are crucial to sustainable organic farming. The confluence of data-sensing, machine intelligence, and robotic actuation being smartly applied to an agricultural setting by a fantastic pair of entrepreneurs is what got us excited about partnering with them,” said Mario Malave of Playground Ventures. Boyer said that he and his cofounder and CTO Thomas Palomares became interested in weeding because it is at the intersection of the farm labor crisis and moving away from chemical applications in agriculture. Palomares has a family history of farming in France and he and Boyer have been working with growers in California for a year to hone their user interface. ‘We want to have a user interface as simple as a washing machine. On the inside, we want the machine to be as smart as possible. On the outside, it should be as simple as possible,” said Boyer. The only element that needs to be entered by the user is a GPS “geofence” so that the machine does not go beyond the field it is supposed to be weeding. The rest of the robot’s activities are self-directed. “One of the things that got us excited was that this is something that farms need to do throughout the year multiple times. so there’s always a use case as opposed to a picking robot,” Niki Pezeshki, vice president at Felicis Ventures. Pezeshki said that finding a pesticide-free and cheaper method of weeding could be a game-changer for organic growers and consumers. “Taking out herbicides really resonated with us too. If you can crack that nut in a scalable way that won’t make organic food more expensive, even better,” said Pezeshki. Boyer is confident that the economics will work in growers’ favor sooner rather than later in part due to targetting vegetable farmers, who generally benefit from higher profit margins and suffer from higher labor costs — setting FarmWise further apart from well-known automated “see and spray” weeding robotics startup Blue River Technologies, which was acquired by John Deere earlier this year. “The cost of weeding for vegetables on a per acre basis is between two and five times higher than for commodities. That makes our solution more economical at the beginning,” said Boyer, who anticipates selling the robots as a service, at least initially. FarmWise investor Valley Oak Investments is also invested in Farmers Business Network, biological crop input company Marrone Bio Innovations, food delivery startup Caviar, dairy management software platform Farmeron, ag biotech company Hazel Technologies, and food waste reduction software Spoiler Alert. Bruce Leak, inventor of QuickTime and cofounder of WebTV has joined FarmWises’s board of directors. Farmwise is currently signing pilot contracts with some of the largest vegetable growers in CA beginning in summer 2018, according to Boyer. Photo: FarmWise
California-based agriculture technology start-up FarmWise has raised $5.7m to bring its weeding robot to the market. The technology uses computer vision and artificial intelligence (AI) to identify and remove weeds from vegetable farms and gather data about crops. Users simply set the boundaries for the machine, and it gets on with its work. The funding comes from venture capital companies Playground Global, Felicis Ventures, Basis Set Ventures and Valley Oak Investments. The robot can be used throughout the year and can make organic food more affordable by reducing the use of pesticides, the company said.
Send Us a Tip Have a scoop that you'd like GeekWire to cover? Let us know.
Online retailer Amazon has secured the patent for a system which would allow it to 3D-print customer's orders in the same trucks used to deliver the items. In 2014, the company set up an online store for custom 3D printing but is limited to hardware and supplies. If the company continues with the technology it could lead to quicker delivery times and increased 3D printing on-the-go.
Huawei has announced signing a strategic agreement to build an open mobile artificial intelligence (AI) ecosystem with Chinese search engine giant Baidu. The strategic cooperation agreement covers AI platforms, technology, internet services, and content ecosystems, Huawei said. The open ecosystem will be built using Huawei's HiAI platform and neural network processing unit (NPU); and Baidu's PaddlePaddle deep-learning framework and Baidu Brain, which contains Baidu's AI services and assets. It will allow AI developers to make use of the technology. Under the partnership, Baidu and Huawei will also work on improved voice and image recognition on smart devices, and will built a consumer augmented reality (AR) software and hardware ecosystem. Content and internet services being explored by the two companies will "strengthen cooperation in areas such as search and feed to bring consumers a wealth of quality content with a more intuitive and convenient service experience", Huawei added. "The future is all about smart devices that will actively serve us, not just respond to what we tell them to do," said Huawei Consumer Business Group CEO Richard Yu. "With a strong background in R&D, Huawei will work with Baidu to accelerate innovation in the industry, develop the next generation of smartphones, and provide global consumers with AI that knows you better." Baidu CEO Robin Li said the search giant is "dedicated" to exploring AI, having last week announced the availability of its AMD EPYC-powered AI, big data, and cloud computing (ABC) services. Also working on developing autonomous driving and autonomous vehicles, Baidu is hoping to utilise Huawei's large customer base for the mobile AI project. "Interactive technologies including voice, machine vision, and AI will drive the [mobile phone] industry forward. Originally developed to be personal tools, mobile phones will become a natural extension of the human body and AI-powered assistants for consumers," Huawei added. "Huawei and Baidu will continue to prioritise consumer needs and leverage each other's strengths to form a partnership that benefits everyone." Huawei head of Consumer Software Engineering and director of Intelligence Engineering Felix Zhang had last month said the addition of AI capabilities to smartphones will bring the next shift in technology, comparing AI to the advent of steam engines in terms of its capacity to fundamentally change people's lives. Mobile AI will change two key aspects of the smartphone, he said: User-machine interaction, and "context-personalised openness". The first aspect will improve efficiencies between the user and their phone across text, voice, image, video, and sensors, while the second will actively provide services and aggregated information across apps, content, third-party features, and native features, he explained. "If you look at the whole ecosystem, the AI will fundamentally change the phone from the smartphone to the intelligent phone," Zhang said. Huawei had unveiled its Kirin 970 chipset with built-in AI in September, at the time calling it the "future of smartphones". Its new mobile AI is made up of a combination of on-device AI and cloud AI. "Huawei is committed to developing smart devices into intelligent devices by building end-to-end capabilities that support coordinated development of chips, devices, and the cloud," Yu said at the time. "The ultimate goal is to provide a significantly better user experience. The Kirin 970 is the first in a series of new advances that will bring powerful AI features to our devices and take them beyond the competition." Limitations in cloud AI necessitated improvements across latency, stability, and privacy, Huawei said, with on-device AI providing this as well as adding sensor data to the offering. Its new flagship smartphones, the Mate 10 and Mate 10 Pro, come kitted out with the AI-focused Kirin processor, which has the dedicated NPU that is able to process 2,000 images per minute via image-recognition technology. Huawei additionally provided the Kirin 970 as an open platform for mobile AI developers and partners in order to drive further developments. This followed Huawei saying in August that AI would play a critical role in driving its smartphone innovation, with the tech giant predicting the advent of the "superphone" two years ago, saying it would be developed by 2020 and take advantage of advancements in AI, big data, and cloud computing. Related Coverage Nokia, Huawei strike multi-year patent licensing deal Specifics of the deal aren't being disclosed, but Huawei will now have access to a slew of mobile patents. 5G standards approved as tech industry signals accelerated deployment Networking, technology, and mobile giants worldwide have signalled their intent to accelerate 5G trials, development, and deployment as 3GPP has set NSA 5G New Radio specs. Android lockdown: Google urges phone makers to support Oreo's rollback protection Google sheds light on a new Oreo security feature that prevents attackers from downgrading a device to insecure versions of Android. 73 percent of developers who don't use AI plan to learn how in 2018 (TechRepublic) Only 17 percent of developers worked with artificial intelligence or machine learning in 2017, according to a DigitalOcean report. How AI will transform the future of work (TechRepublic) Mikhail Naumov of DigitalGenius explains how machine learning is used by enterprise companies like KLM and BMW to improve customer service. Mobile device computing policy (Tech Pro Research) Mobile devices offer convenience and flexibility for the modern workforce-but they also bring associated risks and support issues. This policy establishes guidelines to help ensure safe and productive mobility.
Technology firm Huawei has agreed to construct an open mobile artificial intelligence (AI) ecosystem in partnership with Chinese search engine company Baidu. The strategic cooperation will encompass AI technology, internet services, including search, and content ecosystems. The product will be made using the Huawei HiAI platform and corresponding neural network processing unit, as well as Baidu's deep-learning framework Baidu Brain. The partnership will also include the creation of better image and voice recognition for smart devices, as well as the generation of software for augmented reality.
Traditional finance houses now being seriously challenged in the wealth management market by emerging internet financial companies
Tencent has been granted a licence from the China Securities Regulatory Commission to sell third-party mutual funds directly. The development will allow the company to make better use of its popular messaging app WeChat to sell investment products, analysts have noted. WeChat has close to one billion users. The firm will now have more control over wealth management services offered via WeChat, rather than relying on third-party partners, analysts added. It's also predicted Tencent could seek to turn its payment service, Tenpay, into a full financial service platform. The company previously obtained licences for mobile payments, insurance and micro-financing. 
Credit: CC0 Public Domain The legal cannabis sector is expected to generate $40 billion and more than 400,000 jobs by 2021 in the United States, according to a study released Tuesday. The estimate by consulting firm Arcview includes direct purchases by consumers of $20.8 billion and indirect revenue for growers and various subcontractors as well as money spent with businesses not affiliated with the sector, such as supermarkets. The projection would represent a rise of 150 percent on the $16 billion revenue recorded in 2017, according to the study, released the day after recreational use of marijuana became legal in California. Arcview and its partner in the research, BDS Analytics, expect $4 billion in taxes to be generated within three years. The new regime will lead to the creation of nearly 100,000 cannabis industry jobs in California by 2021, about a third of the nationwide figure and 146,000 jobs overall when indirect effects are considered. Customers and operators in California have complained however about the punitive sales taxes to be applied to cannabis and its derivative products, which can hit 35 percent when state, county and municipal levies are taken into account. California, the most populous US state, became the largest legal market for marijuana in the world on Monday, and public reaction to the law change has been enthusiastic, with long lines and stock shortages reported at clinics already licensed and open. Berkeley mayor Jesse Arreguin hailed the reforms at a ceremony on Monday at Berkeley Patients Group, one of the oldest dispensaries in the United States. "I'm stoked about this historic moment, not just for Berkeley, but for the state of California," Arreguin said, praising the state for "embracing this new economy." Cannabis possession remains illegal under federal law, and Arcview's Tom Adams said fewer than 100 out of the 3,000 outlets and delivery services operating in California were ready to go with the required local and state permits. "Those that were generally report doing multiples of their typical day's business with a far more diverse and less experienced customer base that need a lot of hand-holding and educating from their bud-tenders," he added. "We were very cautious in projecting revenue growth from $3 billion to $3.7 billion in this first year of adult-use legality in California, but we'll have to revise that upwards if, as now appears likely, San Francisco and Los Angeles are going to get permits issued more quickly than we expected." Explore further California issues first licenses for legal pot market © 2018 AFP
US Attorney General Jeff Sessions has announced he is dismantling recent policies that help states legalise recreational cannabis, causing pot stocks to plunge, just as California began cannabis sales on 1 January. The legal cannabis market in the US had been predicted to be worth $40bn and to create over 400,000 jobs by 2021. That figure was made up of $20bn in direct sales and the same amount in indirect revenues for growers and others. Last year the market was worth $16bn.
L.S. Fan, Distinguished University Professor in Chemical and Biomolecular Engineering at The Ohio State University, holds samples of materials developed in his laboratory that enable clean energy technologies. Credit: Jo McCulty, The Ohio State University. Engineers at The Ohio State University are developing technologies that have the potential to economically convert fossil fuels and biomass into useful products including electricity without emitting carbon dioxide to the atmosphere. In the first of two papers published in the journal Energy & Environmental Science, the engineers report that they've devised a process that transforms shale gas into products such as methanol and gasoline—all while consuming carbon dioxide. This process can also be applied to coal and biomass to produce useful products. Under certain conditions, the technology consumes all the carbon dioxide it produces plus additional carbon dioxide from an outside source. In the second paper, they report that they've found a way to greatly extend the lifetime of the particles that enable the chemical reaction to transform coal or other fuels to electricity and useful products over a length of time that is useful for commercial operation. Finally, the same team has discovered and patented a way with the potential to lower the capital costs in producing a fuel gas called synthesis gas, or "syngas," by about 50 percent over the traditional technology. The technology, known as chemical looping, uses metal oxide particles in high-pressure reactors to "burn" fossil fuels and biomass without the presence of oxygen in the air. The metal oxide provides the oxygen for the reaction. Chemical looping is capable of acting as a stopgap technology that can provide clean electricity until renewable energies such as solar and wind become both widely available and affordable, the engineers said. "Renewables are the future," said Liang-Shih Fan, Distinguished University Professor in Chemical and Biomolecular Engineering, who leads the effort. "We need a bridge that allows us to create clean energy until we get there—something affordable we can use for the next 30 years or more, while wind and solar power become the prevailing technologies." Five years ago, Fan and his research team demonstrated a technology called coal-direct chemical looping (CDCL) combustion, in which they were able to release energy from coal while capturing more than 99 percent of the resulting carbon dioxide, preventing its emission to the environment. The key advance of CDCL came in the form of iron oxide particles which supply the oxygen for chemical combustion in a moving bed reactor. After combustion, the particles take back the oxygen from air, and the cycle begins again. The challenge then, as now, was how to keep the particles from wearing out, said Andrew Tong, research assistant professor of chemical and biomolecular engineering at Ohio State. While five years ago the particles for CDCL lasted through 100 cycles for more than eight days of continuous operation, the engineers have since developed a new formulation that lasts for more than 3,000 cycles, or more than eight months of continuous use in laboratory tests. A similar formulation has also been tested at sub-pilot and pilot plants. "The particle itself is a vessel, and it's carrying the oxygen back and forth in this process, and it eventually falls apart. Like a truck transporting goods on a highway, eventually it's going to undergo some wear and tear. And we're saying we devised a particle that can make the trip 3,000 times in the lab and still maintain its integrity," Tong said. This is the longest lifetime ever reported for the oxygen carrier, he added. The next step is to test the carrier in an integrated coal-fired chemical looping process. Another advancement involves the engineers' development of chemical looping for production of syngas, which in turn provides the building blocks for a host of other useful products including ammonia, plastics or even carbon fibers. This is where the technology really gets interesting: It provides a potential industrial use for carbon dioxide as a raw material for producing useful, everyday products. Today, when carbon dioxide is scrubbed from power plant exhaust, it is intended to be buried to keep it from entering the atmosphere as a greenhouse gas. In this new scenario, some of the scrubbed carbon dioxide wouldn't need to be buried; it could be converted into useful products. Taken together, Fan said, these advancements bring Ohio State's chemical looping technology many steps closer to commercialization. He calls the most recent advances "significant and exciting," and they've been a long time coming. True innovations in science are uncommon, and when they do happen, they're not sudden. They're usually the result of decades of concerted effort—or, in Fan's case, the result of 40 years of research at Ohio State. Throughout some of that time, his work has been supported by the U.S. Department of Energy and the Ohio Development Services Agency. "This is my life's work," Fan said. His co-authors on the first paper include postdoctoral researcher Mandar Kathe; undergraduate researchers Abbey Empfield, Peter Sandvik, Charles Fryer, and Elena Blair; and doctoral student Yitao Zhang. Co-authors on the second paper include doctoral student Cheng Chung, postdoctoral researcher Lang Qin, and master's student Vedant Shah. Collaborators on the pressure adjustment assembly work include Tong, Kathe and senior research associate Dawei Wang. The university would like to partner with industry to further develop the technology. The Linde Group, a provider of hydrogen and synthesis gas supply and plants, has already begun collaborating with the team. Andreas Rupieper, the head of Linde Group R&D at Technology & Innovation said that the ability to capture carbon dioxide in hydrogen production plants and use it downstream to make products at a competitive cost "could bridge the transition towards a decarbonized hydrogen production future." He added that "Linde considers Ohio State's chemical looping platform technology for hydrogen production to be a potential alternative technology for its new-built plants". The Babcock & Wilcox Company (B&W), which produces clean energy technologies for power markets, has been collaborating with Ohio State for the past 10 years on the development of the CDCL technology - an advanced oxy-combustion technology for electricity production from coal with nearly zero carbon emissions. David Kraft, Technical Fellow at B&W, stated "The CDCL process is the most advanced and cost-effective approach to carbon capture we have reviewed to date and are committed to supporting its commercial viability through large-scale pilot plant design and feasibility studies. With the continued success of collaborative development program with Ohio State, B&W believes CDCL has potential to transform the power and petrochemical industries." More information: Mandar Kathe et al. Utilization of CO2 as a partial substitute for methane feedstock in chemical looping methane–steam redox processes for syngas production, Energy & Environmental Science (2017). Mandar Kathe et al. Utilization of CO2 as a partial substitute for methane feedstock in chemical looping methane–steam redox processes for syngas production,(2017). DOI: 10.1039/C6EE03701A Cheng Chung et al. Chemically and physically robust, commercially-viable iron-based composite oxygen carriers sustainable over 3000 redox cycles at high temperatures for chemical looping applications, Energy & Environmental Science (2017). DOI: 10.1039/C7EE02657A Journal information: Energy & Environmental Science
Scientists at Ohio State University are creating new technologies that will convert fossil fuels into products, such as electricity, without emitting CO2. The engineers have developed a process through which shale gas is turned into methanol and gasoline, all while consuming carbon dioxide. Under the right conditions, the technology is capable of consuming all the CO2 in the process, and the researchers have found a way to make the process commercially viable.
The tale illustrates the productivity lessons – and skills – Australian construction can take from the local car manufacturing industry. Traditional ways It wasn't always like that, however. Probuild, like many of its competitors in the industry, was suffering under the traditional ways of working and needed to change. It hired engineers with experience of the automotive production chain, who pointed out vast inefficiencies. "We had huge issues with that supply chain," says Luke Stambolis, Probuild's managing director for Victoria. "There wasn't enough time during the day to get materials up." Change required overturning established ways of operating. The first part involved working more closely with suppliers and understanding their processes. For a company used to working with a supplier on a lump-sum payment basis – which put all the financial risk on to the supplier – the head contractor sought to take on some of that risk itself. Probuild's 62-storey Empire Melbourne residential tower under construction for client Mammoth Empire. Andy Henderson Advertisement "It became less of a dictatorship and more of a collaborative approach," Grant says. On Empire Melbourne, the company had worked with the same plumber for 10 years, but had never set foot in his factory, asked about his processes or consulted with him on how to improve things. "We didn't even know where they were getting their materials from," Grant says. Took on board But when they asked the plumber's opinion for a better way to deliver supplies – instead of having products delivered in bulk, cut to size on site and installed – the plumber came up with a stillage, a frame that fitted into the hoists used to lift materials. The plumber delivered materials precut, in the stillage, which he would take back empty and refill. The completed 62-storey Empire Melbourne tower. Supplied Advertisement "It was something we took on board and spread through the entire site. All subcontractors ended up adopting that model for delivery," Grant says. The process cut the volume of waste material needing to be removed from the site by 3000 cubic metres – saving $150,000 in shipping costs alone. For the contractor it resulted in a saving of between 2 per cent and 5 per cent on its bills to subcontractors. Another bottleneck was moving people and materials around. The site had three lifts for people and materials and at peak times, such as the start and end of a shift, it could take an hour to move the 300 people clocking on or off at the same time. But by putting the three lifts on a timetable – with some express to certain floors while others were milk runs to every floor – and showing the waiting workers on screens which lifts were going where, they sped things up. Measure performance "We knew we could load our whole building at peak time in 22 minutes," he says. "If we have control over materials through the warehouse and we're able to get workers to the floor as quickly as we can and know when materials are being dropped off, we know we can build as efficiently and productively as possible." The two-year-old process will keep evolving and the contractor is now trying to make better use of data to measure performance. "I'd like to think by this time next year I'd be able to show in real time how a project's tracking, even how a particular level on a project is tracking," Grant says. "That's where we could be improving, for sure."
Australian building company Probuild has learned from the assembly line process of car manufacturing to significantly speed up the construction of a residential tower in Melbourne, enabling its completion ahead of schedule. Probuild focused on having more control over its supply chain to reduce delays and also on moving workers around the site more efficiently to reduce downtime. It also worked more closely with suppliers so that materials could be delivered pre-cut, which reduced assembly time. The firm relied on data analysis to create further efficiencies and intends to apply what it has learned to future projects.
Most of the new supply to come up in Tseung Kwan O, Tai Po and Tuen Mun
About 35,000 new flats are going on the market in Hong Kong this year, with flats costing less than HKD6m ($768,000) expected to be the most popular with first-time buyers. However, prices are forecast to remain stubbornly high this year, with predictions of further rises of 10% to 20%. Last year, flats near stations on the city's mass transit railway experienced the biggest increases, of between 15% and 20%. Apartment sales volumes are expected to be 5% higher than in 2017.
Sign up for our PoliticsNY newsletter for the latest coverage and to stay informed about the 2021 elections in your district and across NYC A new bill passed in the City Council on Dec. 19, 2017 will require that a two-year study be completed to determine how feasible it would be to use renewable fuels and technology to power the city’s ferries. Earlier this year, Mayor Bill de Blasio announced that he would expand ferry service to include routes in Astoria, Rockaway and South Brooklyn. Now, a new bill aims to make this expansion more environmentally friendly. Introduced by Astoria Councilman Costa Constantinides, Int. No. 54-A will result in a study to analyze what alternative fuels can be used to power the motorized watercrafts. Alternative fuels like biodiesel and hybrid electric, battery electric and fuel-cell electric technologies will all be considered in the study, which should be completed no later than Dec. 31, 2019. The study would also require a review of the types and classes of ferries used, their compatibility with the alternative fuels and alternative fuel technologies, the availability of the fuels and technologies and other issues such as storage and regulatory requirements. The most commonly used fuel, petroleum diesel fuel, generates greenhouse gases when it is burned, as well as harmful pollutants such as sulfur dioxide. The exhaust released by petroleum fuel can also cause respiratory illnesses such as asthma and lung disease. According to the Staten Island Advance, the Staten Island Ferry began to use liquefied natural gas to power one of its ferries in 2013 as opposed to low-sulfur diesel, which was done to reduce carbon dioxide emissions and cut fuel costs. “Over the past four years, our city has made environmental protection a priority – whether through ending our reliance on fossil fuels, cleaning our air quality, building sustainable transit habits, or encouraging use of renewable energy,” Constantinides said in a statement. “Int. 54 will help increase use of renewable fuel in one of our city’s most sustainable transit options – our ferries. As use of our citywide ferry system has grown exponentially, we must innovate the type of energy we use to fuel the boats.” Another bill introduced by Constantinides also passed City Council on Dec. 19. It requires power plant operators in the city to stop burning dirty grades of fuel oil to power their plants sooner than originally proposed.
New York City Council has commissioned a two-year study to determine the feasibility of using renewable fuels and technology to power the city’s ferries. Alternative fuels such as biodiesel and hybrid electric, battery electric and fuel-cell electric technologies will all be studied to assess which options are most compatible with the various types and classes of ferries the city uses.
AMD is big winner from chip flaw fiasco as more than $11 billion in Intel stock value is wiped out 11:43 AM ET Fri, 5 Jan 2018 | 00:43 Investors are piling into AMD shares and selling Intel stock after major chip security vulnerabilities were revealed earlier this week, and it totally makes sense. Enterprises will likely diversify their chip security architecture risk for mission-critical applications by buying more AMD server chips. The company's "architecture differences" have proven immune to the more problematic one of the two disclosed vulnerabilities. British tech website The Register reported Tuesday that some Intel processors have a "fundamental design flaw" and security issue, which spurred the company to confirm the problem later in the day. AMD shares are up 10.4 percent in the two days through Thursday following the report, while Intel's stock declined 5.2 percent in the period, wiping out $11.3 billion of shareholder value. One of the two vulnerabilities, called Meltdown, affects Intel processors. The other, named Spectre, could affect chips from Intel, AMD and Arm. Intel said Wednesday that performance degradation after security updates for Meltdown "should not be significant" for the average user. But on a call with investors, the company admitted a decrease in performance of up to 30 percent was possible after fixes under some "synthetic workloads." Bank of America Merrill Lynch told clients the big Intel performance hits were "likely for enterprise and server workloads." On the flip side, AMD said any performance hits will be "negligible" after Spectre-related security software updates and there is "near zero risk of exploitation." The company also confirmed it is not affected by Meltdown due to processor "architecture differences." Researchers and Apple said Spectre is more difficult to exploit. Multiple Wall Street analysts predicted AMD will take advantage of the Intel's security issues. AMD could use it as "a marketing edge given differing architectures and no vulnerability yet," Mizuho Securities analyst Vijay Rakesh wrote in a note to clients Wednesday. Intel's high-profit data-center business, which sells server chips to cloud computing providers and enterprises, is the chipmaker's crown jewel. Rakesh noted that Intel had 99 percent market share of the data-center market, representing a huge opportunity for AMD. Analysts estimate that Intel's data-center group will generate $18.5 billion in sales and $7.4 billion in operating profit in 2017, according to FactSet. "Longer-term customers could be more motivated to find alternatives at AMD and possibly ARM (CAVM benefits) to diversify the architectural risks," Bank of America Merrill Lynch analyst Vivek Arya wrote Thursday. "AMD appears poised to be the most direct beneficiary." An AMD gain of significant market share in the server market is not unprecedented. The company hit 25 percent share in 2006. If AMD is able to reach 10 percent or 15 percent market share of the data-center business, it could add billions in revenue to the company's financial results. Any increase will be a boon for AMD because the Wall Street consensus for the company's 2017 estimated sales is just $5.25 billion. One leading tech industry analyst says the chipmaker will do just that. "The news of Intel's processor security issue and the potential performance degradation to correct it comes at an inopportune time as Intel currently faces heavy competitive pressure from its long-time nemesis, AMD," Fred Hickey, editor of High Tech Strategist, wrote in an email Thursday. "AMD's new line of chips is a significant challenger for the first time in many years (since AMD's Opteron chip days)." AMD launched new line Epyc data-center processors to much fanfare last June with design wins at cloud computing providers Microsoft Azure and Baidu. "For Intel, it likely means loss of market share (lower revenues) as well as loss of pricing power (lower gross margins) as the advantage shifts to the buyers and away from Intel, which has totally dominated the PC/computer server processor market in recent years," Hickey said. "AMD's new processor chips already had momentum and that momentum will likely be propelled further by the recent security issue disclosures."
Intel has seen more than $11bn wiped off its market value following a report by The Register of two vulnerabilities in its microprocessors that could lead malware to stored password data. In contrast, shares in rival Advanced Micro Devices (AMD) rose 10% after the company said there was a "near zero risk of exploitation" despite its chips being affected by one of the vulnerabilities. Computers using Intel chips from the past 10 years could be affected, including those running Microsoft Windows and Apple OS X. AMD now has a chance to eat into Intel's 99% share of the data-centre market.
OneRNA(TM) Making RNA Sequencing Actionable in Oncology "This is an amazing project that delivers on our mission to save lives and increase the quality of life for the patients while reducing the cost of care“Gitte Pedersen, CEO of Genomic Expression The OneRNA4Bladder project aims to save lives and halve the cost of care for bladder-cancer treatment. Bladder cancer is the most expensive cancer to treat due to expensive diagnostic procedures and ineffective drugs resulting in a recurrence rate of 70%. The OneRNA4Bladder project, which validates a urine-based diagnostic and liquid-biopsy platform, has already been shown, in a smaller 500-patient study, to outperform cystoscopy in terms of sensitivity and specificity. For this reason, it may substitute the invasive, expensive cystoscopy procedure in the diagnostic workup for bladder cancer and for the identification of recurrence. To our knowledge, no other technology has successfully met this endpoint. Most other non-invasive tests add cost to the management of the disease instead of reducing cost simply because, instead of replacing cystoscopy, the test can only provide a supplement to it. The new immune checkpoint inhibitors have recently been approved for bladder cancer. However, with a low overall response rate of 15%, this will soon become a very expensive option, driving up cost. Genomic Expression has developed a way to sequence RNA quickly and inexpensively and then link the statistically changes in tumor RNA to drugs that could provide more durable responses. This technology is called OneRNA™, and, applied as a liquid-biopsy platform, it can quickly and effectively diagnose cancer, select treatment, and measure response and recurrence. Overall, this solution will guide patients toward more effective treatments, reduce the number of cystoscopies and biopsies, identify patients with a higher likelihood of responding to new, expensive immunotherapy treatments, and, finally, reduce the cost of bladder-cancer treatment. Bladder cancer is the sixth most common cancer globally, with an estimated 357,000 new cases worldwide every year. Its high incidence coupled with its relapsing nature result in the highest lifetime treatment costs per patient of all cancers, accounting for almost $3.7 billion/year in direct costs. A significant part of this cost burden stems from routine use of highly invasive, expensive procedures such as cystoscopy in diagnosis and recurrence monitoring. Replacing routine cystoscopy with the OneRNA4Bladder system could cut the cost of diagnosing and managing bladder cancer in half. Perhaps even more striking, the costs associated with an untimely death due to bladder cancer (i.e., the ‘‘value’’ of life lost) approach $7 billion annually. OneRNA4Bladder is expected to significantly minimize the bladder-cancer burden on health care. The market opportunity for OneRNA4Bladder is $4.5 Billion. Genomic Expression can start commercializing various components of the technologies as tools for clinical development of drugs e.g. OneRNA™ is already commercially available as a tool to stratify patients into clinical trials. However once the OneRNA4Bladder clinical study is completed, the market for the solution as a diagnostic platform is estimated to $4.5 billion. About Genomic Expression Genomic Expression is finding the best drug for the patient and the best patient for the drug by sequencing RNA and linking changes due to cancer to drugs through its proprietary algorithms and databases. Analyzing RNA allows us to tell if a tumor will respond to the new immune therapies, which are the only kind of therapies that are potential cures. Right now, only one out of four cancer treatments prolongs life. We spend $100 billion on drugs every year, and eight million patients die. Genomic Expression was started as the diagnostic partner in the $32M Danish “Genome Denmark.” The company now has four in-house clinical programs established in selected cancers with clear unmet needs. The OneRNA™ test can be used on any type of cancer.
Massachusetts-based biotech Genomic Expression has developed the OneRNA4Bladder project, a low-cost method of diagnosing bladder cancer which also halves the cost of care. The OneRNA test, which can be used to diagnose several types of cancer, offers a urine-based diagnostic and liquid biopsy, instead of a traditionally invasive and high-cost cystoscopy. Upon diagnosis, the platform also enables treatment selection, as well as monitoring response and recurrence. Treating bladder cancer in the US costs around $3.7bn per year.
Urban Jungle, a London-based FinTech startup revamping insurance for young people, has secured €1.1 million seed funding from a group of top angel investors, in order to build a better home insurance experience for the growing population of renters. The financing round was led by Rob Devey, ex CEO of Prudential UK and HBOS insurance. This capital injection will support the startup’s plans to launch several new products, including one targeted at the growing number of house and flat shares. Urban Jungle was founded in 2016. The founders Jimmy Williams and Greg Smyth looked at the insurance industry, and noted both that poor tech was making it difficult to buy and use, and that young people were being excluded from insurance because it was failing to keep pace with their changing lifestyles. They believe they can transform the insurance experience by completely rebuilding the tech stack, using modern technologies, for example through smartphone-first design, and using machine learning to improve risk scoring. Jimmy Williams commented: “This investment marks a big step towards our mission to build an insurance provider that customers love. Through 2017, we focused on building a business that our customers can rely on 100% when something goes wrong. In 2018 we’ll vastly increase the choice of products for our customers, and make all of our products even easier to use.” Rob Devey, who led both this, and the business’ previous round of funding commented: “I’m very excited about the huge opportunity to transform the insurance industry using technology, and think this team is superbly well placed to capitalise on that. They have proven their ability to execute to a very highly quality over the past year, and I’m confident that they will only accelerate from here.”
UK fintech Urban Jungle has raised €1.1m ($1.3m) in seed funding towards its tech-driven insurance service for young renters. The investment, led by Rob Devey, former CEO of Prudential UK and HBOS insurance, will help the London-based start-up increase the choice of products available to its customers, such as one targeted at house and flat shares. Urban Jungle was founded in 2016 by Jimmy Williams and Greg Smyth, who felt young people were excluded from the insurance market as it failed to keep up with their changing lifestyles.
RANCHO CUCAMONGA, Calif. — Devon Rising shakes his head and tries to cover his face with his hands. It’s time to get his few remaining teeth cleaned, and he fusses for a bit. Gita Aminloo, his dental hygienist, tries to calm him by singing “Itsy Bitsy Spider,” the classic children’s song. Rising, 42, is mentally disabled and blind. He has cerebral palsy and suffers from seizures. It’s hard for him to get to a dentist’s office, so Aminloo brought her dental picks, brushes and other tools to him at the residential care facility he shares with several other people who have developmental disabilities. Rising is among a vulnerable class of patients who are poor and so frail they can’t leave the nursing home or, in his case, the board-and-care home to visit dentists. Instead, they rely on specially trained dental hygienists like Aminloo, who come to them. But this may be the last time Aminloo cleans Rising’s teeth. And it’s not because of his resistance. Hygienists say some of their patients are no longer getting the critical dental care they need because of recent policy changes: The state dramatically slashed payment to providers and created a preauthorization process they call cumbersome. In 2016, Denti-Cal, the publicly funded dental program for the poor, cut the rate for a common cleaning procedure for these fragile patients from $130 to $55. Hygienists say they can’t afford to continue treating many of them for that kind of money. They also claim that half of their requests to perform the cleanings are rejected — an assertion not supported by state data. The Department of Health Care Services, which runs Denti-Cal, said it made the changes to bring the program’s reimbursement policy in line with other states and to reduce “unnecessary dental treatment.” But Aminloo insists the new state regulations victimize the most vulnerable people, who she said are losing their access to routine dental care. “If these patients don’t get preventive oral care, their overall health is going to suffer,” she warned. Dental hygienists are generally allowed to practice without the direct supervision of a dentist in 40 states, including Nevada, Texas, Colorado, Michigan and Florida. But the type of patients they can see varies by state. So do reimbursement and preauthorization rules. Washington state’s Medicaid program pays providers $46 for a similar cleaning procedure, said Anita Rodriguez, a member of the Washington State Dental Hygienists’ Association. Hygienists there don’t have to obtain preauthorization to perform cleanings, but they are required to explain why the cleaning was necessary when they bill Medicaid. “Our state makes access for our independent hygienists relatively uncomplicated though, like other Medicaid providers, we make pennies on the dollar for our care,” she said. Since California reduced payments for “maintenance” cleanings for these patients — usually performed every three months to treat gum disease — many hygienists have stopped seeing them. Eight hygienists, including Aminloo, filed a lawsuit in Los Angeles County Superior Court in 2016, arguing that the health care services department cut the reimbursement rate without first obtaining necessary federal approval. At one point, it appeared as if the department had agreed to settle and cancel its rate change but then backed out, court documents show. The department said it will not comment on pending litigation. At the time of the rate reduction, the state also started requiring dental hygienists to obtain prior authorization to treat gum disease in patients who live in special care facilities. Hygienists must submit X-rays along with their authorization requests. But they say it’s almost impossible to take decent X-rays of elderly or disabled patients who have a hard time controlling their head movements, or who refuse to open their mouths widely. When hygienists do manage to get X-rays, their requests are often denied anyway, hygienists from across the state told California Healthline. In a letter to the state legislature last year, the California Dental Hygienists’ Association wrote that more than half of their authorization requests had been denied since the change. “Denti-Cal’s sweeping new rules are destroying the lives of fragile patients and the women who own small businesses providing care at the bedside,” the letter said. But state statistics show a much lower denial rate. From the time the change took effect in July 2016 through June 2017, the health care services department approved 10,000 of nearly 13,000 deep cleanings requested by these dental hygienists to treat gum infections, according to the data. It also approved 31,300 of the nearly 33,000 requests for routine cleanings that follow a deep cleaning. The state said it paid more than $2.5 million to dental hygienists for these procedures. Darla Dale, a hygienist in Eureka and a vice president of the hygienists association, said the department’s denial numbers don’t reflect what her organization is seeing. “There’s no way that’s true,” Dale said. “We’re in contact with these hygienists. … Many have stopped working because we can’t spend our lives trying to get authorization.” Darci Trill, a hygienist working in Alameda and Contra Costa counties, is among those who stopped seeing patients in nursing homes after denial letters piled up. “I lost about 70 percent of my Denti-Cal clients,” she said. State health officials pointed to the American Academy of Periodontology, which considers the new authorization guidelines standard, including X-rays to diagnose gum disease. An April 2016 report by the Little Hoover Commission, an independent state watchdog agency, said the state health services department found it “unusual” that nearly 88,000 out of 100,000 Denti-Cal-eligible patients in nursing homes had received deep cleanings during the 2013-14 fiscal year. This figure and other factors raised “questions about their necessity ­— and hence the new policy requiring X-ray documentation,” the report said. In frail patients, advanced gum disease can cause not only tooth loss, but pneumonia and other respiratory issues, Trill said. Maureen Titus, a hygienist in the San Luis Obispo area, said her clients rely entirely on caregivers for their dental hygiene, and that brushing and flossing is neither easy nor effective. “Most have bleeding gums, inflamed gums and tartar buildup,” she said. Among patients who are attached to feeding tubes, tartar builds up quickly because they don’t chew their food, Aminloo said. “After two or three months, you can’t even see their teeth.” The independent practice of dental hygienists in California dates to 1997, when the state legislature allowed them, with additional training and certification, to work without the direct supervision of dentists. Some started their own mobile businesses. This is the first time in the intervening 20 years that they’ve had to obtain preauthorization to perform dental cleanings, Trill said. The California Dental Association, which represents dentists, said dentists have long been required to get prior approval for cleanings for patients in special care facilities. “We supported the department’s decision to equalize requirements for periodontal services, regardless of whether a dentist or hygienist provides the service,” said Alicia Malaby, the association’s spokeswoman. Dr. Leon Assael, the director of community-based education and practice at the University of California-San Francisco’s School of Dentistry, said preauthorization requirements in other states, including Minnesota and Kentucky, where he used to work, have also delayed or limited care for homebound patients. The requirements have driven providers out of the system, he said, leaving patients behind. “If this were toes being lost, this would be a scandal,” Assael said, “but with teeth, it’s been accepted.”
Recent policy changes in California has endangered the dental care of a rising number of patients who are unable to receive dental care as they are too frail to leave their nursing homes. The state has slashed payments to dental care providers and hygienists and created a cumbersome preauthorization process which is hindering the delivery of dental care as part of Denti-Cal. Denti-Cal, the publicly funded program for the poor, reduced the rate for a simple cleaning procedure from $130 to $55 which has seen hygienists unable to continue treating their patients. The Department of Health Care Services maintains it made the changes to reduce unnecessary treatment, but the hygienists maintain that the new regulations victimize the most vulnerable patients. Whilst dental hygienists are allowed to practice without direct supervision in 40 states, rules governing patient choice, reimbursement and preauthorization rates vary. Since reducing payments for maintenance cleaning, many hygienists have stopped seeing their patients, and a lawsuit was filed, arguing that the decision was made without obtaining federal approval. At the same time as the reductions, requirements were also imposed for x-rays but this is an especially difficult process when working with disabled and elderly patients, should the requests even be granted. State data indicates a disparity between hygienists’ observations and state approval, which suggests that between July 2016 to July 2017, 10,000 of almost 13,000 requests were granted for deep cleaning, paying $2.5 million in the process. These figures are disputed by hygienists, one of whom claims that she lost 70 percent of her Denti-Cal patients because of the changes. This is the first time in 20 years, since dental hygienists were permitted to act independently that preauthorization has been required.
Build-to-rent has been gaining traction in Dublin for some time now, with numerous developments either under way or in the process of being planned by major real estate companies. Stock photo: Deposit Belfast is set to see its first build-to-rent (BTR) residential scheme following the submission of a planning application for a 19-storey apartment building in the city's Cathedral Quarter. Should it get the go-ahead, the proposed £15m development by joint venture partners Lacuna and Watkin Jones would see the delivery of 105 one- and two-bed apartments on a site now occupied by a derelict building and surface car park. Build-to-rent has been gaining traction in Dublin for some time now, with numerous developments either under way or in the process of being planned by major real estate companies such as Ires Reit, Kennedy Wilson, Hines Ireland, Marlet Property Group and the Cosgrave Property Group. The increasing appetite for long-term rental over home ownership is being driven by a range of economic and societal factors. Quite apart from the shortage of housing supply and the relative dearth of mortgage finance, a recent report by real estate agents CBRE pointed to the changing nature of Ireland's demographics and living preferences. CBRE noted that the proportion of renters here grew by 4.7pc in the five-year period between 2011 and 2016, to 497,111 households - or nearly 30pc of the population. Looking at occupancy by age group, the report found younger people have a higher propensity to rent, with 65pc of the Dublin population aged 25-39 renting from a landlord. In the same age segment, 26pc of people own their home with the remainder renting from a local authority.
A 19-storey apartment building in Belfast's Cathedral Quarter could become the city's first build-to-rent (BTR) scheme if submitted plans are given approval. The £15m ($20m) development by Lacuna and Watkin Jones is for 105 flats, and would meet an increasing appetite for renting in Northern Ireland. In Dublin, 65% of 25 to 39 year-olds are in rented accommodation, while several companies, including Ires Reit and Marlet Property Group, are planning BTR developments.
Scientists teach robots how to respect personal space by Staff Writers Beijing, China (SPX) Jan 02, 2018 Robots have a lot to learn about humans, including how to respect their personal space. Scientists at the Institute of Automatics of the National University of San Juan in Argentina are giving mobile robots a crash course in avoiding collisions with humans. The researchers published their methods in IEEE/CAA Journal of Automatica Sinica (JAS), a joint publication of the IEEE and Chinese Association of Automation. "Humans respect social zones during different kind[s] of interactions," wrote Daniel Herrera, a postdoctoral researcher at the Institute of Automatics of the National University of San Juan and an author on the study. He notes how the specifics of a task and situation, as well as cultural expectations and personal preferences, influence the distance of social zones. "When a robot follows a human as part of a formation, it is supposed that it must also respect these social zones to improve its social acceptance." Using impedance control, the researchers aimed to regulate the social dynamics between the robot's movements and the interactions of the robot's environment. They did this by first analyzing how a human leader and a human follower interact on a set track with well-defined borders. The feedback humans use to adjust their behaviors - letting someone know they're following too closely, for example - was marked as social forces and treated as defined physical fields. The human interactions (leading and following), including the estimated social forces, were fed to a mobile robot. The programmed robot then followed the human within the same defined borders, but without impeding on the social forces defined by the human interactions. "Under the hypothesis that moving like human will be acceptable by humans, it is believed that the proposed control improves the social acceptance of the robot for this kind of interaction," wrote Herrera. The researchers posit that robots are more likely to be accepted if they can be programmed to respect and respond like humans in social interactions. In this experiment, the robot mimicked the following human, and avoided the leader's personal space. "The results show that the robot is capable of emulating the previously identified impedance and, consequently, it is believed that the proposed control can improve the social acceptance by being able to imitate this human-human dynamic behavior." Research paper Tokyo, Japan (SPX) Dec 15, 2017 Lockheed Martin and NEC Corp have announced that Lockheed Martin will use NEC's System Invariant Analysis Technology (SIAT) in the space domain. SIAT's advanced analytics engine uses data collected from sensors to learn the behavior of systems, including computer systems, power plants, factories and buildings, enabling the system itself to automatically detect inconsistencies and prescribe resol ... read more Related Links
Researchers in Argentina have developed systems to teach robots to respect humans' personal space. The team at the National University of San Juan devised techniques to replicate the social signals given by humans when they feel their personal space is being invaded. These were then set as defined physical fields for the robot as it followed a human's movements. The experiments showed the robot was able to successfully emulate the behaviour and movement of a human. Scientists claim this will help increase future social acceptance of robots.
Source: Xinhua| 2018-01-03 15:31:55|Editor: Xiang Bo Video Player Close BEIJING, Jan. 3 (Xinhua) -- Land sales increased in Chinese cities last year as the government moved to cool the market with higher supply, according to the China Index Academy, a property research organization. Land sales in 300 Chinese cities totaled 950.36 million square meters in 2017, up 8 percent from 2016, while sales of land for residential projects reached 354.33 million square meters, an increase of 24 percent year on year. Land sales in major cities like Beijing, Shanghai and Guangzhou were particularly robust, as local governments increased land supply to cool down runaway house prices fueled by huge demand and limited supply. In China's first-tier cities, land sales jumped 46 percent year on year to 29.79 million square meters last year, according to China Index Academy. Boosted by surging sales, revenue from land transactions rose 36 percent to 4.01 trillion yuan (about 620 billion U.S. dollars) in a total of 300 Chinese cities. China's property market, once deemed a major risk for the broader economy, cooled in 2017 amid tough curbs such as purchase restrictions and increased downpayment requirements as the government sought to rein in speculation. Due to these efforts, both investment and sales in China's property sector slowed. Real estate investment rose 7.5 percent year on year during January-November, down from 7.8 percent in the first 10 months. Property sales in terms of floor area climbed 7.9 percent in the first 11 months, retreating from 8.2 percent in January-October. With the market holding steady, Chinese authorities are aiming for a "long-term mechanism" for real estate regulation, and a housing system that ensures supply through multiple sources and encourages both housing purchases and rentals. A report from the National Academy of Economic Strategy predicted that the country's property market would remain stable in 2018 if there were no major policy shocks.
Land sales in China's first-tier cities such as Beijing, Shanghai and Guangzhou surged by 46% year on year to 30 million sq metres in 2017, according to the China Index Academy. The academy also revealed that land sales in 300 Chinese cities rose by 8% to 950 million sq metres and sales of land for residential purposes rose 24% to 354 million sq metres. The government has been increasing land supply in order to put a brake on rising house prices. New purchase restrictions and requirements for higher deposits have also had an effect in slowing price rises.
Venture capital firm Life Sciences Partners LSP has secured €280m in funding for new medical technologies, labelling it the largest fund in Europe dedicated exclusively to healthcare innovation. The money will be invested in private companies developing forward-thinking medical devices, diagnostics and digital health products. Specifically, it will target technologies with the potential to improve the quality of patient care while simultaneously keeping healthcare costs under control. According to the investment group, the LSP Health Economics Fund 2 was oversubscribed and exceeded both its target size and its original ‘hard’ cap. Speaking to Digital Health News, Rudy Dekeyser, LSP partner, said: “In the digital health space we have a keen interest in proprietary and scalable products with both a clear potential to improve the prevention, diagnosis and treatment of major diseases and a straightforward impact on cost reduction for the healthcare system.” Focused on Europe and the US, the fund will look to invest in around 15 private companies. The products of these companies will need to be “on the market or very close to market introduction”. Dekeyser cited drug compliance, remote monitoring, big data analytics and clinical software as areas of particular interest. He added that companies hoping for a share of the fund would have to “convince us that there is a clear path towards the integration of their innovative product in the complicated healthcare ecosystem, has to know who will pay for their product or services and should have access to the necessary partners for broad implementation of their product in the market.” Investors in the fund include the European Investment Fund and a variety of health insurance companies and institutional investors. Dr René Kuijten, managing partner of LSP, added: “With this new and sizable fund, we have now firmly established our health economics and medical technology strategy.” Early in December, the UK government formed a partnership with the life sciences industry as part of a pledge to boost advancements in medical technology in Britain. Earlier in December, Wayra UK and Merck Sharp & Dohme announced a £68,000 healthcare accelerator programme aimed at machine learning start-ups.
European investment firm Life Sciences Partners has raised €280m ($337m) for the LSP Health Economics Fund 2, which aims to support firms developing medical devices which also help reduce healthcare costs. The fund, which the company said was oversubscribed, will invest in around 15 private firms that have products relating to remote monitoring, drug compliance and big data analytics "on the market or very close to market introduction", according to LSP partner Rudy Dekeyser.
In New York state, the City of New York is set to spend the next two years analyzing the use of alternative fuels to power the city’s ferries following a successful City Council vote in mid-December. The study will look at biodiesel and hybrid electric, battery electric and fuel-cell electric technologies but also look at new ferries to evaluate their compatibility with these cleaner fuels. The study should be submitted to the city council by December 31, 2019.
New York City Council has commissioned a two-year study to determine the feasibility of using renewable fuels and technology to power the city’s ferries. Alternative fuels such as biodiesel and hybrid electric, battery electric and fuel-cell electric technologies will all be studied to assess which options are most compatible with the various types and classes of ferries the city uses.
Fluenta has announced the launch of its new range of ultrasonic flow transducers. The two new transducers enable ultrasonic measurement of gas flow in highly challenging environments. Accurate measurement of gas flow at all temperatures. A new high-temperature transducer to measure gas flow processes at up to +250°C, and a cryogenic transducer functional down to -200°C with significant applications in the liquified natural gas (LNG) sector. The ability to measure gas flow in processes from 6 to 72 in. diameter and up to 100% methane or 100% carbon dioxide, depending on pipe diameter. The high-temperature range transducer can accurately measure gas flow at up to +250°C, allowing Fluenta’s ultrasonic technology to be deployed in a wider range of flare applications, as well as in the chemical processing industry. The cryogenic transducer is designed to work in processes as cold as -200°C, typically found in the liquefied natural gas (LNG) industry and other gas liquification and chemical processes. New software and signal processing allows these transducers to function in processes containing up to 100% methane or 100% carbon dioxide, gas mixes which historically have presented challenges to standard ultrasonic flow meters. Fluenta’s non-intrusive transducers do not interrupt gas flow and can be used across a wide range of pipe diameters from 6 in. to 72 in. The new range of transducers and software are compatible with Fluenta’s FGM160, and can be fitted to existing installations. With government regulation increasingly strict for monitoring flare gas emissions, companies are under pressure to accurately measure and record gas flow. “These new transducers greatly increase the capacity of Fluenta to meet the needs of our existing customers, and to move into new markets such as chemical processing and liquified natural gas.”, comments Sigurd Aase, CEO of Fluenta. “We are investing significantly in ultrasonic gas flow monitoring, and this product launch provides customers with an unrivalled combination flexibility, reliability and accuracy in gas flow monitoring.”
Fluenta has launched two ultrasonic flow transducers that enable measurement of gas flow in extremely high and low temperatures. The high-temperature transducer measures gas up to 250C, while the cryogenic transducer functions down to minus 200C, targeting the liquefied natural gas (LNG) sector. The sensors can work in environments of up to 100% methane or 100% carbon dioxide, gas mixes that historically have presented challenges to standard ultrasonic flow meters.